Why would my LOPimport camera have a 100x multiplier on FL and aperture?
No scaling
This is it (thanks @.grahama )
Attachments in this post:
http://fx-td.com/houdiniandchill/wp-content/uploads/discord/20264701/21/26/image.png
Why would my LOPimport camera have a 100x multiplier on FL and aperture?
No scaling
This is it (thanks @.grahama )
Attachments in this post:
http://fx-td.com/houdiniandchill/wp-content/uploads/discord/20264701/21/26/image.png
General PSA when rendering VDBs with Karma (XPU/CPU) or the Vulkan Viewport
Currently when rendering VDB volumes in a Houdini renderer, Houdini will read the entire VDB file from disk regardless of the number of fields within the VDB that are actually used.
Say for example when working in SOPs, you exported a VDB with the following fields to a VDB on disk. – `density` ( 200MB ) – `temperature` ( 200MB ) – `scatter` ( 300MB ) – `vel` ( 350MB ) – `rest` ( 300MB ) – `flame` (150MB ) (Total, 1500MB file)
However on the USD Stage, either through pruning or selective loading with a Volume LOP, your final stage looks like “` /fx/geo/explosion/ [ Volume ] density [OpenVDBAsset] vel [OpenVDBAsset] scatter [OpenVDBAsset] “` Since only 850MB of data is needed to render, ideally that is all that would be loaded from the VDB files (since they support random access). However with Karma / Vulkan this isn’t the case and all the fields will be read from disk. Which can cause a lot of extra network I/O.
As for other renderers – – RenderMan 26 will only read the fields from disk that are referenced on the stage. (850MB) – V-Ray 7 will only read the fields from disk that are referenced on the stage and used within the volume shaders (850MB)
tl;dr – Make sure you only save the VDB fields you intend to render with, pruning on the stage doesn’t reduce I/O with Karma / Vulkan.
Technically you could have 1 field per VDB file, and assemble them under one Volume prim on the stage and Karma would be okay with that. Resulting in data I/O for only what is on the stage. However other renderers (V-Ray especially) will have an utter shit-fit if your fields are spread across multiple VDBs. So not really recommended.
(This was verified by using a file page monitor on Linux)
I started looking into this. How to make the edit material parameters user friendly rather than the ugly flat list. Found this in the docs but I got waylaid and didn’t get back to working out how I can get that created for materials already authored without it or a hda material for example
Attachments in this post:
http://fx-td.com/houdiniandchill/wp-content/uploads/discord/20251809/15/25/IMG_7793.png
@eckxter This is how I usually set up an animated .bgeo sequence with the Geometry Clip Sequence node. I’ve left some notes in the hipfile that should hopefully note any gotchas. I’m curious if anyone else does it any differently or has more info around the process. If so then I’d be really keen to hear it. I’m also curious if @erikovic has a different process.
Attachments in this post:
http://fx-td.com/houdiniandchill/wp-content/uploads/discord/20254206/26/25/geo_clip_sequence_example.hiplc
this 👆 (changed my life)
just sharing something that annoyed me for too long: if you work on a small team that uses Dropbox as the project server (potentially other platforms too?), when working in solaris the dropbox desktop app might get stuck syncing “rendergallery.db”, and often times it simply blocks the queue so you can’t download / upload other files. I’ve worked around that by NOT using the /stage context, but instead use a lopnet in obj level to start my solaris work. That way you have direct access to the lopnet parms and can change the render gallery source parm, just add `$USER` somewhere in the path and it should fix the sync issues. Hopefully none of you had this sucker before.
This is slightly different then you need, becauase i`m looping clip too(was doing animated grass clips)but good to get you going. In your case i would create a short clip, with no loop, then instance and manage the offsets.
this is 60 frames clip for 360 spin with random offset….
Attachments in this post:
http://fx-td.com/houdiniandchill/wp-content/uploads/discord/20250304/10/25/usd_clip_example.hip
http://fx-td.com/houdiniandchill/wp-content/uploads/discord/20250304/10/25/image.png
http://fx-td.com/houdiniandchill/wp-content/uploads/discord/20250304/10/25/Instancer_anim.mp4
cock, thank you @chris_gardner
needs a bit of a cleanup/ui, but got deadline submission working
for anyone interested, here’s how im getting start/end and output frames for Deadline:
Attachments in this post:
http://fx-td.com/houdiniandchill/wp-content/uploads/discord/20255603/19/25/image.png
Yeah pretty much.
I’ll try some sneaky stuff tomorrow with symlinks
Ok so far the easiest thing I found was the assign material node – adding a parameters override – it basically clones the material for you and just sets the attr and assigns the copy of the material to the geo. You can have it reference primvars from the geometry as well, so I can do s@inputs:file = s@txpath; But it can’t do **_geo and loop ove the prims, so I still have to do it in a for each (which I haven’t done yet.
But… this in itself ‘kinda’ makes the point inmho that you can definitely clearly define a primvar/attr AS a texture up front, and collect them ahead of time for the renderers
I take it back. It works with ** so yay.
Easy enough for me
…and final thing… in import to SOPs, the @name attribute created will always be the mesh component of the path attribute? @path = ‘/path/to/mesh’ @name = ‘mesh’
For anyone interested.. Ive just RFEd for some documentation improvements for the RBD procedural workflow, but also this is my summary
“`Incoming LOP geometry in SOPs will contain @name and @path attributes. @path will be the USD prim path and should be maintained for the round trip to SOPs and back to LOPs. “/root/myGeometry/mesh01” @name will be equivalent to the mesh component of the full @path (and aids in the construction of USD hierarchies in the absence of @path) “mesh01”
Outgoing geometry Input geometry transformation will now be driven by the RBD simulation’s point transforms. As such we need to establish a relationship between the RBD simulationn points and the newly fractured geometry pieces, while creating no additional pieces in the USD hierarchy. While the fracture creates many pieces of geometry, if those pieces carry the same @path attribute, they will have a singular mesh path in USD, maintaining the original input path hierarchy upon return to LOPs.
FOR THE STATIC FRACTURED GEOMETRY We establish this relationship with a primitive @piecename attribute – which would be the concatenated string of the incoming mesh (ie @name) + the piece naming as derived from your fracturing process. @piecename = “mesh01-piece0-1” (while maintaining @path = “/root/myGeometry/mesh01”) @piecename = “mesh01-piece0-2” (while maintaining @path = “/root/myGeometry/mesh01”) … FOR THE SIMULATION POINTS We establish this relationship with a point @piecename attribute that differs from the fractured geometry in that it will be the concatenated string of the full path (ie @path) + the piece naming as derived from your fracturing process. @piecename = “/root/myGeometry/mesh01-piece0-1” @piecename = “/root/myGeometry/mesh01-piece0-1″“`
I feel if the docs kinda spelt this out a little more for the dummies in the back like me, they might make a little more sense than just a rote learning excercise in dropping nodes
I feel like the above falls under ‘assumed knowledge’ – but it’s only assumed by USD eggheads, not your average SOPsy artist looking to bring their workflows to LOPs
This is my take on: www.sidefx.com/docs/houdini/solaris/houdini_rbd_procedural.html
In case anyone’s interested, I actually collate a lot of my learning and discover into these kind of master files.
The green nodes have content – the yellow are placeholders
There are text-based step throughs and descriptions for doing different stuff
I share these with my colleagues and add more zoic pertinent info – but they are files I develop in my time
And for me, its an invaluable resource when I cant remember just how to constrain something.. do projections.. or set up a scatter/instance workflow
actually sorry need to edit the file – left some junk in there
ok there we go @chris_gardner and @davebr
Attachments in this post:
http://fx-td.com/houdiniandchill/wp-content/uploads/discord/20245309/11/24/image.png