Archived post by midgaard78

With regard to the color space talk, I happened to have closely investigated what H20 is doing just yesterday. The issue I was running into is that Karma’s output respects the Input File Rules set up in the OCIO editor (Edit > OCIO Settings…). So even if your working space is set to ACEScg, the rendered image will still be linear rec.709 unless you take some step to prevent it.
– You can put the string “ACEScg” in the render’s filename, and Karma will obey the file rule. – You can change the file rule for *.exr to ACEScg. If you’re already pre-processing your inputs, this is what you should do. This is the “correct” ACES workflow. – You can set the Output Colorspace parameter for each AOV that should be ACEScg. – Or you can use the OCIO filter in the render settings to apply a post conversion. I recommend against that one, though, as I do not believe the conversion is lossless, and it risks a double conversion.
The SDR 1.0 Video view transform from ACES 1.3 ought to be sufficiently visually similar to Output – Display – sRGB from ACES 1.2, but I haven’t verified that yet. The Un-tone-mapped view should be the same as Input – Texture – sRGB.
The ACES 1.2 config can be found here: github.com/colour-science/OpenColorIO-Configs/tree/master/aces_1.2

Archived post by technically_artist

Posting this for future reference and so others won’t walk my foolish path “`c // Vertex Wrangle vector quad[] = {{0,0,0}, {0,1,0}, {1,1,0}, {1,0,0}}; vector tri[] = {{0,0,0}, {1,0,0}, {0,1,0}};
int vtx = vertexprimindex(0, i@vtxnum); int vtxcount = primvertexcount(0, i@primnum);
if(vtxcount == 3) v@__intrinsic_uv = tri[vtx]; else if(vtxcount == 4) v@__intrinsic_uv = quad[vtx]; else // Probably should do NGons at some point v@__intrinsic_uv = -1.0; “`

Archived post by mattiasmalmer

Did you guys see this neat SHARD Noise function that @ENDESGA posted on twitter? x.com/ENDESGA/status/1725827957061759092?s=20
I did a quick houdini variant. Also extended it to 4D so that we can phase it.

Attachments in this post:
http://fx-td.com/houdiniandchill/wp-content/uploads/discord/20233711/21/23/SHARDNOISE.hiplc
http://fx-td.com/houdiniandchill/wp-content/uploads/discord/20233711/21/23/image.png

Archived post by ogvfx

I’ve done some of that custom surface tension calculating adhesion and cohesion on surfaces when I was working for Pixar on elemental.
I pretty much used these for references and incorporated it into flip.
https://vimeo.com/203706350 https://vimeo.com/299769390
One other thing I did was an example I’m trying to find again from odforce that does a sdf calculation on a point cloud and I used that essentially as an airfield mask and used velocity and vorticity also to mask areas for reseeding and maintaining the points to be uniform distance and turned off regular reseeding on the flip solver.
This was the end result on the first shot here: https://vimeo.com/854754642
It ended up being two separate sims when his head explodes and all the splashing on the shelfs. I even used @jake rice 😮 optimal transport example and built a guided splash setup that directed and shaped all the splashes from the sources so the final resolution was predictable. The reseeding for the flip was a pain sometimes on turning it around but I ended up making sure it only resampled if it was deep within the sdf calculation and the flip wasn’t calm. I remember Alejandro had a very early example I kept around and built all this stuff inside a sop solver and just calculated the point cloud sdf in a wrangle before it processed the sop solver.