[hou-cops] Archived post by mattiasmalmer

so anyways. i got that normalmap to depthmap using DFT to work in Nuke too if anyone is interested:

here is the Nuke implementation. (it is a group gizmo so you do not have to worry about using it as it becomes actual nodes in your scene.)

Attachments in this post:
http://fx-td.com/houdiniandchill/wp-content/uploads/discord/20243610/23/24/image.png
http://fx-td.com/houdiniandchill/wp-content/uploads/discord/20243610/23/24/Normal2Depth.gizmo

Archived post by fabriciochamon

here, pong game hip! (Houdini 20.5)
did some cleaning/commenting if anyone is curious about the inner works. But basically:
– chop “keyboard” listens for a couple key presses (in momentary mode) – dopnet takes care of moving blocks at every chop signal, since its a simulation, movement is cumulative – RBD wise, I have : – ball (active=1) – player 1 and 2 blocks (active=0, they still collide, but movement driven by the dopnet) – top/bottom walls (active=0) – power ups (made into rbds for easily collision checking against player blocks) – all elements have collisiongroup / collisionignore bullet attrs
Power ups: (this was the most fun part of this little project for me!)
– 2 types available: “timed” and “one-shot”. “Timed” has duration, “one-shot” shots a projectile that hits enemy block – the behaviors are mostly defined in popwrangles or sopsolvers, and are easy to manage (except for one that rotates enemy block, it has to be injected into the block moving logic 😓 ) – spawn rate/seed etc, along with other game controls, available in a ctrl null. – avaliable power ups: – L (Large) = grow player block 2x the size for a couple seconds – F (Freeze) = shots a snowball that freezes enemy block for a couple seconds upon hit – S (Speed) = makes block move faster – R (Rotate) = shots a projectile that makes enemy block rotate for a couple seconds – M (Magnet) = attracts the ball towards your block, then action key shoots it back with lots of energy
Skins:
– I have a post sim tree that takes care of applying skins to each element – ball has trail flames – blocks have different animations/ embelishments according to the power ups acquired – projectiles visuals, etc – easy to extend and/or change!
theme bg’s were a nice exercise..they are all made in copernicus, check “backrgound” node.

Attachments in this post:
http://fx-td.com/houdiniandchill/wp-content/uploads/discord/20244710/17/24/Pong.hiplc

[hou-cops] Archived post by mattiasmalmer

aight bois! i talked to TinyTexel on shadertoy and he showed me his really cool implementation of Normal2Depth using fourier space filter inversion: www.shadertoy.com/view/XcjXDc I spent a bit of time wrapping my smooth brain around the code and managed to cobble together the same thing in COP:s using OpenCL:
it is very very fast.

Attachments in this post:
http://fx-td.com/houdiniandchill/wp-content/uploads/discord/20240410/17/24/image.png
http://fx-td.com/houdiniandchill/wp-content/uploads/discord/20240410/17/24/fftNormal2Depth.hiplc

[hou-cops] Archived post by mattiasmalmer

Here is an alternative for dealing with displacement from normals or “slant lit” images:

i cant remember where i got the opencl code from. it was made a while back. maybe i wrote it? Maybe entagma? or i might have stolen it from someone here?

Attachments in this post:
http://fx-td.com/houdiniandchill/wp-content/uploads/discord/20241610/08/24/disp_from_normal.zip
http://fx-td.com/houdiniandchill/wp-content/uploads/discord/20241610/08/24/houdini_LCDRX96IQ1.mp4

Archived post by pixel_bender

Hey guys, for anyone who’s ever needed to go on set, wanted to, or just wanted to do their own shoot – I recently put this together based on the content we used to train our lighting students with at SVA. It’s not a ‘public’ document (mainly because Im using images stolen without permission) – but wanted to share here for VFX boffins.
Also open to feedback or further input too

It doesnt cover ‘being a vfx supervisor’ – or the on-set experience with regards to on-set ettiquett at this point… its more of a technical guide

I wouldnt click the embedded hyperlinks, theyll only take you to a notion page you cant access

Attachments in this post:
http://fx-td.com/houdiniandchill/wp-content/uploads/discord/20241810/02/24/VFX_Shooting_Guide_v0.1.pdf

Archived post by jim.meston

Nah f that. It’s as easy as making a subnet, promoting to HDA and changing the view state. Then the user pins the scene view at the container level. Then you basically have an empty utility HDA that people can use if they choose to.

Call it rigpose_multi or something catchy and get 5 cool points.

+ if memory serves me right the rigpose vis controls are multiparms so you the user could theoretically make visibility sets.

Here you go…

By default the container has no parms, but if you drag the relevant multiparm root into the container edit parms window, it propogates all the parms for you. You can do this for the tranforms as well. It takes a few seconds. Or just pin the view at the top level and edit from within the container. Should work for custom hdas with rigpose states also.

Attachments in this post:
http://fx-td.com/houdiniandchill/wp-content/uploads/discord/20245810/01/24/rigpose_multiviewer.gif
http://fx-td.com/houdiniandchill/wp-content/uploads/discord/20245810/01/24/rigpose_multi.zip

Archived post by pixel_bender

…and final thing… in import to SOPs, the @name attribute created will always be the mesh component of the path attribute? @path = ‘/path/to/mesh’ @name = ‘mesh’

For anyone interested.. Ive just RFEd for some documentation improvements for the RBD procedural workflow, but also this is my summary
“`Incoming LOP geometry in SOPs will contain @name and @path attributes. @path will be the USD prim path and should be maintained for the round trip to SOPs and back to LOPs. “/root/myGeometry/mesh01” @name will be equivalent to the mesh component of the full @path (and aids in the construction of USD hierarchies in the absence of @path) “mesh01”
Outgoing geometry Input geometry transformation will now be driven by the RBD simulation’s point transforms. As such we need to establish a relationship between the RBD simulationn points and the newly fractured geometry pieces, while creating no additional pieces in the USD hierarchy. While the fracture creates many pieces of geometry, if those pieces carry the same @path attribute, they will have a singular mesh path in USD, maintaining the original input path hierarchy upon return to LOPs.
FOR THE STATIC FRACTURED GEOMETRY We establish this relationship with a primitive @piecename attribute – which would be the concatenated string of the incoming mesh (ie @name) + the piece naming as derived from your fracturing process. @piecename = “mesh01-piece0-1” (while maintaining @path = “/root/myGeometry/mesh01”) @piecename = “mesh01-piece0-2” (while maintaining @path = “/root/myGeometry/mesh01”) … FOR THE SIMULATION POINTS We establish this relationship with a point @piecename attribute that differs from the fractured geometry in that it will be the concatenated string of the full path (ie @path) + the piece naming as derived from your fracturing process.   @piecename = “/root/myGeometry/mesh01-piece0-1” @piecename = “/root/myGeometry/mesh01-piece0-1″“`

I feel if the docs kinda spelt this out a little more for the dummies in the back like me, they might make a little more sense than just a rote learning excercise in dropping nodes

I feel like the above falls under ‘assumed knowledge’ – but it’s only assumed by USD eggheads, not your average SOPsy artist looking to bring their workflows to LOPs

This is my take on: www.sidefx.com/docs/houdini/solaris/houdini_rbd_procedural.html