Archived post by mattiasmalmer

how do i get the cameras uv space in a materialx shader? i want to do projective texturing in the shader.

This works half-assedly but only on CPUkarma. I load the hit position in worldspace. transform it to cameraspace (using space:camera in the transformpoint) then divide that with hit distance to get a projection. (using ray:hitPz in the mtlxdot) but the camera:space thing does not work in XPU.
I thought this would be the easiest thing ever. anyone having any better tricks?

generating precooked uv data on the geo from the camera is not as good as one tends to get projection errors over larger polygons and such.

ok after the longest time mucking about I found how it is actually done:
You use the coordsys node in LOP:s to define the coordinate system and reference the camera. Then use that coordsys in a mtlxposition and use myCoordinateSysName:ndc to get it in the camera projection space. Neat because then you can use any camera as a projector and so forth.

aw frekk. does not work in xpu.

Attachments in this post:
http://fx-td.com/houdiniandchill/wp-content/uploads/discord/20232109/07/23/image.png