Archived post by paqwak

yup, probably very ugly ‘coding’ wise, but … it works :O) (still need to add the break/stop) condition … kinda hypnotic 🙂 (I should rename it “retarded postman” or something 🙂 )

Attachments in this post:
http://fx-td.com/houdiniandchill/wp-content/uploads/discord/20232307/14/23/PseudoPostman3.hiplc
http://fx-td.com/houdiniandchill/wp-content/uploads/discord/20232307/14/23/GIF_7-14-2023_1-24-27_PM.gif

Archived post by swalsch

totally, seems that openIO won’t area-sample pngs

well, multiple calls of colormap it is

totally works with the mandril

for posterity, this is how I solved it with multiple calls of colormap: “`c vector uvw = set(v@P.x+.5,v@P.z+.5,0.0); string texPath = chs(“tex_path”); vector colors[] = {}; for(int i=0;iv@Cd = avg(colors);“`

Attachments in this post:
http://fx-td.com/houdiniandchill/wp-content/uploads/discord/20230707/12/23/image.png

Archived post by petersanitra

@squidbean yes and yes, and use .tx, works great for me. You can also use {“KARMA_XPU_OPTIX_SPARSE_TEXTURES” : “1”} to force it to sparse

re mipmapping, it picks correct mipmap level from tx. You can check with this map for example, if you don`t have one around already… dgruwier.gumroad.com/l/LJehG

It`s fun map to use, for example checking displacement, comparing engines etc, fun fun 🙂

Archived post by lwwwwwws

this is literally my specialist subject, strap in 😈 there is indeed a lot of eyeballing wih various blurs, convolves, glows, chroma spread filters but there are a few more accurate things you can do: ✺ extract out-of-focus discs from real images and use them as convolution kernels, either in-render or in nuke with ZDefocus/pgBokeh ✺ match distortion and the way it changes with focus by shooting grids, crucial for for wide anamorphics ✺ both zeiss and cooke are slowly getting into the 21st century and providing distortion and vignetting data for their lenses, either online or via the data connection to the camera: cincraft.zeiss.com/us/home, cookeoptics.com/wp-content/uploads/2021/11/Cooke-i-Technology-Part-III-2021.pdf ✺ karma has a physical lens shader and octane has similar controls in its universal camera which at least try to render with aberrations (they are sooooo not based on actual lens designs tho)

✺ there a few tools for properly simulating flares from real lens designs appearing, e.g. dl.acm.org/doi/pdf/10.1145/3329715.3338881, beatreichenbach.github.io/realflare/, www.maxon.net/en/red-giant/vfx/real-lens-flares ✺ there’s another set of flare approaches that consider the whole image rather than single-point light flares, which are less physically-based but look awesome e.g. the video midway down www.fxguide.com/fxfeatured/the-batman-movie-keeping-it-real/ and to some extent the FFT-based bloom in game engines ✺ over the last decade or so there’s been a bunch of research into lens simulation for vfx which feels like it’s almost ready to get used, most obviously lentil.xyz/… I summarised a lot of the relevant papers in the pinned tweet thread on https://twitter.com/dearlensform and i’m perenially trying to make something based on those ideas – the hope is to simulate things that have to be guessed at currently like local contrast effects, astigmatism, pupil occlusion, veiling glare, colour effects beyond just scaling the blue channel a bit… ✺ there’s also TON of research going on right now about simulating lenses for machine vision because obviously the lens has a massive effect on how ML models “see” the world, and some of it might become applicable for vfx as well… i’m way behind with my reading but e.g. ieeexplore.ieee.org/document/9919421, dl.acm.org/doi/10.1145/3197517.3201333, quan-zheng.github.io/publication/NeuroLens-paper.pdf

oh also i’m extremely curious about whether weta have some absolutely sick in-house tools for all this given how exacting both PhysLight and Manuka sound, the fact they they never talk about the obvious connection between the two (the lens) and the fact that they have a solid section of names in the credits labelled “Optics”… <:blobhyperthonk:503450826329817088>

Archived post by lwwwwwws

on mac it’s super easy using instruments.app which comes with xcode… just pick “time profiler”, choose a running process at the top, hit record for a few seconds then fiddle with the viewing options at the bottom ⏱️ actually there’s even a basic version you can get from activity monitor by hitting “sample process” from the three dots menu on linux you can do `perf record -p ` and then one of many options for viewing the output: perf.wiki.kernel.org/index.php/Tutorial#Flame_Graph

for windows i have no clue though, i’d love to know… i’m a clueless baby about windows development, you might have to install full visual studio