
ojrgb
52 posts


@bettercallsalva @AnatoliKopadze Its touchdesigner and everything op wrote is a lie. Im the guy who made the vid
English

@AnatoliKopadze If that's three.js plus MediaPipe under the hood, the work is library glue. Still impressive, just credit goes more to the libs than the model itself.
English

this guy built this animation in 15 minutes with Claude.
a 3D skull that reacts to his face.
every head move changes the visual in real time.
normally this takes years of 3D design school.
today anyone can do it.
we're living in a moment where the only real barrier is your idea.
not your degree. not your budget. not your team.
the people winning right now share 3 things:
> they move fast.
> they spot trends early.
> they know how to use AI.
that last one is the one most people skip.
all you need is an idea and knowing how to talk to Claude.
the article below shows you exactly how.
Anatoli Kopadze@AnatoliKopadze
English

@jacodesby @browomo i swear I've got no clue what they even tryna sell
English

This guy built a refraction effect in TouchDesigner in his bedroom and in 1 weekend reproduced what Disney spends $100 million and a team of 50 people on.
His project is called "Refraction Ball with Particles" and runs on a regular MacBook with 1 USB camera.
No Hollywood pipeline, no render farm, and no VFX agency behind him.
He launched MediaPipe Hands on a USB camera in TouchDesigner.
21 points on each hand are tracked at 60 frames per second. One scene with real liquid glass physics renders right in the application window, without a separate build and without render time.
And his project is built on this setup:
FPS: 65.0
Hand Tracker: MediaPipe Hands, 21 landmarks per hand
Refraction Shader: real-time glass material with chromatic aberration
Particle System: 8,000 particles attached to skeleton
Glass Material: dynamic IOR, light reflection in real-time
Camera: USB camera, 1080p input
And this stack knows exactly what it is doing.
It knows the camera catches every finger, that the preview renders without delay.
The guy iterates on 1 effect over and over.
On his laptop are the files: Refraction Ball.16.toe, Bendy Fabric.52.toe, and dozens of others, each iteration is an attempt to get closer to the quality that VFX studios used to be paid for.
Gradually the stack reaches a state where his effects look like frames from expensive films.
And then it gets interesting, where is the line right now between a big studio and a guy with TouchDesigner?
Every new effect is built from scratch in 1 room, and most of the VFX pipeline now fits in 1 MacBook on a desk.
Here is what is rendering on his screen right now:
"Refraction Ball with Particles, a glass ball with 8,000 particles attached to the skeleton of the hands"
"Prism Cube, an orange-pink cube with chromatic aberration that reacts to the tilt of the palms"
"Glass Face, a liquid glass mask with real light reflection and background refraction"
"Bendy Fabric, rainbow textile with rainbow refraction that bends between the hands"
The entire stack is open: 1 USB camera, a regular MacBook, a TouchDesigner project that downloads in just 30 seconds, and a cat on the shelf behind him lazily watching the whole process.
From what I have observed, this is 1 of the most polished one-man VFX stacks I have seen recently.
Would you be able to tell an effect like this shot with a USB camera in a room apart from a real VFX frame in a movie, if you did not know how it was made?
Anatoli Kopadze@AnatoliKopadze
English

@stephlon_b @browomo I made this IN MY OFFICE FFS, im the guy behind the original vid. Why are all these X accounts trying to paint me as some
broke college student? For christ’s same im 34
English

@CaptainCodeOnX @browomo You’re right on all counts. Signed, THE GUY WHO MADE THE ORIGINAL VIDEO
English

Genuinely one of the most asinine takes I've seen this year, and what's wild is how many ppl have been posting about this dude praising it as "studio level" without actually looking at what's on screen. lemme go through it:
1. MediaPipe Hands is a free Google library. 21 landmarks per hand in real-time is the default, not an achievement
2. chromatic aberration is like 3 lines of GLSL. refraction is a built-in shader function. both are in every beginner TD tutorial on youtube
3. 8000 GPU particles is nothing, modern GPUs push millions. the number is just there to sound impressive
a real time webcam shader and a finished VFX shot aren't the same thing. one is a toy running in a preview window, the other has to composite into live action, match plate lighting frame by frame, and hold up at 4K on a cinema screen. this is a bedroom demo not a studio replacement.
ppl glazing a beginner tutorial as the death of VFX is LinkedInfication of X type shi fr
English

@jacodesby from the tweet it seems like he knows more details about the project than you do haha
English


Vision Pro: $3,500, 600g on your face, 2 hour battery, his version: webcam, bare hands, no headset, infinite battery.
And the wildest part, his version actually feels like the future, Apple's just feels like a brick.
Watch as he spreads his fingers, images burst into 3D, photos and screenshots from his archive fanning between his palms like a cosmic accordion.
Pinch closer, the stack collapses into a single point, pull wider, the images stretch all the way to the corners of his bedroom.
The whole thing runs on TouchDesigner, hand tracking pulled straight from a regular webcam, zero hardware strapped to his body, zero accounts, zero subscriptions, zero buttons to press.
Apple wants you to wear the future, he wants you to grab it, one of these belongs in a museum.
The other one belongs in your living room, that’s some Iron Man vibes.
Noisy@noisyb0y1
English

@defileo He is my brother in law, and very grateful to everyone's positive energy and feedback.
And he is not a 20 year old, American, or in college
English

If you made it here from one of those posts calling me a 20 yr old American student who sold his software for $230k, welcome. I’ll be drip feeding some vids so you can see it’s actually me. Not vibe coded, just coded. Peace! ✌️ #touchdesigner #vfx #visuals
English

@aionfork @defileo I made the original. First of all, the post is cap. Second, it uses a custom GLSL mat I built based on research by @MaximeHeckel . You’re correct re: mediapipe.
English

@defileo The light is actually refracting through the silk and imitating spectral refraction… probably the most impressive part.
It’s using mediapipe. I can’t tell if the silk is real. Or anything anymore. Cam could be inferring the surface … which would definitely require heavy GPUs
English

Apple spent 7 years and $3,500 per unit building Vision Pro, this guy did something better with a webcam in his bedroom.
And no, this isn't a render, it's running live on his laptop right now with a webcam pointed at his hands.
The webcam tracks 21 points on each of his hands in real time, every fingertip, every knuckle, every joint, mapped to a 3D skeleton at 60 frames per second.
A piece of digital silk is rigged to that skeleton, so when he opens his palm the fabric drapes across his fingers, when he closes his hand it crumples in his fist, when he tilts his wrist the cloth slides off and folds with real physics, light reflecting off it like actual material.
He can grab it, stretch it, throw it, catch it, all with his bare hands and zero hardware on his body.
No headset, no gloves, no $3,500 face computer, no 600 gram brick on his skull, no 2 hour battery, no Apple ID, no App Store.
Just a webcam, TouchDesigner and a kid who saw the Vision Pro keynote and thought, I can do that for free.
Apple has 3,000 engineers, $200 billion in cash and 7 years of development time, they built a face computer most people will never own.
This guy has a laptop, a webcam and a weekend, he built a future most people can actually use.
The mixed reality industry just got embarrassed by a guy in a white t-shirt.
Defileo🔮@defileo
English

@shirtweiner @defileo It was me, it wasnt vibe coded it was built in TouchDesigner. You’re right on all other accounts
English

@stillwritescode @defileo Correct. I made the original. It uses a refractive material as opposed to most liquid glass effects that use displacement maps. Pure cap. 🧢
English

@defileo This is really a recreation of Liquid Glass - the UI design system created with Vision Pro in mind.
What this guy built is cool but it really has nothing to do with VR or VR headsets.
English

@juliussland @defileo No they don’t, I made the original video and it has very little utility outside of the effect I recorded.
English

@travisbuhler @defileo Literally nothing. Im the guy in the vid. 0 correlation.
English





