ojrgb

52 posts

ojrgb

ojrgb

@jacodesby

Coding

Brooklyn, NY Katılım Şubat 2026
5 Takip Edilen16 Takipçiler
Thiago Salvador
Thiago Salvador@bettercallsalva·
@AnatoliKopadze If that's three.js plus MediaPipe under the hood, the work is library glue. Still impressive, just credit goes more to the libs than the model itself.
English
1
0
2
206
Anatoli Kopadze
Anatoli Kopadze@AnatoliKopadze·
this guy built this animation in 15 minutes with Claude. a 3D skull that reacts to his face. every head move changes the visual in real time. normally this takes years of 3D design school. today anyone can do it. we're living in a moment where the only real barrier is your idea. not your degree. not your budget. not your team. the people winning right now share 3 things: > they move fast. > they spot trends early. > they know how to use AI. that last one is the one most people skip. all you need is an idea and knowing how to talk to Claude. the article below shows you exactly how.
Anatoli Kopadze@AnatoliKopadze

x.com/i/article/2051…

English
34
33
515
70.3K
ojrgb
ojrgb@jacodesby·
Juggling 3 agents is now a core engineering skill btw
English
0
0
0
17
ojrgb
ojrgb@jacodesby·
Honestly so much of being a good engineer these days is having a good enough spidey sense to know when the plans Claude comes up with are finished or not
English
0
0
0
21
Blaze
Blaze@browomo·
This guy built a refraction effect in TouchDesigner in his bedroom and in 1 weekend reproduced what Disney spends $100 million and a team of 50 people on. His project is called "Refraction Ball with Particles" and runs on a regular MacBook with 1 USB camera. No Hollywood pipeline, no render farm, and no VFX agency behind him. He launched MediaPipe Hands on a USB camera in TouchDesigner. 21 points on each hand are tracked at 60 frames per second. One scene with real liquid glass physics renders right in the application window, without a separate build and without render time. And his project is built on this setup: FPS: 65.0 Hand Tracker: MediaPipe Hands, 21 landmarks per hand Refraction Shader: real-time glass material with chromatic aberration Particle System: 8,000 particles attached to skeleton Glass Material: dynamic IOR, light reflection in real-time Camera: USB camera, 1080p input And this stack knows exactly what it is doing. It knows the camera catches every finger, that the preview renders without delay. The guy iterates on 1 effect over and over. On his laptop are the files: Refraction Ball.16.toe, Bendy Fabric.52.toe, and dozens of others, each iteration is an attempt to get closer to the quality that VFX studios used to be paid for. Gradually the stack reaches a state where his effects look like frames from expensive films. And then it gets interesting, where is the line right now between a big studio and a guy with TouchDesigner? Every new effect is built from scratch in 1 room, and most of the VFX pipeline now fits in 1 MacBook on a desk. Here is what is rendering on his screen right now: "Refraction Ball with Particles, a glass ball with 8,000 particles attached to the skeleton of the hands" "Prism Cube, an orange-pink cube with chromatic aberration that reacts to the tilt of the palms" "Glass Face, a liquid glass mask with real light reflection and background refraction" "Bendy Fabric, rainbow textile with rainbow refraction that bends between the hands" The entire stack is open: 1 USB camera, a regular MacBook, a TouchDesigner project that downloads in just 30 seconds, and a cat on the shelf behind him lazily watching the whole process. From what I have observed, this is 1 of the most polished one-man VFX stacks I have seen recently. Would you be able to tell an effect like this shot with a USB camera in a room apart from a real VFX frame in a movie, if you did not know how it was made?
Anatoli Kopadze@AnatoliKopadze

x.com/i/article/2050…

English
9
5
43
4.9K
ojrgb
ojrgb@jacodesby·
@stephlon_b @browomo I made this IN MY OFFICE FFS, im the guy behind the original vid. Why are all these X accounts trying to paint me as some broke college student? For christ’s same im 34
English
0
0
1
14
Stephen
Stephen@stephlon_b·
@browomo The line keeps moving now bedroom rigs are basically the new studios.
English
1
0
0
14
NMhao
NMhao@asukahaox·
@browomo この人はTouchDesignerで自宅の寝室で屈折効果を作り、ディズニーチームが1億ドルと50人のスタッフを使って再現するのに一週間かかります。すごい!
日本語
1
0
0
129
Captain Code ᯅ
Captain Code ᯅ@CaptainCodeOnX·
Genuinely one of the most asinine takes I've seen this year, and what's wild is how many ppl have been posting about this dude praising it as "studio level" without actually looking at what's on screen. lemme go through it: 1. MediaPipe Hands is a free Google library. 21 landmarks per hand in real-time is the default, not an achievement 2. chromatic aberration is like 3 lines of GLSL. refraction is a built-in shader function. both are in every beginner TD tutorial on youtube 3. 8000 GPU particles is nothing, modern GPUs push millions. the number is just there to sound impressive a real time webcam shader and a finished VFX shot aren't the same thing. one is a toy running in a preview window, the other has to composite into live action, match plate lighting frame by frame, and hold up at 4K on a cinema screen. this is a bedroom demo not a studio replacement. ppl glazing a beginner tutorial as the death of VFX is LinkedInfication of X type shi fr
English
1
0
1
61
ojrgb
ojrgb@jacodesby·
@browomo Every iteration of this story about me gets a little more unhinged. No, I didn’t build it from my bedroom, I built it IN MY OFFICE.
English
0
0
1
20
Maxime
Maxime@MaximeHeckel·
@jacodesby from the tweet it seems like he knows more details about the project than you do haha
English
1
0
2
1.5K
ojrgb
ojrgb@jacodesby·
@OzAIHub @defileo On my page, but there are significant disadvantages to my approach over the vision pro. 0 correlation!
English
0
0
0
9
Defileo🔮
Defileo🔮@defileo·
Vision Pro: $3,500, 600g on your face, 2 hour battery, his version: webcam, bare hands, no headset, infinite battery. And the wildest part, his version actually feels like the future, Apple's just feels like a brick. Watch as he spreads his fingers, images burst into 3D, photos and screenshots from his archive fanning between his palms like a cosmic accordion. Pinch closer, the stack collapses into a single point, pull wider, the images stretch all the way to the corners of his bedroom. The whole thing runs on TouchDesigner, hand tracking pulled straight from a regular webcam, zero hardware strapped to his body, zero accounts, zero subscriptions, zero buttons to press. Apple wants you to wear the future, he wants you to grab it, one of these belongs in a museum. The other one belongs in your living room, that’s some Iron Man vibes.
Noisy@noisyb0y1

x.com/i/article/2049…

English
12
8
90
2.7K
Charlie Chess Penguin
Charlie Chess Penguin@chanceryknight·
@defileo He is my brother in law, and very grateful to everyone's positive energy and feedback. And he is not a 20 year old, American, or in college
English
1
0
3
22
ojrgb
ojrgb@jacodesby·
@defileo DUDE STOP POSTING ABOUT ME PLS
English
0
0
0
9
ojrgb
ojrgb@jacodesby·
If you made it here from one of those posts calling me a 20 yr old American student who sold his software for $230k, welcome. I’ll be drip feeding some vids so you can see it’s actually me. Not vibe coded, just coded. Peace! ✌️ #touchdesigner #vfx #visuals
English
0
0
2
156
ojrgb
ojrgb@jacodesby·
@aionfork @defileo I made the original. First of all, the post is cap. Second, it uses a custom GLSL mat I built based on research by @MaximeHeckel . You’re correct re: mediapipe.
English
1
0
1
48
David Conner
David Conner@aionfork·
@defileo The light is actually refracting through the silk and imitating spectral refraction… probably the most impressive part. It’s using mediapipe. I can’t tell if the silk is real. Or anything anymore. Cam could be inferring the surface … which would definitely require heavy GPUs
English
2
0
1
726
Defileo🔮
Defileo🔮@defileo·
Apple spent 7 years and $3,500 per unit building Vision Pro, this guy did something better with a webcam in his bedroom. And no, this isn't a render, it's running live on his laptop right now with a webcam pointed at his hands. The webcam tracks 21 points on each of his hands in real time, every fingertip, every knuckle, every joint, mapped to a 3D skeleton at 60 frames per second. A piece of digital silk is rigged to that skeleton, so when he opens his palm the fabric drapes across his fingers, when he closes his hand it crumples in his fist, when he tilts his wrist the cloth slides off and folds with real physics, light reflecting off it like actual material. He can grab it, stretch it, throw it, catch it, all with his bare hands and zero hardware on his body. No headset, no gloves, no $3,500 face computer, no 600 gram brick on his skull, no 2 hour battery, no Apple ID, no App Store. Just a webcam, TouchDesigner and a kid who saw the Vision Pro keynote and thought, I can do that for free. Apple has 3,000 engineers, $200 billion in cash and 7 years of development time, they built a face computer most people will never own. This guy has a laptop, a webcam and a weekend, he built a future most people can actually use. The mixed reality industry just got embarrassed by a guy in a white t-shirt.
Defileo🔮@defileo

x.com/i/article/2048…

English
87
189
3.2K
3M
ojrgb
ojrgb@jacodesby·
@shirtweiner @defileo It was me, it wasnt vibe coded it was built in TouchDesigner. You’re right on all other accounts
English
1
0
0
14
big bob
big bob@shirtweiner·
@defileo Ahhh yes the daily post about some kid who made a shitty cloud point filter. He didn’t sell it for hundreds of thousands, it has nothing to do with Apple. It’s a shit vibe coded filter and your comparing it to hardware
English
1
0
10
1.6K
ojrgb
ojrgb@jacodesby·
@stillwritescode @defileo Correct. I made the original. It uses a refractive material as opposed to most liquid glass effects that use displacement maps. Pure cap. 🧢
English
0
0
1
12
Steven Williams
Steven Williams@stillwritescode·
@defileo This is really a recreation of Liquid Glass - the UI design system created with Vision Pro in mind. What this guy built is cool but it really has nothing to do with VR or VR headsets.
English
1
0
12
2.8K
ojrgb
ojrgb@jacodesby·
@juliussland @defileo No they don’t, I made the original video and it has very little utility outside of the effect I recorded.
English
0
0
0
13
Julius
Julius@juliussland·
@defileo It’s interesting how simple ideas sometimes outperform heavily resourced projects, it really comes down to execution and timing
English
1
0
1
1.8K
ojrgb
ojrgb@jacodesby·
@fuhQthx @defileo Not only AI generated but also straight up wrong. I’m not a kid, I’m 34.
English
0
0
3
12
!
!@fuhQthx·
@defileo So sick of these AI generated long endless descriptions to every post anymore. It's so fucking blatantly obvious you didn't even write that.
English
1
0
70
3.7K