Captain Code ᯅ

609 posts

Captain Code ᯅ banner
Captain Code ᯅ

Captain Code ᯅ

@CaptainCodeOnX

Bio-digital jazz, man! gotchi gotchi gotchi @opengotchi XRmaxxing @BlendedBldrs

the grid Katılım Eylül 2022
80 Takip Edilen179 Takipçiler
Sabitlenmiş Tweet
Captain Code ᯅ
Captain Code ᯅ@CaptainCodeOnX·
"A new species would bless me as its creator and source; many happy and excellent natures would owe their being to me."
Captain Code ᯅ tweet media
English
1
1
8
582
Vicharak
Vicharak@Vicharak_In·
@CaptainCodeOnX you'll be able to learn it, that's the whole point. don't worry.
English
1
0
0
10
Vicharak
Vicharak@Vicharak_In·
Ladies and gentlemen, as we all know, @Qualcomm’s IoT SoC will empower the entire ecosystem out there. We’re introducing something exceptional in the single-board computer domain, with it also we’ve also explored modularity, much like @FrameworkPuter. It’s powered by the QCS6490, delivering 12 TOPS of performance.
English
90
290
2.5K
236.7K
Captain Code ᯅ
Captain Code ᯅ@CaptainCodeOnX·
the driver's watching tiktoks and driving. send help
Captain Code ᯅ tweet media
English
0
0
1
24
Fluent
Fluent@fluentxyz·
May the 4th be with you.
Fluent tweet media
English
28
12
184
7K
Blaze
Blaze@browomo·
This guy built a refraction effect in TouchDesigner in his bedroom and in 1 weekend reproduced what Disney spends $100 million and a team of 50 people on. His project is called "Refraction Ball with Particles" and runs on a regular MacBook with 1 USB camera. No Hollywood pipeline, no render farm, and no VFX agency behind him. He launched MediaPipe Hands on a USB camera in TouchDesigner. 21 points on each hand are tracked at 60 frames per second. One scene with real liquid glass physics renders right in the application window, without a separate build and without render time. And his project is built on this setup: FPS: 65.0 Hand Tracker: MediaPipe Hands, 21 landmarks per hand Refraction Shader: real-time glass material with chromatic aberration Particle System: 8,000 particles attached to skeleton Glass Material: dynamic IOR, light reflection in real-time Camera: USB camera, 1080p input And this stack knows exactly what it is doing. It knows the camera catches every finger, that the preview renders without delay. The guy iterates on 1 effect over and over. On his laptop are the files: Refraction Ball.16.toe, Bendy Fabric.52.toe, and dozens of others, each iteration is an attempt to get closer to the quality that VFX studios used to be paid for. Gradually the stack reaches a state where his effects look like frames from expensive films. And then it gets interesting, where is the line right now between a big studio and a guy with TouchDesigner? Every new effect is built from scratch in 1 room, and most of the VFX pipeline now fits in 1 MacBook on a desk. Here is what is rendering on his screen right now: "Refraction Ball with Particles, a glass ball with 8,000 particles attached to the skeleton of the hands" "Prism Cube, an orange-pink cube with chromatic aberration that reacts to the tilt of the palms" "Glass Face, a liquid glass mask with real light reflection and background refraction" "Bendy Fabric, rainbow textile with rainbow refraction that bends between the hands" The entire stack is open: 1 USB camera, a regular MacBook, a TouchDesigner project that downloads in just 30 seconds, and a cat on the shelf behind him lazily watching the whole process. From what I have observed, this is 1 of the most polished one-man VFX stacks I have seen recently. Would you be able to tell an effect like this shot with a USB camera in a room apart from a real VFX frame in a movie, if you did not know how it was made?
Anatoli Kopadze@AnatoliKopadze

x.com/i/article/2050…

English
9
5
44
4.5K