@de_vallibus Thanks. The main raytracer is too huge, so I had Gemini generate everything (code/docs) for the decoupled part. I didn't read or write a single line of code, same as the main project. I only did testing.
github.com/MEBYCentral/we…
@meby_central is there a repository? i'd be interested to look at this. i am making a rust+threejs game engine. i already auto-generate rust apis to offload work.
yours seems like a brilliant idea
@AiDevCraft Exactly. With a well-defined protocol, data generation and rendering become independent projects. The biggest advantage is being able to develop multiple generation tools in parallel without affecting the renderer.
@iabom I don't use real-time screen recording. I save each rendered frame into memory array, and then encode them all into an MP4 using ffmpeg.wasm in the browser.
@AiDevCraft Thank you. That’s exactly right. I believe the strength of this approach is that it allows us to easily visualize "the quality of real-time rendering a few years from now," without waiting for hardware evolution.