Startracker 🔺

8.7K posts

Startracker 🔺 banner
Startracker 🔺

Startracker 🔺

@startracker

Vibe coding a game engine - updates every day! + some genAI movies Creative. Tech. Co-hosting 'The Forum' every Thu from 7 AM PST on @TheArena

Second Reality Beigetreten Ekim 2012
1.7K Folgt5.8K Follower
Angehefteter Tweet
Startracker 🔺
Startracker 🔺@startracker·
FORGET MY PREVIOUS POST, LET ME BLOW YOUR MINDS! 🤯NEW METHOD! AI 2D SPRITES ARE SOLVED. PERIOD. Day 50 of building a webGL game engine using @cursor_ai 🔽🔽🔽 I was wrong. The previous system had flaws. It wasn't the best. If you use it on VFX heavy sprites, it would start falling apart - either via grain/artifacts, or displacement. But I just couldn't give up. I knew I needed something more. This was the limit the models could give me. So I consulted science: Zheng et al., "Bilateral Reference for High-Resolution Dichotomous Image Segmentation" (CAAI AIR 2024, arXiv:2401.03407) - BiRefNet If any of the brilliant minds who wrote this paper read this post, I'm extremely thankful for your research. It made me solve a really big problem with 2D sprites, and I'm extremely grateful! Thank you! Please reach out if you see this! Now, how does this solution work: 1⃣ Generate your Kling animation (still king for anime 2D style). 2⃣ Run BiRefNet (HR-matting variant rocks) → get solid character alpha. 3⃣ Compute simple brightness/luminance alpha (luma → alpha curve, easy in any tool). 4⃣ Final alpha = max(BiRefNet_alpha, brightness_alpha) — that's it. No fancy weights needed. 5⃣ Feed to your engine / Comfy / whatever. Dynamic lights now play nice with zero artifacts. Mind you, this solution requires an nVidia GPU to run local inference. Alternatively, you can get it done on Colab likely with the free tier! You can try CPU, but no guarantees. Try a lighter model like BiRefNet_lite-2K. Demo is purposely in 30 fps even though I wouldn't run an anime 2D game above 12 fps - for both style and performance. Just wanted to show you the absolute results. Please bookmark this! ☑️ If you love 2D and sprite work, this is truly the best way, the best method I could find. Use it, steal it. Enjoy it! Repost if it saves your workflows too! I think I need a break... almost 40 hours straight solving this. Next are dynamic shadows, and I'm not looking forward to that...
Startracker 🔺@startracker

I MADE THE BEST 2D SPRITE AI PIPELINE EVER! After testing so much, doing tons of math, trying various different methods I finally got truly the cleanest sprite result I could ask for! You should bookmark this post! ✅ I think this is the happiest 'aha' moment I've had since starting this journey. Day 48 of making a custom webGL game engine using @cursor_ai 🔽🔽🔽 I've been trying to solve sprites since I started this project. I came close a couple of times but it was always something - either dynamic lights wouldn't work, or the genAI results would make them grainy, or I would have to have incorrect alpha etc. After so much struggling, wanting to give up, thinking it's impossible - it was finally done! Perfect matte interpolation of black & white backgrounds resulting in a clean outline of the sprite, half transparent effects keeping their full quality. But this was not enough for me. I wanted more. I wanted those dynamic lights for the most badass anime effects that I'll be creating for the next many weeks! And somehow, I managed to come up with the perfect balance that is nice to the eye with almost no visible imperfections or artifacts when using any kind of dynamic lights. See for yourself! 🔽🔽🔽 How does this magic happen? 1⃣ First, the animation is created using @Kling_ai. I found it the best to keep the quality anime 2D style I wanted, but I guess others may work as well. 2⃣ Then, the frames are carefully extracted at lossless quality and processed via automation through Nano Banana pass to create an opposite background (white->black or black->white). 3⃣ Once we have a proper spritesheet, I then used MiDaS to compute the depth maps, and then algorithmically create PBR maps for lighting. The remaining depth maps can also be used for various effects, although haven't explored it much yet. 4⃣ Finally, all elements are cut into single frames and overlapped with their respective normal maps. If done right, they're pretty much ready for runtime. I like my heroes to be 512 to truly stand out like fighting game characters, but all of this works with far smaller or bigger sprites. Mind you, it's important to note that Nano Banana has a max output of 4k. Therefore you don't want to make your atlases larger than 4k - which is still quite a lot since it allows you 8x8 512 frames. 💡💡💡 I believe the output quality is beyond anything currently available, and I'll be looking into packaging the whole thing into a web app so people can use it. If you find this useful, please repost it! I really wanted to share this method to the world. Thank you for reading! Look forward to day 49 update!

English
48
56
906
67.5K
Sam
Sam@samtechcoded·
@startracker @katedeyneka I use GPT 5.4 whenever I get stuck in one of those loops where nothing seems to fix it. Worked every time so far.
English
2
0
2
24
Kate Deyneka
Kate Deyneka@katedeyneka·
GPT-5.4 << Opus 4.6 yesterday I used only GPT-5.4 - not a very productive day, unfortunately 🥺
English
23
0
43
4.8K
Startracker 🔺
Startracker 🔺@startracker·
@milichab Just saw you joined too! This is amazing, you guys are now building the future! You did amazing stuff with Cursor - now bring us to the stars! Rooting for you!
English
0
1
4
114
Andrew Milich
Andrew Milich@milichab·
Best time to join
Umesh Khanna 🇨🇦🇺🇸@forwarddeploy

Wave after wave: In just the last few days, founders and founding engineers from incredible startups have joined @xai The talent density is insane - and getting stronger every week If you've gone 0→1 before and want your next moonshot, join forces with @milichab @JasonBud @dchaplot @AmanGotchu @wwzeng1 and the rest of us who embrace the startup DNA DMs open. Come make history 🔥

English
5
10
205
10.3K
Startracker 🔺
Startracker 🔺@startracker·
@henrydaubrez Finally someone needed to explain it. Great job honestly. I'm already tired of explaining to folks creating good stuff takes hundreds of hours - it only reduces the work to one creative instead of a whole big on set/vfx team. And no one even mentions the compute costs.
English
0
0
1
89
Henry Daubrez 🌸💀
Henry Daubrez 🌸💀@henrydaubrez·
Junkyard King - Ep. 0 | MAKING OF ⚙️ HERE’S HOW I MADE THIS, STEP BY STEP 🧵👇 1/14
English
60
61
471
28.4K
Startracker 🔺
Startracker 🔺@startracker·
The real bottleneck is vision. I aim to solve it. I'll be back in action soon!
English
1
0
2
205
Sam
Sam@samtechcoded·
@startracker @avax I should have gone with a game entry haha, but nah I wanted to do something different. I understand with my entry, it's disappointing, but we move forward
English
1
0
1
19
Sam
Sam@samtechcoded·
NGMI - Out of @avax build games. But seriously, I learned a lot from the workshops, and appreciated getting all that info from the team. I know the way I built my product needs to change A LOT. Also, I NEED other people on a team, I have some things I'm just not good at (yet)
English
3
0
8
170
Startracker 🔺
Startracker 🔺@startracker·
@alightinastorm That crowd is actually a minority. But they're loud. Most consumers don't care and don't know. And if they bother you, you can always make for asian markets. They're quite open to it!
English
0
0
1
38
カゲ
カゲ@NEON_KAGE·
@TheRundownAI Hey! Thanks for the article, some of this is inaccurate though! I didn’t care what people were saying, the minute this gained traction I wanted to make it real for the fans. I begin having conversations to do this in late September. Less than 2 months into the bands formation.🤘
English
2
1
8
628
The Rundown AI
The Rundown AI@TheRundownAI·
Someone used Suno AI to generate a Japanese metal band called Neon Oni. Fake member bios, AI-generated music videos, "Based in Tokyo" on Spotify. 80,000+ monthly listeners. Fans had it in their Spotify Wrapped top 5. Merch was selling. Then, community sleuths exposed it. Traced the creator's account to Europe. Spotted AI-generated hands in the music videos. The creator's response? Recruit 7 real musicians from actual Tokyo bands to perform the AI-generated songs live. They've now played several live shows and have more on the books. From an interview with the band's creator: "In an age where AI is taking everyone's jobs, this has actually created jobs. It's done the complete opposite." The AI --> real band transformation is a wild one.
The Rundown AI tweet media
English
164
573
4.4K
1.5M
Startracker 🔺
Startracker 🔺@startracker·
@exQUIZitely This one, aztec and inferno were on regular rotations in most cybercafes during the 1.3 era. But assault was the go to when you don't play pubs but with a few friends.
English
0
0
0
136
exQUIZitely 🕹️
exQUIZitely 🕹️@exQUIZitely·
The bomb has been planted. The most played map in FPS history is Dust2. What sets it apart in discussions of “most played” or “greatest” is its longevity and sheer volume: it’s been a staple since Counter-Strike 1.6 (early 2000s), carried through Source, CS:GO, and CS2. No other FPS map has been played that much over such a long time - now in its third decade.
English
28
11
297
48.1K
Todd Grilliot
Todd Grilliot@GrilliotTodd·
Pixel Engine v1.1 (left) vs Nano Banana 2 (right). I gave both models basically the same prompt. Unless you've tried to create spritesheets with image models before, you probably don't realize how hard it is to get good motion. Image models don't want to animate things, they want to make images. I trained my own animation model (Pixel Engine 1.1) to solve this problem. It's a completely novel approach, and it works surprisingly well. You can try it for free rn in the pixel engine app.
GIF
Todd Grilliot tweet media
English
31
48
684
34.9K
Startracker 🔺
Startracker 🔺@startracker·
FORGET MY PREVIOUS POST, LET ME BLOW YOUR MINDS! 🤯NEW METHOD! AI 2D SPRITES ARE SOLVED. PERIOD. Day 50 of building a webGL game engine using @cursor_ai 🔽🔽🔽 I was wrong. The previous system had flaws. It wasn't the best. If you use it on VFX heavy sprites, it would start falling apart - either via grain/artifacts, or displacement. But I just couldn't give up. I knew I needed something more. This was the limit the models could give me. So I consulted science: Zheng et al., "Bilateral Reference for High-Resolution Dichotomous Image Segmentation" (CAAI AIR 2024, arXiv:2401.03407) - BiRefNet If any of the brilliant minds who wrote this paper read this post, I'm extremely thankful for your research. It made me solve a really big problem with 2D sprites, and I'm extremely grateful! Thank you! Please reach out if you see this! Now, how does this solution work: 1⃣ Generate your Kling animation (still king for anime 2D style). 2⃣ Run BiRefNet (HR-matting variant rocks) → get solid character alpha. 3⃣ Compute simple brightness/luminance alpha (luma → alpha curve, easy in any tool). 4⃣ Final alpha = max(BiRefNet_alpha, brightness_alpha) — that's it. No fancy weights needed. 5⃣ Feed to your engine / Comfy / whatever. Dynamic lights now play nice with zero artifacts. Mind you, this solution requires an nVidia GPU to run local inference. Alternatively, you can get it done on Colab likely with the free tier! You can try CPU, but no guarantees. Try a lighter model like BiRefNet_lite-2K. Demo is purposely in 30 fps even though I wouldn't run an anime 2D game above 12 fps - for both style and performance. Just wanted to show you the absolute results. Please bookmark this! ☑️ If you love 2D and sprite work, this is truly the best way, the best method I could find. Use it, steal it. Enjoy it! Repost if it saves your workflows too! I think I need a break... almost 40 hours straight solving this. Next are dynamic shadows, and I'm not looking forward to that...
Startracker 🔺@startracker

I MADE THE BEST 2D SPRITE AI PIPELINE EVER! After testing so much, doing tons of math, trying various different methods I finally got truly the cleanest sprite result I could ask for! You should bookmark this post! ✅ I think this is the happiest 'aha' moment I've had since starting this journey. Day 48 of making a custom webGL game engine using @cursor_ai 🔽🔽🔽 I've been trying to solve sprites since I started this project. I came close a couple of times but it was always something - either dynamic lights wouldn't work, or the genAI results would make them grainy, or I would have to have incorrect alpha etc. After so much struggling, wanting to give up, thinking it's impossible - it was finally done! Perfect matte interpolation of black & white backgrounds resulting in a clean outline of the sprite, half transparent effects keeping their full quality. But this was not enough for me. I wanted more. I wanted those dynamic lights for the most badass anime effects that I'll be creating for the next many weeks! And somehow, I managed to come up with the perfect balance that is nice to the eye with almost no visible imperfections or artifacts when using any kind of dynamic lights. See for yourself! 🔽🔽🔽 How does this magic happen? 1⃣ First, the animation is created using @Kling_ai. I found it the best to keep the quality anime 2D style I wanted, but I guess others may work as well. 2⃣ Then, the frames are carefully extracted at lossless quality and processed via automation through Nano Banana pass to create an opposite background (white->black or black->white). 3⃣ Once we have a proper spritesheet, I then used MiDaS to compute the depth maps, and then algorithmically create PBR maps for lighting. The remaining depth maps can also be used for various effects, although haven't explored it much yet. 4⃣ Finally, all elements are cut into single frames and overlapped with their respective normal maps. If done right, they're pretty much ready for runtime. I like my heroes to be 512 to truly stand out like fighting game characters, but all of this works with far smaller or bigger sprites. Mind you, it's important to note that Nano Banana has a max output of 4k. Therefore you don't want to make your atlases larger than 4k - which is still quite a lot since it allows you 8x8 512 frames. 💡💡💡 I believe the output quality is beyond anything currently available, and I'll be looking into packaging the whole thing into a web app so people can use it. If you find this useful, please repost it! I really wanted to share this method to the world. Thank you for reading! Look forward to day 49 update!

English
48
56
906
67.5K
dothackzero
dothackzero@dothackzero72·
@techhalla Interesting, but how does it handle PBR workflows for sprites? I'm looking for something that can take a look into the diffuse data and make the material data from it. youtube.com/watch?v=2VyKnZ…
YouTube video
YouTube
English
2
0
8
4.2K
TechHalla
TechHalla@techhalla·
Indie game devs are about to love me (or hate me) for this... I built an AI workflow (app included) that spits out spritesheets in minutes, from assets created on freepik. Breaking it all down below 👇
English
55
174
1.9K
374.1K
Startracker 🔺
Startracker 🔺@startracker·
@chongdashu Thank you! Feel free to steal my workflows! Although I'm pretty sure once nano banana or gpt image start being able to generate good transparency we'll have everything solved easily!
English
1
0
1
91
Chong-U
Chong-U@chongdashu·
@startracker Big fan of your stuff, more advanced than mine!
English
1
0
1
301
Chong-U
Chong-U@chongdashu·
The article's technique for spritesheets works well. But it DOESN'T work as well for one thing... ➡️🚶‍♂️ Walkcycles. No matter what, it just wouldn't get it right. So I tried what others suggested: → Generate video (I used Sora 2) → Extract frames + stitch into spritesheet The results are actually quite promising.
GIF
Chong-U@chongdashu

x.com/i/article/2031…

English
19
17
241
57.2K
Startracker 🔺
Startracker 🔺@startracker·
@alightinastorm There's an answer to 'everyone is building apps in 2 mins but nothing actually gets shipped'. I had some of those offers too. Always told them it doesn't align with what I'm doing
English
1
0
1
282
robot
robot@alightinastorm·
just so you know replit is hiring promotion agencies which in turn pay 1-2k for viral posts they give you a full briefing including post hooks and anatomy to make it go peak viral these agencies never require disclosure i know because they approach me lock them all up nikita
jordwalke@jordwalke

You've never vibe coded like this before. @replit is now your canvas for building & creating. In this video I'm building a graphics shader app. I click Replit's canvas button and see the live app running on infinite editable canvas, generate several design variants in parallel side by side, bring one back into my app. Just 1 of the things we shipped.

English
47
35
888
68.5K
robot
robot@alightinastorm·
thanks guys next stop, 10k
robot tweet media
English
11
0
58
1.3K
Startracker 🔺
Startracker 🔺@startracker·
@pablope62300434 Problem solving is all about the brain. AI has nothing to do with it. It doesn't solve problems. Just turns solutions into executables
English
2
0
0
22
synthian wall facer
synthian wall facer@pablope62300434·
@startracker Love yourself you are better than AI , yes it hurts to learn and use the brain, but its the only way to improve
English
1
0
0
11
Startracker 🔺
Startracker 🔺@startracker·
If you see Cursor only as an IDE you're doing it wrong. In my eyes it becomes an OS. In fact most power users treat it as one. I wouldn't be surprised if they're actually building a real OS internally. Skills are the apps.
English
2
0
6
592