Jeroen Pixel

516 posts

Jeroen Pixel banner
Jeroen Pixel

Jeroen Pixel

@pixelprotest_

Developer and 3D artist - AI/ML/VR

London, UK Beigetreten Kasım 2016
189 Folgt2.8K Follower
Angehefteter Tweet
Jeroen Pixel
Jeroen Pixel@pixelprotest_·
My fox shooting garden defending AI robot is finally done and WORKING! 🤩 (Don’t worry it only shoots 💦 water) After months of slowly moving forward with each part I finished the last step to train a TensorFlow model on the footage of the 🦊 fox I collected hours of footage 📹 with the fox roaming around my garden, from this I labeled around 2000 images with the fox by hand ✋ Honestly, I was quite skeptical training the model was actually gonna work, maybe this was partly the reason I avoided working on this until the very end. If I couldn’t train a model to detect the fox, this whole robot would never be able to function properly. On the flipside though, with no previous experience in hardware or electronics there was a bit of a learning curve and I didn’t want to end up labeling thousands of images, training a TensorFlow model, only to fail on building the hardware. As I started building, I realized that mixing hardware and software adds quite another dimension to debugging things. At times I wasted hours debugging code in my IDE, only to realize the issue was somewhere in the electronics. Furthermore, combining this side project with a full time job and a young family, is not always easy. It can be quite frustrating, to know you only need 4 hours of concentrated effort for a small task, having to spread it out across a week of 20min increments. Then, a few months into the build I noticed the fox had stopped coming to my garden, in fact one day, I recorded her walking with 3 cute little 🐶 pups, and the next day I saw her moving out of my garden completely. Did she know I was building a robot? I had this strange mix of feelings, happy my garden was safe from poop and digging, happy she was safe with her pups, but how was I gonna finish this project if my robot had no fox to detect? For sure they would be back next year, I figured I could postpone the whole thing until next winter, but I also knew it was gonna be much harder to pick up momentum if I did let it sit there for six months. So I decided to keep working, hoping the fox would reappear,.. but she never did. As I finished labeling the footage and started training my model, I could finally see the mAP results, quantifying the precision of my object detection model. It was measuring at 78% across different metrics on detecting my fox. I quickly ran the model on some of the video footage I got from my fox. Inference speed took a hit, but it did a near perfect job detecting the fox, even when she was deep down in the grass or wizzing past in a motion blur. It took me by surprise how well it worked. With the default model I had to drop my confidence threshold way down to 15%, to recognize the fox as 🦜“bird” in one or two frames, with my custom model it followed the fox all the way down to the back of the garden! Still this didn’t solve the issue of there being no actual fox in my garden and how was I gonna wrap this project in a short timeframe. I played with the idea of putting a fox toy 🧸 on an RC 🚗 car, or borrowing a dog to run around the garden to test. Friends suggested I run around the garden in a fox costume.. what a ridiculous idea. I wasn’t really feeling the idea of running around the garden in a floppy cloth fox 🎭 costume, but had a look anyway. I came across these self inflating costumes. This actually could be perfect. Since it’s inflated, it would hold its shape super well, making it much easier to label, train and be recognized by my robot. So I got the costume and shot a time lapse of myself as a fox walking around the garden. I labeled it to around 600 images. Ran the model training again and got a mAP result of 82%. This was even better than my real fox! At this point I knew this was gonna work. So here’s the final 🎥 video, just having some fun with it. I’ll update here whenever the real fox does come back. On a final note, I’m looking for (remote) jobs in these fields of AI now: - object detection - visual generative AI - 3D (nerfs + gaussian splats) So if you know anything let me know! My DMs are open 😊
Jeroen Pixel@pixelprotest_

✨I’m building an AI lawn defender robot. Spring is coming, last year I fixed my lawn by 🌱 seeding it from scratch, great results and highly recommend it. Sadly it also attracts 🐱animals 🦊 that love digging holes and poop everywhere. So I’m building an AI lawn defender robot that can detect animals and spray them with 💦 water 💦 while letting people enjoy the lawn undisturbed. Although I’ve seen a 🦊 I’m not 100% its the only animal, so I’m taking the naive approach and figuring it out step by step. For now I’m starting simple with a camera, RPi and OpenCV to collect some insights into what’s actually happening in my garden.

English
13
20
160
55.7K
Jeroen Pixel retweetet
@levelsio
@levelsio@levelsio·
🧑‍⚖️ A giant AI company called @recraftai launched an upscaler this month But they named it after one of the most popular AI upscalers that already exists by an indie maker: 💎 Clarity by @philz1337x They've been notified but are refusing to stop using the name What should he do?
@levelsio tweet media@levelsio tweet media@levelsio tweet media
Recraft@recraftai

@vasya_omelchuk Hello! We recommend using the Clarity Upscaler. If you’d like enhanced details, the Creative Upscaler is a great choice, though it may slightly adjust the pattern’s appearance.

English
66
17
388
208.6K
Jeroen Pixel
Jeroen Pixel@pixelprotest_·
@mhyrr Had to look it up, so right!
Jeroen Pixel tweet media
English
1
0
10
1.3K
Greg Olsen
Greg Olsen@mhyrr·
@pixelprotest_ Huh, at first I really thought the one on the right was well known woodworker Erik Curtis.. lookup encurtis on YouTube.
English
1
0
4
1.4K
Jeroen Pixel
Jeroen Pixel@pixelprotest_·
wanted to compare the new stable diffusion 3.5 to flux dev. sd35 on the left, flux on the right "hipster man with a beard, building a chair, in a wood shop"
Jeroen Pixel tweet mediaJeroen Pixel tweet media
English
17
7
115
106.1K
Jeroen Pixel
Jeroen Pixel@pixelprotest_·
"a bear building a log cabin in the snow covered mountains"
Jeroen Pixel tweet mediaJeroen Pixel tweet media
English
1
0
2
3.2K
Jeroen Pixel
Jeroen Pixel@pixelprotest_·
"a man showing off his cool new t shirt at the beach, a shark is jumping out of the water in the background"
Jeroen Pixel tweet mediaJeroen Pixel tweet media
English
1
1
3
3.6K
Jeroen Pixel
Jeroen Pixel@pixelprotest_·
here's the haiper having a go
English
0
0
0
628
Jeroen Pixel
Jeroen Pixel@pixelprotest_·
looking forward to the kling results, but they tend to take 3 days to generate, so fingers crossed
English
1
0
0
720
Jeroen Pixel
Jeroen Pixel@pixelprotest_·
I used AI to help clean the shed, sound on.
English
1
0
2
1.1K
Jeroen Pixel
Jeroen Pixel@pixelprotest_·
@PurzBeats Bro such a big fan of your comfy tutorials ❤️
English
1
0
1
50
Jeroen Pixel
Jeroen Pixel@pixelprotest_·
My fox shooting garden defending AI robot is finally done and WORKING! 🤩 (Don’t worry it only shoots 💦 water) After months of slowly moving forward with each part I finished the last step to train a TensorFlow model on the footage of the 🦊 fox I collected hours of footage 📹 with the fox roaming around my garden, from this I labeled around 2000 images with the fox by hand ✋ Honestly, I was quite skeptical training the model was actually gonna work, maybe this was partly the reason I avoided working on this until the very end. If I couldn’t train a model to detect the fox, this whole robot would never be able to function properly. On the flipside though, with no previous experience in hardware or electronics there was a bit of a learning curve and I didn’t want to end up labeling thousands of images, training a TensorFlow model, only to fail on building the hardware. As I started building, I realized that mixing hardware and software adds quite another dimension to debugging things. At times I wasted hours debugging code in my IDE, only to realize the issue was somewhere in the electronics. Furthermore, combining this side project with a full time job and a young family, is not always easy. It can be quite frustrating, to know you only need 4 hours of concentrated effort for a small task, having to spread it out across a week of 20min increments. Then, a few months into the build I noticed the fox had stopped coming to my garden, in fact one day, I recorded her walking with 3 cute little 🐶 pups, and the next day I saw her moving out of my garden completely. Did she know I was building a robot? I had this strange mix of feelings, happy my garden was safe from poop and digging, happy she was safe with her pups, but how was I gonna finish this project if my robot had no fox to detect? For sure they would be back next year, I figured I could postpone the whole thing until next winter, but I also knew it was gonna be much harder to pick up momentum if I did let it sit there for six months. So I decided to keep working, hoping the fox would reappear,.. but she never did. As I finished labeling the footage and started training my model, I could finally see the mAP results, quantifying the precision of my object detection model. It was measuring at 78% across different metrics on detecting my fox. I quickly ran the model on some of the video footage I got from my fox. Inference speed took a hit, but it did a near perfect job detecting the fox, even when she was deep down in the grass or wizzing past in a motion blur. It took me by surprise how well it worked. With the default model I had to drop my confidence threshold way down to 15%, to recognize the fox as 🦜“bird” in one or two frames, with my custom model it followed the fox all the way down to the back of the garden! Still this didn’t solve the issue of there being no actual fox in my garden and how was I gonna wrap this project in a short timeframe. I played with the idea of putting a fox toy 🧸 on an RC 🚗 car, or borrowing a dog to run around the garden to test. Friends suggested I run around the garden in a fox costume.. what a ridiculous idea. I wasn’t really feeling the idea of running around the garden in a floppy cloth fox 🎭 costume, but had a look anyway. I came across these self inflating costumes. This actually could be perfect. Since it’s inflated, it would hold its shape super well, making it much easier to label, train and be recognized by my robot. So I got the costume and shot a time lapse of myself as a fox walking around the garden. I labeled it to around 600 images. Ran the model training again and got a mAP result of 82%. This was even better than my real fox! At this point I knew this was gonna work. So here’s the final 🎥 video, just having some fun with it. I’ll update here whenever the real fox does come back. On a final note, I’m looking for (remote) jobs in these fields of AI now: - object detection - visual generative AI - 3D (nerfs + gaussian splats) So if you know anything let me know! My DMs are open 😊
Jeroen Pixel@pixelprotest_

✨I’m building an AI lawn defender robot. Spring is coming, last year I fixed my lawn by 🌱 seeding it from scratch, great results and highly recommend it. Sadly it also attracts 🐱animals 🦊 that love digging holes and poop everywhere. So I’m building an AI lawn defender robot that can detect animals and spray them with 💦 water 💦 while letting people enjoy the lawn undisturbed. Although I’ve seen a 🦊 I’m not 100% its the only animal, so I’m taking the naive approach and figuring it out step by step. For now I’m starting simple with a camera, RPi and OpenCV to collect some insights into what’s actually happening in my garden.

English
13
20
160
55.7K