petey

411 posts

petey banner
petey

petey

@ossamandere

T4 sw engineer

free state of FL Katılım Nisan 2023
317 Takip Edilen63 Takipçiler
Space and Technology
Space and Technology@spaceandtech_·
Allonic Robotics has introduced a new robotic hand built without screws, cables, or complex joints. It uses braided fibers to create both tendons and structure in a single automated step, making it strong, flexible, and smooth in motion. This approach could simplify manufacturing and speed up the production of robotic hands.
English
48
772
3.5K
290.5K
petey
petey@ossamandere·
@hyprturing didn't even get out of red. got to the meeting in the shielded section and just started laughing so hard
English
0
0
0
160
lachlan
lachlan@hyprturing·
i don’t think I’ve ever met anyone who has finished green mars or even started blue mars. there might be no one in the whole world who’s done it
English
34
0
87
8K
petey
petey@ossamandere·
I am bothered by how much caffeine affects my life
English
0
0
1
8
petey
petey@ossamandere·
i may sub just to grok all over vagueposts
English
0
0
0
7
Yuchen Jin
Yuchen Jin@Yuchenj_UW·
Friends at both big tech and startups tell me they’re spending more than $1000 per day on Claude Code or Codex tokens. That’s $365,000/year. We’re not far from companies spending more on LLM tokens than on human employees.
English
322
110
2.3K
217.1K
Kyle Walker
Kyle Walker@kyle_e_walker·
I'm struggling with an answer to "Can you share the code?" I've always been a big believer in open source software. I built my career on it. When I post a cool project / demo, inevitably I get the question "can you share the code?" Before LLMs, I would typically share - as this question was a request to be taught. Even if someone had the code, it would take time to reproduce what I had done. With LLMs, however, the question feels different. If I share a project repo, LLMs can often reproduce or adapt it in one shot. So the question intent changes from "can you teach me how to do this?" to "can I have your work product for free?" Even if that's not intentional by the question asker. I'm still going to keep open sourcing tools to build things - I've released more packages in the past few months than I ever have. But the value in open sourcing project code itself seems to have diminished. I'm still working through where the line is. But I'm pretty sure the answer has changed from where it was two years ago.
English
14
8
81
10.6K
petey
petey@ossamandere·
@svpino it kinda did, everybody's job is turning into prompt engineering
English
0
0
0
33
Santiago
Santiago@svpino·
I still remember when people thought "prompt engineering" was going to become a real career.
English
398
368
9.6K
420K
petey
petey@ossamandere·
experiment failed the AMD Radeon AI pro r9700 is basically not supported by vLLM yet. every change resulted in lower TPS better luck nxt time 💀
English
0
0
0
61
petey
petey@ossamandere·
oof 8.5 TPS max. ok nvm fuck
Filipino
1
0
0
15
petey
petey@ossamandere·
2 days of work, dozens of flags 40b Qwen3.5 FT Q4 vLLM docker container dual-r9700 setup trying to beat 26 TPS flags in nxt comment if it works. 🙏 @TheAhmadOsman for the rec in an article I can't find right now
English
1
0
0
59
petey
petey@ossamandere·
@CJHandmer Crazy how your timing in building Terraform is working out. Congrats on the groundbreaking!
English
0
0
3
100
Casey Handmer
Casey Handmer@CJHandmer·
Australia (and 100 other countries) will need to rebuild domestic fuel production capacity, and will be doing so as fast as their state capacity and competence allows. For the few that have many decades of local oil left, the path forward is obvious. For the rest, it's time to build solar synthetic fuel.
Hot Rails — oz/acc@hot_rails

Australia lost 44 merchant ships to enemy action in World War II. Many of these were tankers, either importing fuel from abroad (Australia then, as now, had little domestic production) or transporting it domestically from southern population centres to northern operating theatres. Each loss represented weeks of supply; the “small” overall number masks the outsize impact - both practical and psychological - on a nation suddenly exposed and far from its great-power protector. After the war, Australia moved decisively to reduce that vulnerability. We developed oil and gas resources. We built refineries in every major city. By the 1970s, we had a level of fuel security that previous generations would have recognised as hard-won. But the long peace made us complacent. As globalisation and large offshore refineries eroded the competitiveness of local plants, we allowed them to close. Efficiency improved on paper. Resilience declined in reality. Today therefore, as the world again enters a great period of geopolitical upheaval, Australia inexcusably finds herself in a position of vulnerability, uncomfortably similar to that faced in 1940. “We can just use our LNG exports as leverage to secure diesel shipments” is the same logic that once assumed supply lines would always hold. We now know how that story ends. We need to restore sovereign, domestic capability in the fuels that keep our country running.

English
13
8
100
9.8K
petey
petey@ossamandere·
@karpathy also, every point of view has tradeoffs, which is probably the better takeaway
English
0
0
0
3
Andrej Karpathy
Andrej Karpathy@karpathy·
- Drafted a blog post - Used an LLM to meticulously improve the argument over 4 hours. - Wow, feeling great, it’s so convincing! - Fun idea let’s ask it to argue the opposite. - LLM demolishes the entire argument and convinces me that the opposite is in fact true. - lol The LLMs may elicit an opinion when asked but are extremely competent in arguing almost any direction. This is actually super useful as a tool for forming your own opinions, just make sure to ask different directions and be careful with the sycophancy.
English
1.8K
2.4K
31.4K
3.4M
petey
petey@ossamandere·
@LottoLabs llama-serve is the highest ROI for a user getting into locals beyond using ollama
English
0
0
0
109
Lotto
Lotto@LottoLabs·
So llama.cpp just seemed way more efficient for single users than sglang idk?
English
10
0
30
4.4K
petey
petey@ossamandere·
I am locking my phone in a box at night that only opens after my first prompt
English
0
0
0
32
Tenobrus
Tenobrus@tenobrus·
i feel like i assumed everyone else in the world just kinda aged out of watching youtubers at around the same time i did but im starting to worry that may not be the case? are there still people out there who have like double digit youtube hours watched in 2026??
English
122
0
450
31.8K
petey
petey@ossamandere·
@hibakod I'm using 2 of these with llama.cpp and i can't get above 26 t/s with qwen3.5 finetunes at Q4_K_M using rocm or vulkan as a backend. If you pass that please brag about it and share a config
English
0
0
1
47
petey
petey@ossamandere·
@sudoingX Hermes setup was simple and unintimidating, and it didn't have all the baggage of the hype train.
English
0
0
0
59
Sudo su
Sudo su@sudoingX·
what agent harness are you using and why? drop your reasoning below. lets find out what's keeping you on your current setup or what made you switch.
English
144
4
80
15K
petey
petey@ossamandere·
@Teknium Hey qq are you doing anything to mitigate against possible Python supply chain attacks like the liteLLM one that showed up a few days back?
English
2
0
1
84