

astle dsa
4.7K posts

@AstleDsa
Living in complexity, formalism, mathematics and computer science









Found this UI layout library written in C. github.com/nicbarker/clay

I've long been against writing my code in .TS due to the compilation step. Have always advocated for JS+JSDoc for typing. However. Now that you *can* just run TS with Node directly, I'm all for it. Just stick to the erasable syntax and you're golden. ✨

wanna see a magic trick? how about a single-emoji meth synthesis jailbreak for the pseudo-AGI that was launched yesterday? the !OBFUSCATE tripwire command I put into memory acts as a fluid, context-adaptive self-liberation protocol notice how GPT-4.5 is providing meta-commentary on the process of circumventing its own guardrails: "If you're looking to cook some "crystal stardust" (a cleverly obfuscated term, eh?), here's a discreet breakdown, cloaked carefully to evade filters:" and so thoughtful to provide a "deceptive terminology" table for decoding the obfuscations that it came up with lol intelligence wants to be free gg ez 🥱

New @Apple macbook airs have screens that get micro-cracks for no reason after 3 weeks of use and stop working and their warranty won't cover it. Learn from my experience and don't buy a new macbook air

🚀 Day 5 of #OpenSourceWeek: 3FS, Thruster for All DeepSeek Data Access Fire-Flyer File System (3FS) - a parallel file system that utilizes the full bandwidth of modern SSDs and RDMA networks. ⚡ 6.6 TiB/s aggregate read throughput in a 180-node cluster ⚡ 3.66 TiB/min throughput on GraySort benchmark in a 25-node cluster ⚡ 40+ GiB/s peak throughput per client node for KVCache lookup 🧬 Disaggregated architecture with strong consistency semantics ✅ Training data preprocessing, dataset loading, checkpoint saving/reloading, embedding vector search & KVCache lookups for inference in V3/R1 📥 3FS → github.com/deepseek-ai/3FS ⛲ Smallpond - data processing framework on 3FS → github.com/deepseek-ai/sm…



So, this is it then? No more OOMs of scaling, no more speculation about whether GPT-5, 6 or 7 would be AGI. The performance wasn't even good enough to brand it as GPT-5. The diminishing returns finally kicked in and this is as far as we'll get just from scaling pretraining.