uglybyte

222 posts

uglybyte banner
uglybyte

uglybyte

@uglybyte

GPT per capita leadership

/dev/zero انضم Mart 2019
3.1K يتبع92 المتابعون
Keith
Keith@gnukeith·
Time to try out Bazzite
English
14
0
68
3.3K
uglybyte
uglybyte@uglybyte·
A clean LPE like that only comes around once in a while (before AI, at least..)
Tim Becker@tjbecker

Very cool Linux bug found by @xint_official 100% reliable, instant LPE from a portable python script that works on all platforms and distros. Root cause is a subtle logic bug at the intersection of several subsystems. I highly recommend patching and checking out the details!

English
0
0
1
42
Tim Becker
Tim Becker@tjbecker·
Very cool Linux bug found by @xint_official 100% reliable, instant LPE from a portable python script that works on all platforms and distros. Root cause is a subtle logic bug at the intersection of several subsystems. I highly recommend patching and checking out the details!
Xint@xint_official

Patch your Linux boxes! Copy.Fail is a trivially exploitable logic bug in Linux, reachable on all major distros released in the last 9 years. A small, portable python script gets root on all platforms. Found by the teams at @theori_io and @xint_official More details below xint.io/blog/copy-fail…

English
7
12
115
20.2K
uglybyte
uglybyte@uglybyte·
@moyix most probably, I also heard so.. “No published CVE entries” is still factually correct
English
0
0
1
84
uglybyte
uglybyte@uglybyte·
@gnukeith how many shifts until I can buy this? also are the double pounders free for lunch?
English
0
0
0
56
Keith
Keith@gnukeith·
I want to thank my mom for this accomplishment, my family and everyone who has supported me throughout the years
Keith tweet media
English
32
9
379
5.8K
Simon Vans-Colina
Simon Vans-Colina@simonvc·
I've realized that running local models scratches that Pokemon "gotta catch them all" itch. I'm very happy with this catch. Actually useful, fast, free.
Simon Vans-Colina tweet media
English
6
0
27
1.9K
uglybyte
uglybyte@uglybyte·
I’m honestly really sad to announce I couldn’t reach a partnership with them to secure just a single goddamn rack of GB200s as a goodwill gesture. I told them Rubin is coming up and these chips will be obsolete, presented growth graphs and comparisons vs my RTX 3060 but all in vain.
English
0
0
0
61
Keith
Keith@gnukeith·
I'm sad to announce that I could not reach a partnership agreement with OpenAI, for some reason them giving me money for no reason was deemed "Not a good business move" by their business team
Sam Altman@sama

we have updated our partnership with microsoft. microsoft will remain our primary cloud partner, but we are now able to make our products and services available across all clouds. will continue to provide them with models and products until 2032, and a revenue share through 2030.

English
2
1
53
2.6K
uglybyte
uglybyte@uglybyte·
@Yuchenj_UW It’s being released to the public in two weeks, apparently
English
0
0
0
112
Yuchen Jin
Yuchen Jin@Yuchenj_UW·
GPT-5.5 in Codex is really good. Frontier coding models are converging fast, and soon the differences will be less about raw model capability and more about harness, UX, reliability, price, and rate limits. Anthropic: release Mythos! What are you waiting for?
English
50
10
362
16.9K
uglybyte
uglybyte@uglybyte·
@Teknium just filter “UNION” that’ll work :)
English
0
0
1
95
uglybyte
uglybyte@uglybyte·
@HaifeiLi nah, real engineering doesn’t get you a ticket for the hype cycle. The workforce cut they had at X must still be felt, and they prob have to use Grok to fill the gaps :)
English
0
0
1
33
Haifei Li
Haifei Li@HaifeiLi·
@uglybyte Yeah I found it’s “amazing” he owns this platform which owns a lot of data but he wasn’t able to train out a frontier model. But hey I would prefer he focus on real engineering first like fixing the X search!
English
1
0
1
95
Haifei Li
Haifei Li@HaifeiLi·
How does Mr. Musk posting how awesome his Grok AI is everyday but the search function of this platform is still so sh*tty?? You are guaranteed you won’t find anything what you want to find but all the garbage. Can’t someone at X just vibe coding the feature using their “awesome” Grok?
English
3
0
5
1.3K
uglybyte
uglybyte@uglybyte·
@gnukeith I’ve been thinking about that platform too but honestly the gigantic models will run super slow on RAM (let alone the current cost of the required DDR5), and the small models can fit on normal GPUs. It’s either 2-4X 24-32GB cards or a 512GB Mac imo
English
0
0
1
69
Keith
Keith@gnukeith·
@uglybyte A total of 0, I don't have a threadripper or anything which is why I am asking what I could do with it haha
English
1
0
0
161
Keith
Keith@gnukeith·
Ok I need ideas, what can I do with a threadripper?
English
61
0
76
5.2K
uglybyte
uglybyte@uglybyte·
@gnukeith about a kidneys market rate. How much RAM do you have on that TRX5?
English
1
0
2
92
Keith
Keith@gnukeith·
@uglybyte How much would 4x RTX 6000s cost......?
English
1
0
1
317
uglybyte
uglybyte@uglybyte·
@N3mes1s hehe. We limited the models to 15 tokens of reasoning before answers since we ran out of compute, covered it up and now kind of made it better but worse than before. We also fixed some random bugs in the SDK!
English
0
0
1
34
uglybyte
uglybyte@uglybyte·
@halvarflake I do believe the distills of the large Qwens is what brought us here, which were probably partially distilled from Claude :) Did you have any success finding vulnerabilities with these 30B class models? could be a good benchmark for the abilities. I’m about to buy some hardware
English
0
0
0
526
Halvar Flake
Halvar Flake@halvarflake·
Ok, so we are seeing ever stronger smaller models. That's good. What's entirely unclear to me: 1) do we need to train massive models to then distill the smaller models? 2) have we learnt to train smaller models better? What developments are responsible for having a 27b model...
English
7
0
29
6.9K