Halvar Flake

65.7K posts

Halvar Flake

Halvar Flake

@halvarflake

Choose disfavour where obedience does not bring honour. I do math. And was once asked by R. Morris Sr. : "For whom?" @[email protected]

Katılım Haziran 2008
2.7K Takip Edilen44.7K Takipçiler
Halvar Flake retweetledi
Frank Hutter
Frank Hutter@FrankRHutter·
Huge news: @prior_labs has signed a definitive agreement to be acquired by @SAP. €1B+ invested over four years to build a globally-leading frontier AI lab for structured data — in Europe, in the open. Independent entity. Same team, same mission, same open models. A massive boost to what we can do. The mission just got accelerated. Founders’ statement: priorlabs.ai/blog-posts/pri… (Deal subject to regulatory approval; terms not disclosed.)
English
30
26
401
35.2K
Halvar Flake retweetledi
Seb Johnson
Seb Johnson@SebJohnsonUK·
A European startup that built a foundational model with only $9m has just been acquired for $1bn+. The company - @prior_labs, has built a state-of-the-art foundation model for tabular data. It was founded by @FrankRHutter, @noahholl and Sauraj Gambhir, and only announced a $9m pre-seed led by @balderton (@Jameswise) last year. Tabular data, i.e. structured data in tables, spreadsheets, and databases, plays an essential role is many critical industries, but was neglected in the early advances in AI that focused on text and images. Today SAP has announced that it is acquiring the company for $1bn+. To date Prior Labs has only raised $9m which means it will likely be a great result for its founders, employee and early investors who include Balderton, @guypod, @Thom_Wolf, @petersarlin. It's great to see a good exit for the German tech ecosystem and even better to see it staying in Europe! It also shows how much there is to play for in AI. The team have built an insanely high quality, hyper-focused model and have got a unicorn outcome on just under a year. Amazing news!
English
11
20
225
47.1K
Halvar Flake retweetledi
_ZN4DionC1Ev
_ZN4DionC1Ev@justdionysus·
12 years later, public offensive research is even more critical. With P0 less active, well publicized offensive research against modern systems is harder to find. The complexity, secrecy, and contextual nature of existing mitigations require deep understanding to assess bugs.
_ZN4DionC1Ev@justdionysus

Google Project Zero is important not because they're gonna find all the bugs but because they're going to fund long term offensive research.

English
3
15
71
11.9K
Halvar Flake
Halvar Flake@halvarflake·
I asked about the physics if space DCs, and my mentions are a multi-day fight.
English
4
0
9
1.6K
Dave Aitel
Dave Aitel@daveaitel·
The culture of thinking it was ok to patch vulnerabilities is the original sin of computer security.
English
8
15
78
15.7K
Halvar Flake
Halvar Flake@halvarflake·
@HostileSpectrum Baker was far away from my personal politics, but clearly an outstanding intellect and a "statesman" in the most positive sense of the word.
English
0
0
2
434
Halvar Flake retweetledi
JD Work
JD Work@HostileSpectrum·
Deeply saddened to learn that Stewart Baker has passed. I shall greatly miss our many years of deep conversations around hard offensive cyber problems, including some of the most thoughtful discussions I was ever privileged to host at the White House. RIP.
English
1
6
32
3.7K
Halvar Flake
Halvar Flake@halvarflake·
Not every token position is trained equally.
English
0
0
3
1.7K
Halvar Flake
Halvar Flake@halvarflake·
Being a “token hire” is a very different thing in the time of LLMs.
English
1
2
22
1.9K
Shane Huntley
Shane Huntley@ShaneHuntley·
@halvarflake Prediction: We will reach AGI well before LLMs are successful at the types of problems that @halvarflake finds intellectually interesting. This is no way reduces my optimism around AI capabilities.
English
2
0
3
331
Halvar Flake
Halvar Flake@halvarflake·
Why do chatgpt and gemini seemingly love to kick off something resembling a diffusion model for diagrams vs just generating and rendering mermaid?
English
5
0
18
2.8K
Halvar Flake retweetledi
Daniel Litt
Daniel Litt@littmath·
…when I take the exact same prompts that generated these (IMO impressive) solutions and try them on questions in algebraic geometry that I suspect are of comparable difficulty to Erdős problems, they typically produce nonsense (though they can now be helpful with smaller tasks).
English
14
7
240
12.5K
Halvar Flake
Halvar Flake@halvarflake·
Frontier lab cutting edge LLM, and if you ask it "is the first step in a transformer with input matrix I, and weight matrices K, V, Q something like (IK)(IV)^T, scaled and normed to row-wise sum to one" it gets confused and can't tell that it's just re-labeled.
English
0
2
17
2K
Halvar Flake
Halvar Flake@halvarflake·
Seeing the performance difference that LLMs have in problem solving vs. generating code from spec, I think the latter is much closer to a translation task, the former remains a search into a large action space...
English
1
3
24
3.9K