One of the most requested GitHub features in years and the website looks like it was designed by someone 9 years into a 2 year community college program. github.github.com/gh-stack/
Tools are tools, but when you start calling it a god, you’re out of your mind and a blasphemer. This used to be common knowledge, but now it’s commonplace. Tech won’t save you. Only Jesus can. Ask me how I know, 🥰.
Finished listening to Bryan Johnson on Theo Von’s pod.
Hot takes, but room temp AI claims straight outta the valley.
Bryan: “We assign these powers of… anything! [to God]. But on some kind of time scale, AI’s kinda that. When it’s that and how it’s that is TBD” 🧵
@vimtor 😭, bro, stop working so hard. SST is literally like half of the release notes I have to read for my weekly version bump at work. I can't keep up with 11 releases a week
you are correct.
it's all dunning-kreuger. people who don't write well think it writes well. people who don't code well think it codes well. people with no taste think it does are well.
whereas the reality is that it does fine on all these things which does improve a lot of people's stuff. they are just too ignorant (myself included for many domains) to recognize it's actually not very good.
I agree that they don't care. I just think that they do have at least some responsibility, so Sam's frustration is understandable as a CEO of a company that is actively being affected by this issue.
In which case I don't understand your original tweet's intention. But maybe we both misunderstand Sam's tweet, 😆.
Not trying to disagree here, but frankly I don’t think they care. It’s also naive to think they would.
And I don’t think this is limited to Anthropic. Grok, OpenAI, Gemini are all the same, regardless of what they portray to the public. Knowledge cutoff is a known issue, hallucinations are a known issue.
There’s a lot of known limitations regarding LLMs. I’m sure the impact on other companies’ revenue is at the bottom of their list of problems to deal with.
Claude told a user that PlanetScale had shut our service down. This is unsafe by any definition and Anthropic have made no effort to correct this situation.