Helin

168 posts

Helin banner
Helin

Helin

@helin

Physics, Silicon, Data, AI

Portland, OR انضم Ocak 2008
572 يتبع276 المتابعون
Helin
Helin@helin·
@RepLuna Tonight, LLMs will be busy reading these page images and videos.
English
0
0
1
10
Helin
Helin@helin·
I used #notebooklm to generated a 35min podcast discussing 4 top stories on hacker news. The flow: pick top stories, stitch their audio overviews together, make smooth transitions between topics, add intro and outro music (generated by @suno_ai_ ). youtu.be/NsOnGt8d9fA?si…
YouTube video
YouTube
English
0
0
1
956
Helin
Helin@helin·
@paulg @fat Art is about what audience feels, not how awesomely an artist can do. I personally don’t feel relevant at all to the given art piece. I would rank it much lower than lots of recent work I’ve seen, like, from James Jean, Miyamura Gen and many many more.
English
0
0
0
28
Paul Graham
Paul Graham@paulg·
@fat Ok, post some examples of recent work you feel is better than this.
Paul Graham tweet media
English
91
3
178
187K
Jacob
Jacob@fat·
Saying modern art today is crap, is just lazy. There's amazing new artists, just takes a lot of sifting through noise to find. Finding classic art that's incredible is *really easy* because a hundred+ years of filtering the best stuff to the top has already happened by museums.
Paul Graham@paulg

The people who think modern art and architecture are crap are often right. But then they undermine their own case by picking terrible examples of "good" old stuff. They can't resist a Victorian pastiche.

English
10
6
172
199.4K
Logan Kilpatrick
Logan Kilpatrick@OfficialLoganK·
One of the biggest points of feedback I have seen is ensuring that the JSON output does not hallucinate the function parameters. What else?
English
7
1
57
5.8K
Logan Kilpatrick
Logan Kilpatrick@OfficialLoganK·
Developers using @OpenAI: how can we improve function calling for you? Please share your use cases and feature requests! 🧵👇
English
74
15
173
95.2K
Helin
Helin@helin·
@jimkxa Totally! We need more computing, not “smaller transistors+better instruction set+branch prediction”. 1000x can come from: Once a software is developed, it’s compiled to an ASIC design. Then foundry build the chip and host my software. “Serious software peeps build own hardware.”
English
0
0
2
497
Jim Keller
Jim Keller@jimkxa·
Moore's law is still not dead!! My guess 1000x to go on known physics. Probably not EUV
English
57
146
1.3K
427.8K
Helin
Helin@helin·
@garrytan Big AI teaches small ones specific skills for minimal token spending.
English
0
0
0
176
Garry Tan
Garry Tan@garrytan·
Little AI is a good development for founders: you can treat massively expensive foundational models like GPT-4 as a prototyping tool, then cheaply train your own models to higher accuracy If it were all Big AI, all power goes to the biggest and that’s not what I’m hoping for
Ali Ghodsi@alighodsi

We are open sourcing Dolly, a ChatGPT-like model that can do instruction following, created for $30, trained 3 hours on 1 server. The secret in magical human-like interactivity probably lies in a small dataset. databricks.com/blog/2023/03/2…

English
8
20
129
41.8K
Helin
Helin@helin·
@natfriedman This is great! We had internal scripts to run the comparison but the problem is the time spending on getting access to all the APIs. Now I can run a small test on your tool and decide models of interest. As a result, I just submit an access application to Anthropic. Thank you!
English
0
0
1
128
Nat Friedman
Nat Friedman@natfriedman·
Claude-instant is now in nat.dev and it's insanely fast
Nat Friedman tweet media
English
23
26
292
110.1K
Helin
Helin@helin·
@sawyermidddd Easy to make, delicious, and sounds good in ASMR?
English
0
0
0
28
Sawyer Middeleer
Sawyer Middeleer@sawyermidddd·
Why are all the influencers making 糖葫芦 now???
English
1
0
0
178
Helin
Helin@helin·
@megankao_ Agreed. That’s the future we are enabling with a gpt-powered data analysis platform, to help non-sql-speakers to talk to data.
English
0
0
0
51
Megan Kao
Megan Kao@megankao_·
If every department knew how to write SQL, companies could run 10x more efficiently.
English
85
71
1K
298.2K
Helin
Helin@helin·
@OfficialLoganK @OpenAI A data security feature that promises prompts and results will not be used for model training.
English
0
0
0
95
Logan Kilpatrick
Logan Kilpatrick@OfficialLoganK·
If you are using @OpenAI to build your startup or product, how can we help you? 🧵⬇️
English
176
56
617
221.3K
Sawyer Middeleer
Sawyer Middeleer@sawyermidddd·
@petergyang 1) writing SQL scripts 2) writing marketing/sales copy 3) structuring form emails
English
4
0
46
14.4K
Peter Yang
Peter Yang@petergyang·
What is your go-to chatGPT use case?
English
274
61
656
353.6K
Helin
Helin@helin·
propaganda is building a echo chamber at scale
English
0
0
0
1.2K
Helin
Helin@helin·
Looking forward to the release of GPT-4. Feel like waiting for iPhone 4 in the old days.
Helin tweet media
English
0
0
1
1.2K
Helin
Helin@helin·
@EdeyJulia I spent 6 years in West Lafayette for my PhD study. You, Zach, and the team make the town magical!
English
0
0
0
96
Helin
Helin@helin·
@erik_nijkamp @nathanbenaich If LLM API vendors are willing to provide read/customization access to the parameters of the last 2-3 layers of their models, would it be sufficient for RLHF?
English
0
0
0
39
Erik Nijkamp
Erik Nijkamp@erik_nijkamp·
@nathanbenaich I’m afraid the RLHF opportunity is only viable to players, who own the remaining vertical stack (model, serving, training, etc.), making those even stronger.
English
2
0
2
930
Nathan Benaich
Nathan Benaich@nathanbenaich·
Here’s a version of the LLM future: - Few large models maintained and improved by deep pocketed AI cos - Others use RLHF with their own use case + data to build end products. If true, there’s an oppy in RLHF SaaS… …but as an on-ramp to using more LLMs, big cos might do it.
English
13
7
81
21.6K
Helin
Helin@helin·
@kevinyang @OpenAI Great work! If you fine tune models for each user, do you see that actually gives better personalization compared with in-context learning? Great to see llm is disrupting how we work. I’m experimenting it for analytics, and got amazed by how much it can do in many different ways.
English
1
0
0
242
Kevin Yang
Kevin Yang@kevinyang·
OMG it finally works!! 🚀🚀🚀 I got @OpenAI to draft emails 📧 for me in the background 🤯 This christmas I built an email assistant (EmailTriager.com) that automatically drafts email replies behind the scenes no chrome extension necessary, here's the story 👇
English
170
552
5.1K
1.2M
Helin
Helin@helin·
I’m pro-Elon when engineering topics are discussed.
Elon Musk@elonmusk

@Rainmaker1973 If you know the solution is a whole number, doing mental cube approximations & guessing works great

English
1
0
2
710
Helin
Helin@helin·
@_paulshen Natto is a wonderful piece of software that I used to teach my kid programming. Whatever you decide on how much energy to spend on future Natto development, I would appreciate, even it’s 0.
English
1
0
1
49
Paul Shen
Paul Shen@_paulshen·
i'll keep working and building on natto but i'm going to wander a little more. it's my work that i'm most proud of (natto and maaybe drawbattle.io ✏️). please keep using it and spreading the word! thanks again 🙏
English
5
0
41
2.2K
Paul Shen
Paul Shen@_paulshen·
my motivation to work on natto.dev is falling. without twitter, there would be no natto. it's vain but the likes and having an audience kept me going. really appreciate you all!
English
26
6
247
58.6K