ameyab

213 posts

ameyab

ameyab

@ameyab

now: @braintrust; was: AI @ Dropbox, AI at @Office, AI @facebookai; built SmartScreen

Seattle Katılım Nisan 2008
205 Takip Edilen64 Takipçiler
ameyab retweetledi
WorkOS
WorkOS@WorkOS·
You can't write evals for what you haven't seen yet. We talked with @braintrust VP Field CTO, Ameya Bhatawdekar, at HumanX about why production observability is what makes evals actually work.
English
0
3
4
372
Jay Hack
Jay Hack@mathemagic1an·
Can we compress large language models for better perf? "SparseGPT: Massive Language Models can be Accurately Pruned in One Shot" Eliminates the need to use/store 50% of weights for a 175B param model with no significant sacrifice in perf arxiv.org/pdf/2301.00774… Here's how 👇
Jay Hack tweet media
English
16
140
792
143.5K
Gergely Orosz
Gergely Orosz@GergelyOrosz·
My first year, as an engineering manager, I’d try to collect as much data I could about what people did. Spent ~6-8 hrs per person writing perf reviews, trying to make sure I didn’t miss a thing, working through weekends. It was exhausting. Then I made a change (cont’d):
Gergely Orosz@GergelyOrosz

Performance reviews are coming up. An objective way to summarize your own achievements is to use numbers. Those numbers are facts, hard to argue with, and can be compared, should anyone want to do so. Some ideas for numbers. What else have you seen used in perf reviews?

English
37
183
2.1K
0
Davis Blalock
Davis Blalock@davisblalock·
Here are all the ways to get around ChatGPT's safeguards: [1/n]
English
92
1K
6K
0
elvis
elvis@omarsar0·
There is a lot of interest in large language models. As a developer or researcher, you are probably looking for a guide on how to ramp up on LLMs. Here is a study plan you could try:
English
33
139
902
0
Aleksandr Volodarsky
Aleksandr Volodarsky@volodarik·
ChatGPT has crossed 1M+ users in just 5 days. To compare, it took Netflix 41 months, FB - 10 months, and Instagram - 2.5 months. But many haven’t yet realized its full potential. Here are the 10 mindblowing things you can do using it right now:
English
911
9.3K
45K
0
Leandro von Werra
Leandro von Werra@lvwerra·
With the ChatGPT wave rolling over twitter I also want to celebrate that the TRL library crossed 500 stars⭐️🎉 If you want to play with reinforcement learning for language models (the tech behind ChatGPT) yourself checkout the repo&library: github.com/lvwerra/trl
Leandro von Werra tweet media
English
4
39
247
0
Divam Gupta
Divam Gupta@divamgupta·
ChatGPT is blowing up the internet. Here is a thread showing some interesting examples I found. 🧵
English
2
20
93
0
317070
317070@317070·
Did you know, that you can build a virtual machine inside ChatGPT? And that you can use this machine to create files, program and even browse the internet? engraved.blog/building-a-vir…
English
217
2.1K
7.8K
0
Sahil Bloom
Sahil Bloom@SahilBloom·
How to retain everything you learn. The Spaced Repetition Method (science-backed):
English
473
2.9K
13.4K
0
Andrew Wilkinson
Andrew Wilkinson@awilkinson·
Every Founder in Year 1: "I love my company. These people are my family and I will run this business forever." Every Founder in Year 8: "I hate my life and my employees hate me. Please buy my business and let me go away for a very long time."
English
87
66
1.5K
0
Sebastian Raschka
Sebastian Raschka@rasbt·
The new slogan of AI might as well be "show, don't tell"! Finding the BLOOM paper was really hard. Searching Google & multiple blogs, all I found were demos. If someone's still interested in papers: "BLOOM: A 176B-Parameter...Language Model" arxiv.org/abs/2211.05100
Sebastian Raschka tweet media
English
12
35
202
0
elvis
elvis@omarsar0·
🐙TorchScale - A Library for Transformers at (Any) Scale TorchScale is a PyTorch library to scale up Transformers efficiently and effectively. Something worth checking out if you are working on scaling Transformer models. github.com/microsoft/torc…
elvis tweet media
English
3
30
152
0
Emad
Emad@EMostaque·
Some user notes on V2! 🧵 V2 prompts different and will take a while for folk to get used to. V2 is trained on two models, a generator model and a image-to-text model (CLIP). V1 used the CLIP-L14 model open sourced by @OpenAI which was great, but nobody knew what was in it 1/
Stability AI@StabilityAI

We are excited to announce the release of Stable Diffusion Version 2! Stable Diffusion V1 changed the nature of open source AI & spawned hundreds of other innovations all over the world. We hope V2 also provides many new possibilities! Link → stability.ai/blog/stable-di…

English
20
159
897
0
Mujeeb Ahmed
Mujeeb Ahmed@hey_mujeebahmed·
Websites to create Resume/CV for free🤩👇🏻
English
190
3K
8K
0
Sebastian Raschka
Sebastian Raschka@rasbt·
Debugging your PyTorch code today? Here's a lovely, little open-source project for a lovely Sunday! Lovely tensors -- tensors, ready for human consumption: github.com/xl0/lovely-ten…
Sebastian Raschka tweet media
English
14
100
700
0
Patrick Collison
Patrick Collison@patrickc·
Google, Microsoft, Adobe, IBM, Palo Alto Networks, and now Twitter run by CEOs who grew up in India. Wonderful to watch the amazing success of Indians in the technology world and a good reminder of the opportunity America offers to immigrants. 🇮🇳🇺🇸 (Congrats, @paraga!)
English
2.2K
20.3K
111.7K
0
Ramsri Goutham Golla
Ramsri Goutham Golla@ramsri_goutham·
I am putting together a list of platforms that you can explore if you are looking for economical AI/ML training and inference - 1. banana .dev - Ship ML to Prod, instantly. Scalable inference hosting for your machine learning models on serverless GPUs. Thread ->
English
14
83
346
0
Liron Shapira
Liron Shapira@liron·
Today @pmarca was asked by @tylercowen to explain a Web3 use case. I clipped this gem from 28:08 of Conversations With Tyler. Highly recommended...
English
326
472
4K
0