Sergio F. Gonzalez

1.1K posts

Sergio F. Gonzalez banner
Sergio F. Gonzalez

Sergio F. Gonzalez

@sergiofgonzalez

Madrid, Spain Katılım Mayıs 2011
231 Takip Edilen56 Takipçiler
Sergio F. Gonzalez
Sergio F. Gonzalez@sergiofgonzalez·
If you should ever leave me Well, life would still go on, believe me The world could show nothing to me So what good would living do me? — lots of ❤️for all the Valentines out there
English
0
0
1
33
bardieru
bardieru@bardieru·
Goodbye Master... David Lynch
GIF
English
1
1
1
57
Sergio F. Gonzalez retweetledi
FastAPI
FastAPI@FastAPI·
@denicmarko 🥲🙋‍♂️
QME
26
18
551
15K
Sergio F. Gonzalez
Sergio F. Gonzalez@sergiofgonzalez·
If I ever go to Seattle I won’t say anything but there would le be signs 🥰
Sergio F. Gonzalez tweet media
English
2
0
4
53
Lucas VM 🏳️‍🌈
Lucas VM 🏳️‍🌈@Princ3sConsu3la·
Me encanta, me río muchísimo, cuando la gente intenta utilizar frases hechas, pero sale mal: - Estoy entre la espalda y la pared - Siento que soy un cerdo a la izquierda Alguna del estilo que hayáis escuchado?
Español
1.9K
908
17.3K
2.3M
Sergio F. Gonzalez retweetledi
TRÄW🤟
TRÄW🤟@thatstraw·
Did you?
TRÄW🤟 tweet media
English
111
344
2.9K
176.7K
Sergio F. Gonzalez retweetledi
CISA Cyber
CISA Cyber@CISACyber·
We’re responding to CVE-2024-3094, a reported supply chain compromise affecting XZ Utils versions 5.6.0 and 5.6.1. XZ Utils may be present in Linux distributions. See our additional guidance at cisa.gov/news-events/al….
English
21
527
1.1K
284.8K
Sergio F. Gonzalez
Sergio F. Gonzalez@sergiofgonzalez·
You have to ❤️ the people at @ManningBooks. I filed a problem related to their Livebook platform and they added the book to my account as compensation 😊 Completely took me by surprise!
Sergio F. Gonzalez tweet media
English
1
1
9
660
Sergio F. Gonzalez retweetledi
Philipp Schmid
Philipp Schmid@_philschmid·
We got a late Christmas gift from @Microsoft! 🎁🤗 Microsoft just changed the license for their small LLM phi-2 to MIT! 🚀 Phi-2 is a 2.7 billion parameter LLM trained on 1.4T tokens, including synthetic data, achieving 56.7 on MMLU, outperforming Google Gemini Nano. TL;DR: 🧮 2.7B Parameters base model 📱 ~1GB memory requirement for quantization 📜 context length of 2048 tokens. 🚀 outperforms Google Gemini Nano (3.2B) 🧑🏻‍💻 Good at coding, especially Python 🗣️ Licensed under MIT 🤗 Available on Hugging Face Thank you, @SebastienBubeck and Team, for this change! The open-source community will benefit a lot from it! 🤗
Philipp Schmid tweet media
English
8
66
410
56K