송준 Jun Song

2.3K posts

송준 Jun Song banner
송준 Jun Song

송준 Jun Song

@jun_song

Super-Tune | Making Local LLMs/Agent easy for everyone | DM is open Useful posts in highlights

대한민국 서울 Katılım Eylül 2024
149 Takip Edilen9.4K Takipçiler
Sabitlenmiş Tweet
송준 Jun Song
송준 Jun Song@jun_song·
The era of Personal Sovereign AI is coming. We're going to use personal hardware to build personal SaaS. We stop handing our data over to corporations and start owning our own intelligence. People are already waking up to this and spinning up Local LLMs. As Big AI adds more guardrails and raises prices, the push for AI sovereignty will only explode. Bookmark this tweet.
English
17
16
120
10.1K
송준 Jun Song
송준 Jun Song@jun_song·
로컬LLM : M5 Max vs M6 Max (예상) 예상 변화 > 3nm -> 2nm 공정 : 전력효율, 발열 및 성능 개선 > LPDDR5X -> LPDDR6 : 대역폭 개선 (750~950GB/s 예상) > Prefill/Decode 속도 : 1.3~1.8배 상승 예상출시 : 2027년 (맥북)
송준 Jun Song tweet media
한국어
0
0
1
10
송준 Jun Song
송준 Jun Song@jun_song·
What would be the best way to lower the temperature of Macbook Pro? Still searching and couldn’t find the answer.
English
9
0
3
834
송준 Jun Song
송준 Jun Song@jun_song·
@DavidBilbie 2나노공정은 엄청난 도약일거라고 생각합니다. 하지만 맥스튜디오로 나오려면 2년은 기다려야할거에요
한국어
0
0
1
18
David Bilbie
David Bilbie@DavidBilbie·
@jun_song You not gonna wait until then M6? I heard it's gonna be a big leap?
English
1
0
0
26
송준 Jun Song
송준 Jun Song@jun_song·
One thing that frustrates me: even if the M5 Ultra Mac Studio drops in October, getting a 256GB or 512GB model will take at least a 3 month wait. I can't believe I won't get my hands on it until next year 😮‍💨
English
10
0
33
2.6K
송준 Jun Song
송준 Jun Song@jun_song·
Local LLM: MacBook 14-inch vs. 16-inch > Same performance from a cold start. > Due to its structural design, the 14-inch heats up much faster. > The heat causes the fans to kick in quickly, generating noise. > Because of thermal throttling, performance drops compared to the 16-inch during long tasks. Personal Conclusion: The 14-inch is fine if you only run agents for short periods and prioritize portability, but if you plan to use them for extended sessions, the 16-inch is the better choice.
송준 Jun Song tweet media
English
0
2
9
554
송준 Jun Song
송준 Jun Song@jun_song·
@gabrieldtmx 128gb가 가격대비 더 좋다고 생각합니다. JANGtQ DSV4같은 대형모델을 구동 가능하니까요. 스로틀링은 발생하긴 하지만 큰 문제는 안됩니다
한국어
0
0
0
92
GabrielDTM
GabrielDTM@gabrieldtmx·
@jun_song Thinking about a 16” M5 Max for local LLM stuff—does it throttle under longer runs or stay pretty stable? And what’s the “sweet spot” config (64GB vs 128GB)?
English
1
0
0
107
송준 Jun Song
송준 Jun Song@jun_song·
Local LLM: M3 Ultra vs. M5 Ultra (Expected) - Under MLX Optimization - • Bandwidth: 819 GB/s vs. 1,228 GB/s (1.5x faster decode) • Prefill: Slow vs. Neural Accelerator (~4x faster prefill, solving the Mac's biggest weakness) • Price: Used market premium (~2x) vs. Expected M5 Ultra price hike (still expected to be cheaper than the current M3 Ultra premium) • Availability: Available via used market (new is out of stock) vs. Expected release in June/Oct 2026 + 3+ months for CTO shipping Personal Opinion: Unless you are in an absolute rush, it makes much more sense to wait rather than paying the massive premium right now. However, if the disaggregated inference technology that pairs it with a high-prefill machine like the DGX Spark is fully realized, the M3 could still be a great option (though this is still a work in progress).
송준 Jun Song tweet media
English
5
2
31
2.4K
송준 Jun Song retweetledi
송준 Jun Song
송준 Jun Song@jun_song·
가장 쉽고 빠르게 로컬LLM 세팅하는법 : Codex/Claude code 구독이 있거나, 만약에 없으면 Google Antigravity의 무료티어로도 충분 에이전트에게 HuggingFace 링크를 던져주고, ”이 로컬LLM을 Hermes Agent로 세팅해줘“ 당신의 하드웨어에 맞게 모든것이 설정됩니다. 어려워하지 마세요.
한국어
10
29
348
23.4K
송준 Jun Song
송준 Jun Song@jun_song·
@bogyuni 저는 그래서 m5 max 맥북을 사용하며 기다리는중입니다 ㅜ
한국어
0
0
1
128
맥주소년
맥주소년@bogyuni·
@jun_song 이렇게 늦게 출시 되는 줄 알았다면, m4를 그냥 사고 기다릴 걸 그랬어요...
한국어
1
0
1
134
송준 Jun Song
송준 Jun Song@jun_song·
OpenAI 폰이 빠르면 2027 Q1에 양산될 가능성이 있다고 합니다. 아이폰18도 Starlink 탑재 루머, 그리고 애플의 내장 칩은 온디바이스 AI에 최적화되게 개발중. 과연 하드웨어 기술이 없는 OpenAI폰이 아이폰을 이길 수 있을까요?
송준 Jun Song tweet media
한국어
7
2
15
1K
Un_akh
Un_akh@Ntdhnl·
@jun_song Est ce qu’il sera au même prix à la sortie ?
Français
1
0
0
135
송준 Jun Song
송준 Jun Song@jun_song·
I know this might sound crazy. But I truly believe the final contender against the Chinese AI alliance will ultimately be @xai . Right now, all Chinese tech companies are essentially moving as one unified team under the government. 🧵
English
7
1
24
1.6K
Justinian
Justinian@Publius_____·
@jun_song @sama They are lying. It’s about you advancing open models and what that means long term. 100%.
English
1
0
1
49
송준 Jun Song
송준 Jun Song@jun_song·
This AI supremacy war isn't a simple LLM benchmark fight anymore. It’s a full-scale race with a definitive finish line: AGI. Don't get me wrong—my heart is with local models and I still firmly believe Open Source MUST win. I'm just objectively analyzing the macro landscape here.
English
0
0
7
362
송준 Jun Song
송준 Jun Song@jun_song·
Only @elonmusk and the xAI ecosystem are fighting back with top-tier technology across ALL these diverse fields simultaneously. Some might say executing this grand vision is impossible, but China is actually making it a reality as we speak.
English
1
0
5
380