Sematic

68 posts

Sematic banner
Sematic

Sematic

@SematicAI

The open-source Continuous Machine Learning platform. Automate recurrent end-to-end ML training pipelines for better, safer models, and faster teams.

San Francisco, CA เข้าร่วม Mayıs 2022
167 กำลังติดตาม139 ผู้ติดตาม
Sematic รีทวีตแล้ว
Emmanuel Turlay
Emmanuel Turlay@neutralino1·
The only playground that lets you query dozens of open-source and proprietary models at once and compare quality, cost, throughput. Get started for free 👇
Airtrain AI@AirtrainAI

We are excited to announce the Airtrain.ai March 2024 LLM Playground release & Wednesday’s @ProductHunt launch! 🚀🚀🚀 ✅ Simultaneously prompt multiple models ✅ 18 supported LLMs ✅ Inference Metrics ✅ Persisted Sessions 👉Learn More: airtrain.ai/blog/the-llm-p…

English
0
1
4
363
Sematic รีทวีตแล้ว
Airtrain AI
Airtrain AI@AirtrainAI·
We are excited to announce the Airtrain.ai March 2024 LLM Playground release & Wednesday’s @ProductHunt launch! 🚀🚀🚀 ✅ Simultaneously prompt multiple models ✅ 18 supported LLMs ✅ Inference Metrics ✅ Persisted Sessions 👉Learn More: airtrain.ai/blog/the-llm-p…
English
1
1
5
542
Sematic รีทวีตแล้ว
Airtrain AI
Airtrain AI@AirtrainAI·
Our batch evaluation tool comes with a scoring model that is able to reproduce academic MMLU benchmark results very closely. Check it out :👇 app.airtrain.ai/public/job/c21…
English
0
1
4
177
Sematic รีทวีตแล้ว
Emmanuel Turlay
Emmanuel Turlay@neutralino1·
Google's Gemini demo would be breathtaking if it was not heavily doctored and produced. Check my hot take below 👇 youtu.be/mt4FtV-Zk_w
YouTube video
YouTube
English
0
2
2
95
Sematic รีทวีตแล้ว
Airtrain AI
Airtrain AI@AirtrainAI·
The Thanksgiving chaos at OpenAI highlights the importance of open-source AI. Relying exclusively on OpenAI for your AI strategy is risky and unwise. @neutralino1's take below 👇 youtu.be/g98bkdqWPL4
YouTube video
YouTube
English
0
2
2
98
Sematic รีทวีตแล้ว
Airtrain AI
Airtrain AI@AirtrainAI·
We're super excited to share our new free AI-assisted batch evaluation tool for LLMs. You can evaluate, score, and compare LLMs for your specific application, on your own eval dataset. Reach out for an invite link. youtu.be/O-Uquvmbt-U
YouTube video
YouTube
English
1
2
8
452
Sematic รีทวีตแล้ว
Emmanuel Turlay
Emmanuel Turlay@neutralino1·
The gist of my talk at @MLOpsWorld was that there are two emerging classes of #LLMs: Oracle models which are very large, hard and pricy to train and run, and Worker Bee models which are cheap, fast, and good at specific tasks. Fine-tuned on synthetic data.
Emmanuel Turlay tweet media
English
0
3
4
561
Sematic รีทวีตแล้ว
Max Rumpf
Max Rumpf@maxrumpf·
🚀 These YC companies make building with AI 10x easier! 🧑‍💻 An underrated aspect of @ycombinator is the wealth of resources the community has created for building with LLMs & AI. 🔖 Here's a list of tools you can start using today. You don’t want to miss bookmarking this! 🧵From testing & fine-tuning to infrastructure, the range of AI dev tools crafted by YC founders is incredible. ❓What’s a hard thing you wish was easy when building AI apps?
Max Rumpf tweet media
English
19
65
275
75.8K
Sematic รีทวีตแล้ว
Max Rumpf
Max Rumpf@maxrumpf·
🥁 Orchestration @SematicAI: The open-source orchestrator loved by ML teams. It enables end-to-end pipelines to reduce model turnaround time by 80%. @DAGWorks’s Hamilton: Open-source micro-orchestration framework for describing data flows. Companies use it for modeling data and feature engineering pipelines, prompt engineering, and LLM application workflows. Arakoo's EdgeChains: Open Source SDK that models generative AI applications as config management. Built on top of Jsonnet as the orchestration grammar.
Max Rumpf tweet media
English
2
2
10
636
Sematic รีทวีตแล้ว
Emmanuel Turlay
Emmanuel Turlay@neutralino1·
Exciting to see a potential alternative to transformers that yields lower latencies and costs, and on @AMD #GPUs. This is what the #AI industry needs to move faster. The main bottlenecks at this time are access to GPUs, training and inference costs, and inference latency. Seems like RetNet could address all of the above. Of course it doesn't come without tradeoffs, but these can be mitigated or even removed over time.
Aran Komatsuzaki@arankomatsuzaki

Retentive Network: A Successor to Transformer for Large Language Models Proposes RetNet as a foundation architecture for LLMs, simultaneously achieving training parallelism, low-cost inference, and good performance. arxiv.org/abs/2307.08621

English
0
1
4
209
Sematic
Sematic@SematicAI·
By abstracting away infrastructure, and guaranteeing visualizations, traceability, and observability, we’ve measured an 80% speed-up in #ML model development and retraining time with Sematic. The result? Faster innovation and competitive edge. #AI #Productivity
Sematic tweet media
English
1
1
3
76
Sematic
Sematic@SematicAI·
📢We're proud to publish our first Customer Case Study 📢 Voxel reported an 80% reduction in model training time thanks to Sematic's productivity gains. Traceability, visualizations, scaling contribute to cutting down turnaround time for #MachineLearning. sematic.dev/blog/how-voxel…
English
8
3
21
5.8K
Sematic
Sematic@SematicAI·
@michaeljschock Thanks for starring our repo Michael 🙂 Let us know if you have any questions.
English
0
0
0
3
Sematic รีทวีตแล้ว
Emmanuel Turlay
Emmanuel Turlay@neutralino1·
Are business cards still a thing? Love my new ones anyway.
Emmanuel Turlay tweet media
Oakland, CA 🇺🇸 English
1
1
7
373