Nii Osae

426 posts

Nii Osae banner
Nii Osae

Nii Osae

@Stark_Osae

Founder & CEO @ Mindbeam | AI scientist solving the puzzle of intelligence | Forbes 30 under 30

New York, USA Katılım Mayıs 2015
2.4K Takip Edilen852 Takipçiler
Sabitlenmiş Tweet
Nii Osae
Nii Osae@Stark_Osae·
I am thrilled to formally introduce Mindbeam as we emerge from stealth. We have built the fastest AI training framework to date, achieving groundbreaking performance benchmarks and an 86% reduction in energy consumption. This milestone reflects our core mission to accelerate the future of AI while significantly reducing the environmental impact of large-scale model training. We are proud to be working closely with the amazing team at @awscloud to deploy this amazing product to thousands of customers across AI research labs and Fortune 100 Enterprises. Our flagship product, Litespark, is now available on the AWS Marketplace. Visit mindbeam.ai to learn more.
Nii Osae tweet mediaNii Osae tweet media
Mindbeam@MindbeamAI

🚀 BREAKING: We just revolutionized AI pre-training We are excited to come out of stealth and launch our revolutionary product: Litespark, a language model framework which utilizes advanced algorithms to speed up training and inference workloads for generative AI applications. Litespark reduces training time from months to days. With just 16 H100s, you can train your own billion-parameter foundation model from scratch in 26 hours. Fortune 100 companies are already transforming their AI workflows with our breakthrough framework on AWS.

English
9
19
46
5.3K
Nii Osae
Nii Osae@Stark_Osae·
Stress test across different physics regimes: from highly entangled, frustrated quantum states (left) to simple aligned spins (right). Our method nails the ground state energy across the board—no manual tuning for different problems. That's the power of learned circuit design.
Nii Osae tweet media
English
0
0
1
29
Nii Osae
Nii Osae@Stark_Osae·
Watch a quantum circuit evolve from rough sketch to precision instrument. The AI generates an initial circuit (left), continuous optimization tunes the angles (middle), then smart rewiring unlocks the right connections (right)—dropping energy from -22 to -64 (near-perfect!).
Nii Osae tweet media
English
1
0
1
42
Nii Osae
Nii Osae@Stark_Osae·
We are thrilled to share our latest innovation, SpinGQE, a generative quantum eingensolver for spin hamiltonians. Ground state search is fundamental to quantum computing but VQE suffers from barren plateaus and limited expressivity. In our solution, we reframe circuit design as generative modelling where a transformer learns distributions over quantum circuits by matching logits to energies at each gate subsequence. Paper: arxiv.org/abs/2603.24298 Code: github.com/Mindbeam-AI/Sp…
English
1
1
2
64
Nii Osae
Nii Osae@Stark_Osae·
@JustinLin610 Thank you for your amazing contribution to the community! We are excited for what you’ll build next
English
0
0
2
1.4K
Junyang Lin
Junyang Lin@JustinLin610·
me stepping down. bye my beloved qwen.
English
1.7K
731
13.5K
6.5M
Nii Osae retweetledi
Mindbeam
Mindbeam@MindbeamAI·
Announcing our Litespark Technical Report! 🚀 Mindbeam Litespark boosts LLM training up to 6x faster & cuts energy use by 83%. Works with any transformer architecture. Smarter, greener AI is here. Dive into the details: arxiv.org/abs/2510.02483
English
0
1
2
230
Nii Osae retweetledi
Mindbeam
Mindbeam@MindbeamAI·
🚀 Proud moment alert! AWS Startups just spotlighted our founder, Nii Osae, sharing how enterprise teams can turbocharge pre-training models by 6x, reducing training time from months to days. Check out how Mindbeam Litespark transforms AI development—now available on the AWS Marketplace. Get started: bit.ly/litespark
AWS Startups@AWSstartups

💡 @MindbeamAI helps Fortune 500 companies pre-train LLMs on #AWS. We caught up with Nii Osae, the company’s Founder & CEO, at the AWS NYC Summit to learn more about the future of #AI infrastructure. Find us at AWS Summit LA to connect. 🚀 go.aws/45TUAyC

English
0
1
7
566
Nii Osae
Nii Osae@Stark_Osae·
We are bringing unprecedented AI training speed to enterprises on @awscloud 🚀
AWS Startups@AWSstartups

💡 @MindbeamAI helps Fortune 500 companies pre-train LLMs on #AWS. We caught up with Nii Osae, the company’s Founder & CEO, at the AWS NYC Summit to learn more about the future of #AI infrastructure. Find us at AWS Summit LA to connect. 🚀 go.aws/45TUAyC

English
2
11
17
1.3K
Nii Osae
Nii Osae@Stark_Osae·
Super excited to share @MindbeamAI Litespark record-breaking benchmarks! With Litespark, you can train a 3 billion-parameter model with 16 H200 GPUs in less than 2 days, reducing training costs from millions of dollars to under $20K. Bigger models experience improved performance up to 6x in training speed! This marks an inflection point in AI model training and we are happy to share this with the world. 🚀🚀🚀
Mindbeam@MindbeamAI

Litespark is bringing unprecedented training speed and performance to Generative AI. These benchmarks are redefining the status quo and unlocking new potentials for AI infrastructure.🚀

English
1
8
15
1.1K
Nii Osae retweetledi
François Chollet
François Chollet@fchollet·
The arrival of the first AGI will not be the world-shattering event most people expect. Instead, it will go entirely unnoticed by the general public.
English
26
25
387
19.5K
Nii Osae
Nii Osae@Stark_Osae·
@fchollet You hit the nail right on the head!
English
0
0
0
164
François Chollet
François Chollet@fchollet·
Quantum physics only starts to make intuitive sense when you take the perspective of God: what choices would you make if you were to build a universe and make it run efficiently? Lazy evaluation -> observer effect Caching -> quantum entanglement
English
42
26
414
38.5K
Nii Osae retweetledi
François Chollet
François Chollet@fchollet·
In quantum mechanics, the observer effect boils down to the idea that the world state is lazily evaluated (i.e. you only compute something when the result is causally needed to evaluate something else). This makes the many-world interpretation very icky: MWI requires "compute" and "storage" in the below-world substrate to be free and in infinite supply. But the entire point of lazy evaluation is to save compute and storage! Basically: if the universe has made the design choice of lazy evaluation, it implies that whatever resources it runs on must cost something, which in turn rules out the infinitely resource-hungry many-world design choice.
English
115
85
1K
158.7K
Nii Osae retweetledi
François Chollet
François Chollet@fchollet·
Interesting work on reviving RNNs. arxiv.org/abs/2410.01201 -- in general the fact that there are many recent architectures coming from different directions that roughly match Transformers is proof that architectures aren't fundamentally important in the curve-fitting paradigm (aka deep learning) Curve-fitting is about embedding a dataset on a curve. The critical factor is the dataset, not the specific hard-coded bells and whistles that constrain the curve's shape. As long as your curve is sufficiently expressive all architectures will converge to the same performance in the large-data regime.
English
67
252
1.9K
222.5K
Niklas Muennighoff
Niklas Muennighoff@Muennighoff·
Excited to start a PhD in AI @Stanford today🌲 Grateful for help from many people! In the LLM era, many rightly questioned me doing a PhD, but the points in @karpathy's great PhD Guide still hold i think. Regardless feel free to reach out if you have extra H100s😁 (or to collab!)
English
55
44
1.3K
124.6K
Nii Osae retweetledi
Niko McCarty.
Niko McCarty.@NikoMcCarty·
This paper reports a CRISPR-based method to wipe data stored in DNA. Single-stranded DNA that should *not* be erased is converted into double-stranded DNA. Everything else gets chewed up. "It is possible to wipe 10^11.7 files of up to 10^7 GB with 100% file sanitization."
Niko McCarty. tweet media
English
4
16
86
9.2K
Nii Osae retweetledi
Dileep George
Dileep George@dileeplearning·
Hippocampus is phenomena-rich, all confusing! What if they arise from the wrong space-centric view? We offer a different story of how space is represented in the brain * there are no place cells! cognitive map is not spatial! * Kant is wrong, Leibniz is right! * place field methodology is usefully wrong 🧵 1/ science.org/doi/10.1126/sc…
English
15
85
404
66.2K
Nii Osae
Nii Osae@Stark_Osae·
Biologically-inspired AI architecture is the way to go 🦾
Ted Werbel@tedx_ai

OpenAI is about to blow your mind with Active Inference... So what is it? > First introduced in the early 2000s in a series of papers by neuroscientist and theoretical neurobiologist, Karl Friston - active inference is a theory of how the brain uses statistical inference and generative world models to predict sensory inputs and guide actions to minimize prediction errors - helping explain human perception, action and learning > Perception updates our generative world model to reduce errors in prediction while actions change our environment to align with our predictions - minimizing the probability of errors in our predictions > It is likely that with a combination of enough compute, advancements in continuous learning / information retrieval with causal grounding and layers of active inference methods like GoT, AoT, CoV and MCTS - we may be inching closer and closer to a generative, continually learning model that operates at near-human levels of cognition ✨ I suspect that we might start to see the emergence of “energy-based” models (EBM) that operate more dynamically and continuously learn with hot-swappable memory partitions that evolve over time. They will intelligently route and adjust the level of compute needed for more sophisticated means of active inference based on the complexity of a given query. These models will also allow users to explicitly define how much energy / reasoning strength is needed at inference time. I'll be posting more on how this works under the hood soon with layered graphs-of-thought (GoT) and algos like monte-carlo tree search (MCTS)... For more on active inference check out these incredible papers: > Friston, K. (2003). "Learning and inference in the brain." > Friston, K. (2005). "A theory of cortical responses." > Friston, K. (2006). "Free-energy principle for perception and action."

English
0
0
5
349