Imen Grida Ben Yahia

1K posts

Imen Grida Ben Yahia banner
Imen Grida Ben Yahia

Imen Grida Ben Yahia

@Imengby

passionate, curious, mission-driven |PhD | Principal AI/ML specialist @aws | ex-Orange | Public speaker | researcher | #networks #networkdata #ML #DL #ML #MLOPS

London, England Katılım Ekim 2009
896 Takip Edilen480 Takipçiler
Imen Grida Ben Yahia retweetledi
Akshay 🚀
Akshay 🚀@akshay_pachaar·
Google just dropped "Attention is all you need (V2)" This paper could solve AI's biggest problem: Catastrophic forgetting. When AI models learn something new, they tend to forget what they previously learned. Humans don't work this way, and now Google Research has a solution. Nested Learning. This is a new machine learning paradigm that treats models as a system of interconnected optimization problems running at different speeds - just like how our brain processes information. Here's why this matters: LLMs don't learn from experiences; they remain limited to what they learned during training. They can't learn or improve over time without losing previous knowledge. Nested Learning changes this by viewing the model's architecture and training algorithm as the same thing - just different "levels" of optimization. The paper introduces Hope, a proof-of-concept architecture that demonstrates this approach: ↳ Hope outperforms modern recurrent models on language modeling tasks ↳ It handles long-context memory better than state-of-the-art models ↳ It achieves this through "continuum memory systems" that update at different frequencies This is similar to how our brain manages short-term and long-term memory simultaneously. We might finally be closing the gap between AI and the human brain's ability to continually learn. I've shared link to the paper in the next tweet!
Akshay 🚀 tweet media
English
257
1K
6K
511K
Imen Grida Ben Yahia retweetledi
Werner Vogels
Werner Vogels@Werner·
No data, no AI, no progress. My @AmazonScience article explores how multi-layered mapping + petabyte-scale cloud infrastructure helps save lives in time of crisis. Building AI without addressing the fundamental data divide means solving the wrong problems.amazon.science/blog/why-ai-fo…>
English
3
16
44
7.5K
Imen Grida Ben Yahia
Imen Grida Ben Yahia@Imengby·
4) The graph-floor model is the smallest, at 13M parameters, and models inter-robot dependencies directly. It combines graph neural networks with temporal attention. It was trained on two million robot-hours.
English
0
0
0
53
Imen Grida Ben Yahia
Imen Grida Ben Yahia@Imengby·
3) The image-floor model has 900M parameters and reasons over full spatial layouts. It uses a convolutional encoder over multi-channel floor grids with sixty-second context. It was trained on three million robot-hours.
English
1
0
0
43
Imen Grida Ben Yahia retweetledi
Anthropic
Anthropic@AnthropicAI·
New Anthropic research: Persona vectors. Language models sometimes go haywire and slip into weird and unsettling personas. Why? In a new paper, we find “persona vectors"—neural activity patterns controlling traits like evil, sycophancy, or hallucination.
Anthropic tweet media
English
228
895
5.8K
1.4M
Imen Grida Ben Yahia retweetledi
Steve Jarrett
Steve Jarrett@stevejarrett·
We will demo ‘Network Genius’ with @awscloud at #MWC2024 next week in Barcelona. It’s a graph ‘digital twin’ representation of the Orange network running on AWS Neptune with hard-core, AI based root cause analysis. Built by my team, my brilliant peer Orange CTO @lleboucher’s team, our graph research team led by @TailhardatL, and Amazon 5G software lead @Imengby (who recently left Orange). If you want yo see the future of network analytics, stop by the Amazon booth…
Steve Jarrett tweet media
Paris, France 🇫🇷 English
2
7
32
2.5K
Imen Grida Ben Yahia
Imen Grida Ben Yahia@Imengby·
Very important objective indeed. Operationalising LLMs is the key for production grade pipelines. Sharing here two blogs for insights about steps and stacks. The first blog explains the differences between MLOPS and LLMOPS, important as in production, you would need to combine an LLM and a classical ML for an end to end use case. aws.amazon.com/blogs/machine-… The second blog is zooming on the Evaluation as a key step for the cycle of an LLM aws.amazon.com/blogs/machine-…
English
1
1
7
290
Steve Jarrett
Steve Jarrett@stevejarrett·
My goal for 2024 is to design and run an elegant #LLMOps workflow, including reliably optimizing, distilling, and maintaining a few of the leading foundation models for a number of critical use cases. If any of you have recommendations on how best to do this, I am happy to hear them!
Andrew Ng@AndrewYNg

What do you want to learn in 2024?

Paris, France 🇫🇷 English
6
1
17
2.7K
Imen Grida Ben Yahia retweetledi
LlamaIndex 🦙
LlamaIndex 🦙@llama_index·
A Comprehensive Survey of Advanced RAG 📖 If you’re looking for a one-stop shop for advanced RAG concepts, look no further 💫 @ivanilin9 details all the key concepts in this blog post 📚, and each section highlights @llama_index resources/guides that you can check out. Here are twelve core concepts: 1. Chunking and vectorization 2. Hierarchical Indexing 3. Query Generation/Rewriting (HyDE) 4. Context Enrichment (Sentence Window Retrieval, Auto-merging Retrieval) 5. Fusion Retrieval 6. Query Transformations 7. Adding Conversational History with @llama_index chat engines 8. Routing 9. Agents 10. Response Synthesis 11. Fine-tuning 12. Evaluation Each concept is covered in textual and visual detail. Blog post: pub.towardsai.net/advanced-rag-t…
LlamaIndex 🦙 tweet media
English
5
111
645
98K
Imen Grida Ben Yahia
Imen Grida Ben Yahia@Imengby·
Roaming in the beautiful Richmond park and following the deer 🦌 - A moment of meditation and «zenitude »
Imen Grida Ben Yahia tweet mediaImen Grida Ben Yahia tweet mediaImen Grida Ben Yahia tweet media
English
1
0
3
345
Bruno Zerbib
Bruno Zerbib@bruno_zerbib·
40 ans d'innovations, de recherche et d'excellence ! Notre site de Caen est un véritable symbole de réussite et d'engagement pour le développement de la région 🤩 Continuons à penser et déployer les innovations de demain ! RDV pour les 50 ans ! @fallacher @JMEscalettes
Français
4
19
75
5.4K