dennylee
18.7K posts

dennylee
@dennylee
geek, scribe, coffee snob, and wanna-be cyclist. Contributor to Apache Spark and Delta Lake maintainer. Developer Relations at Databricks (opinions r my own)

Databricks is proud to be a Founding Gold Sponsor of @TheOfficialACM Conference on AI and Agentic Systems—the first ACM conference dedicated to compound AI and agentic systems, with our co-founder @matei_zaharia on the organizing committee. Join us May 26–29 in San Jose for the premier event for rigorous, reproducible research in compound AI architectures, optimization, and deployment. Register today: caisconf.org


Can LLMs adapt continually without losing base skills? Fast-Slow Training (FST) pairs "slow" weights with "fast" context. FST vs. RL: • 3x more sample-efficient • Higher performance ceiling • Less KL drift (better plasticity) • Continual learning: succeeds where RL stalls



















