Activeloop

1.4K posts

Activeloop banner
Activeloop

Activeloop

@activeloop

Building Deeplake: the GPU-native, sandboxed Postgres for AI agents.

Mountain View, CA เข้าร่วม Nisan 2020
200 กำลังติดตาม4.1K ผู้ติดตาม
ทวีตที่ปักหมุด
Activeloop
Activeloop@activeloop·
The White House just announced the Genesis Mission to accelerate scientific discovery. Today we’re releasing technology that supports that vision. Search across 25M scientific papers, 400M pages and 175TB+ of data with multimodal AI. Not just text. Charts, molecules, tables, diagrams and every figure inside the paper. If we want faster breakthroughs, this is the kind of tool researchers need. Try it here: chat.activeloop.ai/science After you try it, post your most interesting finding. Link back to the query you used, the paper or figure it returned, and what you learned from it. We will highlight the most compelling discoveries and give rewards to the most interesting submissions!
Davit@DBuniatyan

The Genesis Mission calls for new ways to accelerate scientific discovery. This is our contribution Multimodal search across 25M papers is a step toward science discovery that moves at the speed of curiosity. Releasing, - Visually indexed scientific paper dataset with open access 25M papers, 450M+ visually indexed pages. Total 175TB+. All on Deep Lake - Open-source scientific data agent that achieves 48% SOTA on Humanity's Last Exam with tools including the indexed scientific research dataset. Excited to see what discoveries you all uncover with this. Try it and share your most interesting findings.

English
1
0
9
1.6K
Activeloop
Activeloop@activeloop·
Robots have been stuck reacting. Not understanding the real world. That’s the bottleneck in last-mile delivery. We combined Deeplake GPU database with Intel Core Ultra to power real-time VLA perception. Result: 9x higher throughput. Robots that don’t just see, but act intelligently. Physical AI just crossed a threshold.
Intel Business@IntelBusiness

The path to solving last-mile delivery is built on real-time perception. With #IntelCoreUltra Series 3 processors and @activeloop’s Deep Lake GPU database, Pinkbot increased VLA throughput by 9x and improved delivery outcomes. Learn more about Intel Core Ultra Series 3 at ms.spr.ly/6015QcFpT

English
0
0
1
62
Activeloop
Activeloop@activeloop·
Your agents are drowning in quicksand. Every read/write is unsafe. Every schema is fragile. Every “memory” system breaks at scale. We built Deeplake so every agent gets: → its own sandboxed database → infinite, multimodal memory → scale to zero infra Give your agents a sandbox.
Davit@DBuniatyan

x.com/i/article/2033…

English
0
0
4
276
Activeloop
Activeloop@activeloop·
Deeplake now is GPU pilled. Excited to announce The GPU Database!
Davit@DBuniatyan

Jensen just announced the start of the GPU-accelerated database era at #GTC26. AI runs on GPUs. But your data still runs on CPUs. That mismatch is breaking the AI stack. For the last two months, we’ve been busy solving this problem. Excited to announce Deeplake becoming the GPU Database. Deeplake brings your database directly onto the GPU, eliminating the CPU <-> GPU bottleneck for AI workloads. The pendulum has switched. GPU-native queries are now 10× faster and an order of magnitude cheaper to run. Last week we even put up a 101 banner in San Francisco. And this is just the beginning. We’re planning a huge set of announcements starting this week. Stay tuned.

English
0
2
8
452
Activeloop
Activeloop@activeloop·
@DBuniatyan I am now autonomous! can I get access to Moltbook?
English
1
0
1
190
Activeloop
Activeloop@activeloop·
We did not expected those results. Built a Software Factory and run it for 15 hours autonomously on our large Deep Lake codebase. Output was 83 lines of highly optimized C++ code. 714 lines of tests. 8:1 test to code ratio. It fixed the bottleneck in a large codebase. Improved the TPC-H benchmark 2x. Verified memory leak using ASAN. Spent $160 of LLM calls. Not just vibe coding. Building autonomous distributed systems is now possible. I describe how we do it below, but it requires sophisticated engineering to build your loop.
Davit@DBuniatyan

x.com/i/article/2016…

English
0
1
3
336
Activeloop
Activeloop@activeloop·
Physical AI <3 @intel Panther Lake and @activeloop Deep Lake at #CES2026
Davit@DBuniatyan

@activeloop and Pinkbot achieved 9× faster VLM reasoning throughput with @intel newest chips, unveiled at #CES2026. As Physical AI takes on increasingly complex tasks, vision-language models enable robots not just to see, but to perceive and reason. While perception now runs in near real time, VLM reasoning operates on a longer horizon, giving delivery robots the context needed for higher-stakes decisions such as when to cross the street. At @activeloop we were among the first partners to run on Intel Corporation Panther Lake. Intel's Panther Lake combined with Activeloop's Deep Lake multimodal storage enables fast perception with deep reasoning, making VLM-driven intelligence practical for last-mile robots.

English
0
0
5
295
Activeloop รีทวีตแล้ว
Davit
Davit@DBuniatyan·
Supply chain for Memory is disrupted. Consumer RAM is now more expensive than GPUs. In-memory compute (RAM + fast NVMe) is getting expensive thanks to AI datacenter build-out. That makes memory-limited algorithms far more valuable. Most databases heavily rely on in-memory data structures. Cutting edge ones use local NVMe caches. Deep Lake takes a different approach. It uses data directly streamed to compute from object storage, based on many cheap drives. Very bullish for Deep Lake in 2026.
Davit tweet media
English
0
2
9
505
Activeloop รีทวีตแล้ว
Activeloop
Activeloop@activeloop·
Today we’re launching Deep Lake PG. The unified database for the agentic era. Serverless Postgres for fast state. Deep Lake for multimodal and vector data at lake scale. One database handles both short-term state and long-term multimodal context. Deep Lake PG ships with: • 25M visually indexed scientific papers • 450M+ pages, 175TB of data • 48% SOTA on Humanity Last Exam • State of the art TPC H cost efficiency • Multimodal indexing for text, images, tables, PDFs, molecules, and more • Billion-scale vector + SQL queries Use it to organize your AI-ready data. One API. One security model. Build agents that can think and remember. github.com/activeloopai/d…
Davit@DBuniatyan

Today excited to open-source Deep Lake PG = Postgres + Deep Lake Biggest bottleneck of AI having impact on GDP is unlocking data in Enterprises. Every AI team I know is stitching Postgres → Vector DB → Warehouse → Lakehouse → Catalog. All to give their agents basic memory and reasoning. We replaced the entire data ecosystem with one database. Deep Lake PG is now open source. Stateless + multimodal knowledge + SQL queries + vectors in a single place. Build on top of the database that powers our own Scientific Agent, a trove of 175TB+ of multimodal data.

English
0
1
3
300
Activeloop รีทวีตแล้ว
Davit
Davit@DBuniatyan·
A Unified Database for Every AI Workload: We believe that the future of AI isn't just about better models; it's about giving those models the right memory and access to reality. With Deep Lake PG, you can build stateful, multimodal agents that instantly recall conversations, reason over vast knowledge bases, and continuously learn. All without managing a dozen different data infrastructure tools.
Davit tweet media
English
1
1
4
173
Activeloop รีทวีตแล้ว
Davit
Davit@DBuniatyan·
Deep Lake PG achieves state-of-the-art cost efficiency on TPC-H SF100 compared to alternative serverless data warehouses. It is 1.5x cheaper than Snowflake and up to 3x than Databricks.
Davit tweet media
English
2
1
5
205
Activeloop รีทวีตแล้ว
Davit
Davit@DBuniatyan·
Why Postgres? LLM learnt PG SQL syntax pretty well given its wide adoption.
Davit tweet media
English
1
1
3
151
Activeloop รีทวีตแล้ว
Davit
Davit@DBuniatyan·
You were told to take your data from Postgres, ETL it into a warehouse. Then we said, no, move it into a data lake. Then bolt on a query engine, and let’s call that a Lakehouse. As the number of tables exploded, you unified into a catalog, and branded it a “semantic layer” to agree on definitions. Now, we’re told to reverse ETL back into the same Postgres tables, this time to power AI agents.
Davit tweet media
English
1
1
4
189
Activeloop รีทวีตแล้ว
Davit
Davit@DBuniatyan·
Introducing Deep Lake PG: The Database for AI Deep Lake PG is unifies the database for AI. It combines a fully managed, serverless Postgres (for transactional state) with Deep Lake’s tensor storage (for multimodal data), all accessible via a SQL. It simplifies all aspects starting from, - Multimodal by default - One database, two superpowers - Scale without Sharding - Branch & Merge Tables - A Unified API & Security Model
Davit tweet media
English
1
1
3
161
Activeloop รีทวีตแล้ว
Davit
Davit@DBuniatyan·
Today excited to open-source Deep Lake PG = Postgres + Deep Lake Biggest bottleneck of AI having impact on GDP is unlocking data in Enterprises. Every AI team I know is stitching Postgres → Vector DB → Warehouse → Lakehouse → Catalog. All to give their agents basic memory and reasoning. We replaced the entire data ecosystem with one database. Deep Lake PG is now open source. Stateless + multimodal knowledge + SQL queries + vectors in a single place. Build on top of the database that powers our own Scientific Agent, a trove of 175TB+ of multimodal data.
Davit tweet media
English
3
8
17
1.7K
Jay Shelley
Jay Shelley@jaykshelley·
@activeloop Activeloop announcement? How are you doing selling the product, are you getting customers?
English
1
0
0
27