ApertureData

567 posts

ApertureData banner
ApertureData

ApertureData

@ApertureData

Foundational Data Layer for AI: Combine scalable vector search with memory-optimized graph and multimodal data management

Mountain View, CA Se uniรณ Kasฤฑm 2018
29 Siguiendo412 Seguidores
Tweet fijado
ApertureData
ApertureData@ApertureDataยท
New website, fresh documentation, and a MAJOR release on deck! Managing multimodal data is a challenge, but weโ€™re making it easier with our newly launched ApertureDB Cloud. With experience deploying ApertureDB to Fortune 100 customers, we now bring the features to , ApertureDB Cloud which enables faster insights and streamlined workflows. Big thanks to @VentureBeat & @mr_bumss for featuring the release!! Key Figures: โ€ข35x faster at mobilizing multimodal datasets โ€ข2-4x faster than other some open-source vector search solutions โ€ข66% of enterprise data remains unusedโ€”time to change that. venturebeat.com/data-infrastruโ€ฆ
English
4
3
7
2.4K
ApertureData retuiteado
Vishakha Gupta-Cledat
Vishakha Gupta-Cledat@vishakha041ยท
I was learning about 10+ new AI tools a day and realized I had no way to filter the noise. The spark for a better map came from the agentic stack sessions at @PWVentures . Watching founders build real-world "plumbing" forced a shift in perspective. 1/4 ๐Ÿงต
English
2
2
4
95
ApertureData retuiteado
Vishakha Gupta-Cledat
Vishakha Gupta-Cledat@vishakha041ยท
Iโ€™ll be at @NVIDIAGTC next week. Jensen Huangโ€™s talked about AI as a โ€œfiveโ€‘layer cakeโ€: energy โ†’ chips/compute โ†’ cloud infrastructure โ†’ models โ†’ applications, and it's becoming the industryโ€™s shared map. Whatโ€™s interesting is how quickly the software layers are exposing a gap: ๐˜๐—ต๐—ฒ๐˜† ๐—ป๐—ฒ๐—ฒ๐—ฑ ๐—ฎ ๐—ฑ๐—ฎ๐˜๐—ฎ + ๐—บ๐—ฒ๐—บ๐—ผ๐—ฟ๐˜† ๐˜€๐˜‚๐—ฏ๐˜€๐˜๐—ฟ๐—ฎ๐˜๐—ฒ ๐˜๐—ต๐—ฎ๐˜ ๐—ด๐—ถ๐˜ƒ๐—ฒ๐˜€ ๐—”๐—œ ๐—ฐ๐—ผ๐—ป๐˜๐—ถ๐—ป๐˜‚๐—ถ๐˜๐˜†, ๐—ด๐—ฟ๐—ผ๐˜‚๐—ป๐—ฑ๐—ถ๐—ป๐—ด, ๐—ฎ๐—ป๐—ฑ ๐˜๐—ต๐—ฒ ๐—ฎ๐—ฏ๐—ถ๐—น๐—ถ๐˜๐˜† ๐˜๐—ผ ๐—ผ๐—ฝ๐—ฒ๐—ฟ๐—ฎ๐˜๐—ฒ ๐—ผ๐˜ƒ๐—ฒ๐—ฟ ๐—ฟ๐—ฒ๐—ฎ๐—นโ€๐˜„๐—ผ๐—ฟ๐—น๐—ฑ ๐—ฐ๐—ผ๐—ป๐˜๐—ฒ๐˜…๐˜. Verticalization is accelerating this shift. Physical AI is pushing it even faster. Stateless prompts donโ€™t cut it when systems need to see, remember, and act across time. A few questions Iโ€™m bringing into GTC: โ€ข Do small, domainโ€‘native models win once memory becomes the anchor, or do frontier models keep stretching upward? โ€ข How long can teams absorb token costs before architecture becomes the real constraint? โ€ข And what exactly is โ€œmemoryโ€ now โ€” a log file, a vector store, or the core infrastructure layer that binds the stack? At @ApertureData, weโ€™ve been thinking about this layer for a long time. ๐—œ๐—ณ ๐˜†๐—ผ๐˜‚โ€™๐—ฟ๐—ฒ ๐—ฒ๐˜…๐—ฝ๐—น๐—ผ๐—ฟ๐—ถ๐—ป๐—ด ๐—ถ๐˜ ๐˜๐—ผ๐—ผ, ๐—น๐—ฒ๐˜โ€™๐˜€ ๐—ฐ๐—ผ๐—ป๐—ป๐—ฒ๐—ฐ๐˜.
Vishakha Gupta-Cledat tweet media
English
0
1
3
61
ApertureData retuiteado
Vishakha Gupta-Cledat
Vishakha Gupta-Cledat@vishakha041ยท
The PoC days of AI are over. ๐—ง๐—ต๐—ฒ ๐˜๐—ถ๐—บ๐—ฒ ๐—ณ๐—ผ๐—ฟ ๐—ฝ๐—ฟ๐—ผ๐—ฑ๐˜‚๐—ฐ๐˜๐—ถ๐—ผ๐—ป ๐—ถ๐˜€ ๐—ป๐—ผ๐˜„! Across industries, and especially inside AI-native tech companies, the same cracks are showing: โ€ขย ย ย Retrieval latency โ€ขย ย ย Multimodal sprawl โ€ขย ย ย Glue code overload โ€ขย ย ย Graph + vector fragmentation Before context graphs and AI memory, getting the data foundation layer right is the first order of business today. @ApertureData
Vishakha Gupta-Cledat tweet mediaVishakha Gupta-Cledat tweet mediaVishakha Gupta-Cledat tweet mediaVishakha Gupta-Cledat tweet media
English
1
2
3
121
ApertureData retuiteado
Vishakha Gupta-Cledat
Vishakha Gupta-Cledat@vishakha041ยท
Hey, AI models are everywhere, getting smarter as we speak, butttt we talk far less about how ๐˜€๐—ฐ๐—ฎ๐—น๐—ฎ๐—ฏ๐—น๐—ฒ, ๐—ฝ๐—ฒ๐—ฟ๐˜€๐—ถ๐˜€๐˜๐—ฒ๐—ป๐˜, ๐—บ๐˜‚๐—น๐˜๐—ถ๐—บ๐—ผ๐—ฑ๐—ฎ๐—น ๐—บ๐—ฒ๐—บ๐—ผ๐—ฟ๐˜† ๐—ถ๐˜€ ๐—ฑ๐—ฒ๐˜€๐—ถ๐—ด๐—ป๐—ฒ๐—ฑ ๐—ถ๐—ป ๐—”๐—œ ๐˜€๐˜†๐˜€๐˜๐—ฒ๐—บ๐˜€. In practice, โ€œmemoryโ€ shows up as a combination of things teams already struggle to manage: stored embeddings, structured metadata, relationships between entities, and the raw multimodal assets themselves. When these pieces live in different systems, memory becomes fragmented. Context is hard to retrieve consistently. Long-running workflows become brittle. Treating memory as infrastructure means designing the data layer so that embeddings, metadata, and multimodal content can be stored, queried, and related in one place. ๐—œ๐˜ ๐—บ๐—ฒ๐—ฎ๐—ป๐˜€ ๐˜€๐˜‚๐—ฝ๐—ฝ๐—ผ๐—ฟ๐˜๐—ถ๐—ป๐—ด ๐—ฟ๐—ฒ๐˜๐—ฟ๐—ถ๐—ฒ๐˜ƒ๐—ฎ๐—น ๐—ฝ๐—ฎ๐˜๐˜๐—ฒ๐—ฟ๐—ป๐˜€ ๐˜๐—ต๐—ฎ๐˜ ๐—ด๐—ผ ๐—ฏ๐—ฒ๐˜†๐—ผ๐—ป๐—ฑ ๐˜€๐—ถ๐—ป๐—ด๐—น๐—ฒ ๐—พ๐˜‚๐—ฒ๐—ฟ๐—ถ๐—ฒ๐˜€, ๐—ฎ๐—ป๐—ฑ ๐—บ๐—ฎ๐—ธ๐—ถ๐—ป๐—ด ๐—ถ๐˜ ๐—ฝ๐—ผ๐˜€๐˜€๐—ถ๐—ฏ๐—น๐—ฒ ๐—ณ๐—ผ๐—ฟ ๐—ฎ๐—ด๐—ฒ๐—ป๐˜๐˜€ ๐—ฎ๐—ป๐—ฑ ๐—ฎ๐—ฝ๐—ฝ๐—น๐—ถ๐—ฐ๐—ฎ๐˜๐—ถ๐—ผ๐—ป๐˜€ ๐˜๐—ผ ๐—ฏ๐˜‚๐—ถ๐—น๐—ฑ ๐—ฐ๐—ผ๐—ป๐˜๐—ฒ๐˜…๐˜ ๐—ผ๐˜ƒ๐—ฒ๐—ฟ ๐˜๐—ถ๐—บ๐—ฒ ๐˜‚๐˜€๐—ถ๐—ป๐—ด ๐˜๐—ต๐—ฒ ๐˜€๐—ฎ๐—บ๐—ฒ ๐˜‚๐—ป๐—ฑ๐—ฒ๐—ฟ๐—น๐˜†๐—ถ๐—ป๐—ด ๐˜€๐˜†๐˜€๐˜๐—ฒ๐—บ. As AI systems move from one-off calls to multi-step, multi-agent workflows, ๐—บ๐—ฒ๐—บ๐—ผ๐—ฟ๐˜† ๐˜€๐˜๐—ผ๐—ฝ๐˜€ ๐—ฏ๐—ฒ๐—ถ๐—ป๐—ด ๐—ฎ ๐—ฏ๐—ผ๐—น๐˜-๐—ผ๐—ป. ๐—œ๐˜ ๐—ฏ๐—ฒ๐—ฐ๐—ผ๐—บ๐—ฒ๐˜€ ๐—ฝ๐—ฎ๐—ฟ๐˜ ๐—ผ๐—ณ ๐—ต๐—ผ๐˜„ ๐˜๐—ต๐—ฒ ๐˜€๐˜†๐˜€๐˜๐—ฒ๐—บ ๐—ถ๐˜€ ๐˜€๐˜๐—ฟ๐˜‚๐—ฐ๐˜๐˜‚๐—ฟ๐—ฒ๐—ฑ ๐—ณ๐—ฟ๐—ผ๐—บ ๐—ฑ๐—ฎ๐˜† ๐—ผ๐—ป๐—ฒ. And thatโ€™s where our inspiration and direction at @ApertureData is coming from, as we continue to design and refine the infrastructure for multimodal data, vector search, and relationship-aware queries in a single layer.
Vishakha Gupta-Cledat tweet media
English
0
1
3
127
ApertureData retuiteado
Vishakha Gupta-Cledat
Vishakha Gupta-Cledat@vishakha041ยท
๐—–๐—ผ๐—ป๐˜๐—ฒ๐˜…๐˜ ๐—ด๐—ฟ๐—ฎ๐—ฝ๐—ต๐˜€ are getting a lot of attention, and for good reason (@JayaGup10 @FoundationCap )! They point to a future where human and agent decisions can be captured, understood, and revisited with far more fidelity than todayโ€™s systems allow. ๐˜‰๐˜ถ๐˜ต ๐˜ต๐˜ถ๐˜ณ๐˜ฏ๐˜ช๐˜ฏ๐˜จ ๐˜ต๐˜ฉ๐˜ข๐˜ต ๐˜ท๐˜ช๐˜ด๐˜ช๐˜ฐ๐˜ฏ ๐˜ช๐˜ฏ๐˜ต๐˜ฐ ๐˜ณ๐˜ฆ๐˜ข๐˜ญ๐˜ช๐˜ต๐˜บ ๐˜ณ๐˜ฆ๐˜ฒ๐˜ถ๐˜ช๐˜ณ๐˜ฆ๐˜ด ๐˜ฎ๐˜ฐ๐˜ณ๐˜ฆ ๐˜ต๐˜ฉ๐˜ข๐˜ฏ ๐˜ข ๐˜ฏ๐˜ฆ๐˜ธ ๐˜ข๐˜ฃ๐˜ด๐˜ต๐˜ณ๐˜ข๐˜ค๐˜ต๐˜ช๐˜ฐ๐˜ฏ. If context graphs are going to function as a system of record for reasoning, ๐—ผ๐—ฟ๐—ด๐—ฎ๐—ป๐—ถ๐˜‡๐—ฎ๐˜๐—ถ๐—ผ๐—ป๐˜€ ๐—ป๐—ฒ๐—ฒ๐—ฑ ๐˜๐—ต๐—ฒ ๐˜๐—ฒ๐—ฐ๐—ต๐—ป๐—ถ๐—ฐ๐—ฎ๐—น ๐˜€๐˜‚๐—ฏ๐˜€๐˜๐—ฟ๐—ฎ๐˜๐—ฒ ๐—ฎ๐—ป๐—ฑ ๐˜๐—ต๐—ฒ ๐—ฐ๐˜‚๐—น๐˜๐˜‚๐—ฟ๐—ฎ๐—น ๐—ฝ๐—ฟ๐—ฎ๐—ฐ๐˜๐—ถ๐—ฐ๐—ฒ๐˜€ ๐˜๐—ผ ๐˜€๐˜‚๐—ฝ๐—ฝ๐—ผ๐—ฟ๐˜ ๐˜๐—ต๐—ฒ๐—บ. ApertureDB provides the data layer. Agent frameworks provide the reasoning. Integrations provide the raw material. The harder part is ๐—ฐ๐˜‚๐—น๐˜๐˜‚๐—ฟ๐—ฎ๐—น: ๐—น๐—ฒ๐—ฎ๐—ฟ๐—ป๐—ถ๐—ป๐—ด ๐˜๐—ผ ๐—ท๐˜‚๐˜€๐˜๐—ถ๐—ณ๐˜† ๐—ฑ๐—ฒ๐—ฐ๐—ถ๐˜€๐—ถ๐—ผ๐—ป๐˜€, ๐—ฎ๐—ป๐—ป๐—ผ๐˜๐—ฎ๐˜๐—ฒ ๐—ฟ๐—ฒ๐—ฎ๐˜€๐—ผ๐—ป๐—ถ๐—ป๐—ด, ๐—ฒ๐˜ƒ๐—ฎ๐—น๐˜‚๐—ฎ๐˜๐—ฒ ๐—ผ๐˜‚๐˜๐—ฐ๐—ผ๐—บ๐—ฒ๐˜€, ๐—ฝ๐—ฟ๐—ผ๐˜๐—ฒ๐—ฐ๐˜ ๐˜€๐—ฒ๐—ป๐˜€๐—ถ๐˜๐—ถ๐˜ƒ๐—ฒ ๐—ฐ๐—ผ๐—ป๐˜๐—ฒ๐˜…๐˜, ๐—ฎ๐—ป๐—ฑ ๐˜๐—ฟ๐—ฒ๐—ฎ๐˜ ๐—ฑ๐—ฒ๐—ฐ๐—ถ๐˜€๐—ถ๐—ผ๐—ป ๐˜๐—ฟ๐—ฎ๐—ฐ๐—ฒ๐˜€ ๐—ฎ๐˜€ ๐—ณ๐—ถ๐—ฟ๐˜€๐˜โ€๐—ฐ๐—น๐—ฎ๐˜€๐˜€ ๐—ฎ๐—ฟ๐˜๐—ถ๐—ณ๐—ฎ๐—ฐ๐˜๐˜€. If we get this right, weโ€™ll build organizations where agents understand human reasoning, humans understand agent reasoning, decisions become auditable, knowledge compounds, and the ๐—ฒ๐—ป๐˜๐—ถ๐—ฟ๐—ฒ ๐—ฐ๐—ผ๐—บ๐—ฝ๐—ฎ๐—ป๐˜† ๐—ฏ๐—ฒ๐—ฐ๐—ผ๐—บ๐—ฒ๐˜€ ๐—บ๐—ผ๐—ฟ๐—ฒ ๐—ถ๐—ป๐˜๐—ฒ๐—น๐—น๐—ถ๐—ด๐—ฒ๐—ป๐˜ ๐—ผ๐˜ƒ๐—ฒ๐—ฟ ๐˜๐—ถ๐—บ๐—ฒ. Thatโ€™s the real trillionโ€‘dollar opportunity.... aperturedata.io/resources/contโ€ฆ
Vishakha Gupta-Cledat tweet media
English
0
2
5
117
ApertureData
ApertureData@ApertureDataยท
This collaboration also gave us an opportunity to launch UI support for our text vector search features!
Vishakha Gupta-Cledat@vishakha041

Recently, the team at iSonic.ai (Praneel Panchigar, Nisshutosh Sharma, Torlach Rush, and Ankesh Kumar) decided to ๐—บ๐—ผ๐˜ƒ๐—ฒ ๐—ณ๐—ฟ๐—ผ๐—บ ๐— ๐—ผ๐—ป๐—ด๐—ผ๐——๐—• ๐˜๐—ผ ๐—”๐—ฝ๐—ฒ๐—ฟ๐˜๐˜‚๐—ฟ๐—ฒ๐——๐—• for their textโ€‘ and metadataโ€‘heavy workloads and replaced a vectorโ€‘only setup that struggled under real retrieval patterns. @isonic_ai ๐˜ฃ๐˜ถ๐˜ช๐˜ญ๐˜ฅ๐˜ด ๐˜ˆ๐˜ ๐˜ข๐˜ด๐˜ด๐˜ช๐˜ด๐˜ต๐˜ข๐˜ฏ๐˜ต๐˜ด ๐˜ต๐˜ฉ๐˜ข๐˜ต ๐˜ต๐˜ถ๐˜ณ๐˜ฏ ๐˜ข ๐˜ค๐˜ณ๐˜ฆ๐˜ข๐˜ต๐˜ฐ๐˜ณโ€™๐˜ด ๐˜ฆ๐˜ฏ๐˜ต๐˜ช๐˜ณ๐˜ฆ ๐˜ค๐˜ฐ๐˜ฏ๐˜ต๐˜ฆ๐˜ฏ๐˜ต ๐˜ญ๐˜ช๐˜ฃ๐˜ณ๐˜ข๐˜ณ๐˜บ ๐˜ช๐˜ฏ๐˜ต๐˜ฐ ๐˜ข๐˜ฏ ๐˜ช๐˜ฏ๐˜ต๐˜ฆ๐˜ณ๐˜ข๐˜ค๐˜ต๐˜ช๐˜ท๐˜ฆ ๐˜ฆ๐˜น๐˜ฑ๐˜ฆ๐˜ณ๐˜ช๐˜ฆ๐˜ฏ๐˜ค๐˜ฆ ๐˜ต๐˜ฉ๐˜ข๐˜ต ๐˜ข๐˜ฏ๐˜ด๐˜ธ๐˜ฆ๐˜ณ๐˜ด ๐˜ฒ๐˜ถ๐˜ฆ๐˜ด๐˜ต๐˜ช๐˜ฐ๐˜ฏ๐˜ด, ๐˜ด๐˜ถ๐˜ณ๐˜ง๐˜ข๐˜ค๐˜ฆ๐˜ด ๐˜ต๐˜ฉ๐˜ฆ ๐˜ณ๐˜ช๐˜จ๐˜ฉ๐˜ต ๐˜ฎ๐˜ข๐˜ต๐˜ฆ๐˜ณ๐˜ช๐˜ข๐˜ญ, ๐˜ข๐˜ฏ๐˜ฅ ๐˜ฉ๐˜ฆ๐˜ญ๐˜ฑ๐˜ด ๐˜ค๐˜ณ๐˜ฆ๐˜ข๐˜ต๐˜ฐ๐˜ณ๐˜ด ๐˜ฎ๐˜ฐ๐˜ฏ๐˜ฆ๐˜ต๐˜ช๐˜ป๐˜ฆ ๐˜ฎ๐˜ฐ๐˜ณ๐˜ฆ ๐˜ฆ๐˜ง๐˜ง๐˜ฆ๐˜ค๐˜ต๐˜ช๐˜ท๐˜ฆ๐˜ญ๐˜บ. What stood out wasnโ€™t just that they switched to ApertureDB, ๐—ถ๐˜ ๐˜„๐—ฎ๐˜€ ๐™ฌ๐™๐™ฎ. In their benchmarks, ApertureDB ๐—ฐ๐—ผ๐—ป๐˜€๐—ถ๐˜€๐˜๐—ฒ๐—ป๐˜๐—น๐˜† ๐—ผ๐˜‚๐˜๐—ฝ๐—ฒ๐—ฟ๐—ณ๐—ผ๐—ฟ๐—บ๐—ฒ๐—ฑ ๐—–๐—ต๐—ฟ๐—ผ๐—บ๐—ฎ on retrieval, but performance was only part of the story. They needed reliability under load, richer metadata models without arbitrary limits, and a clean path toward graphโ€‘based retrieval as their product evolves. And they valued choosing infrastructure that wonโ€™t constrain them if they needed to introduce other modalities. This journey mirrors what we see across teams building real production systems. ๐—ฉ๐—ฒ๐—ฐ๐˜๐—ผ๐—ฟ ๐˜€๐—ฒ๐—ฎ๐—ฟ๐—ฐ๐—ต ๐—ถ๐˜€ ๐—ฎ ๐—ฝ๐—ผ๐˜„๐—ฒ๐—ฟ๐—ณ๐˜‚๐—น ๐—ฏ๐˜‚๐—ถ๐—น๐—ฑ๐—ถ๐—ป๐—ด ๐—ฏ๐—น๐—ผ๐—ฐ๐—ธ, ๐—ฏ๐˜‚๐˜ ๐˜ƒ๐—ฒ๐—ฟ๐˜† ๐—ณ๐—ฒ๐˜„ ๐˜„๐—ผ๐—ฟ๐—ธ๐—น๐—ผ๐—ฎ๐—ฑ๐˜€ ๐˜€๐˜๐—ฎ๐˜† ๐˜ƒ๐—ฒ๐—ฐ๐˜๐—ผ๐—ฟโ€๐—ผ๐—ป๐—น๐˜† ๐—ณ๐—ผ๐—ฟ ๐—น๐—ผ๐—ป๐—ด. As products mature, teams start needing: โ€ข metadata that can grow without ceilings โ€ข relationshipโ€‘aware retrieval โ€ข graphโ€‘structured context โ€ข and the option to go multimodal when the time is right Itโ€™s encouraging to see teams make that transition intentionally, not because a tool is trendy, but because their workloads demand more flexibility, reliability, and room to grow. Curious how others are thinking about this shift as their retrieval and data layers mature. @ApertureData

English
0
0
2
97
ApertureData retuiteado
Vishakha Gupta-Cledat
Vishakha Gupta-Cledat@vishakha041ยท
I was recently asked about my predictions for vector databases over the next year, and the more I think about it, the clearer one thing feels. ๐—ฉ๐—ฒ๐—ฐ๐˜๐—ผ๐—ฟ ๐—ฑ๐—ฎ๐˜๐—ฎ๐—ฏ๐—ฎ๐˜€๐—ฒ๐˜€ ๐—ฎ๐—น๐—ผ๐—ป๐—ฒ ๐—ต๐—ฎ๐˜ƒ๐—ฒ ๐—ป๐—ฒ๐˜ƒ๐—ฒ๐—ฟ ๐—ฏ๐—ฒ๐—ฒ๐—ป ๐—ฒ๐—ป๐—ผ๐˜‚๐—ด๐—ต. There is still room to innovate on algorithms and scale, but the bigger shift is how we think about ๐—ฑ๐—ฎ๐˜๐—ฎ ๐—บ๐—ฎ๐—ป๐—ฎ๐—ด๐—ฒ๐—บ๐—ฒ๐—ป๐˜ ๐—ฎ๐˜€ ๐—ฎ ๐˜„๐—ต๐—ผ๐—น๐—ฒ. Moving beyond text-only workloads toward truly ๐—บ๐˜‚๐—น๐˜๐—ถ๐—บ๐—ผ๐—ฑ๐—ฎ๐—น data. Treating ๐—บ๐—ฒ๐—บ๐—ผ๐—ฟ๐˜† as a first-class concept for the future of AI agents. And questioning whether throwing more compute or long context at the problem is actually the right long-term answer. Whatโ€™s surprised me most is how little emphasis thereโ€™s been on ๐—ฒ๐—ณ๐—ณ๐—ถ๐—ฐ๐—ถ๐—ฒ๐—ป๐—ฐ๐˜†. Weโ€™ve been comfortable scaling power and cost, instead of demanding better system design and more thoughtful infrastructure choices. Vector search is just one part of the equation. ๐—ง๐—ต๐—ฒ ๐—ฟ๐—ฒ๐—ฎ๐—น ๐—ถ๐—ป๐—ณ๐—น๐—ฒ๐—ฐ๐˜๐—ถ๐—ผ๐—ป ๐—ฝ๐—ผ๐—ถ๐—ป๐˜ ๐—ต๐—ฎ๐—ฝ๐—ฝ๐—ฒ๐—ป๐˜€ ๐˜„๐—ต๐—ฒ๐—ป ๐˜๐—ฒ๐—ฎ๐—บ๐˜€ ๐—ป๐—ฒ๐—ฒ๐—ฑ ๐˜๐—ผ ๐˜€๐˜‚๐—ฝ๐—ฝ๐—ผ๐—ฟ๐˜ ๐˜€๐—ฐ๐—ฎ๐—น๐—ฒ, ๐—บ๐˜‚๐—น๐˜๐—ถ๐—บ๐—ผ๐—ฑ๐—ฎ๐—น ๐—ฑ๐—ฎ๐˜๐—ฎ, ๐—ฎ๐—ป๐—ฑ ๐—ฟ๐—ฒ๐—น๐—ฎ๐˜๐—ถ๐—ผ๐—ป๐˜€๐—ต๐—ถ๐—ฝ-๐—ฎ๐˜„๐—ฎ๐—ฟ๐—ฒ ๐—พ๐˜‚๐—ฒ๐—ฟ๐—ถ๐—ฒ๐˜€ ๐—ถ๐—ป ๐—ฎ ๐˜„๐—ฎ๐˜† ๐˜๐—ต๐—ฎ๐˜ ๐˜๐—ฟ๐—ฎ๐—ฑ๐—ถ๐˜๐—ถ๐—ผ๐—ป๐—ฎ๐—น ๐˜€๐˜†๐˜€๐˜๐—ฒ๐—บ๐˜€ ๐˜„๐—ฒ๐—ฟ๐—ฒ๐—ปโ€™๐˜ ๐—ฑ๐—ฒ๐˜€๐—ถ๐—ด๐—ป๐—ฒ๐—ฑ ๐—ณ๐—ผ๐—ฟ. Feels like we are entering a phase where the real differentiation wonโ€™t come from models alone, but from how well we design memory, infrastructure, and efficiency into the stack (@ApertureData ). Curious how you are thinking about this as you plan for the year ahead..
Vishakha Gupta-Cledat tweet media
English
0
1
2
95
ApertureData retuiteado
Vishakha Gupta-Cledat
Vishakha Gupta-Cledat@vishakha041ยท
๐—” ๐—ณ๐˜‚๐—น๐—น ๐—ฑ๐—ฎ๐˜† ๐—ฎ๐˜ ๐˜๐—ต๐—ฒ ๐—ฃ๐—ผ๐˜€๐˜โ€๐—œ๐—ป๐—ฑ๐˜‚๐˜€๐˜๐—ฟ๐—ถ๐—ฎ๐—น ๐—ฆ๐˜‚๐—บ๐—บ๐—ถ๐˜, ๐—ฎ๐—ป๐—ฑ ๐˜๐—ต๐—ฒ ๐˜€๐—ถ๐—ด๐—ป๐—ฎ๐—น ๐˜„๐—ฎ๐˜€ ๐˜‚๐—ป๐—บ๐—ถ๐˜€๐˜๐—ฎ๐—ธ๐—ฎ๐—ฏ๐—น๐—ฒ, ๐—ถ๐˜'๐˜€ ๐—ป๐—ผ๐˜ ๐—ถ๐—ณ ๐˜„๐—ฒ ๐—ฑ๐—ฒ๐—ฝ๐—น๐—ผ๐˜† ๐—ฎ๐—ด๐—ฒ๐—ป๐˜๐˜€ ๐—ถ๐—ป ๐—ฒ๐˜ƒ๐—ฒ๐—ฟ๐˜† ๐—ผ๐—ฟ๐—ด ๐—ฏ๐˜‚๐˜ ๐˜„๐—ต๐—ฒ๐—ป ๐—ฎ๐—ป๐—ฑ ๐—ต๐—ผ๐˜„ ๐˜€๐˜†๐˜€๐˜๐—ฒ๐—บ๐—ฎ๐˜๐—ถ๐—ฐ๐—ฎ๐—น๐—น๐˜†. (Plus I had the chance to chair the session on Organizational Intelligence, Context Graphs, and Hyperโ€‘Adaptive Enterprises with such insightful speakers like Prukalpa Sankar, Dipanjan Ghosh, Tatyana Mamut) Across conversations, the shift was clear: ๐˜ž๐˜ฆโ€™๐˜ณ๐˜ฆ ๐˜ฎ๐˜ฐ๐˜ท๐˜ช๐˜ฏ๐˜จ ๐˜ง๐˜ณ๐˜ฐ๐˜ฎ โ€œ๐˜Š๐˜ข๐˜ฏ ๐˜ธ๐˜ฆ ๐˜ข๐˜ถ๐˜ต๐˜ฐ๐˜ฎ๐˜ข๐˜ต๐˜ฆ ๐˜ต๐˜ฉ๐˜ช๐˜ด?โ€ ๐˜ต๐˜ฐ โ€œ๐˜ž๐˜ฉ๐˜ข๐˜ต ๐˜ฅ๐˜ฐ๐˜ฆ๐˜ด ๐˜ข๐˜ถ๐˜ต๐˜ฐ๐˜ฎ๐˜ข๐˜ต๐˜ช๐˜ฐ๐˜ฏ ๐˜ค๐˜ฉ๐˜ข๐˜ฏ๐˜จ๐˜ฆ ๐˜ข๐˜ฃ๐˜ฐ๐˜ถ๐˜ต ๐˜ฉ๐˜ฐ๐˜ธ ๐˜ธ๐˜ฆ ๐˜ธ๐˜ฐ๐˜ณ๐˜ฌ, ๐˜จ๐˜ฐ๐˜ท๐˜ฆ๐˜ณ๐˜ฏ, ๐˜ข๐˜ฏ๐˜ฅ ๐˜ค๐˜ฐ๐˜ฐ๐˜ณ๐˜ฅ๐˜ช๐˜ฏ๐˜ข๐˜ต๐˜ฆ?โ€ ๐—ง๐—ต๐—ฒ๐—บ๐—ฒ๐˜€ ๐˜๐—ต๐—ฎ๐˜ ๐—ธ๐—ฒ๐—ฝ๐˜ ๐˜€๐˜‚๐—ฟ๐—ณ๐—ฎ๐—ฐ๐—ถ๐—ป๐—ด โ€ข Everyone is automating something โ€” but the real leverage is in supervisor agents โ€ข Context is queen, but culture has to be encoded too โ€ข Evals and observability are finally firstโ€‘class โ€ข As hallucinations drop, the bottleneck shifts to system design and governance โ€ข Bounded autonomy + control loops are becoming productivity tools โ€ข Crossโ€‘org agent execution? Not yet โ€ข Agent sprawl is real โ€ข Highโ€‘value agents are coming fast ๐—ง๐—ต๐—ฒ ๐˜€๐—ฝ๐—ถ๐—ฐ๐˜† ๐˜๐—ฎ๐—ธ๐—ฒ๐˜€ ๐˜๐—ต๐—ฎ๐˜ ๐—บ๐—ฎ๐—ฑ๐—ฒ ๐—ฝ๐—ฒ๐—ผ๐—ฝ๐—น๐—ฒ ๐˜€๐—ถ๐˜ ๐˜‚๐—ฝ โ€ข โ€œOur teams arenโ€™t allowed to code anymore, they can only train models.โ€ Fascinating, but hard to imagine for infrastructure where correctness and reliability matter as much as intelligence. โ€ข โ€œModels evolve, compute gets better, humans struggle to change.โ€ A brutally accurate observation. โ€ข โ€œNo need to centralize data, let each orgs' agents handle their own. โ€ Beautiful in theory with the right super agent in charge. In large enterprises with fragmented data estates, this requires deep structural change. ๐˜ ๐˜ต๐˜ฉ๐˜ฐ๐˜ถ๐˜จ๐˜ฉ๐˜ต ๐˜ข๐˜ฃ๐˜ฐ๐˜ถ๐˜ต ๐˜ต๐˜ฉ๐˜ฆ๐˜ด๐˜ฆ ๐˜ด๐˜ฑ๐˜ช๐˜ค๐˜บ ๐˜ต๐˜ข๐˜ฌ๐˜ฆ๐˜ด ๐˜ข๐˜ฏ๐˜ฅ ๐˜ต๐˜ฉ๐˜ฆ ๐˜ค๐˜ฐ๐˜ฎ๐˜ฎ๐˜ฐ๐˜ฏ ๐˜ณ๐˜ถ๐˜ฏ๐˜ฏ๐˜ช๐˜ฏ๐˜จ ๐˜ต๐˜ฉ๐˜ฆ๐˜ฎ๐˜ฆ ๐˜ข๐˜ฏ๐˜ฅ ๐˜ข๐˜ญ๐˜ญ ๐˜ต๐˜ฉ๐˜ฆ๐˜ด๐˜ฆ ๐˜ฑ๐˜ฐ๐˜ช๐˜ฏ๐˜ต๐˜ด ๐˜ต๐˜ฐ๐˜ฐ๐˜ฌ ๐˜ฎ๐˜ฆ ๐˜ฃ๐˜ข๐˜ค๐˜ฌ ๐˜ต๐˜ฐ ๐˜ต๐˜ฉ๐˜ฆ ๐˜ฆ๐˜ฏ๐˜ช๐˜จ๐˜ฎ๐˜ข ๐˜ฐ๐˜ง ๐˜ฎ๐˜ฆ๐˜ฎ๐˜ฐ๐˜ณ๐˜บ ๐˜ง๐˜ฐ๐˜ณ ๐˜ข๐˜จ๐˜ฆ๐˜ฏ๐˜ต๐˜ด! If agents are going to supervise other agents, operate across modalities, and make decisions with bounded autonomy, persistent multimodal memory becomes the real substrate, not just retrieval, but continuity. Follow along on what we are building at @ApertureData to see how we achieve this! Appreciated this opportunity from the Post Industrial Institute team..
Vishakha Gupta-Cledat tweet mediaVishakha Gupta-Cledat tweet mediaVishakha Gupta-Cledat tweet media
English
0
1
2
66
ApertureData retuiteado
Vishakha Gupta-Cledat
Vishakha Gupta-Cledat@vishakha041ยท
** Multimodal Infra FAQs ** Choosing your infrastructure right can save you months if not years of heartache and wasted money. But doing it for AI, especially as you work on building production AI systems, is not easy. Here are some fundamental multimodal infrastructure Q/A that are worth bookmarking as your checklist, if youโ€™re doing AI development beyond tabular data (which incidentally is a modality too!). Q: What is multimodal data? I wish people would ask me that before assuming that if you donโ€™t work with images or videos, you donโ€™t have multimodal data. Nope! Technically, as soon as you start using more than one type or โ€œmodeโ€ of data, say text + tables, you are already multimodal. Q: Is there a reason to tie metadata, embeddings, and raw assets together for any reason? Can you query them together? A: Search and retrieval always need you to consider multiple factors. Sometimes you know the exact metadata value to look for e.g. give me all pictures of Scarlett Johannson. But sometimes you only have a vague recollection - show me all movies with โ€œblue aliensโ€. You want to find your data assets nonetheless. If you donโ€™t stitch them together, the workflow will always be stitched together and hard to scale. And yes, it can be done, with the right infrastructure in place. Q: Why can I not just convert everything to text and use it? A: When you listen to a dialog or watch a scene, is it just words that impact your understanding of the situation or also the tone, the sentiment, and the emotions? These are hard to capture in a transcript. There is also an inherent structure within other modalities e.g. various scenes in a video, diagrams in a document, and so on. Capturing and understanding the modality can give you a much better and correct view of what you are dealing with. Q: Can you actually visualize multimodal data? A: This is quite a valuable capability to be honest. Viewing PDFs, images, and video frames directly speeds up debugging and iteration significantly. And yes, with the right representation and UI, it is possible to do so. Q: What happens when your dataset grows into millions of objects? A: Builders often ask about concurrency, hardware memory behavior, and performance especially under sustained load. Underlying object stores are very well optimized for high throughput. The right representation and parallel compute setup can create a scalable platform to handle multimodal data for AI usage patterns. Q: Can a platform built for multimodal AI support agentic memory requirements? A: Short-term vs long-term retrieval, cross-modality context, and persistence patterns are quickly becoming real requirements and yes, it is possible to build a layer on the right data foundation. These are the kinds of questions that reveal whether a system can support true multimodal AI in production. If you are evaluating infrastructure in this space, these questions are a solid starting point. (check out @ApertureData on how all this is done)
Vishakha Gupta-Cledat tweet media
English
0
1
3
68
ApertureData retuiteado
Vishakha Gupta-Cledat
Vishakha Gupta-Cledat@vishakha041ยท
As a data product we had to move fast in 2025, after all, the market required it, moving from vanilla RAG to graph to Agentic RAG in a few months interval each. Now, 2026 is off to a great start at @ApertureData with petabytes of multimodal data, increasing belief in the power of knowledge graphs, and enterprise scale Agentic memory use cases to deploy on the foundation we have built in ApertureDB. Checkout how our year went and what's coming in 2026 : aperturedata.io/resources/reflโ€ฆ
Vishakha Gupta-Cledat tweet media
English
0
1
4
92
ApertureData
ApertureData@ApertureDataยท
What's also interesting are the lessons around the role of SQL as outlined towards the end of the blog! JSON made functionality more expressible across data modalities while SQL and MCP plug-ins helped offer compatibility...
Vishakha Gupta-Cledat@vishakha041

MLOps Community - Beyond SQL: The Query Language Multimodal AI Really Needs - Why multimodal AI broke SQLโ€”and why ApertureDB went JSON-first instead Live now on @mlopscommunity Learnt at @ApertureData home.mlops.community/public/blogs/bโ€ฆ #MLOpsCommunity

English
0
0
3
84
ApertureData
ApertureData@ApertureDataยท
What went right with AI in 2025?ย What didnโ€™t get enough attention in 2025 but should โ€” and will โ€” pick up momentum in 2026? This edition of our final newsletter of the year brings together insights from AI leaders across the industry on how adoption matured, why Agentic AI took off, and what must change in 2026 to move from experimentation to impact. Sharing the full state of the union here. linkedin.com/pulse/19-apertโ€ฆ Thank you for your thoughtful responses - Matthias Spycher, Stephanie Cannon, Manasvi Sharma, and Yuanbo Wang!
English
0
2
6
1.8K
ApertureData retuiteado
Vishakha Gupta-Cledat
Vishakha Gupta-Cledat@vishakha041ยท
In Part 1, @ApertureData valued community member, @ayesha_imr, focused on the data foundation -- designing a multimodal graph in ApertureDB that lets an AI agent understand and traverse conference content using natural language. That structure set the stage for something deeper: ๐—ด๐—ถ๐˜ƒ๐—ถ๐—ป๐—ด ๐˜๐—ต๐—ฒ ๐—ฎ๐—ด๐—ฒ๐—ป๐˜ ๐˜๐—ต๐—ฒ ๐˜๐—ผ๐—ผ๐—น๐˜€ ๐—ถ๐˜ ๐—ป๐—ฒ๐—ฒ๐—ฑ๐˜€ ๐˜๐—ผ ๐—ฟ๐—ฒ๐—ฎ๐˜€๐—ผ๐—ป. Part 2 is where she demonstrates how those query patterns evolve into seven well-defined tools, and a LangGraph-based ReAct agent learns how to combine them to interpret real user intent, ๐˜๐—ผ ๐—ต๐—ฒ๐—น๐—ฝ ๐˜†๐—ผ๐˜‚ ๐—ป๐—ฎ๐˜ƒ๐—ถ๐—ด๐—ฎ๐˜๐—ฒ ๐—ฐ๐—ผ๐—บ๐—ฝ๐—น๐—ฒ๐˜… ๐—ฑ๐—ฎ๐˜๐—ฎ ๐—ฐ๐—ผ๐—น๐—น๐—ฒ๐—ฐ๐˜๐—ถ๐—ผ๐—ป๐˜€ ๐—ฎ๐—ป๐—ฑ ๐—ด๐—ฎ๐˜๐—ต๐—ฒ๐—ฟ ๐—ถ๐—ป๐˜€๐—ถ๐—ด๐—ต๐˜๐˜€ ๐˜†๐—ผ๐˜‚ ๐—ป๐—ฒ๐—ฒ๐—ฑ. In this case, from conference data shared by @MLOpsWorld Here are the top three takeaways from this phase: โ€ข ๐˜›๐˜ฐ๐˜ฐ๐˜ญ ๐˜ฅ๐˜ฆ๐˜ด๐˜ช๐˜จ๐˜ฏ ๐˜ฅ๐˜ฆ๐˜ต๐˜ฆ๐˜ณ๐˜ฎ๐˜ช๐˜ฏ๐˜ฆ๐˜ด ๐˜ข๐˜จ๐˜ฆ๐˜ฏ๐˜ต ๐˜ค๐˜ข๐˜ฑ๐˜ข๐˜ฃ๐˜ช๐˜ญ๐˜ช๐˜ต๐˜บ. The seven tools define what the agent can truly accomplish, more than swapping in a larger LLM ever could. โ€ข ๐˜œ๐˜ฏ๐˜ช๐˜ง๐˜ช๐˜ฆ๐˜ฅ ๐˜จ๐˜ณ๐˜ข๐˜ฑ๐˜ฉ + ๐˜ท๐˜ฆ๐˜ค๐˜ต๐˜ฐ๐˜ณ + ๐˜ฎ๐˜ฆ๐˜ต๐˜ข๐˜ฅ๐˜ข๐˜ต๐˜ข ๐˜ด๐˜ต๐˜ฐ๐˜ณ๐˜ข๐˜จ๐˜ฆ ๐˜ฎ๐˜ข๐˜ต๐˜ต๐˜ฆ๐˜ณ๐˜ด. ApertureDB (from ApertureData) enables constrained semantic search (filter--> embed search) in a single atomic query, simplifying the entire workflow. โ€ข ๐˜Ž๐˜ฐ๐˜ฐ๐˜ฅ ๐˜ฆ๐˜น๐˜ข๐˜ฎ๐˜ฑ๐˜ญ๐˜ฆ๐˜ด ๐˜ฎ๐˜ข๐˜ฌ๐˜ฆ ๐˜ฃ๐˜ฆ๐˜ต๐˜ต๐˜ฆ๐˜ณ ๐˜ข๐˜จ๐˜ฆ๐˜ฏ๐˜ต๐˜ด. Detailed few-shot examples significantly improved tool selection, parameter accuracy, and multi-step reasoning. The result is an AI agent that can answer complex, multi-part questions about years of MLOps conference content with accuracy and context. Blog and agent link in comments below @langchain
Vishakha Gupta-Cledat tweet media
English
1
1
6
294
ApertureData retuiteado
Vishakha Gupta-Cledat
Vishakha Gupta-Cledat@vishakha041ยท
As someone whoโ€™s often invited to speak at events, and attends my fair share of conferences, I know firsthand how hard it is to find the right talk, the right clip, or even the right year something happened. Slides live in one place, videos in another, transcripts somewhere else. ๐˜Ž๐˜ณ๐˜ฆ๐˜ข๐˜ต ๐˜ค๐˜ฐ๐˜ฏ๐˜ต๐˜ฆ๐˜ฏ๐˜ต ๐˜จ๐˜ฆ๐˜ต๐˜ด ๐˜ค๐˜ณ๐˜ฆ๐˜ข๐˜ต๐˜ฆ๐˜ฅ, ๐˜ฃ๐˜ถ๐˜ต ๐˜ฏ๐˜ฐ๐˜ต ๐˜ข๐˜ญ๐˜ธ๐˜ข๐˜บ๐˜ด ๐˜ด๐˜ถ๐˜ณ๐˜ง๐˜ข๐˜ค๐˜ฆ๐˜ฅ! Thatโ€™s the problem we set out to solve with our ๐—ป๐—ฒ๐˜„ ๐—”๐—œ ๐—ค๐˜‚๐—ฒ๐—ฟ๐˜† ๐—”๐—ด๐—ฒ๐—ป๐˜. Give it a try lnkd.in/gVNk7cWi and then head over to the blog to read how setup the data layer for it. In this first blog, @ayesha_imr talks about how we partnered with 6th Annual @MLOpsWorld | GenAI Summit committee to build a structured, multimodal data foundation that spans three years of conference content;ย titles, descriptions, transcripts, speaker info, and videos -ย all stored inside ApertureDB from @ApertureData. With that architecture in place, the agent can answer questions like: โ€œWhich talks covered AI agents with memory?โ€ โ€œShow presentations on RAG by Databricks engineers.โ€ No manual digging. Just natural language --> precise retrieval (isn't that what we all like to do nowadays) This first post in the series focuses on the data layer, the schema and graph design choices that make intelligent query decomposition possible as the effectiveness of any agent is ultimately limited by the structure of its data. Stay tuned for Part 2 where we will explore how these patterns turn into tools inside a LangGraph-based ReAct agent.
Vishakha Gupta-Cledat tweet media
English
0
5
9
811
ApertureData retuiteado
Vishakha Gupta-Cledat
Vishakha Gupta-Cledat@vishakha041ยท
When you hear โ€œmultimodal data for AI,โ€ do you immediately think โ€œembeddingsโ€ or hey it doesnโ€™t include my text data? Most people do. But multimodality is so much more than generating embeddings from different data types and running approximate searches. โ€ข Itโ€™s about ๐˜‚๐—ป๐—ฑ๐—ฒ๐—ฟ๐˜€๐˜๐—ฎ๐—ป๐—ฑ๐—ถ๐—ป๐—ด ๐˜€๐˜๐—ฟ๐˜‚๐—ฐ๐˜๐˜‚๐—ฟ๐—ฒ: the layout in a PDF, the relationships between fields, the hierarchy of metadata. โ€ข Itโ€™s about ๐˜ƒ๐—ถ๐˜€๐˜‚๐—ฎ๐—น๐—ถ๐˜‡๐—ฎ๐˜๐—ถ๐—ผ๐—ป: actually seeing the document, the image, the table, the text block youโ€™re working with. โ€ข Itโ€™s about ๐—ป๐—ฎ๐˜ƒ๐—ถ๐—ด๐—ฎ๐˜๐—ถ๐—ผ๐—ป: moving through components, discovering how each element connects to its metadata and to its embedding space. Most importantly, ๐—ฟ๐—ฒ๐—ฎ๐—น-๐˜„๐—ผ๐—ฟ๐—น๐—ฑ ๐—บ๐˜‚๐—น๐˜๐—ถ๐—บ๐—ผ๐—ฑ๐—ฎ๐—น ๐—ฑ๐—ฎ๐˜๐—ฎ ๐—ฟ๐—ฎ๐—ฟ๐—ฒ๐—น๐˜† ๐—ณ๐—ถ๐˜๐˜€ ๐—ถ๐—ป ๐—ฎ ๐˜€๐—ถ๐—ป๐—ด๐—น๐—ฒ ๐—ป๐—ผ๐—ฑ๐—ฒโ€™๐˜€ ๐—บ๐—ฒ๐—บ๐—ผ๐—ฟ๐˜†. Thatโ€™s where the real engineering challenges show up. ๐—ช๐—ฒ ๐—ต๐—ฎ๐˜ƒ๐—ฒ ๐—ฏ๐—ฒ๐—ฒ๐—ป ๐˜๐—ต๐—ถ๐—ป๐—ธ๐—ถ๐—ป๐—ด ๐—ฑ๐—ฒ๐—ฒ๐—ฝ๐—น๐˜† ๐—ฎ๐—ฏ๐—ผ๐˜‚๐˜ ๐˜๐—ต๐—ถ๐˜€ ๐—ฎ๐˜ @ApertureData ๐—ฎ๐˜€ ๐˜„๐—ฒ ๐˜€๐˜‚๐—ฝ๐—ฝ๐—ผ๐—ฟ๐˜ ๐—ฃ๐——๐—™ ๐˜ƒ๐—ถ๐—ฒ๐˜„๐—ถ๐—ป๐—ด, ๐—บ๐˜‚๐—น๐˜๐—ถ๐—บ๐—ผ๐—ฑ๐—ฎ๐—น ๐—ฝ๐—ฟ๐—ผ๐—ฐ๐—ฒ๐˜€๐˜€๐—ถ๐—ป๐—ด, ๐—ฎ๐—ป๐—ฑ ๐˜€๐—ฒ๐—บ๐—ฎ๐—ป๐˜๐—ถ๐—ฐ ๐˜€๐—ฒ๐—ฎ๐—ฟ๐—ฐ๐—ต ๐—ถ๐—ป ๐—ผ๐—ป๐—ฒ ๐—ฝ๐—น๐—ฎ๐—ฐ๐—ฒ ๐—ฎ๐—ป๐—ฑ ๐˜๐—ต๐—ถ๐˜€ ๐—ฑ๐—ถ๐˜€๐˜๐—ถ๐—ป๐—ฐ๐˜๐—ถ๐—ผ๐—ป ๐—ธ๐—ฒ๐—ฒ๐—ฝ๐˜€ ๐—ฐ๐—ผ๐—บ๐—ถ๐—ป๐—ด ๐˜‚๐—ฝ. If โ€œmultimodalโ€ still feels like โ€œjust embeddings,โ€ or only when you have more than text , you are only seeing 20% of the picture.
Vishakha Gupta-Cledat tweet media
English
0
1
4
58
ApertureData
ApertureData@ApertureDataยท
tldr; A few years ago, I kept seeing the same scene play out in every AI team I met. A room full of brilliant engineers, state-of-the-art models, and GPUs humming in the background, yet half the day was spent moving data between systems. ๐—œ๐—บ๐—ฎ๐—ด๐—ฒ๐˜€ ๐—ถ๐—ป ๐—ผ๐—ป๐—ฒ ๐˜€๐˜๐—ผ๐—ฟ๐—ฒ, ๐—บ๐—ฒ๐˜๐—ฎ๐—ฑ๐—ฎ๐˜๐—ฎ ๐—ถ๐—ป ๐—ฎ๐—ป๐—ผ๐˜๐—ต๐—ฒ๐—ฟ, ๐—ฒ๐—บ๐—ฏ๐—ฒ๐—ฑ๐—ฑ๐—ถ๐—ป๐—ด๐˜€ ๐˜€๐—ผ๐—บ๐—ฒ๐˜„๐—ต๐—ฒ๐—ฟ๐—ฒ ๐—ฒ๐—น๐˜€๐—ฒ. ๐—˜๐—ป๐—ฑ๐—น๐—ฒ๐˜€๐˜€ ๐—ฐ๐—ผ๐—ป๐—ป๐—ฒ๐—ฐ๐˜๐—ผ๐—ฟ๐˜€. ๐—™๐—ฟ๐—ฎ๐—ด๐—ถ๐—น๐—ฒ ๐—ฝ๐—ถ๐—ฝ๐—ฒ๐—น๐—ถ๐—ป๐—ฒ๐˜€. It wasnโ€™t a lack of innovation, it was the architecture. The ๐—บ๐—ผ๐—ฟ๐—ฒ ๐—บ๐˜‚๐—น๐˜๐—ถ๐—บ๐—ผ๐—ฑ๐—ฎ๐—น ๐—ผ๐˜‚๐—ฟ ๐—”๐—œ ๐˜€๐˜†๐˜€๐˜๐—ฒ๐—บ๐˜€ ๐—ฏ๐—ฒ๐—ฐ๐—ฎ๐—บ๐—ฒ, ๐˜๐—ต๐—ฒ ๐—บ๐—ผ๐—ฟ๐—ฒ ๐—ฑ๐—ถ๐˜€๐—ฐ๐—ผ๐—ป๐—ป๐—ฒ๐—ฐ๐˜๐—ฒ๐—ฑ ๐˜๐—ต๐—ฒ ๐˜‚๐—ป๐—ฑ๐—ฒ๐—ฟ๐—น๐˜†๐—ถ๐—ป๐—ด ๐—ฑ๐—ฎ๐˜๐—ฎ ๐—ด๐—ผ๐˜. Thatโ€™s when we started asking a simple question: what if the database itself understood multimodal data? ๐˜ž๐˜ฉ๐˜ข๐˜ต ๐˜ช๐˜ง ๐˜ช๐˜ต ๐˜ต๐˜ณ๐˜ฆ๐˜ข๐˜ต๐˜ฆ๐˜ฅ documents, ๐˜ช๐˜ฎ๐˜ข๐˜จ๐˜ฆ๐˜ด, ๐˜ท๐˜ช๐˜ฅ๐˜ฆ๐˜ฐ๐˜ด, ๐˜ฆ๐˜ฎ๐˜ฃ๐˜ฆ๐˜ฅ๐˜ฅ๐˜ช๐˜ฏ๐˜จ๐˜ด, ๐˜ข๐˜ฏ๐˜ฅ ๐˜ฎ๐˜ฆ๐˜ต๐˜ข๐˜ฅ๐˜ข๐˜ต๐˜ข ๐˜ข๐˜ด ๐˜ง๐˜ช๐˜ณ๐˜ด๐˜ต-๐˜ค๐˜ญ๐˜ข๐˜ด๐˜ด ๐˜ค๐˜ช๐˜ต๐˜ช๐˜ป๐˜ฆ๐˜ฏ๐˜ด, not as files to be passed around? That question became ๐—”๐—ฝ๐—ฒ๐—ฟ๐˜๐˜‚๐—ฟ๐—ฒ๐——๐—• (from @ApertureData), a database built for the next era of AI, where agents need to see, connect, and reason in the multimodal world. Intelligence isnโ€™t just retrieval; it is understanding the relationships that tie information together -ย context through relationships and hierarchies. Thatโ€™s the mindset guiding every product decision we make. And because I believe in sharing what we learn: โ€ข ๐—ž๐—ฒ๐—ฒ๐—ฝ ๐—ฎ๐˜€๐—ธ๐—ถ๐—ป๐—ด: What relationships do I expect my models/agents to reason over? โ€ข ๐—–๐—ต๐—ฎ๐—น๐—น๐—ฒ๐—ป๐—ด๐—ฒ ๐˜†๐—ผ๐˜‚๐—ฟ๐˜€๐—ฒ๐—น๐—ณ: Does my data layer allow me to query those relationships as easily as I query "select * from โ€ฆ"?
ApertureData tweet media
English
0
1
4
63
ApertureData retuiteado
Vishakha Gupta-Cledat
Vishakha Gupta-Cledat@vishakha041ยท
Testing limits of the product and imagining what is possible are what drive a startup in the right direction. At this monthโ€™s HackerSquad by Developer Events Hack Day in SF, we saw both happening in real time. Some builders went straight into the technical depth. They wanted to understand how ๐—”๐—ฝ๐—ฒ๐—ฟ๐˜๐˜‚๐—ฟ๐—ฒ๐——๐—• ๐—ต๐—ฎ๐—ป๐—ฑ๐—น๐—ฒ๐˜€ ๐—พ๐˜‚๐—ฒ๐—ฟ๐˜†๐—ถ๐—ป๐—ด ๐—บ๐—ฒ๐˜๐—ฎ๐—ฑ๐—ฎ๐˜๐—ฎ, ๐—ฒ๐—บ๐—ฏ๐—ฒ๐—ฑ๐—ฑ๐—ถ๐—ป๐—ด๐˜€, ๐—ฎ๐—ป๐—ฑ ๐—ฟ๐—ฎ๐˜„ ๐—ณ๐—ฟ๐—ฎ๐—บ๐—ฒ๐˜€ ๐˜๐—ผ๐—ด๐—ฒ๐˜๐—ต๐—ฒ๐—ฟ, ๐˜„๐—ต๐—ฎ๐˜ ๐—ต๐—ฎ๐—ฝ๐—ฝ๐—ฒ๐—ป๐˜€ ๐—ฎ๐˜€ ๐—ฑ๐—ฎ๐˜๐—ฎ๐˜€๐—ฒ๐˜๐˜€ ๐˜€๐—ฐ๐—ฎ๐—น๐—ฒ, ๐—ฎ๐—ป๐—ฑ ๐—ต๐—ผ๐˜„ ๐—ฐ๐—ผ๐—ป๐—ฐ๐˜‚๐—ฟ๐—ฟ๐—ฒ๐—ป๐—ฐ๐˜†, ๐—บ๐—ฒ๐—บ๐—ผ๐—ฟ๐˜†, ๐—ฎ๐—ป๐—ฑ ๐—ฝ๐—ฒ๐—ฟ๐—ณ๐—ผ๐—ฟ๐—บ๐—ฎ๐—ป๐—ฐ๐—ฒ ๐—ฏ๐—ฒ๐—ต๐—ฎ๐˜ƒ๐—ฒ ๐˜‚๐—ป๐—ฑ๐—ฒ๐—ฟ ๐—ฟ๐—ฒ๐—ฎ๐—น ๐˜„๐—ผ๐—ฟ๐—ธ๐—น๐—ผ๐—ฎ๐—ฑ๐˜€. These are the kinds of questions that push us to refine and harden the system at @ApertureData Others looked at the same platform and immediately started imagining. Multimodal search across PDFs, images, video, and text sparked ideas around ๐—ธ๐—ป๐—ผ๐˜„๐—น๐—ฒ๐—ฑ๐—ด๐—ฒ ๐—ด๐—ฟ๐—ฎ๐—ฝ๐—ต๐˜€, ๐˜ƒ๐—ถ๐—ฑ๐—ฒ๐—ผ ๐˜€๐—ฒ๐—บ๐—ฎ๐—ป๐˜๐—ถ๐—ฐ ๐˜€๐—ฒ๐—ฎ๐—ฟ๐—ฐ๐—ต, ๐—ด๐—ฒ๐—ป๐—ผ๐—บ๐—ถ๐—ฐ ๐—ฒ๐—บ๐—ฏ๐—ฒ๐—ฑ๐—ฑ๐—ถ๐—ป๐—ด๐˜€, ๐—ฑ๐—ฎ๐˜๐—ฎ๐˜€๐—ฒ๐˜ ๐—ฝ๐—ฟ๐—ฒ๐—ฝ, ๐—ฐ๐—น๐—ผ๐˜‚๐—ฑ ๐—ถ๐—ป๐—ด๐—ฒ๐˜€๐˜๐—ถ๐—ผ๐—ป, and more. Seeing all modalities connected in a single system opens doors quickly. ๐˜‰๐˜ฐ๐˜ต๐˜ฉ ๐˜ข๐˜ณ๐˜ฆ ๐˜ฆ๐˜ฒ๐˜ถ๐˜ข๐˜ญ๐˜ญ๐˜บ ๐˜ช๐˜ฎ๐˜ฑ๐˜ฐ๐˜ณ๐˜ต๐˜ข๐˜ฏ๐˜ต. ๐˜–๐˜ฏ๐˜ฆ ๐˜ฌ๐˜ฆ๐˜ฆ๐˜ฑ๐˜ด ๐˜ถ๐˜ด ๐˜ฉ๐˜ฐ๐˜ฏ๐˜ฆ๐˜ด๐˜ต. ๐˜›๐˜ฉ๐˜ฆ ๐˜ฐ๐˜ต๐˜ฉ๐˜ฆ๐˜ณ ๐˜ฌ๐˜ฆ๐˜ฆ๐˜ฑ๐˜ด ๐˜ถ๐˜ด ๐˜ช๐˜ฏ๐˜ด๐˜ฑ๐˜ช๐˜ณ๐˜ฆ๐˜ฅ. Thank you to @itsajchan and the HackerSquad community for bringing together a room full of builders who do both so naturally.
Vishakha Gupta-Cledat tweet mediaVishakha Gupta-Cledat tweet mediaVishakha Gupta-Cledat tweet mediaVishakha Gupta-Cledat tweet media
English
0
1
4
174