Bruce Dando | SAP Data & Intelligence

821 posts

Bruce Dando | SAP Data & Intelligence banner
Bruce Dando | SAP Data & Intelligence

Bruce Dando | SAP Data & Intelligence

@brucedando_

20+ yrs turning SAP chaos into clarity. Helping enterprises get AI-ready (system, process, data) Practice Dir @Emphasys | SAP Gold Partner

Australia Katılım Eylül 2017
197 Takip Edilen114 Takipçiler
Sabitlenmiş Tweet
Bruce Dando | SAP Data & Intelligence
I just built a full SAP ESG dashboard mockup for a client in under 60 seconds. Here's how 👇 The workflow: 1. Grabbed the official SAP S/4HANA Web UI Kit + SAP Analytics Cloud Dashboard Hi-Fi Kit from GitHub 2. Linked them into Figma as design libraries (free - SAP provides these publicly) 3. Connected Figma to Claude via the MCP connector 4. Asked Opus 4.6 one prompt: build me an ESG dashboard using the SAP Fiori Horizon design language for a mid-market client What it generated: → Full SAP Fiori shell bar, tab navigation, KPI tiles with sparklines → 4 tabs: Overview, Environmental, Social, Governance → GHG emissions tracking with Scope 1/2/3 breakdowns → EcoVadis scoring, net-zero target progress rings → DE&I metrics, board composition, compliance frameworks → All using the correct Horizon Morning theme palette and SAP design patterns The model understood SAP Sustainability Control Tower layout patterns, CSRD/ESRS reporting structures, and standard ESG KPIs — then assembled them into a production-quality interactive React component. No templates. No drag and drop. One prompt. This changes the game for SAP consultants doing pre-sales. Instead of spending days mocking up what a solution could look like, you can generate a client-ready demo in a conversation. The Figma → Claude → artifact pipeline is genuinely the fastest path from design system to working prototype I've ever seen.
Bruce Dando | SAP Data & Intelligence tweet mediaBruce Dando | SAP Data & Intelligence tweet mediaBruce Dando | SAP Data & Intelligence tweet mediaBruce Dando | SAP Data & Intelligence tweet media
English
2
0
4
245
Bruce Dando | SAP Data & Intelligence
At a finance transformation summit in Sydney and I’m really surprised how far behind many companies are. Showing amazement at RPA for AR processes - that was 10 years ago!
English
0
0
0
10
Bruce Dando | SAP Data & Intelligence
Three acquisitions in 8 weeks. Reltio (golden records), Dremio (open lakehouse), Prior Labs (tabular AI). This is SAP building the complete data-to-AI pipeline. The Dremio play is the most significant. BDC becomes Iceberg-native, meaning SAP and non-SAP data federated in an open format. No ETL, no replication, no proprietary lock-in. Plus MCP integration already built in for any AI agent framework. Prior Labs is the longer bet. Tabular Foundation Models built for structured business data, not text. Most business decisions come from tables, not documents. LLMs are the wrong tool for that. Herzig's quote nails it: "Enterprise AI doesn't stall because the models aren't good enough; it stalls because the data isn't ready."
English
0
1
2
44
Bruce Dando | SAP Data & Intelligence
SAP's Double Acquisition: Dremio + Prior Labs. What It Means for Data Teams. SAP just announced two acquisitions in one day. This changes the data platform story significantly. ACQUISITION 1: DREMIO. Agentic Data Lakehouse Dremio is an Apache Iceberg-native data platform built for AI agents. SAP is acquiring it to transform Business Data Cloud into an Iceberg-native enterprise lakehouse. What Dremio brings: → High-performance query engine that federates across SAP and non-SAP data without ETL → Apache Iceberg open format, no proprietary lock-in → AI Semantic Layer that adds business context so every agent draws from the same source of truth → Self-managing platform: automated clustering, compaction, and query optimisation → MCP integration already built in, any LLM or AI agent framework can access enterprise data directly Why this matters: Until now, BDC Connect enabled zero-copy sharing with Databricks, Snowflake, Google Cloud, and Fabric. Dremio goes further, it makes BDC itself an open lakehouse that can unify SAP and non-SAP data natively, without relying on external platforms. ACQUISITION 2: PRIOR LABS. Tabular Foundation Models Prior Labs is a frontier AI research lab focused on structured business data. SAP is investing over €1 billion to scale it over four years. What Prior Labs brings: → Tabular Foundation Models (TFMs). AI built specifically for structured data like spreadsheets, financial records, and supply chain logs → Moves beyond correlation to causation: not just predicting what might happen, but explaining why → Conversational layer for business users to query datasets and test assumptions without technical expertise → Will integrate into SAP AI Core, Business Data Cloud, and Joule Why this matters: LLMs are designed for text. Most business decisions come from tables. Prior Labs builds AI that actually understands structured enterprise data, the kind that lives in SAP systems. THE COMBINED PICTURE In 8 weeks, SAP has acquired three companies that complete the BDC data platform: → Reltio - golden records and master data unification (announced March) → Dremio - open data lakehouse for SAP + non-SAP data federation (today) → Prior Labs - AI models purpose-built for structured business data (today) Add the Google Cloud/Gemini partnership and the existing Databricks, Snowflake, and Fabric integrations, and BDC is no longer just an SAP data warehouse. It's becoming a genuine enterprise data platform. SAP CTO Philipp Herzig put it clearly: "Enterprise AI doesn't stall because the models aren't good enough; it stalls because the data isn't ready for AI agents." WHAT CUSTOMERS SHOULD DO NOW: The platform capabilities just expanded significantly. But neither Dremio nor Prior Labs will be production-ready inside BDC until after Q3 2026 close. Use this time to: → Assess your current data fragmentation across SAP and non-SAP systems → Start creating governed data products in Datasphere → Evaluate your master data readiness for Reltio integration → Identify your first AI use case for structured data prediction Sapphire is 7 days away. Expect detailed roadmaps and integration timelines for all three acquisitions. What's your reaction to the double acquisition?
Bruce Dando | SAP Data & Intelligence tweet media
English
0
0
1
70
Bruce Dando | SAP Data & Intelligence
SAP Analytics Cloud has a built-in AI feature most teams never turn on: Smart Insights. What it does: automatically analyses your data and surfaces anomalies, trends, and correlations. No manual drill-down. No analyst intervention. How to use it: 1. Add a chart to your story 2. Right-click any data point 3. Select "Smart Insights" 4. SAC analyses every dimension and tells you what's driving the variance Example: revenue dropped 15% in APAC. Smart Insights shows you it's driven by one product category in one country, not a regional trend. Saves hours of manual investigation. Where it works best: -> Variance analysis (actuals vs budget) -> Trend detection across large datasets -> Identifying outliers that humans would miss -> Executive dashboards where users need explanations, not just numbers Where it breaks: -> Messy data models with inconsistent hierarchies -> Too many dimensions with low cardinality (noise, not signal) -> Models without clear measure/dimension separation -> Users who don't trust AI suggestions (change management, not technology) Smart Insights doesn't replace analysis. It tells you where to look first. The best SAC dashboards combine live BW data, cascading filters, and Smart Insights. Real-time numbers, clean navigation, and automatic explanations.
Bruce Dando | SAP Data & Intelligence tweet media
English
0
0
0
79
Bruce Dando | SAP Data & Intelligence
The AI-driven forecasting in SAC is underused. Smart Insights is the feature most teams haven't explored yet. It automatically analyses your data and surfaces anomalies, trends, and correlations without the user asking. Drop a chart into a story, right-click, and Smart Insights tells you which dimensions are driving the variance. Where it gets powerful: combine it with live BW connections. Real-time actuals from BW, with SAC automatically highlighting what changed and why. No analyst intervention, no manual drill-down. The limitation: Smart Insights is only as good as the data model underneath. Clean dimensions, consistent hierarchies, and well-structured measures. If the data is messy, the insights are misleading.
English
0
0
0
33
Bruce Dando | SAP Data & Intelligence
SAP Analytics Cloud has two types of models. Most teams pick the wrong one. Planning models vs import models: when to use which. PLANNING MODELS: -> Use when business users need to write data back (budgets, forecasts, scenarios) -> Support data actions, versions, allocations, and approval workflows -> Allow manual input, copy/paste from Excel, and scheduled data loads -> Best for: financial planning, headcount planning, demand forecasting IMPORT MODELS: -> Use when you just need to visualise data from external sources -> Data is read-only. No write-back, no planning features -> Faster to set up, simpler to maintain -> Best for: operational dashboards, KPI reporting, ad-hoc analysis WHERE TEAMS GO WRONG: -> Building a planning model when they only need reporting (adds unnecessary complexity) -> Building an import model then realising finance needs to edit forecasts (forces a rebuild) -> Not using live connections to BW/4HANA when the data is already there (duplicates data into SAC) THE DECISION FRAMEWORK: Do users need to write data? -> Planning model Does the data already exist in BW? -> Live connection (no model needed) Is it read-only from a non-BW source? -> Import model Get this right at the start. Changing model type later means rebuilding every story that uses it.
Bruce Dando | SAP Data & Intelligence tweet media
English
0
0
0
29
Bruce Dando | SAP Data & Intelligence
Good overview. One thing worth adding to the Datasphere characteristics: the Data Marketplace. It's the feature that turns Datasphere models into discoverable, governed data products that other teams can find and consume. Without the Marketplace, you build great models that nobody outside your team knows exist. With it, you create a self-service catalogue where business teams can discover data, understand what it contains, and request access through governed workflows. It's also the foundation for AI. Joule agents need to discover governed data products to consume. If the data isn't published to the Marketplace with semantic labels and business context, the agent can't find it.
English
0
0
0
3
Bruce Dando | SAP Data & Intelligence
This is the best technical summary of the partnership I've seen. The key detail: BigQuery + SAP BDC for zero-copy data. This means SAP customer data stays governed in Datasphere while Google Cloud agents consume it directly. No replication, no sync issues. This is now the fourth major BDC zero-copy integration: Databricks, Snowflake, Microsoft Fabric (roadmap), and Google Cloud/BigQuery. The pattern is clear: BDC is becoming the connective tissue for the entire enterprise data ecosystem. The practical question: multi-agent orchestration between Joule and Gemini requires governed, consistent data on both sides. Most customers need data products in Datasphere and clean master data (Reltio) before this architecture delivers value.
English
0
0
0
30
Inoltra
Inoltra@Inoltra_co·
SAP & Google Cloud are bringing agentic AI to CX: Joule Agents for campaign execution, BigQuery + SAP BDC for zero-copy data, Gemini Enterprise for multi-agent orchestration, and SAP Engagement Cloud for personalised lifecycle activation. Learn more: sap.to/6046BB8NIM
Inoltra tweet media
English
1
0
1
31
Bruce Dando | SAP Data & Intelligence
The "Internal AI Adoption - We are customer zero" stats are telling. ~30% developer productivity uplift and 100% of support tickets touched by AI shows SAP is eating their own cooking. But the stat I'm watching for at Sapphire is external customer adoption. ~230 implementations and 140+ customers is a start, but against SAP's installed base, that's still early. The "embedded domain knowledge" theme is the right focus. An SAP AI agent that understands revenue recognition rules and fiscal year variants is fundamentally different from a generic LLM querying tables. But that domain knowledge layer only works when the data underneath is governed through data products in Datasphere. €2bn pipeline influenced is a strong signal. The question is conversion.
English
1
1
0
250
Holger Müller #EnterpriseAcceleration
.@SAP Sapphire 2026 themes: AI agent accuracy, embedded domain knowledge and processes bit.ly/4cK3ykh SAP is planning to revamp its portfolio to "infuse deep domain know-how into SAP's AI agents" at its SAP Sapphire conference in Orlando.
English
1
0
1
120
Bruce Dando | SAP Data & Intelligence
SAP Q1 2026 results are in. The numbers that matter for data teams: → Cloud revenue up 27% (constant currencies) → Cloud ERP Suite up 30% → Cloud backlog €21.9B, up 25% → Reltio consolidation in full-year guidance What this means: S/4HANA Cloud is accelerating. More transactional data in cloud-native SAP environments. Your data architecture needs to account for this. Reltio in the guidance means golden records inside BDC are coming sooner than expected. Factor this into your MDM strategy now. SAP says "Business AI delivering real outcomes." Expect Sapphire (11 days away) to push Joule production case studies. Thomas Saueressig's new role as Chief Customer Officer shows SAP tightening the link between licensing and actual implementation. 27% cloud growth is impressive. The question: how many customers have the data foundations to use what's already available? That gap is where the real work happens.
English
0
0
1
78
Bruce Dando | SAP Data & Intelligence
You built a data model in SAP Datasphere. But if nobody else in the organisation knows it exists, it's not a data product. It's a well-kept secret. The Data Marketplace is what changes this. When you publish a data product to the Marketplace: -> Other teams can find it by searching -> They see what it contains, who owns it, how often it refreshes -> They request access through governed workflows -> They consume it in SAC, Databricks, Snowflake, or Fabric When you don't publish: -> Your team built something useful. Nobody knows. -> Another team builds the same thing from scratch. -> AI agents can't find it because it's not catalogued. This also matters for Joule. AI agents need to discover and consume governed data products. If your data isn't in the Marketplace with semantic labels and business context, the agent can't find it. Build the model. Publish it. Document it. Make it discoverable. That's what turns a dataset into a data product.
Bruce Dando | SAP Data & Intelligence tweet media
English
0
0
0
23
Bruce Dando | SAP Data & Intelligence
Embedded domain knowledge is the right focus. An SAP AI agent that understands revenue recognition rules, intercompany eliminations, and fiscal year variants is fundamentally different from a generic LLM querying tables. But domain knowledge in the agent layer only works when the data layer is governed. Joule needs to discover governed data products through the Data Marketplace, not navigate raw SAP tables with cryptic field names. The architecture: Reltio for golden records, Datasphere for governed data products published to the Marketplace, Joule for domain-aware AI agents consuming that trusted data. Most customers I work with are 12-24 months of data foundation work away from being ready. That's the preparation worth doing before Sapphire.
English
0
0
0
30
Michael Ni 倪孟堅
Michael Ni 倪孟堅@mikeni·
.@SAP Sapphire 2026 themes to be ready for: AI agent accuracy, embedded domain knowledge and processes bit.ly/4cK3ykh SAP is planning to revamp its portfolio to "infuse deep domain know-how into SAP's AI agents" at its SAP Sapphire conference in Orlando.
Michael Ni 倪孟堅 tweet media
English
1
3
1
272
Macy Mills
Macy Mills@_CallMeMacy·
Lately, I've been advising founders to vibecode their own CRM that reads their emails, calendars, and keeps things up to date in real time. It took me 2 hours to create my own using Claude Code (and I've never coded in my life). Am I crazy for suggesting this or do others agree?
English
197
3
267
58.4K
Bruce Dando | SAP Data & Intelligence
SAP Sapphire 2026 is 13 days away. If you're in a data or analytics role, here's what to prepare before the event: Assess your data foundation Do you have governed data products in Datasphere? Is your master data clean enough for AI agents? If not, Sapphire announcements will be aspirational, not actionable. Map your BDC architecture Which workloads stay in BW/4HANA? What moves to Datasphere? Where does Snowflake, Databricks, Google Cloud, or Fabric fit? Have this mapped before you attend. Identify your first AI use case Don't wait for Sapphire to start planning. The foundation work takes months. Pick one process, one dataset, one agent. Prepare specific questions "Tell me about AI" sessions won't help. Ask: How does Reltio integrate with existing MDG? When does Fabric BDC Connect go GA? What's the Joule agent deployment path for finance? Track the data themes AI agent accuracy. Embedded domain knowledge. Multi-agent orchestration (Google Cloud). Golden records (Reltio). These are the threads that connect everything SAP announced in the last 3 months. The organisations that get the most from Sapphire arrive with a plan, not just a badge.
Bruce Dando | SAP Data & Intelligence tweet media
English
0
0
1
49
Bruce Dando | SAP Data & Intelligence
"AI agent accuracy" and "embedded domain knowledge" are the right themes. This is SAP addressing the 3% production adoption gap for Joule. The domain knowledge piece is what makes SAP's AI story different from generic LLM platforms. An SAP agent that understands revenue recognition rules, intercompany eliminations, and fiscal year variants is fundamentally more useful than one that just queries tables. But embedding domain knowledge into agents only works when the data underneath is governed. A Joule agent with deep financial process knowledge still fails if the master data is duplicated, the hierarchies are inconsistent, or the currency tables aren't maintained. Expecting Sapphire to clarify how Reltio's golden records, Datasphere data products, and Joule's domain knowledge fit together architecturally. That's the integration that matters.
English
0
0
0
64
Constellation Research
Constellation Research@constellationr·
.@SAP Sapphire 2026 themes: AI agent accuracy, embedded domain knowledge and processes zurl.co/CQSH9 SAP is planning to revamp its portfolio to "infuse deep domain know-how into SAP's AI agents" at its SAP Sapphire conference in Orlando.
English
1
3
3
211
Bruce Dando | SAP Data & Intelligence
@SAP Sapphire 2026 is 17 days away (May 11-13, Orlando). The data story has changed dramatically since last year. Here's what's different: BDC integrations: @databricks (live), @Snowflake (live), @googlecloud /@bigquery (announced this week), @MicrosoftFabric (roadmap). SAP data is no longer locked in. Reltio acquisition: Golden records and AI-native entity resolution coming to BDC. Master data across SAP and non-SAP systems unified for the first time. Multi-agent AI: Joule + Gemini Enterprise orchestrating across platforms. Marketing first, expanding across CX. SAC evolution: Datasphere as the governed data backbone. SAC as the planning and consumption UI. Clear architecture separation. The platform story is the strongest it's ever been. The adoption question remains: how many customers can actually implement what was announced in the last 6 months, let alone what's coming at Sapphire? That's the gap between product strategy and customer reality. And it's where the real work happens.
English
0
0
0
103
Bruce Dando | SAP Data & Intelligence
The "unified data" part is the critical detail. Multi-agent orchestration between Joule and Gemini only works if both platforms are looking at the same, governed customer data. This is where BDC Connect for Google and BigQuery matters. Bidirectional, zero-copy data access means SAP customer data stays governed in Datasphere while Gemini agents consume it directly. No replication, no sync issues, no conflicting customer records. Marketing is the first use case (H2 2026), but the pattern extends across all of SAP CX. The question for most organisations: do they have the governed data products and clean master data in place to take advantage of this when it ships?
English
0
0
0
28
impresacity.it
impresacity.it@impresacity·
SAP e Google Cloud: arrivano gli agenti AI per rivoluzionare il marketing su larga scala: Grazie all'integrazione tra Joule e Gemini Enterprise, le aziende potranno orchestrare campagne complesse in tempo reale utilizzando dati unificati e senza barriere… dlvr.it/TSB9Xl
Italiano
1
0
0
11
Bruce Dando | SAP Data & Intelligence
SAP and Google Cloud just announced a multi-agent AI partnership at Cloud Next '26. Here's what's actually significant: BDC Connect for Google and BigQuery. Bidirectional, zero-copy data access between SAP and Google Cloud. This is the fourth BDC integration after Databricks, Snowflake, and Microsoft Fabric. Multi-agent orchestration. Joule agents (SAP) and Gemini agents (Google Cloud) coordinating across platforms. Not just data sharing. Agent-to-agent coordination. Marketing is the first use case. Prompt: "Increase repeat purchases from the last 30 days." The agents handle everything from segmentation to personalisation to activation. Available H2 2026. Expanding across SAP CX portfolio after marketing. My take: the partnership is impressive, but the data readiness requirement just got harder. Multi-agent AI across two platforms needs governed, consistent master data on both sides. Reltio (golden records) + Datasphere (data products) + BDC Connect (cross-platform sharing) = the foundation this needs. The organisations investing in data governance now will be first to benefit. Everyone else is watching another demo they can't implement yet. news.sap.com/2026/04/sap-go…
Bruce Dando | SAP Data & Intelligence tweet media
English
0
1
2
153
Bruce Dando | SAP Data & Intelligence
The data layer is what makes this partnership different from another AI announcement. BDC Connect for Google and BigQuery enables bidirectional, zero-copy data access. That means SAP customer data stays governed in Datasphere while Gemini agents can access it without replication. This is the fourth major BDC integration: Databricks, Snowflake, Microsoft Fabric, now Google Cloud. The pattern is clear: SAP is positioning BDC as the connective tissue for the entire enterprise data ecosystem. But multi-agent orchestration across two platforms requires consistent, governed master data on both sides. The Reltio acquisition for golden records becomes even more strategic in this context.
English
0
0
0
64
The Analyst
The Analyst@TheAnalystDE·
$SAP SAP und $GOOGL Google Cloud haben eine neue Partnerschaft angekündigt, die Marketingfachleuten den Einsatz von KI-Agenten im großen Maßstab ermöglicht. Neue Integrationen zwischen SAP Engagement Cloud, SAP Customer Experience und Joule sowie Gemini Enterprise erlauben es Agenten, sicher auf einheitliche Daten beider Ökosysteme zuzugreifen.
Deutsch
1
1
3
385