Ronny Kober is responsible as Data Platform Lead at Sunfire for the company-wide analytics infrastructure – and has navigated the path from chaotic data silos to a modern, AI-supported data platform himself. In conversation with Scavenger AI, he shares the insights gained from this: without hype, without a parade of buzzwords, but with concrete principles that are applicable to any company. This article summarizes the key points.
The problem that almost every company knows
Having data and using data are two fundamentally different things.
Most medium-sized companies sit on enormous amounts of data – from ERP systems (Enterprise Resource Planning, that is, software for corporate management), CRM systems (Customer Relationship Management, to manage customer relationships), from production plants, sales tools, and dozens of other sources. The problem: This data is scattered in so-called data silos – isolated islands that do not communicate with each other. Marketing has its numbers, sales have theirs, the finance department has its own Excel sheet.
As soon as users see incorrect or inconsistent numbers in a dashboard, they lose trust. And then the real chaos begins: Each team builds its own solution, its own report, its own truth. In the end, each department measures the same KPIs (Key Performance Indicators) differently – and no one knows which numbers to trust anymore. In the expert world, this is called Shadow BI: a shadow reporting system that emerges parallel to the official data infrastructure and undermines it.
According to a study from 2024/2025, lack of data quality is the most common reason why data projects fail – in almost 44% of all cases. This exactly matches what experienced data platform teams report from practice.
Data strategy does not start with tools – but with a question
Ronny Kober cites one of the most important principles that is nevertheless most often ignored in practice: A data strategy does not start with the choice of a tool.
Many companies make exactly this mistake. They evaluate platforms for weeks, compare providers, invest in licenses – and lose sight of what they actually want to achieve with the data. The right order is different: First, ask what the data should achieve for the company. What decisions should be made better? Which processes should become more transparent? Which questions remain unanswered today because of missing data?
Only then comes the second step: a concrete use case.
The Skateboard Principle: Start Lean, Create Value Quickly
Anyone who wants to get from A to B quickly does not immediately build a car. They build a skateboard – simple, works immediately, gets you to your goal faster than nothing at all. Applied to data projects, this means: Do not immediately build a complete data platform. Instead, identify a use case, export the relevant data from existing systems – if necessary as CSV files – and visualize it using simple means. Then check: Does it create value? Does it answer the questions we are asking?
If so, you take the next step. If not, you have lost little time and money.
This lean principle from agile software development is still applied far too rarely in the data field – and is exactly the approach that allows even smaller teams to deliver real results quickly. It is also the thought behind Scavenger AI: Ready for use in a few days, without months of BI projects, without weeks of dashboard programming – just ask a question and get an answer.
Trust in Data: The Underestimated Foundation of Every AI Analytics
Before a company thinks about AI analytics, another question must be answered: Can we trust our data?
Data quality is not a technical side issue, but the foundation on which everything else is built. Poor data quality not only means incorrect dashboards – it means incorrect decisions. And poor decisions based on incorrect data can be more costly than making no data-driven decisions at all.
How does one ensure data quality? The approach that has proven successful in practice is strongly oriented towards modern software development: automated tests for data pipelines, clear standards for data processing, and an architectural principle that ensures that raw data is always available and errors in processing – not in the original data – are corrected.
Specifically: When an error in data processing is discovered, the raw data is not touched. Instead, the transformation – that is, the process that converts raw data into usable metrics – is corrected and executed again. This allows reproducible, reliable results to be generated at any time.
The Medallion Architecture: Bronze, Silver, Gold
Anyone dealing with modern data platforms will sooner or later encounter the medallion architecture – an approach pioneered by Databricks that has established itself as a best practice in the data engineering world.
The principle is simple to understand: Data goes through three layers.
The bronze layer contains the raw data – unaltered, directly from the source systems. This layer is the safety net: No matter what happens, the original data is always available.
The silver layer contains cleaned, prepared data. Duplicates are removed, missing values filled in, inconsistencies resolved. This is where the actual data quality work takes place.
Finally, the gold layer contains fully prepared metrics and datasets that can be used directly for business intelligence and reporting. A BI tool (Business Intelligence Tool, that is, software for data visualization and analysis) like Power BI, Grafana, or Looker accesses this layer – and does not perform any calculations or transformations anymore.
The result: All departments see the same numbers, calculated according to the same rules. The shadow BI problem dissolves. One central principle here: The entire transformation occurs in the data platform – not in the BI tool. The BI tool only visualizes what the gold layer has already prepared.
SQL: The Underestimated Standard in the Age of AI
SQL (Structured Query Language) is a programming language that allows you to query and process data from databases. It is over 50 years old – and more relevant than ever.
According to Ronny Kober, SQL remains the central standard even in the age of AI analytics – for several reasons. First, SQL is declarative: You describe what you want, not how to calculate it. This makes it readable and understandable, even for people without a deep technical background. Second, SQL is the common denominator between data engineers and analysts – both speak this language, which greatly simplifies collaboration in small teams. Third, all modern data platforms are based on SQL – from Databricks to Snowflake to BigQuery.
And AI? It does not make SQL obsolete but makes people proficient in SQL more productive. Generative AI can suggest SQL queries, explain errors, and simplify complex joins. But the result must be understood and checked – especially for less experienced analysts, critically checking the AI-generated queries is essential.
Scavenger AI goes one step further: The platform automatically translates natural language into SQL, executes the queries, and presents the results as graphics, tables, or text – without the user having to write a single line of code. The semantic layer that Scavenger builds is the crucial factor: It ensures that the AI truly understands the data structure, business logic, and meaning of individual fields – thereby delivering reliable answers.
Generative BI: The End of the Dashboard Factory?
Most companies know the pattern: endless dashboard libraries that no one maintains and hardly anyone truly uses. Dashboards are built on request, end up somewhere in the BI tool, are called up two or three times – and then disappear into digital nirvana. At the same time, new requests pile up for the data team.
Most of these requests are structurally simple. "How have my top 10 customers performed in the last quarter?" "Which products are performing best in region X?" For such questions, you do not need an elaborate dashboard – you need a quick, reliable answer.
This is exactly where generative BI comes in: AI systems that answer questions in natural language based on a well-maintained semantic data model, generate spontaneous visualizations, and enable ad-hoc analyses – without an analyst needing to intervene. This frees data teams from monotonous standard requests and creates capacity for what truly adds value: complex analyses, data modeling, strategic projects.
Important dashboards for north star metrics – that is, the overarching metrics that measure business success – will continue to exist. But the majority of daily data work will increasingly be answered by AI analytics. Scavenger AI is precisely designed for this paradigm shift.
Management Buy-in: The Underestimated Prerequisite
Technology alone is not enough. A successful data strategy always requires a clear commitment from management – not as lip service, but as a strategic decision: Data is a central tool for achieving corporate goals. This means specifically: a dedicated data team, clear responsibilities, resources for data work – and the readiness to treat data projects not as nice-to-haves, but as strategic investments.
Without this commitment, even the best technical solutions will fail. With it – and the right tools – small teams can achieve enormous impact.
Practical Recommendations: How to Start Today
Three principles that have proven themselves in practice time and again:
First: Strategy before tools. Clarify what should be achieved with the data. Which decisions should improve? Which processes should become more transparent? Only then evaluate tools.
Second: Start lean. Identify a concrete use case, test with simple means whether the approach creates value. Skateboard before car.
Third: Adhere to technical principles. Implement medallion architecture, keep transformations in the data platform, implement automated tests for data quality, transfer agile development processes from software development to data work.
Anyone who follows these three principles lays a foundation on which AI analytics – whether with Scavenger AI or another solution – can truly work. Because AI can only be as good as the data it accesses. And data can only be as good as the infrastructure in which it is managed.
Conclusion: The Path to Real Data Intelligence
Working data-driven is not a question of company size, budget, or technology alone. It is a matter of strategy, culture – and the first step.
Anyone who starts today to break down data silos, build trust in data quality, and lay the right technical foundations creates the prerequisites for what business intelligence will mean tomorrow: AI analytics that truly works. Not as a promise – but as a tool that enables better decisions every day.
The skateboard comes first. The rest follows.
Scavenger AI is an AI-powered analytics platform that enables companies to ask questions of their data in natural language – and receive answers in seconds. The insights in this article stem from a conversation with Ronny Kober (Data Platform Lead, Sunfire).