14. April 2026 By Tobias Jasinski
Beyond the platform: Why architecture and tools must go beyond data mesh
In recent years, many companies have done their homework: modern data platforms are in place, the cloud is established, and the first AI applications are running productively. And yet the feeling remains: ‘We're not getting the most out of it.’
The cause is rarely a lack of technology, but rather an architecture that looks modern but doesn't really ‘work’ in everyday life. Data products are not used consistently, governance only works on paper, and the interaction between the platforms involved remains bumpy.
Status quo of data architecture: good platform, little impact
We see similar patterns time and again among our customers:
- A powerful data warehouse or data lakehouse is established.
- Self-service tools for analytics and reporting are available.
- Initial data products such as ‘Customer 360’ or ‘Sales Dashboard’ exist.
Nevertheless
- departments continue to use their Excel shadow worlds
- new data products are slow to gain traction
- and the number of different ‘truths’ within the company is increasing rather than decreasing.
The reason: architecture was too often understood as a pure technology project, not as a working mode of a data ecosystem.
Data mesh: From monolith to ecosystem
Modern data architecture today consists of an interplay of specialised platforms: analytics, machine learning, IoT, real-time streaming, generative AI. The crucial step is to move away from ‘one big system’ towards a harmonised ecosystem that does three things:
- 1. Seamless integration across system boundaries: Data flows in a controlled but frictionless manner between operational systems (ERP, CRM, shop, ticketing) and analytical platforms – and back again.
- 2. Domain-oriented responsibility: Teams from marketing, sales, operations, etc. become domains that are responsible for their data products – with clear schemas, quality requirements and service levels.
- 3. Federated governance instead of centralised bottleneck IT: Central guidelines (security, compliance, standards) define the framework within which domains can act quickly without having to coordinate every detail.
Concepts such as data mesh encapsulate these principles in an architectural philosophy. In practice, however, implementation often fails on precisely two points: internal interaction and active governance.
Challenges of data mesh: interaction & governance in everyday life
Even if the platform is technically well set up, typical areas of tension remain:
- Interfaces between platforms and teams: Marketing needs a new segment quickly, Finance needs valid reporting for the board, the data science team wants to run an experiment – and all three are competing for the same data and resources.
- Unclear responsibilities: Who ‘owns’ a data product? Who decides on changes to the schema? Who prioritises the requirements of different stakeholders?
- Governance as a stumbling block or blind spot: Either governance is implemented too late (‘we'll sort it out when we're bigger’) or it is implemented so rigidly that innovation is hardly possible.
- No central data catalogue or marketplace: Either there is no data catalogue tool at all, or there are several that do not communicate with each other. What remains are silos and unconsumable assets.
The result is new silos based on modern technology: marketing, sales, operations and IT each build their own data islands, only this time in the cloud.
Structured setup for data mesh: Architecture as an enabler of value
To ensure that architecture does not become an end in itself, but rather supports the value orientation described in the first blog post, a structured setup consisting of three levels is required:
1. Infrastructure & platforms
- Cloud-native basis: Automatic scaling, flexible resource utilisation, standardised security mechanisms.
- Self-service capabilities: Domains can create, test and deploy data products without blocking central IT every time.
- Standardised interfaces: API and event-based design that provides defined paths for data flows – instead of individual point-to-point integrations.
2. Data products & domains
Data-as-a-product: Each data product has
- a unique owner
- defined consumers
- documented quality criteria (timeliness, completeness, latency)
- clear service level agreements
Domain orientation: Teams such as ‘Sales & Marketing’, ‘Customer Service’ or ‘Supply Chain’ are responsible for their data products from a technical perspective – from definition to further development.
3. Governance & collaboration
Federated governance models:
- Central policies (security, data protection, compliance, naming standards)
- Decentralised responsibility for implementation and further development in the domains
Living artefacts:
- Data catalogues that show which data products exist and how they can be used
- Data contracts that specify which fields, formats and quality standards apply between producer and consumer
- Automated quality checks and data lineage that create transparency about origin, transformation and use
This creates an environment in which data products are not only built, but also used trustingly and securely, precisely where the ‘actions’ described in the first blog post take place: in CRM systems, shops, marketing automation, service tools or operational workflows.
Data architecture in the service of the ‘insight-to-action’ chain
Perhaps the most important change in perspective: good data architecture is not a collection of modern tools, but the organised ability to translate data into value.
This means that
- use cases drive architecture – not the other way around: instead of ‘We need a new data lake,’ it's ‘We want to reduce churn, improve upselling, identify risks early – what architecture enables us to do that?’
- Operational systems are an integral part of the architecture: It is not enough to pull data into a platform. Relevant insights must flow back into the operational systems so that campaigns, prices, inventories or service processes can be aligned with them.
- Governance supports implementation rather than hindering it: Policies and standards are deliberately designed to meet regulatory requirements on the one hand, while allowing for rapid iterations and experimentation on the other.
How adesso supports: From platform project to data ecosystem
In customer projects, we see time and again that the step from a ‘well-developed technology stack’ to a functioning data ecosystem is the decisive one. Typical components of our support:
- Analysis of the existing platform setup and identification of interaction and governance gaps
- Design of a target-oriented data architecture that clearly structures subject domains, data products and governance
- Development of self-service and data-as-a-product concepts, including roles, processes and tooling
- Introduction of federated governance with clear responsibilities, data contracts and automated quality monitoring
The goal is always the same: an architecture that not only allows value case and value measurement mechanisms, but actively facilitates them.
Next step: moving from reading to action
If you feel that your data architecture looks good ‘on paper’ but is not yet working properly in everyday life, there are two simple options:
Non-binding pre-call: 30 minutes in which we assess the current situation and outline possible next steps or discuss architecture patterns, governance approaches and practical examples for your scenario.
If you are interested, just send a short email or a contact request with the keyword ‘data ecosystem’ – I will then get back to you with suggestions.
Data Driven
From data chaos to a data-driven company
Data is the key resource of digitalisation. It enables the optimisation of the customer journey, informed and efficient decisions, and the automation of processes, and forms the basis for every form of artificial intelligence.