Datavault Builder

Datavault Builder ist die preisgekrönte, unternehmensorientierte Data-Warehouse-Automatisierungslösung, die es Ihnen ermöglicht, Ihren Daten und Ihren Entscheidungen zu vertrauen.

Es handelt sich um eine effiziente visuelle Datenintegrationslösung, die bis zu neun verschiedene Software-Tools ersetzt. Datavault Builder setzt auf Standardisierung und Zusammenarbeit zwischen Fach- und IT-Mitarbeitern als Schlüssel zum Erfolg.

Steigern Sie Ihre Produktivität und profitieren Sie von der schnellstmöglichen Time-to-Insights, ohne Kompromisse bei der vollständigen Überprüfbarkeit einzugehen.

Hierarchical Links can be modeled using a virtual alias hub.
This makes the relation clear = which side is the child and which side is the parent.
The green double line represents an identity. This way we can traverse the connection as many times as needed from child to parent.
In the Dimensional Model this circle is flattened into a path from which you can select at any hierarchical level.

Hierarchical Links (1)

Multi-Active Satellites with business timelines are modeled using a sub-partition hub.
In this hub we choose the business key as BK + sub-partition key.
This allows delta loading + linking to specific entries.
As an additional feature, we support Multi-Active Satellites for data with an external technical timeline by creating a Bi-Temporal Satellite with needing a sub-partion hub.

Multi-Active Satellite (2)

Driving Keys are captured by modeling the expected cardinality between objects.
This information is automatically evaluated when creating the output interface as it prevents you to walk from “Order” to “Order Line” to prevent the fanning-trap plus it output the history or the latest information for “Order Line” to “Order” based on your requirements

Validity in Relationships (3)

Processes as hubs
We do model the delivery, as it represents a core business concept, as hub.
And in fact, our modeling test has shown that there is in fact a key even otherwise specific by the business user.
Many to Many relations without attributes are modeled as links.
Many to Many relations with attributes are currently modeled as a dummy hub. This will change in 2023.

mn Tables (4)

Just define more than one load into a hub
Passive integration works completely automatic by defining more than one load for the same hub. Datavault Builder supports the concatenation of columns to create a unique key. It also supports normalization of de-normalized source data. If data shall not be passively integrated the business keys can be prefixed with a fixed value.

Early Integration (5)

Modeled as Hub and Satellite
It takes just seconds to store a reference table as hub with satellite. We include the key parts also in the satellite and we automatically create a SCD type 1 materialization of the satellite so for the look up so you don’t need to apply any logic and you still get the full history in case it is needed.

Historized Reference Table (6)

Perfect duplicates can be loaded if the PK check is turned off. In that case the values are distincted.
But if not business wise expected and the primary key check feature is turned on the hub load in question is stopped and all dependent loads like satellites and links are not executed by design.
The business key checker shows the relevant duplicate keys and you can drill down to sample data sets:

Duplicate Data (7)

If all business key parts are missing the row will not be loaded as we assume it is a different type.
If only parts are missing, they are assumed empty.
If such cases can happen as errors in the data source use a persistent staging (PSA) load within Datavault Builder first with a technical key to prevent data loss.

All type of attribute changes are handled automatically in the satellites.

Changes of Attributes (9)

Implicit deletions
We keep automatically track of implicit deletes for every full load in tracking satellites and offer them as virtual columns in the output. Delta loads of attributes can be combined with full loads of business keys to detect implicit deletions as well on huge data sets

Deletion of Business Keys (10)

Missing References
Are added automatically in the target hub so the link can be created without breaking referential integrity. No entry in created in the satellite. The hub record is marked with the source of the record.

Invalid foreign keys (11)

Implicit deletes
Are marked automatically in the tracking satellite for every full load. Delta for attributes can be combined with full loads on business keys to check for implicit deletes in huge data sets.

Deletion of Orders (12)

Load Times
Are managed automatically and can be used for SCD type 2 output. If external timelines exist they can be used as the leading time line (example inscription time). Just select while modeling the interface that you want to output data as SCD type 2 object and we will automatically create a load-time and load-time-end column, generate a PIT table with corresponding loads and add the management of the PIT table to the masterjob.

The tool did provide the correct figures for this test.

You are completely free to include whatever business rules your business requires. Virtual or materialized. You choose.
• Virtual Business Rules
• Materialize Business Rules Output
• Run Stored Procedures or External Tools
• Design and Maintain your Data Model
• Supports Raw Vault & Business Vault
• Create Data Flows in Real Time
• Data Lake Option

For more information, please see:

Business Rules (15)

The Datavault Builder derives all lineage information from the effective implementation.
From all modules you can jump directly to the related lineage information.
Also, you can access all information via Rest API and database view to include this information in your reporting tool like PowerBI, Tableau, Qlik or in your central data dictionary.

Data Lineage (16)


There are different types of errors. The basic principle is that we try to heal errors if possible and inform the user if necessary. We differentiate and handle following error types:

Error Handling (17)

For every source system (logical source system) a master job is created which contains all elements modeled based on this source system. If the model is extended by modeling or deployment the job adds automatically all new elements.
This automatic job can be customized to omit certain loads and manual jobs can be created loading specific elements. For example, if you want to load masterdata once a day but a transaction table every five minutes.
The orchestration tool is not only parallelizing loads within layers (staging, vault) but also among layers.
The Datavault Builder contains a built-in scheduler – but as everything can be controlled using REST calls – external schedulers like Cronacle, Control-M, UC4 or Airflow can be used to trigger the automatic master jobs.
Jobs can be started in full and delta mode allowing you to choose at execution time the scope of your loads preventing double implementation of the same source.

The Datavault Builder supports an ITIL compliant separation of environments and deployment path. Use the simplified deployment to sync several environments pairwise or use the fully GIT-flow based approach to achieve distributed development and CI/CD.

Deployment (19)

Automatically updated masterjobs
Masterjobs per source system are self configuring and updating based on the data model. No configuration necessary. Still the standard jobs can be configured to omit certain loads and manual jobs can be created based on business needs like reloading some data more often than other.

Built in Scheduler
Run your jobs directly by scheduling full and delta jobs in Datavault Builder omitting double implementation for full and delta loads.

Integrate in your Enterprise
Integrate Datavault Builder with your enterprise schedulers like Control-M, UC4, Airflow and others by using Datavault Builder’s Rest APIs.

Job Chaining
Define job dependencies directly in Datavault Builder.

Scheduling (20)

Many different relational and NoSQL databases are supported as source. As target database where data is processed and persisted the following databases are supported:

  • Snowflake (Snowflake Ready Certified)
  • Azure Synapse Analytics
  • Azure SQL Managed Instance
  • Azure SQL Database
  • SQL Server on Prem
  • Exasol
  • Oracle
  • Postgres
  • Google BigQuery (2023)

Other databases upon request

As the tool is provided as stateless Docker containers the deployment method can be changed at any time without migration costs. As example the Datavault Builder can be moved from Azure to AWS by stopping it on Azure and starting it up on AWS without any migration efforts. The same holds true for moving from on premises to the cloud.

nstallation Requirements (22)

Klicke außerhalb der Vergleichs-Bar um sie zu schließen.
Vergleichen ×
Vergleich starten! Zurück zur Seite