Supporting a wide range of data platform components across Azure and AWS. Optimize price performance by swapping between storage and processing technologies with no recoding.
Moving data into your data platform has never been easier. Flexible to connect to any data source including complex sources such as SAP, and leverage built-in standardization and best-practices.
Guided development with full modelling flexibility for any data product need e.g. marts, ABTs, and reverse ETL. No-code, low-code, as well as full-code without breaking end to end data lineage on column level.
Great flexibility in how you want to put your data platform into production. You may deploy as native flows in Azure Data Factory and avoid any lock-in, document any existing data factories and analyze & predict costs.
Built-in best-practices and processes to improve quality, speed, and collaboration. Easily adjust your architecture at any time, use automation to free up time for collaboration, and have test automation and unit tests ready.
Letting your organization formally manage and gain better control over your data assets. Key elements include data traceability and transparency, compliance and data ownership, and integration of master data.
Always updated self-service documentation for end-users. It helps understand the data, where it is coming from, transformations, dependencies, regulatory requirements and data ownerships.
A core functionality tracking the complete end-to-end dependencies between all assets (e.g. tables and columns) in the data platform, all the way from data sources to visualizations in Power BI or similar.
Xpert BI is a metadata-driven tool that automates the data extraction and ingestion process when building a data warehouse, data lake, or a data platform. This setup can be done in just a few minutes and enables incredibly efficient data ingestion.
The tool includes a best-practice architecture and methodology to ensure high-quality solutions. Data is stored in different layers, from raw to curated, and modelled to fit the organizational reporting and analytic needs.
Configurations include a range of filter options, incremental and delta loads, windowed loads, history tracking, and more. Process code in Xpert BI or deploy it as native pipelines in Azure Data Factory.
Deploy to a wide range of storage architectures to implement your data platform. Xpert BI supports Azure SQL Managed Instance, Azure Data Lake Storage gen2 and Azure Synapse Analytics. And combine them as you like.
Complete data lineage on both table and column level from data source, through all data transformation layers, to your reports. This gives users and developers full traceability both from a top-down and bottom-up perspective.
An intelligent processing engine where developers work on a high level of abstraction and configure update schedules based on fact-and-dimension data model level. Xpert BI finds all executable objects in the dependency and executes on the most efficient order.
Automation, standardization, and built-in documentation enable your developers to focus on value-adding activities implementing business requirements, rather than time-consuming maintenance tasks such as troubleshooting queries and execution plans.
With a powerful search engine and a web-based interface, Solution Catalog allows any user to search and find relevant documentation and data. It provides one place to find all documentation regarding the whole data platform.
The documentation and end-to-end data lineage are built automatically by using active metadata. Therefore it will always be up to date and reflect the actual implementation. No more outdated excel spread sheets and word documents.
A built-in test automation framework helps increases the data quality and stability. It helps you proactively detect data quality issues and anomalies before it reaches end users.
Built-in functionality to capture business documentation for data early in the development process. Documentation is available for all down-stream use.
Xpert BI DataOps enables testing on all levels and support a range of query languages. This means you can use both SQL, DAX and MDX as well as constants in data comparisons, and schedule this with your daily loads or migration loads.