Attunity Makes It Faster, Easier To Capture BI & Analytics from Data Warehouses

Attunity Ltd. aims to speed up analytics projects from data warehouses by removing a lot of the pain, time and expense that often goes with such preparation.  Attunity Compose, the latest in the company’s product line, was designed with automation as top-of-mind. IDN speaks with Attunity’s Lawrence Schwartz.

Tags: analytics, Attunity, BI, big data, Compose, data warehouse, ETL, Hadoop, NoSQL, Replicate, transformation, wizard,

Lawrence Schwartz

"Attunity Compose aims to eliminate weeks or months of manual effort and coding that it often takes to design, build, and deploy a data warehouse."

Big Data in Motion Summit
Data & Analytics for Insights, Intelligence & Operations
February 25, 2016
Online Conference

Attunity Ltd. is looking to speed up analytics projects from data warehouses by removing a lot of the pain, time and expense that often goes with such preparation.


Attunity Compose was designed with automation as top-of-mind “to re-define the process for setting up data warehouses, populating them and spinning off data marts in an agile way,” Lawrence Schwartz, Attunity’s chief marketing officer told IDN. 


Under the covers, it uses a model-based, agile approach to support the complete end-to-end lifecycle of creating, populating and maintaining a data warehouse for analytics.  In this way, Attunity Compose aims "to eliminate weeks or months of manual effort and coding that it often takes to design, build, and deploy a data warehouse," Schwartz added. 


Attunity’s approach to lower the barriers to ready data warehouses for modern analytics projects is attracting customer interest. “Prospects are coming to us because they’re estimating that the ETL development associated with building a data warehouse can consume as much as 60-80 percent of the time they need to complete their BI project. They are looking for an easier, more efficient way to get the job done,” he added.


Attunity delivers several new levels of automation across a range of data warehousing tasks, including:

  • Automated data transformation
  • Transparently managing surrogate keys (and natural-to-surrogate key3 mappings)
  • Handling of history types and slowly changing dimensions
  • Supporting and integrating history information coming from operational sources with history
  • Eliminating the need to deal with the order in which the data warehouse is loaded
  • Handling updates based on partial records (updating only changed fields)
  • Asynchronously loading entity from several sources


Further, to build a data mart, Attunity Compose provides a smart wizard that walks users through all the steps required, including: selecting fact tables; selecting dimension tables; defining transaction date(s); creating deformalized tables; and generating ETL (extract, transform and load) code to incrementally populate the data mart.


For operations and faster analytics, Attunity also provides more efficient and high-performance data ingestion from multiple data sources. This comes thanks to Attunity Compose’s ability to work with another company product, Attunity Replicate.


“Attunity Compose integrates with Attunity Replicate so that BI teams can acquire structured or unstructured data from any major database, data warehouse, Hadoop, NoSQL or cloud platform and deliver it to the data warehouse either through efficient bulk loading or real-time change data capture technology,” Schwartz said.


The integrated Attunity Compose / Replicate combination also “will easily rebalance data between Hadoop Data Lakes and conventional data warehouses.  We also have Visibility software that defines what data should be moved from data warehouses to Hadoop for cost, utilization or performance reasons,” Schwartz added.


Given all the pain points across a data warehouse lifecycle – from set-up, data ingestion and even on-going operations – IDN asked Schwartz to walk us through how Attunity Compose cracks the code to offer enterprise IT a more agile and speedy solution.



“Typical ETL commands must be implemented with error-prone and cumbersome manual coding that ties up talented developers for long periods of time.  This is no longer tolerable,” Schwartz told IDN.  “BI teams must be able to deploy data models and start analytics projects quickly – then just as quickly tweak the model, business rules or data sources as they adapt to lessons learned.”


In addition, Attunity offers other software to improve query execution performance with data warehouses.


How Attunity Automation Delivers an Agile Data Warehouse for BI, Analytics

To illustrate the customer benefits of Attunity’s approach, Schwartz shared some end user results.

  • One manufacturing firm reported that Attunity Compose let them cut the time and effort to build a data mart from weeks to a single day.
  • A financial services company reported it can now ingest billions of rows/hour doing full load while replicating from Oracle Exadata into Greenplum with Attunity Replicate.
  • A credit services company reported Attunity lets them handle half a million records per second using change data capture (CDC) from large and highly active Oracle databases into a data warehouse. They also report peak processing can hit Gbs per hour of log data.


An analyst watching the growing demand for faster ways to use data warehouses to reap analytics sees merit in Attunity’s approach. "Too many companies rely on complex, hand-coded approaches to building data warehouses. . . Developers often fear that automation tools will put them out of work, but in reality, these solutions simply enable focus on higher value projects like BI and analytics,” said Wayne Eckerson, founder of Eckerson Group, in a statement.


Attunity works by automatically generating code to create data staging for a data warehouse and data marts. Staging tables, the mirror images of the source table inside the data warehouse, are automatically generated by Attunity Compose. In turn, these staging tables are used to collate and prepare data for loading into the data warehouse tables.


Once source metadata is discovered and mapped (and the model is generated), Attunity Compose can then build the data warehouse with a simple mouseclick. DDL statements are auto-generated (based on normalized enterprise data warehouse principles). After the DDL is generated, the statements are executed.


For on-going data warehouse operations, the existing model is constantly compared to the sources – over the life of the warehouse. Physical adjustments to the data warehouse structure are automatically introduced as the model evolves, without generating ETL code.

The Attunity portfolio of big data management software promotes access, management, sharing and distribution of data across heterogeneous enterprise platforms and the cloud.  It includes data replication, data flow management, test data management, change data capture (CDC), data connectivity, enterprise file replication (EFR), managed file transfer (MFT), data warehouse automation, data usage analytics, and cloud data delivery.