Thanks to Attunity Compose, our ability to make decisions faster and more effectively has increased at least tenfold
Agile Data Warehousing & Data Lake Pipeline Automation
Build a new data warehouse with Attunity Compose. Migrate an existing data warehouse to a new platform. Expand an existing data warehouse with new data feeds. Generate data marts for lines of business. Prototype, perform ad hoc analytics and testing.
Accelerate and simplify the loading and transformation of large-scale data lakes. Deliver data efficiently to any major Hadoop/Data Lake platform.
Accelerate and simplify data warehouse design, development, testing, deployment and updates
Agile Data Warehouse Automation
A fresh approach to data warehouse automation
Attunity Compose automates the design, implementation and updates of data warehouses and data marts. It minimizes manual, error-prone data warehouse design processes by automating data modeling, ETL generation and workflow.
Data architects can speed up analytics projects, optimize process and reduce risk. Compose flexibly supports either a model driven methodology, guided by business processes, or a data-driven methodology that is based on reporting requirements to easily adapt to new requirements or model changes through the data warehouse environment.
- Reduce the time and cost of analytics projects on cloud platforms
- Quickly spin up, scale up, and iterate a data warehouse
- Dynamically adjust data sources and models
Automated creation of analytics-ready data
Create, load and transform enterprise data into Hadoop Hive
Compose for Hive fully automates the pipeline of BI ready data into Hive, enabling you to automatically create both Operational Data Stores and Historical Data Stores. We also leverage the latest innovations in Hadoop, such as the new ACID Merge SQL capabilities, available today in Apache Hive (part of the Hortonworks 2.6 distribution), to automatically and efficiently process data insertions, updates and deletions.
With Attunity Compose for Hive, database administrators and data architects gain:
- Simple, universal and real-time data ingestion and landing
- Comprehensive automation
- Continuous, non-disruptive data store updates
- Transaction consistency
- Improved operational visibility
- Accelerate data warehousing projects
- Reduce risk and ensure consistency
- Improve business impact of analytics
- Reduce time, cost and development requirements
- Faster Data Lake operational readiness
- Reduced development time
- Reduced reliance on Hadoop skills
- Standard SQL access for data architects
Compose is integrated with Attunity’s award winning Replicate technology to provide high speed data loading with CDC capability for real-time analytics. Because of this integration, you can take advantage of an easy-to-use graphical interface to load data without requiring any agents on the source systems.
Additionally with this integration you have broad support to source heterogeneous systems – including relational databases, mainframe and Hadoop.
We cut implementation costs by 80%, generated ETL code in 1/20 the time normally required, and rolled out a new DW in 3 months rather than a year. We can now update monthly rather than twice annually
Using this solution, we have saved and will continue to save significant time and resources, enabling us to re-focus on higher value initiatives that directly and positively affect our bottom line
We greatly appreciate the automation and agility it lends our development process, including version control with rollback capabilities. As a growing insurance company, we are managing incredible volumes of data. Being able to analyse it more quickly gives us a competitive edge while the automation saves us months of costly development time and labour
Using Attunity, we were able to create our strategic analytical platform, insights analytics, which allows us to make important operational decisions that benefit our staff and students
Adding Compose to our EDW environment versus our prior efforts is like comparing apples to oranges. We went from taking 10-14 days to build a data mart down to just a single day