|By Robert Eve||
|March 12, 2012 01:45 PM EDT||
Organizations and decision processes are changing.
Business intelligence has to adapt.
These dynamic, new reporting and analytical needs can be summarized in single word - Agility.
BI Today Is Too Slow
According to the TDWI Benchmark Report: Organizational and Performance Metrics for Business Intelligence Teams, on average it takes eight weeks to add a new data source and another seven to develop a new complex report.
- How can BI become more agile?
- Will agility come from new BI tools?
- Or will the answer come for the new approaches to data integration?
In a recent white paper, "Data Virtualization for Business Intelligence Agility", Rick F. van der Lans, a leading authority on data virtualization and the author of the upcoming book "Data Virtualization in Business Intelligence Architectures: Revolutionizing Data Integration for Data Warehouses," states van der Lans. "The biggest challenge facing the industry today is how to develop systems that have an agility level that matches the speed with which the business evolves.
If the industry fails to achieve this, current business intelligence systems will slowly become obsolete and will weaken the organization's decision-making strength. Agility is becoming a crucial property of each and every BI system, and data virtualization is key to achieving it."
Look to the Data, not to the BI Tools
Van der Lans advises IT organizations to think broadly about their BI agility challenge. "It's not simple to pinpoint why most of the current business intelligence systems are not that agile. It's not one aspect that makes them static. But undoubtedly one of the dominant reasons is the database-centric solution that forms the heart of so many business intelligence systems."
Long Chain of Data Stores Reduces Agility
The architectures of most business intelligence systems are based on a complex chain of data stores starting with production databases, data staging areas, a data warehouse, dependent data marts, and personal data stores. Simply maintaining this complexity is overwhelming IT today. In addition to van der Lans' description of this chain in the white paper, he also addresses the same subject in a video chalk talk.
According to van der Lans, "While these classic systems have served business well for the last twenty years. However, considering the need for more agility, they have some disadvantages:
- Duplication of data
- Non-shared meta data specifications
- Limited flexibility
- Decrease of data quality
- Limited support for operational reporting:
- Limited support for reporting on unstructured and external data"
From a different point of view, SOA World's Zettabytes of Data and Beyond describes the challenges of force-fitting development methods that were appropriate for earlier times when less data complexity was the norm.
In addition, the proliferation of fit-for-purpose data stores including data warehouse appliances, Hadoop-based file systems, and a range of No-SQL data stores are breaking the hegemony of the traditional data warehouse as the "best" solution to the enterprise-level data integration problem. The business and IT impact of these new approaches can be explored in the Virtualization Magazine article NoSQL and Data Virtualization - Soon to Be Best Friends.
Data Virtualization Products Provide Agility
Rick van der Lans' white paper discusses how data virtualization products can help to make business intelligence (BI) systems more agile. According to van der Lans, data virtualization products simplify and thus improve BI time-to-solution in through aspects such as:
- Unified data access
- Data store independence
- Centralized data integration
- Transformation and cleansing
- Consistent reporting results
- Data language translation
- Minimal data store interference
- Simplified data structures
- Efficient distributed data access
Virtualization Magazine's How Data Virtualization Improves Business Agility - Part 2 addresses data virtualization product capabilities that fulfill business information needs far faster than traditional methods using a streamlined, iterative approach that evolves easily.
How Data Virtualization Improves Business Agility - Part 3 provides additional insight with particular focus on how to data virtualization products provide superior developer productivity, lower infrastructure costs and better optimization of existing data integration solutions.
Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility profiles ten large enterprises that are using data virtualization products to accelerate IT responsiveness to new business needs.
The financial benefits of these data virtualization capabilities are significant. And they can be applied flexibly to fund additional data integration activities and/or other business and IT projects.
Learn More about Data Virtualization
Rick van der Lans will headline the March 20 broadcast of The Briefing Room discussing "Stay Flexible: Five Ways to Harness Information Assets" at 4pm ET. The Briefing Room, hosted by The Bloor Group's Eric Kavanagh, is a live interactive forum which allows vendors to give a detailed technical presentation to a respected analyst who asks questions and provides industry insights. During the broadcast, van der Lans will explain how data virtualization can solve many data management problems before they occur. I will participate as well to discuss how Composite Software's data virtualization products provide the critical agility businesses are seeking.
Go here to register for the March 20 Briefing Room broadcast.
- Five Ways Data Virtualization Improves Data Warehousing
- Data Virtualization at Pfizer: A Case Study
- Data Virtualization Technology Advancements Deliver New Value
- Data Virtualization Adoption Propelled by Significant Business Benefits
- Why Bother to Abstract Your Data?
- Extend MDM with Data Virtualization
- Will Data Virtualization Work for Me?
- It’s Here! The First Book on Data Virtualization
- How to Evaluate a Data Virtualization Platform
- Roadmap for Data Virtualization