Rick van der Lans is data virtualization’s leading independent analyst. So when he writes a new white paper, any enterprise that is struggling to connect all their data (which is pretty much every enterprise), would be wise to check it out.
Rick’s latest is Data Vault and Data Virtualization: Double Agility. In a nutshell, the paper addresses how enterprises can craftily combine the Data Vault approach to modeling enterprise data warehouses with the data virtualization approach for connecting and delivering data. The result is what Rick calls double agility as each approach accelerates time to solution in complex data environments.
Data Vault Pros and Cons
Adding new data sources such as big data and cloud to an existing data warehouses is difficult. The Data Vault approach provides the extensibility required. This is the first agility.
Unfortunately, from a query and reporting point of view developing reports straight for a Data Vault‐based data warehouse results in complex SQL statements that almost always lead to bad reporting performance. The reason is Data Vault models distribute data over a large number of tables.
Losing Agility Due to Data Mart Proliferation
To solve the performance problems with Data Vault, many enterprises have built physical data marts that reorganize the data for faster queries.
Unfortunately valuable time must be spent on designing, optimizing, loading, and managing all these data marts. And any new extensions to the enterprise data warehouse must be re-implemented across the impacted marts.
Data Virtualization Returns the Agility
To avoid the data mart workload, yet retain agile warehouse extensibility, Rick has worked with Netherlands based system integrator Centennium and Cisco to provide a better, double agility, alternative.
In this new solution, Cisco Data Virtualization, together with a Centennium-defined data modeling technique called SuperNova, replaces all the physical data marts. So, no valuable time has to be spent on designing, optimizing, loading, managing and updating these derived data marts. Data warehouse extensibility is retained, but because the reporting is based on virtual, rather than physical models, they are very easy to create and maintain.
Meet Rick van der Lans at Data Virtualization Day
To learn more about this innovative solution as well as data virtualization in general, come to Data Virtualization Day 2014 in New York City on October 1. Rick, along with the also sharp Barry Devlin, will join me on stage for the Analyst Roundtable. I hope to see you there.
To learn more about Cisco Data Virtualization, check out our page.
Evolutionary biologists talk about features that suddenly seem to burst forth and enable stunning new capabilities for life forms in the natural world. Eyes. Legs. Flight. And right now service providers have at their fingertips many new operational features that can help them become more agile purveyors of better, faster, and cheaper services. It’s no exaggeration to look at Software-defined Networking (SDN), Network Functions Virtualization (NFV), and other new, evolving technology approaches as part of a groundbreaking, evolutionary leap forward.
Service providers are not alone in embracing solutions that provide greater service agility. A 2013 Gartner study that asked enterprises to list their primary driver for cloud services found that 66% of enterprises ranked agility and service velocity above cost savings and other factors.
Steve Jobs is arguably the most amazing innovator of our times. I recently read some of his thoughts on innovation. His statement “Innovation distinguishes between a leader and a follower,” caused me to reflect upon my eight-year association with data virtualization, and consider who in the IT analyst community have been the innovative leaders.
The role call of top analysts doing innovative work continues with Noel Yuhanna of Forrester who wrote the analyst community’s first research paper on data virtualization in January 2006, Information Fabric: Enterprise Data Virtualization.
Gartner’s Ted Friedman and Mark A. Beyer, and more recently Merv Adrian, Roxane Edjlali, Mei Selvage, Svetlana Sicular and Eric Thoo, have been both descriptive and proscriptive about the use of data virtualization as a data integration delivery method, a data service enabler and a key component in what Gartner calls the Logical Data Warehouse.
Further, there have been myriad analysts who have amazing contributions.
The learned trio of Dr. Barry Devlin, Dr. Robin Bloor, and Dr. Richard Hackathorn have pushed the art of the possible.
While analyst / practitioners such as Jill Dyche, Mike Ferguson, Rick Sherman, Steve Dine, Evan Levy, David Loshin and William McKnight, via their hands-on client work, have “kept data virtualization grounded on reality street,” to quote Mike Ferguson.
And let’s not forget the Massachusetts’ Waynes — Wayne Eckerson formerly of TDWI and Wayne Kernochan, author of the eponymous Thoughts From a Software IT Analyst blog. Their voices and insights have proven invaluable.
To quote Gene Rodenberry, “It isn’t all over; everything has not been invented; the human adventure is just beginning.” The same is true for data virtualization. So I look forward to more great insights from these innovators, as well as a new generation led by Puni Rajah of Canalys and Vernon Turner of IDC.
To see Rick van der Lans and Barry Devlin on stage and gain even more insights from the 2014 Data Virtualization Leadership Award winners, join us at Data Virtualization Day 2014 on October 1 in New York City.
Watch for a sneak peek of Data Virtualization Day 2014.
To learn more about Cisco Data Virtualization, check out our page.
At F5 Agility 2014 Copenhagen this month, Applications take the central stage. The key focus area is Application Delivery Controllers (ADC’s as Gartner calls them) and how they are increasingly becoming more important to modern IT than they used to be, allowing scale, availability, orchestration and provisioning.
Another key focus area is around the deployment of applications and how joint technology solutions present a tremendously powerful option for F5’s customers. F5’s partners -- and Cisco is a key partner – are a large part of Agility. Cisco is at the event to demonstrate how its ACI technology integrates with F5 BIG-IP to improve manageability, strengthen security, and ensure faster and more successful deployments.
We have quite a few exciting things that we are showcasing at F5 Agility from a Cisco ACI perspective, and in this blog I want to take you on a quick tour of the highlights. As a testimony to our growing momentum with the Cisco ACI-F5 joint solution effort, we have demos, business and technical breakout sessions. The event features F5 CEO John McAdam’s keynote on Tuesday June 17, where he will … discuss how applications are impacting the architecture of the data center and driving IT strategy and alignment to key business drivers.
Cisco Marketing Manager Ravi Balakrishnan is presenting a business breakout session-4 titled “Unleash the power of Cisco ACI and F5 Synthesis for Accelerated Application deployments,” on Wednesday June 18. This session gives a detailed overview on the benefits of our joint solution and the customer pain points it addresses, so do not miss this. I’d also encourage you to attend Paolo Pio and Nicolas Menant’s technical breakout session-3 on Wednesday June 18, where they will walk you through details of ACI/VMDC joint solution.
We also have an exciting demo at the Cisco ACI kiosk, where we are showing how the Cisco ACI and F5 BIG-IP joint solution works with step-by-step illustrations of configuration, deployment and execution. The demos run Tuesdays and Wednesday at the Exhibit Hall for the entire duration of the day. Stop by our demo kiosk to get a deep-dive architecture type white-boarding or brainstorming type engagement with Cisco subject matter experts on ACI-F5 integrated solution. There are several other solution areas where Cisco and F5 are working together including F5 LTM-Cisco Nexus 7000 integration, F5 LTM-SourceFire NGIPS integration etc, so come by our demo booth to learn more.
Written by Tom Davies, Technical Solutions Architect
The world can be a tough place for a service provider in today’s marketplace. Revenue per customer is declining and margins are shrinking. Over-the-top players are delivering services at an ever increasing pace and systems integrators are positioning themselves as end-to-end providers of complex services. It is a potent mix that is negatively affecting the top line growth of many providers.
From discussing these key business issues with numerous service providers, the desire and will to offer new and innovative services on cutting edge technology cannot be doubted. Unfortunately, service providers have large legacy networks to consider, with a multitude of Element Management Systems, Network Management Systems and Operational Support System stacks, which tend to be customised to manage the specific networks and the services that reside on them. These stacks can present a bottleneck to offering new services and adopting new and disruptive technologies in terms of time to market, capital cost and the operational expenditure to deploy and manage them.
Service providers find themselves with reduced top line growth opportunity and stifled capability Read More »