Cisco Blogs


Cisco Blog > Data Center and Cloud

New Research Identifies How to Accelerate Data Virtualization Adoption

The challenges of data management are getting exponentially harder. These ever-increasing data quantities, diversity and distribution are revolutionizing data management, opening the door for new solutions such as data virtualization.

Data virtualization fulfills a range of business demands for data, supports high iteration and fast response times, all while saving significant IT costs.

Radiant Advisors is a leading strategic research and advisory firm that helps transform today’s organizations into tomorrow’s data-driven industry leaders.  They recently teamed with Cisco to better understand the barriers experienced by companies considering data virtualization and see how companies that have already adopted data virtualization overcame them.

The Data Virtualization Adoption Dilemma

Data virtualization is an advanced technology with nearly every major research and consulting firm provides architectural blueprints that include a logical business semantic layer.

Yet only some companies have it, leaving data virtualization an often-missed opportunity for business and IT to keep pace with today’s volatile data landscape.

To achieve the benefits of data virtualization, companies need to take the leap.

Unique Research Formula

To better understand data virtualization adoption barriers, Radiant Advisors’ Research Director, Lindy Ryan, interviewed a broad, pre-adopter community to understand their perceptions and concerns.  These top-of-mind issues were then posited to companies that have already adopted data virtualization, who then anonymously shared the insights, best practices, and lessons they learned in overcoming barriers to data virtualization adoption.

The resulting research report, Overcoming Barriers to Data Virtualization Adoption, consolidated these findings into clear counsel intended to guide potential adopters to overcome perceived barriers.

Read The Research

Download Overcoming Barriers to Data Virtualization Adoption here.

 

Learn More

To learn more about Cisco Data Virtualization, check out our page.

Join the Conversation

Follow us @CiscoDataVirt.

Tags: , , , ,

Analytics for an IoE World

I recently wrote about how Cisco is helping customers more effective manage massive amounts of data, types of data and unprecedented distribution of data. This will be one of the toughest challenges brought on by the Internet of Everything (IoE) and, with solutions such as Data Virtualization and Big Data Warehouse Expansion, Cisco is enabling our customers to meet the challenge head on of bringing all of this data together in ways that are meaningful to business users.

After the business can access and view all of this data, however, the question becomes…now what? The next challenge is to extract insights from the data to make better business decisions.  After all, more data is only good if you use it to make better decisions than you would have made otherwise.

The rules of customer and business relationships are constantly changing due to technological innovation and consumption patterns. Analytics can reveal patterns in customer data that affect business processes and outcomes. Advanced analytics is different than reporting because it prescribes what to do, or predicts what is likely to happen, instead of just reporting what has already happened.

Utilizing the network to securely connect data throughout the IoE, whether in motion (streaming) or at rest (historical), is the future of advanced analytics.  For a retailer, it will give them the opportunity to take intelligent actions to engage customers directly at the point of purchase and in real-time. But it’s so much more than that. What can real-time analytics in retail tell us about how to serve customers more effectively?  What can real-time analytics in manufacturing tell us about how to make the workplace safer?  What can real-time analytics in healthcare tell us about how to better treat cancer patients?

When our customers can accurately predict outcomes by combining years of historical data with real-time information, they can drive better decisions…better outcomes.

 

Learn More

Interested in hearing how Cisco is paving the way to the future of analytics? Please join us for a webcast at 9 AM Pacific time on October 21st entitled ‘Unlock Your Competitive Edge with Cisco Big Data and Analytics Solutions.’ #UnlockBigData

Register Now button

To learn more about Data and Analytics, check out our page.

Join the Conversation

Follow us @CiscoDataVirt #UnlockBigData.

Tags: , , , , , , ,

Aligning Solutions to Meet our Customers’ Data Challenges

In our previous big data blogs, my Cisco associates have focused on the topic of building the best infrastructure for long-term success with big data. I’d like to start a new chapter in the series, focusing on building the right data strategy and analytics solutions.

Today, people, process, data and things function together through a combination of machine-to-machine, person-to-machine and person-to-person connections. We call this the Internet of Everything (IoE). While the IoE is making us all smarter, it is also creating more data, more types of data and in more places.

This wealth of data comes with major challenges but also has the potential for amazing opportunities. At Cisco, we’re all about helping our customers turn these challenges into opportunities. The first step begins with proper management of the massive amounts and types of data in multiple locations. From a solutions perspective, that first step is our agile data integration software, Cisco Data Virtualization. It abstracts data users need from multiple different sources and brings it together to give users a unified, friendly view of the data.

Unknown

By leveraging this technology with additional solutions, our customers can access data across the IoE and use that data to respond quickly to change, make better decisions and gain a competitive advantage. Driven by the massive amounts of data in today’s IT environment, customers are facing huge expenses to add capacity to their existing enterprise data warehouses (EDW), the place where data is traditionally stored.

We help customers tackle the challenge of increasing enterprise data warehouse costs with Cisco Big Data Warehouse Expansion (BDWE). BDWE identifies infrequently used data and provides a methodology and tools to offload the data onto Hadoop, avoiding additional capacity costs and extending the life of the data warehouse.

I spoke with a customer recently who shared that one terabyte (TB) of data in an EDW costs $100,000 per year to maintain. That exact same amount of data for the same amount of time in Hadoop only costs $1,000 to maintain. This is a significant difference. By implementing an ongoing strategy to offload data from the primary system to Hadoop, our solution frees up resources to be utilized in more strategic ways. Additionally, we deploy Data Virtualization to act as a ‘virtual database,’ to access data regardless if it resides in the original warehouse or the new Hadoop data store. So not only does BDWE significantly lower costs, but the historical data remains easily accessible.

Unknown-1

Our customers gain the business insights and outcomes they seek with a complete suite of software, hardware and services solutions that access and analyze data, no matter where it is stored on the network. After all, the power of data is not just in the ability to access it but to use it to change behavior or the way you run your business.

Not only do we connect more people, processes, data, and things than any other company, we can also bring analytics to data wherever it is—no matter how remote—to turn information into insights almost instantly. More to come in my next blog about Cisco’s analytics portfolio and how its helping tackle the next major IoE challenge, extracting value insight from your data.

 

Learn More

To learn more about the benefits of Cisco analytics solutions and the power of our integrated infrastructure for big data, please join us for a webcast at 9 AM Pacific time on October 21st entitled ‘Unlock Your Competitive Edge with Cisco Big Data and Analytics Solutions.’ #UnlockBigData

Register Now button


 

 

To learn more about Cisco Data Virtualization, check out our page.

Join the Conversation

Follow us @CiscoDataVirt #UnlockBigData.

Tags: , , , , , , ,

Innovation Distinguishes Between a Leader and a Follower

Steve Jobs is arguably the most amazing innovator of our times.  I recently read some of his thoughts on innovation. His statement “Innovation distinguishes between a leader and a follower,” caused me to reflect upon my eight-year association with data virtualization, and consider who in the IT analyst community have been the innovative leaders.

Since 2006, I have worked with over one hundred IT analysts to define and advance the data virtualization market.  I even teamed up with one, Judith Davis, to co-author the first book on data virtualization, Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility.

Others such as Rick van der Lans, author of data virtualization’s second book, Data Virtualization for Business Intelligence Systems: Revolutionizing Data Integration for Data Warehouses and the seminal article, The Network is the Database, have contributed mightily to the market’s understanding of data virtualization’s capabilities, advantages and benefits.

The role call of top analysts doing innovative work continues with Noel Yuhanna of Forrester who wrote the analyst community’s first research paper on data virtualization in January 2006, Information Fabric: Enterprise Data Virtualization.

Gartner’s Ted Friedman and Mark A. Beyer, and more recently Merv Adrian, Roxane Edjlali, Mei Selvage, Svetlana Sicular and Eric Thoo, have been both descriptive and proscriptive about the use of data virtualization as a data integration delivery method, a data service enabler and a key component in what Gartner calls the Logical Data Warehouse.

Dave Wells, author of TDWI’s Data Virtualization Course, Data Virtualization: Solving Complex Data Integration Challenges, helped bring data virtualization into the mainstream.    As did Boulder BI Brain Trust members Claudia Imhoff, Colin White, John O’Brien, Ralph Hughes, John Myers and more who I recently wrote about in Rocky Mountains High On Data Virtualization.

Further, there have been myriad analysts who have amazing contributions.

  • The learned trio of Dr. Barry Devlin, Dr. Robin Bloor, and Dr. Richard Hackathorn have pushed the art of the possible.
  • While analyst / practitioners such as Jill Dyche, Mike Ferguson, Rick Sherman, Steve Dine, Evan Levy, David Loshin and William McKnight, via their hands-on client work, have “kept data virtualization grounded on reality street,” to quote Mike Ferguson.
  • And let’s not forget the Massachusetts’ Waynes — Wayne Eckerson formerly of TDWI and Wayne Kernochan, author of the eponymous Thoughts From a Software IT Analyst blog.  Their voices and insights have proven invaluable.

To quote Gene Rodenberry, “It isn’t all over; everything has not been invented; the human adventure is just beginning.”  The same is true for data virtualization.  So I look forward to more great insights from these innovators, as well as a new generation led by Puni Rajah of Canalys and Vernon Turner of IDC.

To see Rick van der Lans and Barry Devlin on stage and gain even more insights from the 2014 Data Virtualization Leadership Award winners, join us at Data Virtualization Day 2014 on October 1 in New York City.

Watch for a sneak peek of Data Virtualization Day 2014.

Learn More

To learn more about Cisco Data Virtualization, check out our page

Join the Conversation

Follow us @CiscoDataVirt #DVDNYC

Tags: , , , ,

Active Archiving with Big Data

Historical data is now an essential tool for businesses as they struggle to meet increasingly stringent regulatory requirements, manage risk and perform predictive analytics that help improve business outcomes. While recent data is readily accessible in operational systems and some summarized historical data available in the data warehouse, the traditional practice of archiving older, detail-level data on tape makes analysis of that data challenging, if not impossible.

Active Archiving Uses Hadoop Instead of Tape

What if the historical data on tape was loaded into a similar low cost, yet accessible, storage option, such as Hadoop?  And then data virtualization applied to access and combine this data along with the operational and data warehouse data, in essence intelligently partitioning data access across hot, warm and cold storage options.  Would it work?

Yes it would!  And in fact does every day at one of our largest global banking customers.  Here’s how:

Adding Historical Data Reduces Risk

The bank uses complex analytics to measure risk exposure in their fixed income trading business by industry, region, credit rating and other parameters.  To reduce risk, while making more profitable credit and bond derivative trading decisions, the bank wanted to identify risk trends using five years of fixed income market data rather than the one month (400 million records) they currently stored on line.  This longer time frame would allow them to better evaluate trends, and use that information to build a solid foundation for smarter, lower-risk trading decisions.

As a first step, the bank installed Hadoop and loaded five years of historical data that had previously been archived using tape.  Next they installed Cisco Data Virtualization to integrate the data sets, providing a common SQL access approach that made it easy for the analysts to integrate the data.  Third the analysts extended their risk management analytics to cover five years.   Up and running in just a few months, the bank was able to use this long term data to better manage fixed income trading risk.

Archiving with Big Data_Bank

To learn more about Cisco Data Virtualization, check out our Data Virtualization Video Portal.

Tags: , , , , ,