In this video, Cisco Distinguished IT Engineer Jon Woolwine and I discuss Cisco IT’s approach to Network Programmability and SDN, describing some SDN-related use case solutions currently in development. Read More »
As part of our ongoing Partner Voices blog series, we had the opportunity to hear from MCPc. During the past 11 years, MCPc has bet long on Cisco, using networking, switching, telepresence, and digital media tools within its own business and in the outstanding help for the business of its clients.
For example, since the beginning of 2013, MCPc has used Cisco Telepresence internally for more than 4,320 hours of cumulative communication. That is more than 180 full days of time. Most MCPc associates have Jabber on their mobile devices, and their local media is paying attention to the ways in which MCPc has implemented Cisco throughout the company. But MCPc does more than just make its own travel schedule easier for employees – it has enabled clients to take advantage of Cisco’s full breadth of offerings. Read More »
Following part two of our Big Data in Security series on University of California, Berkeley’s AMPLab stack, I caught up with talented data scientists Michael Howe and Preetham Raghunanda to discuss their exciting graph analytics work.
Where did graph databases originate and what problems are they trying to solve?
Michael: Disparate data types have a lot of connections between them and not just the types of connections that have been well represented in relational databases. The actual graph database technology is fairly nascent, really becoming prominent in the last decade. It’s been driven by the cheaper costs of storage and computational capacity and especially the rise of Big Data.
There have been a number of players driving development in this market, specifically research communities and businesses like Google, Facebook, and Twitter. These organizations are looking at large volumes of data with lots of inter-related attributes from multiple sources. They need to be able to view their data in a much cleaner fashion so that the people analyzing it don’t need to have in-depth knowledge of the storage technology or every particular aspect of the data. There are a number of open source and proprietary graph database solutions to address these growing needs and the field continues to grow.
What’s the problem with Big Data? You guessed right — it’s BIG.
Big Data empowers organizations to discern patterns that were once invisible, leading to breakthrough ideas and transformed business performance. But there is simply so much of it, and from such myriad sources — customers, competitors, mobile, social, web, transactional, operational, internal, external, structured, and unstructured — that, for many organizations, Big Data is overwhelming. The torrents of data will only increase as the Internet of Everything spreads its ever-expanding wave of connectivity, from 10 billion connected things today to 50 billion in 2020.
So, how can organizations learn to use all of that data?
The key lies not in simply having access to enormous data streams. Information must be filtered for crucial, actionable insights, and presented to the right people in a visualized, comprehensible form. Only then will Big Data transform business strategies and decisions. In effect, Big Data must be made small.
However, as McKinsey & Co. reported, many organizations don’t have enough data scientists, much less ones who understand the business well enough to draw conclusions. The trick is to get the scientists together with the experts who understand the business levers driving the organization. Put them in a room with the right tools, and watch the synergy fly.
But what sort of a room?
Competing with the virtual, e-commerce world is becoming increasingly challenging for real-world businesses. Traditional retailers have long envied the massive amounts of valuable data that online retailers have available to help them better understand customer behavior and implement winning marketing tactics. Online retailers know valuable information such as how frequently customers return, how long they spend on their sites, what the customers looked at but didn’t buy, and where they went before and after coming to the site. Businesses as diverse as hotels, banks, stadiums, airports, and large public venues are all looking for ways to get similar detailed data on customer activities in their facilities, so they can improve the customer experience and their bottom lines. The data and insights have not been available to bricks-and-mortar facilities, until now.
That situation is changing through the growing availability of Wi-Fi in business locations. Many retailers, hotels, and other businesses are increasingly offering Wi-Fi as a service that allows their customers to connect mobile devices to the Internet. Hidden in this valuable service is a vast amount of information and insight, which retailers and others can use to deliver tangible value to their bottom lines. Hypersensitive location information, device details, identification of returning customers, and sophisticated path analysis are just some of the customer data captured by Wi-Fi networks. Businesses are now realizing that the data and capabilities offer new ways to improve the customer experience and support a range of market-leading monetization models.
For many businesses, these new location-based experiences and Read More »