Never before has mankind had access to such an ever-widening range of personal communication options, giving us the ability to create, disseminate and consume information immediately. The frenetic pace at which devices join the Internet is unprecedented, and the constant growth in the amount of data traversing the Web is far from peaking. This whirlwind of data surrounding us will continue to expand as more devices push and pull content across the Internet faster and faster.
Ye Olde Story of Big Data
Before the Internet began its deluge of data, the world was overwhelmed by another data explosion when, in the mid-15th century, Johannes Gutenberg invented the printing press. In the 50 years that followed, Europeans printed more books than all of the manuscripts written in the previous 950 years, prompting the great Dutch humanist Desiderius Erasmus to ask, “Is there anywhere on earth exempt from these swarms of new books?”
People of the 16th century responded to the unbridled volume and variety of printing press output with waves of innovation. With the slow output of manuscripts behind them, strategies emerged to manage the burgeoning content, including the development of bibliographies to catalog all the books written, advances in note taking to summarize the information learned, encyclopedias to organize information by subject and public libraries to share the expanding content.
Big Data Redux
Today, we create more data in two days than all the data produced from the dawn of civilization until 2003 (Tweet This). That’s 5000 years of data overrun every 48 hours. Erasmus’s question is still applicable today, with a slight twist: “Is there anywhere on earth exempt from these swarms of new [content]?” Additional devices connect to the Internet daily, while content grows exponentially, which leaves me wondering what will happen when the swarms of new content overrun 5000 years of data in an hour or less?
The 21st century is also responding to its unbridled volume and variety of content. However, the proliferation of the number of devices adds a third dimension with a timely twist: velocity. Velocity is derived from the Latin word Velox, meaning swift or rapid. While volume and variety describe the size and shape of data, velocity describes the rate at which data moves, and data cannot move without infrastructure. The swiftness of infrastructure (megahertz, input/output, bandwidth and latency) and the ability to rapidly enable optimal resources (Network, CPU, Memory and Storage) both directly impact the velocity of data. When data velocity increases the value of information rises, which lifts business performance.
Cisco UCS and Big Data
The Unified Computing System is designed so businesses can harness the power of velocity. UCS successfully combined network and compute with the ability to assign resources rapidly. Tens of thousands of customers confirm the benefits derived from dynamic provisioning, reduced management time and efficient data center utilization. UCS extends swift performance with the addition of solid state memory, validated by 82+ World Record benchmarks. UCS combines network, compute and flash memory within a modular, scalable and extensible architecture.
UCS’s agility means workloads can move into service quickly. Its performance enables multiple workloads to consistently operate at high velocity. It shifts effort away from configuring and tuning infrastructure and towards new application deployments and feature enhancements. With UCS, businesses can address expected and unexpected demands with equal aplomb.
The printing press of everything rapidly spread across Europe in the 16th century. The flood of books reshaped European societies as they transformed in response to the outpouring of content. In The Internet of Everything, our devices (which serve as printing press and books) spread data between people and autonomous devices immediately. We attempt to synthesize data in real-time as the number of people and autonomous devices communicating increase globally.
Big Data Version One emerged 500 years ago to wrestle with data volume and variety. Today, Big Data Version Two grapples with data velocity (time) in addition to wrestling with volume and variety. Timely information rules when the Internet rewards the swift and penalizes the slow (Tweet This). Now is the moment to master velocity. What would your business be able to do with more time? Let us know in the comments.