“Software is Eating the World” is a quote attributed to Marc Andreessen and somewhat further explored by his business partner Ben Horowitz. Mark Andreessen gives compelling reasons to validate this quote. To some extend I have to agree with some of his reasons (but I am also a little bit biased as a software engineer). On the other hand, when I read this (and this is partly based on working in different domains on software), I wonder if software is that disruptive. If you look “under the hood” of software applications, you find that a lot of software is based on fundamental software principles that are already 20-30 years old, yet they are still frequently used (and for good reasons). That does not mean there are no new advances in software, however old and proven technologies still play an important role (like we say in mathematics, it does not become old, it becomes classic).
So maybe the reason that “Software is Eating the World” is due to the advances in hardware? Would you run modern enterprise applications in the Cloud 20 years ago? One of the challenges could certainly be the bandwidth. Was the IPhone a victory for software or hardware? A lot of the IPhone GUI was not that revolutionary IMO but the combination of hardware and software made for a potent technology disruption.
Read More »
Tags: analytics, API, SDN, sensors, Service Provider, software
This was the title of a November 19 2013 panel that I had moderated in Washington D.C. at the MPLS-SDN Isocore Conference.
The abstract for this conference was designed to be a bit provocative, specifically:
“ Virtualization as a concept is not new. However, in the context of Software Defined Networking,the virtualization discussion has been focusing on overlay functions e.g networking. What about virtualization overlays and interworking with existing architectures? What are the implications to performance and management? Are we speaking the same language?
The panelists will have an opportunity to articulate the virtualization problem space for the industry and the opportunity for the industry to address.”
My panelists included the following individuals: Read More »
Tags: analytics, APIs, Big Data, SDN, Service Provider, services, software
Like many IT organizations, Cisco’s internal IT department is deploying Big Data solutions to mine the ever-increasing data in-flow from a wide range of sources – and thus gain competitive advantage and insights.
The typical environment includes an ecosystem of different tools and data sources that looks something like this (image courtesy of @TorstenVolk):
Source: EMA Research
Cisco IT realized that as the demand for analysis of this data increased, the demands on their infrastructure and Day 2 operations management would likely grow exponentially. So they knew that they needed an enterprise-grade workload automation solution that could manage processes involving Hadoop, MapR, Cloudera, Informatica, Teradata, SAP HANA, BusinessObjects, Tableau as well as other analytics applications, data feeds and repositories.
Fortunately, we have a workload automation software solution – Cisco Tidal Enterprise Scheduler – that meets those requirements and more:
Read More »
Tags: Big Data, Cisco IT, Cisco Unified Management, inside cisco it, software, Tidal Enterprise Scheduler, unified management, workload automation
Big data and cloud are drastically changing today’s IT landscape. The proliferation of traditional and new data sources plus the movement of data to the cloud complicate a company’s ability to access all of its data assets. This creates an important need to complement traditional data warehousing by providing a real-time, consolidated logical view of data, better known as data virtualization.
Today, Cisco is announcing its intent to acquire Composite Software, a market leader in data virtualization software and services. Composite’s technology connects and optimizes many types of data from across the network and makes it appear as if it’s in one place, allowing companies to make better business decisions. Together, Cisco and Composite will help to accelerate the shift from physical data integration to data virtualization.
For example, the NYSE Euronext produces billions of data per day through quotes, trades, orders, receipts. This data is housed in multiple locations. Composite provided a solution with its data virtualization platform, which functions as a virtual data warehouse to provide access to trades, orders, quotes and other data for analysis, compliance and reporting across 14 exchanges. With data virtualization’s flexible data delivery infrastructure, the organization increased business responsiveness, improved the breadth of analytic insight and lowered its costs.
Consistent with our model for Next Generation IT, Composite will expand Cisco’s portfolio of Smart Services and extend our next-generation services platform with software and hardware solutions. By connecting network knowledge (APIs) and programmability with Cisco’s industry leading Unified Computing System, and adding Composite’s software and query optimization expertise, Cisco will be well positioned to provide highly differentiated capabilities to our customers.
In addition, this acquisition reinforces our commitment to support partner consumption models and assist our partners in broadening their services portfolios.
This acquisition builds on Cisco’s framework for a unified platform and our software services strategy with the recent acquisition of SolveDirect. Composite’s data virtualization solution, combined with SolveDirect’s process integration platform, will provide cross-domain data and workflow integration capabilities to enable real-time business insights and operations.
Tags: acquisition, data virtualization, Hilton Romanski, M&A, services, software
There’s no doubt that video is becoming more pervasive in business. It’s no wonder: humans are visually oriented. We’ve been reading people’s faces since we were newborns, so it’s natural for us to use visual cues as we build stronger relationships and better organizations.
As video makes deeper inroads in enterprises large and small, I keep hearing the concept of “good enough” video. So what does “good enough” really mean? Is there a specific number of pixels, or frame rates, or a certain standard that makes video “good enough”? How can you define “good enough” for your organization?
If you focus on the results your organization wants to achieve, your video collaboration strategy will fall into place.
Here are the most important things to help you define a video collaboration strategy that is “good enough” for your organization’s goals.
Read More »
Tags: Cisco, collaboration, good enough, hardware, pervasive, quality, software, TelePresence, video