Cloud computing is more mainstream today than ever before, but it’s important to note that there are still significant opportunities for IT leaders to innovate and leverage cloud delivery options to capture new business opportunities and implement new IT models.
The Evolution of ITaaS: The Convergence of Two Roads
On one hand, traditional private cloud services within customer IT services are driving different degrees of completeness depending on organizational needs. Virtualization, consolidation and on-premise shared services are some of the drivers within the private cloud space.
On the other hand, public cloud services are evolving to include Infrastructure as a Service (IaaS), Software as a Service (SaaS), and Platform as a Service (PaaS).
Today, these two tracks are intersecting to create demand for a hybrid cloud model. While the concept of the “Hybrid” cloud has developed mostly as a consequence of the availability of different cloud services, this same availability is also driving the evolution of IT as a Service.
What does this mean for business? It means that fundamentally, IT is adopting a supply chain management logic by deciding whether to make or buy a specific service based on a variety of organizational goals, market pressures, and available options.
The Ongoing IT Sourcing Strategy: Make vs. Buy
Read More »
Tags: Cisco, Cisco IT, cloud, Cloud Computing, cloud services, data center, data_center, Hybrid Cloud, ITaaS, orchestration, private cloud, Public Cloud
My 2014 predictions are finally complete. If Open Source equals collaboration or credibility, 2013 has been nothing short of spectacular. As an eternal optimist, I believe 2014 will be even better:
- Big data’s biggest play will be in meatspace, not cyberspace. There is just so much data we produce and give away, great opportunity for analytics in the real world.
- Privacy and security will become ever more important, particularly using Open Source, not closed. Paradoxically, this is actually good news as Open Source shows us again, transparency wins and just as we see in biological systems, the most robust mechanisms do so with fewer secrets than we think.
- The rise of “fog” computing as a consequence of the Internet of Things (IoT) will unfortunately be driven by fashion for now (wearable computers), it will make us think again what have we done to give up our data and start reading #1 and #2 above with a different and more open mind. Again!
- Virtualization will enter the biggest year yet in networking. Just like the hypervisor rode Moore’s Law in server virtualization and found a neat application in #2 above, a different breed of projects like OpenDaylight will emerge. But the drama is a bit more challenging because the network scales very differently than CPU and memory, it is a much more challenging problem. Thus, networking vendors embracing Open Source may fare well.
- Those that didn’t quite “get” Open Source as the ultimate development model will re-discover it as Inner Source (ACM, April 1999), as the only long-term viable development model. Or so they think, as the glamor of new-style Open Source projects (OpenStack, OpenDaylight, AllSeen) with big budgets, big marketing, big drama, may in fact be too seductive. Only those that truly understand the two key things that make an Open Source project successful will endure.
- AI recently morphed will make a comeback, not just robotics, but something different AI did not anticipate a generation ago, something one calls cognitive computing, perhaps indeed the third era in computing! The story of Watson going beyond obliterating Jeopardy contestants, looking to open up and find commercial applications, is a truly remarkable thing to observe in our lifespan. This may in fact be a much more noble use of big data analytics (and other key Open Source projects) than #1 above. But can it exist without it?
- Finally, Gen Z developers discover Open Source and embrace it just like their Millennials (Gen Y) predecessors. The level of sophistication and interaction rises and projects ranging from Bitcoin to qCraft become intriguing, presenting a different kind of challenge. More importantly, the previous generation can now begin to relax knowing the gap is closing, the ultimate development model is in good hands, and can begin to give back more than ever before. Ah, the beauty of Open Source…
Tags: ai, AllSeen, big data analytics, Cloud Computing, cognitive computing, cyberspace, Fog computing, hypervisor, Inner Source, internet of things, IoT, meatspace, NFV, Open, open source, opendaylight, OpenStack, privacy, qCraft, robotics, SDN, security, transparency, virtualization
The insurance industry is continuously looking for the simplest, most efficient method of providing consumers with the best service, while at the same time trying to reduce overall operating expenses. While insurance providers explore the right options for their business, one thing is certain, cloud-based environments are low-risk solutions that enable applications to increase business value. From Cisco research, we know that running desktop applications in the cloud can be attractive because it reduces complexity and increases security.
Aside from the insurance industry, other financial services institutions struggle to find a business structure that provides the desired flexibility and market savings necessary to provide excellent customer service. However, with the help of cloud computing and unified communications, these challenges are being overcome. Due to recent success and proven low-risk functionality, insurers are gradually adopting cloud solutions to help guide business operations and initiatives. In fact, Gartner is predicting that the cloud system infrastructure market to grow by 47.8% through 2015. Read More »
Tags: Cisco, Cloud Computing, Financial Services, insurance, insurers, unified communications
The road in my picture below – the A82 that winds through Glencoe in Scotland – was used in the James Bond “Skyfall” movie in one of the amazing car chase scenes. This road winds through sparsely inhabited territory, has lots of ups, downs, bumps and turns and if you’re not careful it can be a dangerous road. I’ll draw the analogy here with the challenges of introducing new technologies: there can be ups, downs, bumps and turns into the unknown, if you are not careful. And in my case here, I’ll use this analogy to illustrate the challenges of adopting OpenStack: without the right kind of approach, without a carefully managed exploratory “pilot” investigation and subsequent roadmap planning, you may find that adopting OpenStack – or any other open source software solution, for that matter – has its share of challenges, ups, downs, bumps and turns into the unknown.
Read More »
Tags: Cisco Services, cloud, Cloud Computing, data center, data_center, OpenSource, OpenStack
In three short years OpenStack has become cloud management platform that is “Too Big to Fail” (according to Citi Research). Whether it is true or not, OpenStack is definitely gaining traction and is making a profound impact not only as a viable Cloud management option, but also on the software economics for Cloud solutions.
Cloud computing is rapidly transforming businesses and organizations by providing access to flexible, agile, and cost-effective IT infrastructure. These elastic capabilities help accelerate the delivery of infrastructure, applications, and services with the right quality of service (QoS) to increase revenue. Cisco’s approach—innovative and unified data center infrastructure that provides the underlying foundation for OpenStack technology—enables the creation of massively scalable infrastructure that delivers on the promise of the cloud.
Cisco Common Cloud Architecture built on Cisco Unified Computing System (UCS) with OpenStack provides the foundation for flexible, elastic cloud solutions enabling speed and agility. As the saying goes “Every Skyscraper is built on a strong foundation of pillars”, the OpenStack platform requires the core requirements from the underlying infrastructure – simplification, rapid provisioning, self-service consumption model, and elastic resource allocation. Cisco UCS uniquely provides a policy based resource management model, which simplifies by integrating compute, networking and storage with the ability to scale and automate deployment.
This foundation addresses every stage of cloud deployment be it private or public cloud offerings. Some of primary workloads targeted for OpenStack based deployments are:
- Self-service development and test environments
- Massively scalable software-as-a-service (SaaS) solutions
- High-performance, scale-out storage
- Web server, multimedia, big data, and cluster-aware applications
- Applications with extensive computing power requirements and mixed I/O workloads
To accelerate these cloud infrastructure deployments, Cisco has developed starter configurations focused on compute-intensive, mixed or heterogeneous and storage-intensive workloads. The various server nodes are typically sized to include the OpenStack controller, compute, Ceph storage, Swift proxy and Swift storage.
Cisco UCS Solution Accelerator Paks for Cloud Infrastructure Deployments
Scaling beyond 160 servers can be implemented by interconnecting multiple UCS domains using Nexus 3000/5000/6000/7000 Series switches, scalable to thousands of servers and to hundreds of petabytes storage, and managed from a single pane using UCS Central in a datacenter or distributed globally as shown in figure.
Read More »
Tags: Cisco UCS, cloud, Cloud Computing, data center, OpenStack, UCS, virtualization