In this series of articles I’ll articulate the challenges customers face in hybrid cloud adoption, the key hybrid cloud requirements and ways to address them.
Organizations are trying to transform their business and innovate faster by getting access to resources on-demand per business needs but enterprise IT has not been able to provide that. This has led to a new challenge of “shadow IT” – employees going direct to the public cloud to get fast and easy access to resources by going around IT. Shadow IT proves that business users are looking for the flexibility of cloud, but IT is wary of public cloud due to associated concerns of security, loss of visibility and control.
Hybrid cloud enables organizations to innovate faster by enabling rapid, self-service provisioning of resources, with the choice to deploy workloads in enterprise’s own data center or in the public cloud in a pay-as-you-go and scaled out manner. Hybrid clouds enable multiple use cases such as dev/test, capacity augmentation and disaster recovery besides control of Shadow IT. There is an increased trend towards hybrid cloud as it offers flexibility to respond quickly to business needs and allows reduction in cost. As per Forrester, more then 70% of enterprises plan to complement their in-house server and storage resources with IaaS resources from public cloud providers for primary or peak workloads. This points to the fact that customers want Hybrid Clouds, not just private or IaaS public clouds.
While we see the advantages of hybrid cloud, we don’t see large-scale customer adoption yet. The factors that have been preventing this are
– No easy way to deploy and manage on-premise and public cloud resources through a single-console.
– Lack of security for workloads running in public cloud and unsecure connectivity from private to public cloud.
– Slow and complex management processes such as need to re-architect the application while migrating workloads across hybrid cloud.
Customers are also concerned about getting locked-in to a particular vendor’s solution or a particular public cloud. Today one particular public cloud may be right for certain class of applications but tomorrow another public cloud provider may offer better SLAs, cost or application performance. There are some fragmented solutions that allow migrating workloads from customer’s private cloud to a public cloud but then there is no easy way for the workloads to move back to enterprise or migrate easily to another public cloud. Customers have seen that with such solutions neither they get the complete agility nor the long term cost benefits. As a result, they are weary of getting locked-in to a particular public cloud or a solution that works only across a single hypervisor or over a certain compute, network or storage device.
Customers are looking at true hybrid cloud capabilities that means more than just running some applications on-premises and some in the public cloud. “Hybrid” clouds require a functional extension of local resources to the cloud to the same degree that local resources are connected/integrated. Lets look at the key customer requirements of a true hybrid cloud:
- Self-service access: A true hybrid solution will provide self-service portal to users and IT admins. It will enable users to seamlessly deploy applications both on-premise or in the public cloud from a unified console. It will allow IT admins to manage workloads from a single pane of glass and enforce complete security for workloads in multi-tenant shared public cloud as private cloud.
- Workload portability: It will offer bi-directional migration of workloads from private to public cloud independent of the underlying architecture.
- IT as a broker: It will enable IT to act as a broker on behalf of Line of business while minimizing risk by enforcing that application network and security policies are identical regardless of the location of the workloads.
- Open architecture: It will enable choice and flexibility for users, IT admins and cloud service providers by being based on Open APIs and architecture. It will allow users flexibility in workload sourcing options without getting locked-in to a particular public cloud or vendor solution. It will enable service providers to rapidly offer a hybrid cloud solution.
We believe that an open and as easy-to-use approach is essential to delivering real hybrid cloud capabilities and help transform the way IT services are delivered. With such an approach; IT will not have to live in the shadows of “Shadow IT” rather can act as a broker of cloud services for LOBs. As the above requirements are addressed, we will increasingly see organizations taking a hybrid approach to cloud.
Tags: cloud, Cloud Computing, Hybrid Cloud, IT as a broker, secure hybrid cloud
Two weeks ago, I presented a webinar on Dynamic Fabric Automation (DFA) and went over the allocated 1 hour to cover the content. Yesterday, as I was doing follow up with a hands-on demo, I went over time too. This illustrates how rich DFA is, and how much there is to say about it! Dynamic Fabric Automation is an environment for data center automation that is centered on the CPOM (Central Point of Management), a set of services that are provided with the new Data Center Network Manager (DCNM) release 7.0(1).
The services available on the CPOM provide the following:
- Power On Auto Provisioning (POAP)
- Inter-switch link connection verification
- A single console for configuration
- Network Auto-Config Profile provisioning
- Message processing for external orchestrator
- Automatic host provisioning
- Embedded management for network monitoring and data collection
All of these services are provided using standard protocols and applications. For example, the POAP service uses DHCP, TFTP and SCP/SFTP, but using a combination of templates and a very intuitive and easy-to-use GUI, DCNM provides a simplified and systematic way of bringing up your data center fabric. The inter-switch link validation or cable consistency check allows the operator to verify the fabric connections against a predefined template and prevent unexpected connections to come up.
The Jabber process provides the single console for configuration, statistics and troubleshooting. Using any XMPP client, an operator can “chat” with the fabric devices; this approach offers the possibility to organize devices in chat groups that match their role, their location or simply some administrative set. With XMPP, a single command can be sent to multiple devices in a secure way.
The most important element of the CPOM is certainly the network profile provisioning. Read More »
Tags: #CLEUR, BGP, Cisco, Cisco Live Milan, Cloud Computing, datacenter, FabricPath, OpenStack, orchestration, VMware vCloud Director
Cloud computing is more mainstream today than ever before, but it’s important to note that there are still significant opportunities for IT leaders to innovate and leverage cloud delivery options to capture new business opportunities and implement new IT models.
The Evolution of ITaaS: The Convergence of Two Roads
On one hand, traditional private cloud services within customer IT services are driving different degrees of completeness depending on organizational needs. Virtualization, consolidation and on-premise shared services are some of the drivers within the private cloud space.
On the other hand, public cloud services are evolving to include Infrastructure as a Service (IaaS), Software as a Service (SaaS), and Platform as a Service (PaaS).
Today, these two tracks are intersecting to create demand for a hybrid cloud model. While the concept of the “Hybrid” cloud has developed mostly as a consequence of the availability of different cloud services, this same availability is also driving the evolution of IT as a Service.
What does this mean for business? It means that fundamentally, IT is adopting a supply chain management logic by deciding whether to make or buy a specific service based on a variety of organizational goals, market pressures, and available options.
The Ongoing IT Sourcing Strategy: Make vs. Buy
Read More »
Tags: Cisco, Cisco IT, cloud, Cloud Computing, cloud services, data center, data_center, Hybrid Cloud, ITaaS, orchestration, private cloud, Public Cloud
My 2014 predictions are finally complete. If Open Source equals collaboration or credibility, 2013 has been nothing short of spectacular. As an eternal optimist, I believe 2014 will be even better:
- Big data’s biggest play will be in meatspace, not cyberspace. There is just so much data we produce and give away, great opportunity for analytics in the real world.
- Privacy and security will become ever more important, particularly using Open Source, not closed. Paradoxically, this is actually good news as Open Source shows us again, transparency wins and just as we see in biological systems, the most robust mechanisms do so with fewer secrets than we think.
- The rise of “fog” computing as a consequence of the Internet of Things (IoT) will unfortunately be driven by fashion for now (wearable computers), it will make us think again what have we done to give up our data and start reading #1 and #2 above with a different and more open mind. Again!
- Virtualization will enter the biggest year yet in networking. Just like the hypervisor rode Moore’s Law in server virtualization and found a neat application in #2 above, a different breed of projects like OpenDaylight will emerge. But the drama is a bit more challenging because the network scales very differently than CPU and memory, it is a much more challenging problem. Thus, networking vendors embracing Open Source may fare well.
- Those that didn’t quite “get” Open Source as the ultimate development model will re-discover it as Inner Source (ACM, April 1999), as the only long-term viable development model. Or so they think, as the glamor of new-style Open Source projects (OpenStack, OpenDaylight, AllSeen) with big budgets, big marketing, big drama, may in fact be too seductive. Only those that truly understand the two key things that make an Open Source project successful will endure.
- AI recently morphed will make a comeback, not just robotics, but something different AI did not anticipate a generation ago, something one calls cognitive computing, perhaps indeed the third era in computing! The story of Watson going beyond obliterating Jeopardy contestants, looking to open up and find commercial applications, is a truly remarkable thing to observe in our lifespan. This may in fact be a much more noble use of big data analytics (and other key Open Source projects) than #1 above. But can it exist without it?
- Finally, Gen Z developers discover Open Source and embrace it just like their Millennials (Gen Y) predecessors. The level of sophistication and interaction rises and projects ranging from Bitcoin to qCraft become intriguing, presenting a different kind of challenge. More importantly, the previous generation can now begin to relax knowing the gap is closing, the ultimate development model is in good hands, and can begin to give back more than ever before. Ah, the beauty of Open Source…
Tags: ai, AllSeen, big data analytics, Cloud Computing, cognitive computing, cyberspace, Fog computing, hypervisor, Inner Source, internet of things, IoT, meatspace, NFV, Open, open source, opendaylight, OpenStack, privacy, qCraft, robotics, SDN, security, transparency, virtualization
The insurance industry is continuously looking for the simplest, most efficient method of providing consumers with the best service, while at the same time trying to reduce overall operating expenses. While insurance providers explore the right options for their business, one thing is certain, cloud-based environments are low-risk solutions that enable applications to increase business value. From Cisco research, we know that running desktop applications in the cloud can be attractive because it reduces complexity and increases security.
Aside from the insurance industry, other financial services institutions struggle to find a business structure that provides the desired flexibility and market savings necessary to provide excellent customer service. However, with the help of cloud computing and unified communications, these challenges are being overcome. Due to recent success and proven low-risk functionality, insurers are gradually adopting cloud solutions to help guide business operations and initiatives. In fact, Gartner is predicting that the cloud system infrastructure market to grow by 47.8% through 2015. Read More »
Tags: Cisco, Cloud Computing, Financial Services, insurance, insurers, unified communications