Two weeks ago, I presented a webinar on Dynamic Fabric Automation (DFA) and went over the allocated 1 hour to cover the content. Yesterday, as I was doing follow up with a hands-on demo, I went over time too. This illustrates how rich DFA is, and how much there is to say about it! Dynamic Fabric Automation is an environment for data center automation that is centered on the CPOM (Central Point of Management), a set of services that are provided with the new Data Center Network Manager (DCNM) release 7.0(1).
The services available on the CPOM provide the following:
- Power On Auto Provisioning (POAP)
- Inter-switch link connection verification
- A single console for configuration
- Network Auto-Config Profile provisioning
- Message processing for external orchestrator
- Automatic host provisioning
- Embedded management for network monitoring and data collection
All of these services are provided using standard protocols and applications. For example, the POAP service uses DHCP, TFTP and SCP/SFTP, but using a combination of templates and a very intuitive and easy-to-use GUI, DCNM provides a simplified and systematic way of bringing up your data center fabric. The inter-switch link validation or cable consistency check allows the operator to verify the fabric connections against a predefined template and prevent unexpected connections to come up.
The Jabber process provides the single console for configuration, statistics and troubleshooting. Using any XMPP client, an operator can “chat” with the fabric devices; this approach offers the possibility to organize devices in chat groups that match their role, their location or simply some administrative set. With XMPP, a single command can be sent to multiple devices in a secure way.
The most important element of the CPOM is certainly the network profile provisioning. Read More »
Tags: #CLEUR, BGP, Cisco, Cisco Live Milan, Cloud Computing, datacenter, FabricPath, OpenStack, orchestration, VMware vCloud Director
Cloud computing is more mainstream today than ever before, but it’s important to note that there are still significant opportunities for IT leaders to innovate and leverage cloud delivery options to capture new business opportunities and implement new IT models.
The Evolution of ITaaS: The Convergence of Two Roads
On one hand, traditional private cloud services within customer IT services are driving different degrees of completeness depending on organizational needs. Virtualization, consolidation and on-premise shared services are some of the drivers within the private cloud space.
On the other hand, public cloud services are evolving to include Infrastructure as a Service (IaaS), Software as a Service (SaaS), and Platform as a Service (PaaS).
Today, these two tracks are intersecting to create demand for a hybrid cloud model. While the concept of the “Hybrid” cloud has developed mostly as a consequence of the availability of different cloud services, this same availability is also driving the evolution of IT as a Service.
What does this mean for business? It means that fundamentally, IT is adopting a supply chain management logic by deciding whether to make or buy a specific service based on a variety of organizational goals, market pressures, and available options.
The Ongoing IT Sourcing Strategy: Make vs. Buy
Read More »
Tags: Cisco, Cisco IT, cloud, Cloud Computing, cloud services, data center, data_center, Hybrid Cloud, ITaaS, orchestration, private cloud, Public Cloud
My 2014 predictions are finally complete. If Open Source equals collaboration or credibility, 2013 has been nothing short of spectacular. As an eternal optimist, I believe 2014 will be even better:
- Big data’s biggest play will be in meatspace, not cyberspace. There is just so much data we produce and give away, great opportunity for analytics in the real world.
- Privacy and security will become ever more important, particularly using Open Source, not closed. Paradoxically, this is actually good news as Open Source shows us again, transparency wins and just as we see in biological systems, the most robust mechanisms do so with fewer secrets than we think.
- The rise of “fog” computing as a consequence of the Internet of Things (IoT) will unfortunately be driven by fashion for now (wearable computers), it will make us think again what have we done to give up our data and start reading #1 and #2 above with a different and more open mind. Again!
- Virtualization will enter the biggest year yet in networking. Just like the hypervisor rode Moore’s Law in server virtualization and found a neat application in #2 above, a different breed of projects like OpenDaylight will emerge. But the drama is a bit more challenging because the network scales very differently than CPU and memory, it is a much more challenging problem. Thus, networking vendors embracing Open Source may fare well.
- Those that didn’t quite “get” Open Source as the ultimate development model will re-discover it as Inner Source (ACM, April 1999), as the only long-term viable development model. Or so they think, as the glamor of new-style Open Source projects (OpenStack, OpenDaylight, AllSeen) with big budgets, big marketing, big drama, may in fact be too seductive. Only those that truly understand the two key things that make an Open Source project successful will endure.
- AI recently morphed will make a comeback, not just robotics, but something different AI did not anticipate a generation ago, something one calls cognitive computing, perhaps indeed the third era in computing! The story of Watson going beyond obliterating Jeopardy contestants, looking to open up and find commercial applications, is a truly remarkable thing to observe in our lifespan. This may in fact be a much more noble use of big data analytics (and other key Open Source projects) than #1 above. But can it exist without it?
- Finally, Gen Z developers discover Open Source and embrace it just like their Millennials (Gen Y) predecessors. The level of sophistication and interaction rises and projects ranging from Bitcoin to qCraft become intriguing, presenting a different kind of challenge. More importantly, the previous generation can now begin to relax knowing the gap is closing, the ultimate development model is in good hands, and can begin to give back more than ever before. Ah, the beauty of Open Source…
Tags: ai, AllSeen, big data analytics, Cloud Computing, cognitive computing, cyberspace, Fog computing, hypervisor, Inner Source, internet of things, IoT, meatspace, NFV, Open, open source, opendaylight, OpenStack, privacy, qCraft, robotics, SDN, security, transparency, virtualization
The insurance industry is continuously looking for the simplest, most efficient method of providing consumers with the best service, while at the same time trying to reduce overall operating expenses. While insurance providers explore the right options for their business, one thing is certain, cloud-based environments are low-risk solutions that enable applications to increase business value. From Cisco research, we know that running desktop applications in the cloud can be attractive because it reduces complexity and increases security.
Aside from the insurance industry, other financial services institutions struggle to find a business structure that provides the desired flexibility and market savings necessary to provide excellent customer service. However, with the help of cloud computing and unified communications, these challenges are being overcome. Due to recent success and proven low-risk functionality, insurers are gradually adopting cloud solutions to help guide business operations and initiatives. In fact, Gartner is predicting that the cloud system infrastructure market to grow by 47.8% through 2015. Read More »
Tags: Cisco, Cloud Computing, Financial Services, insurance, insurers, unified communications
The road in my picture below – the A82 that winds through Glencoe in Scotland – was used in the James Bond “Skyfall” movie in one of the amazing car chase scenes. This road winds through sparsely inhabited territory, has lots of ups, downs, bumps and turns and if you’re not careful it can be a dangerous road. I’ll draw the analogy here with the challenges of introducing new technologies: there can be ups, downs, bumps and turns into the unknown, if you are not careful. And in my case here, I’ll use this analogy to illustrate the challenges of adopting OpenStack: without the right kind of approach, without a carefully managed exploratory “pilot” investigation and subsequent roadmap planning, you may find that adopting OpenStack – or any other open source software solution, for that matter – has its share of challenges, ups, downs, bumps and turns into the unknown.
Read More »
Tags: Cisco Services, cloud, Cloud Computing, data center, data_center, OpenSource, OpenStack