Avatar

No one can deny that the internet is now integral to the way people work and relax. We spend hours streaming videos on services like Netflix and Youtube. And we consult Google on every possible question from where to go on holiday, to the latest news, to the meaning of life.

But as online services become ever more deeply embedded into people’s lives, the importance of providing a flawless user experience grows too. There are lots of competing demands for people’s attention. And if a film starts buffering or search results take a few seconds to load, customers are going to close the window and move on.

Caching content to meet new demands

Unfortunately, growing customer demands and expectations are making these frustrating experiences more common – and we can understand why by looking at how video streaming works.

When a customer asks to stream a film, that information is transmitted from a point at the network edge, like a home or office. In a traditional network, the request would then be transmitted to a centralised data centre, via the core network, to be processed. The data enabling the film to be streamed would then be sent to the customer back at the edge.

This model used to work just fine. Even though the data was travelling a long way, it went so fast that it didn’t matter. But those few milliseconds that it takes to travel to the data centre and back really add up when the network is responding to millions of customer requests at the same time. And as the demands on the network grow, so does the likelihood of delays and dissatisfied customers.

To make sure they meet their customers’ expectations, search engines and video streaming companies need to find a way of delivering fast services at huge scale. One way of doing this is keeping the distance that data travels through the network to a minimum. That’s why they’re increasingly storing content in the outer network layers. Because when the information needed to supply the content is here, there’s no longer any need to send data to the inner network and back.

Developing a dispersed network

But this trend creates a challenge for service providers, who are suddenly being asked to provide the functionality of a data centre in the outer parts of the network. The good news is that this is possible. The challenge is that it takes careful planning and investment in new technology.

Creating a dispersed network infrastructure means rebuilding your network in several stages – probably over a period of years. So it’s important that service provides choose a technology partner that can offer expert advice and support over the long term.

Businesses will need to think about how they can use technology like segment routing or Cisco’s Application Centric Infrastructure (ACI) to create an automated, integrated architecture, that allows them to manage traffic efficiently across the network. And they’ll need to think about managing security from the very beginning of the process.

CORD: a global movement

A key source of support for service providers building distributed networks is the CORD (Central Office Re-architected as a Datacentre) project. CORD, which is backed by a range of leading tech companies across the world, works on making it possible to provide the traditional functions of a data centre in the outer network layer.

Leading service providers are in no doubt that the dispersed networks supported by CORD are part of the future. Andre Fuetsch, the senior vice president of architecture and design at AT&T, said that his company “supports the goals and achievements that are embodied in CORD.”

“The work is pushing the boundaries of many technologies and architectures, as well as open source and open spec hardware,” said Futsch last year. “We are learning from the CORD experiments and trials and using this knowledge to refine AT&T’s Integrated Cloud.”

Another company on board with the project is the data centre provider Equinix. Its chief technology officer Ihab Tarazi said that the company “plans to collaborate with the CORD project and deploy the CORD software stack on Open Compute Project hardware.”

Businesses like these know that creating a more distributed network function structure will be essential if they want to support the requirements of streaming services and search engines, as well as 5G networking. It’s a real challenge. But service providers that make the effort to meet it now will reap the rewards in years to come.

Find out more about how you can meet the challenges of the future with Cisco’s data centre solutions.

 



Authors

Philippe Tubello

Manager, Systems Engineering

Global Service Provider - EMEAR