Developing Products and Solutions in the 21st Century

November 10, 2014 - 2 Comments

On November 3rd, 2014 at the Software Defined Network-Multiprotocol label Switching SDN-MPLS (Software Defined Networking-Multiprotocol Label Switching) Conference in Washington D.C: I moderated a stellar panel titled, “Developing Products and Services in the 21st Century.”

Quite a few of the attendees represented Service Providers; with a few attendees from the Public Sector and vendor communities.

In framing up the discussion, I had proposed the following provocative abstract:

In the world of “real time”, the relationship between supplier and customer is blurring.  This world is now defined by so-called co-innovation where each party is equally accountable; the supplier more so when one frames the discussion around achieving business outcomes.

Much of these principles have been described in the book, Consumption Economics, The New Rules of Tech written by J.B. Wood et al where the linear process of defining requirements; creating complexity and monetizing complexity is no longer valid.

Further, development is moving from agile operations to true development operations requiring a very close customer-supplier relationship, in it sum equates to a co-innovation partnership.  

The development paradigm is changing now as well from one where traditional “waterfall” design requires painstaking design and planning, to a more agile one.

Furthermore, many organizations are realizing that the real value in a development organization is in the organization’s ability to quickly iterate on product development – which has changed the model from the value being in how an organization works rather than necessarily on what it builds.

As the discussion moves from SDx, OpenX, NFV, Internet of Things and so on, can we abstract hypothetically on what the next waves of technologies will look like and their implications to portfolio development?

What is the process to scout for technologies sustainably?

How does one shape portfolios not only from a customer perspective but also from a supplier perspective?

What is the role of research?

This process has evolved recently from a long-form process of development of standards, which then factored into requirements for RFQ/RFP processes, which then led to product acquisition and deployment.

Products deployed in these processes typically were deployed for many, many years.

How can we envision the development and consumption of services in the future?

There is no doubt, that the notion of “real time” will disrupt processes that have been in place for the past several years.  Think Microsoft providing a new capability  [feature] every three days compared to hours e.g. Facebook, Google.

Certainly, capturing a relentless customer perspective mapped to the business models; and deconstructing the problem space will be most pivotal.  Co-innovation and a partnership therein between customers-service providers and vendors is indeed our opportunity.

One last word from a Service Provider on the panel was move away from being infrastructure centric under the context of cloud computing; focus on capturing value – what is it YOU are selling and to WHOM?

Tweet us @CiscoSP360 if you have any questions or comments.

In an effort to keep conversations fresh, Cisco Blogs closes comments after 60 days. Please visit the Cisco Blogs hub page for the latest content.


  1. Dear Santanu,

    Thank you always for your invaluable input- I so appreciate it!
    The premise of the discussion is that moving from a very product centric modality to one that redefines so called services in real time is perhaps the industry opportunity. I was recently reminded by a colleague that laser focus on the customer experience is a MUST.
    NfV is tablestakes…..

  2. Another great write up! Would you say that convergence of packet and circuit networks will become easier as SDN/NFV models become more adaptable to SP’s? Smooth interaction between IP & Transport networks have raised a few question marks from operations and management perspective – duplication of functionality, resources, Capex/Opex etc., lack of a common way to use and manage. Solution of bigger router ( e.g., Juniper T640 or TX8 or Cisco CSR-1..) is better may or may not be acceptable for much longer any more in “green” environment.

    I once watched a demo with optical Wavelength switches and OpenFlow controllers (SC09) done at Stanford. ..different ISP’s with different business models and a “Private Line ” adopted a unified Virtualization approach where they all came to the same Flowvisor under t6he Transport service Provider (TSP) control. OpenFlow protocol was used between the ISP’s and Flowvisor and Flowvisor and the Client network tiers. The Client networks consisted of a single physical infrastructureof packet and circuit switches.

    TSP was shown to virtualize the network with the FlowVisor while maintaining operator control via NMS/EMS:
    1) The FlowVisor managing “slices” of the TSP’s network for ISP customers, where { slice = bandwidth + control of part of TSP’s switches }

    2) NMS/EMS was used to manually provision circuits for Private Line customers

    Of course, as we all know what works in a lab or even in a contained environment, is a far cry from live production networks from more perspectives than we can likely count :-).

    Hence, IMHO, great subject to address. SDN/NFV is almost here, at POC stage and about to take off…about time the (I)SP’s got in the room too!

    Gratitude and Respects,