Few things get me as fired up about this industry as being able to dive into some serious data about, well, data. By now, most of our readership is quite familiar with our Cisco Visual Networking Index study or VNI. It’s comprised of several separate efforts, with the VNI Forecast being the best known of the bunch and the one that is most often quoted in the media, used by regulators, and factored into architectural plans by our customers. The Forecast is, as its name conveys, offers a forward looking view. Using third party subscription growth forecasts and advanced modeling techniques, we deliver an estimate of the amount of traffic that will be crossing global IP networks over the next half decade.
But what is real traffic looking like now, you ask?
For that, we have the Cisco VNI Usage study – unlike the Forecast, which uses third-party subscription growth inputs, VNI Usage assess the activity taking place over networks every day. It’s a compilation of anonymous data sourced from millions of broadband subscriber lines from over 20 service providers worldwide. While we don’t know the names of the user or the specific content viewed, this unique, primary data does provide subscriber-level insights into a variety of factors such as: the type of applications used; when and how much bandwidth is consumed; and a context for the networking trends and challenges that many network operators are grappling with today. It’s real data on data that’s real interesting (and you can quote me on that). Over a series of several posts this week, I’ll dive into details of the Cisco VNI Usage study that was just released this week and covers the research period of the third calendar quarter of 2010.
Here are some top takeaways:
- The average broadband connection generates 14.9 GB of Internet traffic per month, up from 11.4 GB per month last year when we ran this same study – this is an increase of 31 percent when averaged out across the global subscriber base.
- Peer-to-peer (P2P) file sharing is now 25 percent of global broadband traffic – last year it was 38 percent of total traffic. It’s important to know that despite this significant drop in percentage, the overall about to traffic generated by P2P in absolute terms is still growing – it’s just growing more slowly than visual networking and other advanced applications such as online video. Which leads me to…
Read More »
Tags: broadband subscriber, internet traffic, internet usage study, research, visual networking index, vni
Broadband is a term that has been around for years and is admittedly overused. What some refer to as “broadband” is really more of just a step up from narrowband…call it “slightly wider narrowband” if you will. As many long-time readers of SP360 know, at Cisco, we’ve been a big proponent of having higher broadband standards and classifications since consumers, businesses, and governments alike stand to benefit. It’s not enough to just get marginally faster email when most of the rest of the world is focused on video and other advanced applications which require not just bandwidth but intelligence as well.
While there are many studies that chart overall broadband penetration, or the percentage of a population that receives broadband (by whatever definition may be used), we realized a few years ago that there wasn’t a study that focused on broadband “Quality” (i.e. what that said broadband can actually do). To that end, Cisco searched around and found interest in an effort from the Saïd Business School of Oxford University and the University of Oviedo’s Department of Applied Economics that we agreed to sponsor. The effort, called the Broadband Quality Study, or BQS, is now in its third year with the latest results just released.
So what’s the news this year compared to last year or in 2008 when the study debuted? While the BQS, which uses the data from 40 million real-life broadband quality tests conducted in 72 countries around the globe between May-June of 2010, gives us many new insights and surprises every year, I would say the real standout result this year is the speed at which countries have been able to become broadband leaders in just a few years. While we have expected this trend in the past, the three years of data to draw on, the BQS only now really proves it: Read More »
Tags: broadband, broadband quality, Broadband Quality Study, economy, Oxford University, research, Saïd Business School
As we face the combined challenges and opportunities presented by globalization, technology acceleration, and demographic shifts, competition is increasing, and innovation is becoming more and more critical for companies and countries to succeed and stay ahead of the curve.
This past week at Educause, we announced the availability of our new Research and Administrative Computing solution based on Cisco’s world-class Data Center and Collaboration technologies. This solution can help administrative leaders and research center directors save time and money and improve performance by working better together. By improving research collaboration within the university, and between universities, companies, and governments, we can improve the ability to innovate.
Read More »
Tags: administration, architecture, collaboration, communication, education, growth, higher, innovation, research
Universities have traditionally been measured on the quantity and quality of research funding and research outputs as one of the main factors of institutional success. It is one measure used to rank universities both nationally and internationally and is used more than any other measure to attract both students and faculty. But is the way research effectiveness perceived about to change? The internet and web 2.0 have made collaboration easier and easier. Hours of library research have been reduced through not only sophisticated search engines and online journals and e-books, but access to blogs of leading academics, researchers and research students, and to tools such as Twitter, Del.icio.us and wikis.
In published research papers, arguments are based on the evidence of research data of which much is invisible, or evidenced through carefully selected quotations or the results of experiments. With the ability to store video and audio electronically as well as numerical and textual data, research publications will become increasingly multi-modal; and data-sets will be made easily accessible so that research results can be opened to greater public scrutiny as well as re-use by others.
Perhaps, as collaboration is made easier and facilitated between and across institutions sole authorship will become a thing of the past as academics work together on research questions, sharing data sets and interpreting them in new ways according to their research questions, their academic domain and their cultural experience. Researchers will post their research data in multiple formats onto collaboration web sites so others can take them down, re-interpret them together and add to the global pool of knowledge
Peer review and referred publications will be replaced by peer argument and co-creation and co-development of theory which means research council funding will need to be based on new criteria
This scenario is perhaps not too futuristic as the tools that make it possible are available now. So what will academic expertise mean in a web 2.0 world and how will academic rigour be judged? What will the criteria be for acceptance as contribution to research, and how will universities be judged on their scholarly output? And finally, what will be counted as authoritative research evidence and who will it belong to?
Tags: collaboration, higher education, peer review, research, web 2.0