Cisco Blogs


Cisco Blog > Government

Work-Life Innovation: Impact on the Individual

Networked technologies have made work and learning increasingly mobile and highly flexible. So much so that employees are now choosing work-location flexibility over a higher salary and employers are providing workers with the tools to facilitate this. Cisco IBSG calls this “Smart Work.” Of course, the ability to make flexible working a viable option depends on a number of factors, including availability of good broadband connectivity, employer trust, the nature of the work in which an employee is engaged, and suitable social software and video technologies that enable the employee to remain in a connected (albeit virtual) work environment.

Employees, too, have to develop a new form of self-discipline that involves maintaining a good work-life balance; rather than working longer hours, this entails spending much of their extra time with family, in the community, or furthering their own personal and professional development. Read More »

Tags: , , , , , , , , , , , , , ,

Cisco Open Source Conference 2012

We’ve held our annual Cisco Open Source event this week, on May 1st in San Jose. I’m very impressed to see the large turnout and the ultra positive feedback after the keynote and 5 tracks on Linux, SDN, Big Data, Emerging Technologies and Community Development. Wonderful to see Irving Wladawsky-Berger from IBM, Jim Zemlin from the Linux Foundation, Simon Crosby from Bromium and the great discussions that ensued. Next time we’ll have to open this event up to more than just one afternoon, there is just so much open collaboration that is taking place. My thanks to our track leads, Michael Hein who helped me put together the Linux track, Jan Medved and Dave Ward on SDN, Mark Voelker and Ed Warnicke on Big Data, Fabio Maino and Flavio Bonomi on Emerging Technologies, and Peter Saint-Andre for the Community Management and Tools — these guys have already left their mark on timeless and enduring open standards, but it’s amazing to see how good they are in open source!  We’ll have to post the key takeaways in these next blog entries, for now to all those of you who came, contributed and enjoyed this event, we salute you! Open at Cisco is a vibrant and growing community.

Tags: , , , ,

Towards an Industry Standard for Benchmarking Big Data Workloads

Industry standard benchmarks have played, and continue to play, a crucial role in the advancement of the computing industry. Demands for them have existed since buyers were first confronted with the choice between purchasing one system over another. Over the years, industry standard benchmarks have proven critical to both buyers and vendors: buyers use benchmark results when evaluating new systems in terms of performance, price/performance and energy efficiency, while vendors use benchmarks to demonstrate competitiveness of their products and to monitor release-to-release progress of their products under development [1]. Historically we have seen that industry standard benchmarks enable healthy competition that results in product improvements and the evolution of brand new technologies.

Over the past quarter-century, industry standard bodies like the Transaction Processing Performance Council (TPC) and the Standard Performance Evaluation Corporation (SPEC) have developed several industry standards for performance benchmarking, which have been a significant driving force behind the development of faster, less expensive, and/or more energy efficient system configurations.

The world has been in the midst of an extraordinary information explosion over the past decade, punctuated by rapid growth in the use of the Internet and the number of connected devices worldwide. Today, we’re seeing a rate of change faster than at any point throughout history, and both enterprise application data and machine generated data, known as Big Data, continue to grow exponentially, challenging industry experts and researchers to develop new innovative techniques to evaluate and benchmark hardware and software technologies and products.

I am co-chairing a workshop with my distinguished colleagues Chaitanya Baru, Meikel Poess, Milind Bhandarkar, Tilmann Rabl and others entitled Workshop on Big Data Benchmarking (WBDB 2012) [2], supported by the National Science Foundation (NSF.gov). This is a crucial initial step towards the development of an industry standard benchmark for providing objective measures of the effectiveness of hardware and software systems dealing with Big Data. Several industry experts and researchers have been invited to present and debate their vision on benchmarking big data platforms.

A report from this workshop will be presented at the just-announced 4th International Conference on Performance Evaluation Benchmarking (TPCTC 2012) [3], organized by the TPC, which will be collocated with the 38th International Conference of Very Large Data Bases (VLDB 2012), a premier forum for data management and database researchers, vendors and users. With this conference, we encourage industry experts and researchers to submit ideas and methodologies in performance evaluation, measurement and characterization in areas including, but not limited to: big data, cloud computing, business intelligence, energy and space efficiency, hardware and software innovations and lessons learned in practice using TPC and other benchmark workloads [4].

Cisco has been an active member of the TPC since 2010 and the SPEC since 2009.

[1] R. Nambiar, N. Wakou, P. Thawley, A. Masland, M. Lanken, M. Majdalany, F. Carman: Shaping the Landscape of Industry Standard Benchmarks: Contributions of the Transaction Processing Performance Council: Springer 2011
[2] Workshop on Big Data Benchmarking: http://clds.ucsd.edu/wbdb2012/
[3] TPC Press Release: http://finance.yahoo.com/news/transaction-processing-performance-council-announces-150000511.html
[4] TPCTC 2012 Call for Papers: http://www.tpc.org/tpctc2012/

Tags: , , , ,

Cisco UCS Ecosystem for Oracle: Extend Support to Big Data and Oracle NoSQL Database

Many Big Data related innovations have been developed by Web 2.0 companies, resulting in a growing collection of open source technologies that dramatically change the culture of collaborative software development and the scale and economics of hardware infrastructure. These technologies enable data storage, management and analysis in ways that were not possible before with traditional technologies such as relational database management systems, in a cost-effective manner.

NoSQL is one such technology that has emerged as an increasingly important part of big data trends for applications that demand large volumes of simple reads and updates against very large datasets (Hadoop is the other innovation, a generic processing framework designed to execute “read only” queries and batch jobs against massive datasets). NoSQL is often characterized by what it is not, and definitions vary. It can be Not Only SQL-based or simply Not a SQL-based relational database management system. NoSQL databases form a broad class of non-relational database management systems that are evolving rapidly, and several solutions are emerging with highly variable feature sets and few standards.

While these technologies are attractive from the standpoint of the innovations they can bring,  not all products meet enterprise requirements. Many organizations require robust, commercially supported solutions for rapid deployments and the ability to integrate such solutions in to existing enterprise applications infrastructure.

To address these needs, Cisco and Oracle are the first vendors collaborating to deliver enterprise-class NoSQL solutions. Exceptional performance, scalability, availability and manageability are made possible by the combination of the Cisco Unified Computing System (UCS) and  Oracle NoSQL Database. Together, this powerful solution provides a platform for the quick deployment along with predictable throughput and latency for most demanding applications.

Read More »

Tags: , ,

Innovation in Government? Collaborative Research is Critical to Fuel Breakthroughs

This week I had the opportunity to meet with a research group from the University of Tokyo visiting California to explore the role of technology for intelligent cities of the future.   I prepared for this meeting with a discussion with colleage Dr. Norm Jacknis concerning his collaboration with government leaders and university researchers who are delving deeply into the impact of the Internet on government, politics, and society. 

Three takeaways were clear from these conversations:

1. Critical importance of collaborative research across expertise domains, geographies, and public and private sectors

2. Capability to harness the explosion of information or big data deluge that is being fueled by mobile devices connected to the intelligent network

3. An optimistic point of view about potential for research applications, and I’m an optimist!

Next month, Cisco is hosting a live webcast with Dr. Martin Chalfie, 2008 Nobel Prize Laureate in Chemistry Fueling Innovation:  How Research is Really Done (February 29, 2012 at 9:00 am Pacific Time / 12:00 pm Eastern Time). 

This webcast will explore how the fruits of basic research are critical to fueling applications. Dr. Chalfie will give examples from his own research developing Green Fluorescent Protein (GFP) as a biological marker, as well as from work by others, to demonstrate that the application of basic research into fundamental problems in biology is important for its own sake and, fuels the development of various new applications.

While research is typically focused on one industry, great discoveries generally provide value for multiple industries. 

Dr. Chalfie is a Professor of Biological Sciences and former chair of the Department of Biological Sciences at Columbia University. In 2008 he shared the Nobel Prize in Chemistry with Osamu Shimomura and Roger Tsien for his introduction of GFP as a biological marker.

Tags: , , , ,