Hey data heads! O’Reilly Strata Conference kicks off tomorrow at the Santa Clara Convention Center. If you plan to go you better have tickets because the event is sold out.
If you are attending the three days of sessions, keynotes, receptions and exhibits, please stop by the MapR booth for some face time with one of our big data partners and plan to attend their sponsored panel discussion which includes Cisco IT and other MapR customers facilitated by Mike Gualtieri from Forrester Research.
Best Practices for Hadoop In Production
“Mike Gualtieri, principal analyst at Forrester Research, Inc., will engage a panel of production Hadoop users – including Cisco IT, The Climate Corporation, The Rubicon Project, and Solutionary – to discuss the challenges and best practices for deploying Hadoop in production. Join us for an engaging conversation on tips and tricks in deploying Hadoop in production. 10:40am, Wednesday February 12 in Ballroom H” See more here.
Representing Cisco IT will be Distinguished Engineer, Piyush Bhargava. Piyush is Chief Architect for Data Architecture & Innovation and is responsible for finding new ways of harnessing value from data and for designing architectures to support data processing, analytics & data science needs.
If you can’t make it to Strata Conference or are double booked during this time slot you can get the goods on this topic from a number of other sources.
- First, check out the webinar Cisco and MapR produced with Mike that highlights the Cisco IT case study. Mike outlines the 7 Best Practices for Productionizing Hadoop, MapR follows with their best practices for data tables and Cisco IT covers their big data use case. If you want to skip to the Cisco IT use case, it starts at 34:00.
- Then, take a look at my blog post on the Cisco IT big data platform success story. Cisco IT has standardized on Cisco UCS servers and fabric interconnects for its big data infrastructure and uses MapR for data tables.
For operational workloads, Cisco IT has standardized company-wide on Cisco Tidal Enterprise Scheduler with its unique set of adapters that include API integrations to MapR, Hive and Sqoop for big data workload processing.
- Next, spend some time reading the Unleashing IT Big Data Edition. This special edition of Unleashing IT is all about big data and highlights our joint Cisco/MapR customer Solutionary who incidentally, will join Cisco IT on the Strata Conference panel discussion on Wednesday. This e-zine is packed full of big data thought leadership; best practices for infrastructure and operations; customer success stories and include our partners Intel, MapR and others.
- Finally, peruse the Cisco big data website for all the latest on our solutions, partners, services and resources to help you make the best informed decision on moving your big data platform from development to production.
Big Data blogs I follow:
Tags: Big Data, Cisco IT, Cisco UCS, Cisco Unified Management, inside cisco it, software, Strata Conference, Tidal Enterprise Scheduler, unified management, workload automation
Editor’s Note: This is the second of a four-part deep dive series into High Density Experience (HDX), Cisco’s latest solution suite designed for high density environments and next-generation wireless technologies. For more on Cisco HDX, visit www.cisco.com/go/80211ac. Read part 1 here. Read part 2 here.
The 802.11ac wireless networking standard is the most recent introduction by the IEEE (now ratified), and is rapidly becoming more accepted and reliable industry standard. The good news is that the client and vendor adoption rate for 802.11ac is growing at a much higher pace as compared to when 802.11n was introduced back in 2009. There has been an accelerated growth seen with the mobile and laptop devices entering the wireless market embedded with an 802.11ac WiFi chipset. Unlike in the past, laptop, smartphone and tablet manufacturers are now acknowledging the fact that staying up to date with the latest Wi-Fi standards is as important for the bandwidth hungry users as having a better camera or a higher resolution display.
With the launch of the new 802.11ac AP 3700, Cisco introduces the Cisco HDX (High Density Experience) Technology. Cisco HDX is a suite of solutions aimed towards augmenting the higher performance, more speed and better client connectivity that 802.11ac standard delivers today.
ClientLink 3.0 features as an integral part of Cisco HDX technology designed to resolve the complexities that comes along with the new BYOD trend driving the high proliferation of 802.11ac capable devices.
So what is ClientLink 3.0 technology and how does it work?
ClientLink 3.0 is a Cisco patented 802.11ac/n/a/g beamforming technology Read More »
Tags: 802.11, access point, antenna, AP, beamforming, cell size, Cisco, client, client connectivity, ClientLink, device, downlink, hardware, HD, HDX, high density, IEEE, Industry Standard, LAN, mobile, mobility, network, rf, smartphone, software, solution, tablet, technology, wi-fi, wifi, wireless, wlan
“Software is Eating the World” is a quote attributed to Marc Andreessen and somewhat further explored by his business partner Ben Horowitz. Mark Andreessen gives compelling reasons to validate this quote. To some extend I have to agree with some of his reasons (but I am also a little bit biased as a software engineer). On the other hand, when I read this (and this is partly based on working in different domains on software), I wonder if software is that disruptive? If you look “under the hood” of software applications, you find that a lot of software is based on fundamental software principles that are already 20-30 years old, yet Read More »
Tags: analytics, API, SDN, sensors, Service Provider, software
“Software is Eating the World” is a quote attributed to Marc Andreessen and somewhat further explored by his business partner Ben Horowitz. Mark Andreessen gives compelling reasons to validate this quote. To some extend I have to agree with some of his reasons (but I am also a little bit biased as a software engineer). On the other hand, when I read this (and this is partly based on working in different domains on software), I wonder if software is that disruptive. If you look “under the hood” of software applications, you find that a lot of software is based on fundamental software principles that are already 20-30 years old, yet they are still frequently used (and for good reasons). That does not mean there are no new advances in software, however old and proven technologies still play an important role (like we say in mathematics, it does not become old, it becomes classic).
So maybe the reason that “Software is Eating the World” is due to the advances in hardware? Would you run modern enterprise applications in the Cloud 20 years ago? One of the challenges could certainly be the bandwidth. Was the IPhone a victory for software or hardware? A lot of the IPhone GUI was not that revolutionary IMO but the combination of hardware and software made for a potent technology disruption.
Read More »
Tags: analytics, API, SDN, sensors, Service Provider, software
This was the title of a November 19 2013 panel that I had moderated in Washington D.C. at the MPLS-SDN Isocore Conference.
The abstract for this conference was designed to be a bit provocative, specifically:
“ Virtualization as a concept is not new. However, in the context of Software Defined Networking,the virtualization discussion has been focusing on overlay functions e.g networking. What about virtualization overlays and interworking with existing architectures? What are the implications to performance and management? Are we speaking the same language?
The panelists will have an opportunity to articulate the virtualization problem space for the industry and the opportunity for the industry to address.”
My panelists included the following individuals: Read More »
Tags: analytics, APIs, Big Data, SDN, Service Provider, services, software