Cisco Blogs
Share

Enable Automated Big Data Workloads with Cisco Tidal Enterprise Scheduler


October 2, 2014 - 0 Comments

In our previous big data blogs, a number of my Cisco associates have talked about the right infrastructure, the right sizing, the right integrated infrastructure management and the right provisioning and orchestration for your clusters. But, to gain the benefits of pervasive use of big data,  you’ll need to accelerate your big data deployments and make a seamless pivot of your “back of the data center” science experiment into the standard data center operational processes to speed delivery of the value of these new analytics workloads.

If you are using a “free” (hint: nothing’s free), or open source workload scheduler, or even a solution that can manage day-to-day batch jobs, you may run into problems right off the bat. Limitations may come in the form of dependency management, calendaring, error recovery, role-based access control and SLA management.

And really, this is just the start of your needs for full-scale, enterprise-grade workload automation for Big Data environments! As the number of your mission-critical big data workloads increases, predictable execution and performance will become essential.

Lucky for you Cisco has exactly what you need! Cisco Tidal Enterprise Scheduler (TES) helps organizations to operationalize Hadoop solutions by giving you enterprise-grade workload automation that cover:

  1. Dependency Management: With its ability to design business processes that establish complex dependencies across data repositories and applications, Tidal Enterprise Scheduler overcomes the shortcomings of free or open source schedulers that are usually limited to handling scripts and don’t extend to other common applications within the data center, such as ETL, DBMS, EDW, BI, etc.
  2. Enterprise Calendaring: TES provides robust, enterprise-level calendaring capabilities with a number of built-in and extensible options.
  3. Error Recovery: Robust policies provide the ability to detect failure, send notifications and retry job or entire workflows.
  4. SLA Management: TES enables SLA support that ensures proper tracking and notifications of key workflows and can predict completion time and bottlenecks.
  5. Role-based Access Control: While some schedulers can’t create different group for schedulers vs monitoring users, etc., TES provides a simple way to manage users and permissions and access
  6. Scale: While native Hadoop schedulers are designed to handle only Hadoop-related jobs and not built for enterprise scale, TES offers a variety of adapters and agents to enable customers to build end-to-end workflows that cut across traditional applications and the upcoming Hadoop-based applications.

In addition to these benefits, your organization will also obtain the following advantages:

  • Efficient Development: Developers and data scientists can focus on their core task of analytics without having to worry about integration or monitoring nuances, such as writing watch-dog scripts or throwing bodies at a complex workload error correction.
  • Consistent Operations Support: With deep API integrations including BI, ERP and standard RDBMS solutions, data center admins can continue to leverage the same applications they use today without having to retrain for Big Data workloads.

Today, apart from the wide variety of adapters and agents that we offer, TES has support for the following Hadoop specific workflows making your job(s)– forgive the pun, that much easier:

  • Data Movement: FTP-like operation to transfer files in/out of Hadoop
  • Sqoop: For moving data to/from Hadoop and DBMS
  • MapReduce: For analyzing the data stored in Hadoop
  • Hive: For running SQL-like queries on Hadoop data store

You can read more about Cisco IT’s use of Cisco Tidal Enterprise Scheduler for their Big Data infrastructure here: Cisco IT Automates Workloads for Big Data Analytics Environments from Cisco Data Center. And for more on how Cisco IT uses Cisco Unified Computing System (UCS) as the platform for Big Data, you can refer to this case study: How Cisco IT Built Big Data Platform to Transform Data Management.

But, better yet, why not give yourself the opportunity to hear about Cisco’s entire big data infrastructure and solution story. Please join us for a webcast at 9 AM Pacific time on October 21st entitled ‘Unlock Your Competitive Edge with Cisco Big Data and Analytics Solutions.’ #UnlockBigData

Register now

 

 

In an effort to keep conversations fresh, Cisco Blogs closes comments after 60 days. Please visit the Cisco Blogs hub page for the latest content.