Wealth Management firms are spending billions on IT to differentiate in the market place. Yet the question remains, can “Big Data” have a material impact on the business? Can it deliver business outcomes by reducing risk, increasing assets under management, driving profitability, client satisfaction, products per client, client and financial advisor retention, all while improving the cost/income ratio and return on equity?
These are questions that are being discussed in board rooms across the financial industry and topics that I will cover in this blog series.
In order to answer these questions we need to put the wealth client at the center and understand changing client needs and expectations around how the client wants to be served by the firm. We need to examine external factors such as the impact of game changing consumer technology and unprecedented client access to information, as well as understand how new market entrants are challenging the traditional financial advisor value proposition and business model as a new round of Robo-Advisors hit the market.
Up until recent years, banks enjoyed an account centric transactional business model. What is changing is the onset of unstructured social interaction data as smart mobile devices and mobile broadband Internet usage reach high penetration levels. Device proliferation is leading to the availability of “data exhaust” from mobile phones, tablets, automobiles, video cameras, and from sensors in buildings, streets, consumer wearables and footfall traffic counters. Correlation of such data to better attract, retain, and serve clients can create market advantage.
The “Big” in Big Data comes from the fact that worldwide data volume is doubling every two years with unprecedented volume, variety, and velocity. Ninety percent of the all data in the history of the world was created in the last two years (SINTEF)! The concept of Big Data is about the correlation and analysis of transaction data, social interaction data, and machine/sensor data in a way that can turn data into knowledge, knowledge into insights, and insights into actions in real-time.
So what does this all mean for wealth managers?
As a wealth manager, what impact would it have on your business if you were able to increase the understanding of your client exponentially? Actions derived from data are informed by highly personalized needs predictions that can arm wealth managers with deep insights about their clients, increase their relevance in every interaction, and directly contribute to business outcomes. Big Data can help wealth managers transform the client value proposition and re-imagine the client experience.
The new vision for financial services is that a firm must be present in the financial lives of its clients, any time, any place, on any device, and across any channel.
The firm can no longer wait for the client to come to it. It must be proactive in delivering highly relevant value-added services in real-time and anticipate client needs. The firm needs to aspire to creating a “market of one” experience for each wealth client, understand the needs of and the hierarchy within the household, and move to a client centric versus account centric go-to-market approach.
When it comes to Big Data in Wealth Management start with the foundation, put the client at the center, and define business outcomes. Focus on building capabilities around what is possible while re-imagining the client experience.
Wealth management firms can take concrete steps in the form of measurable business outcome based projects to significantly enhance the client experience. These include:
- Define a roadmap for wealth client data analytics maturity. This will identify gaps that can be addressed resulting in more relevant advisor-client interactions.
- Establish a wealth client listening system across all channels. Early detection of client behaviors can lead to the identification of issues and sales opportunities.
- Create a real-time single view of wealth client data with data virtualization. Substantial savings can be had by leaving disparate data in place while providing managers with a single view.
- Establish an analytics driven financial advisor collaboration platform. This helps create market differentiation by maximizing advisor productivity, sharing best practices daily.
- Deploy mobile virtual advisor video capability and establish branch analytics. This improves client experience and gives advisors more minutes per day with clients increasing cross-selling opportunities.
- Empower advisors with real-time client insights to drive business outcomes. This helps the advisor manage to client life events with much greater granularity and speed.
The choices that wealth management firms make around data analytics in the next two years will determine their position in the marketplace. Can Big Data help wealth managers? With a client centric and business outcomes solutions approach, the answer is an astounding YES!
I will discuss each of the above steps in more detail in my next blog. As always I welcome your suggestions, stories, and feedback!
Tags: analytics, Big Data, financial advisor, Financial Services, video, wealth management
Cisco has been working closely with Hortonworks in delivering turnkey Big Data solutions that expedites the time to market for our joint Big Data Hadoop customers. Cisco’s industry leading UCS Integrated Infrastructure for Big Data, is designed to deliver performance at scale for a wide variety of Big Data workloads. We are working with Hortonworks to integrate Cisco UCS Director Express for Big Data with Apache Ambari, to provide a fully automated solution to deploy and manage Hadoop hardware, networking and the Hortonworks Data Platform. It is built on the solid foundation of the highly successful UCS management platform and the award winning UCS Director orchestration engine.
Today, we are excited that Cisco UCS is HDP certified and Operations Ready. The integration with Apache Ambari allows customers to now deploy and manage their Hadoop clusters in a reliable and consistent manner. The Operations Ready designation is a new certification introduced by Hortonworks to provide the additional assurance that the tool has been integrated with Apache Ambari APIs. Cisco UCS with Hortonworks delivers a fully validated solution and reduces the complexity of managing Hadoop clusters. Cisco is committed to bringing industry leading solutions for Big Data to market, in partnership with Hortonworks and other ecosystem partners.
Tags: Big Data, Cisco UCS Director, UCS Director Express for Big Data, ucs integrated infrastructure, ucsbigdata, unlock big data, UnlockBigData
Ten large oil refineries produce about 10 terabytes of data each day, which equates to the entire printed collection of the U.S. Library of Congress.
One modernized city the size of Singapore can generate about 2.5 petabytes of data every day, which translates to all U.S. academic research libraries combined.
And with more than 14 billion, data-transmitting devices connected to the Internet today, growing to 50 billion by 2020, it is little wonder that most of us are overwhelmed by this mind-boggling explosion of data.
Turning this flood of raw data into useful information and even wisdom for better business decisions and quality of life experiences is what the Internet of Everything (IoE) is all about. This is a daunting task. According to IDC Research, just .5% of all data is used or analyzed, and online data volumes are doubling every two years from a combination of mobile devices, videos, sensors, M2M, social media, applications and much more.
Connected Analytics Portfolio
Last Thursday, however, Cisco unveiled our Connected Analytics portfolio for the Internet of Everything, a unique approach that includes software packages to bring analytics to the data, regardless of its location or whether it is in motion or at rest. This new generation of analytics tools for IoE can convert more and more data into valuable intelligence — from the inter cloud, to the data center to the network’s edge.
Read More »
Tags: analytics, Big Data, Cisco, Fog, Internet of Everything, internet of things, IoE, IoT, Process Improvement, Wim Elfrink
In this episode of Engineers Unplugged, Tony Harvey (@tonyknowspower) and Craig Sullivan (@craigsullivan70) discuss the role of storage in SAP HANA. How does big data impact you? Watch and learn.
Thinking about unicorns.
This is Engineers Unplugged, where technologists talk to each other the way they know best, with a whiteboard. The rules are simple:
- Episodes will publish weekly (or as close to it as we can manage)
- Subscribe to the podcast here: engineersunplugged.com
- Follow the #engineersunplugged conversation on Twitter
- Submit ideas for episodes or volunteer to appear by Tweeting to @CommsNinja
- Practice drawing unicorns
Join the behind the scenes by liking Engineers Unplugged on Facebook.
Tags: Big Data, netapp, SAP. HANA, Storage, UCS
Want to get the most out of your big data? Build an enterprise data hub (EDH).
Big data is rapidly getting bigger. That in itself isn’t a problem. The issue is what Gartner analyst Doug Laney describes as the three Vs of Big Data: volume, velocity, and variety.
Volume refers to the ever-growing amount of data being collected. Velocity is the speed at which the data is being produced and moved through the enterprise information systems. Variety refers to the fact that we’re gathering information from multiple data sources such as sensors, enterprise resource planning (ERP) systems, e-commerce transactions, log files, supply chain info, social media feeds, and the list goes on.
Data warehouses weren’t made to handle this fast-flowing stream of wildly dissimilar data. Using them for this purpose has led to resource-draining, sluggish response times as workers attempt to perform numerous extract, load, and transform (ELT) functions to make stored data accessible and usable for the task at hand.
Constructing Your Hub
An EDH addresses this problem. It serves as a central platform that enables organizations to collect structured, unstructured, and semi-structured data from slews of sources, process it quickly, and make it available throughout the enterprise.
Building an EDH begins with selecting the right technology in three key areas: infrastructure, a foundational system to drive EDH applications, and the data integration platform. Obviously, you want to choose solutions that fit your needs today and allow for future growth. You’ll also want to ensure they are tested and validated to work well together and with your existing technology ecosystem. In this post, we’ll focus on selecting the right hardware.
The Infrastructure Component
Big data deployments must be able to handle continued growth, from both a data and user load perspective. Therefore, the underlying hardware must be architected to run efficiently as a scalable cluster. Important features such as the integration of compute and network, unified management, and fast provisioning all contribute to an elastic, cloud-like infrastructure that’s required for big data workloads. No longer is it satisfactory to stand up independent new applications that result in new silos. Instead, you should plan for a common and consistent architecture to meet all of your workload requirements.
Big data workloads represent a relatively new model for most data centers, but that doesn’t mean best practices must change. Handling a big data workload should be viewed from the same lens as deployments of traditional enterprise applications. As always, you want to standardize on reference architectures, optimize your spending, provision new servers quickly and consistently, and meet the performance requirements of your end users.
Cisco Unified Computing System to Run Your EDH
The Cisco Unified Computing System™ (Cisco UCS®) Integrated Infrastructure for Big Data delivers a highly scalable platform that is proven for enterprise applications like Oracle, SAP, and Microsoft. It also provides the same required enterprise-class capabilities–performance, advanced monitoring, simplification of management, QoS guarantees–to big data workloads. With lower switch and cabling infrastructure costs, lower power consumption, and lower cooling requirements, you can realize a 30 percent reduction in total cost of ownership. In addition, with its service profiles, you get fast and consistent time-to-value by leveraging provisioning templates to instantly set up a new cluster or add many new nodes to an existing cluster.
And when deploying an EDH, the MapR Distribution including Apache™ Hadoop® is especially well-suited to take advantage of the compute and I/O bandwidth of Cisco UCS. Cisco and MapR have been working together for the past 2 years and have developed Cisco-validated design guides to provide customers the most value for their IT expenditures.
Cisco UCS for Big Data comes in optimized power/performance-based configurations, all of which are tested with the leading big data software distributions. You can customize these configurations further, or use the system as is. Utilizing one of Cisco UCS for Big Data’s pre-configured options goes a long way to ensuring a stress-free deployment. All Cisco UCS solutions also provide a single point of control for managing all computing, networking, and storage resources, for any fine tuning you may do before deployment or as your hub evolves in the future.
I encourage you to check out the latest Gartner video to hear Satinder Sethi, our VP of Data Center Solutions Engineering and UCS Product Management, share his perspective on how powering your infrastructure is an important component of building an enterprise data hub.
In addition, you can read the MapR Blog, Building an Enterprise Data Hub, Choosing the Foundational Software.
Let me know if you have any comments or questions, or via twitter at @CicconeScott.
Tags: Big Data, blade server, blades servers, C240 M3 Rack Server, Cisco UCS, Cisco Unified Computing System, Cisco Unified Data Center, Cisco Unified Fabric, Enterprise Data Hub, Gartner, Hadoop, MapR, rack server, UCS Central, UCS service profiles