Avatar

How important is privacy in building and maintaining trust with customers? While more than 160 countries have omnibus privacy laws today, business leaders recognize that privacy is more than a compliance exercise – it has become a business imperative that is inextricably tied to customer trust.

As part of Cisco’s recognition of Data Privacy Day on January 28, today we released the Cisco 2024 Data Privacy Benchmark Study, our seventh annual review of key privacy issues and their impact on business. Drawing on responses from more than 2,600 organizations in 12 geographies, the findings highlight the strong connection between privacy and trust, the positive impact of privacy regulation, the attractive economic returns from privacy investment, and some of the key challenges facing organizations over their use of Artificial Intelligence (AI) and generative AI.

Privacy Enables Customer Trust 

Privacy is critical to the buying process today. Ninety-four percent of respondents said their customers would not buy from them if they did not adequately protect data. And they are looking for demonstrable evidence that the organization can be trusted when it comes to privacy. Ninety-eight percent said that external privacy certifications – like ISO 27701 and the APEC Cross-Border Privacy Rules – are important factors in their buying decisions. And 97% recognize they have a responsibility to use data ethically. Interestingly, for each of these important metrics, the percentages are at the highest levels we’ve seen in our privacy research.

Strong Support for Privacy Laws 

Privacy laws put additional costs and requirements on organizations, including the need to catalog and classify data, implement controls, and respond to user requests. Despite these requirements, organizations continue to overwhelmingly support privacy laws. Eighty percent of respondents said privacy laws have had a positive impact on them, and only 6% said the impact has been negative.

Why would organizations be so positive about regulations that add cost and effort? Because they recognize the need to reassure their customers about how their data is being used. For years, consumer respondents in our research have indicated they want governments to play the leading role when it comes to protecting data. Strong privacy regulation boosts consumer confidence and trust in the organizations that are handling their data.

The Economics of Privacy 

Privacy has continued to provide attractive financial returns for organizations around the world. In this year’s survey, 95% indicated that privacy’s benefits exceed its costs. Additionally, 70% to 80% recognize the specific benefits that come from business drivers such as greater loyalty and trust, mitigating losses from data breaches, achieving operational efficiency, and enabling agility and innovation.

While privacy budgets have remained roughly flat on average for 2023 at $2.7 million, larger organizations have seen increases in the 7-8% range since last year. The average return on privacy investment was 1.6 times, which means the average organization estimates $160 of benefit for each $100 of privacy investment. Thirty percent of organizations are getting returns of at least 2 times their privacy investment.

Slow Progress on AI and Transparency 

We’ve learned that consumers are concerned about AI use involving their data today, and 60% have already lost trust in organizations over their AI practices. We asked about this in last year’s Benchmark Study, and 92% of respondents said their organizations needed to do more to reassure their customers that their data was being used only for intended and legitimate purposes in AI. We asked the same question this year, and the percentage had only dropped 1% to 91%, indicating that not much progress has been made.

When asked what they are doing to build confidence in their AI use, organizations cited several initiatives. Fifty percent are ensuring a human is involved in the process, 50% are trying to be more transparent about the AI applications, and 49% have instituted an AI ethics management program. It often takes time to put in place the necessary elements, as highlighted in Cisco’s recently published AI Readiness Index, which showed that only 14% of organizations were fully ready to integrate AI into their businesses.

Concerns with Generative AI 

Generative AI applications have the power to use AI and machine learning to create new content quickly, including text, images, and code. Seventy-nine percent of respondents said they are already getting significant value from generative AI. But this new technology brings risks and challenges, including the protection of personal or confidential data entered into these tools.

Over two-thirds of respondents indicated they were concerned about the risk of the data being shared with competitors or with the public. Nonetheless, many of them have entered information that could be problematic, including non-public information about the company (48%). Fortunately, many organizations are starting to put in place controls on the tools used or data entered. We will continue to track this novel technology as it evolves.

How to Get More Information 

To learn more, including our recommendations for organizations, check out the Cisco 2024 Data Privacy Benchmark Study, Infographic, and our Principles for Responsible AI.

Also, the Cisco 2023 Purpose Report (Power section) and the ESG Reporting Hub (Integrity and Trust) to see how trustworthiness, transparency, and accountability are keys to Cisco’s approach to security, privacy, and trust.

All this and more can be found on the Cisco Trust Center.


We’d love to hear what you think. Ask a Question, Comment Below, and Stay Connected with Cisco Security on social!

Cisco Security Social Channels

Instagram
Facebook
Twitter
LinkedIn



Authors

Robert Waitman

Director, Privacy Center of Excellence