Diving brief:
- Tech professionals say data privacy is at the top of their list of ethical concerns surrounding the deployment of generative AI in the enterprise, according to a Deloitte report released Monday. The firm surveyed 1,848 business and technology professionals.
- According to the report, nearly three-quarters of professionals ranked data privacy among their top three ethical concerns when using technology.
- Two in five respondents reported data privacy as their top concern this year, nearly double the one in four respondents who cited data privacy. in Deloitte’s 2023 survey.
Dive overview:
Technology leaders are addressing the infrastructure and talent needs of their organizations while helping to guide the adoption of generative AI. Ethical concerns should also be on the checklist.
“GenAI eliminates the ‘expertise barrier’: more people can make the most of data, with less technical knowledge required,” said Sachin Kulkarnimanaging director, risk and brand protection, Deloitte LLP, in the report. “While this is a benefit, the potential for data leaks may also increase as a result.”
Professionals are also concerned about the impacts of generative AI on transparency, data provenance, intellectual property and hallucinations. Job cuts, while often cited as a major concern, have only been reported by 16% of respondents.
Among emerging technology categories, business and IT professionals have identified cognitive technologies – a category that includes, among others, large language models, machine learning, neural networks and generative AI – as posing the most serious ethical risks.
The category has surpassed digital reality, autonomous vehicles and robotics, among other technology verticals. However, respondents also ranked cognitive technologies as most likely to contribute to social good.
Due to their reliance on data, the majority of executives are concerned about how generative AI tools can increase cybersecurity risks by expanding their attack surface, a flexible survey published earlier this month found.