You may not know it, but the world of cybersecurity is about to experience its Super Bowl. More than 40,000 people from more than 130 countries will travel to San Francisco the week of May 6 for the 33rd annual RSA Cybersecurity Conference. This will be my 16th year as president of the RSA Conference, and there is an intensity and urgency leading up to this year’s event that I have never seen before. To understand why, my team analyzed thousands of speaker submissions from the world’s cyberspace advocates. Three themes stood out: artificial intelligence, information manipulation and burnout.
New AI technologies bring new risks
As AI’s footprint expands across business and society (nearly one in five speakers focused on it this year), every industry is trying to figure out how to harness the power of AI-based systems. At the same time, security professionals are discovering new risks. One of these risks is that these systems could leak company and user data. Another concern is accuracy. Systems based on a large language model (LLM) are probabilistic, meaning you can ask the same question multiple times and get slightly or significantly different answers each time. This may be okay for generating a short story, but what if your new AI-powered customer service chatbot occasionally provides wildly inaccurate or fictitious information to customers?
When it comes to cybersecurity, we try to manage risks with compensating controls: technologies and processes to limit or mitigate these risks. The challenge is that many of these AI technologies are new and the appropriate compensating controls to manage emerging risks are only just being put in place. Additionally, there are concerns about regulating AI. Several countries have recently issued AI guidelines or regulations, with the most prominent examples being the European Union AI Law and the American White House Executive Decree on the safe, secure and reliable development and use of artificial intelligence. A change in future regulations (a restriction prohibiting these AI systems from reasoning about a customer’s emotional state, for example) could lead to the elimination of AI-based customer support chatbots.
The crisis of information manipulation
Just a few years ago, it took both technical acumen and intention to create deepfakes; now all you need is an intention. From a societal point of view, cybersecurity experts fear that the next US presidential election will lead to a tidal wave of deepfakes to influence public opinion. From a business perspective, deepfakes have enhanced the ability of cybercriminals to commit fraud. In one recent example, a financial employee of a large multinational company in Hong Kong was on a video conference with a group of colleagues and, during that meeting, was asked to transfer $25 million from the company into the context of a transaction. Unfortunately, these trusted colleagues were actually deepfakessynthetic representations of real employees controlled by a fraudster.
Problems with information manipulation go far beyond modifying video and audio. A recent insidious example comes from the software world, where malicious actors managed to plant a backdoor in a very commonly used application called XZ Utilities. If this software implant had not been discovered by a software developer at Microsofttens of thousands of businesses could have been compromised.
Burnout is increasing once again
In addition to the challenges posed by AI and data manipulation, the cybersecurity community has witnessed a wave of high-profile ransomware attacks, like the one that close MGM complexes at the end of last year. We reviewed over 10,000 speakers over the past five years and the topic of “burnout” came up twice. The first was in 2021, when COVID surged and cyber workers had to quickly adapt to secure a fully remote workforce. The topic of burnout then returned to normal levels in 2022 and 2023, but increased again in 2024. It is not just the recent wave of attacks that is taking a toll on cybersecurity professionals; There is growing concern that Chief Information Security Officers (CISOs) may be held personally liable for company violations. Two case in particular have raised the specter of such responsibility, and there is new pressure businesses to quickly report details of a compromise.
The power of community
Go through your day and think about all the touchpoints you have with technology. Your car is a computer, your bank is an app on your phone: technology is everywhere, which means hackers are everywhere too. I’ve spent my entire career in cybersecurity: from writing some of the first books on software vulnerability detection, to teaching computer security at Columbia University, to serving as a CTO at Symantec. What most people don’t appreciate about cybersecurity professionals is that we are part of a mission-driven community. Attackers often work in near isolation; cyber pros collaborate. The elite of the global cybersecurity community are set to gather at the RSA Conference, but it’s more than just a gathering. It is the summoning of a community.
Hugh Thompson, Ph.D., East Executive Chairman of the RSA Conference.
Other must-read comments:
The opinions expressed in comments on Fortune.com are solely the opinions of the authors and do not necessarily reflect the opinions and beliefs of Fortune.