Ghazi Ben Amor, Vice President of Corporate Development at Zamaoffers advice to policymakers and regulators on adaptable cybersecurity and how to develop nuanced and effective privacy policies that can evolve alongside technological advances
Since the cost of cybercrime East expected to reach $13.82 trillion by 2028and could get even worse as cybercriminals gain access to increasingly sophisticated AI, public trust in technology naturally declines.
In response, the UK government outlined new regulatory advances to improve cybersecurity in AI models and software at its flagship CYBERUK cybersecurity event in May.
Comprised of two comprehensive codes of practice, created with industry experts, the idea is that developers are now required to design and build their AI products to be resilient to unauthorized access, tampering, and damage. Doing so will make these products more secure and reliable, helping to build and maintain trust among users and stakeholders across industries that rely on AI technology.
While for me these new measures are certainly a step in the right direction, I have concerns about the future adaptability and effectiveness of regulatory frameworks; concerns that are also shared by the developer community.
A recent survey examining developers’ thoughts found that 72% believe privacy regulations are not designed for the future56% are concerned that dynamic regulatory structures could pose new threats, and 30% believe regulators lack the skills to fully understand the technology they are supposed to oversee.
One of my main concerns is the security risk associated with training AI systems that require large datasets, often containing sensitive personal information. Inconsistent or constantly changing regulations could create vulnerabilities in this area, increasing the risk of data breaches or misuse. And as regulations evolve, ensuring the security and privacy of personal information used in AI training will only become more difficult – a situation that could prove highly problematic for both individuals and organizations.
What should be considered when creating regulatory frameworks?
So what can we do to future-proof regulations and proactively protect our digital infrastructure to avoid having to constantly play catch-up with cybercriminals?
Improving knowledge of PET:
Nearly a third of developers believe that regulators are not sufficiently informed and do not have the skills to understand the technology they are responsible for regulating. So, improving knowledge about privacy-enhancing technologies (PETs) is a good place to start. Understanding the strengths and limitations of each PET allows for a flexible and tailored approach rather than a one-size-fits-all policy. Here are some of the main ones, but there are many more available or in development. The key is to stay up to date with developments:
- Authentication technologies:
- Multi-factor authentication (MFA) adds an extra layer of security and is widely used in online banking and enterprise software. Biometric authentication, which uses unique physical characteristics such as fingerprints or facial recognition, is another advanced method. In the future, federated identity mechanisms such as FIDO (Fast Identity Online) or OpenID Connect promise enhanced security and simplified user authentication across various platforms.
- End-to-end encryption (E2EE):
- This technology ensures that data is encrypted from sender to recipient, preventing unauthorized access. However, implementing E2EE can be complex and resource-intensive, requiring significant computing power and sophisticated key management. Additionally, because E2EE prevents service providers from accessing the data, it can complicate data recovery and legal compliance.
- Fully Homomorphic Encryption (FHE):
- FHE enables data processing without decryption, effectively combining AI and data security. The technology is evolving rapidly and many use cases are now possible. For example, financial institutions can now use FHE to train fraud detection models without exposing personal data, and healthcare providers can perform predictive diagnostics without compromising patient privacy.
- Multiparty Computation (MPC):
- Complementing FHE, MPC allows a quorum of entities to decrypt data collaboratively, ensuring only authorized access. Each entity holds only part of the decryption key, preventing unilateral access to the data. Clear data remains accessible only to the end user.
Promoting continuous learning
AIn addition to a thorough understanding of PET, Regulators should definitely invest in broader, ongoing training and skills development as a priority to stay up to date with developments and threats. This is the only way to stay ahead of a world that is changing so rapidly technologically. In fact, in areas such as AI and cybersecurity, rapid innovation can quickly make existing knowledge obsolete. That is why I believe that employees should, where possible, attend industry events and conferences to give them the opportunity to keep up to date with the latest developments and trends. These events also allow staff to network with experts, fostering connections that can lead to valuable insights and collaborations.
Collaborate with technology creators
I don’t think policymakers should be solely responsible for developing nuanced and effective privacy policies that keep pace with technological advances. In fact, by working directly with developers and creators of new technologies, policymakers can ensure that these developers are designing their products with existing frameworks in mind rather than waiting for new regulations to adapt to each new technological advance. Equally important is that policies are designed to be conducive to innovation, while partnering with technology companies is also a great way to share knowledge. Those who are willing to invite representatives from technology companies to host seminars or internal demos will benefit immensely from the expertise and practical advice of those at the forefront of technology.
Adopt a dynamic and adaptive regulatory framework
Technology doesn’t stand still, which means neither can policy. By designing regulations that are flexible and able to evolve with technological advances, we can better respond to the constant emergence of new cybersecurity and data privacy challenges. Implementing these measures can include regular policy reviews and updates through dedicated committees, working with stakeholders from the technology sector, academia, cybersecurity, and end users to ensure regulations are comprehensive and adaptable, and even implementing feedback mechanisms for organizations and individuals to report their challenges and successes.
As increasingly complex systems such as AI, IoT, and advanced data analytics become an integral part of everyday life, now is the time to develop more robust and forward-thinking privacy policies that are not only comprehensive for today, but also resilient to evolving and future cyber threats.