The Criminal Division of the United States Department of Justice (DOJ) recently updated its Evaluation of Corporate Compliance Programs (ECCP) policy document, which prosecutors rely on to evaluate the effectiveness of compliance programs to determine whether to file charges, impose monetary penalties, or require certain ongoing compliance obligations, as in under a corporate integrity agreement. Notably, the September 23, 2024 revisions to the ECCP require prosecutors to evaluate whether a company’s compliance program includes safeguards to ensure that its deployment of new technologies, including artificial intelligence (AI), does not will not result in “deliberate or reckless misuse”. that violates criminal laws or the company’s code of conduct.
Background
THE Justice Manual (formerly known as United States Attorneys’ Handbook) describes the specific factors that prosecutors should consider when exercising their discretion against a commercial organization in the case of a criminal investigation. A key factor is the effectiveness of the company’s compliance program. The ECCP, for its part, defines the criteria that prosecutors use to evaluate whether a program is effective. The ECCP is important both as a reference document for designing and implementing corporate compliance programs and as a guidance document for understanding how the DOJ makes prosecutorial decisions. Generally and at a minimum, companies involved in the healthcare sector must ensure that their compliance programs meet the criteria set out in the ECCP.
Emerging Technology/AI Risk Assessment
In March 2024, Assistant U.S. Attorney General Lisa Monaco directed the DOJ Criminal Division to integrate the assessment of risks presented by new technologies, including AI, into the ECCP. As a result of this guidance, the ECCP now includes criteria for prosecutors to use to assess whether a company has adequate controls in place to mitigate the risks associated with the use of AI. Federal prosecutors should ask:
- Does the company have a process for identifying and managing emerging internal and external risks that could potentially impact the company’s ability to comply with the law, including risks related to the use of new technologies?
- How does the company assess the potential impact of new technologies such as AI on its ability to comply with criminal laws?
- Is risk management related to the use of AI and other new technologies integrated into broader enterprise risk management (ERM) strategies?
- What is the company’s approach to governance regarding the use of new technologies such as AI in its business operations and compliance program?
- How does the company limit potential negative or unintended consequences resulting from the use of technologies, both in its business operations and in its compliance program?
- How does the company mitigate the potential for deliberate or reckless misuse of technologies, including by company internals?
- To the extent the company uses AI and similar technologies in its business or as part of its compliance program, are controls in place to monitor and ensure its reliability, reliability and use in accordance with the applicable law and the company’s code of conduct?
- Are there controls to ensure that the technology is used only for its intended purposes?
- What basis for human decision is used to evaluate AI? How is accountability for AI use monitored and enforced?
- How does the company train its employees to use emerging technologies like AI?1
The above criteria collectively demonstrate that a company that uses AI at the core of its business operations must have specific policies and procedures in its compliance program to ensure that its AI software is deployed and continues to function properly . This would include policies and procedures for auditing AI performance, training employees on the appropriate use of AI, and detecting abuse of AI or any other new technology.
Resources and access to data
The September 2024 revisions to the ECCP also include new criteria relating to a company’s use of data analytics in its compliance program. For example, the updated ECCP requires prosecutors to evaluate whether compliance personnel have access to relevant data systems from which personnel can adequately monitor the effectiveness of the company’s compliance program. Additional questions from the updated PCEC include:
- Is the company appropriately leveraging data analytics tools to drive efficiencies in compliance operations and measure the effectiveness of compliance program components?
- How does the company manage the quality of its data sources?
- How does the company measure the accuracy, precision, or recall of the data analysis models it uses?2
The new criteria reflect an expectation on the part of the DOJ that companies use data tools as part of their compliance efforts and that they provide their compliance staff with access to all data necessary to enable staff to implement the compliance program.
Takeaways
As Monaco said, “fraud using AI remains fraud.” The revised ECCP confirms that the DOJ takes seriously the potential risks associated with the misuse of AI and will review companies’ compliance programs to ensure these risks are addressed and mitigated. Additionally, the revised ECCP clarifies that DOJ expects 1) compliance personnel to have access to relevant data sources necessary to carry out their responsibilities under the compliance program and 2) companies to dedicate sufficient resources, including including data analysis tools, to their compliance efforts.
Remarks :
1 ECCP p. 3-4.
2 ECCP p. 13.