Generative AI is advancing at an exponential rate. Since the release of GPT-4 less than two years ago, there has been an explosion of GenAI tools designed for legal professionals. The rapid proliferation of software incorporating this technology is accompanied by increased concerns about their ethical and secure implementation.
Ethics boards across the country have taken steps to offer guidance to lawyers looking to adopt GenAI in their firms. Most recently, the American Bar Association weighed in with a formal Opinion 512 in late July.
In its opinion, the ABA Standing Committee on Ethics and Professional Responsibility recognized the significant productivity gains that GenAI can offer legal professionals, explaining that “GenAI tools offer lawyers the opportunity to increase the efficiency and quality of their legal services to clients… Lawyers must, however, recognize the inherent risks.”
Importantly, the committee also cautioned that when using these tools, lawyers “should not abdicate their responsibilities by relying solely on a GAI tool to perform tasks that require the exercise of professional judgment.” In other words, while GenAI can significantly increase efficiency, lawyers should not rely on it to the detriment of their personal judgment.
The committee then addressed the key ethical issues that GenAI tools pose in their work processes. It first focused on technological competence. According to the committee, lawyers must keep abreast of developments in GenAI technologies and have a reasonable understanding of the benefits, risks and limitations of this technology.
Confidentiality obligations were also discussed, with the panel emphasizing the need to ensure that GenAI does not inadvertently expose client data and that systems are not allowed to train on confidential data. In particular, the committee required that lawyers obtain informed consent from the client before using these tools in a way that could impact client confidentiality, particularly when using consumer-facing tools that train on captured data.
The committee also provided guidance on oversight requirements, stating that lawyers in leadership positions must ensure compliance with GenAI policies established by their firm. The duty of oversight includes implementing policies, training staff, and overseeing the use of AI to prevent ethical violations.
The committee stressed the importance of reviewing all GenAI outputs to ensure their accuracy: “The court’s duties also require that lawyers, before submitting materials to a court, review those outputs, including authoritative analyses and citations, and correct errors, including inaccuracies of law and fact, failure to include controlling legal authority, and misleading arguments.”
Finally, the committee provided an overview of the ethics of legal fees charged when using GenAI to handle client matters. The committee explained that lawyers can charge fees that include time spent reviewing AI-generated results, but cannot charge clients for time spent learning how to use GenAI software.
It is important to note that lawyers are prohibited from billing their clients for time they would have spent on their work without the efficiencies gained from GenAI tools. In other words, clients can only be billed for work performed, not for time saved by GenAI.
Each new ethics opinion, such as ABA Formal Opinion 512, offers much-needed guidance that enables lawyers to thoughtfully and responsibly integrate AI tools into their firms. By addressing emerging concerns and providing clear standards, these opinions reduce uncertainty and pave the way for forward-thinking lawyers to confidently embrace GenAI.
While the ABA’s opinion is advisory only, it represents a positive trend of responsive guidance that provides the legal profession with the information needed to ethically innovate and embrace emerging technologies in the ever-evolving age of AI.
Nicole Black is a lawyer, author, journalist and the lead legal information strategist at MyCase, LawPay, CASEpeer and Docketwise, the AffiniPay companies. You can contact her at: (protected email).