As generative AI (GAI) technology proliferates and legal software companies focus on integrating this new functionality into their platforms, ethics boards across the country are recognizing and responding to the implementation challenges. implemented by lawyers. GAI vendors promise streamlined workflows and increased efficiency, but with these benefits come concerns about ensuring accuracy of results, adequate oversight, maintaining confidentiality, and compliance billing processes.
Due to the many obstacles encountered when using GAI in its current state, several jurisdictions have issued ethical guidelines in recent months, which I have discussed in previous columns. The State Bar of California was the first to issue guidelines in November 2023. It was followed by Florida, which issued Ethics Opinion 24-1 on January 19, and by the Court Committee New Jersey Supreme Court on Artificial Intelligence and Courts’ Preliminary Guidelines issued on January 24.
On January 19, the North Carolina State Bar Council entered the fray with a proposed guidance, Formal Ethics Opinion 4 of 2023 (online: https://www.ncbar. gov/for-lawyers/ethics/proposed-opinions/), which is open for comments until March 30.
This draft opinion provides an in-depth analysis of the many issues lawyers encounter when using GAI tools, as well as common-sense, clear advice on the ins and outs of adopting GAI in an ethical manner. compliant.
Many of the Council’s conclusions mirror those reached by other ethics committees. For example, the Council concluded that lawyers can use GAI, but that the duty of competence means that they are responsible for “reviewing, evaluating, and ultimately relying on the work produced by someone – or something – other than the lawyer,” which includes GAI results. . Additionally, the duty of technological competence requires that lawyers become familiar with the GAI so that they can “exercise independent and responsible professional judgment in determining how (or whether)” use of the GAI is appropriate.
The Council believes that lawyers should carefully review GAI providers to ensure that client confidential information is protected, just as they are required to do when “providing confidential information to third-party software (practice management) , cloud storage, etc.)”. warned that when lawyers use consumer GAI software, they should avoid “entering client-specific information into publicly available AI resources” to prevent confidential data from being used to train the AI system. AI.
Importantly, the Committee clarified that when attorneys use GAI to draft pleadings and adopt the outcome as their own, signing the pleading certifies their “good faith belief as to the factual and legal assertions contained therein.” », a practice which necessarily applies to all pleadings submitted. in court, regardless of their original source.
Client consent was also discussed. The Committee determined that when GAI is used for ordinary tasks such as “conducting legal research or managing generic cases/practices”, client consent is not required, whereas it would be required in advance for any substantive task that “is akin to outsourcing legal work to a non-lawyer.”
Finally, the Committee addressed legal issues of billing and clarified that for hourly billing, attorneys can only bill clients for time actually spent using GAI and cannot bill for time saved through use of this tool. However, the Committee suggested that due to the questionable reduction in billable hours that can be achieved through the use of IAG, lawyers might consider moving to flat-rate billing “for document drafting – even when ‘they use AI to assist with writing… provided that the flat fee charged is not clearly excessive and the client consents to the billing structure.
As for accounting for the cost of using an IAM tool, this is only permitted when the fees charged correspond to “the actual expenses incurred when using the AI in the course of providing legal services to a client , provided that the expenses charged are accurate, not clearly excessive, and the customer consents to the charges, preferably in writing. In comparison, charging customers an overhead administrative fee to cover the cost of AI tools built into software typically used by the business is unacceptable.
North Carolina’s addition to the growing body of AI ethical guidance for lawyers highlights the important balance required to realize the benefits of AI while adhering to ethical standards. The opinion’s findings align with those of other jurisdictions and emphasize that fundamental principles of legal ethics remain unchanged even as technology advances at a rapid pace. As we continue to see the integration of AI into various aspects of legal work, guidance like North Carolina’s becomes invaluable to attorneys striving to maintain the highest standards of professionalism in the legal profession. digital age.
Nicole Black is a Rochester, New York attorney, author, journalist, and head of SMB and external training at MyCase and LawPay, AffiniPay companies. She is the nationally recognized author of “Cloud Computing for Lawyers” (2012) and co-author of “Social Media for Lawyers: The Next Frontier” (2010), both published by the American Bar Association. She is also co-author of “Criminal Law in New York,” a Thomson Reuters treatise. She writes regular columns for Above the Law, ABA Journal, and The Daily Record, has authored hundreds of articles for other publications, and regularly speaks at conferences on the intersection of law and emerging technologies. She is an ABA Legal Rebel and featured on the Fastcase 50 and ABA LTRC Women in Legal Tech lists. She can be contacted at (email protected).