Generative AI and the Duty of Competence Conundrum

With its promise to generate written work product in a fraction of the time it takes for a seasoned attorney to do so, generative artificial intelligence (GAI) is becoming a regular part of legal work at large law firms. Its popularity has grown so much, in fact, that in 2024, the New York State Bar Association (NYSBA) and American Bar Association (ABA) each published reports providing ethical guidelines for its use.
Though the reports are enormously helpful, there remains an open question about the extent to which our Duty of Competence requires an attorney to review the work product of GAI. In this blog post, we discuss two possible interpretations of the rules and argue that the rules require a more comprehensive interpretation of our Duty of Competence than simply has heretofore been recommended. This interpretation has implications for claims about the impact of productivity of GAI in legal work.
A Primer on GAI
GAI, in its present form, is essentially a very advanced form of text prediction that continuously predicts text based on an initial user-based input. In a previous blog post, we covered how large language models (LLMs) such as ChatGPT generate text. The text prediction is based on statistical "weighting" information coded into the LLM based on prior training.
It is important to distinguish the GAI's text-prediction procedure from the process of logical reasoning. GAI outputs text based on the statistical relationship of prior existing text and its next output. So for instance, if the prior existing text is "two plus two equals," the GAI is likely to predict that the next text will be "four." It will generate the text "four" because the existing LLM and its prior training predict that the next text will be "four" (likely because the LLM contains numerous examples where the next text was "four"). It does not, however, generate the text "four" because the LLM in any way understands the concept of "two," the process of addition, the concept of "four" and deduces that the answer is "four" based on its understanding of the meaning of these concepts – only the statistical relationships between these words in the LLM are in play in its processing.
The Duty of Competence
In general, Rule 1.1 (Duty of Competence) states that "a lawyer shall provide competent representation to a client. Competent representation requires the legal knowledge, skill, thoroughness and preparation reasonably necessary for the representation." In explaining the Duty of Competence with respect to GAI, the NYSBA primarily focuses on the duty that an attorney has to gain competence with GAI. The NYBSA notes that under Comment 8 to RPC Rule 1.1, the Duty of Competence involves "keeping abreast of 'the benefits and risks associated with technology the lawyer uses to provide services to client'" See Report and Recommendations of the New York State Bar Association Task Force on Artificial Intelligence, p. 30, 55 (hereinafter, NYSBA Report). In its chart of AI & Generative AI Guidelines, the NYSBA report focuses solely on understanding the "benefits, risks and ethical implications associated with" technology. NYSBA Report p. 58. Given this, the NYSBA suggests that attorneys generally need to obtain "education, training and proficiency" with respect to GAI in order to comply with Rule 1.1.
The NYSBA additionally notes that lawyers are required to resist "techno-solutionism" – the belief that every social, political and access problem has a solution based in the development of new technology. Highlighting the Avianca case as a prime example (a case in which an attorney submitted materials to the court without cite-checking cases cited in the materials), the NYSBA suggests that "attorneys cannot rely on technology without verification" and suggests that an attorney is further subject to Rule 5.3 which imposes a supervisory obligation over nonlawyers. In this case, GAI is treated as the relevant nonlawyer.
The ABA takes a similar tack. It first emphasizes the duty that an attorney has to gain competence with GAI. ABA Formal Opinion 512 p. 3 ("…lawyers should either acquire a reasonable understanding of the benefits and risks of the GAI tools that they employ in their practices or draw on the expertise of others who can provide guidance about the relevant GAI tool's capabilities and limitations") (hereinafter, ABA Report). The ABA additionally notes that "lawyers must recognize inherent risks" with GAI because GAI tools "may combine otherwise accurate information in unexpected ways to yield false or inaccurate results … [and] hallucinations, providing ostensibly plausible responses that have no basis in fact or reality." Id. Given this, they observe that "a lawyer's reliance on … a GAI tool's output – without an appropriate degree of independent verification or review of its output – could violate the duty to provide competent representation as required by Model Rule 1.1." Id.
The Duty of Competence in a Legal Research Setting
Setting aside the NYSBA's and ABA's focus on the Duty of Competence as specifically a duty to become competent in GAI, there are questions regarding how an attorney must implement his or her "supervisory" duties over GAI under either Rule 1.1 or Rule 5.3. Especially when GAI is being touted as a tool for decreasing the workload of an attorney and also a tool for legal research, there is a need for guidance on how an attorney balances productivity against competence in a legal work setting.
The ABA's Report does look into this issue, albeit briefly, and it likens the balancing act to the application of a sliding scale – illustrating the application of the sliding scale to the use of GAI to summarize a voluminous number of contracts. ABA Report p. 4. Here, the ABA Report suggests that if the attorney checks the summaries of the GAI on a smaller subset of contracts and finds them to be correct, then the attorney may be able to trust the GAI's summaries for the larger set of contracts. Id. This example, however, is not as illuminating when examining the attorney's duties with respect to legal research and drafting – the type of work now being touted by Westlaw and Lexis and moreover highlighted specifically in the Avianca case. This might lead one to wonder: What is an "appropriate degree of independent verification or review" of the outputs of GAI with respect to legal research and drafting? Neither the NYSBA nor the ABA clearly explains to what this amounts.
Below are two viewpoints on the requisite independent verification or review that go beyond learning to use GAI.
Duty to Verify vs. Duty of Expertise
We present two proposals for the duty that an attorney has with respect to legal research and drafting, which we call the Duty to Verify and the Duty of Expertise. The Duty to Verify refers to a general duty to verify or confirm that case citations of GAI work product are correct – that they both refer to real cases and support the propositions for which they are cited. The Duty to Verify is hinted at in both the ABA and NYSBA Reports. See, e.g., NYSBA Report p. 29 (noting that Avianca demonstrates that "attorneys cannot rely on technology without verification"); id. at 37 (noting "attorneys must identify, acknowledge and correct mistakes made or represented to the court" for work submitted to the court and third parties); id. at p. 51 (noting judge-instituted policies that language drafted by generative AI "be checked for accuracy … by a human being", or be certified "that each and every citation to the law or the record in the paper, has been verified as accurate"); see also ABA Report p. 3 (noting "a lawyer's reliance on … a GAI tool's output – without an appropriate degree of independent verification or review of its output – could violate the duty to provide competent representation"); id. at p. 4 (providing an example of a lawyer "test[ing] the accuracy of the tool on a smaller subset of documents … and finding the summaries accurate").
The Duty of Expertise refers to a general duty to be knowledgeable in the area of law to be researched. It goes beyond merely checking work for accuracy (such as cite checking) to ensuring that the work is evaluated from the perspective of an expert. For example, though the Duty to Verify might demand that an attorney check the cases that have been cited in a document for accuracy, the Duty of Expertise might demand that an attorney substantively understand the statutory law and case law in the relevant area of law covered by the GAI work product – including opposing case law and primary case law underlying the legal arguments made in the GAI work product. The Duty of Expertise is also hinted to in both the ABA and NYSBA Guidelines. See, e.g., NYSBA Report p. 60 (noting that an attorney "must ensure that the work produced by the Tools is … complete…" in its AI & Generative AI Guidelines); ABA Report p. 4 (noting that GAI "cannot replace the judgment and experience necessary for lawyers to competently advise clients about their legal matters or to craft the legal documents or arguments required to carry out representations."); id. ("lawyers may not leave it to GAI tools alone to … perform .. functions that require a lawyer's personal judgment or participation. Competent representation presupposes that lawyers will exercise the requisite level of skill and judgment regarding all legal work.").
Indications That the Duty to Verify Is Not Enough
Though neither the ABA nor NYSBA report provides a clear statement that there is a Duty of Expertise, there are indications that, at the very least, the Duty to Verify is insufficient to dispensing with one's ethical obligations.
Something beyond the Duty to Verify is suggested by Rule 3.3 ("Candor to the Court"), which states that "(a) a lawyer shall not knowingly (i) make a false statement of fact or law to a tribunal or fail to correct a false statement of material fact or law; [or] (ii) fail to disclose to the tribunal legal authority in the controlling jurisdiction known to the lawyer to be directly adverse to the position of the client and not disclosed by opposing counsel." Though verification can result in correcting a false statement of law or a nonexistent citation, it cannot determine if adverse legal authority has been omitted from work submitted to the court. Of course, though an attorney is required under Rule 3.3to disclose only directly adverse legal authority that is known to him or her, even this minimal requirement goes beyond the Duty to Verify.
The duty of competence is explored by the ABA Ethics Committee in its Opinion No. 1442, which outlines the duty of competence in the defense context as involving "reasonably competent assistance of an attorney acting as his diligent conscientious advocate," which in turn requires their performing "adequate legal research" sufficient to determine what defenses can be raised by defendant. However, merely verifying case law cited in legal work product generated by AI isn't enough to determine if available defenses are adequately explored, indicating that the Duty to Verify is not sufficient.
Competence is also explored in the NYSBA Ethics Committee Formal Opinion 2018-3 ("Ethical Implications of Plagiarism in Court Filings"), which discusses the issue of extensive copying – an act with some similar ethical implications to GAI. According to the committee, if a lawyer copies extensively and inaptly from a treatise or prior judicial opinion, the brief may give the court little guidance on how the law applies to the facts of the case and may leave the adversary's arguments unrebutted. The resulting brief may advocate ineffectively for the client and may therefore violate Rule 1.1(a). See Shatz & McGrath, "Beg, Borrow, Steal" ("[C]ourts are especially concerned not so much with the mere copying of someone else's work, but rather the act of copying in lieu of customizing a brief to the issues and circumstances of the case.") and Joy & McMunigal, "The Problems of Plagiarism as an Ethics Offense", Criminal Justice, Volume 26, Number 2, Summer 2011 (urging courts to focus on competence and diligence in reviewing whether particular instances of copying without attribution rise to the level of ethics offenses) (both articles cited in the NYSBA opinion). Of course, in the case of treatises and prior judicial opinions, the veracity of the citations contained therein is not the issue. The committee argues that the duties of the attorney go beyond this – the attorney must tailor the work to the facts and circumstances of the case. This suggests that in the case of GAI, there are duties that go beyond the Duty to Verify.
As noted earlier, the ABA explicitly recommends utilizing a sliding-scale technique for determining the level of oversight required for the use of GAI. Current GAI tools for legal research, however, have been shown to hallucinate in a very high percentage of instances. See, e.g., Magesh, Surani, Dahl, Suzgun, Manning & Ho, "Hallucination-Free? Assessing the Reliability of Leading AI Legal Research Tools" (examining Retrieval-Augmented GAI products by Westlaw and Lexis and finding hallucinations in at least one out of six benchmarking queries). Though such studies don't clearly indicate the extent of one's duties with respect to legal research, given the extent of the hallucinations and the different types of hallucinations that GAI may exhibit, this would indicate that something more than the Duty to Verify is required.
Indications That the Duty of Expertise Is Required
Beyond this, there are indications in Rule 1.1 that something like the Duty of Expertise is required. Consider that the Comments for Rule 1.1 ("Competence") note generally that a lawyer may either have or obtain "the requisite knowledge and skill in a particular matter" or substitute their knowledge or skill by "associat[ing] or consult[ing] with, a lawyer of established competence in the field in question." (Comments 1, 2).
However, on the assumption that GAI is, at best, to be treated as a nonlawyer – an assumption that the NYSBA Report explicitly adopts – this suggests that an attorney using GAI to draft legal documents cannot treat GAI as a substitute for their own knowledge or skill. He or she must either already have or obtain that requisite knowledge first (that is, barring his referring to or associating with or consulting with another lawyer of established competence). This would suggest then that any substantive work product from GAI must be evaluated from the perspective of the attorney who has gained expertise first.
This conclusion, if correct, would seem to put into question the extent to which GAI serves as a productivity-enhancing tool, at least as it pertains to legal research work product where an attorney isn't already an expert in the relevant law. If, after generating a legal research memorandum using GAI, in addition to checking all citations for hallucinations, an attorney must also independently gain expertise in the relevant subject matter and then evaluate the work product from that perspective of expertise, how much time is GAI saving in the end? See also "AI on Trial: Legal Models Hallucinate in 1 out of 6 (or More) Benchmarking Queries" (observing that the task of verifying citations in retrieval-augmented GAI may cut into purported efficiency gains of GAI, given the frequency of hallucinations).
At the very least, if there are studies that purport to examine the productivity benefits of GAI, they should account for an attorney's compliance with legal ethics requirements, including the Duty of Competence, in determining any productivity benefits of GAI.
Related Insights
