Girl looking at a computerIt is yet to be determined how ChatGPT and other generative artificial intelligence (AI) applications may transform the legal profession and the delivery of legal services. However, law schools and their faculty are deciding now whether to permit students to use these AI tools in their law school classes and how best to train students to utilize them to meet the demands of a profession in flux.

AI is already an integral part of the legal profession, whether through e-discovery, contract review, legal research, or client billing applications. While these traditional AI tools often use historical patterns to detect and categorize data sets, formulating insights and predicting future outcomes, generative AI learns patterns from data to create new content.

When responding to a prompt or question, generative AI tools such as ChatGPT (OpenAI’s chatbot which was launched in November 2022) can perform research and analysis, summarize documents, and generate new content (including outlines, essays, and contracts).

Given the relative newness of these generative AI applications, many law schools’ academic policies and honors codes do not specifically reference them. Rather, individual faculty have the discretion to determine whether students may use AI tools for law school assignments and exams in their respective classes.

Gaining a ‘functional understanding’ of AI

University of Illinois College of Law Professor Patrick J. Keenan, who recently led a faculty session on “Generative AI & Law Teaching,” said he is finalizing a fall 2023 class policy regarding generative AI use in his law school classroom.

He said he plans to “permit students to use it to help generate ideas, refine ideas, check grammar, and similar activities” but will prohibit them from “using it to draft an entire assignment or write entire paragraphs, sentences, or papers for class.”

Keenan said he will also “require students to include a short statement in any written assignment describing how they used generative AI in the assignment,” should they choose to utilize it.

Daniel W. Linna Jr., Director of Law and Technology Initiatives and a Senior Lecturer at Northwestern Pritzker School of Law and McCormick School of Engineering, said that, in his courses, he has “encouraged students to use generative AI tools with attribution and inclusion of the prompts used.”

In his “Law of AI and Robotics” courses, Linna said he required students to sign up for a ChatGPT account, experiment with ChatGPT, “post[] examples of ChatGPT working both well and poorly for certain tasks,” and discuss those examples.

Linna said his goal is to “help students develop a functional understanding of artificial intelligence tools, including ChatGPT,” because “[l]awyers need to have a functional understanding of AI so that they can understand what AI systems can and cannot do” and “ask good questions of those developing the AI systems.”

Challenges when law students use AI

Like Linna, Keenan believes it is important for law students to understand both the capabilities and limitations of these AI tools, as “students will benefit from learning what generative AI is good for and where it is not helpful and can even be dangerous.”

One potential danger Keenan identified is law students’ relative inability to differentiate correct answers from incorrect or incomplete answers due to their lack of legal experience.

“Generative AI chatbots (like ChatGPT and Bard) often give wrong, misleading, or grossly incomplete answers to legal questions,” Keenan explained. “When I get a wrong answer on an issue of criminal law, I recognize it as incorrect and tell the chatbot that it is wrong. But students who are just learning the material might be misled and make a critical error.”

Keenan is also concerned that overreliance on generative AI tools may diminish lawyers’ creativity, as the text or material generated is “based entirely on other people’s ideas and prior writings,” and may undermine lawyers’ ability to accurately assess clients’ legal issues.

“Legal issues don’t come to lawyers with labels attached,” Keenan said. “It is our job to figure out what kind of problem the client is describing. If you rely on a chatbot to do the hard work, your ability to diagnose problems will weaken and you won’t be an effective advocate for your client.”

AI may help ‘level the playing field’ in law school

Despite generative AI’s limitations, Linna and Keenan believe law students can reap benefits—both in the classroom and in their careers—from learning how to navigate and deploy these tools.

Keenan believes that generative AI can improve the law school experience and “help level the playing field” for first-generation law students and those students who may not have had the same educational preparatory experiences and opportunities as other students.

Keenan said that “some colleagues here at the University [of Illinois at Urbana-Champaign] are partnering with our excellent computer scientists to create chatbot teaching assistants who can help students navigate the course website and syllabus.”

According to Keenan, these chatbot assistants will “be able to answer basic questions that students often struggle with.” Keenan believes this support “can be a big help to students who are first-generation college or law school students and might be reluctant to approach the professor.”

Linna predicts that generative AI can be particularly useful in equipping law students to meaningfully contribute during the first few years of their legal careers.

According to Linna, “legal educators have a tremendous opportunity to demonstrate to the legal industry how AI tools can be designed, developed, and deployed to help the students and attorneys using them to learn, create higher quality work product more efficiently, and improve the delivery of legal services.”

Should ‘deskilling’ be a concern?

Given law firms’ concerns that “generative AI tools will result in ‘deskilling,’ such that junior attorneys won’t learn the skills that they need to be effective,” Linna said, “we can design AI systems so that junior attorneys are empowered and trained better than ever.”

Legal educators can accomplish this, Linna said, by creating assessments “that cannot be completed by simply submitting a prompt to ChatGPT” but rather “require students to first use generative AI and then use their skills to iterate and improve what a system provides to them.”

Linna said such preparation is essential in law school because “AI is transforming law, and AI in the future will replace many of the tasks that lawyers perform today.”

AI and ‘democratizing access to justice’

Finally, because generative AI will enable many legal tasks to be performed more efficiently and cost-effectively, Keenan and Linna are optimistic that these tools can deliver societal benefits, democratizing access to justice and expanding the delivery of legal services.

“There is a huge deficit in civil legal services for vulnerable communities,” Keenan explained. “I think that generative AI tools, if used properly and with close guidance by lawyers, could be part of the solution to this. The possibilities are incredible and there is great potential to help the most vulnerable people find solutions to their legal problems.”

“Law students and lawyers should be excited,” Linna said, “about how they can use generative AI to carry out the mission of law and achieve the vision of everyone having access to legal services and justice.”

Staying up to date on issues impacting the legal profession is vital to your success. Subscribe here to get the Commission’s weekly news delivered to your inbox.

The Rise of ChatGPT: Ethical Considerations for Legal Professionals

Why ChatGPT Matters for the Future of Legal Services

Ethical Considerations for Lawyers Regarding Email Encryption

The post Where Two Illinois Law School Professors Stand on AI in Their Classrooms appeared first on 2Civility.