this post was submitted on 20 Oct 2024
68 points (95.9% liked)
Futurology
1776 readers
210 users here now
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
And I believe them 100%.
The legal profession revolves around logic. Legal arguments are formed based on the rules of logic. A fine tuned model would absolutely demolish any human opponent here.
Take chess for example. There's a predefined set of rules. Chess was one of the first games where ML models beat humans.
Sure, the magnitude of complexity of rules in law is more than that in chess. The ground principle is still the same.
The legal system does not revolve around logic and even it it was: LLMs can't reason, so they'd be useless, anyways.
Law = rules. Action A is legal. Action B isn't. Doing X + Y + Z constitutes action A and so on. Legal arguments have to abide by all rules of logic. Law is one of the most logic heavy fields out there.
As for LLMs not being able to reason, it's very debatable. Whether they reason or not depends upon your definition of "reasoning". Debating definitions here is useless however, as the end result speaks for itself.
If LLMs can pass certification exams for lawyers, then it means either one of two things:
Ok. What test would an LLM need to pass to convince you that it capable of being a lawyer?
The short answer is being admitted by the bar; we already trust them to certify humans.
If for some reason I were arbiter, I would say a convincing record of doing actual legal work, vetted by existing lawyers. The legal profession already has a well-defined model of how non-lawyers can contribute to the work, so there is no need for a quantum leap up to being a lawyer.
Draw a pentagon