Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

A model that's trained on legal decisions can still be used to explore these questions, though. The model may end up being uncertain about which way the case will go, or even more strikingly, it may be confident about the outcome of a case that then is decided differently, and you can try and figure out what's going on with such cases.


But what value does that have? The difference between a armchair lawyer and a real actual lawyer is in knowing when something is legal/illegal, but unlikely to be seen that way in a court or brought to a favorable verdict. It's knowing which cases you can actually win, and how much it'll cost and why.

Most of that is not in scope of what an LLM could be trained on, or even what an LLM would be good at. What you're training in that case would be someone who's an opinion columnist or twitter poster. Not an actual lawyer.


The point is not in replacing all of the lawyers or programmers but rather that we will no longer need so many of them since a lot of their expertise is becoming a commodity today. This is a fact and there have been many many examples of that.

My friend who hasn't been trained for SQL, nor computer science at all, is now all of the sudden able to crunch through complex SQL queries because of the help he gets through LLMs. He, or more specifically his company, does not need to hire an extern SQL expert anymore since he can manage it himself. He will probably not write a perfect SQL but it's going to be more than good enough and that's actually all that it matters.

The same thing happened at much much smaller scale with Google Translate. 10 years ago we weren't able to read foreign language content. Today? It's not even a click-away because Chrome is doing it for you automatically so it has become a commodity to go and read any website we wish to.

So, the history already proved us that "real translators" and "real SQL experts" and "real XY experts" have been already replaced by their "armchair" alternatives.


But that ignores that the stakes of law are high enough that you often cannot afford to be wrong.

30 years ago, the alternative to Google Translate was buying a translation dictionary or hiring a professional, neither of which was things you'd do for something you didn't care much about. Yes, I can go look at a site/article that's in a language I don't speak and get it translated and generally get the idea of what it's saying. If I'm just trying to look at a restaurant's menu in another language, I'm probably fine. I probably wouldn't trust it if I had serious food allergies, or was trying to translate what I could legally take through customs. If you're having a business meeting about something, you're probably still hiring a real human translator.

Yes, stuff has become commodity-level, but that just broadens who can use it, assuming they can afford for it to be wrong, and for them to have no recourse if it is. Google Translate won't pay your hospital bills if you rely on it to know there aren't allergens in your food and it mistranslated things. ChatGPT won't do the overtime to fix the DB if it gives you a SQL command that accidentally truncates the entire Dev environment.

Almost everything around law on most countries doesn't have "casual usage" where you can afford to be wrong. Even the most casual stuff you may go to a lawyer about, such as setting up a will, is still something where if you try to just do it yourself, you can create a huge legal mess. I've known friends whose relatives "did their own research" and wrote their own wills and when they died, most of their estate's value was consumed in legal issues trying to resolve it.

As I said before - a legal LLM may be fine for writing opinion pieces or informing arguing on the internet, but messing up even basic stuff about the law can be insanely costly if it ends up mattering, and most people won't know what will end up mattering. Lawyers bill hundreds an hour, and bailing you out of decisions you made an LLM-deluded mess could easily take tens of hours.


The stakes of deploying a buggy code into the data center production code can easily cost millions of $$$ and yet we still see that one of the primary usages of LLMs today is exactly in the software engineering. Accountability exists in every domain so such argument doesn't make law any different than anything else. You will still have an actual human signing off the law interpretation or code pull-request. It will just happen that we will not going to need 10 people for that job but 1. And this is at this point I believe inevitable.


legal reasoning involves applying facts to the law, and it needs knowledge of the world. the expertise of a professional is in picking the right/winning path based on their study of the law, the facts and their real world training. money is in codifying that to teach models to do the same




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: