I could see this simply resulting in every chatbot having a disclaimer that it might be spitting straight bullshit and you should not use it for legal advice.
At this point, I do consider this a positive outcome, too, because it’s not always made obvious whether you’re talking with something intelligent or just a text generator.
But yeah, I would still prefer, if companies simply had to have intelligent support. This race to the bottom isn’t helping humanity.
Can only speak for the UK but as the lowest civil court here, small claims decisions are not binding on any other court (including other small claims courts) but they are considered “pervasive” and thus a judge should be aware and take them into consideration.
Yeah, I mean, at the very least, it’s a solid argumentation. Any judge who’s given a similar case and doesn’t look up, if someone else already dealt with such a case, is just doing a disservice to themselves…
Surprised Air Canada’s lawyers had the bravado to make claims like this. So glad they lost, I hope this becomes precedent for anything similar.
I could see this simply resulting in every chatbot having a disclaimer that it might be spitting straight bullshit and you should not use it for legal advice.
At this point, I do consider this a positive outcome, too, because it’s not always made obvious whether you’re talking with something intelligent or just a text generator.
But yeah, I would still prefer, if companies simply had to have intelligent support. This race to the bottom isn’t helping humanity.
That won’t hold up, though.
I don’t know if small claims create precedent in the same way that a normal lawsuit would.
Can only speak for the UK but as the lowest civil court here, small claims decisions are not binding on any other court (including other small claims courts) but they are considered “pervasive” and thus a judge should be aware and take them into consideration.
Yeah, I mean, at the very least, it’s a solid argumentation. Any judge who’s given a similar case and doesn’t look up, if someone else already dealt with such a case, is just doing a disservice to themselves…