

Ich glaube nicht, dass es eine Ablehnung gegenüber Psychologen ist. Es gibt einfach zu wenige - wenn KI auf einem ähnlichen Level arbeiten könnte, hätten plötzlich viele Leute Zugang zu psychologischer Hilfe, die aktuell keinen haben.


Ich glaube nicht, dass es eine Ablehnung gegenüber Psychologen ist. Es gibt einfach zu wenige - wenn KI auf einem ähnlichen Level arbeiten könnte, hätten plötzlich viele Leute Zugang zu psychologischer Hilfe, die aktuell keinen haben.


Es müsste erwiesen sein, dass die KI selbst im schlimmsten Fall nicht schlechter handelt als der durchschnittliche Psychologe. Da sind wir weit, weit, weit von entfernt.
I’d have expected it to be easier than shopping for hair products!
Door-Kun!


My friend, please run to the next available doctor
But at least it’s UE5™ brown 😎


If the story is true, I’m guessing this was mistranslated & it could have cost them millions if the cats damaged the equipment. If it really cost them anything, they’d have mentioned what happened.


skong release when??


I haven’t had any issues with my 8BitDo controller on Proton 10.x.


Thanks, now I’m up -44.49£ 😎 (assuming 50% card to cash conversion & 1.33 USD : 1 GBP)


no because this is literally in development, this isn’t some 60 year old mature tech
Of course, you don’t have research supporting your position because it’s still in development. So obviously we can just ignore all the papers released over the last decade+ which show the opposite of what you’re claiming - convenient!
there is already academic accessible research talking about LLM issues of which the major concern is hallucinations, to the point where the word bailout is starting to make the rounds in the us from these very companies
the argument is whether or not you believe this is inherent or fixable and a big focus is on the training
anyone listening to any ai company right now is a damn fool with the obvious circular vendor bullshit going on
Yeah, as I expected - you literally don’t understand what this conversation is even about. Since you have a bone to pick with the industry, you make up random claims that you think make the industry look bad. But what you don’t understand is: you’re just making a fool of yourself by making subjective claims around topics you simply don’t understand. Critique the AI industry for the greedy, useless shit they’re doing and creating, not by making up wrong “facts” and ignoring all evidence against them.
And just to save us both time, I’ll list try to list positions you seem to think I hold, which I don’t:
If you choose to reply again and think I’m lying about not holding these positions, re-read the conversation until you understand it.


Ah yes, and you can’t show us that research because it goes to another school? And all companies that train LLMs are simply too stupid to realize this fact? Their research showing the opposite (which has been replicated dozens of times over) was just a fluke?
I see where the confusion is.
“Brat” can also refer to adults.


Yeah, that’s what I guessed. Try to look into the research first before making such grandiose claims.


No, it doesn’t. Unless you can show me a paper detailing that literally any amount of synthetic data increases hallucinations, I’ll assume you simply don’t understand what you’re talking about.


I mean - yeah, it is? This is a well-researched part of the data pipelines for any big model. Some companies even got into trouble because their models identified as other models, whose outputs they were trained on.
It seems you have a specific bone to pick that you attribute to such training, but it’s just such a weird approach to deny pretty broadly understood results…


First: that’s wrong, every big LLM uses some data cleaned/synthesized by previous LLMs. You can’t solely train on such data without degradation, but that’s not the claim.
Second: AI providers very explicitly use user data for training, both prompts and response feedback. There’s a reason businesses pay extra to NOT have their data used for training.


It can be, spinning iron has pretty bad throughput.
Isn’t waypipe meant to replace X over SSH?
Like those nuclear mantis shrimp detonations? That makes a lot of sense