- cross-posted to:
- sci
- technology@lemmy.world
- nottheonion@lemmy.world
- cross-posted to:
- sci
- technology@lemmy.world
- nottheonion@lemmy.world
It’s not unclear at all. Journals are not bastions of science any longer. They are a money making scheme for already wealthy assholes. Actual scientists and researchers have been bemoaning how shitty journals are for decades.
i believe that’s called vanity publishing
I have published in peer reviewed journals and done a few reviews myself. It is not a perfect system. There are generally only a few reviewers. Typically others that have published in the journal in that area. The goal is not to check the work deeply and the tilt is to allowing and also trusting the authors. The other thing that shocked me, the authors generally pay quite a lot of money by the page to publish articles. Also not every journal is the same. Some hard to publish in and others easy. Some nonprofit and others profit making entities. The rush to publication and the publish or perish situation in science creates its own issues too.
This is not that much a failure of science in that it was discovered pretty quickly and presumably a retraction has been made. It is comical.
I would add that the patent system has similar issues. It is far from perfect too. Invalidating a patent is a lot more time consuming and costly and a lot less funny.
The goal is not to check the work deeply and the tilt is to allowing and also trusting the authors.
That sounds like the wrong fucking goals then. I push for detailed code reviews all the time, and encourage my peers to ask questions.
Jounal articles are the start of discussion not the end. They get an idea out there for others to consider test and extend or dispute.
Keep in mind reviewers are generally unpaid as well.
Code reviews do not test correctness of code either of if the code is bug free. They also tend to assume good will of the participants. They have similar issues.
Further, most of the times, it’s simply infeasible to test the data in-depth. We’re all humans with busy schedules and it is, unfortunately, not trivial to replicate experiments. If a reviewer feels more data is needed to support a claim, they can ask for a follow-up test or experiment, but it has to be within reason
Why should the enshittification stop at the internet? Let’s enshittify the whole world for good measure.
I don’t like this either but that’s not what enshittification means.
Yep, quoted from the wiki:
Here is how platforms die: first, they are good to their users; then they abuse their users to make things better for their business customers; finally, they abuse those business customers to claw back all the value for themselves. Then, they die. I call this enshittification, and it is a seemingly inevitable consequence arising from the combination of the ease of changing how a platform allocates value, combined with the nature of a “two sided market”, where a platform sits between buyers and sellers, hold each hostage to the other, raking off an ever-larger share of the value that passes between them.
deleted by creator
Sorry I was quoting it for people that don’t click
I hate the term and the fact it became widespread. Unfortunately, mass adoption also means it will mutate and evolution will follow its course.
That image haunts me. I can only assume this paper got through peer review because the reviewers saw that image and went mad, like all those who have witnessed eldritch monstrosities.
It’s not AI, it’s a window on the diseased minds of poor bastards who experimented with some lovecraftian horror from outer space.
Yeah, I like that plot twist better.
🤖 I’m a bot that provides automatic summaries for articles:
Click here to see the summary
Appall and scorn ripped through scientists’ social media networks Thursday as several egregiously bad AI-generated figures circulated from a peer-reviewed article recently published in a reputable journal.
But, looking closer only reveals more flaws, including the labels “dissilced,” Stemm cells," “iollotte sserotgomar,” and “dck.”
Many researchers expressed surprise and dismay that such a blatantly bad AI-generated image could pass through the peer-review system and whatever internal processing is in place at the journal.
One scientific integrity expert questioned whether it provide an overly complicated explanation of “how to make a donut with colorful sprinkles.”
The image is supposed to provide visual representations of how the signaling pathway from Figure 2 regulates the biological properties of spermatogonial stem cells.
As such, research journals have recently set new authorship guidelines for AI-generated text to try to address the problem.
Saved 72% of original text.