paywall bypass: https://archive.is/whVMI

the study the article is about: https://www.thelancet.com/journals/langas/article/PIIS2468-1253(25)00133-5/abstract

article text:

AI Eroded Doctors’ Ability to Spot Cancer Within Months in Study

By Harry Black

August 12, 2025 at 10:30 PM UTC

Artificial intelligence, touted for its potential to transform medicine, led to some doctors losing skills after just a few months in a new study.

AI helped health professionals to better detect pre-cancerous growths in the colon, but when the assistance was removed, their ability to find tumors dropped by about 20% compared with rates before the tool was ever introduced, according to findings published Wednesday.

Health-care systems around the world are embracing AI with a view to boosting patient outcomes and productivity. Just this year, the UK government announced £11 million ($14.8 million) in funding for a new trial to test how AI can help catch breast cancer earlier.

The AI in the study probably prompted doctors to become over-reliant on its recommendations, “leading to clinicians becoming less motivated, less focused, and less responsible when making cognitive decisions without AI assistance,” the scientists said in the paper.

They surveyed four endoscopy centers in Poland and compared detection success rates three months before AI implementation and three months after. Some colonoscopies were performed with AI and some without, at random. The results were published in The Lancet Gastroenterology and Hepatology journal.

Yuichi Mori, a researcher at the University of Oslo and one of the scientists involved, predicted that the effects of de-skilling will “probably be higher” as AI becomes more powerful.

What’s more, the 19 doctors in the study were highly experienced, having performed more than 2,000 colonoscopies each. The effect on trainees or novices might be starker, said Omer Ahmad, a consultant gastroenterologist at University College Hospital London.

“Although AI continues to offer great promise to enhance clinical outcomes, we must also safeguard against the quiet erosion of fundamental skills required for high-quality endoscopy,” Ahmad, who wasn’t involved in the research, wrote a comment alongside the article.

A study conducted by MIT this year raised similar concerns after finding that using OpenAI’s ChatGPT to write essays led to less brain engagement and cognitive activity.

  • RogueBanana@piefed.zip
    link
    fedilink
    English
    arrow-up
    24
    ·
    21 hours ago

    Also very apparent in IT. Juniors blindly generating garbage and coming to me when the shit they blindly create doesn’t work. Got to drill them with questions to make them actually learn something. Concerning that the same is happening in medical even for the experts.

    • mindbleach@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      10
      ·
      20 hours ago

      It sounds like this is about when they stopped using AI.

      If they do better with it than without it, why optimize how good they are without it? Like, I know how to do math, by hand. But I also own a calculator. If the speed and accuracy of my multiplication is life-and-death for worried families, maybe I should use the calculator.

      • RogueBanana@piefed.zip
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        14 hours ago

        No, this is about me trying to fix their buggy ai code that they have no idea how it works and what it isn’t working. If you can do your work completely on your own without issues then whatever but if you are breaking stuff and come to me asking for help cause you don’t know how your own code works then that’s a massive problem. I don’t mind teaching people, I actually enjoy it but that’s only when you are putting in effort to learn it instead of copy pasting code from copilot.

        • mindbleach@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          8 hours ago

          Okay cool, that’s not what’s happening here.

          These aren’t “vibe doctors.” They’re trained oncologists and radiologists. They have the skill to do this without the new tool, but if they don’t practice it, that skill gets worse. Surprise.

          For comparison: can you code without a compiler? Are you practiced? It used to be fundamental. There must be e-mails lamenting that students rely on this newfangled high-level language called C. Those kids’ programs were surely slower… and ten times easier to write and debug. At some point, relying on a technology becomes much smarter than demonstrating you don’t need it.

          If doctors using this tool detect cancer more reliably, they’re better doctors. You would not pick someone old-fashioned to feel around and reckon about your lump, even if they were the best in the world at discerning tumors by feel. You’d get an MRI. And you’d want it looked-at by whatever process has the best detection rates. Human eyeballs might be in second place.

          • RogueBanana@piefed.zip
            link
            fedilink
            English
            arrow-up
            2
            ·
            6 hours ago

            I never implied they are vibe doctors? Its just a comment on my annoying experience, don’t read to much into it.

            • mindbleach@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              5 hours ago

              “Concerning that the same is happening in medical even for the experts.”

              It isn’t.

              Glad we cleared that up?

      • Baggie@lemmy.zip
        link
        fedilink
        English
        arrow-up
        13
        arrow-down
        1
        ·
        20 hours ago

        If your use a calculator, and it gives you back a number that can’t possibly be right, you know there’s an error somewhere along the line.

        If you’ve never done multiplication before, you won’t have that innate sense of what looks right or wrong.

          • Baggie@lemmy.zip
            link
            fedilink
            English
            arrow-up
            9
            arrow-down
            1
            ·
            17 hours ago

            It’s an analogy. It’s referring to the original comment where people don’t have the skills to recognise how or why something doesn’t work. The core problem is without that fundamental understanding of what you’re trying to do, you don’t know why something doesn’t work.

            • mindbleach@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              3
              ·
              9 hours ago

              No shit, it’s my analogy. And I made clear - the underlying skill still exists.

              These doctors can still spot cancer. They’re just rusty at eyeballing it, after several months using a tool that’s better than their eyeballs.

              X-rays probably made doctors worse at detecting tumors by feeling around for lumps. Do you want them to fixate on that skill in particular? Or would you prefer medical care that uses modern technology?

              • Baggie@lemmy.zip
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                1
                ·
                7 hours ago

                Well you need to work on your communication skills as much as you do on your tone then.

                You clearly are more focused on being argumentative and obtuse rather than engaging the argument that skills need to be developed, before you assign all the work to a machine that automates the process, but errors can and will occur.

                Enjoy spending all your life entering every discussion predisposed to anger and argument, I’ve got better things to do with my time.

                • mindbleach@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  1
                  ·
                  edit-2
                  7 hours ago

                  Tone policing, followed by essentialist insults. Zero self-awareness.

                  Meanwhile, I’ve repeatedly pointed out: these doctors have the skills. The machine only helps. You can’t or won’t engage with that.

      • subignition@piefed.social
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        2
        ·
        10 hours ago

        Because “AI” tools are unsustainable, and it would be better not to have destroyed your actual skill when the bubble eventually pops.

        • mindbleach@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          9 hours ago

          This is not that kind of AI. It’s not an LLM trained on WebMD. You cannot reason about this domain-specific medical tool, based on your experience with ChatGPT.