Does AI actually help students learn? A recent experiment in a high school provides a cautionary tale.

Researchers at the University of Pennsylvania found that Turkish high school students who had access to ChatGPT while doing practice math problems did worse on a math test compared with students who didn’t have access to ChatGPT. Those with ChatGPT solved 48 percent more of the practice problems correctly, but they ultimately scored 17 percent worse on a test of the topic that the students were learning.

A third group of students had access to a revised version of ChatGPT that functioned more like a tutor. This chatbot was programmed to provide hints without directly divulging the answer. The students who used it did spectacularly better on the practice problems, solving 127 percent more of them correctly compared with students who did their practice work without any high-tech aids. But on a test afterwards, these AI-tutored students did no better. Students who just did their practice problems the old fashioned way — on their own — matched their test scores.

  • Vanth
    link
    fedilink
    English
    193 months ago

    I’m not entirely sold on the argument I lay out here, but this is where I would start were I to defend using chatGPT in school as they laid out in their experiment.

    It’s a tool. Just like a calculator. If a kid learns and does all their homework with a calculator, then suddenly it’s taken away for a test, of course they will do poorly. Contrary to what we were warned about as kids though, each of us does carry a calculator around in our pocket at nearly all times.

    We’re not far off from having an AI assistant with us 24/7 is feasible. Why not teach kids to use the tools they will have in their pocket for the rest of their lives?

    • @filister@lemmy.world
      link
      fedilink
      English
      183 months ago

      I think here you also need to teach your kid not to trust unconditionally this tool and to question the quality of the tool. As well as teaching it how to write better prompts, this is the same like with Google, if you put shitty queries you will get subpar results.

      And believe me I have seen plenty of tech people asking the most lame prompts.

      • @otp@sh.itjust.works
        link
        fedilink
        English
        -53 months ago

        I remember teachers telling us not to trust the calculators. What if we hit the wrong key? Lol

        Some things never change.

        • @Deceptichum@quokk.au
          link
          fedilink
          English
          63 months ago

          I remember the teachers telling us not to trust Wikipedia, but they had utmost faith in the shitty old books that were probably never verified by another human before being published.

            • @Deceptichum@quokk.au
              link
              fedilink
              English
              23 months ago

              Eh I find they’re usually from a more direct source. The schoolbooks are just information sourced from who knows where else.

              • @qarbone@lemmy.world
                link
                fedilink
                English
                63 months ago

                I don’t know about your textbooks and what ages you’re referring to but I remember many of my technical textbooks had citations in the back.

                • @bluewing@lemm.ee
                  link
                  fedilink
                  English
                  33 months ago

                  Yep, students these days have no idea about the back of their books and how useful the index can be and the citations after that.

                  Even after repeatedly pointing it out, they still don’t make use of it. Despite the index being nearly a cheat code in itself.

    • @Schal330@lemmy.world
      link
      fedilink
      English
      143 months ago

      As adults we are dubious of the results that AI gives us. We take the answers with a handful of salt and I feel like over the years we have built up a skillset for using search engines for answers and sifting through the results. Kids haven’t got years of experience of this and so they may take what is said to be true and not question the results.

      As you say, the kids should be taught to use the tool properly, and verify the answers. AI is going to be forced onto us whether we like it or not, people should be empowered to use it and not accept what it puts out as gospel.

      • @Petter1@lemm.ee
        link
        fedilink
        English
        83 months ago

        This is true for the whole internet, not only AI Chatbots. Kids need to get teached that there is BS around. In fact kids had to learn that even pre-internet. Every human has to learn that you can not blindly trust anything, that one has to think critically. This is nothing new. AI chatbots just show how flawed human education is these days.

      • @wesley@yall.theatl.social
        link
        fedilink
        English
        33 months ago

        Yeah it’s like if you had a calculator and 10% of the time it gave you the wrong answer. Would that be a good tool for learning? We should be careful when using these tools and understand their limitations. Gen AI may give you an answer that happens to be correct some of the time (maybe even most of the time!) but they do not have the ability to actually reason. This is why they give back answers that we understand intuitively are incorrect (like putting glue on pizza), but sometimes the mistakes can be less intuitive or subtle which is worse in my opinion.

      • @Womble@lemmy.world
        link
        fedilink
        English
        03 months ago

        Ask your calculator what 1-(1-1e-99) is and see if it never halucinates (confidently gives an incorrect answer) still.