Don’t get me wrong, there are problems with it, both in the process that modern AI uses as well as the sources that it draws from, however, as of right now ai is just a tool like auto-tune or photoshop.

Even though it will change the media formats that it is attached to, it will not supplant them within the next 5 to 10 years, it will simply transform them.

  • MeatsOfRage@lemmy.world
    link
    fedilink
    English
    arrow-up
    32
    arrow-down
    8
    ·
    edit-2
    4 months ago

    As someone who daily drives ChatGPT for a lot of stuff, I agree. I heard someone put it this way “AI is perfect for stuff that is hard to find but easy to verify”.

    The other day I took a photo of my liquor cabinet and told it to make a cocktail recipe with ingredients on hand. Or if I encounter an error on my PC I’ll just describe the problem. Or for movie recommendations when I have a very specific set of conditions. Or trying to remember a show from my childhood with only a vague set of memories. The list goes on.

    Particularly for anything coding. If I’m trying to learn something I always learn best when I can just see an example of the thing in action as documentation is not always great. Or if I’m doing data manipulation and I have the input and output and just need the function to convert one to the other. I recently saved a whole afternoon of effort with that one. Or spec tests I’ll just drop my whole code file in and ask it for full coverage.

    These are all things that traditional search engines are poor or incapable of. I’d have a hard time going back if they just turned all this off tomorrow.

    I think there’s a lack of education around how to use AI which is actually a problem. Like you shouldn’t be using it to identify if a mushroom is safe to eat. You shouldn’t be using really for anything food or health related for that matter. You should ask it for its sources when you are unsure of its answers.

      • illi@lemm.ee
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        4 months ago

        Yep. The few times I used chatGPT was when I wanted to look something up but didn’t know what keywords to use and it was easier to just describe it. Or when I just didn’t feel like digging through links and just wanted an answer to a not so important question.

  • TurboWafflz@lemmy.world
    link
    fedilink
    English
    arrow-up
    19
    ·
    4 months ago

    I just wish companies would stop cramming it into every product and making it the selling point for everything because they can’t think of anything else to do. It’s kind of fun to play with sometimes but it just isn’t useful at all for most tasks that companies seem to want us to use it for. I’m hoping it will be mostly forgotten outside of niche uses in a few years the same way we forgot about 3D TVs or NFTs as soon as the next new tech buzzword got invented

  • UpperBroccoli@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    16
    ·
    4 months ago

    It will transform mediocre content into crap, and then it will regurgitate that crap into worse and worse crap. All of that at the expensive of untold Terawatt-hours of energy. AI will not take over the world, but it sure will help destroy it anyway.

    • alvvayson@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      2
      ·
      4 months ago

      I agree with this take.

      AI will definitely make some white collar jobs way more productive, and thus change the nature of that work and reduce the number of people employed in those jobs.

      A good example is translation, where translators are now mostly reviewing translated texts instead of translating from scratch.

      This means the ability to read fast and take on the role of editor is what remains important in the remaining jobs for translators.

  • SkyNTP@lemmy.ml
    link
    fedilink
    English
    arrow-up
    11
    ·
    4 months ago

    The problem is not the tech, as usual, it’s the people, who have been led to believe it’s AGI, who equate forming syntactically correct sentence with intelligence, and that that is enough to perform most white collar tasks.

  • DemocratPostingSucks@lemm.ee
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    1
    ·
    4 months ago

    To offer a counternarative, we don’t really know where AI will plateau, what will a 1 quadrillion parameter language model look like? And will it make my degree worthless by comparison lol?

    • Scrubbles@poptalk.scrubbles.tech
      link
      fedilink
      English
      arrow-up
      3
      ·
      4 months ago

      Eve when when it does, we will still have hallucination. It will still be impossible to guarantee a correct answer. It’ll be more accurate, but those problems require heavy rethinking to models themselves, more accurate models will lessen it but it won’t go away

  • disguy_ovahea@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    4 months ago

    The dangers of AI should be determined by application, not capability. It’s a tool, like any other. You can use a hammer to build a house or cave in a skull.

  • 9point6@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    3
    ·
    4 months ago

    There are ethical problems with how many of the models have been created and some of what they’re being used for

    But I generally agree, it’s pretty much the same thing we see with every technological innovation—something big changes and a load of things get disrupted, a group of people then get angry about said innovation, eventually those people dwindle and the innovation gets absorbed into the general public’s idea of what modern life consists of.

    I can’t think of any big innovation over the past few decades that hasn’t really followed a similar trajectory

    • bizarroland@fedia.ioOP
      link
      fedilink
      arrow-up
      1
      ·
      4 months ago

      True but I can’t think of anything that capitalism hasn’t fucked up one way or another.

      I would be against making AI responsible for what capitalism has done.

      • db0@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        2
        ·
        4 months ago

        Neither am I. I’m just pointing out that Capitalism has ruined ai like it ruined everything from the loom to the printing press to the internet