• tburkhol@lemmy.world
    link
    fedilink
    arrow-up
    68
    ·
    2 days ago

    If he knew what he was doing, would he need to be vibe coding? The target audience are exactly the people most susceptible to collateral damage.

    • INeedMana@piefed.zip
      link
      fedilink
      English
      arrow-up
      6
      ·
      2 days ago

      I’ll probably get eaten here but here goes: I do use LLMs when coding. But those should NEVER be used when on unknown waters. To quickly get the 50 lines boilerplate and fill out the important 12 - sure. See how a nested something can be written in a syntax I’ve forgotten - yes. Get some example to know where to start searching the documentation from - ok. But “I asked it to do X, don’t understand what it spewed out, let’s roll”? Hell no, it’s a ticking bomb with a very short fuse. Unfortunately the marketing has pushed LLMs as things one can trust. I feel I’m already being treated like a zealot dev, afraid for his job, when I’m warning people around me to not trust the LLMs’ output below search engine query

    • moody@lemmings.world
      link
      fedilink
      arrow-up
      16
      ·
      2 days ago

      I have a couple dev friends who were told by management that they need to be using AI, and they hate it.

      • MrMcGasion@lemmy.worldM
        link
        fedilink
        arrow-up
        6
        ·
        2 days ago

        I know not everyone is in a position where they can just ignore management, and maybe I’ve just been in more blue collar jobs where ignoring management is normalized. But unless they’re literally looking over your shoulder, what they don’t know won’t hurt them. It’s not like AI is more efficient once you count the extra debug time.

      • ZDL@lazysoci.al
        link
        fedilink
        arrow-up
        1
        arrow-down
        1
        ·
        1 day ago

        Ask the AI to forge a log of them using the AI to do their work so they can do their work the proper way and prove in a log they used AI?