• sacredbirdman
      link
      fedilink
      566 months ago

      Nope, they can use your NPU, GPU or CPU whatever you have… the performance will vary quite a bit though. Also, the larger the model the more memory it needs to run well.

    • @elliot_crane@lemmy.world
      link
      fedilink
      English
      406 months ago

      With it being local it’s probably a small and limited model. I took a couple courses on machine learning years ago (before it got rebranded as “AI”), and you’d be surprised at how well a basic image recognition model can run on the lowest-spec macbook from 2012.

      • ferret
        link
        fedilink
        English
        276 months ago

        Tbh the inversion of typical intuition that is LLMs taking orders of magnitudes more memory than computer vision can mess people unfamiliar up on estimates of the hardware required

    • lemmyvore
      link
      fedilink
      English
      216 months ago

      You only need lots of precessing power to train the models. Using the models can be done on regular hardware.