• PeriodicallyPedantic@lemmy.ca
      link
      fedilink
      arrow-up
      12
      ·
      8 hours ago

      That means that functionally the LLM has access to your location.

      The tool needs to be running on your device to have access to the location, and apps can’t/don’t really call each other in the background, which means the chat app has access to your location, which means the LLM can request access to your location via the tool or the app can just send that information back to home base whenever it wants.

    • yermaw@sh.itjust.works
      link
      fedilink
      arrow-up
      5
      ·
      7 hours ago

      Its like saying i dont have access to your address, when the big book of everybody’s address including yours is on the desk in front of me, that I can look at whenever required.

    • bluesheep@sh.itjust.works
      link
      fedilink
      arrow-up
      5
      ·
      10 hours ago

      I also know that iOS allows an approximate location to be sent to apps, which maybe is the case here.

      Which doesn’t take away from the creep factor let me set that straight.

      • prettybunnys@sh.itjust.works
        link
        fedilink
        arrow-up
        2
        ·
        9 hours ago

        I think that’s still a permission, by default it’s “general area” but you can also allow more fine grained location data

      • oortjunk@sh.itjust.works
        link
        fedilink
        arrow-up
        4
        arrow-down
        4
        ·
        10 hours ago

        It very very much does if you understand how that sausage is made.

        To the untrained eye though, I feel that.

        • MotoAsh@piefed.social
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          7 hours ago

          Does it if you know, though…?

          IMO, even involving location and private data in the digital ecosystem that includes a centralized LLM is a very unwise thing to do.

          We both know that LLMs can and will spit out ANYTHING in their training data regrdless of how many roadblocks are put up and protective instructions given.

          While they’re not necessarily feeding outright personal info (of the general public, anyways) in to their LLMs’ models, we should also both know how slovenly greedy these cunt corpos are. It’ll only be a matter of time before they’re feeding everything they clearly already have in.

          At that point, it won’t just be creep factor, but a legitimate doxxing problem.

        • MadameBisaster@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          3
          ·
          10 hours ago

          Yeah and means that it can call on the location too, so while it doesnt have direct access it has indirect access. If thats a problem anyone has to fecide for themself