That means that functionally the LLM has access to your location.
The tool needs to be running on your device to have access to the location, and apps can’t/don’t really call each other in the background, which means the chat app has access to your location, which means the LLM can request access to your location via the tool or the app can just send that information back to home base whenever it wants.
Its like saying i dont have access to your address, when the big book of everybody’s address including yours is on the desk in front of me, that I can look at whenever required.
IMO, even involving location and private data in the digital ecosystem that includes a centralized LLM is a very unwise thing to do.
We both know that LLMs can and will spit out ANYTHING in their training data regrdless of how many roadblocks are put up and protective instructions given.
While they’re not necessarily feeding outright personal info (of the general public, anyways) in to their LLMs’ models, we should also both know how slovenly greedy these cunt corpos are. It’ll only be a matter of time before they’re feeding everything they clearly already have in.
At that point, it won’t just be creep factor, but a legitimate doxxing problem.
Yeah and means that it can call on the location too, so while it doesnt have direct access it has indirect access. If thats a problem anyone has to fecide for themself
Technicality that the llm doesn’t, but the function it calls it there does.
That means that functionally the LLM has access to your location.
The tool needs to be running on your device to have access to the location, and apps can’t/don’t really call each other in the background, which means the chat app has access to your location, which means the LLM can request access to your location via the tool or the app can just send that information back to home base whenever it wants.
Correct.
Its like saying i dont have access to your address, when the big book of everybody’s address including yours is on the desk in front of me, that I can look at whenever required.
I also know that iOS allows an approximate location to be sent to apps, which maybe is the case here.
Which doesn’t take away from the creep factor let me set that straight.
I think that’s still a permission, by default it’s “general area” but you can also allow more fine grained location data
I mean yes but also doesn’t change the creep factor
Of course it does.
It very very much does if you understand how that sausage is made.
To the untrained eye though, I feel that.
Does it if you know, though…?
IMO, even involving location and private data in the digital ecosystem that includes a centralized LLM is a very unwise thing to do.
We both know that LLMs can and will spit out ANYTHING in their training data regrdless of how many roadblocks are put up and protective instructions given.
While they’re not necessarily feeding outright personal info (of the general public, anyways) in to their LLMs’ models, we should also both know how slovenly greedy these cunt corpos are. It’ll only be a matter of time before they’re feeding everything they clearly already have in.
At that point, it won’t just be creep factor, but a legitimate doxxing problem.
Yeah and means that it can call on the location too, so while it doesnt have direct access it has indirect access. If thats a problem anyone has to fecide for themself
Any website you access wihout a VPN can get a rough location.
is that supposed to make it better?
browser data leaks a bad, lack of privacy in tech is fucking cancer
Just saying. I agree it’s not good, but is is the norm so it’s less “spooky” for sure.
VPN isn’t the only or best solution for this. Alternatives include TOR, proxies, etc.
Yes of course. I took the first one that came to mind.
True enough
Right MCP servers are a thing.