But what if it’s just a copy of that relation R? Is it the same mind in a new body? Or a new body that’s so utterly similar that it will merely continue on believing it’s the same person?
I don’t like that philosophy, because it pretends the mind exists outside of the brain.
Does the difference of the literally unobservable matter? Yes absolutely to the person stepping in to the machine! Sure, everyone else doesn’t have to care, but the teleported person sure probably should care, so that philosophy does less than nothing for answering the question.
He’s basically saying, 'I don’t care if you die if I cannot tell if the new guy is different.". Saying, “I don’t care” is not an answer.
Does it if you know, though…?
IMO, even involving location and private data in the digital ecosystem that includes a centralized LLM is a very unwise thing to do.
We both know that LLMs can and will spit out ANYTHING in their training data regrdless of how many roadblocks are put up and protective instructions given.
While they’re not necessarily feeding outright personal info (of the general public, anyways) in to their LLMs, we should also both know how slovenly greedy these cunt corpos are. It’ll only be a matter of time before they’re feeding everything in.
At that point, it won’t just be creep factor, but a legitimate doxxing problem.