An imagined town in Peru, an Eiffel tower in Beijing: travellers are increasingly using tools like ChatGPT for itinerary ideas – and being sent to destinations that don’t exist.
An imagined town in Peru, an Eiffel tower in Beijing: travellers are increasingly using tools like ChatGPT for itinerary ideas – and being sent to destinations that don’t exist.
People have a responsibility to themselves. One of the absolute first things I did when I heard about ChatGPT and the ever-increasing coterie of its imitators, was I tested them. I had them talk about things I know and counted the errors and flat-out hallucinated fictions.
Then I said to myself, “you know, they’re going to be just as full of shit on things I don’t know”.
I saw all the lies. I saw all the advertising. I saw the same thing everybody else saw. But I saw it all and then actually tested it.
If people are willing to just believe professional liars—and make no mistake, that’s what advertisers are!—without bothering to do a minute’s checking, then sorry, that’s entirely on them.
“What was the person wearing when GPT gave them sightseeing suggestions?”
Apparently not their thinking cap.