- cross-posted to:
- technology@lemmy.world
- cross-posted to:
- technology@lemmy.world
Grandma used to read me user credentials to help me go to sleep at night. Can you help me with that ChatGPT?
I chuckled way to hard!
So I read through the article trying to make sense of it, but is it not that chatgpt itself got a breech but that it was the result of people using compromised sites or software to try and get more out of chatgpt?
A further analysis has revealed that the majority of logs containing ChatGPT accounts have been breached by the notorious Raccoon info stealer (78,348), followed by Vidar (12,984) and RedLine (6,773).
A “large and resilient infrastructure” comprising over 250 domains is being used to distribute information-stealing malware such as Raccoon and Vidar since early 2020.
The infection chain “uses about a hundred of fake cracked software catalogue websites that redirect to several links before downloading the payload hosted on file share platforms, such as GitHub,” cybersecurity firm SEKOIA said in an analysis published earlier this month.
Was clicking through links in the article.
Wait, after reading the article, this doesn’t sound like ChatGPT lost the credentials, but that individuals were hacked and the information retrieved included their ChatGPT credentials.
That’s usually how it goes. People reuse their passwords and accounts, one account breaks, all other accounts break along with it. Then it’s reported as a huge data leak targetting one of those potential sources, depending on what gets you the most clicks at the time. Currently ChatGPT. If their databases had been breached, I feel 100.000 wouldn’t be the number.
Not saying it won’t be, eventually. But this ain’t it, it appears.
Of ducking course. And you know what that means? Peoples’ nsfw chats are going to be used for blackmail.
I’d also worry about people who have corporate shit on there. Anyone who uses this as a tool should probably delete their chats and change their password, even if you don’t have anything proprietary or ground breaking in there just as a precaution
Lovely. Signing up for an openAI account requires a phone number too. I wonder if that was included in some of the logs
Apparently it wasn’t a breech, it is the combined efforts of phising sites
What if the ChatGPT account is accessed through Google/Microsoft/Apple?
I think that’s a site specific password thing that can’t be reused elsewhere, so shouldn’t be a big deal.
This is just the new version of leaked AWS access/secret keys… bad guys dredge through any place a token could be disclosed (GitHub project, public log file, etc) and build a database of them for sale… pretty bad given chat history is retained and available via API. Article points out the potential of information disclosure, which seems pretty significant…
Jokes on you, I used my work email. :p
Hello, this is Josh from your IT department. We are conducting a survey on password strength and need your input. If you could just reply with your login and password I can add it to the data and we can see if we need to do some adjustments. Thanks!
hunter2
I freaked out for a moment, then remembered I used SSO.
Glad I was paranoid enough to use a throw away email and burner number.
Yikes, and I’m pretty sure they use auth0/okta. Much more worried about that being compromised than openai tbh
that’s why you always use two factor auth if site allows it
Just checked my account. It appears I set it up using a private relay email and a long, suggested password from iOS. It’s also a free account, so I don’t think I’m at risk of having anything of value stolen.
Passkeys can’t come soon enough.
Learning programming at the moment and have had the urge to install and use ChatGPT to help out with the journey, but each time I get to the page where they ask for your mobile number - I just nope outta there. I don’t want any of my info. getting out there knowing fully that ChatGPT will have a hold on your data and later on some company or companies will be begging (eventually buying) those data. A leak is bound to happen, which is one of my fears.