Stolen ChatGPT premium accounts for sale on the dark web


Trading stolen ChatGPT account credentials, especially those for premium accounts, has been on the rise on the dark web since March, allowing cybercriminals to bypass OpenAI geofencing restrictions and gain unrestricted access to ChatGPT, according to a report. Check Point study report.

"Over the past month, CPR (Check Point Research) has observed an increase in conversations on underground forums related to the leak or sale of compromised ChatGPT premium accounts," Check Point said in a blog post. "Most of these stolen accounts are sold, but some of the actors also share stolen ChatGPT premium accounts for free, to advertise their own services or tools to steal the accounts."

Various criminal activities around ChatGPT

Researchers have observed various types of ChatGPT-related discussions and exchanges on the dark web over the past month.

The latest dark web activity in terms of ChatGPT includes leaking and free posting of ChatGPT account credentials and trading of stolen ChatGPT premium accounts.

Cybercriminals also exchange verification and brute force tools for ChatGPT. These tools allow cybercriminals to hack into ChatGPT accounts by running huge lists of email addresses and passwords, trying to guess the correct combination to gain access to existing accounts.

Also offered is ChatGPT Account as a Service, a dedicated service that offers to open premium ChatGPT accounts, most likely using stolen payment cards, Check Point said in its blog post.

SilverBullet configuration for sale

Cybercriminals also offer a configuration file for SilverBullet that can verify a set of credentials for the OpenAI platform in an automated way, Check Point said.

SilverBullet is a web test suite that allows users to make requests to a target web application. It is also used by cyber criminals to carry out credential stuffing and account verification attacks against different websites to steal accounts for online platforms.

In the case of ChatGPT, according to the researchers, this allows them to hijack accounts on a large scale. The process is fully automated and can initiate between 50 and 200 checks per minute. Furthermore, it supports proxy implementation which, in many cases, allows you to bypass different protections on websites against such attacks.

“Another cybercriminal who only targets abuse and fraud against ChatGPT products, even named himself 'gpt4'. In his threads, he offers for sale not only ChatGPT accounts, but also a setup for another tool automated checkpoint that verifies the validity of an identifier,” said Checkpoint.

Lifetime upgrade to ChatGPT Plus

An English-speaking cybercriminal began advertising a lifetime ChatGPT Plus account service with a 100% satisfaction guarantee on March 20, Check Point said.

A lifetime upgrade from a normal ChatGPT Plus account opened through the email provided by the buyer costs €59.99, while OpenAI's original legitimate price for the service is €20 per month.

“However, to save costs, this clandestine service also offers an option to share access to the ChatGPT account with another cybercriminal for €24.99, for life,” Check Point said.

What can be done with stolen ChatGPT account credentials?

There is a high demand for credentials stolen from ChatGPT premium accounts, as they can help cybercriminals bypass the geofencing restrictions it imposes. ChatGPT has geofencing restrictions that restrict the use of the service in certain geographic areas such as Iran, Russia, and China.

However, by using the ChatGPT API, cybercriminals can bypass the restrictions and also use premium accounts, Check Point said.

Another potential use for cybercriminals is to obtain personal information. ChatGPT accounts store recent requests from the account owner.

“So when cybercriminals steal existing accounts, they have access to the original account owner's requests. This may include personal information, details of company products and processes, etc. Check Point said in the blog post.

In March, Microsoft-backed OpenAI revealed that a bug in the open source Redis client library had caused ChatGPT to crash and leak data, where users could see other people's personal information and users' chat requests. .

Chat requests and personal information such as subscriber names, email addresses, payment addresses and partial credit card information of about 1,2% of ChatGPT Plus subscribers were exposed, acknowledged the society.

Privacy issues in ChatGPT

There have been several privacy and security issues surrounding ChatGPT in recent months. Italy's data privacy regulator has already banned ChatGPT for alleged privacy violations related to the chatbot's collection and storage of personal data. Authorities have said they will lift the temporary ban on ChatGPT if OpenAI meets a set of data protection requirements by April 30.

The German data protection commissioner has also warned that ChatGPT could face a potential block in Germany due to data security issues.

Meanwhile, earlier this week, OpenAI announced a bug bounty program that invites the global community of security researchers, ethical hackers, and technology enthusiasts to help the company identify and fix vulnerabilities in its security systems. generative artificial intelligence.

OpenAI will distribute cash rewards ranging from €200 for low severity finds to €20,000 for exceptional finds.

Copyright © 2023 IDG Communications, Inc.