Meta recently fixed a serious security flaw in its AI chatbot system that could have exposed users’ private conversations. The vulnerability was discovered by Sandeep Hodkasia, founder of AppSecure, who reported the issue to Meta in December 2024. The company patched the bug in January 2025 and rewarded the researcher with a $10,000 (₹8.5 lakh) bounty.
What Was the Issue?
According to a report by TechCrunch, the problem was related to how Meta AI handled user prompts on its servers. Each time a logged-in user edited a prompt to regenerate text or an image, the system assigned a unique ID to that conversation.
Hodkasia discovered that these IDs could be accessed by analyzing network traffic on the browser. By simply changing the unique ID, it was possible to view someone else’s conversation with Meta AI, including their prompts and AI responses. The researcher claimed these IDs were “easily guessable,” making it a major security risk.
The flaw existed because the system did not properly verify authorization for these IDs. In the wrong hands, this weakness could have exposed a huge amount of sensitive data, including personal queries like medical advice, legal questions, or even confessions.
Meta’s Response
Meta spokesperson Ryan Daniels confirmed that the issue was fixed in January and stated that the company found no evidence of misuse by attackers. The vulnerability was quickly patched after being reported, and extra security measures were added.
Why It Matters
This discovery came just weeks after another incident where Meta AI’s discovery feed accidentally showed private chatbot conversations, raising concerns about user privacy on AI platforms. Meta has since added warnings to discourage users from sharing personal details in AI chats.
The bug highlights the importance of robust security systems in AI technology, especially as companies race to integrate AI tools into everyday apps. Without strong safeguards, such flaws could lead to massive privacy breaches.
Also Read: OpenAI Delays New AI Model Over Safety Tests