September 17, 2024 at 04:24PM
Google Cloud’s Document AI service has a vulnerability that could be exploited by attackers to access and steal sensitive data from Cloud Storage buckets. Despite being reported, Google has yet to fully address the issue, leaving the attack vector open. The nature of the vulnerability and back-and-forth with Google regarding a bounty are detailed by Vectra AI’s security researcher, Kat Traxler.
Based on the meeting notes, the principal security researcher Kat Traxler discovered a vulnerability in Google Cloud’s Document AI service, which, if exploited, could allow data thieves to break into Cloud Storage buckets and steal sensitive information. Despite reporting the flaw to Google in early April, it took several months of back-and-forth discussions for the company to acknowledge and fix the issue. Ultimately, Traxler received a bug bounty of $3133.70 for her disclosure. The vulnerability was related to overly permissive settings in the Document AI service, specifically in batch-processing mode, where the service agent’s permissions were too broad, allowing it to access any Google Cloud Storage bucket within the same project, and potentially exfiltrate sensitive data. Traxler was persistent in advocating for the fix and finally received the reward in September. It’s worth noting that Google did not immediately respond to inquiries from The Register regarding this issue.
Would you like me to include any specific details or highlight any particular points from the meeting notes?