- A former Meta engineer in London allegedly downloaded approximately 30,000 private Facebook images using a custom script to bypass security systems
- Meta discovered the breach over a year ago, terminated the employee, and referred the case to the Metropolitan Police
- The ICO is aware of the incident and data protection experts say Meta could face regulatory consequences if found to lack adequate safeguards
On April 7, 2026, The Guardian reported that a former Meta employee in London is under criminal investigation for allegedly downloading approximately 30,000 private Facebook images.
According to court papers cited by The Guardian, the engineer is accused of designing a program specifically intended to circumvent Meta’s internal detection systems — allowing him to access and download private user images at scale. The case is being handled by the Metropolitan Police’s cybercrime unit, and the former employee has been released on police bail pending further reporting in May 2026. Any foreign travel plans must be disclosed to police.
Sky News reported that upon discovery, Meta terminated the employee’s contract, notified affected users, and referred the matter to UK law enforcement. The company also said it enhanced its internal security systems following the incident. The case has drawn renewed attention to the access that platform employees can theoretically wield over user data — and to the question of whether the safeguards in place are sufficient to catch someone who goes looking for it deliberately.
This is not the first time Meta has faced scrutiny over how it handles user data in recent months. As we reported earlier this week, Meta has been navigating a series of internal AI-related security incidents — including an episode where an AI agent’s instructions inadvertently caused sensitive internal data to be exposed to employees. The company has also been dealing with fallout from its AI smart glasses program, which we covered in March, as privacy concerns around the hardware surfaced in multiple markets.
The Script That Bypassed Meta’s Internal Security
The detail that makes this case stand out is the method. According to court papers reviewed by the Press Association, the former Meta engineer did not simply access the images through legitimate channels and forget to log out. He allegedly created a dedicated script designed to route around Meta’s internal detection systems — meaning he thought about this in advance and took active steps to avoid being caught. That level of premeditation typically transforms an access incident into a criminal matter rather than an HR one. ITV News noted that two magistrates recently agreed to vary the man’s police bail conditions, suggesting the case is progressing through the court system.
Jon Baines, a senior data protection specialist at law firm Mishcon de Reya, told The Guardian that when an employee accesses personal data without employer authorization, there is potential for offences under both data protection and computer misuse laws. The key question for Meta’s regulatory exposure is whether the company had appropriate technical and organizational measures in place to prevent or detect unauthorized access.
If it did, the company itself is unlikely to face liability — but if the Information Commissioner’s Office determines those safeguards were insufficient, Meta could be exposed to significant fines or legal claims from affected users.
The Information Commissioner’s Office confirmed it is aware of the incident. An ICO spokesperson said social media users should be able to trust that their personal information is handled responsibly. The regulator’s involvement is notable because it raises the possibility of a formal investigation into Meta’s data protection practices separate from the criminal case against the individual employee.
What This Tells Us About Platform Employee Access
The broader context here is uncomfortable for anyone who has ever assumed their social media data is locked down. Platforms like Meta employ thousands of engineers, content moderators, and support staff who, by necessity, have varying levels of access to user accounts and content. The systems that govern that access — audit logs, anomaly detection, role-based permissions — are only as good as their implementation. Someone with the right knowledge and enough motivation can, in theory, go looking.
This case is a reminder that data security is not just a technical problem; it is also a human resources and cultural one. Meta’s own statement acknowledged the breach and said protecting user data was a top priority. The company added that it cooperated with the ongoing investigation. What it did not address directly is why an employee with access to that many user images was able to operate undetected for whatever period the breach went on before detection.
For Meta, the immediate priority is the criminal proceedings and whatever internal reviews the ICO may initiate. For the 30,000 affected users, the practical question — what was done with the images, and were they shared or sold — remains unanswered.

