Meta's Smart Glasses Privacy Disaster: What You Actually Need to Know

There’s a moment when you realize the device you thought was just sitting on your nightstand might be recording you without your knowledge. That’s essentially what happened to Ray-Ban Meta smart glasses owners after a Swedish investigative report dropped in February, and honestly, it’s the kind of privacy violation that feels almost too brazen to be real.

But it is real. Contractors working for a Kenya-based company called Sama have apparently been watching intimate footage captured by these glasses. We’re talking about people having sex, using the bathroom, changing clothes. All of it ending up in Meta’s databases, reviewed by workers who probably didn’t sign up to be voyeurs.

The Report That Changed Everything

Swedish newspapers Svenska Dagbladet and Göteborgs-Posten, along with freelance journalist Naipanoi Lepapa, interviewed over 30 Sama employees who work on data annotation for Meta’s AI systems. The reporters didn’t get direct access to the footage itself, but the interviews were damning enough.

One anonymous employee described watching “a video where a man puts the glasses on the bedside table and leaves the room. Shortly afterwards, his wife comes in and changes her clothes.” Another mentioned seeing users’ partners come out of the bathroom naked.

The frustration from these workers is palpable. One told the reporters: “You understand that it is someone’s private life you are looking at, but at the same time you are just expected to carry out the work.” That’s not just uncomfortable. That’s the sound of someone realizing they’re complicit in a massive breach of trust.

What Meta Actually Says About This

When the BBC asked Meta for comment, the company confirmed it “sometimes” shares content with contractors to improve its AI chatbot. They said this data is “first filtered to protect people’s privacy,” pointing to face blurring as an example. Which, you know, doesn’t exactly erase the problem when people are recording their bank cards and intimate moments without even realizing it.

Meta’s privacy policy for smart glasses does technically disclose that photos and videos get sent to Meta servers when cloud processing is enabled. It also mentions that human reviewers might check your interactions with the Meta AI. But buried in a privacy policy is not the same as being upfront with users about what’s actually happening.

The company made Meta AI with camera turn on by default last August until users manually disable it. Meta claimed at the time that photos and videos aren’t used for training, but that doesn’t mean they’re not being watched by human eyes.

The Red Light Problem

Here’s where it gets even worse. The Ray-Ban Meta glasses flash a red light when recording, which sounds like a reasonable safety measure. Except lots of people don’t notice it. Or they misinterpret what it means. Some users probably have no idea they’re recording at all.

Sama employees noted that people seem to be inadvertently capturing incredibly sensitive material. Bank card numbers. Passwords. Private moments. The kind of data that shouldn’t be anywhere near a corporate database, let alone viewable by contract workers scattered across the globe.

“We see everything, from living rooms to naked bodies. Meta has that type of content in its databases. People can record themselves in the wrong way and not even know what they are recording,” one anonymous employee said.

A class-action lawsuit was filed against Meta and Luxottica (Ray-Ban’s parent company) right after the Swedish report gained traction. The lawsuit takes direct aim at Meta’s marketing slogan for the glasses: “designed for privacy, controlled by you.”

The legal argument is simple and devastating. No reasonable person would interpret that slogan to mean “your intimate home videos will be viewed and catalogued by human workers in Kenya.” Meta made privacy the centerpiece of its entire marketing campaign while deliberately obscuring how the product actually works. That’s not just misleading. It’s textbook consumer deception.

The lawsuit is seeking damages, punitive penalties, and an injunction forcing Meta to change its business practices. Whether it succeeds or not, it’s a clear signal that consumers are done accepting this kind of bait-and-switch on technology privacy.

What This Means for the Broader Business of Smart Glasses

This isn’t just about Ray-Ban Meta anymore. It’s about the entire premise of wearable devices that can constantly record your environment. As Meta reportedly plans to add facial recognition to these glasses as soon as this year, the privacy implications multiply exponentially.

The UK’s Information Commissioner’s Office has already written to Meta about the report. That’s regulatory attention, which tends to matter more than bad press when it comes to actually changing corporate behavior.

Sama, for its part, issued a statement about its GDPR and CCPA compliance, secure facilities, background checks, and worker benefits. All of which is fine and good, but doesn’t actually address the fundamental problem. The problem is that intimate footage exists in Meta’s systems at all.

You can delete your recordings in the Meta AI app, sure. But how many users know that? How many people even read through the labyrinthine privacy policies that technically disclose all of this?

The real question isn’t whether Meta violated anyone’s trust. They clearly did. The real question is whether anything actually changes because of it, or whether we just accept this as the new normal.

Written by

Adam Makins

I can and will deliver great results with a process that’s timely, collaborative and at a great value for my clients.