{"id":2525,"date":"2026-03-27T22:18:11","date_gmt":"2026-03-27T22:18:11","guid":{"rendered":"https:\/\/stock999.top\/?p=2525"},"modified":"2026-03-27T22:18:11","modified_gmt":"2026-03-27T22:18:11","slug":"meta-promised-it-wouldnt-spy-on-you-with-its-ai-smart-glasses-a-lawsuit-says-humans-are-watching-you-actually","status":"publish","type":"post","link":"https:\/\/stock999.top\/?p=2525","title":{"rendered":"Meta promised it wouldn&#8217;t spy on you with its AI smart glasses. A lawsuit says humans are watching you, actually"},"content":{"rendered":"<p><img src=\"https:\/\/fortune.com\/img-assets\/wp-content\/uploads\/2026\/03\/GettyImages-2173579322_694e93-e1774556179469.jpg?w=2048\" \/><\/p>\n<p>When Meta opened its Ray-Ban smart glasses up for pre-order, it made clear of one thing: your privacy will be secure. \u201cRay-Ban Meta smart glasses are built with privacy at their core,\u201d read a statement at the time, released in September 2023. The marketing was unambiguous about your privacy, and as a result, you might have seen people wearing them around town, in a Super Bowl ad, or even in a court proceeding about child safety on Meta\u2019s own platforms. ICE agents were even reportedly wearing them in the field.<\/p>\n<p>What you might not have seen is, well, yourself caught in the crosshairs of the glasses\u2019 camera. Now, a new study\u2014and a federal lawsuit that quickly followed\u2014alleges the company is even less transparent than those thick lenses, claiming the company is quietly routing users\u2019 footage to human workers overseas instead of its AI models. These workers have seen everything from people undressing to sensitive financial documents, and it\u2019s thanks to users who opt into data sharing for AI training purposes.<\/p>\n<p>\u201cIn some videos you can see someone going to the toilet, or getting undressed. I don\u2019t think they know, because if they knew they wouldn\u2019t be recording,\u201d a worker said he saw in the videos from the glasses.<\/p>\n<p>In late February, Swedish publications Svenska Dagbladet and G\u00f6teborgs-Posten published an investigation into Meta\u2019s AI training pipeline, finding Meta contractors in Kenya help train the artificial intelligence powering the glasses (comprised of the Ray-Ban Meta Wayfarer (Gen 2), the Ray-Ban Display, and the Oakley Meta HSTNs models). What they saw was startling.\u00a0<\/p>\n<p>\u201cWe see everything, from living rooms to naked bodies,\u201d the workers were quoted in the study. \u201cMeta has that type of content in its databases.\u201d<\/p>\n<p>Any user who opts into sharing data for AI training purposes effectively allows all parts of their life to be recorded, and then as a result, reviewed, either by the AIs it\u2019s supposed to train or by the humans behind it. That includes footage of people in bathrooms, undressing, watching porn, and, in at least one documented case, a pair of glasses left on a bedside table that captured a partner who had never consented to being recorded.\u00a0<\/p>\n<p>Meta\u2019s subcontractors\u2014who were data annotators teaching the AI to interpret images by manually labelling content\u2014also reported viewing users\u2019 credit card numbers and financial documents. At the time of the study\u2019s release, Meta responded through a spokesperson, saying: \u201cWhen people share content with Meta AI, like other companies we sometimes use contractors to review this data to improve people\u2019s experience with the glasses, as stated in our privacy policy. This data is first filtered to protect people\u2019s privacy.\u201d<\/p>\n<p>A class action begins<\/p>\n<p>The report triggered legal action. On March 4, plaintiffs Gina Bartone and Mateo Canu filed a class action lawsuit against Meta Platforms Inc. (and glassesmaker Luxottica of America) accusing the company of violating federal and state laws by failing to disclose that videos captured by the glasses are transmitted to its servers and then to the Kenyan subcontractor for manual labeling.\u200b Referencing new privacy bills and regulations as result of the increase in AI and the surveillance economy, the suit reads that \u201cMeta knows this\u201d in reference to the public\u2019s growing concern of their privacy and safety, and \u201cagainst this backdrop,\u201d Meta released the glasses with a \u201creassuring promise: the Glasses were \u2018designed for privacy, controlled by you.&#8217;\u201d<\/p>\n<p>Brian Hall, a privacy and AI attorney at Stubbs Alderton &amp; Markiles, says the revelations were as predictable as they were alarming. \u201cThat\u2019s horrifying. It\u2019s kind of exactly what we all imagined would happen,\u201d Hall told Fortune. \u201cI\u2019m old enough to remember 10 or 12 years ago when Google had their glasses, and that was a concern about people going into restrooms with them on. We\u2019re kind of right back there now.\u201d\u200b<\/p>\n<p> (When Google unveiled its prototype Google Glass in 2013, it ignited a fierce public backlash over surveillance, consent, and the death of anonymity. Bars, restaurants, casinos, and strip clubs banned the device outright, and wearers were mockingly dubbed \u201cGlassholes\u201d).<\/p>\n<p>Hall says the legal liability remains murky, partly because Meta\u2019s own Terms of Service state that data annotators \u201cwill review your interaction with AI, including the content of your conversations with or messages to AI,\u201d and specifies this review \u201ccan be automated or manual.\u201d \u201cIf we went and did a close reading of their privacy policy, there\u2019s not going to be anything explicitly that says they don\u2019t do that,\u201d Hall said. \u201cIn terms of their legal liability, I don\u2019t know, but it\u2019s certainly a PR liability. This is some of the most sensitive information and imagery that there is out there.\u201d\u200b<\/p>\n<p>Hall says his biggest concern isn\u2019t actually the glasses wearers themselves, it\u2019s everyone else caught in the frame. \u201cThe bystanders, the people who are being filmed and identified, they\u2019re the ones that are at risk,\u201d he said. \u201cSadly, our privacy laws are not designed to protect those people. They\u2019re designed to protect the people who are wearing the glasses and their ability to manage their own data.\u201d\u200b<\/p>\n<p>In reference to reports of a man using the glasses in a U.K. court to help \u201ccoach\u201d him through testimony, Hall said the risk compounds significantly as Meta reportedly considers adding facial recognition to the glasses. \u201cIt really is moving from a world where today you might be able to see somebody on the street, in a courtroom, in a bar, and you might be able to do some investigation on Facebook and Instagram and find them. But this is instant. It\u2019s automatic, zero effort. You could be sitting in a courtroom identifying witnesses.\u201d<\/p>\n<p>Hall says existing law is simply not built for what Meta\u2019s glasses make possible. \u201cI don\u2019t know that the existing laws are really sufficient to protect us from the risks of the kind of things that Meta and other social media companies are doing right now,\u201d he said. \u201cIt\u2019s sort of getting shoehorned into the privacy laws, but those are rarely enforced as it is,\u00a0 and this is completely upending the whole framework that those were built upon.\u201d\u200b<\/p>\n<p>\u201cI\u2019m not seeing that people are meaningfully addressing it in any way,\u201d he said, saying current regulations are piecemeal and fail to address the concerns of privacy entirely. Once privacy is addressed, he said \u201ceverything else is just kind of window dressing.\u201d\u200b<\/p>\n<p>Meta did not respond to requests for comment.<\/p>\n<p>#Meta #promised #wouldnt #spy #smart #glasses #lawsuit #humans #watching<\/p>\n","protected":false},"excerpt":{"rendered":"<p>When Meta opened its Ray-Ban smart glasses up for pre-order, it made clear of one&#8230;<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":[],"categories":[245],"tags":[3296,881,5905,2571,713,813,4203,5906,214,3295,5907,5908,1427],"_links":{"self":[{"href":"https:\/\/stock999.top\/index.php?rest_route=\/wp\/v2\/posts\/2525"}],"collection":[{"href":"https:\/\/stock999.top\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/stock999.top\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/stock999.top\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/stock999.top\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=2525"}],"version-history":[{"count":0,"href":"https:\/\/stock999.top\/index.php?rest_route=\/wp\/v2\/posts\/2525\/revisions"}],"wp:attachment":[{"href":"https:\/\/stock999.top\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=2525"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/stock999.top\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=2525"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/stock999.top\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=2525"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}