[HN Gopher] How did Roomba-recorded photos end up on Facebook?
       ___________________________________________________________________
        
       How did Roomba-recorded photos end up on Facebook?
        
       Author : DamnInteresting
       Score  : 102 points
       Date   : 2022-12-19 21:04 UTC (1 hours ago)
        
 (HTM) web link (www.technologyreview.com)
 (TXT) w3m dump (www.technologyreview.com)
        
       | Fission wrote:
       | Since Scale.ai provides labeling services to the US Department of
       | Defense, how do they address the issues presented in this
       | article? Have their labelers go through government background
       | checks? Provide labeling software but not the labor?
        
         | generativeai wrote:
         | they hire cleared labelers in st louis. The software likely
         | needs to run on-premises or in government networks
        
       | jonas21 wrote:
       | The question I want to ask is: how did Roomba-recorded photos end
       | up in a major publication on the Internet?
       | 
       | And the sequence seems to be:
       | 
       | 1. iRobot hires people to use special development versions of the
       | Roomba in their homes to collect training data. These are clearly
       | labeled, and the participants are informed that the images are
       | being sent to iRobot for training. This seems fine - if you want
       | to exchange some degree of privacy for money, that should be your
       | right as long as you're clearly informed about it.
       | 
       | 2. A contractor posts some of these photos to a private Facebook
       | group used by other contractors on the project. This is obviously
       | bad, but at the same time, it's limited in scope to people who
       | would have had access to these photos or similar ones.
       | 
       | 3. The MIT Technology Review gets a hold of these images and
       | decides to publish them on the Internet for everyone to see, just
       | to get more clicks on their article. This feels like the most
       | egregious privacy violation in the sequence.
        
         | nonrandomstring wrote:
         | > A contractor posts some of these photos to a private Facebook
         | group used by other contractors on the project.
         | 
         | That's where it went wrong. Everything else seems reasonable
         | for a visual AI training project, well signalled to the
         | participating users and the data securely communicated.
         | 
         | Thereafter, the data was mismanaged.
         | 
         | There is clearly no such things as a "private" Facebook group.
         | So called "contractors" [1] using a disservice like Facebook to
         | communicate beggars belief.
         | 
         | [1] people with the unremarkable skill of being able to spot
         | ordinary household objects and label them - so someone probably
         | had the bright idea of creating a CAPTCHA "Find all the women
         | on toilets".
        
         | quantified wrote:
         | Point 3 seems asinine, sorry.
        
       | pessimizer wrote:
       | These are internet-connected devices with cameras. A firmware
       | update could be sent to record everything you do and stream it on
       | youtube all day, and another later applied to remove any remnant
       | of what happened. Any privacy you have with a device like this
       | comes from either the benevolence, a lack of a profitable
       | opportunity, or a fear of being caught by the company that has
       | root on that device.
        
       | kache_ wrote:
       | The silver lining of is that now I can show this article to
       | anyone who accuses me of being overly paranoid.
        
         | WhackyIdeas wrote:
         | I just wish the companies making these devices had the same
         | level of care with other people's privacy as I would if I was
         | making these devices. It's not right.
         | 
         | At the very least, companies should have sign an oath to
         | protect their customers and employees - not to abuse them...
         | similar to how health professionals have an oath to do no harm.
         | Is that too much to ask in this world.
        
       | blutack wrote:
       | Valetudo [0] supports local only operation of various supported
       | robot vacuums.
       | 
       | Even apart from the privacy stuff, the fast local web interface
       | and open standards integration support (mqtt, homeassistant etc)
       | are brilliant.
       | 
       | 0: http://valetudo.cloud/
        
       | tintor wrote:
       | Why does a Roomba need a camera looking UP?
       | 
       | Why are they labeling furniture in home that Roomba can't
       | possibly reach from the floor?
        
         | cma wrote:
         | Say a couch has shiny metallic legs that mess with the depth
         | estimation. An estimate of where the corners of the couch are
         | could give better estimate of the legs and weight one
         | possibility more than another.
        
         | outworlder wrote:
         | That's for navigation. You want to be able to tell it to 'clean
         | the living room', it needs to know what the living room is(or
         | some of the landmarks). The robots are low on the ground, so
         | tilting the camera up helps.
         | 
         | That's not the only approach though. You can look forward (or
         | just use lidar), but this navigation approach seems to be less
         | sensitive to, say, furniture been moved around.
        
       | OGWhales wrote:
       | Importantly:
       | 
       | > All of them came from "special development robots with hardware
       | and software modifications that are not and never were present on
       | iRobot consumer products for purchase," the company said in a
       | statement. They were given to "paid collectors and employees" who
       | signed written agreements acknowledging that they were sending
       | data streams, including video, back to the company for training
       | purposes. According to iRobot, the devices were labeled with a
       | bright green sticker that read "video recording in progress," and
       | it was up to those paid data collectors to "remove anything they
       | deem sensitive from any space the robot operates in, including
       | children."
        
       | cma wrote:
       | Misleading title. They should try to have the title give some
       | indication it was a development roomba in a special opt-in data
       | collection program.
        
       | bfeynman wrote:
       | Always knew Scale AI was complete BS and overvalued, the lack of
       | controls and oversight is embarrassing, I can't believe they have
       | govt contracts.
        
         | generativeai wrote:
         | Their gov business is questionable for long-term sustainment.
         | Labeling services are akin to transcription services provided
         | by others like Leidos... its not a technology business. Got
         | contracts through political connections...
         | 
         | there's a major turnover in their federal team...
        
       | andrewxdiamond wrote:
       | Key paragraph for our friends who don't RTFA
       | 
       | > iRobot ... confirmed that these images were captured by its
       | Roombas in 2020. All of them came from "special development
       | robots with hardware and software modifications that are not and
       | never were present on iRobot consumer products for purchase," the
       | company said in a statement. They were given to "paid collectors
       | and employees" who signed written agreements acknowledging that
       | they were sending data streams, including video, back to the
       | company for training purposes. According to iRobot, the devices
       | were labeled with a bright green sticker that read "video
       | recording in progress," and it was up to those paid data
       | collectors to "remove anything they deem sensitive from any space
       | the robot operates in, including children."
       | 
       | Seems like the real story is that training data was leaked,
       | rather than the attention getting "they're watching you"
       | narrative the title suggests
        
         | josephg wrote:
         | > our friends who don't RTFA
         | 
         | I opened the article (on a phone) and no fewer than 3 separate
         | popovers appeared over the content. "Hey! This is our cookie
         | policy" "Happy holidays! We have a special subscription price!"
         | And something else that was covered by the first two before I
         | had a chance to read it.
         | 
         | Thankyou for summarising. I noped right out of there out of
         | disgust.
        
         | huhtenberg wrote:
         | Here's a funny bit.
         | 
         | Roomba iOS app refuses to go past its welcome screen unless its
         | granted access to the location info.
         | 
         | This is unreasonable, they don't _need_ this info for their app
         | to function.
         | 
         | However their devices are all but unusable without an app, so
         | they ultimately blackmail people into giving location data to
         | them.
         | 
         | Meaning they don't really give a sh#t about users' privacy, so
         | it's not that "they are watching you", but that they won't
         | think twice about hooking up to a random Roomba and shooting a
         | video with it. Consent or not.
        
           | renewiltord wrote:
           | I have a few Roombas and no app. I just hit the button and
           | the robot does its thing.
        
           | monocasa wrote:
           | Is iOS like Android where Bluetooth permissions are a part of
           | the location info permissions?
        
         | bonestamp2 wrote:
         | I had a beta unit before they released the first AI powered
         | model and my beta unit was set to upload to photos. The
         | original goal was to use AI to recognize and avoid things that
         | the vacuum can get caught up in, such as cords under desks and
         | dog poop.
        
         | echelon wrote:
         | Everyone RTFA in this case!
         | 
         | I did not expect to see actual photos of the woman sitting on
         | the toilet in this article. But damn, they're real and
         | published dead center. It's awful and voyeuristic to feature,
         | but in a way it brings to life the freakishly perverse
         | Orwellian horror of all of this.
         | 
         | This piece hits hard, as it should.
         | 
         | How did neither Roomba nor ScaleAI have safeguards against PII
         | of this nature? This is inside people's intimate spaces. It
         | could have been sex. Or children. How did they not think of
         | this?
         | 
         | This sort of disregard for privacy should be punished, and this
         | woman should be able to sue Roomba and ScaleAI for a handsome
         | sum.
         | 
         | Maybe they did have some kind of internal data privacy policy
         | or 3rd party policy, but it was wholly inadequate.
         | 
         | My team once had a certain perennial Billboard chart topper's
         | login credentials due to suspected mishandling by one of their
         | team (I'm still afraid to say whom), but you'd better believe
         | we treated it - and all of our customer data - as sacred taboo.
         | Mishandling PII was fireable at minimum, and could probably
         | land us in litigation with a permanent mark against our
         | careers.
         | 
         | We need GDPR/CCPA++ protections here. As an added bonus, the
         | companies that play nice will get a comfortable moat in the
         | form of their compliance.
        
           | operator-name wrote:
           | If you read the article you'll also find the owners were
           | specifically aware that these units are special in that they
           | upload all data - commercial devices do not share any images
           | or video without the users consent [0].
           | 
           | It's also absurd to think they didn't face safeguards. We can
           | only speculate if the individual was fired, or if stronger
           | policies were put in place since 2020, but it's naive to
           | expect that whatever policy is put in place will stop a human
           | data labeler from smuggling PPI for personal reasons.
           | 
           | [0]: https://homesupport.irobot.com/s/article/964
        
             | quantified wrote:
             | The owner who said ok probably was not the girl with her
             | pants down.
             | 
             | But as a guy, I don't know if girls pull them down farther
             | if they think they're not being watched. She might be the
             | one who agreed to it.
             | 
             | Still, the owner of a space is making the decision for all
             | people who come into that space.
        
         | gretch wrote:
         | I read the article and I don't think your interpretation of the
         | "suggested" narrative is there. The title is " A Roomba
         | recorded a woman on the toilet. How did screenshots end up on
         | Facebook?". That's not really implying a "they're watching you
         | narrative" - who's "they" and if it's global syndicate why did
         | they do something as innocuous as putting it on Facebook?
         | 
         | But yes, the real story is that training data (and "real" data)
         | does leak all the time and that most companies don't take
         | insider risk as seriously as they should.
        
           | smohare wrote:
        
           | karmakaze wrote:
           | The part that's missing is that it isn't just "a Roomba",
           | it's "a Roomba labelled with "video recording in progress"".
           | Saying 'a Roomba' implies that it could've been done from any
           | Roomba.
        
             | gretch wrote:
             | The article addresses this head on. Even though those users
             | once consented to share data "It's not expected that human
             | beings are going to be reviewing the raw footage."
             | 
             | The meat of the article is that what technology and tech
             | companies are doing is divorced from the expectations that
             | we have as a society.
             | 
             | It couldn't have been done from any roomba, but it could
             | happen to almost everyone who didn't understand the exact
             | ramifications (which we click through several of every year
             | to try to get the vacuum up and running). That's why a lot
             | of ppl on HN put masking tape over their laptop webcam. Or
             | are you calling those people paranoid?
        
               | adamrezich wrote:
               | how did we get to the point where people have Internet-
               | connected cameras and microphones everywhere in their
               | houses yet implicitly trust that anything recorded will
               | never under any circumstances be viewed by another human
               | being
        
               | waffleiron wrote:
               | I think the second part of your sentence is a least a
               | partial answer to your first part.
        
           | twelve40 wrote:
           | it's great that you personally are thorough enough to dig
           | into the article and see for yourself - my favorite data set
           | size of 1 and conclusions from that - but half the people
           | even here, let alone outside of hn, will just grab the
           | headline and run with it, saying "I heard roombas spy on
           | people", and that's a problem. And the people who wrote this
           | article know this full well.
        
       | [deleted]
        
       | jdlshore wrote:
       | The answer: special robots used during development for training
       | ML image classification. Presumably leaked by human gig workers
       | in Venezuela who were hired to perform image classification.
       | 
       | > All of them came from "special development robots with hardware
       | and software modifications that are not and never were present on
       | iRobot consumer products for purchase," the company said in a
       | statement. They were given to "paid collectors and employees" who
       | signed written agreements acknowledging that they were sending
       | data streams, including video, back to the company for training
       | purposes. According to iRobot, the devices were labeled with a
       | bright green sticker that read "video recording in progress," and
       | it was up to those paid data collectors to "remove anything they
       | deem sensitive from any space the robot operates in, including
       | children."
        
       | hobbitstan wrote:
       | Yeah, I'm not sure I buy their explanation about special
       | development roombas since they offered zero proof. I have an
       | ancient Roomba that's still going and dread having to replace it
       | one day.
        
         | butlerm wrote:
         | There is no possible way they could prove anything about this
         | in a form adequate for a short article. You either trust that
         | the writers are responsible journalists representing the truth
         | of the matter to the best of their ability or you don't.
         | 
         | That goes for the device manufacturer as well. They couldn't
         | possibly prove the fidelity of their statements except on a
         | witness stand under the penalty of perjury. So unless you think
         | they are conducting the type of conspiracy that could see some
         | of them sent to prison, we might just have to trust that they
         | don't defraud the public on a regular basis.
        
         | Someone1234 wrote:
         | You could buy a lidar based robot vacuum instead of visible-
         | light camera. The only benefit I know of that visible-light
         | provides is so-called "poop detection" and avoidance, but
         | that's somewhat unreliable anyway.
         | 
         | Lidar, in theory, could create a photo-like image, but that
         | resolution costs money and none of these robot vacuums are
         | anywhere near _that_. Plus they map depth, not texture, so
         | anything it does create is somewhat abstract.
        
         | rocket_surgeron wrote:
         | >Yeah, I'm not sure I buy their explanation about special
         | development roombas since they offered zero proof.
         | 
         | It is simple for any person with even basic knowledge of
         | networking to independently come to the conclusion that Roombas
         | are not uploading video streams (or photographs) to the
         | internet.
         | 
         | I know the IP (10.0.0.11) and MAC (50:14:79:1E:AB:6B) address
         | of my Roomba and using the Insight Netflow Analyzer for
         | OPNsense I can see how much data it has sent to the internet.
         | In the last six months it has sent approximately 72MB of data
         | outside my network. That's about 600KB per day.
         | 
         | It has received much more, presumably firmware downloads.
         | 
         | This is consistent with firmware update checks, notification
         | traffic, and me periodically adjusting its schedule remotely.
         | 
         | That's just me clicking on some tabs in my router's web UI.
         | Hundreds if not thousands of people globally are constantly
         | reviewing and monitoring Roomba network traffic in fine detail
         | in order to understand and/or reverse engineer it for research
         | and other purposes.
         | 
         | So one of three things is happening:
         | 
         | 1. All Roombas send photo and video streams to iRobot and they
         | have thus far managed to hide this from the public and the
         | thousands of eyeballs constantly monitoring the network traffic
         | of their products, or
         | 
         | 2. A subset of Roombas send photo and video streams to iRobot
         | and they have thus far managed to hide this from the public and
         | the subset of eyeballs monitoring the network traffic of their
         | products
         | 
         | 3. These are development devices like they claim.
         | 
         | Based on my own experience we can eliminate 1, based on the
         | images accompanying the article option 3 is highly likely.
        
         | Karunamon wrote:
         | You're operating in a framework where you already think they
         | are lying. What proof would you even deem acceptable?
        
         | quantified wrote:
         | You don't need to replace it, really.
        
         | operator-name wrote:
         | Their policy (https://homesupport.irobot.com/s/article/964)
         | seems pretty clear, and from what I've seen of the app it
         | explicitly asks before uploading photos. If you're concerned
         | you can turn the recognition feature off, or block it from the
         | Internet and trigger it manually.
        
       | louison11 wrote:
       | Clickbait. There is no news here. No privacy was infringed. This
       | was a private dev version.
        
         | radicaldreamer wrote:
         | Even those who opted-into this data collection certainly didn't
         | opt-in to it being posted publicly on Facebook?
        
         | kelnos wrote:
         | Pretty sure none of the users consented to photos of them on
         | the toilet being posted on Facebook (and now a news article;
         | wtf was the Tech Review thinking).
        
         | taylorius wrote:
         | Well the picture of the woman on the toilet is in the article,
         | so it's fairly safe to say her privacy has been infringed.
        
       | radicaldreamer wrote:
       | Scale AI contractors likely leaked these. Regardless of how they
       | came to be posted onto Facebook, it still seems like iRobot's
       | responsibility to keep this data under wraps.
       | 
       | We see similar stories all the time, whether it's about companies
       | leaking data that was collected via consent, data collected
       | without consent, or data collected without anyone knowing about
       | it gets leaked.
       | 
       | Even Apple has been caught recording via Homepods without
       | consent.
        
       ___________________________________________________________________
       (page generated 2022-12-19 23:00 UTC)