[HN Gopher] Every reported mistaken arrest using facial recognit...
       ___________________________________________________________________
        
       Every reported mistaken arrest using facial recognition, person has
       been Black
        
       Author : thunderbong
       Score  : 85 points
       Date   : 2023-08-06 21:40 UTC (1 hours ago)
        
 (HTM) web link (www.businessinsider.com)
 (TXT) w3m dump (www.businessinsider.com)
        
       | 10g1k wrote:
       | Light = contours = facial recognition...
        
       | cltby wrote:
       | > "We believe this results from factors that include the lack of
       | Black faces in the algorithms' training data sets..." the
       | researchers wrote in an op-ed for Scientific American.
       | 
       | > The research also demonstrated that Black people are
       | overrepresented in databases of mugshots.
       | 
       | The sort of clear-headed thinking that makes the AI bias field as
       | respected as it is.
        
         | lozenge wrote:
         | It makes sense to me? The algorithm specialises in
         | distinguishing between the faces in its training set. It works
         | by dimensionality reduction. If there aren't many black faces
         | there it can just dedicate a few of its dimensions to
         | "distinguishing black face features".
         | 
         | Then if you give it a task that only contains black faces, most
         | of the dimensions will go unused.
        
           | cltby wrote:
           | Are black faces overrepresented or underrepresented?
           | According to AI researchers, we're faced with Schrodinger's
           | Mugshot--there's simultaneously too many and too few!
        
             | BoorishBears wrote:
             | There's two datasets but your conviction that there's some
             | political undertone to this is leaving you unable to
             | process basic logic.
        
         | zerocrates wrote:
         | The actual quote that the mention in the article refers to:
         | 
         | "Using diverse training sets can help reduce bias in FRT
         | performance. Algorithms learn to compare images by training
         | with a set of photos. Disproportionate representation of white
         | males in training images produces skewed algorithms because
         | Black people are overrepresented in mugshot databases and other
         | image repositories commonly used by law enforcement.
         | Consequently AI is more likely to mark Black faces as criminal,
         | leading to the targeting and arresting of innocent Black
         | people."
         | 
         | So they're saying that simultaneously the _training_ set has
         | too few black faces and the set being _compared_ against has
         | too many.
        
         | sonofhans wrote:
         | Yeah, that's terrifying. It's the same reasoning as, "Of course
         | more Black people are convicted of crimes -- we arrest more of
         | them!"
        
         | marsven_422 wrote:
         | Maybe they do more crimes
        
       | [deleted]
        
       | rootusrootus wrote:
       | Is it plausible that cameras have more trouble with dark skin
       | because of reduced contrast of facial features? Or is that
       | already studied and discarded, and this is just a training set
       | problem?
        
         | jxf wrote:
         | That's very plausible. But then that means there is an inherent
         | bias against people with darker skin. So the more interesting
         | question is: why wasn't that bias anticipated and caught, and
         | if it was known, why was it allowed to go to production where
         | it affected the lives and liberty of one group of people
         | disproportionately?
        
           | vuln wrote:
           | Because the people selling the software reported the false
           | positive rate a x per y but just accidentally forget to
           | report that ALL of the false positives are of one single
           | group. I would love to be in a sales meeting with a company
           | that develops this type of software and the police that buy
           | it.
        
           | jurimasa wrote:
           | That's easy: racism.
        
         | ruined wrote:
         | it is definitely part of the problem. historically, film was
         | designed for optimal rendering of light skin, with little
         | consideration given to darker skin. that trend has continued
         | with digital sensors and modern image processing. any
         | photographer or videographer that regularly images black skin
         | is aware of this.
         | 
         | https://www.npr.org/sections/codeswitch/2014/04/16/303721251...
         | 
         | yes, it is also a training set problem. training on real-world
         | data will reproduce racial bias, because racial bias exists in
         | the world. racism in, racism out.
         | 
         | but the more direct cause of this kind of thing is the racism
         | of the police using these tools, the states deploying these
         | tools, and the private enterprises making these tools.
        
       | NotSuspicious wrote:
       | > All six known reports of false arrests due to facial
       | recognition technology were made by Black people.
       | 
       | It's fascinating how sample size differs based on who wants what
       | policy to be enacted. What are the false arrest rates due to
       | people just looking at a picture and thinking "That's the one"?
        
       | Animats wrote:
       | There's work going on to look a little further into the infrared
       | to fix this.[1] See the image set "Sample database images in all
       | three spectra". High dynamic range images help, a lot.
       | 
       | [1] https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10320199/
        
       | bandrami wrote:
       | We can't even get automatic hand dryers to work for nonwhite
       | people so I think usable face recognition is still a ways off
        
         | tasogare wrote:
         | Given that hand dryers are ubiquitous in Japan, they're
         | certainly working on a large part of nonwhite people.
        
       | henearkr wrote:
       | The facial ID standards should move towards adding a 3D scan of
       | the face (especially now that biometrics are digitalized and
       | stored on chips on ID cards and passports). That would solve this
       | kind of problems.
       | 
       | Neuronal networks could then just add that in their training and
       | be accurate again, and surveillance systems could use a
       | complementary lidar.
        
       | jacknobody wrote:
       | In northern Cape York (Australia) I have several friends who
       | cannot be successfully photographed because their skin is so dark
       | that cameras are unable to resolve their facial features. I don't
       | expect facial recognition to be much use on Cape York.
        
         | staticautomatic wrote:
         | Pretty sure wedding photographers solved this problem a long
         | time ago, though I hope none of them let the facial recognition
         | vampires in on the mysterious secrets of basic lighting and
         | exposure.
        
           | quicklime wrote:
           | If they paid as much money for each photo in the training
           | data as people do for wedding photos, I'm sure the problem
           | would go away.
        
           | Tozen wrote:
           | Exactly. This issue is very solvable and people have been
           | using various solutions for many years. The hurdle is not
           | technology, its other factors.
        
         | Tozen wrote:
         | There is such a thing as lighting, using brighter lights, and
         | processing of images[1][2][3]. From a technological and
         | photography point of view, these types of issues can be
         | resolved. Often the issue is the refusal to create a different
         | setup for darker subjects, or purposefully injecting
         | stereotypes or bias to not find or use obvious solutions.
         | 
         | Furthermore, because people have dark skin, doesn't mean they
         | have the exact same bone structure, facial proportions, or
         | symmetry.
         | 
         | [1]: https://creativecloud.adobe.com/discover/article/10-tips-
         | for...
         | 
         | [2]: https://organicheadshots.com/how-to-photograph-people-of-
         | col...
         | 
         | [3]: https://calgaryjournal.ca/2021/02/28/time-for-a-new-lens-
         | the...
        
         | deepspace wrote:
         | > cannot be successfully photographed
         | 
         | I highly, highly doubt that is the case.
         | 
         | Maybe it is hard to take a good photo with a phone in point and
         | shoot mode. But any professional photographer and most
         | experienced amateurs can absolutely take a good photo of them.
         | 
         | There are only two provisos: they have to use a manually
         | adjustable camera, and they have to be able to control the
         | lighting (even if only by asking the subjects to move around).
        
         | anaganisk wrote:
         | What if the AI is designed to flag them as potential bad
         | actors.
        
           | ipaddr wrote:
           | Flag the person who cannot be photographed? What are we
           | flagging again?
        
             | ithkuil wrote:
             | Clearly all such people that cannot be photographed are all
             | the same person
        
       | [deleted]
        
       | tedunangst wrote:
       | Would be interesting to see when false matches are double checked
       | by police and when police simply proceed to the arrest.
        
       | cratermoon wrote:
       | As far back as 1985 Brian Winston wrote[1], "It is one such
       | expression - color film that more readily photographs Caucasians
       | than other human types - that is our concern in this piece."
       | Richard Dyer wrote in 1997, "The aesthetic technology of
       | photography, as it has been invented, refined, and elaborated,
       | and the dominant uses of that technology, as they have become
       | fixed and naturalised, assume and privilege the white
       | subject."[2].
       | 
       | Key findings in the bias of facial recognition come from a 2018
       | paper in _Proceedings of Machine Learning Research_ which found,
       | "We evaluate 3 commercial gender classification systems using our
       | dataset and show that darker-skinned females are the most
       | misclassified group (with error rates of up to 34.7%). The
       | maximum error rate for lighter-skinned males is 0.8%. The
       | substantial disparities in the accuracy of classifying darker
       | females, lighter females, darker males, and lighter males in
       | gender classification systems require urgent attention if
       | commercial companies are to build genuinely fair, transparent and
       | accountable facial analysis algorithms."
       | 
       | 1 "A Whole Technology of Dyeing: A Note on Ideology and the
       | Apparatus of the Chromatic Moving Image"
       | <https://www.jstor.org/stable/20025012>
       | 
       | 2 Dyer, Richard. 1997. White : Essays on Race and Culture.
       | London: Routledge.
       | 
       | 3 "Gender Shades: Intersectional Accuracy Disparities in
       | Commercial Gender Classification"
       | <https://proceedings.mlr.press/v81/buolamwini18a.html>
        
         | marsven_422 wrote:
         | [dead]
        
       | RugnirViking wrote:
       | While I have no doubt of the potential for facial recongition
       | systems to have NUMEROUS inbuilt biases, including even IF every
       | actor involved in their creation and deployment held no ill
       | intent. There have only been six such cases apparently. It seems
       | a bit premature for this kind of hedline.
       | 
       | That also being said, this kind of software is just bad in
       | general, even outside of the biases thing.
       | 
       | One more thing. A quote from the article: "Detroit's police chief
       | said their facial recognition technology, when used alone, fails
       | 96% of the time, Insider previously reported." - What?? That's
       | insane. Surely those numbers can't be right? the source on that
       | he goes into more detail with similar specific numbers. I really
       | hope it doesn't only exist to give more ammunition in court, and
       | they don't care whether it works at all...
        
         | whaleofatw2022 wrote:
         | AI is the new Polygraph...
         | 
         | Except now it can also be used by academia, employers, and
         | more!
        
       ___________________________________________________________________
       (page generated 2023-08-06 23:00 UTC)