[HN Gopher] King County, WA is first in the country to ban facia...
       ___________________________________________________________________
        
       King County, WA is first in the country to ban facial recognition
       software
        
       Author : sharkweek
       Score  : 159 points
       Date   : 2021-06-02 21:21 UTC (1 hours ago)
        
 (HTM) web link (komonews.com)
 (TXT) w3m dump (komonews.com)
        
       | jvolkman wrote:
       | For those unaware of WA state counties: this includes Seattle and
       | its surrounding metropolitan area.
        
         | geephroh wrote:
         | But unfortunately, the ban only applies to King County
         | agencies, e.g., KC Sheriff's Department. It does not apply to
         | the numerous independent city jurisdictions within KC such as
         | Bellevue, Redmond, Kent, Auburn, Snoqualamie, etc. who may have
         | their own law enforcement entities, including the Seattle
         | Police Department. And while Seattle does have more restrictive
         | oversight of facial recognition tech through the Seattle
         | Surveillance Ordinance[1] passed in 2018, there are several
         | local governments located inside the boundaries of KC that have
         | experimented with or have indicated they may implement the
         | tech.
         | 
         | TL;DR, those of us who have been working this issue for several
         | years are gratified with the unanimous(!) vote of the county
         | council, but realize there's still a lot of work to do both at
         | the local and state (and national) levels.
         | 
         | [1]
         | https://www.seattle.gov/tech/initiatives/privacy/surveillanc...
        
       | cactus2093 wrote:
       | Genuine question - why not ban cameras altogether? Or ban the use
       | of computers in police stations, make them write up all their
       | reports by hand. I truly don't understand why there would be a
       | line at facial recognition, it's just a law against making a
       | process more efficient.
       | 
       | It's clear to me that there should be a line to prevent fully
       | convicting someone of a crime without any humans in the loop at
       | all. Don't replace the jury or public defenders with robots. But
       | facial recognition could just be a way to look through a large
       | amount of footage to find relevant clips that would then be
       | reviewed in more detail by humans. Without facial recognition,
       | you just pay cops overtime to look through footage themselves. It
       | doesn't necessarily affect the outcome of the process at all.
       | 
       | I think facial recognition just needs some help with its image,
       | it's just a tool that would save tax dollars. Or a means to
       | Defund The Police, if that's your thing.
        
         | paxys wrote:
         | Everything else that you described - computers, digital records
         | - has simple algorithms understandable by the average police
         | officer or citizen. You type a document, it gets saved. You can
         | do a full text search on it. You can pull up an old case and
         | look at photos. You can quickly cross reference. All these
         | tasks could be done step by step by a person and it would just
         | take more time.
         | 
         | When it comes to facial recolonization or AI in general, could
         | anyone really tell you why the computer decided to flag your
         | face and why another similar one was rejected? Would you accept
         | a search warrant when the judgement wasn't based on any human's
         | perception but something which came out a black box? Who makes
         | sure that the data sets used to train the models was 100% free
         | of bias?
        
           | zdragnar wrote:
           | Since when do search warrants get automatically approved by a
           | black box? It is on the judge to approve, not a machine.
           | 
           | If that's not enough, make a human sign off that they have
           | confirmed or at least agree with the black box's conclusion,
           | and put the department on the hook if it was obviously wrong.
           | 
           | > Who makes sure that the data sets used to train the models
           | was 100% free of bias?
           | 
           | The box is comparing a source image to an image with a name
           | attached to it. Like you said, no different than a human
           | would do, with all of their own biases in play. We aren't
           | talking minority report here, so there's no reason to think
           | this is a hurdle that would be difficult to overcome.
        
             | 542458 wrote:
             | I mean, FISA warrants had (have?) a 99+% approval rate
             | despite many applications having serious deficiencies.
             | 
             | Utah's e-warrant program has a 98% approval rate. Some
             | warrants are approved in less than 30 seconds.
             | 
             | Warrants across the US are frequently approved on evidence
             | as shaky as a single anonymous tip.
        
         | an_opabinia wrote:
         | > Or a means to Defund The Police, if that's your thing.
         | 
         | Are there many sincere researchers studying flaws in facial
         | recognition advocating its unequivocal ban forever? Joy
         | Boulamwini:
         | 
         | > At a minimum, Congress should require all Federal agencies
         | and organizations using Federal funding to disclose current use
         | of face-based technologies. We cannot afford to operate in the
         | dark.
         | https://www.govinfo.gov/content/pkg/CHRG-116hhrg36663/pdf/CH...
         | 
         | Timnit Gebru in the NYTimes:
         | 
         | > Do you see a way to use facial recognition for law
         | enforcement and security responsibly?
         | 
         | > It should be banned at the moment. I don't know about the
         | future. https://archive.is/JqiqP
         | 
         | Are the flaws Algorithmic Justice League finding real? Someone
         | has definitely been wrongly accused by a facial recognition
         | error (https://www.nytimes.com/2020/06/24/technology/facial-
         | recogni...).
         | 
         | Is there certainly an _impression_ that activists want a
         | forever ban? Yes, Joy Buolomwini and Timnit Gebru are
         | frequently represented as advocating against even  "perfect"
         | facial recognition.
         | 
         | It's true, lawyers tend to advocate for a forever ban. I don't
         | think these researchers advocate for a forever ban. If you read
         | their raw words, they are much more intellectually honest than
         | the press allows.
         | 
         | Is your line of thinking co-opted by people acting in bad
         | faith? Certainly. You would feel very differently about the
         | deployment of the system if you were so much more likely to be
         | falsely accused by it due to the color of your skin.
         | 
         | Every intellectually honest person's opinion of the justice
         | system eventually matures. You'll come around to delegating
         | authority to researchers over lawyers and police.
        
         | microdrum wrote:
         | This is the right take. Can't ban math. Facial rec is here. If
         | you don't like it, win a public policy debate about making it
         | evidentially weak in front of a judge. Banning facial rec is
         | like saying "you can have security cameras and iPhones, but
         | only human eyeballs can look through them, not computers!"
         | Arbitrary, and doomed to fail.
        
           | diamond_hands wrote:
           | The math is incorrect with Black people's faces more than
           | White people's faces.
           | 
           | "Can't ban math", that's like saying "can't ban words". Yes,
           | but you can ban a combination of words in a location, such a
           | "There's a fire!" in a crowded theater. You can ban a
           | combination of math in a police station that leads to people
           | going to jail.
        
             | tick_tock_tick wrote:
             | > The math is incorrect with Black people's faces more than
             | White people's faces.
             | 
             | That's a limiting aspect of the physical world less light
             | back = less details. Flagging footage for manual review
             | doesn't need to be bias free just the end component that
             | actually effects the person in the video.
             | 
             | Yelling fire in a crowded theater is legal.....
             | 
             | https://www.theatlantic.com/national/archive/2012/11/its-
             | tim...
        
             | bsenftner wrote:
             | The issue here is a complete lack on the behalf of the FR
             | industry from impressing the importance of human oversight,
             | and then validating that human oversight does not suffer
             | racial blindness. It is pointless if the operators of an FR
             | system cannot distinguish between similar age siblings or
             | cousins of the ethnicity they are seeking via FR. This is
             | far too often the case, and those police officers operating
             | an FR system could simply be replaced by same ethnicity
             | operators to receive significantly less bias.
        
         | abeppu wrote:
         | I think the tough part is you can't really ban failing to think
         | in the presence of an algorithm. People see an algorithm
         | produce a result and often assign far too much confidence to
         | it.
         | 
         | "The satnav directions say to turn onto this pier."
         | 
         | "The model says this house is worth 200k less than the ones in
         | the whiter neighborhood a quarter mile away."
         | 
         | "The system recommends the patient have 40% allocated home
         | health care hours per week going forward."
         | 
         | "The algorithm says this grainy footage of someone far away
         | from the camera in the dark and the rain is X."
         | 
         | If you put humans in the loop informed by an algorithm making
         | judgements, the human will often defer to the algorithm. Does
         | the algorithm give an output that indicates its uncertainty?
         | Based on what model? How equipped is the human to consider
         | that?
        
         | matz1 wrote:
         | Right, this is like banning knife because it can be used to
         | kill people. I much prefer they try to fix whatever it is that
         | cause misery with the use of facial recognition, instead of
         | banning it altogether.
        
         | _jal wrote:
         | I think there are very good reasons to keep the government out
         | of the FR business, at least for now.
         | 
         | Facial recognition is the enabling technology for automated
         | real-world repression. Maybe it is true that we can find a way
         | to tame it to gain efficiency gains without destroying open
         | society. But right now it looks like a dangerous one-way
         | ratchet.
        
         | FridayoLeary wrote:
         | At least ban speed cameras! I think we can all agree on that.
         | Also, all those cameras that detect when you make illegal turns
         | and send you nasty fines through the post.
        
         | _fullpint wrote:
         | The lack of understanding of the downfalls and conflicts with
         | facial recognition with justice system and this post is every
         | reason why it shouldn't be near the justice system.
        
         | [deleted]
        
         | kderbyma wrote:
         | it's about security and convenience.....it's a tradeoff so it's
         | impossible to achieve both perfectly.
         | 
         | typically you find the balance....right now you are suggesting
         | that the improvement is a convenient way of increasing security
         | but the reality is, that security in one sense can lead to loss
         | of it in another.
         | 
         | example. Assuming all things are Good and no possibility for
         | corruption then it would be purely beneficial....but since
         | those aren't controls, and not realistic either, we cannot
         | sacrifice security for convenience in this case.....if that
         | makes sense.
        
         | NegativeLatency wrote:
         | Laws can always be changed, seems sensible to me to prevent
         | this stuff being rolled out when there are so many unresolved
         | problems and abuses with it.
         | 
         | I generally don't want my government to "move fast and break
         | things"
        
         | bsenftner wrote:
         | There is a subtle misdirection when some jurisdiction "bans
         | government use of facial recognition" - that ends up, or is by
         | design, how private agencies are created that law enforcement
         | then contracts with less oversight than before.
        
         | kenjackson wrote:
         | I agree. It's hard to understand how this is different than
         | using a computer to detect, for example, credit card fraud.
         | Both pour through tons of private data (typically using
         | algorithms that don't make sense to a layperson -- or maybe
         | even to experts since they likely use some DNNs) to determine
         | is something problematic may have occurred. Both technologies
         | can be used for nefarious purposes, but both can have
         | safeguards that minimize the likelihood and impact of these
         | purposes.
         | 
         | I'm way more scared of guns than I am of facial recognition and
         | as a country we will never ban guns.
        
         | Arainach wrote:
         | >it's just a tool that would save tax dollars
         | 
         | No, nothing is "just a tool". Surveillance technology enables
         | all sorts of new attacks. Automation makes things feasible.
         | Cheap storage that allows weeks or months of security footage
         | to be preserved changes what's possible. License plate scanners
         | are "just" a technology that could be done with a bunch of cops
         | on corners with notepads, but in aggregate they're a tracking
         | of where everyone is and has been that could never be done
         | without the technology. Facial recognition is very similar.
        
           | matz1 wrote:
           | > Surveillance technology enables all sorts of new attacks
           | 
           | Sure, but like any other tools it also enable all sorts of
           | new benefit.
           | 
           | The preferable action would be to take advantage of the
           | benefit while also try to fix whatever it is that cause
           | problem with the tools instead of simply banning it.
        
             | wizzwizz4 wrote:
             | That's what they're doing.
             | 
             | The _problem_ is automated identification of faces. The
             | solution is banning that.
        
               | matz1 wrote:
               | >The problem is automated identification of faces
               | 
               | Whats the problem with this ?
        
               | lewdev wrote:
               | I don't think that's the problem. The problem is that the
               | justice system believes that it's enough to convict
               | someone of a crime.
        
             | dabbledash wrote:
             | The problem is that constant surveillance of people makes
             | exerting power over them trivial.
        
               | matz1 wrote:
               | So then that is the actual problem, try to fix that
               | problem instead of banning it. Allowing people to own gun
               | also make it trivially easy to kill people. So ban gun ?
        
               | dabbledash wrote:
               | If there's a way to make it impossible for the government
               | to use facial recognition to monitor people other than
               | legally banning them doing so, it's hard to imagine it.
        
               | matz1 wrote:
               | why would i want to make it impossible for government to
               | monitor people ?
               | 
               | government monitoring people by itself is not a problem,
               | government monitoring people _to harm_ people is the
               | problem.
               | 
               | What need to be fixed is the usage of tools to harm
               | people, not the tool itself.
        
               | dabbledash wrote:
               | The only thing that keeps them from harming you is your
               | power relative to them. Handing them more power only
               | increases the probability that they will abuse what
               | powers they do have.
               | 
               | I thought the election of Trump would disabuse people of
               | this idea that it's smart to hand more and more power to
               | central authorities. But at this point I think it's
               | hopeless. People just want the state to be Daddy, and
               | assume or hope that its power will always be wielded by
               | people they like against people they don't.
        
         | tyingq wrote:
         | >I think facial recognition just needs some PR help
         | 
         | They are making their own bed in many cases. Here's one of
         | several interesting moves by Clear View AI, for example:
         | https://twitter.com/RMac18/status/1220876741515702272
        
         | didibus wrote:
         | I think it's a matter of accuracy and bias. If the tech keeps
         | making false accusations, especially against the same kind of
         | people over and over, then a ban might be in place.
         | 
         | You could argue that humans make similar mistakes and bias, but
         | the scale is always smaller when humans are involved, just
         | because of how slow they are.
         | 
         | Say a model is wrong 37% of the time, and so are humans, but
         | the model runs 1000 times a minute, where as humans perform the
         | task 10 times a minute. That means the model makes 370 false
         | accusations a minute, where a human makes only 3 a minute.
         | 
         | In effect, the number of people you bothered falsely accusing
         | is still less when done manually, simply because it can't
         | scale.
         | 
         | Lastly, there's an argument that the police shouldn't be too
         | efficient, because some crimes might be better off existing in
         | a grey area.
         | 
         | People do fear police supremacy and police states, the idea
         | that they can find you for j-walking or jumping a fence as a
         | dare, etc. Or that the state could become authoritarian, but
         | you'd have no way to fight back as rebels due to advance tech
         | of surveillance the police would have, etc. I'm not saying
         | those are justified, but I think it plays into it, people are
         | fine saying that only the highest risk cases are worth
         | pursuing, but if a tech comes along that lets police pursue all
         | cases, it might start to feel abusive to people.
         | 
         | P.S.: I'm neither against nor in support of a ban here, I'm
         | still making up my mind, just answering some of the reasons I
         | know off for being for the ban.
        
           | kenjackson wrote:
           | > I think it's a matter of accuracy and bias. If the tech
           | keeps making false accusations, especially against the same
           | kind of people over and over, then a ban might be in place.
           | 
           | It seems like we should then ban eyewitness testimony. That's
           | has tons of accuracy issues and bias.
        
         | mandelbrotwurst wrote:
         | Would you make a similar argument that allowing the police to
         | track location of all citizens 24/7 is just making the process
         | of having officers follow people around more efficient?
        
         | anigbrowl wrote:
         | Sure...just give the public the same access that police have.
         | Then if dirty cops break the law or mislead the public,
         | everyone will know.
        
         | throwawaysea wrote:
         | I agree with you - this is just forcing policing to be more
         | expensive and inefficient. It would be absurd to ban a police
         | department from using Microsoft Office for example, and instead
         | forcing them to track data in physical spreadsheets. Banning
         | facial recognition is equivalent to forcing a police department
         | to hire enough staff to monitor county-wide feeds and match
         | feeds against a long list of criminals they're looking out for.
         | With humans in the loop and requirements for when facial
         | recognition can be used, I feel like there isn't a "real
         | problem" here. But when we look at quotes from this article,
         | for example the person quoted from Latino Advocacy who is
         | concerned about ICE enforcing the law against illegal
         | immigrants, it's clear that the motivation to ban facial
         | recognition is really more political in nature - it's about
         | letting some people get away with breaking the law more so than
         | fundamental concerns with facial recognition.
        
       | ericls wrote:
       | Nice! Ban recommendation algorithm next!
        
       | throwaway1959 wrote:
       | My instinct for self-preservation tells me that this is not a
       | good thing. I understand the need for privacy, but what happens
       | if somebody puts a gun (or a knife) to your face? I think that
       | the need for privacy could be solved through the legislation: we
       | can have very severe restrictions on who could look at this data
       | and why. Also, we can have severe restrictions on the
       | admissibility of such data in court. Unfortunately, I have not
       | seen any credible efforts from politicians, right or left, to
       | introduce privacy protections from the surveillance abuses.
        
         | cryptoz wrote:
         | The article and discussion is not about privacy. The people
         | against facial recognition are against it, at least in this
         | case, because it is racist - or at least, it produces racist
         | outcomes.
         | 
         | Removing _bias_ from facial recognition is the problem you
         | would have to solve to appease the concerns right now, not
         | privacy.
         | 
         | When innocent minorities are getting locked up because the
         | software running it was trained with poor data, the outcomes of
         | using the software is a racism-entrenched legal and justice
         | system.
         | 
         | Which is why people are fighting against it.
        
           | lawnchair_larry wrote:
           | Someone should let these people know that nobody gets put in
           | jail based on the facial recognition's decision, so their
           | "concerns" are impossible. Not only that, but if anything,
           | it's less likely to find darker skin tones at all, so it will
           | favor minorities, not hurt them.
           | 
           | It's a shortcut for manually digging through databases to
           | identify people. Any identification is followed up with
           | investigation, just as it would be if a human matched it. No
           | decision is made by the machine.
           | 
           | So, no, it's not racist at all.
        
             | loeg wrote:
             | > Someone should let these people know that nobody gets put
             | in jail based on the facial recognition's decision, so
             | their "concerns" are impossible. Not only that, but if
             | anything, it's less likely to find darker skin tones at
             | all, so it will favor minorities, not hurt them.
             | 
             | The article directly contradicts both of your claims:
             | 
             | > "Facial recognition technology is still incredibly biased
             | and absolutely creates harm in the real world," said
             | Jennifer Lee with ACLU Washington. "We now know of three
             | _black men_ who have been wrongfully arrested _and jailed_
             | due to biased facial recognition technology.
        
             | IncRnd wrote:
             | I don't think you read the article, which contains examples
             | to support their claims that are the opposite of yours,
             | which do not have any supporting evidence.
        
         | tkzed49 wrote:
         | > what happens if somebody puts a gun (or a knife) to your
         | face?
         | 
         | Nothing. Either they mug you and leave or you get injured (or
         | they didn't see the cop standing behind them.) Facial
         | recognition is not going to solve that problem.
         | 
         | I'm not informed on the issue, but I'd imagine that preventing
         | them from buying the technology is easier than tightly
         | controlling its use.
        
         | diamond_hands wrote:
         | We have survived as a society for long time without the need
         | for this.
         | 
         | You could say the same thing about the 1st, 4th, and 5th
         | amendments. "what about the children"
        
           | throwaway1959 wrote:
           | You may be right. The facial recognition does seem to
           | interact with the 4th amendment, at least. But then it
           | happens in the public place? I don't know the answer to that
           | one. I fear that in the age of social media and Antifa, the
           | protections that we had before are no longer enough. Now we
           | have additional actors on the streets who may turn to
           | physical violence on a dime. I feel that the streets should
           | be free from physical violence.
        
             | geephroh wrote:
             | Well, there's another amendment to the US constitution that
             | is a substantial contributor to the level and severity of
             | physical violence on our streets. But we won't talk about
             | that...
        
               | akomtu wrote:
               | You mean the rampant hate speech and misinformation
               | enabled by 1A?
        
               | tick_tock_tick wrote:
               | Ok I'm at a loss what amendment don't we talk about that
               | leads to violence on the streets. Are you just being
               | trying to be cute and talk about the 13th?
        
       | 1cvmask wrote:
       | Another click bait article. Beware. Facial recognition was banned
       | by SF before in the US.
       | 
       | But it's a technology evolving. And susceptible to manipulation
       | as well. Watch the avantgarde comedy by the creators of South
       | Park:
       | 
       | https://www.youtube.com/watch?v=9WfZuNceFDM
        
         | jjulius wrote:
         | I mentioned it in another response in this thread, but while
         | it's technically correct that SF is both a city and a county,
         | making it the first _actual_ county to ban the tech, it 's
         | important identify the fact that this ban covers a much wider
         | area, a population of ~2.2 million compared to ~880,000, and 39
         | towns/cities compared to 1.
         | 
         | With that in mind I don't have much of an issue with the use of
         | the word "county", considering there aren't a whole ton of
         | city-counties relative to the total number of counties in the
         | US.
        
       | benatkin wrote:
       | This might be a good thing for proponents of surveillance. They
       | can wait until some really bad things happen and people beg for
       | facial recognition.
        
       | WalterBright wrote:
       | At last, KC did something right. Hooray! There's nowhere near
       | enough unsolved violent crime to justify the surveillance state.
       | And yes, I have been the victim of an unsolved violent robbery.
       | 
       | P.S. who cares if KC is first or not. What matters is it got
       | done.
        
         | kyleblarson wrote:
         | There may not be enough unsolved violent crime to justify
         | facial recognition, but one thing there is too much of in
         | Seattle, Portland, SF, LA, NYC is un-prosecuted violent crime.
        
           | zorpner wrote:
           | If we forced the cops to wear their badge numbers, we'd know
           | who was committing the violence and they could be prosecuted
           | for it.
        
             | WalterBright wrote:
             | If someone is beating me, I'm not likely to be able to
             | focus on his number and memorize it. When a thug put his
             | gun in my face to rob me, I later could describe the gun in
             | great detail, but not his face.
        
               | madhadron wrote:
               | But with all the smartphones around, someone can get the
               | officer's badge number hopefully while they're beating
               | you.
        
           | femiagbabiaka wrote:
           | Do you have any sources I can read through on this? I'm very
           | interested to hear that this is the case.
        
           | [deleted]
        
         | sharken wrote:
         | It's great to see the US leading the way on this, i hope that
         | Europe takes notice.
        
           | amelius wrote:
           | https://www.bloomberg.com/news/articles/2021-04-21/facial-
           | re...
        
           | vecinu wrote:
           | Specifically _on this_ , yes but keep in mind the US is far
           | behind Europe in terms of civil rights / privacy and other
           | protections for citizens (See GDPR for ex.).
           | 
           | I just learned about this "traffic light labelling" going on
           | in the EU and was blown away that this was implemented 4
           | YEARS ago. [1] I'm hoping the US catches to Europe for
           | everything else, we're far behind.
           | 
           | [1] https://www.euractiv.com/section/agriculture-
           | food/news/traff...
        
       | monocasa wrote:
       | How's gait identification doing these days?
        
       | buildbot wrote:
       | This is great! As a data scientist though, we should go farther
       | and ban using ML anywhere in policing or the justice system. It
       | just has no place in a system that's supposed to presume
       | innocence.
        
       | 1vuio0pswjnm7 wrote:
       | The backstory seems to be that WA state had succumbed to lobbying
       | from Microsoft and had passed a law allowing facial recognition,
       | with limits, in 2020.1,2,3 This is an ordinance that only applies
       | to one county. Statewide, it appears the rules are more lax.
       | 
       | Note that other states had already limited the use of facial
       | recognition, by law enforcement, before California or Washington,
       | e.g., NH in 2013 and OR in 2017.4,5
       | 
       | 1 https://www.wired.com/story/microsoft-wants-rules-facial-rec...
       | 
       | 2 https://www.wsj.com/articles/washington-state-oks-facial-rec...
       | 
       | 3 https://housedemocrats.wa.gov/blog/2020/03/12/house-and-sena...
       | 
       | 4 http://www.gencourt.state.nh.us/rsa/html/VII/105-D/105-D-2.h...
       | 
       | 5 https://law.justia.com/codes/oregon/2017/volume-04/chapter-1...
       | 
       | I would bet Microsoft is lobbying in many states regarding facial
       | recognition and privacy laws to try to get the laws they want.
       | The news will report that they are proponents of passing laws
       | governing these issues, but the point is that they want to shape
       | how the laws are written to benefit Microsoft. I can find
       | citations if anyone doubts this is happening.
        
       | Bostonian wrote:
       | Suppose looters ransack a store when it is closed and are caught
       | on video. Why wouldn't you want facial recognition software to
       | identify them? Do you have a right to privacy when you break into
       | a building?
       | 
       | You can support some uses of facial recognition without wanting
       | it to be used to say ticket people for jaywalking.
        
         | TMWNN wrote:
         | >You can support some uses of facial recognition without
         | wanting it to be used to say ticket people for jaywalking.
         | 
         | Agreed. As xanaxagoras said, this is a political favor to
         | Seattle Antifa.
         | 
         | I presume that any prosecution using facial recognition
         | software is also going to have human beings verifying that
         | video actually matches the face of the accused. In other words,
         | facial recognition software is going to be used as an automated
         | first-pass filter.
        
         | didibus wrote:
         | > Do you have a right to privacy when you break into a building
         | 
         | To find the person who appears on video footage filmed in the
         | building, you need to spy on everyone and then match the face
         | from the footage against the face of everyone else walking
         | about. All of those other people did not break into the
         | building, yet for this to work, their privacy is expunged by
         | having their movement and faces filmed, tracked and catalogued
         | so that the FR software can cross-match them.
         | 
         | I think there is a privacy argument here. If I didn't commit
         | any crime, maybe it shouldn't be that my face gets recorded in
         | some database alongside my latest location.
        
         | function_seven wrote:
         | > _You can support some uses of facial recognition without
         | wanting it to be used to say ticket people for jaywalking._
         | 
         | That's a huge assumption that many of us (opponents to
         | widespread surveillance tech) just don't agree to. I don't
         | think it's possible to hand government this kind of capability,
         | then limit it to a specific set of uses. It always expands in
         | scope, covering more and more use cases until the folks over at
         | Vision Zero[0] make an impassioned plea, "Suppose you could
         | prevent 10 pedestrian deaths a year by enforcing our jaywalking
         | laws better? Why wouldn't you want facial recognition software
         | to protect them?"
         | 
         | Or maybe the bias-free policing people[1] put forward their
         | argument that removing human cops from jaywalking enforcement
         | will increase Equity. It would be a decent proposition! And you
         | end up with every minor thing being automatically caught and
         | ticketed.
         | 
         | That would suck.
         | 
         | "Slippery slope" may be a fallacy in formal logic, but it's
         | damn useful in resisting the march into a dystopian future. Nip
         | this shit in the bud. Make it require some effort to enforce
         | the law.
         | 
         | [0] http://www.seattle.gov/transportation/projects-and-
         | programs/...
         | 
         | [1] https://council.seattle.gov/2017/07/20/council-president-
         | har...
        
           | TeMPOraL wrote:
           | > _" Slippery slope" may be a fallacy in formal logic_
           | 
           | It's not if the slope is, in fact, slippery. At least around
           | here, one fallacy more common than slippery slope is the
           | _slippery slope fallacy fallacy_ - calling out real slippery
           | slopes as fallacious.
        
         | rychco wrote:
         | Yes, everyone has the right to privacy from facial recognition
         | software, including criminals.
        
         | luke2m wrote:
         | Why would I?
        
         | [deleted]
        
         | swiley wrote:
         | >Suppose looters come to ransack a store, don't you want to be
         | able to wave them off with a shotgun?
         | 
         | The potential for abuse is too high.
        
         | greyface- wrote:
         | This is a ban on use of facial recognition by the government.
         | It does not limit store owners in any way.
        
         | verall wrote:
         | I wouldn't want facial recognition software to identify them
         | because I can't understand its failure cases. If it is allowed
         | in court as evidence, prosecutors will talk up the recognition
         | as "state-of-the-art" and juries will be influenced by its
         | opinions.
        
           | ggreer wrote:
           | Do you also apply this reasoning to fingerprinting, DNA
           | analysis, tire prints, and ballistics comparisons? Like most
           | people, I don't understand all of the failure modes involved
           | with those technologies, but they do seem to be helpful tools
           | for bringing violent criminals to justice.
           | 
           | I also think eyewitness testimony has many failure modes. If
           | anything, it's probably less accurate than current facial
           | recognition tech and biased in ways that are harder to
           | determine. Yet I wouldn't want to ban all use of eyewitness
           | testimony.
           | 
           | Banning facial recognition seems like overkill. Instead, it
           | makes more sense to restrict it so that we can get the good
           | parts (catching violent criminals) without the bad parts
           | (oppressive surveillance state). Instead of banning all
           | fingerprinting and DNA analysis, we have rules for how police
           | can use them. Why not have similar rules for facial
           | recognition?
        
         | rantwasp wrote:
         | did you read the article?
         | 
         | also, if you are part of a minority that is frequently
         | misidentified by this "tech" and you end up being harassed by
         | this pos tech, do you still want it used?
        
         | zardo wrote:
         | > Why wouldn't you want facial recognition software to identify
         | them?
         | 
         | Why would I want it?
        
           | mywittyname wrote:
           | Facial recognition isn't very accurate. Based on some of the
           | research I've seen, it's borderline worthless under many
           | circumstances.
           | 
           | I wouldn't want to use it in such a situation because of the
           | likelihood that the person who committed a crime is going to
           | go free while an innocent person might take the fall for it.
        
             | WalterBright wrote:
             | I would like it even _less_ if it was 100% accurate.
             | 
             | Do you really want every move you make logged? It's an
             | incredibly powerful tool to use for oppression. If someone
             | in the government doesn't like you, all they have to do is
             | watch you for a few days. You'll be sure to commit a crime
             | sooner or later, such as jaywalking, or maybe you looked at
             | someone in a suspicious manner, and then they prosecute you
             | to the max.
        
         | anigbrowl wrote:
         | Because I don't want to live in an overbearing police state. I
         | find it weird that you would pick the example of looting rather
         | than some sort of very serious (and irreversible/uninsurable)
         | crime like murder. You are surely aware that technological
         | capabilities lead to feature creep, just as you are surely
         | aware that police departments all over the country now operate
         | military-grade armored cars to little apparent public benefit.
         | 
         | Edit: just to expand on this, here's a press conference from
         | earlier today (sorry about the poor sound):
         | https://twitter.com/DarwinBondGraha/status/14001715920642416...
         | 
         | Here, Oakland, California's Chief of Police admits that police
         | claims about molotov cocktails and crowd demographics that were
         | used to justify the deployment of tear gas and rubber bullets
         | against protesters a year ago actually had no basis in fact.
         | The Chief explained that he received information to that effect
         | via the radio, and then went out and repeated it to the public
         | (via the media) without making any effort to vet its accuracy.
         | (For clarity, he has only been police _chief_ since February;
         | at the time last year, he was a acting as a spokesperson for
         | the department). It 's arguable that the decision to deploy
         | tear gas escalated the protest into a full-fledged riot; even
         | if you don't think so, it certainly misled the public about the
         | behavior and composition of the protesters, inevitably
         | impacting policy debates and so on.
         | 
         | I feel this is a good example of why the police cannot be
         | trusted with facial recognition tools; false claims can be used
         | to to designate large numbers of people as criminal suspects,
         | and that suspect status can then be leveraged to intrude upon
         | their rights. Were California's interim prohibition on facial
         | recognition not in place, chances are that it would already
         | have been deployed to identify large numbers of legitimate
         | protesters on the basis of initial false allegations (ie, lies)
         | made by individual police officers. 33 officers have since been
         | disciplined and no doubt civil litigation will delve deeper
         | into this, but at present the police officers who were
         | disciplined do _not_ have to be identified, despite the fact
         | that they were quite happy to lie in order to violate the
         | rights of that same public.
         | 
         | https://www.ktvu.com/news/oakland-police-chief-apologizes-is...
        
       | anigbrowl wrote:
       | It's not the first county int he country; as noted int he article
       | San Francisco (which is both a city and a county) instituted such
       | a ban in 2019. California has a statewide ban in place already.
       | It's good news but needlessly and inaccurately sensationalized.
        
         | cryptoz wrote:
         | The linked source is a Sinclair News outlet. They are probably
         | the worst media conglomerate in America honestly. Their news is
         | constant fear and I view anything by them as poisonous
         | information because it is just as likely meant to mislead as it
         | is to inform. Sure, KOMO publishes some real news. They also
         | publish lies and wildly misleading stuff.
         | 
         | https://www.youtube.com/watch?v=_fHfgU8oMSo
         | 
         | https://www.youtube.com/watch?v=GvtNyOzGogc
        
           | anigbrowl wrote:
           | Did not realize that it's a Sinclair affiliate, worth bearing
           | in mind.
        
         | loeg wrote:
         | There's a quote in the middle of the article which was probably
         | inaccurately summarized for the headline:
         | 
         | > We applaud King County for being the first _multi-city
         | county_ in the nation to pass such necessary measures
        
         | jjulius wrote:
         | Eh, I feel like that's being too picky. First, California's
         | "ban" is only a three-year moratorium on FRT in police body
         | cameras, whereas King County's is on "the use of [FRT] software
         | by county government agencies and administrative offices."
         | 
         | Second, and where things get a bit more "technical" is that SF
         | is both a city and a county, yes, but it's only one city in
         | that county. There are 881,549 people in SF county compared to
         | 2,252,782 people in King County[0]. According to each county's
         | Wikipedia page, SF county is 231.91 square miles to King
         | County's 2,307. King County has 39 cities and towns[1] to SF
         | county's 1.
         | 
         | So while yes, you're _technically_ correct, I still think that
         | the headline is accurate as-is. Most of the country 's counties
         | are similar to King County (eg, not a city-county), and it's
         | important to distinguish the fact that this ban covers a
         | tremendously wide area and numerous municipalities.
         | 
         | [0]
         | https://www.census.gov/quickfacts/fact/table/kingcountywashi...
         | 
         | [1] https://kingcounty.gov/depts/health/codes/cities.aspx
        
       | stakkur wrote:
       | _" The legislation bans the use of the software by county
       | government agencies and administrative offices, including by the
       | King County Sheriff's Office."_
       | 
       | So anybody else can use it anywhere, including third-party
       | contractors performing work for any of the above parties.
       | 
       | Also, strangely, I see no mention of hardware or other non-
       | software facial recognition technology.
        
         | TechnoTimeStop wrote:
         | The ridiculousness that is King County at this point is
         | borderline a conspiracy. Most of it surrounds, "insert skin
         | color" bad, all others good.
         | 
         | Our subreddits are cancer and have been known for a while to be
         | the highest manipulated forums for astroturfing on reddit.
         | 
         | Can anyone help? Can we help ourselves?
        
       | avanti wrote:
       | There is the conservative and the progressive. The conservative
       | hold their 1984 bible and say the end of world is coming. The
       | progressive just go forward and everything is fine.
        
       | tediousdemise wrote:
       | How will this facial recognition tech ban be enforced? The most
       | common users of the tech are the ones who will be enforcing the
       | ban. Are we honestly expecting the police department to self-
       | regulate?
        
       | [deleted]
        
         | Layke1123 wrote:
         | Shut...up. There is no such thing as antifa, and even if there
         | were, being anti fascist IS the correct call.
        
       | throwawaysea wrote:
       | Considering how rampant property crime like car break-ins or
       | catalytic converter thefts or shoplifting are in King County,
       | this seems like a really bad decision. I would definitely like to
       | see criminals identified, located, arrested, and sentenced. Not
       | to mention, we just had a whole year of regular rioting in
       | Seattle, with fiascos like CHAZ and daily illegal blockades of
       | highways. This technology makes it much more likely that the
       | police department can actually enforce the law as it exists on
       | the books by identifying and locating these criminals. It also
       | makes it much more likely that home surveillance footage from
       | law-abiding residents can be put to use.
       | 
       | I do not think this technology is intrusive or invasive as the
       | quoted council member claimed. Recording in public spaces is
       | completely legal to begin with. And we can always limit the use
       | of facial recognition by police departments to cases with either
       | a warrant or probable cause, to prevent completely uncontrolled
       | surveillance. The allegations of racial biases are also not
       | meaningful. In practice, false positives in machine vision
       | algorithms are contained by maintaining a human in the loop to
       | verify matches. It is trivial to use this technology in a way
       | that matches human-levels of accuracy with this layer of safety
       | in place.
       | 
       | Banning this kind of technology outright feels like a fear-driven
       | decision by luddites. That's a charitable take - a more direct
       | perspective is that the politicians and interest groups
       | supporting this ban are looking to protect their favored groups,
       | which very frankly seems to include criminals.
        
       | FridayoLeary wrote:
       | What's the point when they can buy the information from private
       | companies?
        
         | gnopgnip wrote:
         | The CA ban includes the police using facial recognition
         | services from private companies as well
        
           | geephroh wrote:
           | And the same applies to KC governmental authorities as well.
        
       | theknocker wrote:
       | Wow what heroes.
       | 
       | Hey, quick question: Did they also ban Stratfor's gait tracking
       | technology that they piloted for years?
        
       | dalbasal wrote:
       | Why does this apply only to policing? Is it a matter of
       | authority/jurisdiction?
        
         | jjulius wrote:
         | It does not apply only to policing. From the article:
         | 
         | >The legislation bans the use of the software by county
         | government agencies and administrative offices...
        
       ___________________________________________________________________
       (page generated 2021-06-02 23:00 UTC)