[HN Gopher] How the Police Use AI to Track and Identify You
       ___________________________________________________________________
        
       How the Police Use AI to Track and Identify You
        
       Author : kungfudoi
       Score  : 109 points
       Date   : 2020-10-03 18:25 UTC (4 hours ago)
        
 (HTM) web link (thegradient.pub)
 (TXT) w3m dump (thegradient.pub)
        
       | heavyset_go wrote:
       | If you want to see the horrors that result from law enforcement's
       | use of AI and predictive policing, look no further than here[1].
       | 
       | It's several videos taken from body cameras of police officers
       | harassing, assaulting and abducting people because they showed up
       | in their system as being related to, or knowing, people who are
       | suspected, and not convicted, of crimes.
       | 
       | The videos are from a story that made the front page on HN about
       | a month ago[2], if you want more details.
       | 
       | [1]
       | https://projects.tampabay.com/projects/2020/investigations/p...
       | 
       | [2] https://news.ycombinator.com/item?id=24363871
        
         | srtjstjsj wrote:
         | It there any "AI" in the Tampa issue?
         | 
         | Or "predictive"?
         | 
         | The Tampa program is vindictive policing to harass undesirables
         | into leaving town.
        
         | maze-le wrote:
         | This incredibly disturbing. I grew up with family members
         | telling me horror stories about state abuse like this, from
         | more than 30 years past in a country that doesn't exist
         | anymore: The German Democratic Republic. If this isn't classic
         | Stasi Zersetzungstaktik, I don't know what it is. Aren't there
         | any laws against abuse like this? How the hell is this legal?
        
           | kyuudou wrote:
           | I'm glad you've pointed this out. Comparisons to GDR and the
           | Stasi often get ridiculed for being excessive and hyperbolic
           | when in fact what's currently implemented would've been the
           | Stasi's dream.
           | 
           | If anyone needs a good weekend movie recommendation, __Das
           | Leben der Anderen __[1] is excellent and quite relevant.
           | 
           | 1: https://en.wikipedia.org/wiki/The_Lives_of_Others
        
       | geraldkleber wrote:
       | This is why we built Trix (trix.co).
       | 
       | It's a consumer-facing photo editing app that uses adversarial AI
       | to manipulate your photos in such a way that companies like
       | Clearview AI can't train facial recognition algorithms off your
       | data. You can download the Android or iOS version at trix.co -
       | we're in beta right now.
       | 
       | Would love to get anyone's feedback!
        
         | searchableguy wrote:
         | What if this works to increase bias against certain group of
         | people because the facial recognition software isn't trained on
         | them?
         | 
         | I am not sure technology can solve this completely.
        
           | geraldkleber wrote:
           | I very much appreciate that question, and having been in this
           | space for a few years now, it's certainly one that's relevant
           | to the tech as whole, but less so our app.
           | 
           | Our technology is simply indexed to a public data-set of 30k
           | individuals and when our deep learning model scrambles the
           | key-points on your photos to confuse the clearviews of the
           | world it does in a random manner. The model truly is a black
           | box in that way.
        
         | chance_state wrote:
         | Do you have any data on how effective this technique is?
        
           | geraldkleber wrote:
           | Not yet - much of this research came out of U Chicago this
           | summer and we just launched our beta. You can download at
           | trix.co (iOS and Android).
        
             | srtjstjsj wrote:
             | So you're Theranosing it?
        
               | geraldkleber wrote:
               | Nope - you should check out the research!
               | 
               | https://sandlab.cs.uchicago.edu/fawkes/
        
         | polishdude20 wrote:
         | Hmm trying to download it on Android, I click the link and it
         | starts to take me to google play but it just loads forever.
         | Never gets to the page to download it
        
           | geraldkleber wrote:
           | Hmm, lemme do some bug squashing. Will reply when it's dealt
           | with!
        
         | _tulpa wrote:
         | I really don't think you can solve technology with more
         | technology...
        
           | geraldkleber wrote:
           | In this instance, we're well convinced we can.
        
         | godelski wrote:
         | This is cool, do you have a technical blog post?
        
           | geraldkleber wrote:
           | No technical blog post yet, but have some faqs up that could
           | help out!
           | 
           | https://www.trix.co/faq
        
         | samename wrote:
         | >"We require that you sign up with a valid US phone number to
         | verify your identity as a human for security purposes."
         | 
         | How does collecting personal data improve security? Why is
         | identity verification needed when the point of using the
         | service is to avoid automated identification in the first
         | place?
        
           | bigbubba wrote:
           | Such a database would be a tasty treat for any unsavory
           | company looking to purchase them.
        
             | geraldkleber wrote:
             | This is a measure designed to help us prevent facial
             | recognition companies from gaining programmatic access to
             | our api to test against it. In addition, because we are a
             | new startup with limited computer resources such testing
             | could also harm our throughput capabilities for actual
             | users. Unfortunately, there are far better sources of
             | simple name and phone number data such as whitepages.com,
             | or any other CNAM service, so we doubt we would be a target
             | of an attack for this data or that this data would be sell-
             | able even if we were bad actors. Our perspective is that
             | requiring phone verification allows us to provide a better
             | level of service to customers.
        
         | srtjstjsj wrote:
         | Do you have any evidence that your system works?
         | 
         | "Adversarial AI" doesn't really work against systems like
         | Clearview that aren't using "AI" in the first place.
        
           | geraldkleber wrote:
           | You should check out the research that was done on this. The
           | technology works - we're simply the first to really
           | productize it in this manner.
           | 
           | https://sandlab.cs.uchicago.edu/fawkes/
        
         | vaccinator wrote:
         | Sounds like a fight that can never be won
        
           | geraldkleber wrote:
           | We don't think that's the case!
        
         | Jon_Lowtek wrote:
         | Someone should check them for facebook scraping, because their
         | tos and privacy policy are worded in a way that implies that:
         | 
         | > _" By granting Trix access to any Third-Party Accounts, you
         | understand that Trix may access, .. any information .. that you
         | have provided to and stored in such Third-Party Account ("SNS
         | Content") ... all SNS Content shall be considered to be your
         | User Content"_
         | 
         | Some great advice from their own privacy policy:
         | 
         | > _" You should always review, and if necessary, adjust your
         | privacy settings on third-party websites and services before
         | linking or connecting them to Trix's websites or Service."_
         | 
         | Statements like this are red flags:
         | 
         | > _" We may collect metadata associated with User Content.
         | Metadata typically consists of how, when, where and by whom a
         | piece of User Content was collected and how that content has
         | been formatted."_
         | 
         | > _" Trix may transfer information that we collect about you,
         | including personal information, to affiliated entities, or to
         | other third parties "_
         | 
         | Basically they reserve the right to do as they please with your
         | data and all data they can access through services you link
         | with their service.
        
       | ape4 wrote:
       | And why do they scan the demonstrations asking for police
       | accountability - right away that sounds like a conflict of
       | interesting. You never hear about the AI scanning faces at places
       | where actual criminals hang out - like maybe a pawn shop
       | (possibly selling stolen goods).
        
         | tal8d wrote:
         | Because the "demonstrations" have devolved into riots where
         | "asking" means firebombing. Combine that with local prosecutors
         | refusing to pursue charges, and you've got a situation where
         | the police a highly motivated to identify everybody in black
         | bloc chic.
        
       | blantonl wrote:
       | It won't be long before we see more and more escalating instances
       | of the population using traditional electronic and physical
       | warfare techniques against law enforcement in these "battles." I
       | predict we'll see someone in the general public shoot down a
       | collection drone, or start to actively target these physical
       | collection nodes installed on poles, posts, buildings etc.
       | 
       | There were already numerous documented cases of electronic
       | warfare being deployed during protests at law enforcement - the
       | jamming of their public safety 2-way radio systems.
       | 
       | Whenever there is an escalation of techniques against the "enemy"
       | - the enemy fights back. There will be interesting developments
       | in this space the next decade with regard to this. Law
       | enforcement already needs advanced direction finding equipment -
       | which is traditionally a military operation.
        
       | andreyk wrote:
       | Don't think it's cited in this piece, there is also this recent
       | piece of news - "Controversial facial-recognition software used
       | 30,000 times by LAPD in last decade, records show"
       | 
       | https://www.latimes.com/california/story/2020-09-21/lapd-con...
       | 
       | I guess it's reasonable for police to use this technology, but
       | not without more transparency and accountability... Feels like
       | this year that's becoming very clear.
        
         | moate wrote:
         | >>I guess it's reasonable for police to use this technology...
         | 
         | Depends on your interpretation on the 4th amendment. I
         | personally feel it's extremely unreasonable.
        
           | monocasa wrote:
           | You're not the only one. If you read the opinions in United
           | States v. Jones, the supreme court makes a pretty big
           | distinction between the kinds of surveillance a police
           | officer could reasonably do on their own, versus the scaling
           | up that technology allows, and how that might kick otherwise
           | allowed surveillance without a warrant to requiring a warrant
           | just about the sheer amount of information being acquired.
           | And that's even with data that could be available from
           | watching from public streets, if there were enough officers
           | to watch everyone.
        
       | stanrivers wrote:
       | China perfected this all for us and now the U.S. government can
       | piggyback off of their practices... great.
       | 
       | Serious note - the below is such a scary concept. There is a line
       | somewhere and this feels like you are starting to cross it...
       | both place-based and person-based predictive policing sound ripe
       | for profiling / instigating actions that police can use as a
       | reason to arrest an otherwise harmless person.
       | 
       | "Predictive policing programs are another illustrative example
       | showing how data, surveillance technology, and a system of
       | automated policing work together to spy on, search, and,
       | ultimately, control Americans who have not committed or been
       | convicted of a crime.
       | 
       | Predictive policing is premised on the idea that historical data
       | of crime, demographics, socioeconomics, and geography can be used
       | to forecast future incidents. Knowing where crime is likely to
       | occur again, police try to intervene beforehand and prevent it.
       | 
       | Broadly there are two kinds of "heat maps" produced by predictive
       | policing models: place-based, which uses less data to try to
       | avoid systemic pitfalls of relying on crime and demographic data
       | and surges police into specific areas, and person-based, which
       | tracks and creates a list of "high-risk" individuals by combining
       | a person's criminal history with an analysis of their social
       | network. "
        
         | jancsika wrote:
         | > person-based, which tracks and creates a list of "high-risk"
         | individuals by combining a person's criminal history with an
         | analysis of their social network.
         | 
         | There is no way the software actually achieves this. If it did
         | we would have already heard about a well-off target identified
         | by the software, a target with the means to mount a loud
         | defense against such tracking.
         | 
         | And it cannot be that the software accurately returns such
         | results but LE simply ignores them if the target seems to be
         | too powerful. In that case we would have already heard from a
         | leaker about a murder from a perpetrator that such a system
         | correctly identified but LE neglected to follow up on. (Think
         | of the value to the company of such evidence!)
         | 
         | No, I'm guessing this system gives LE cover for the same flawed
         | systems they've historically used to target and harass people
         | who don't have the means to loudly mount a defense. Just take
         | the same process that a court has ruled against as
         | unconstitutional, use its data as input to your ML algo, and
         | voila! You've probably got at at least one more decade before
         | society figures out you're just appending the words "through a
         | sophisticated AI system" to your outlawed policing.
        
           | [deleted]
        
           | heavyset_go wrote:
           | > _I 'm guessing this system gives LE cover for the same
           | flawed systems they've historically used to target and harass
           | people who don't have the means to loudly mount a defense._
           | 
           | This is exactly what happens. If we look at the evidence[1],
           | law enforcement preys upon the poor and minorities who are
           | not even committing crimes, but their predictive policing
           | systems target them for harassment anyway.
           | 
           | [1] https://projects.tampabay.com/projects/2020/investigation
           | s/p...
        
           | monocasa wrote:
           | Why would we have heard about any of the failures? Part of
           | the 'blue wall' is intended to hide policing failures from
           | the general populace. Particularly when
           | 
           | * in the case of targeting a well off person who they decide
           | to further investigate, they can simply use it as reason to
           | look harder at someone, use parallel construction to build
           | the case that the defense sees, and simply drop any cases
           | where the defense happens to get too close. This is what they
           | do with stingrays.
           | 
           | * in the case of targeting a well off person who they decide
           | not to investigate, they can simply tell themselves that the
           | person is a paragon of society and it must be one of the
           | failures of the technology. Like when a boy escaped Jeffrey
           | Dahmer's house heavily drugged, running down the street,
           | bleeding out of his anus, begging anyone who would listen to
           | not let Jeffrey take him back, the police got involved and
           | gave him back since Jeffrey was such a paragon of the town.
           | That cop later became head of the police union.
        
           | moate wrote:
           | >>There is no way the software actually achieves this. If it
           | did we would have already heard about a well-off target
           | identified by the software, a target with the means to mount
           | a loud defense against such tracking.
           | 
           | I mean it's more likely that the system is going to informing
           | you about likely targets for certain crime types (less murder
           | which is frequently singular and more drug dealing since
           | repeated). It's going to help create lists of suspects to
           | pursue on a matter. If one of them is also a senator's son or
           | a CEO, you'd likely go in softer than Joe Poorperson
        
       | Jon_Lowtek wrote:
       | > _Last month, it was revealed the Department of Homeland
       | Security authorized the domestic surveillance of protestors and
       | journalists, training a system usually reserved for hunting
       | terrorists overseas with drones on American citizens exercising
       | their First Amendment rights._
       | 
       | wait wat
       | 
       | edit: the articles linked source doesn't actually say that but
       | instead talks about "Baseball card" dossiers.
        
         | Pfhreak wrote:
         | This is where policing in America has been headed for at least
         | a decade now. Everything is so polarized that people who would
         | normally be opposed to this view those being tracked as their
         | enemies, and thus fair targets. (For example, I'd imagine that
         | right wing or libertarian types would normally consider
         | surveillance to be a real bad thing. But if it gets deployed
         | against George Floyd protestors... Ehhh... nbd)
         | 
         | When you hear politicians talking about being "tough on crime"
         | or "law and order", this is exactly the program they are
         | pushing for.
        
           | srtjstjsj wrote:
           | Please don't conflate conservative with libertarian. They are
           | partially similar in economic policy but opposites in social
           | policy.
        
       | WarOnPrivacy wrote:
       | With rare exceptions, police are strongly addictive
       | personalities. There's no bit of power over the public that cops
       | can use in a reliably responsible way.
       | 
       | Surveillance & control are drugs, ones that they are not capable
       | of putting down.
        
         | tal8d wrote:
         | That is true of almost everybody, not just police. Petty
         | tyrants are a dime a dozen, they just lack the ability to
         | legally kidnap you. People are bad, so we need a government
         | comprised of people are bad, so we... I already hear the cries
         | of "But who will build the roads?!"
        
           | srtjstjsj wrote:
           | So we come together as a democratic society to hold each
           | other accountable.
        
       | Pfhreak wrote:
       | I'm not going to say ACAB, but many who do believe that no matter
       | how nice of an individual a cop might be, they are still working
       | within an organization that pushes for privacy shredding
       | technologies like this.
       | 
       | You can be a sweet, community oriented beat cop out there doing
       | your best, but behind you stands surveillance, militarization of
       | police, and the tragedy of prisons in America.
       | 
       | Maybe this isn't a generalization that's fair -- should/do we
       | fault all FAANG employees because their parent companies are
       | doing unethical things?
       | 
       | I don't know the answer, but articles like this definitely make
       | me think a whole lot more about whether individual police can be
       | ethical while operating in the way the departments do today.
        
         | ssklash wrote:
         | I don't think you can separate the individual from the
         | organization when the organization either fundamentally or in
         | practice is engaged in negative/harmful activities. Policing is
         | broken in the US, and though its original intention is good,
         | the daily practice of it today is systemically flawed and no
         | longer has the original intent as its goal.
         | 
         | The same applies to companies like Facebook and Google. They
         | fundamentally exist to extract data from their users. They
         | never had another legitimate mission that was corrupted over
         | time, unlike the police. Apple, Netflix, Amazon at least have
         | services that do not revolve around taking advantage of their
         | users and manipulating their behavior.
         | 
         | Organizations are made up of individuals at the end of the day.
         | If you are a cop or work at companies like Palantir, Google,
         | Facebook, you are helping prop up organizations that actively
         | do harm to many people. You may not be as responsible as cops
         | that murder or decision makers at the top, but their guilt
         | would not be possible without you. Please think long and hard
         | about where you work and why you work there.
        
           | srtjstjsj wrote:
           | Are you honestly saying that FB and Google don't provide
           | anything that users value?
        
       | Animats wrote:
       | Can this be used against employers to detect wage theft? "Time
       | shaving" should be easy to detect. If cell phone tracking, W-2
       | payments, and employer time records are in substantial
       | disagreement, wage theft has been found.
        
         | [deleted]
        
         | klenwell wrote:
         | I initially read this the other way: used _by_ employers to
         | detect wage theft.
         | 
         | Supervisor Drone: You are allotted two government-mandated
         | 15-minute breaks a day, but our third-party right-to-work
         | service detected you away from your work sector for an average
         | of 16.8 minutes per break period over your past 3 shifts. You
         | are hereby terminated and liable for service fees you agreed to
         | as part of your original employment agreement.
        
         | moate wrote:
         | By whom? The government has shown time and time again they
         | aren't interested in policing attacks against labor (see: the
         | toothlessness of the NLRB, the gig economy, weakening of
         | unions/organizing, etc) so it wouldn't be them.
         | 
         | The companies aren't going to do it because LOL wut?
         | 
         | So how are you going to get labor to do that? And if they do,
         | who do they take it to (because again, NLRB is underfunded and
         | relatively weak)
        
           | Animats wrote:
           | We'll have to see how pro-labor the next administration is.
           | 
           | Labor law enforcement could be run as a profit center, like
           | drug enforcement, if the fines were higher.
        
       | aaron695 wrote:
       | I know HN isn't very technical so this might be hard to
       | understand, but this isn't AI.
       | 
       | It's just computer power.
       | 
       | All this was possible 40 years ago but it cost orders of
       | magnitude more.
        
         | notsuoh wrote:
         | It's true that AI is a bit of an overloaded term these days,
         | but in the modern usage of AI you have all types of machine
         | learning and neural networks, which this certainly is. The AI
         | you're thinking of is usually referred to as AGI. But all that
         | aside, in the end your comment is still a little off the mark
         | because what is any flavor of AI but computer power.
        
       ___________________________________________________________________
       (page generated 2020-10-03 23:00 UTC)