(C) Daily Kos This story was originally published by Daily Kos and is unaltered. . . . . . . . . . . Algorithms Must Be Transparent [1] ['This Content Is Not Subject To Review Daily Kos Staff Prior To Publication.'] Date: 2024-06-25 We live in a world controlled by algorithms. What kind of job you get, whether you can get government assistance, and whether or not you go to jail can all be decisions that are controlled or influenced by algorithms. And almost all of those algorithms are created and controlled by companies that will not let the public see how they operate. Two recent stories about programs meant to solve or deter crimes show how damaging that control can be. The first story is about a program called Cybercheck. It has been used all over the country to allegedly place suspects at crime scenes, using machine learning (what we used to call AI before Sam Altman figured out he could make more money by calling it AI) to parse through the public internet and other various data sources. Unfortunately, there is little independent evidence that it works and the owner of the firm that created it apparently has a long history of lying. In one example, the system claimed to put two men at the scene of the crime based on one of their phones allegedly trying to connect to a security camera’s WiFi around the time of the crime. Except there is no security camera footage in the report of the crime, nor is there any mention of a security camera in any of the police reports. The Cybercheck report did not make it clear how the camera’s IP was discovered and verified. The company claims that the system is over 98% accurate but gives no evidence to back that claim or even explain how that claim is calculated. Their owner claims that his methodology has been peer reviewed but can produce no reliable evidence for that claim. And when defense attorneys ask for the data used to convict their clients, Cybercheck claims it does not retain that data. And, of course, they refuse to let defense attorneys see their algorithm, since it is “proprietary”. People have been convicted in part because of this system despite no one knowing how it works. Or, indeed, if it works. This is not limited to expert witnesses at trial — this problem infects how we dispatch police as well. NYC has used for several years a program called ShotSpotter. ShotSpotter is supposed to, well, spot shots. It uses microphones placed all over the city to detect sounds and then determines, using machine learning (can’t wait for them to claim its AI to boost their stock price), whether or not the sound was a gun shot. It doesn’t work. At all. At its best, is successful only 20% of the time. It failed to identify over 200 actual gun shots, and the NYPD estimates that they lost over 400 man hours to investigating ShotSpotter false alarms. And, of course, ShotSpotter’s algorithms are proprietary. The word algorithm sounds mysterious and high-tech, but it’s not. It just means a process by which you reach a conclusion or complete a task. Recipes are algorithms. But companies like the ones that created CyberCheck and ShotSpotter insist that algorithms must remain proprietary or that outsiders cannot understand their complex recipes. They do this to protect their business, but not in the way they claim. A proprietary algorithm merely means that the courts and the public have to take the companies word for how it works. That is obviously unacceptable when the algorithms have an impact on the public. If ShotSpotter had been forced to undergo a review, then maybe NYC and other cities would not have wasted so much money and police time on its terrible, terrible results (Or maybe not. The Adams administration in NYC, because Adams is a cop and the herbiest herb that ever did herb, has publicly defended the system and appears ready to re-up the contract. Other cities, thankfully, are not run by morons and have stopped or plan to stop using a gunshot potter that cannot spot gun shots). If the CyberCheck algorithm was available to courts and defense experts, then we would not have to wonder if it is putting people in jail unjustly for cash. I am completely unsympathetic to the argument that a company has to keep its secrets to keep its business. That is entirely unacceptable when we are dealing with people’s lives. Any algorithm that touches on hiring, police work, government benefit decisions, etc. must either be completely open or available to review by experts. Anything less puts the needs of executives over the harm to others. I don’t find that a compelling argument. Secret recipes result in, sometimes, deadly food. The people affected by those recipes, those algorithms, deserve a government that demands the forces that affect their lives be reviewable and understandable. Anything less is a direct attack on the well-being of people subject to those recipes. [END] --- [1] Url: https://www.dailykos.com/stories/2024/6/25/2248247/-Algorithms-Must-Be-Transparent?pm_campaign=front_page&pm_source=more_community&pm_medium=web Published and (C) by Daily Kos Content appears here under this condition or license: Site content may be used for any purpose without permission unless otherwise specified. via Magical.Fish Gopher News Feeds: gopher://magical.fish/1/feeds/news/dailykos/