[HN Gopher] When the algorithm is your boss
       ___________________________________________________________________
        
       When the algorithm is your boss
        
       Author : jrepinc
       Score  : 50 points
       Date   : 2022-01-30 19:29 UTC (3 hours ago)
        
 (HTM) web link (tribunemag.co.uk)
 (TXT) w3m dump (tribunemag.co.uk)
        
       | RcouF1uZ4gsC wrote:
       | This phrasing is already conceding half the battle.
       | 
       | A computer or "algorithm" doesn't decide goals and values on its
       | own.
       | 
       | The computer executes an algorithm that efficiently implements
       | the values of decisions of the people in charge of developing and
       | deploying the software (and their bosses).
       | 
       | Businesses have done a great job of laundering controversial and
       | sometimes illegal decisions through "algorithms" as though they
       | were some kind of independent entity.
        
         | Buttons840 wrote:
         | The "Uno Reverse Card" for this situation is to ask why we need
         | executives and management when the algorithm does everything.
         | The answer is, as you say, the algorithm isn't actually doing
         | anything really, just what its owners would have done
         | themselves.
        
           | jt2190 wrote:
           | I agree. The interests of managers are often in direct
           | conflict with the interests of owners, as the managers try to
           | carve out a larger and larger piece of the business for
           | themselves. These algorithms will reduce the need for
           | managers, which is what owners want.
        
           | ahelwer wrote:
           | I don't think that's quite true. It's like the difference
           | between surveilling someone by tailing them and surveilling
           | them with an automated network of cameras equipped with
           | facial recognition. Scale matters. Much work (especially work
           | we call unskilled) is rendered tolerable by the small human
           | moments you have with your coworkers, a conversation out back
           | when taking out the trash or whatever. Management sees any
           | such thing as "time theft" or a similarly dystopian term and
           | would like to eradicate it, but they cannot be everywhere at
           | once. An algorithm _can_ be everywhere at once. So the
           | conversation  "maybe people should not actually have to be
           | doing productive things the entire time they are being paid"
           | never really had to be had before, because people would just
           | do things to make their job tolerable. But now that such
           | simple pleasures can actually be eradicated the conversation
           | does have to be had.
           | 
           | There are other domains where analogous conversations should
           | be had. Many people agree with their laws in the general and
           | even specific senses, but almost nobody would want to live in
           | a society where all the laws are enforced perfectly 100% of
           | the time. Occasional lawbreaking is necessary for society to
           | function. Anybody who objects to this should really think
           | through what they're wishing for.
        
             | lupire wrote:
             | That's a different topic -- how much power one person
             | should be allowed to have. It's still the person's power.
        
         | zwkrt wrote:
         | I agree mostly but i think that saying "the algorithm is my
         | boss" really does help the discussion along. I had a friend
         | ("S") who on hard times sighed up to deliver through uber eats.
         | Their entire tenure included no human interaction. S worked for
         | about 3 weeks, got some bad reviews for delivering food cold,
         | and were "fired" all without ever talking to anyone or seeing
         | anyones face. They would ironically (but truthfully!) claim
         | that their phone was the worst boss they ever had.
         | 
         | Its true that somewhere behind the scenes is a set of people
         | making the algorithm and a set of people implementing it, but
         | for the end user/worker there is no human, no realistic
         | arbitration, no negotiation, no favors, no coworkers, no
         | smiles, no understanding, and no recourse. It reminds me of
         | yesterday's "Did I just lose $500k?" post. When S got fired it
         | may as well have been mandated by god.
         | 
         | ADDENDUM: I don't want to toot our horn too much, but also
         | remember that as a member of this forum you are probably an
         | order of magnitude better at navigating complex institutions,
         | understanding business objectives, understanding the "intent"
         | if a piece of algorithmic technology, and advocating for
         | yourself than the average person who picks up gig work. To many
         | people it is literally magic, and the further your boss is
         | toward being magical the closer you are to being a slave to
         | them. And I don't think I am being melodramatic with my choice
         | of words.
        
           | lupire wrote:
           | The phone wasn't their boss. Uber's bilionaire management was
           | their boss. Phone was the scapegoat.
        
           | rambambram wrote:
           | > ... remember that as a member of this forum you are
           | probably an order of magnitude better at navigating complex
           | institutions, understanding business objectives,
           | understanding the "intent" if a piece of algorithmic
           | technology, and advocating for yourself than the average
           | person who picks up gig work. To many people it is literally
           | magic, and the further your boss is toward being magical the
           | closer you are to being a slave to them.
           | 
           | So true. Not everybody is going to see through the smoke and
           | mirrors.
        
       | Animats wrote:
       | Machines should think. People should work.
       | 
       | This is so much more cost-effective than slavery.
        
       | jpalomaki wrote:
       | On the other hand, with algorithms you can actually study how the
       | decisions are made. You can investigate what was given as input
       | and how it affected the results.
       | 
       | With complicated algorithms it may not be obvious, but then you
       | can try to poke the black box with thousands of sample cases to
       | see how it reacts.
       | 
       | Compare this with humans, who often don't even know themselves
       | what their decisions are really based on.
        
         | dehrmann wrote:
         | Yeah, it's easier to investigate a racist algorithm than a
         | racist manager. The article makes the following point, but I'm
         | not sure if it's necessarily worse than the status quo:
         | 
         | > All the while, the potential for discrimination based on
         | race, gender, or disability, especially in hiring and firing
         | software, remains obscured behind the black box of AI.
        
         | advisedwang wrote:
         | Who is the "you" here? Workers certainly do not get to probe
         | the internals of the algorithm that fired them. For liability
         | reasons I'm sure human resource management systems are not
         | exposed to researchers. Maybe a court could but in practice I
         | don't think that's happening and the bar to get that kind of
         | review is incredibly high.
         | 
         | Only the company itself is actually in practice going to be
         | testing their system but they only do so with a view to their
         | own interests.
        
         | giantg2 wrote:
         | "On the other hand, with algorithms you can actually study how
         | the decisions are made. You can investigate what was given as
         | input and how it affected the results."
         | 
         | You'd have to have access to it. This is highly unlikely. If
         | testing as a black box, could they identify a test and change
         | its behavior, like VW emissions tests?
        
         | the_snooze wrote:
         | Machines don't have shame, reputation, values, empathy. There's
         | little hope of self-correction. An algorithm will never ask
         | "Are we the baddies?"
        
       | verisimi wrote:
       | This is the planned future for us all, unfortunately. Technocracy
       | demands that scientific experts evaluate our reality and then
       | determine what we are and are not allowed to do.
       | 
       | Too much carbon/water/energy use = restraints on your usage. (No
       | restraints if you can pay though.) This is what smart meters,
       | smart phones, smart cities are going to facilitate. (Smart = Spy)
       | 
       | If I agreed with their evaluations and was able to choose to opt
       | in, I might even consider this. But this is not the plan. What
       | will really happen is that the political and billionaire classes
       | will find that it is right for them to assume the role of
       | determining the technocratic goals that the AI driven system
       | should achieve.
       | 
       | We will find that we have a tyrant computer to engage with. There
       | won't be a friendly face to help you by bending the rules in some
       | way. It will be like when you ring up your local governance
       | office to sort something out... but worse. We are not going to be
       | stepping into techno-utopia, that's for sure.
       | 
       | A bio-medical-security-id seems to be a required first step.
        
         | Gys wrote:
         | > What will really happen is that the political and billionaire
         | classes will find that it is right for them to assume the role
         | of determining the technocratic goals
         | 
         | 'They' have power and money because 'we' give it to them. Not
         | because they take it from us.
        
           | verisimi wrote:
           | Yes. Partly we give it to them, because they do not present
           | their case in a straightforward way. I think there is a lot
           | of social engineering that occurs, that just happens to align
           | and progress the pre-planned agenda. So, imo, we give our
           | money and power over, as we are deceived.
        
       | zackmorris wrote:
       | This was supposed to be a warning, not an instruction manual, but
       | here it is again:
       | 
       | https://marshallbrain.com/manna
        
       | not2b wrote:
       | Note that this is a British article, and where it uses the word
       | "liberal" American should read it as somewhere between
       | "libertarian" and "neoliberal".
        
       | rambambram wrote:
       | Triangulation (the narcissistic one) at it's best. Make a machine
       | - or software function in this case - the messenger. As an evil
       | c#nt you get a) plausible deniability, b) smoke and mirrors, c)
       | gas lighting, and d) a scape goat in one. Oh, and your algorithms
       | become your 'flying monkeys' as well.
       | 
       | For personal reasons I had to dive into NPD, BPD and psychopaths
       | (all cluster B personality disorders) the last years. Once I
       | finally understood that sh1t, a whole lot of other stuff made
       | sense. What big tech is doing here is using these same principles
       | on a world wide scale. I'm not saying this is new, because the
       | Romans with their 'divide and conquer' did basically the same.
       | 
       | The solutions to counteract this as an individual are also the
       | same as found in the psychology books: go 'no contact', or at
       | least 'gray rock'. If you are capable, that is. There's going to
       | be a lot of casualties, and I feel for the workers there (or
       | should I say 'victims').
        
         | heisenbit wrote:
         | Yup. Impulsive decision making, lack of empathy, lack of
         | ability to take different points of view and black&white
         | thinking.
        
           | rambambram wrote:
           | Lack of empathy? Definitely. Impulsive decision making? I'm
           | not so sure.
           | 
           | What you describe sounds pretty BPD, especially with the
           | black/white thinking (although I think hot/cold is clearer
           | wording). I was more thinking along sinister lines here.
           | 
           | And indeed, zero regard for different points of view. Which
           | is very logical when one finally understands that: if you are
           | - in your own mind - the god of the universe and beyond,
           | every other view is by definition beneath you.
        
       ___________________________________________________________________
       (page generated 2022-01-30 23:00 UTC)