[HN Gopher] Media companies that also sell personal data
       ___________________________________________________________________
        
       Media companies that also sell personal data
        
       Author : kjhughes
       Score  : 99 points
       Date   : 2022-11-12 14:47 UTC (8 hours ago)
        
 (HTM) web link (www.wired.com)
 (TXT) w3m dump (www.wired.com)
        
       | seydor wrote:
       | based on their academic services, I wouldn't be too worried
       | because those services are crap. Their academic analytics stuff
       | barely passes the test and is mostly used because nobody wants to
       | make a competitor. Most academic metadata is openly available to
       | anyone who wants to use it, but people know that academia is more
       | about networking, PR, politics etc rather than the exactness of
       | academic analytics. Then elsevier's own websites were pretty crap
       | for the (pre-SciHub) times i used them , sometimes being offline,
       | other times with stupid web design etc. They don't have a
       | technical moat other than being so entrenched in academic
       | politics.
       | 
       | I assume something similar is going on in the other spaces. They
       | live in a legacy world that will be replaced with AI overnight.
        
       | chiefalchemist wrote:
       | I'm not so sure quiet captues the moment. The invasion has been
       | effective due to pointless distraction after pointless
       | distraction - and algorithms tuned to surface those distractions.
       | A monopoly of the collective mind.
       | 
       | Few can hear this coming, not because it's quite but because of
       | the excessive volume from excessive noise.
        
       | c7b wrote:
       | Oh wow. Elsevier alone has much more grip on global academia than
       | any one company should have (what the article didn't mention,
       | they also own Overleaf, so they have access to a non-negligible
       | fraction of all STEM research before it is available to anyone
       | but the authors). But I didn't know that it is part of an even
       | larger conglomerate l (LexisNexis alone would make for a
       | formidable Elsevier competitor).
        
         | elashri wrote:
         | I couldn't find this link between Elsevier and overleaf. What I
         | did find is that overleaf is owned by digital science UK which
         | is owned by Holtzbrinck Publishing Group. The later owns a lot
         | of publication companies and group. They own Nature and more
         | than 50 share in Springer. Am I missing something?
        
           | c7b wrote:
           | Hmm, I remember hearing that they purchased Overleaf several
           | years ago - maybe I misremembered, or things have changed
           | again since then.
        
       | caconym_ wrote:
       | "Big Tech" is a distraction. They barely even "sell" data;
       | rather, they use it to power platforms where advertisers can
       | target users without learning who they are. At such a large scale
       | there are obviously exceptions (see: Cambridge Analytica), but
       | directly selling user data is generally not how they make their
       | money.
       | 
       | The abuses of the companies you never hear about are orders of
       | magnitude worse. They are plugged into every institution of
       | society you interact with in your daily life, so you essentially
       | can't opt out, and their primary business model is siphoning off
       | sensitive de-anonymized personal data and selling it to anybody
       | willing to pay. It ought to be criminal, and the fact that it
       | isn't just goes to show how captured and neutered our government
       | has become.
       | 
       | I assume our legislators' PR focus on "Big Tech" wrt. data
       | privacy is by design. If the American people really understood
       | the true shape of things, there'd be blood in the streets.
       | 
       | (I'm not saying "Big Tech" doesn't cause substantial harm, but in
       | my view it's harm of a different sort.)
        
         | [deleted]
        
         | drdec wrote:
         | > The abuses of the companies you never hear about are orders
         | of magnitude worse. > They are plugged into every institution
         | of society you interact with in your daily life, so you
         | essentially can't opt out, and their primary business model is
         | siphoning off sensitive de-anonymized personal data and selling
         | it to anybody willing to pay.
         | 
         | This would be much more impactful if you named one of these
         | companies and gave some concrete examples of what they are
         | doing.
        
           | caconym_ wrote:
           | For a start, you should probably read OP's article. I know
           | often HN commenters don't read the article before commenting,
           | and I'm sometimes guilty of this myself, but in this case the
           | linked article offers fairly important context for my comment
           | in the sense that it explicitly names at least one such
           | company and gives concrete examples of what they are doing
           | with your data.
           | 
           | For another salient example, check out this article^[1] about
           | cell carriers selling their customers' real-time location
           | data without consent.
           | 
           | ^[1] https://arstechnica.com/tech-policy/2020/02/fcc-issues-
           | wrist...
        
         | trap_goes_hot wrote:
         | Very much like a soldier refusing to execute immoral orders, we
         | need the smartest engineers working at these companies to stop
         | implementing such features. As it stands, engineers working in
         | big tech are often celebrated on platforms like HN. Especially
         | the companies that are just creating spying machines, and
         | addictive time-wasters.
         | 
         | We outlaw addictive chemical substances, and maybe such tech
         | should also be looked at in similar light.
         | 
         | At our company, the use of cell phones is not permitted inside
         | certain manufacturing zones. It's a bit sad to see some of the
         | younger employees being very restless when they can't check
         | their phone every 30 seconds.
        
           | ajb wrote:
           | Once you have more than 100 people - and there are obviously
           | more than 100 "smartest engineers" - solutions of the form
           | "everyone should be ethical enough not to do this, without
           | further incentives" are impractical.
        
           | Jensson wrote:
           | > We outlaw addictive chemical substances, and maybe such
           | tech should also be looked at in similar light.
           | 
           | Yeah, so the solution is not to convince engineers, but to
           | convince politicians to get laws like GDPR. Trying to
           | convince workers or companies to stop earning money has never
           | worked well.
        
             | [deleted]
        
         | webmobdev wrote:
         | No, BigTech are the biggest data-miners in the industry as they
         | have _daily_ regular and  "unrestricted" access to our lives
         | through the devices we use everyday. Thanks to our mobile
         | phones, computers and the internet, they even have insights and
         | personal data from all the other services we use. BigTech also
         | purchases and partners with data brokers. And all of them do
         | sell their data to the _government_ (as the new government data
         | center in Utah highlight, the PRISM program is thriving).
         | BigTech is the most successful privatisation of intelligence
         | gathering in the history of our world, and I 'd say even a
         | monument to capitalism.
         | 
         | While a triumph for capitalism, this is unfortunately also
         | really bad for us ordinary citizens in a democracy as it upsets
         | the balance of power between the rich and our elected
         | representatives. The last time this happened, and a government
         | fought to correct this balance with them (the British vs the
         | East india Company), the British empire itself crumbled. If
         | left unchecked, I fear history will repeat itself again with
         | our current and sole superpower too.
        
       | nonrandomstring wrote:
       | I think the main harm identified here is summed up in the line;
       | 
       | "The truth is pay-walled, but lies are for free"
       | 
       | It's not _what_ information is or isn 't collected. It can be
       | valuable, and great social good can come out of so-called "big
       | data". And we can, as a society, sensibly decide what is
       | allowable in terms of prediction and prejudice (which are
       | essentially the same thing in this context).
       | 
       | The problem is utility asymmetry. Having a few for-profit
       | corporations own and trade our data is a societal catastrophe in
       | the making, and can only tend toward fascism.
        
         | zeta0134 wrote:
         | It's somewhat ironic that I cannot view the full article; it's
         | paywalled, and I am not a Wired subscriber.
        
           | nonrandomstring wrote:
           | Using wget pretending to be something else will probably see
           | you right. Text based without JavaScript pulls Wired articles
           | nicely.
        
           | nouryqt wrote:
           | https://archive.ph/pQQV1
        
         | hammock wrote:
         | The truth has always been paywalled. Good information comes at
         | a cost, information asymmetry is power. What is different now
         | is that information of all types is being shadowbanned
        
           | nonrandomstring wrote:
           | > information of all types is being shadowbanned
           | 
           | That sounds interesting, please say more.
        
             | gadflyinyoureye wrote:
             | If we did, that too would shadowbanned. For example, look
             | at how people were pointing to early studies that the mRNA
             | jabs caused heart issues in males 40 and under about 1.5
             | years ago. Raising such concerns was enough to get account
             | here shadowbanned even when they posted to papers.
        
               | switchbak wrote:
               | The interesting thing is the truth there was not behind a
               | paywall. It was in the anarchic corners of the internet
               | that allowed such heresy. What was behind the paywalls
               | was pure state serving propaganda.
               | 
               | The last few years have been a real lesson to those who
               | are paying attention. Unfortunately I've been really
               | disappointed with how few of my friends have kept an open
               | mind despite the poisoned information landscape.
        
               | bombcar wrote:
               | The only reason the Internet was a "bastion of truth" for
               | awhile was that the whole thing was outside the walls.
               | 
               | The walls have moved and the truth has been confirmed,
               | everything that is outside the walls is heresy, don't
               | even _think_ of going to look at it, as it is bad.
        
         | DenisM wrote:
         | > The problem is utility asymmetry. Having a few for-profit
         | corporations own and trade our data is a societal catastrophe
         | in the making, and can only tend toward fascism.
         | 
         | It occurred to me that a few centuries ago every knight had his
         | own little army that was loyal to him, and lent out to the king
         | in the time of need. The idea that the army would belong to the
         | people was alien at the time, however today few question the
         | wisdom (even dictators pay lip service to it).
         | 
         | Could it be that the idea of big information belonging to the
         | people will one day also seem entirely commonplace?
        
         | lifeisstillgood wrote:
         | I read recently about the concept of "civic databases" -
         | someone pointed out the absurdity of the worlds scientific
         | papers being "catalogued by one post grad on the run, but used
         | by everyone".
         | 
         | Yeah. But even if you gave me access to every data set
         | worldwide I still would have trouble making head or tail of it.
         | but it's a start
        
           | turtleyacht wrote:
           | It stuck with me too. Who's going to write the stored procs
           | and views? Unless it's all going to be tables only.
           | 
           | "Software Diffusion, infer relationships about these tables
           | and create an explorer for me."
           | 
           | jancsika's comment [1] in a thread on _The Cypherpunk
           | Manifesto_ :
           | 
           | > There should be a sustainable solution to bootstrapping
           | civic databases to archive and make available/discoverable
           | all the shits citizens care about without waiting 70+ years
           | for it to enter the public domain.
           | 
           | > It's absurd as it is now. We've got a scientific database
           | duct-taped together by a fucking grad student in hiding, and
           | AFAICT nearly every researcher uses it.
           | 
           | [1] https://news.ycombinator.com/item?id=33555419
        
             | lifeisstillgood wrote:
             | Yes that one.
             | 
             | Glad it reverberated a little across this corner of the
             | internet.
             | 
             | It's a great term - conveys what we mean. Maybe "Civic
             | DataSets"
        
         | tootie wrote:
         | Counterpoint: Public media. While I don't think there's any
         | statutory obligation, you will likely never find paywalls or
         | brokered data. And the content is very reliable.
        
         | andsoitis wrote:
         | > "The truth is pay-walled, but lies are for free"
         | 
         | Verified, researched information is more costly to build than
         | lies, which anyone can just make up out of thin air.
        
           | nonrandomstring wrote:
           | It's not so simple. Good lies, the kind of disinformation
           | that swings elections, wins wars, and controls populations,
           | can be extraordinarily difficult to build. Think Operation
           | Mincemeat. The backstories, legends, covers and dissemination
           | costs can rival research in pursuit of truth. And by the same
           | token, deep truths are sometimes there in the open for anyone
           | with eyes and the wits to observe and write-up.
           | 
           | What I think you're talking about is "bullshit", which is
           | really a kind of ephemeral non-data, like spam, advertising
           | and opinions.
           | 
           | Flooding the public sphere with that, especially now using
           | ever more plausible generative "AI" tools, devalues public
           | truth and makes private intelligence stores seem more
           | attractive and valuable.
           | 
           | In the limit this can even become a justification for
           | cloistered data, as the Church understood so well with Latin
           | scriptures. We'll be told "The people can't be _trusted_ with
           | the truth ".
        
           | factsarelolz wrote:
           | But a vast majority of research is publicly funded (by tax
           | payers). So if it's paid for by said individuals should they
           | have to pay again to see the results?
        
         | lazyeye wrote:
         | This is not remotely true. Lies can be very much pay-walled
         | too.
        
           | bombcar wrote:
           | _facts_ can be found everywhere, for free or for pay.
           | 
           | But what is harder to find is what people really want, which
           | is to be told how to _feel_ about something, and that 's more
           | and more behind a paywall.
           | 
           | (Which can still be completely _wrong_ mind you!)
        
             | lazyeye wrote:
             | Being told how to feel is more behind a paywall? This is
             | also not remotely true.
        
         | kornhole wrote:
         | Because it is unlikely this data will ever become a common good
         | available to all, we can use creative ways to poison it. For
         | example, I buy things for me under the kids names and have my
         | partner use my card to buy things for herself. I only have one
         | real ID social media account which is intentionally enigmatic
         | and stale. Of course my networks obfuscate a lot. Unfortunately
         | this is a systemic problem, and the one percent like me are a
         | drop in the bucket.
        
       ___________________________________________________________________
       (page generated 2022-11-12 23:00 UTC)