[HN Gopher] Europe's big tech bill is coming to fruition
       ___________________________________________________________________
        
       Europe's big tech bill is coming to fruition
        
       Author : DamnInteresting
       Score  : 59 points
       Date   : 2023-03-06 20:40 UTC (2 hours ago)
        
 (HTM) web link (www.technologyreview.com)
 (TXT) w3m dump (www.technologyreview.com)
        
       | adamsb6 wrote:
       | This article is not at all skeptical of this regulation.
       | 
       | I'm especially skeptical about legislation coming from the body
       | responsible for making me click dozens of "Accept Cookies"
       | buttons every day.
        
         | timtom39 wrote:
         | EasyList Cookie. uBlockOrigin has it built in. FF mobile can
         | run it.
        
         | andrewmutz wrote:
         | I completely agree. The GDPR had the best of intentions, but
         | did not materially improve data privacy online. Meanwhile it
         | has absolutely made the UX of the web worse.
         | 
         | I know what the responses will be: "its a great law with poor
         | enforcement". Perhaps that's true, but if so what makes us
         | think additional EU tech regulations will be any better
         | enforced?
        
         | Jochim wrote:
         | The website choosing to sell your browsing data is the reason
         | you have to click those buttons.
        
       | rom-antics wrote:
       | Well that's a loaded title. (EDIT: The article title, not the
       | changed HN title)
       | 
       | Another take:
       | 
       | https://www.eff.org/deeplinks/2022/02/enforcement-overreach-...
       | 
       | Read the section on Trusted Flaggers to find out what that word
       | "safer" really means.
        
       | colpabar wrote:
       | https://archive.ph/U5Vwa
        
       | arbuge wrote:
       | I am continually reminded of PG's joke on Twitter a few years
       | ago:
       | 
       | https://twitter.com/paulg/status/1231699385525903360?lang=en
        
       | jawns wrote:
       | I'm curious about the outlawing of shadow banning.
       | 
       | As a former content moderator, I found shadow banning to be
       | remarkably effective for our most pernicious actors, who would
       | otherwise quickly realize that their account is banned and create
       | multiple new ones.
        
         | sacrosancty wrote:
         | [dead]
        
         | aaron695 wrote:
         | [dead]
        
         | Gigachad wrote:
         | These days the most effective method is phone number
         | verification. It's possible but significantly harder to get
         | around this.
        
       | seydor wrote:
       | Requiring "trusted flaggers" should clash with freedom of press
       | so i don't see this passing through parliaments.
       | 
       | Transparency of algorithms is unenforceable to irrelevant.
       | 
       | Things like "no personalized recommendations" are dead in the
       | water- things like ai chat don't personalize, they just use the
       | chat history, which can be stored locally.
       | 
       | Mandatory data sharing breaks international trade agreements
       | 
       | A lot of the other stuff is standard stuff that every website
       | does
       | 
       | https://commission.europa.eu/strategy-and-policy/priorities-...
       | 
       | https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A...
       | 
       | There stuff in the regulation that is simply laughably worded:
       | 
       | >Providers of online platforms shall not design, organise or
       | operate their online interfaces in a way that deceives or
       | manipulates the recipients of their service or in a way that
       | otherwise materially distorts or impairs the ability of the
       | recipients of their service to make free and informed decisions.
        
         | pjc50 wrote:
         | The "trusted flaggers":
         | https://www.lexology.com/library/detail.aspx?g=0045b1bf-165b...
         | 
         | It's not that different from _existing_ ad-hoc systems for
         | dealing with CSAM, like the IWF in the UK. And it will be
         | within the exemptions of Article 10 ECHR. The only country that
         | might object is Germany.
        
           | seydor wrote:
           | But they allow a government to directly remove content.
           | 
           | This is potentially evil, no matter how trusted a state
           | (thinks) it is.
           | 
           | The status of 'trusted flagger' under this Regulation shall
           | be awarded, upon application by any entity, by the Digital
           | Services Coordinator of the Member State in which the
           | applicant is established, to an applicant that has
           | demonstrated that it meets all of the following conditions:
           | 
           | (a)
           | 
           | it has particular expertise and competence for the purposes
           | of detecting, identifying and notifying illegal content;
           | 
           | (b)
           | 
           | it is independent from any provider of online platforms;
           | 
           | (c)
           | 
           | it carries out its activities for the purposes of submitting
           | notices diligently, accurately and objectively.
        
             | Jensson wrote:
             | You do know that the government can already remove whatever
             | they want? Why is this evil at all compared to what we
             | already have? All it does is give us more due process and
             | clearer rules than before when it was up to private actors
             | to do it.
             | 
             | For example, do you think it is more evil that Google can
             | ban you for any reason without telling you why, or the
             | government being able to ban you while stating exactly why
             | and you having legal rights? I think the first is worse
             | than the second, so moving from the first to the second is
             | a positive development.
        
               | seydor wrote:
               | Which government does that?
               | 
               | They can sue and arrest people who publish, not play
               | ducking big brother
        
               | Jensson wrote:
               | If they tell Google that you have illegal content then
               | Google will remove it, that is how it works today
               | everywhere. What kind of world do you think we live in?
        
               | seydor wrote:
               | Which is crucial, that google has the chance to challenge
               | the request , in court where it belongs
               | 
               | Removing content by government insiders has only been
               | done in authoritarian countries
        
               | Jensson wrote:
               | Google can challenge the content flagging as well. These
               | trusted flaggers can only flag content for Google to
               | review, they can't remove it themselves, as far I
               | understand it. So it is exactly the same as before, just
               | that it is formalized.
               | 
               | Edit: Think the Twitter files, with how the American
               | government flags content on Twitter and then Twitter bans
               | it. USA already has that trusted flagging system, but
               | under the hood so you don't see it. I don't see why
               | moving that to the open would be any worse.
        
               | seydor wrote:
               | > Think the Twitter files, with how the American
               | government flags content on Twitter and then Twitter bans
               | it
               | 
               | Yes this is pretty much what it is, and now it is a legal
               | requirement, and the request by the government have to be
               | prioritized and processed 'without delay'.
               | 
               | And if the government abuses the requests, then the
               | website can complain to ... the government.
               | 
               | I don't understand why you think this is normal. it is
               | not. This "government unable to stop itself" is precisely
               | the reason why press freedom was written into
               | constitutions
               | 
               | Much of EU doesn't rank low in corruption. What this
               | means is, every 4 years the new government will be re-
               | staffing the "national censorship service" (Digital
               | Services Coordinator ) with its cronies. The countries
               | which need free press the most will be affected for the
               | worse
        
               | Jensson wrote:
               | > Much of EU doesn't rank very low in corruption. What
               | this means is, every 4 years the new government will be
               | re-staffing the "national censorship service" with its
               | cronies.
               | 
               | But Google could challenge that to the EU court, and the
               | EU court isn't full of Bulgarian cronies. Or do you think
               | that the EU court would take the side of Bulgarian
               | cronies, really?
               | 
               | If Bulgaria wanted to censor the internet they would
               | already have their own laws to do it. As you said, only
               | extremely authoritarian countries censors the way you
               | describe here, I doubt Bulgaria would get away with it,
               | they would get kicked out of EU if they start to create a
               | CCP style big brother state.
               | 
               | (I used Bulgaria as an example since it is ranked the
               | most corrupt country in EU)
        
               | seydor wrote:
               | There is no EU court, only national courts can refer a
               | case to the ECJ for consultation. There is also no way to
               | "kick a country out of EU"
        
         | Jensson wrote:
         | These aren't press companies, they are content platforms. It
         | doesn't affect any press companies, the article states which
         | services it effects and it is limited to large platforms.
         | 
         | These companies are still allowed to post whatever they want,
         | they just have to follow some rules regarding being a platform
         | for content creating others. If they don't want to be a
         | platform they are free to stop being one.
        
       | slowmovintarget wrote:
       | "in mice"
       | 
       | or something like that.
       | 
       | The internet is about to get a lot more balkanized and more
       | heavily regulated. That doesn't make it safer, it just puts
       | government back in the driver's seat for who gets to decide what
       | is allowed where, sans those messy election things.
       | 
       | > Proponents of the legislation say the bill will help bring an
       | end to the era of tech companies' self-regulating. "I don't want
       | the companies to decide what is and what isn't forbidden without
       | any separation of power, without any accountability, without any
       | reporting, without any possibility to contest," Verdier says.
       | "It's very dangerous."
        
         | [deleted]
        
       | ParksNet wrote:
       | Straight out of the World Economic Forum: 'Digital Identity'
       | playbook.
       | 
       | De-anonymize speech online, to protect the ruling class, and
       | enforce any narrative you desire.
        
         | mistrial9 wrote:
         | > De-anonymize speech online
         | 
         | anonymity for political speech has been known and debated for
         | four hundred years in the West. Of course some actors come down
         | on one side or the other. Twitter v0.1 was supposed to be a
         | failsafe for that; now its 2023.
        
       | newaccount74 wrote:
       | The EU regulations need a lot more enforcement and agencies
       | really need to go after companies trying to exploit loopholes.
       | 
       | For example, consumers are entitled to return goods and services
       | purchased online for 14 days, with some exceptions. Almost all
       | app stores include some weasel wording how that 14 day rule
       | doesn't apply, or they employ some kind of intermediate currency
       | to get around the rule (eg. you have to buy Minecoins to buy
       | content in Minecraft).
       | 
       | The result is that a lot of digital markets are a wild west where
       | the consumer protection rules don't apply (eg. if you buy a fake
       | mod in Minecraft, you just lost 5EUR and there is nothing you can
       | do about it)
       | 
       | Another way companies get around the 14 day return policy is to
       | just not offer services starting today, but you have to buy
       | services 14 days ahead of time, so when the service starts you no
       | longer have a right to reverse the contract.
        
         | kwhitefoot wrote:
         | > Another way companies get around the 14 day return policy is
         | to just not offer services starting today, but you have to buy
         | services 14 days ahead of time, so when the service starts you
         | no longer have a right to reverse the contract.
         | 
         | The supplier might think that that circumvents the law, I
         | suspect that the courts might think otherwise. Some loopholes
         | only exist in potentia and evaporate under court scrutiny, but
         | no one can be certain which they are until a a case is brought.
        
         | endofreach wrote:
         | This is just plain wrong. The 14-day return policy is meant for
         | physical goods in the first place. It was not meant as a trial
         | period, but a protection because you can't actually see the
         | product when buying online.
         | 
         | Also, the 14-days do not start with the purchase date, but with
         | the delivery date.
         | 
         | If you buy a digital product online, it is the equivalent of a
         | physical purchase in a physical store. So nobody needs to ,,get
         | around" the policy. They just need to let you know it does not
         | apply.
         | 
         | And i do think it is fair.
         | 
         | If someone rips you off, that's a different story, there is
         | laws for that and those cases aren't meant to be covered by
         | this policy.
        
         | krzyk wrote:
         | IANAL but the above won't fly in court. At least the last part,
         | 14 day return policy applies from the time you receive the
         | goods/service, not when you give a business money for that
         | good/service.
         | 
         | E.g. you can return preordered goods, for which you sometimes
         | wait a month.
         | 
         | Even steam allows returning a game in 14 days (and you played
         | at most 2 hours, which is fair).
        
           | debugnik wrote:
           | > Even steam allows returning a game in 14 days
           | 
           | I believe that's their own policy, digital goods that are
           | consumable immediately can be sold with a renounce of your
           | return period.
        
       | justaman wrote:
       | The internet doesn't need to be "safer".
        
         | Guthur wrote:
         | Safer for them.
        
           | deathhand wrote:
           | Nothing allowed that could influence elections. Oh gee, I'm
           | sure that wouldn't be abused at all.
        
       | Barrin92 wrote:
       | _" The DSA will effectively outlaw shadow banning (the practice
       | of deprioritizing content without notice), curb cyberviolence
       | against women, and ban targeted advertising for users under 18.
       | There will also be a lot more public data around how
       | recommendation algorithms, advertisements, content, and account
       | management work on the platforms, shedding new light on how the
       | biggest tech companies operate"_
       | 
       | Although it comes very late, better than never. I think this bill
       | is fantastic. It brings important decisions on how platforms work
       | from tech companies to the public, which is where they ought to
       | belong in the first place.
        
       | vlovich123 wrote:
       | > That said, the bill makes it clear that platforms aren't liable
       | for illegal user-generated content, unless they are aware of the
       | content and fail to remove it.
       | 
       | Is there any clarity as to what "aware of" constitutes here? For
       | example, telephone providers are aware that people are using
       | their service for illicit things but knowing _which_ account  /
       | phone call is illicit suddenly makes that all less clear.
       | 
       | Honestly, I'm not quite as bullish on the ability to regulate
       | safety on the internet, considering a not insignificant amount of
       | the privacy violations is promulgated by ad networks collecting
       | information on behalf of intelligence agencies to work around
       | those pesky constitutional provisions. The history of the
       | internet is filled with "make you safer" legislation that
       | achieves questionable results at best.
        
         | vlovich123 wrote:
         | Oh goody.
         | 
         | > Only if the flagged content is evidently manifestly illegal
         | can such notices give rise to 'actual knowledge'. According to
         | the text of the Digital Services Act (section 63), "Information
         | should be considered to be manifestly illegal content and
         | notices or complaints should be considered manifestly unfounded
         | where it is evident to a layperson, without any substantive
         | analysis, that the content is illegal or, respectively, that
         | the notices or complaints are unfounded."
         | 
         | So glad we're leaving this up to a vague obviousness standard.
         | Not like copyright and other kinds of illicit content is
         | notoriously hard even for experts to decipher and get right.
        
           | Jensson wrote:
           | American copyright laws already makes these platforms delete
           | anything that could hint at being copyright infringement, I
           | don't see how this could make things worse.
        
         | VWWHFSfQ wrote:
         | Anecdotal
         | 
         | I used to run a live video streaming website (a la Twitch.tv)
         | for a specific niche of live content. But once the pirates
         | discovered the site they started streaming live sports and
         | other copyrighted content. I started getting DMCAs from MLB,
         | NFL, EPL. All very threatening emails with PDFs of legalese
         | containing screenshots of my (small, niche) website streaming
         | their content.
         | 
         | I would always just immediately shut off and ban the streamers
         | that were mentioned.
         | 
         | > Is there any clarity as to what "aware of" constitutes here?
         | 
         | I became "aware" by the DMCA notice.
        
           | vlovich123 wrote:
           | Running a large scale service is completely different I think
           | though. My understanding is that most DMCA notices are
           | automated and a good fraction of those not actually valid.
           | Similarly, these systems from the provider side already have
           | automated DMCA takedown mechanisms. The bigger problem is
           | what happens when the counter party files a DMCA counter
           | notice - now you are aware there's potentially infringing
           | content but you're not allowed to take it down. Of course,
           | I'm sure this law takes the DMCA into account.
           | 
           | The trickier part though I'm referring to is not DMCA but
           | community moderation. Someone flags a picture of your child
           | bathing naked as child porn. Is the provider now on notice as
           | having been informed? How do they validate the circumstances
           | of the photo to make sure it's yours? These aren't easy
           | questions and looking at it through the DMCA lens is
           | insufficient because this law goes way beyond that. The
           | section 230 lawsuits in front of SCOTUS right now are not
           | dissimilar to what DSA is trying to regulate and yet I don't
           | see extra clarity here.
        
       | [deleted]
        
       ___________________________________________________________________
       (page generated 2023-03-06 23:00 UTC)