[HN Gopher] An Update on the UMN Affair
       ___________________________________________________________________
        
       An Update on the UMN Affair
        
       Author : chmaynard
       Score  : 339 points
       Date   : 2021-04-29 15:19 UTC (7 hours ago)
        
 (HTM) web link (lwn.net)
 (TXT) w3m dump (lwn.net)
        
       | lacker wrote:
       | Key takeaway from this update:
       | 
       |  _One final lesson that one might be tempted to take is that the
       | kernel is running a terrible risk of malicious patches inserted
       | by actors with rather more skill and resources than the UMN
       | researchers have shown. That could be, but the simple truth of
       | the matter is that regular kernel developers continue to insert
       | bugs at such a rate that there should be little need for
       | malicious actors to add more._
        
         | david_allison wrote:
         | > the simple truth of the matter is that regular kernel
         | developers continue to insert bugs at such a rate that there
         | should be little need for malicious actors to add more.
         | 
         | Dmitry Vyuko[0] estimated in 2019 that each kernel release
         | contains >20,000 bugs. As an outsider looking in, it seems that
         | the kernel needs a lot more boots on the ground focusing on
         | maintenance.
         | 
         | [0] https://www.youtube.com/watch?v=iAfrrNdl2f4 ~0:00-3:30 for
         | statement.
        
           | cortesoft wrote:
           | 20,000 NEW bugs?! Or is it the same 20,000 that keep being
           | brought along every release?
        
           | cratermoon wrote:
           | I don't know if 20,000 is a large or small number for a
           | codebase the size of the linux kernel. How about bugs per
           | line of code? In _Code Complete_ , Steve McConnell writes
           | "Industry Average: about 15 - 50 errors per 1000 lines of
           | delivered code" As of 2020, there were 27.8 million lines in
           | the kernel git repo. By McConnell's measure, the number of
           | bugs in the kernel is rounding error.
        
             | [deleted]
        
           | hinkley wrote:
           | I feel like I've had this conversation at work at least a few
           | times. It's the ratio between new code and maintenance. Can
           | you solve this by adding more maintainers? Yes. Can you solve
           | it by reallocating people to maintenance? Maybe (depends if
           | they decide to quit instead).
           | 
           | But you can also flip it and ask if this feature focus is
           | really helping or hurting. The parts of the work that look
           | like 'feature factory' work should be treated as suspect. On
           | some projects that can be half of the new work. We get so
           | consumed trying to figure out if we can do something that we
           | don't stop to think if we _should_.
           | 
           | In a volunteer organization, however, you have to create a
           | certain amount of busy work in order to make sure that new
           | volunteers have stepping stones to becoming bigger
           | contributors. Did this need to be done? No, but we needed a
           | way to get Tom from point A to point C, where we really need
           | his help, and so we had him do B even though it's not really
           | much of a priority.
        
           | exmadscientist wrote:
           | > As an outsider looking in, it seems that the kernel needs a
           | lot more boots on the ground focusing on maintenance.
           | 
           | Following kernel development a decade ago, that was _exactly_
           | my opinion as well. Maintenance of older, creakier stuff gets
           | little attention. Adding shiny new buggy features was all the
           | rage. Code and architecture review was spotty at best, and a
           | lot of shiny new things caused damage to old things, but no
           | one cared because the new things were shiny.
        
         | tester756 wrote:
         | I said that on the first post but yea people kept screaming
         | "ethics" "ethics" haha as if apt groups gave a single fuck
         | about your ethics
        
           | AnimalMuppet wrote:
           | If UMN is not an APT group, then it is reasonable to expect
           | them to have some ethics.
           | 
           | If UMN _is_ an APT group, then it is reasonable to ban them.
        
             | tester756 wrote:
             | they wanted to show / remind about a way which could allow
             | apt to do some very bad stuff
        
         | chmaynard wrote:
         | Instead of submitting patches with known bugs, the UMN
         | researcher could have done a retrospective study of patches
         | from other kernel developers containing bugs that were later
         | found and fixed. Not only was he acting in bad faith, he was
         | stupid.
        
           | cpeterso wrote:
           | The researcher also could have coordinated with Linux and
           | other Linux leads, so they know which patches will be "bad".
           | They all want the Linux review process to produce high
           | quality code and could have suggested other ways to test the
           | process.
        
             | scarmig wrote:
             | This seems ideal, but Linus would probably have accurately
             | pointed out that it's already well-known that it wouldn't
             | be hard for a malicious actor to insert bad code, so
             | there's nothing really to be learnt here, at the cost of
             | wasting the time of reviewers who are already overwhelmed.
        
           | mumblemumble wrote:
           | Only looking at patches that were later found and fixed would
           | not be measuring the same thing as what they were doing.
           | 
           | Not saying what they were doing was in any way ok, but I do
           | doubt there's any way to measure what they were trying to
           | measure without deliberately introducing bugs into some sort
           | of review stream. Merely using observational data would
           | produce results that are approximately as useful as those of
           | every other observational design. (to wit: not very)
        
             | hinkley wrote:
             | If you're filing a patch with a pattern you think will get
             | by the review process, why not look to see if that pattern
             | already existed in the code? That would give you the same
             | information and you're potentially uncovering zero days
             | instead of creating them, assuming those issues haven't
             | already been fixed.
        
           | izgzhen wrote:
           | That is not considered high impact by the research community
           | ;)
        
           | hangonhn wrote:
           | There was also a high degree of arrogance on his part. That
           | he thought his needs and priorities outweigh those of the
           | much, much bigger community and/or that he doesn't need
           | permission from anyone in charge of the codebase is just
           | astounding.
        
           | xucheng wrote:
           | Alternatively, I think the research could be done in an
           | isolated experiment environment among consented participants.
           | For example, they could ask maintainers to review some set of
           | patches outside the context of Linux kernel. They would then
           | ask maintainers opinions on the said patches using some form
           | of questionnaires. If the participants are well informed the
           | nature of the experiments, there would not be any ethic
           | concerns. For incentive, the researchers could pay for any
           | maintainers willing to participate the experiments.
        
         | dekhn wrote:
         | I assume this is happening already; I would expect nation
         | states to have already inserting a number of exploits so they
         | are prepared in case of war, to exploit the enemy's servers.
        
           | SirYandi wrote:
           | This just made me think, I wonder if there is some sort of
           | Trident nuclear warhead submarine equivalent in cyberspace.
           | Software based mutually assured destruction.
        
           | [deleted]
        
       | MilnerRoute wrote:
       | The thing that troubles me is how hundreds of good-faith patches
       | came under suspicion.
       | 
       | We can say that the researchers are entirely responsible for that
       | as well -- for creating that need for suspicion. But I do wonder
       | if we were too quick to also fall into a pattern of outrage and
       | absolute rejection.
       | 
       | This story went viral partly because it seemed much worse -- "And
       | they're still submitting malicious patches to this day!" -- than
       | it actually was.
        
         | bigwavedave wrote:
         | I dunno, I think this is one case where outrage and rejection
         | weren't too quick in coming.
         | 
         | Let's say you're a software architect (dunno if you are or
         | not). You hire four new software devs, two of them from
         | Colorado State University, and you put the two from CSU on the
         | same team. After a year, you find out one of them has been
         | secretly working for CSU's cyber security research department
         | and poisoning your product with patches that do nothing or
         | subtly introduce bugs on purpose purely so their professor can
         | write a research paper about your company's process. Naturally,
         | you fire the developer and that leaves a bad taste in your
         | mouth and CSU promises not to interfere again. Then you get
         | suspicious and start checking the work of the other CSU dev
         | more closely and find out they're submitting bad patches too!
         | They claim that these bad patches were submitted because their
         | experimental static code analyzer didn't find any issues with
         | them.
         | 
         | What conclusion should you reach?
        
           | MilnerRoute wrote:
           | You should conclude "I don't know whether or not these
           | patches were submitted in good faith."
           | 
           | People sometimes submit bad patches. Not all of them are
           | malicious.
           | 
           | I don't think we've fully addressed or explored the scenario
           | where a young, bright-eyed aspiring student coder does in
           | fact make an innocent mistake. And then is accused of
           | deliberate malicious sabotage. And large chunks of the
           | internet also start condemning them for deliberate and
           | malicious sabotage.
           | 
           | What does that say about our community?
        
             | kuratkull wrote:
             | It's also important to understand the situation you are in
             | - if you are the second CSU dev you got some extra due
             | diligence to do. This is the real world, it's not fair.
        
       | adamrezich wrote:
       | interesting that they admit that going forward, they don't really
       | have the resources to fully vet all submitted patches to the
       | degree necessary to prevent something like this from happening
       | again. seems like, if nothing else, this has proven the Linux
       | kernel as pretty vulnerable to similar, more malicious future
       | attacks. there's no real easy solution to this problem
        
         | detaro wrote:
         | That's not really news though, and IMHO one of the major points
         | of criticism against the researchers: If what they did would
         | have shown anything new or unsuspected it'd have been one
         | thing, but they didn't, and could have shown the same with less
         | impact.
        
           | UncleMeat wrote:
           | The research was a _little_ more compelling than that, since
           | they did use an automated tool to find the partial vulns that
           | they could sneakily upgrade to complete vulns. That is a
           | little more interesting than introducing an entire complete
           | vuln in a patch and hoping nobody notices.
           | 
           | Still very unsurprising, but a little more interesting than
           | how the work has been presented in media.
           | 
           | I do also still believe that Oakland was incredibly foolish
           | for accepting this work and that the PC has almost as much
           | egg on their face as the researchers.
        
       | flowerbeater wrote:
       | This is a reasonable and balanced analysis of the situation. In
       | retrospect, it seems like the reversion of the 190 patches was an
       | overreaction that ended up causing a lot of confusion: many
       | people even on HN misinterpreted the comments on reversions to
       | believe that bad patches were committed to the source tree or to
       | stable.
       | 
       | But besides the lesson that one ought not to be deceptive with
       | submitting patches, is also the lesson that the kernel is not as
       | well reviewed as one may hope and with some effort, it's
       | certainly possible to add an undetected vulnerability. I think
       | that's probably one thing that led to the drama, is that the
       | fundamental trust and work of the kernel was attacked, and the
       | maintainers felt the need to fight back to protect their
       | reputation.
        
         | detaro wrote:
         | I don't think the latter part is true, my impression is that
         | the kernel people are very well aware of the limits of their
         | review ability and don't pretend to be unfoolable.
        
           | flowerbeater wrote:
           | There's a wide range of degrees between "unfoolable" and "can
           | be done by a persistent student". I think the impression (at
           | least my impression) used to be is that it was possible
           | before but quite unlikely without state-level efforts, but
           | now we understand a properly advised student can get most of
           | their attempted vulnerabilities inserted.
        
             | detaro wrote:
             | The pure number of just regular bugs that aren't caught is
             | already a good indicator that not much special effort is
             | needed. (And "just a persistent student" isn't _that_
             | little, given that the group also contributed regularly to
             | the kernel, was studying its security, ... and thus quite
             | familiar with the field, and the kind of people a nation
             | state would employ for that)
        
           | Mathnerd314 wrote:
           | The harder part is probably fooling all of the static
           | analysis tools that get run on the kernel, Coverity,
           | Coccinelle, Smatch, and so forth.
        
         | tytso wrote:
         | 190 patches were _not_ reverted. 190 patches were _proposed_ to
         | be reverted, and those reverts have been going through the
         | normal kernel review process. In many cases, the patches were
         | confirmed to be correct, and so the revert was dropped. In 42
         | cases, those commits were found to be inadequate in some way;
         | in some cases, the commit didn 't actually fix the problem, or
         | introduced another problem, or I believe in one case, actually
         | introduced a security problem(!). Call those last set of
         | patches, "good faith hypocrite commits".
         | 
         | Remember, too, that many of these commmits were in random
         | device drivers. Consider how often random Windows device
         | drivers crash or cause random blue screens of death. Not all
         | "SECURITY BUGS (OMG!)" are created equal. If it's in some
         | obscure TV digitalization card in the media subsystem, it won't
         | affect most systems. Core kernel subsystems tend to have much
         | more careful review. As the ext4 maintainer, I'm a bit slow in
         | accepting patches, because I want to be super careful. Very
         | often, I'll apply the patch, and then looking at the changed
         | code in the context of the entire source file before approving
         | the commit. Just looking at the diff might not be enough to
         | catch more subtle problems, especially dealing with error
         | handling code paths.
         | 
         | The problem with device drivers is that in some cases, they are
         | written by the hardware engineers that created the hardware,
         | and then as soon as the hardware ships, the engineers are
         | reassigned to other teams, and the device driver is effectively
         | no longer maintained. One of the reasons why some maintainers
         | are super picky about allowing device drivers to be admitted to
         | the kernel is because the concern that driver author will be
         | disappear after the patch is accepted, and that means the
         | subsystem maintainer is now responsible for maintaining the
         | driver. So if you want to sneak a SECURITY BUG (OMG!) into the
         | kernel, targetting some obscure device driver is the going to
         | be your simplest path. But that's really only useful if you are
         | gunning for a IEEE S&P paper. The obscure device is not likely
         | to be generally used, so it's not that useful.
         | 
         | (Unless, of course, you are a hacker working for Mossad, and
         | you are targetting a super obscure device, like, say, a nuclear
         | centrifuge in use by Iran.... in that case, that security
         | vulnerability only works on a very small number of systems is a
         | feature, not a bug. :-)
        
       | Animats wrote:
       | _That could be, but the simple truth of the matter is that
       | regular kernel developers continue to insert bugs at such a rate
       | that there should be little need for malicious actors to add
       | more._
       | 
       | That's the trouble with the one giant kernel architecture. Too
       | much trusted code.
       | 
       | The QNX microkernel barely changed from year to year, and the
       | number of kernel bugs approached zero. It's only about 65K bytes
       | of code. Yet it could do most of what Linux does.
        
       | dcow wrote:
       | > They failed in their effort to deliberately insert bugs, but
       | were able to inadvertently add dozens of them. ... kernel
       | maintainers (and maintainers of many other free-software
       | projects) are overworked and do not have the time to properly
       | review every patch that passes through their hands ...code going
       | into the kernel is often not as well reviewed as we like to
       | think.
       | 
       | Sounds like there's some room for improvement in the process of
       | how lighter weight contributions are introduced, reviewed, and
       | land in the kernel.
       | 
       | Perhaps there is a technical solution that would help ease the
       | load off maintainers and shift some of the burden to patch
       | authors.
       | 
       | --
       | 
       |  _When I find myself in times of trouble / mother Mozilla speaks
       | to me / whisper words of wisdom / don't use C._
       | 
       |  _And in my hour of darkness / Rust is installed in my system
       | tree / emitting tokens of wisdom / don't use C._
       | 
       |  _And then the broken hearted people / maintaining kernel code
       | agree / Rust's safety is the answer / don't use C._
       | 
       |  _For though they may be parted / Rust will prevent Use After Fee
       | / they will see the answer / don't use C._
       | 
       |  _And when the code is cloudy / the borrow checker confronts me /
       | thou shall specify reference lifetimes / don't use C._
       | 
       |  _I wake up to the sound of safe code / not one vuln surrounding
       | me / rust is in the kernel / no more C._
        
         | azinman2 wrote:
         | Rust doesn't magically make your code bug free. The fixes/bugs
         | you reference are highly unlikely to be all memory management
         | related (eg use-after-free), and you can still do unsafe things
         | in Rust even with memory.
        
           | operator-name wrote:
           | Rust and other languages with strong type systems don't
           | magically make your code bug free but their type systems are
           | more expressive.
           | 
           | If used correctly this can make all sorts of states unwanted
           | program states unrepresentable. Rust's shining feature is its
           | linear types but that doesn't mean the only kinds of bugs it
           | eliminates are memory management related.
        
           | dcow wrote:
           | Of course you can. But it _is_ much harder. I 'm just having
           | some fun.
           | 
           | Anecdotally, Rust coerces you into writing sound code and
           | really helps break some destructive habits that C breeds. I
           | rarely need to debug Rust code and when I do the bugs are
           | logic errors not memory errors. It's a noticeable difference.
           | For lightweight contributions to the kernel and "peripheral"
           | contributions (where some big company writes a massive messy
           | driver because the market forces their hand.. not because
           | they enjoy curating and actively helping to maintain linux),
           | the type of discipline Rust enforces is invaluable.
           | 
           | Also, what do you mean that Rust won't solve use after free?
           | The whole point of the borrow checker is tracking and
           | enforcing reference lifetimes in order to prevent using a
           | pointer to memory that's been dropped from scope and no
           | longer has an owner (has been freed). Sure it can't prevent
           | all issues magically just yet (there are limitations) but
           | it's a positive force in the right direction.
           | 
           | Rust forces developers to do upfront what is usually
           | relegated to careful code review. You're forced to think
           | about the memory implications of your code. If the issue is
           | that we can't thoroughly review the abundance of code
           | submitted to the kernel, then e.g. requiring that, unless
           | justified, new patches shall land in safe rust, seems like it
           | would directly address the core issue that there aren't
           | enough eyes on the code. Obviously it's more nuanced than
           | that, there's existing code you can't just say "all new
           | contributions must be rust".. the example is just
           | illustrative.
           | 
           | The existing proposal and work on carefully incorporating
           | rust into the kernel is super awesome. And I'm excited!
        
       | antattack wrote:
       | Given high number of buggy patches being accepted it seems kernel
       | developers could use bug-bounty program.
        
         | Edman274 wrote:
         | I'll insert the bugs, and you can fix them, and we'll make bank
         | together!
        
       | truth_ wrote:
       | > a buggy patch originating from an experimental static-analysis
       | tool run by another developer at UMN. That led developers in the
       | kernel community to suspect that the effort to submit
       | intentionally malicious patches was still ongoing. Since then, it
       | has become apparent that this is not the case, but by the time
       | the full story became clear, the discussion was already running
       | at full speed.
       | 
       | So they found a convenient scapegoat.
       | 
       | I am not buying it.
        
       | occupy_paul_st wrote:
       | > Of the remaining four, one of them was an attempt to insert a
       | bug that was, itself, buggy, so the patch was actually valid
       | 
       | Absolutely legendary
        
       | notyourday wrote:
       | The key take away, for me, has been that the Foundation that sent
       | the demand letter does not have the balls to grant Stupid Prize
       | to those that so diligently tried to win it.
       | 
       | A good demand letter announcing the winning of the Stupid Prize
       | would have been to tell the university that it would be banned
       | until it:
       | 
       | * fires the assistant professor
       | 
       | * terminates all the students that participated
       | 
       | with a note that should any of those students or a professor go
       | to a different university that university would be banned as
       | well.
       | 
       | Had that price would have been awarded it would have definitely
       | made into future ethics classes.
        
       | staticassertion wrote:
       | > That said, there are multiple definitions of "malice". To some
       | of the developers involved, posting unverified patches from an
       | experimental static-analysis tool without disclosing their nature
       | is a malicious act. It is another form of experiment involving
       | non-consenting humans.
       | 
       | What an absurd statement. Malice implies intent ie: the goal was
       | to do harm. There was never any malice - not in the hypocrite
       | commits and certainly not in the static analysis findings.
        
       | thamalama wrote:
       | reminds me of another potentially critical mistake,
       | https://mncomputinghistory.com/gopher-protocol/
       | 
       | 'According to Alberti, licensing "socially killed gopher."
       | Second, changes to internet hardware gradually made Gopher less
       | appealing. As put by McCahill, Gopher was designed within "the
       | limits of what the technology (at the time) could do." It was
       | deliberately text rich because images were slow to load on the
       | comparatively slow modems of the early 1990s. The decision to
       | prioritize text made Gopher relatively fast while the more image-
       | oriented World Wide Web languished in slow speeds (the "World
       | Wide Wait"). However, by 1994 improvements in modem technology
       | had turned this asset into a liability.'
        
       | cycomanic wrote:
       | Apart from the understandable annoyance (anger) of being
       | experimented on and the scandal that this somehow got approved by
       | the UMN ethics board I did find the reaction to this somewhat
       | interesting in that the kernel community was willing to
       | immediately ban every one from UMN (a university of significant
       | size) and investigate/revert all patches originating from UMN. In
       | contrast I don't believe the reaction was anywhere near that
       | strong when it came out that the NSA likely subverted a crypto
       | standardisation process. Should the kernel community have reacted
       | at least equally as strong (if they did and I'm just not aware of
       | it please educate me).
        
         | ilamont wrote:
         | From the UMN response signed by the Computer Science &
         | Engineering department head and associate department head:
         | 
         |  _The study was focused on understanding a system by
         | identifying mechanisms through which security issues could be
         | introduced in Linux software. Therefore, purely as a technical
         | matter, this study did not qualify as "Human Subjects Research"
         | and received no further scrutiny or oversight from the IRB.
         | Importantly, even if one believed the study did involve human
         | research subjects (perhaps due to the interactions and the
         | incidental collection of email addresses), the study would
         | clearly be exempt from IRB oversight pursuant to 45 CFR
         | 46.104(D). In other words, the UMN IRB acted properly in this
         | case._
         | 
         | Full response:
         | 
         | https://drive.google.com/file/d/1z3Nm2bfR4tH1nOGBpuOmLyoJVEi...
        
           | cycomanic wrote:
           | Ok I guess this shows my ignorance on what constitutes human
           | research (although I'm outside the US and definitions might
           | be different here). I have to say this is a good response
           | letter from the department.
        
         | anonydsfsfs wrote:
         | > In contrast I don't believe the reaction was anywhere near
         | that strong when it came out that the NSA likely subverted a
         | crypto standardisation process
         | 
         | It was pretty strong. There were many calls to remove the chair
         | of the IETF CFRG group because of his association with the NSA:
         | https://news.ycombinator.com/item?id=6942145
        
       | temp8964 wrote:
       | I totally agree this is a very balanced review of the affair.
       | 
       | The fact that "kernel maintainers ... are overworked and do not
       | have the time to properly review every patch that passes through
       | their hands" is not a situation created by the group of
       | researchers.
       | 
       | If the Linux kernel is so crucial to the real world industry, and
       | the kernel team is so distressed at their work, this is a problem
       | need to be seriously addressed. Proactively punish the whole UMN
       | does nothing to address the real issue here.
       | 
       | If the team is so distressed at this tremendously important work
       | and they constantly feel they are on the edge of collapsing, may
       | be you really really really should ask for outside help and even
       | demand more contributions from the big techs.
       | 
       | Nobody should overwork, especially those work on something so
       | crucial to the whole industry.
       | 
       | Edit: I just read a line from the Rust Foundation page [1]: "Good
       | software is built by happy, well-supported people."
       | 
       | 1. https://foundation.rust-lang.org/
        
         | matheusmoreira wrote:
         | > Proactively punish the whole UMN does nothing to address the
         | real issue here.
         | 
         | The _real_ issue here is human experimentation without consent
         | or even information. Why do they think they can run experiments
         | on free software developers without even contacting them about
         | it?
        
         | iforgotpassword wrote:
         | I wish I could just start reviewing random patches that get
         | submitted/accepted into linux, but it's a daunting task.
         | Although I'm good at C, lacking good knowledge of the according
         | subsystem in the kernel makes it impossible to find anything
         | but very trivial errors, which usually get caught by static
         | analyzers afaict. I'm afraid of introducing noise that costs
         | maintainers even more time, by commenting that something might
         | be wrong because "I think you cannot sleep here" or "I think
         | this needs to be in the critical section" when I'm not 100%
         | sure.
         | 
         | The sad thing is that the Linux kernel isn't a small simple
         | project, which manifests in many ways. I once debugged and
         | fixed an issue in i915. Finding out where exactly the patch
         | should go, which repo and branch it should be based on, how to
         | format the commit message, etc took at least twice as long as
         | finding and fixing the issue. You end up in some doc listing
         | instructions for a dozen mail clients to configure them so they
         | don't fuck up your patch when you send it. I guess that's why
         | there is "git send-email"; at some point it was easier to add
         | email sending capabilities to freaking git than to deal with
         | email clients, because the whole process is so archaic. I
         | wonder how many people get discouraged by this whole process
         | and just give up. I'd like to think it's just filtering out low
         | effort contributions, but I'm not really sure.
         | 
         | Long story short, I think the approachability of the Linux
         | kernel is pretty low and I believe there are a bunch of capable
         | people who'd happily get involved in some way, but are
         | discouraged by the high bar of entry. It's a far stretch from
         | github pull requests and issues usability-wise.
        
         | finnthehuman wrote:
         | >the real issue here
         | 
         | One phrase that always drives up the wall is "the real issue
         | here." There can be more than one issue. For anything
         | complicated, there are going to be at least two issues at hand.
         | Banning a University to get the attention of leadership to
         | corral their research department is a valid response to one of
         | the real issues here.
        
           | temp8964 wrote:
           | I don't see the university here is a real issue. Sting
           | research is always uncommon, naturally it won't happen in
           | many years (or ever). It's not like if you don't ban UMN,
           | someone else from their CS department will do another sting
           | on the Linux kernel...
           | 
           | What is far far more important is the overwork problem of the
           | kernel team. And malicious attack from bad actors is way more
           | important threat than a improper research study.
        
         | splithalf wrote:
         | How about the jokers at UMN make an effort to "help out"?
         | Sheesh. If the publicly funded universities can't lend a hand
         | or do their fair share, what hopes have we that big tech will
         | come in and save open source. Do the universities have money
         | problems?
        
         | CrankyBear wrote:
         | It's a known problem, but getting kernel maintainers up to
         | speed is Not an easy job.
        
         | [deleted]
        
         | hinkley wrote:
         | Burnout is a very, very real problem for volunteer
         | organizations in general.
         | 
         | There's also a middle ground where you are doing the most you
         | can sustain. If you run a volunteer group where everyone thinks
         | "I could do more" all the time then there are a lot of missed
         | opportunities.
         | 
         | However, when everyone thinks, "I can't do any more", then any
         | additional insult on top of that becomes a much bigger deal,
         | because it is now 'too much'. For the group's sanity you
         | probably want some fraction to feel underutilized at all times.
         | When 'too much' happens, they get an opportunity to do more,
         | and that extra effort is immediately appreciated.
         | 
         | But even then, sooner or later you'll have two or three dramas
         | that overlap in time and energy and people will get grumpy. Has
         | the kernel team been dealing with some other bullshit this last
         | year? I mean, besides participating in the epidemic with the
         | rest of us, which counts as at least 2 dramas. I think right
         | now you still have to cut people a little slack for getting
         | upset about things you think they shouldn't.
        
         | josefx wrote:
         | > is not a situation created by the group of researchers.
         | 
         | As far as I got from various threads on the issue the
         | university has been involved in at least two major time
         | wasters:
         | 
         | * Breaking the trust model and forcing the maintainers to
         | revert all already accepted patches on suspicion alone
         | 
         | * Running a primitive static code analysis tool that got
         | overeager with correcting impossible bugs and not marking the
         | generated patches as such.
         | 
         | > Proactively punish the whole UMN does nothing to address the
         | real issue here.
         | 
         | Proactive? The universities review board signed off on that
         | mess, there is nothing proactive about banning a group of bad
         | actors after the fact.
        
           | staticassertion wrote:
           | > * Running a primitive static code analysis tool that got
           | overeager with correcting impossible bugs and not marking the
           | generated patches as such.
           | 
           | This is not a major time waster. It happens constantly -
           | students everywhere are doing this, and the patches are
           | accepted constantly.
           | 
           | Further, while Greg classified the entirety of the analyzers
           | findings as being useless, he was incorrect.[0]
           | 
           | > The universities review board signed off on that mess,
           | there is nothing proactive about banning a group of bad
           | actors after the fact.
           | 
           | In terms of the researchers submitting malicious patches it
           | was done without edu addresses, which means that banning them
           | solves nothing.
           | 
           | [0] https://lore.kernel.org/lkml/b43fc2b0-b3cf-15ab-7d3c-25c1
           | f2a...
           | 
           | In fact, most of the fixes fit into the bucket of
           | "potentially useful".
        
           | SiempreViernes wrote:
           | Proactive probable refers to also banning the 6700 staff,
           | along with the 46000 students, that are not part of the
           | research project from ever contributing.
        
             | monocasa wrote:
             | That staff and student body is governed by the same ethics
             | board that allowed this to happen.
        
               | cycomanic wrote:
               | Facebook experimented on people (some might have even
               | been kernel programmers), but Facebook programmers are
               | still contributing to the kernel. Let's not even talk
               | about all the other organisations who have done some
               | pretty unethical things, I mean lots of corporations who
               | have or still are violating the GPL (on Kernel code even)
               | are allowed to continue to contribute to the kernel.
        
               | monocasa wrote:
               | They didn't experiment on the kernel process itself.
               | 
               | And it would be counterproductive to set up the
               | incentives so that people who aren't contributing in the
               | first place can't.
        
             | zopa wrote:
             | Realistically, if a UMN student submits a patch from their
             | personal, non-.edu email address, will anyone notice or
             | care?
        
         | [deleted]
        
         | slim wrote:
         | Why are you presuming that the problem of trust would be solved
         | by having more paied developers involved ? Usually a problem of
         | trust is solved by having less people involved
        
         | dkersten wrote:
         | Maybe it doesn't fix the underlying issues of overworked
         | maintainers, but what the "hypocrite commits" team did was
         | definitely wrong.
         | 
         | I mean, a hacker breaking into a bank for research without
         | explicit permission (i.e. contracts signed etc) would results
         | in prosecution. You don't pull a stunt like that.
        
           | wnevets wrote:
           | According the researchers they never actually committed any
           | "hypocrite commits". If that is true, in your example it
           | would accusing someone of a crime for discussing how to break
           | into a bank for research
        
             | dkersten wrote:
             | From the article:
             | 
             |  _In response, the UMN researchers posted an open letter
             | apologizing to the community, followed a few days later by
             | a summary of the work they did [PDF] as part of the
             | "hypocrite commits" project. Five patches were submitted
             | overall from two sock-puppet accounts, but one of those was
             | an ordinary bug fix that was sent from the wrong account by
             | mistake. Of the remaining four, one of them was an attempt
             | to insert a bug that was, itself, buggy, so the patch was
             | actually valid; the other three (1, 2, 3) contained real
             | bugs. None of those three were accepted by maintainers,
             | though the reasons for rejection were not always the bugs
             | in question._
             | 
             | So three submissions were malicious and buggy.
        
               | wnevets wrote:
               | There appears to be a disconnect. Their FAQ says
               | otherwise.
               | 
               | > We did not introduce or intend to introduce any bug or
               | vulnerability in the Linux kernel. All the bug-
               | introducing patches stayed only in the email exchanges,
               | _without_ being adopted or merged into any Linux branch,
               | which was explicitly confirmed by maintainers. Therefore,
               | the bug-introducing patches in the email did not even
               | become a Git commit in any Linux branch. None of the
               | Linux users would be affected. The following shows the
               | specific procedure of the experiment. [1]
               | 
               | https://www-users.cs.umn.edu/~kjlu/papers/clarifications-
               | hc.... [1]
        
               | angry_octet wrote:
               | My understanding from the LWN article is that the patches
               | were rejected by the maintainers, not that the authors
               | pulled them after passait review.
               | 
               | Lacking informed consent from the kernel mgmt (i.e. G
               | K-H), or external academic oversight, we can't be certain
               | what they planned; I can't see why they deserve the
               | benefit of the doubt.
               | 
               | CS researchers need to catch up with life sciences: pre-
               | registered research, trials plans, external ethical
               | oversight by experts in the technology and ethics.
        
         | ajross wrote:
         | > Proactively punish the whole UMN does nothing to address the
         | real issue here.
         | 
         | Sorry, but academics using the presumption of good faith that
         | their status brings to do black/gray hat security work using an
         | open source project of critical social importance as a dupe is
         | a very real problem on its own. And it absolutely requires a
         | separate solution. Other academic fields treat research ethics
         | as critically important. Something like would simply end the
         | career of someone in medicine, for example. Maybe computer
         | science should too.
         | 
         | That the public kernel maintainers need more support and
         | resources is a problem also. But the solution is different. We
         | should be willing to discuss both and not use the latter as a
         | way to make excuses for the nutjobs at Minnesota.
        
           | hinkley wrote:
           | I see this spectacle as punishing the UMN IRB which has
           | demonstrated an inability to prevent issues like this
           | occurring in the future.
           | 
           | If any readers have not learned that the trick to inter-
           | organizational progress is to get the right questions into
           | the right ears, now is a good time to look at that concept.
           | You can tell your own people until you're blue in the face
           | that the customer has a very expensive XY problem, or you can
           | buddy up to one of their engineers and have them ask their
           | management chain for Y, at which point the seas part and you
           | have budget to solve the problem correctly.
           | 
           | The IRB's bosses need to be audited, and as far as I'm aware
           | the IRB hasn't volunteered that it was part of this dustup
           | and promised to do anything about it. Which means someone has
           | to _make_ them do it. Which means their boss needs to care. I
           | don 't know how universities are organized, but I'm betting
           | that the president/chancellor of the school has hire/fire
           | control over that group. And if not them, then the board. How
           | do you get the board's attention?
           | 
           | More importantly, if I'm a Gatech student and I don't think
           | my university is being ethical about open source, I can make
           | them care by pointing out how UMN got exiled for doing the
           | same sorts of things we are suggesting doing. If UMN just
           | gets a finger wave they'll just shrug and wait for theirs.
        
           | temp8964 wrote:
           | Publishing sting is not the UMN team's intervention. The
           | Sokal affair [1] is probably the most famous one. There are
           | many others too [2]. In the Grievance studies affair [3],
           | three researchers submitted 20 fake articles. They must have
           | wasted tons of works on review teams.
           | 
           | 1. https://en.wikipedia.org/wiki/Sokal_affair
           | 
           | 2. https://en.wikipedia.org/wiki/List_of_scholarly_publishing
           | _s...
           | 
           | 3. https://en.wikipedia.org/wiki/Grievance_studies_affair
        
             | ajross wrote:
             | There's a big difference between a publishing a fake
             | article and trojaning the kernel.
        
               | whimsicalism wrote:
               | Not in terms of the reviewers time.
        
               | [deleted]
        
               | philwelch wrote:
               | The Linux kernel is infrastructure. If it has bugs or
               | security flaws, that directly impacts people's lives.
               | 
               | Intentionally publishing pseudointellectual nonsense in a
               | journal of pseudointellectual nonsense by wasting the
               | time of peer reviewers in pseudointellectual fields of
               | study does not have any direct effect in the real world.
               | The stakes are simply not comparable. To be blunt, even
               | the value of the reviewers' wasted time is not
               | comparable.
        
             | HarryHirsch wrote:
             | Sokal and Pluckrose-Boghossian-Lindsay were polemics
             | against misguided academic practice, you can make a case
             | that these were piss-takes, not research projects.
             | 
             | Note that there were real-world repercussions against
             | Boghossian from his institution.
        
           | hoppyhoppy2 wrote:
           | >Something like would simply end the career of someone in
           | medicine, for example.
           | 
           | Not so at UMN!
           | 
           | https://en.wikipedia.org/wiki/Death_of_Dan_Markingson
           | 
           | The physician at the center of it is still employed (as a
           | physician) at UMN.
        
             | __blockcipher__ wrote:
             | Thanks for linking to that. I'd never heard of that case
             | before; that is so incredibly horrifying. They should be in
             | jail right now for having a known psychotic patient sign an
             | "informed consent" form.
             | 
             | As an aside, there is such a long history of blatant
             | unethical behavior and pseudoscientific medical procedures
             | throughout the entire history of medicine (and especially
             | so-called "public health" - take AZT as one tiny example),
             | that it shocks me that people have this attitude of "oh
             | yeah bloodletting and lobotomies and clitorectomedies and
             | male circumcision as punishment for masturbation were wrong
             | but everything's fine now and there's no giant
             | institutional problems with the medical orthodoxy anymore"
        
           | benrbray wrote:
           | Isn't the idea that bad faith actors will always tell you
           | they're acting in good faith?
           | 
           | The researchers should be banned from contributing, but the
           | Linux maintainers shouldn't be assuming all contributions are
           | in good faith. Both things can be true. Trust but verify.
        
             | wombatpm wrote:
             | By banning the University, they bring attention to the
             | proper authorities and light a fire under them to resolve
             | the issue. The IRB should not have allowed this to happen.
             | I've worked in the academic survey industry and there are
             | strict rules you must follow with human subjects. Last I
             | heard, the kernel team was still mostly human. You don't
             | get to secretly do things to your subjects.
             | 
             | I believe even the old Folger's commercials where they
             | secretly replaced people's coffee with Folger's Crystals
             | would not pass muster with respect to informed consent.
        
               | benrbray wrote:
               | Oh yes I completely agree UMN deserves the ban. The code
               | review process for Linux kernel commits clearly needs to
               | be improved as well. Hence "both can be true".
        
               | dcow wrote:
               | Did you read the LWN article and the TAB statement?
               | 
               | > The writing of a paper on this research [PDF] was not
               | the immediate cause of the recent events; instead, it was
               | the posting of a buggy patch originating from an
               | experimental static-analysis tool run by another
               | developer at UMN. That led developers in the kernel
               | community to suspect that the effort to submit
               | intentionally malicious patches was still ongoing. Since
               | then, it has become apparent that this is not the case,
               | but by the time the full story became clear, the
               | discussion was already running at full speed.
               | 
               | and
               | 
               | > The LF Technical Advisory Board is taking a look at the
               | history of UMN's contributions and their associated
               | research projects. At present, it seems the vast majority
               | of patches have been in good faith, but we're continuing
               | to review the work. Several public conversations have
               | already started around our expectations of contributors.
               | 
               | The recent shit storm was not caused by a malicious act
               | on the part of anybody at UMN. The "overworked"
               | maintainers made a mistake and ascribed malice to the
               | actions of a good-faith contributor who also happened to
               | be from UMN.
               | 
               | I still see people in this thread assuming that UMN has
               | some history of submitting malicious patches and the
               | wholesale ban is a justifiable response. That response
               | might be justified if some internal ethical system at UMN
               | is broken, but that is not the case, at least here. This
               | history is clean save the one incident regarding the
               | research paper, which has been withdrawn.
        
               | nominated1 wrote:
               | > The recent shit storm was not caused by a malicious act
               | on the part of anybody at UMN. The "overworked"
               | maintainers made a mistake and ascribed malice to the
               | actions of a good-faith contributor who also happened to
               | be from UMN
               | 
               | This is not true and is explained in the article:
               | 
               | > That said, there are multiple definitions of "malice".
               | To some of the developers involved, posting unverified
               | patches from an experimental static-analysis tool without
               | disclosing their nature is a malicious act. It is another
               | form of experiment involving non-consenting humans.
               | 
               | This is the "recent shit storm" and can in no way be
               | described as "good-faith" particularly after the
               | disgusting response from the researcher after being
               | called out on their shit.
        
               | staticassertion wrote:
               | This is so very frustrating, and why I wish the LWN
               | article more strongly explained how wrong Greg was.
               | 
               | > particularly after the disgusting response from the
               | researcher after being called out on their shit.
               | 
               | The researcher was slandered publicly by Greg. Greg
               | repeatedly accuses the maintainer of _purposefully_
               | submitting _malicious_ patches, which he NEVER did. Greg
               | also does so in an extraordinarily insulting way,
               | diminishing the student 's skillset, somewhat ironically
               | as the student's research did in fact find bugs in the
               | kernel.
               | 
               | The researcher very rightfully responded the way he did -
               | by calling out the slanderous remarks and removing
               | himself from the Linux Kernel, a project that will no
               | longer benefit from his contributions (which, if you
               | bothered to look into, are far better than Greg gave
               | credit for).
               | 
               | Greg is very much in the wrong here. He is the one who
               | made repeated false accusations.
               | 
               | Please do not continue the very unfortunate attacks
               | against a student who was only trying to submit good-
               | faith patches to the kernel.
        
               | nominated1 wrote:
               | I don't appreciate being told what I can and cannot say
               | in this manner. As for attacking people, you're doing a
               | whole lot more than me, which unlike you, wasn't my
               | intent.
        
               | staticassertion wrote:
               | > I don't appreciate being told what I can and cannot say
               | in this manner.
               | 
               | I didn't tell you what to say, I'm pleading with you and
               | everyone else to stop spreading misinformation that is
               | costing an innocent man his reputation.
               | 
               | > As for attacking people, you're doing a whole lot more
               | than me, which unlike you, wasn't my intent.
               | 
               | You called Aditya's response "disgusting".
        
               | HarryHirsch wrote:
               | You omitted the fact that Aditya's advisor was Kangjie
               | Lu, who ran the research project with the bogus patches.
               | It would be natural to assume that anything from that
               | general direction would be more of the same.
        
               | staticassertion wrote:
               | I didn't "omit it", it's irrelevant. It obviously would
               | _not_ be natural since it was _incorrect_ , and a natural
               | response would not be to throw blind, incorrect
               | allegations and ridiculous insults due to a suspicion.
               | 
               | Greg overreacted. Linus agrees, other kernel maintainers
               | agree. The only person who won't come out and admit it is
               | Greg himself, and his overreaction has cost the kernel a
               | valuable asset as well as the reputation of an innocent
               | researcher, who now gets comments like "they're
               | disgusting" from people who took Greg's accusations at
               | face value.
        
             | bronson wrote:
             | > the Linux maintainers shouldn't be assuming all
             | contributions are in good faith
             | 
             | They don't. The kernel has been under threat for a very
             | long time. Here's an example from 2003:
             | https://lwn.net/Articles/57135/
        
       | openasocket wrote:
       | > the other three (1, 2, 3) contained real bugs. None of those
       | three were accepted by maintainers, though the reasons for
       | rejection were not always the bugs in question.
       | 
       | Wait, so none of the maintainers were actually fooled by these
       | hypocrite commits? In 2/3 the maintainers explicitly caught the
       | use-after-free bug on their own, and in the third they had some
       | unrelated feedback. Maybe I mis-read their paper, but they made
       | it sound like if they hadn't told the maintainers "wait, don't
       | merge this, it's bad" they would have introduced vulnerabilities
       | into the kernel. But that isn't what happened at all!
       | 
       | Again, maybe I mis-read the paper, but it seems like we can add
       | dishonesty, if not outright fraud, to the list of problems with
       | these researchers.
        
         | iudqnolq wrote:
         | Possibly? It depends on whether you think every patch they
         | submitted was part of the research.
         | 
         | > Thus, one of the first things that happened when this whole
         | affair exploded was the posting by Greg Kroah-Hartman of a
         | 190-part patch series reverting as many patches from UMN as he
         | could find. Actually, it wasn't all of them; he mentioned a
         | list of 68 others requiring manual review because they do not
         | revert easily.
         | 
         | > Most of the suspect patches have turned out to be acceptable,
         | if not great, and have been removed from the revert list; if
         | your editor's count is correct, 42 patches are still set to be
         | pulled out of the kernel.
         | 
         | > For those 42 patches, the reasoning behind the revert varies
         | from one to the next. In some cases, the patches apply to old
         | and presumably unused drivers and nobody can be bothered to
         | properly review them. In others, the intended change was done
         | poorly and will be reimplemented in a better way. And some of
         | the patches contained serious errors; these definitely needed
         | to be reverted (and should not have been accepted in the first
         | place).
        
       | shp0ngle wrote:
       | "It's easier to ask for forgiveness than for permission", huh?
        
         | kuratkull wrote:
         | The older you are, the more you understand the importance of
         | knowing this.
        
       | shockeychap wrote:
       | > But if we cannot institutionalize a more careful process, we
       | will continue to see a lot of bugs and it will not really matter
       | whether they were inserted intentionally or not.
       | 
       | I've always known this to be the case with many FOSS projects.
       | (Remember the Heartbleed bug?) I'm a little surprised it's an
       | issue for something this critical that's used by all of the
       | world's largest technology companies, including all of FAANG.
        
       | returningfory2 wrote:
       | HN meta point
       | 
       | > The old saying still holds true: one should not attribute to
       | malice that which can be adequately explained by incompetence.
       | 
       | This is _exactly_ the comment I left on the original HN thread,
       | which was downvoted [0]. I can understand the kernel maintainers
       | being overwhelmed by the situation and jumping to the worst case
       | interpretation, but interesting to see HN commenters doing the
       | same, too.
       | 
       | https://news.ycombinator.com/item?id=26890520
        
         | tester756 wrote:
         | HN went really emotional about this particular matter
        
           | waiseristy wrote:
           | HN also often suffers the same fate as other social
           | networking sites, where comments that don't align with masses
           | get downvoted
        
         | cycomanic wrote:
         | I read your comment and felt the same way. I also thought you
         | didn't deserve the down votes (I think I even gave you an
         | upvote to counteract, even though I hardly ever vote).
        
         | raegis wrote:
         | If I dropped by a UMN researcher's office and asked for 30
         | minutes of their time for something they consider unimportant,
         | they could reasonably reply, "no, I'm too busy with research".
         | Likewise, a kernel developer should not be forced to
         | unwittingly waste their time. The "malice" here is that the
         | researchers may consider their time more valuable than others.
         | Incompetence is a separate thing.
         | 
         | Lastly, kernel developers are naturally going to trust computer
         | scientists at major universities. The researchers know this
         | (subconsciously or not) and took advantage of it. Patches from
         | smith@joeschmoe.com will automatically get more scrutiny.
        
           | cycomanic wrote:
           | But that is not what happened. In the bug submission
           | experiment they used Gmail addresses not their UMN addresses.
           | The whole thing came up when one of the students (I think he
           | was a graduate student working in the research group)
           | submitted patches from his over eager static analysis tool
           | using his uni address. So if anything the work from the uni
           | adress received more scrutiny than the other patches.
        
             | raegis wrote:
             | OK, agreed, I was wrong on my second point.
        
             | [deleted]
        
       | dcow wrote:
       | So, basically, Greg is kind of a jerk too. I mean I respect the
       | dude for all the work he does but man his back and forth with an
       | honest person and jump to ascribe malice looks pretty bad in this
       | light. An apology might be in order.
        
         | tofuahdude wrote:
         | What? Greg's open-source work was maliciously attacked by the
         | exact same actor who then submits incompetent patch requests. I
         | think he responded in a totally appropriate manner.
         | 
         | He does not owe these people anything - and certainly not
         | kindness after their track record.
        
           | dcow wrote:
           | It wasn't the same person and it wasn't an attack. Read the
           | LWN article.
           | 
           | > The writing of a paper on this research [PDF] was not the
           | immediate cause of the recent events; instead, it was the
           | posting of a buggy patch originating from an experimental
           | static-analysis tool run by another developer at UMN. That
           | led developers in the kernel community to suspect that the
           | effort to submit intentionally malicious patches was still
           | ongoing. Since then, it has become apparent that this is not
           | the case, but by the time the full story became clear, the
           | discussion was already running at full speed.
        
             | tofuahdude wrote:
             | By "actor" I was referring to the institutionally
             | untrustworthy UMN, but since you seem interested in
             | splitting hairs, you should probably realize in doing so
             | that Aditya Pakki === Aditya Pakki.
             | 
             | Notice that Aditya is one of the signers of the apology
             | letter: https://lore.kernel.org/lkml/CAK8KejpUVLxmqp026JY7x
             | 5GzHU2YJL...
             | 
             | And that several of the identified vulnerabilities were
             | tied to Aditya: https://lwn.net/ml/linux-
             | kernel/YH+zwQgBBGUJdiVK@unreal/
        
               | re wrote:
               | Aditya's commits--though some were low-quality and/or
               | buggy, including the one you linked--were not part of the
               | intentionally deceptive "hypocrite commits" project.
        
         | re wrote:
         | I don't think Greg was totally off base, though I do think he
         | could have been more restrained. Aditya was essentially
         | spamming the reviewers with bad patches and doing zero due
         | diligence, including, significantly, not responding to anyone
         | pointing out flaws in the commits[0]; he did get the benefit of
         | the doubt at first and had the opportunity to clear things up.
         | Ultimately though he was collateral damage of his advisor's
         | involvement with the previous study. One of the risks of
         | deception studies, especially if performed unethically or
         | irresponsibly, is that loss of trust.
         | 
         | Edit: While I don't think that suspicions or distrust were
         | unreasonable, Leon's declaration that the commits were a direct
         | continuation of the "hypocrite" research did escalate things
         | unnecessarily[1]. I'm disappointed, though not surprised, by
         | how many people--both commenters and tech news outlets--ran
         | with that as the story at face value.
         | 
         | [0]
         | https://lore.kernel.org/lkml/20210407155031.GA1014852@redhat...
         | https://lore.kernel.org/lkml/bd3c84bc-6ae0-63e9-61f2-5cf64a9...
         | https://lore.kernel.org/lkml/20210407153458.GA28924@fieldses...
         | 
         | [1] https://lore.kernel.org/lkml/YH+zwQgBBGUJdiVK@unreal/
        
       | mjw1007 wrote:
       | I think that's an evenhanded writeup of the situation, but I also
       | think the author should have mentioned in the article that he is
       | himself a member of the technical advisory board whose actions
       | he's reporting on (if
       | https://www.linuxfoundation.org/en/about/technical-advisory-...
       | is up to date).
        
       | shockeychap wrote:
       | > Consider, for example, the 5.12 development cycle (a relatively
       | small one), which added over 500,000 lines of code to the kernel
       | over a period of ten weeks. The resources required to carefully
       | review 500,000 lines of code would be immense, so many of those
       | lines, unfortunately, received little more than a cursory
       | looking-over before being merged.
       | 
       | I am more than a little astounded that new functionality is being
       | added to the kernel THAT fast.
        
       | einpoklum wrote:
       | > The corollary -- also something we already knew -- is that code
       | going into the kernel is often not as well reviewed as we like to
       | think.
       | 
       | This may be true but not really a corollary of what happened. In
       | fact, it seems there aren't the resources to determine how many
       | of the patches are actually bad and should really be reverted.
        
       | re wrote:
       | Some of the past related threads:
       | 
       |  _"They introduce kernel bugs on purpose"_ -
       | https://news.ycombinator.com/item?id=26887670 (1954 comments)
       | 
       |  _UMN CS &E Statement on Linux Kernel Research_ -
       | https://news.ycombinator.com/item?id=26895510 (320 comments)
       | 
       |  _Open letter from researchers involved in the "hypocrite commit"
       | debacle_ - https://news.ycombinator.com/item?id=26929470 (374
       | comments)
       | 
       |  _Linux Foundation's demands to the University of Minnesota_ -
       | https://news.ycombinator.com/item?id=26955414 (168 comments)
        
       | slaymaker1907 wrote:
       | I really think this whole thing would make for entertaining
       | content on one of the YouTube drama channels. Overall, I thought
       | the kernel team was right to be upset about being experimented on
       | without consent, but the exchanges on both sides had some notable
       | incidents of immaturity. I didn't think the attacks against the
       | static analysis guy were totally justified. They were calling him
       | an idiot for putting in an unnecessary null check. To be clear,
       | he then also overreacted, and that really made me think that it
       | only takes one side to start de-escalating things.
       | 
       | The most disappointing response IMO was from the University
       | itself. I didn't like how they deflected off the issue of why
       | there was no IRB questioning of this. Just because human
       | experimentation is narrowly defined federally doesn't mean you
       | can't have higher standards.
        
       | belinder wrote:
       | So the paper is not being published? Then what was the whole
       | point of the research
        
         | jasonjayr wrote:
         | They were forced to retract the paper as a result of this
         | debacle.
        
         | kjs3 wrote:
         | The point was to get a paper published. That's it.
        
       | toss1 wrote:
       | Key point:
       | 
       | >>"kernel maintainers (and maintainers of many other free-
       | software projects) are overworked and do not have the time to
       | properly review every patch that passes through their hands. They
       | are, as a result, forced to rely on the trustworthiness of the
       | developers who submit patches to them. The kernel development
       | process is, arguably, barely sustainable when that trust is well
       | placed; it will not hold together if incoming patches cannot, in
       | general, be trusted. "
       | 
       | Yet these UMN "researchers" deliberately chose to violate that
       | trust.
       | 
       | Are they so ethically clueless and arrogant that they didn't
       | think to review the plan? Did they hold any kind of ethical
       | review and pass it, or fail it and decide to proceed anyway?
       | 
       | In any case, they've now alerted the entire world that the Linux
       | dev process and OS dev processes in general are incredibly
       | vulnerable to malicious actors.
       | 
       | This vulnerability is also shown by another #1 HN story[1]
       | 
       | [1] https://blog.netlab.360.com/stealth_rotajakiro_backdoor_en/
        
         | detaro wrote:
         | How does the netlab link have anything to do with Linux
         | development processes?
        
           | toss1 wrote:
           | >> "... a backdoor targeting Linux X64 systems, a family that
           | has been around for at least 3 years."
           | 
           | The point is that OS systems fundamentally depend on the
           | trustworthiness of the contributors and the diliginece and
           | bandwidth of the maintainers.
           | 
           | Both are examples showing that the diligence and bandwidth is
           | not infinite, and can be relatively easily overwhelmed or
           | outmaneuvered.
           | 
           | The UMN hack showed how easily the trust can be violated, and
           | the netlab 3 year lifetime before discovery shows how easily
           | flaws can stay hidden.
           | 
           | The saying is "many eyes make shallow bugs". In an ideal case
           | and with systems that are small relative to the body of
           | people maintaining and actively scrutinizing it, that's true.
           | The reality is that the scale of Linux and most OS is now
           | well beyond the set of people scrutinizing it. How much
           | source code did you review before you installed your latest
           | build?
        
             | kuratkull wrote:
             | You are confused about the kernel vs. userland.
        
             | detaro wrote:
             | The "backdoor" is a program running on Linux. It's not a
             | modification to any system component. "will run arbitrary
             | binaries" is not an issue with lack of trustworthiness.
             | (The way it came onto that system might be, but we don't
             | know anything about that)
        
       | endisneigh wrote:
       | My take away from this affair is how we again and again
       | overgeneralize. It never made any sense to blame UMN in general
       | for an action taken by arguably only a single department, and
       | even holding them culpable is a stretch (this is in response to
       | some people in another thread saying no one from UMN should ever
       | be able to contribute to the kernel ever again to teach them a
       | lesson).
       | 
       | It's not really surprising a bunch of academic-types were too far
       | removed from open source etiquette - that removal is pretty much
       | the source of the drama (this entire thing would've been avoided
       | if a few key people were informed ahead of time).
       | 
       | My other take away is that it seems social engineering still is
       | the most powerful type. From this to the Twitter fiasco - it's
       | worth trying to make your workflow resilient against social
       | engineering indeed - by the authors own admission, the process
       | leans a bit too much towards pre-established trust and
       | credibility. Unfortunately that trust isn't immutable, nor is it
       | everlasting so it's a spot that can be exploited.
        
         | kayodelycaon wrote:
         | > It never made any sense to blame UMN in general for an action
         | taken by arguably only a single department
         | 
         | UMN is ultimately responsible for the actions of the people
         | under their organization. This include oversight and dealing
         | with the actions of those people when they have crossed lines.
        
           | castlecrasher2 wrote:
           | Agreed, and I'm stunned this is even in question.
        
         | sidr wrote:
         | If the university doesn't review research methodologies that
         | directly impact others (I'm not talking about societal impact
         | of the work down the line), then they should be considered bad-
         | faith actors. This is the equivalent of punishing a university
         | for not having an IRB for human experiments .
         | 
         | I assume individual members from UMN may still contribute
         | without a UMN address. They just won't be given any special
         | consideration (implicit or explicit) as belonging to a
         | particular organization.
        
           | detaro wrote:
           | By that standard, basically no university meets your
           | threshold. (in other places the IRB might have decided
           | differently, but also that's not extremely likely)
           | Researchers are just not monitored that closely usually.
        
             | hn8788 wrote:
             | They'd already complained to UMN about the research, and
             | nothing was done about it. It wasn't just the IRB that
             | allowed it to continue.
        
               | cycomanic wrote:
               | What do you mean continued it. The research of submitting
               | buggy patches had ended already (they even had a draft
               | paper). The whole row started because one of the
               | researchers submitted a patches from a (not very good
               | apparently) in progress static analysis tool. The bad
               | quality of the patch led to GKH accusing the submitter
               | that they are continuing. As the article states this was
               | not the case.
        
               | detaro wrote:
               | Is that the case? I hadn't seen anything conclusive that
               | that actually had happened? (some people said the
               | contacted the IRB, but presumably that was part of what
               | triggered the IRB to look at it)
               | 
               | But looking into this and clearly documenting it is part
               | of what UMN needs to do.
        
               | hn8788 wrote:
               | In the most recent email chain where they banned UMN, GKH
               | said that they were going to have to complain to the
               | university again. From what I understand, the IRB review
               | was initiated by the researchers themselves after
               | receiving backlash from the original paper.
        
             | jldugger wrote:
             | >By that standard, basically no university meets your
             | threshold.
             | 
             | The question then, is what to do about this much larger
             | scale problem, not to simply assert the premise is wrong.
        
             | HarryHirsch wrote:
             | But researchers don't operate within a vacuum, research is
             | a social effort. The faculty member and the graduate
             | student who dreamt up this line of research have discussed
             | it with others and they all thought it was a splendid idea.
        
         | namelessoracle wrote:
         | Agree or disagree with whether it should have been done but
         | blaming UMN was a social tactic to embarass the University so
         | that UMN would apply pressure to the researcher and deal with
         | them.
        
           | mturmon wrote:
           | The social tactic could have been changed from "we will
           | revert all recent patches originating from UMN" to "we are
           | forced to review all recent patches originating from UMN, and
           | to enforce stricter scrutiny of UMN patches going forward."
           | Second is more honest and also applies social pressure.
        
             | hmfrh wrote:
             | The second option also doesn't result in any big negatives
             | for UMN associated personnel because the kernel devs take
             | on all the work of enforcing stricter scrutiny.
        
             | raegis wrote:
             | That sounds like a good first warning. But they persisted
             | even after Greg KH complained to UMN and told them to stop.
             | A more severe punishment for a second offense is needed,
             | IMO.
        
           | infogulch wrote:
           | > social tactic
           | 
           | This, but not derisively. Note that the university's IRB
           | granted an exception for this behavior, and neither UMN nor
           | the department responded with consideration for the concerns
           | raised by the kernel team until _after_ they banned the whole
           | domain.
        
         | einpoklum wrote:
         | > It's not really surprising a bunch of academic-types were too
         | far removed from open source etiquette
         | 
         | Umm, there's etiquette in academia too you know... even FOSS
         | etiquette in academia.
        
           | shadowgovt wrote:
           | And that fact highlights the mistakes made here.
           | 
           | A fellow researcher intentionally pushing bad data into a
           | colleague's study as some kind of penetration test on the
           | colleague's scientific methodology would be justifiably seen
           | as betrayal, and a possibly bad faith betrayal at that since
           | research grants are a finite resource.
           | 
           | The researchers in the story appear to have failed to
           | extrapolate that sort of collegial courtesy to the open
           | source community maintaining Linux.
        
             | einpoklum wrote:
             | The researchers in this story acted unethically - as
             | academics. I'm saying it's not the culture difference.
        
       | parksy wrote:
       | The situation has echoes of the scholars who a few years back
       | submitted fake papers to journals, some of which passed peer
       | review and were published. That was an attempt to backdoor the
       | scientific method, really.
       | 
       | I don't condone guerrilla penetration testing whether its on the
       | Linux kernel or a science journal, and I am sure that the ethical
       | implications will be subject to much outrage and debate for some
       | time to come, but when it occurs and is successful it is
       | concerning because if a small team of scholars can do it, so can
       | anyone else.
       | 
       | The inner workings of the Linux kernel maintenance has been kind
       | of a black box for me for years, I just naively trust it works
       | and my next update won't compromise my system, so it's concerning
       | that this happened and will be following to see how this plays
       | out.
        
       | a_ghoul wrote:
       | After doing some academic research I have a sour taste in my
       | mouth for it, and this makes it even worse. Disappointing from
       | UMN.
        
       | at_a_remove wrote:
       | Twenty years ago, this incident would have been met with a "Your
       | scientists were so preoccupied with whether or not they could,
       | they didn't stop to think if they should." A decade back, "play
       | stupid games, win stupid prizes" would have won out. Now, I think
       | The Kids would say "Fuck around and find out" (this is often
       | accompanied by a stencil of an open-mouthed alligator).
       | 
       | No matter the form of the reaction, I am still left wondering at
       | the proportions of disregard, hubris, naivete, and perhaps sheer
       | lack of consideration of consequences that went into this whole
       | affair. For the life of me I simply cannot imagine anyone patting
       | me on the back and saying, "Good show on that paper, old boy!
       | Smartly done!" Was there nobody in the whole group who had
       | reservations? Were any expressed or were they simply hidden away?
       | I am sincerely curious as to how it managed to get to this point.
        
         | defaultname wrote:
         | I'll be the odd one out and say that I see nothing wrong with
         | their study. There are much more skillful, much better
         | resourced actors who can do what the UMN did much more
         | discretely.
         | 
         | I mean, the UMN flaws were _egregious_ and blatantly obvious.
         | They should have been met with ridicule at first glance. If
         | they aren 't (and they weren't), there are problems that need
         | to be addressed.
         | 
         | But there's a torch mob about this -- a very misled, and I
         | would say foolish torch mob -- and as a result comments like
         | mine will invariably be moderated down to invisible. I guess if
         | we just brigade against some junior researchers it'll help
         | defend the kernel from actually malicious actors...
        
           | rideontime wrote:
           | What a rhetorical power move it is to preemptively decry
           | "torch mobs" for having your comment disapproved of. You
           | might also consider the possibility that you'll get downvoted
           | for suggesting that "ridicule" is an appropriate response in
           | a code review.
        
             | defaultname wrote:
             | Maybe you haven't been paying attention, but the "torch
             | mobs" have been pretty active as this situation has
             | unfolded. There is nothing "preemptive" about it. The
             | zeitgeist is pretty obvious.
             | 
             | As to the "ridicule", putting aside the out of place
             | attempt to moral high ground about a rhetorical turn of
             | phrase, yes their commits were so egregiously flawed that
             | ridicule was the only response.
        
           | simion314 wrote:
           | Seems more ethical and positive experiment to have the
           | students to work with historical data, you can find a metric
           | on how easy it is to slip a bug into the kernel, maybe they
           | could identify problematic trends like time of day/year when
           | bugs can slip in, or time before a release.
           | 
           | Do we really need someone to prove that a malicious backdoor
           | can be added? since real bugs happen then it is clear that
           | the system is not perfect,
           | 
           | Imagine this would be allowed, then all contributions would
           | need to be checked not for the regular kind of problems but
           | for all those stealthy tricks, so instead of working on the
           | kernel you would need to attempt to find on how each commit
           | is truing to trick you , or maybe this is fine on it's own
           | but is part of a bigger plan.
        
           | dang wrote:
           | > _and as a result comments like mine will invariably be
           | moderated down to invisible._
           | 
           | Please don't downvote-bait or superciliously posture like
           | that. It degrades discussion noticeably and breaks the site
           | guidelines: https://news.ycombinator.com/newsguidelines.html.
           | Note the second-last one. Also this one: " _Please don 't
           | sneer, including at the rest of the community._" Also this
           | one: " _Don 't be snarky._"
        
             | abnry wrote:
             | Okay, but he gets a comment from you and avoids a faded out
             | comment. If his comment was simply faded out, you would not
             | have replied. Frankly, you need to recognize that there is
             | a mob mentality on this topic and that legitimate comments
             | are getting downvoted.
        
             | mumblemumble wrote:
             | I'm not sure which is the greater chilling effect, the way
             | downvoting works out in practice on this site, or having
             | the site moderator dismiss you as "sneer"ing and being
             | "snarky" if you openly acknowledge the downvote's chilling
             | effect.
        
               | oblio wrote:
               | Downvotes on HN are capped at -4. Go to Reddit and notice
               | it's unbounded.
        
           | dsr_ wrote:
           | You completely ignore the issue of consent.
           | 
           | The UMN has placed itself on the same level as those
           | miscreants, because they screwed up in several different
           | ways.
           | 
           | Finally, maturity is, in large part, knowing the difference
           | between having a capability and exercising it. As an
           | institution, the UMN has revealed itself to be immature. What
           | they do about it now determines how they will be treated in
           | future.
        
           | ampdepolymerase wrote:
           | When a state level attacker carries out a successful attack
           | via the kernel, then the developer community will hide behind
           | shocked_pikachu.jpeg and call for more social engineering and
           | red team training for the Linux foundation. Until then,
           | people are happy being complacent and any attempt to change
           | the status quo will be met with hostility.
        
           | kortilla wrote:
           | Your comment will be downmodded because you mentioned it in
           | advance and you haven't actually stated why you think it was
           | fine to abuse the open source maintainers other than "someone
           | else can do it".
           | 
           | "I beat my kids so some weirdo on the street doesn't!"
        
           | balozi wrote:
           | Their particular problem in this case is that they are
           | researchers at a reputable university. They are not supposed
           | to be caught doing this, ever. Poorly supervised students
           | coupled to disinterested faculty advisors is the recipe for
           | this sort of non-sense.
        
             | cycomanic wrote:
             | Actually the supervisor was directly involved in this. I
             | think it's much more the high pressure publish or perish
             | culture, possibly coupled with the pressures of tenure-
             | track hiring (I'm not entirely sure if the supervisor was
             | already tenured).
        
           | zajio1am wrote:
           | > I'll be the odd one out and say that I see nothing wrong
           | with their study. There are much more skillful, much better
           | resourced actors who can do what the UMN did much more
           | discretely.
           | 
           | Puting these patches under false pretense is deceptive
           | behavior to other people, that is problematic regardless of
           | whether the intent is malicious or just doing research.
        
           | floatingatoll wrote:
           | If UMN's ethics review(s) has approved this experiment in
           | cooperation with Linus/Greg, there would not be anything to
           | be outraged about.
           | 
           | The outrage, and the core ethical violation, centers around
           | intentionally wasting the time of others for selfish benefit
           | without appropriate review and consent.
        
             | dekhn wrote:
             | There is no such thing as UMN ethics board, although they
             | do have an ethics office: https://integrity.umn.edu/ethics.
             | I don't think they get individually involved in specific
             | studies unless there's been some sort of compliance
             | violation.
             | 
             | It's also questionable whether the IRB should have
             | considered this human subjects research and not given them
             | an exemption. It's also questionable whether, if the IRB
             | had done that, the IRB would have stopped the study or
             | asked for revisions to the study design (they would if they
             | were paying close enough attention).
             | 
             | Professors at universities are typically given large
             | amounts of freedom to conduct studies without heavy prior
             | approval, it's a tradeoff.
        
               | shadowgovt wrote:
               | And it's a trade-off that the Linux maintainers have now
               | justifiably called them to task on for making the trade-
               | off in a way that pushed the cost onto external parties.
        
               | hinkley wrote:
               | As the top level stated, "fuck around and find out."
        
               | floatingatoll wrote:
               | Okay, "UMN's ethics review(s)" encompasses whatever the
               | exact ethical review process(es) are. Updated.
        
               | dekhn wrote:
               | ethics is mostly self-regulated; when you apply for
               | grants, the University ensures you're not proposing to do
               | anything illegal (risk protection for the U), but
               | subsequently doesn't actively monitor most researchers to
               | ensure they are being ethical. Same for publications. In
               | those cases, the response is entirely reactive, doing
               | investigations and actions after the problem hits the
               | press.
        
             | tmotwu wrote:
             | > If UMN's ethics board has approved this experiment in
             | cooperation with Linus/Greg
             | 
             | No, informed consent must be with all participants and
             | maintainers reviewing the patches. Why does Linus/Greg get
             | to decide that for others?
        
               | angry_octet wrote:
               | With senior endorsement it would be easy to recruit a
               | pool of participants.
        
               | tmotwu wrote:
               | Yes, opt-in informed consent from maintainers and
               | reviewers of the patches.
        
               | hinkley wrote:
               | I guess that depends on whether you consider this a
               | sociology experiment or white hat work.
               | 
               | I'm not sure that I agree that sociology experiments have
               | 'informed consent' the way you appear to be thinking of
               | it. Yes, you know you're in an experiment, but if you
               | know what the experiment actually is, then your reactions
               | are not authentic and you skew the results (which always
               | makes me wonder about clever people in experiments).
               | 
               | In white hat stories, it's not always the case that
               | everyone knows ahead of time, but 'enough' people know.
               | Those who do know bear part of the responsibility of
               | ensuring that things don't 'go too far', and they give
               | organizational consent but not personal consent. Although
               | I confess that OSS might be a little fuzzy here because I
               | didn't sign anything when I started. You can't tapdance
               | around informing me by pointing to some employment
               | agreement.
        
               | tmotwu wrote:
               | You are free to disagree. Obviously, not every scenario
               | can be navigated using an arbitrary policy for conduct,
               | which was what clearly happened here. 'Informed consent'
               | in the context of cybersecurity research is described in
               | the Menlo Report [1].
               | 
               | And fyi, not all white hat stories are clean in their
               | approaches, that in itself remains a controversial topic
               | for another discussion. Furthermore, employees in an
               | organization are under a different set of contractual
               | obligations, full of caveats, to their employers. In some
               | ways, they've already "consented" to specific bare
               | minimums(white-hat can be framed as security awareness
               | training required in your job role).
               | 
               | Open source contributors and reviewers are individual
               | third party actors. No one has established any tolerance
               | limits. So "enough" people doesn't really apply here
               | because no one was made the arbiter source to decide
               | that.
               | 
               | [1]
               | https://www.dhs.gov/sites/default/files/publications/CSD-
               | Men...
        
               | floatingatoll wrote:
               | That is not as cut and dried a decision as you frame it
               | to be.
               | 
               | California emissions testing for vehicles includes
               | licensed smog test stations and a process where
               | undercover inspectors bring cars that are in violation to
               | those stations. If the smog test station is incompetent,
               | they will be cited and perhaps stripped of their
               | operating license.
               | 
               | If another state decided that they'd like to start
               | performing random tests upon their network of smog test
               | stations, without any retaliation to those stations, then
               | it would not be a violation of ethics for that state to
               | send undercover cars through the stations.
               | 
               | It _would_ be unethical to _punish_ those who fail
               | undercover tests, _unless_ the state had announced that
               | random undercover testing was beginning and that
               | punishments would be applied for failures.
               | 
               | The researchers were not attempting to modify the
               | behavior of the participants, nor did they seem to be
               | interested in naming and shaming specific maintainers, so
               | it's not as simple as "anyone who comes into contact with
               | the experiment must be fully informed".
        
               | iudqnolq wrote:
               | A professor is not a government. Also, all governments
               | will use uncover officers without warning first.
        
               | floatingatoll wrote:
               | The analogy is from California to Linus/Greg, not UMN.
        
               | quickthrowman wrote:
               | California controls the licenses for smog test stations.
               | I would imagine there's a clause in the contract that
               | says "California, at any time, may do random undercover
               | inspections of the smog testing facility to ensure
               | compliance" which the owner of the licensed smog station
               | would be aware of.
               | 
               | Do you see how that differs from an academic randomly
               | experimenting on an open source project with no notice or
               | warning?
               | 
               | Retail store owners/managers contract out "mystery
               | shoppers" to test compliance with retail store policy and
               | procedure. This example is also nothing like the UMN
               | experimenting on Linux, since there's a contract and both
               | parties are aware.
               | 
               | A similar example to the UMN/Linux situation would be an
               | academic doctor deciding to randomly test blood donor
               | screening by sending in HIV positive people to lie about
               | their status in order to donate tainted blood and only
               | telling the Red Cross or whoever after the blood has been
               | donated.
        
             | zajio1am wrote:
             | > If UMN's ethics board has approved this experiment in
             | cooperation with Linus/Greg, there would not be anything to
             | be outraged about.
             | 
             | I do not think that would help. This was done on public
             | mailing list and deceptive behavior was also against third
             | parties (other reviewers). I do not think Linus/Greg can
             | give consent to that.
        
               | floatingatoll wrote:
               | Within the scope of the project, Linus and Greg
               | absolutely could deem this acceptable on a project basis.
               | Individual contributors might then leave the project once
               | this comes to light, but if the project owners say "We
               | need you to submit an 8 hour video of paint drying with
               | every commit", that's their right to do so, and if they
               | say "We chose to allow researchers to waste hours of your
               | collective project time for a experiment", that's their
               | right to do as well. If they want to guide the project
               | down seemingly-unproductive paths because they truly
               | think it's worth it, then they will.
        
               | pvg wrote:
               | The difference is one of these gives you the opportunity
               | to consent or opt out of the wasting of your time, the
               | other ones doesn't. You aren't describing ethically
               | equivalent things.
        
               | floatingatoll wrote:
               | With an experiment of this nature, it is not always
               | possible to get universal consent-or-refusal, due to the
               | nature of the experiment. If you're doing an experiment
               | with footpaths and you want to close specific paths to
               | see how people's behavior changes, you _do_ need to seek
               | permission from the owner of the footpaths, and you _do_
               | need to perform some sort of ethics review to ensure you
               | aren 't creating severe mobility issues for those being
               | experimented upon, but you would _never_ be expected to
               | seek permission from each person traveling those
               | footpaths to waste up to a minute of their time by
               | closing the footpath they intended to traverse. Whether
               | or not that time waste is acceptable is ultimately the
               | decision of the owner /operator of the footpaths, not of
               | the individuals using it.
               | 
               | So, then, it is similar with a 'secret experiment upon
               | unwitting people' and the Linux project. The
               | owner/operators are Linus and Greg, and as the experiment
               | cannot be pre-announced without tainting the outcomes,
               | they are the ultimate "go or no-go" decision makers --
               | just as the owner/operator of the footpaths would, too,
               | have a right to refuse an experiment upon their
               | participants. The individual Linux contributors who
               | participate in the Linux project have no implicit
               | authority whatsoever in any such consideration, and would
               | not be offered opt-in or opt-out consent prior to it
               | being performed. If the good of the project requires
               | pentesting the processes that contributors operate, the
               | project has every right to do so; it's for the good of
               | the project, and contributors' time spent will have been
               | net valuable even if the specific contributions are felt
               | to have been "discarded" or if the contributors feel that
               | their time was "wasted".
               | 
               | This lack of individual authority in many respects is not
               | comfortable or appealing for open source contributors to
               | consider, but it's critical for us to confront it and
               | learn lessons from it. We do not perfect authority over
               | how open source projects use our contributions, whether
               | in time, money, or code. Some percentage of our
               | contributions will always end up being discarded or
               | wasted, and sometimes that will be upsetting to
               | contributors. These are real aspects of project
               | participation regardless of whether secret experiments
               | are approved by the project owners/operators or not. I
               | hope that this event helps us develop better empathy for
               | large projects, such as Linux or other operating systems,
               | when they make decisions that benefit the project rather
               | than contributors.
        
               | operator-name wrote:
               | In a previous thread a user suggested the following:
               | 
               | Get consent from the project leaders and announce
               | publicly their intentions and a time window of a few
               | months. Then randomly submit the patches as originally
               | outlined.
               | 
               | Although this would not prove as strong of a result, it
               | is far more ethical and similarly effective. Companies
               | use this kind of method all the time.
        
             | temp8964 wrote:
             | Wouldn't Linus/Greg be heavily criticized for such
             | cooperation then? The nature of this experiment prohibits
             | it from total consent with the process been experimented
             | on.
        
               | phnofive wrote:
               | They likely wouldn't have cooperated with study, but
               | advised that there is no point to it as there is no
               | mechanism to catch the behavior they described.
               | 
               | It sort of like asking "What if we volunteered for the
               | parks department and slipped a load of salt into the
               | fertilizer?" - of course bad actors can find new ways to
               | circumvent security.
        
               | floatingatoll wrote:
               | "We found out that we have no process for confirming that
               | we're applying the correct fertilizer to the soil in
               | question, and accidentally salted the earth of a small
               | patch of flowers during the test. While not ideal, the
               | damage is small and contained and repairs are underway."
        
               | floatingatoll wrote:
               | No one's discussing that, and I think that's the most
               | valuable question of all. Was this experiment so
               | worthwhile that it ought to have been approved? Or was
               | the experiment itself so irrelevant that ethical
               | compliance or not, it shouldn't have needed doing? Is
               | random testing of gatekeepers an appropriate process in
               | Linux development?
        
               | cycomanic wrote:
               | Actually I think this is a very good point you're
               | raising. Maybe the kernel community has a whole should
               | consider a way of checking their processes. Now randomly
               | submitting buggy patches is probably not the right way,
               | but there very well might be some interesting research
               | (on the processes) that could be done. So maybe a
               | document that laid out what would be acceptable and what
               | not could be helpful.
        
           | drknownuffin wrote:
           | I agree strongly. People demanding this be pursued in
           | criminal court are seriously getting bent out of shape. "We
           | can mob these people because they're super in the wrong and
           | also utterly helpless to defend themselves against my
           | righteous fury," is ridiculous.
           | 
           | They made an ethical blunder, but this wasn't Tuskegee 2.0.
        
         | sp332 wrote:
         | And it's not too rare, either. Tests of "supply chain attacks"
         | have actually uploaded sample packages to package repos like
         | NPM and PyPI, and the professional researchers collected bug
         | bounties. Meanwhile the volunteer repo admins often take the
         | blame and have to clean up all the fake packages too.
         | https://arstechnica.com/information-technology/2021/02/suppl...
        
           | angry_octet wrote:
           | The difference is that the bug bounties are being collected
           | from an endorsed bug bounty program.
           | 
           | Fake and malicious packages jam public collections regularly,
           | and are not part of white hat research. It's a trash fire
           | which seemingly only gets better with exploits forcing people
           | to change.
        
         | kodah wrote:
         | > Was there nobody in the whole group who had reservations?
         | 
         | I've seen very well-intentioned people do very stupid,
         | seemingly callous, etc things that would make anyone who didn't
         | know them question their quality as a human being.
         | 
         | My answer to these kinds of situations has been to remedy it
         | and move on quickly. The more you dwell, the more possibilities
         | and random connections pop in your head, and the easier it is
         | to believe everybody is just a selfish, evil, etc person; which
         | is far from the truth if you've chosen to live life a little.
        
         | dylan604 wrote:
         | There are plenty of examples of the entire decision tree being
         | tone deaf from the real world not just in academia. Often
         | times, this turns out to be because the group involved is not
         | very diverse so the group "experience" is not as wide. How many
         | times has something with a double meaning been used in
         | marketing where the world gasps and asks "was there not one
         | person involved of ____ race/sex/age/etc?"
        
         | senderista wrote:
         | Wonder if anyone involved was on the tenure track. If so I
         | assume they can kiss tenure goodbye.
        
           | azhenley wrote:
           | It looks like the lead faculty is likely going up for tenure
           | in a year or two.
        
         | itronitron wrote:
         | >> Was there nobody in the whole group who had reservations?
         | 
         | Based on my experience (not at UMN), reservations expressed by
         | juniors are dismissed and the seniors keep their reservations
         | to themselves, lest they also be dismissed.
        
         | tomc1985 wrote:
         | The academic ivory tower is real. I have seen tenured
         | professors with tenuous grips on the reality outside of
         | academia. Something like this kernel affair seems pretty
         | plausible to me
        
       | eximius wrote:
       | What I want to know is whether the researchers had ever privately
       | asked anyone in the chain of the kernel whether they would be
       | willing to participate as a filter.
       | 
       | Bad example for obvious reasons, but if Linus was given the
       | usernames that would be submitting bad patches and forewarning of
       | which ones were valid and which ones weren't, it could have been
       | done such that the experiment was valid and the commits were not
       | included in the final tree.
       | 
       | I think it is a rather mild restriction to the experiment that a
       | single person must be able to find an excuse to recuse themselves
       | - as simple as "I don't have time to look at this, can someone
       | else" would have worked.
        
       | woofie11 wrote:
       | ... And I'd still like some update on the UMN IRB, which exempted
       | this research.
       | 
       | Graduate students will make mistakes, and as much as this was
       | stupid, reckless, and unethical, graduate school is a time to
       | learn.
       | 
       | Professors will make mistakes. That's less excusable.
       | 
       | IRBs are set up exactly to prevent stuff like this. My experience
       | is they don't work, and UNM's clearly doesn't.
        
         | dekhn wrote:
         | IRBs are set up to prevent costly lawsuits against the
         | university, although they were originally intended to avoid
         | human experimentation like the nazis (and many eugenecists and
         | other doctors in the US) did.
        
           | HarryHirsch wrote:
           | There are informal mechanisms as well. Someone will say to a
           | hotheaded or greedy faculty member "look, you can't boot M
           | off this patent, three years ago that other faculty member
           | booted N off a few patents, there were issues, and then we
           | had to pay tens of thousands for all the amended filings!"
        
       | carbocation wrote:
       | > _The old saying still holds true: one should not attribute to
       | malice that which can be adequately explained by incompetence._
       | 
       | To the contrary, my takeaway from these events is a further
       | erosion of the credibility of Hanlon's razor.
        
         | edoceo wrote:
         | the razor holds IMO. because this isn't _adequately_ explained
         | by incompetence alone. the crafty /sneaky approach has some
         | hints of malice.
        
           | carbocation wrote:
           | That is actually totally fair. But I will offer a quibble:
           | having to focus on the " _adequately_ " part makes the razor
           | less interesting.
        
             | admax88q wrote:
             | But... it's there. Without the "adequately" its not
             | Hanlon's razor, its random other razor that is probably
             | less accurate.
        
         | hderms wrote:
         | People jump to Hanlon's razor way too quickly. The mere
         | knowledge that people are likely to brush off things as
         | incompetence instead of malice is a potential tool in the hands
         | of the malicious.
         | 
         | Yes, all things being equal, it's better to err on the side of
         | assuming someone didn't have malicious intent, but some people
         | like to use it like a blunt instrument and the fact they do is
         | dangerous in such a complex world.
         | 
         | Additionally, it's incredibly biased in its application simply
         | because only a good-natured person would be inclined to brush
         | off malicious action as incompetence. Malicious actors would be
         | unlikely to do the same.
        
       | bumbledraven wrote:
       | > _42 [UMN] patches [out of the 190 listed by Greg Kroah-Hartman]
       | are still set to be pulled out of the kernel... some of the
       | patches contained serious errors; these definitely needed to be
       | reverted (and should not have been accepted in the first
       | place).... all of those [42] patches were accepted by subsystem
       | maintainers throughout the kernel, which is not a great result.
       | Perhaps that is a more interesting outcome than the one that the
       | original "hypocrite commit" researchers were looking for. They
       | failed in their effort to deliberately insert bugs, but were able
       | to inadvertently add dozens of them._
        
       | jolux wrote:
       | I am not terribly surprised to learn that code review processes
       | for the kernel are not as stringent as they should be. However, I
       | do wonder whether the kernel's defect rate is significantly
       | different from that of NT or XNU, and whether the situation could
       | be significantly improved. I thought there was large
       | institutional investment in Linux kernel development, but if the
       | existing developers are horribly overworked and stand no chance
       | of properly reviewing every patch, even from existing
       | contributors, clearly the project needs more resources.
       | 
       | I'm glad that this situation has prompted reflection on these
       | issues for kernel developers though. It seems like the best
       | possible outcome.
        
       | maest wrote:
       | > [LWN subscriber-only content]
       | 
       | I'm guessing you weren't supposed to share this link?
        
         | woofie11 wrote:
         | Nope. LWN built the feature that subscribes can share links as
         | a feature. If you subscribe, you can share.
         | 
         | LWN is a very fair and ethical organization. This struck what
         | felt to them (and to much of the community) as a good balance.
        
         | swyx wrote:
         | right below that it says "The following subscription-only
         | content has been made available to you by an LWN subscriber."
         | 
         | feature, not bug
        
           | WA9ACE wrote:
           | I'm so very happy that's a feature, I subscribed to LWN a
           | while back from reading some subscriber only links on HN.
        
         | janfoeh wrote:
         | LWN links are intentionally shareable. See the "Welcome .." box
         | at the top:
         | 
         | > The following subscription-only content has been made
         | available to you by an LWN subscriber.
        
         | [deleted]
        
         | gwd wrote:
         | When (as a subscriber) you make such a link, it says that if
         | such links are "abused", the ability to make them may go away.
         | Having the occasional article posted to HN seems to be OK with
         | them (or they probably would have added a specific request not
         | to post them to news aggregation sites). It's likely a
         | wholesale "Free LWN for all" page with a weekly subscriber link
         | to all the articles they're worried about.
        
         | Diederich wrote:
         | FYI: the owner of lwn.net is on record as being ok with people
         | widely sharing subscriber only content links.
        
       | feral wrote:
       | I found the reaction to this surprising.
       | 
       | When Security Researcher tell a company about a bug they've
       | found, and the company reacts badly, this community is usually
       | strongly in favor of the Security Researcher.
       | 
       | The overworked corporate sysadmins don't get a lot of empathy.
       | The assumption seems to be that that their software shouldn't
       | have been vulnerable in the first place.
       | 
       | Now here's a security researcher researching smuggling bugs into
       | the Linux kernel. It probably should be secure, and probably
       | should face scrutiny. People have tried snuggle bugs into it
       | before.[0]
       | 
       | So why is the opinion so strongly against the Security Researcher
       | here?
       | 
       | What's the big difference in principal?
       | 
       | [0] https://freedom-to-tinker.com/2013/10/09/the-linux-
       | backdoor-...
        
         | kstrauser wrote:
         | My opinion: security researchers attack an organization's
         | products, like their code or website or services. UMN attacked
         | the organization's _people_ to test their defenses.
         | 
         | Lots of companies run bug bounty programs and thank you for
         | finding vulnerabilities in their product. Humans don't scale so
         | well, though, and if you pester the hell out of a company's
         | office manager trying to social engineer them, that company is
         | going to be super freaking annoyed at you.
         | 
         | If UMN had analyzed the Linux code to find problems, then
         | patched them, the kernel team would be happy. They didn't. They
         | spammed the human maintainers with a flood of patches and lied
         | about why they were sending them. They conducted an impromptu
         | and unanticipated phishing test, and you just don't do that.
        
         | hyper_reality wrote:
         | One big difference is explained in the article. The fact that
         | code going into the kernel is not as secure as we might hope is
         | already known to the open source community. Maintainers are
         | overworked and none would be surprised if you told them that it
         | would be possible to smuggle in backdoors. This is not a "bug",
         | but an issue with time and resources, and because the
         | researchers attempted to _add_ bugs to demonstrate it just
         | makes it worse.
         | 
         | On the other hand, security researchers are finding
         | vulnerabilities that weren't previously known. They've
         | discovered specific exploitable bugs, rather than introducing
         | new ones. Following disclosure, the company can patch the
         | vulnerabilities and users will be safer. Which makes that a
         | laudable thing to do.
        
       ___________________________________________________________________
       (page generated 2021-04-29 23:00 UTC)