Linus Torvalds: XZ Utils Breach Raises Questions About Trust in Open Source Development

แชร์
ฝัง

ความคิดเห็น • 86

  • @monad_tcp
    @monad_tcp 23 วันที่ผ่านมา +55

    3:42 this is really a insightful realization. I wish society would learn that, they keep becoming more totalitarian in the effort to have lots of rules to catch bad behavior, and as excuse to scoop all the data from citizens. We're trading freedom for security and ending up having none of them.

    • @grokitall
      @grokitall 22 วันที่ผ่านมา +2

      robert hienlien has a character in the moon is a harsh mistress who says "you make whatever rules make you feel comfortable, and if i need to break them i will do my best not to get caught". paraphrasing here as i don't remember the exact wording.

    • @codingblues3181
      @codingblues3181 16 วันที่ผ่านมา

      Secular liberalism is as authoritarian as any other, just with different mechanisms.

  • @user-oj7uc8tw9r
    @user-oj7uc8tw9r 20 วันที่ผ่านมา +74

    I blame jia tan everytime something breaks when I use linux

    • @vilian9185
      @vilian9185 19 วันที่ผ่านมา

      lmao gonna do that too

    • @TomJacobW
      @TomJacobW 4 วันที่ผ่านมา

      DINKLEBEEEEEERG!

  • @ErikBergfur
    @ErikBergfur 25 วันที่ผ่านมา +92

    Pity they never addressed the chance of already having a lot of successful attacks like the XZ-one, active and in use... Undetected...

    • @unaimillian
      @unaimillian 23 วันที่ผ่านมา +36

      The pity nobody cares about is the amount of attacks that target proprietary software. As it's sourse code and executables are observed by far less amount of people

    • @Raspredval1337
      @Raspredval1337 23 วันที่ผ่านมา +2

      @@unaimillian exactly

    • @ErikBergfur
      @ErikBergfur 23 วันที่ผ่านมา +17

      @@unaimillian sure. It's an age old argument. But why use it to ignore the threat? We can control the OSS/FOSS dev process but not the closed source one...
      The XZ attack shows how the attackers play the long game and are skilled at hiding backdoors in "plain sight"

    • @autohmae
      @autohmae 23 วันที่ผ่านมา

      @@unaimillian the executables are observed a lot, especially updates have people looking at changes compared to the previous version trying to figure out how to break into systems.

    • @angrydachshund
      @angrydachshund 22 วันที่ผ่านมา

      Agreed, the whole thing is cope, because the FOSS world is already dead, they just can't admit it yet.

  • @jackthatmonkey8994
    @jackthatmonkey8994 24 วันที่ผ่านมา +26

    It feels similar to logging to me. If you keep logs, but don't audit them, then what do those logs do?
    Relying on open source in a context where security really matters feels like much the same way to me, however daunting this implication is. Though I'd rather live in a world where 'its open source so its safe' holds up
    I don't use arch btw

    • @whohan779
      @whohan779 24 วันที่ผ่านมา +3

      You should really give Arch a try. 😉

    • @sergeykish
      @sergeykish 23 วันที่ผ่านมา +1

      Visibility. For example, AUR has more visibility than instructions on forum, and there are people who review PKGBUILDs. Same all the way down.

    • @Dragon905
      @Dragon905 21 วันที่ผ่านมา +1

      You can use logs and open code if, for example you are suspicious of a behaviour on your system. Of course logs don't do anything, but in case of doubt, reference the logs or code!

  • @Nilruin
    @Nilruin 17 วันที่ผ่านมา +10

    I love how nobody in frame in the audience is looking at the stage and paying attention.

    • @jamesclark7380
      @jamesclark7380 15 วันที่ผ่านมา

      I think it's more of a monkish respectful bow thing.

    • @TomJacobW
      @TomJacobW 4 วันที่ผ่านมา +2

      ⁠​⁠@@jamesclark7380 it’s a weird angle; looking up for so long is tough on the neck. In either case, hearing the words is more important than seeing two mouths move.

  • @Turalcar
    @Turalcar 24 วันที่ผ่านมา +40

    For every attack like this in open source there are a dozen (I made up the number) in closed source that go undetected for years.

    • @gusthomas6872
      @gusthomas6872 24 วันที่ผ่านมา +7

      you made up the number because it is unknowable! assuming a large number is probably good in that case

    • @jfbeam
      @jfbeam 23 วันที่ผ่านมา +4

      For every one _known_ in OSS, there are many _UNKNOWN._ Have you audited every line of code for every package installed on your Ubuntu desktop? (I didn't think so. Nobody has; even people that would know what they're looking at.) Just as there are many unknown backdoor'd commercial programs. Of the two, one is much, much harder to detect, but neither is trivial.

    • @dansanger5340
      @dansanger5340 23 วันที่ผ่านมา +2

      Another example of the denialism in the open-source world.

    • @philipstephens5960
      @philipstephens5960 23 วันที่ผ่านมา +2

      @@jfbeamWell, what SHOULD be happening is that every pull request goes through a thorough code review. Even imperfect code reviews should increase the chances of catching malicious code.

    • @autohmae
      @autohmae 23 วันที่ผ่านมา

      @@jfbeam actually, many is even an unknown

  • @TeaTree-e8y
    @TeaTree-e8y 24 วันที่ผ่านมา +5

    Would love to hear some of your thoughts and advice on repurposing content. You seem to have a really good pipeline here and I'm attempting to something similar using the newer tools.

  • @ZombieLincoln666
    @ZombieLincoln666 14 วันที่ผ่านมา

    Linus is so smart

  • @bloodyorphan
    @bloodyorphan 7 วันที่ผ่านมา

    Using smaller teams for security and fidelity reasons was the answer for Solaris, it's less true for Microsoft these days.
    I think the better answer is to do a better Quality Assurance , especially for base kernel non standard ports etc, before a release candidate can be considered for actual release. You know the default ports for every service, make sure spurious ports are not enabled before publishing.
    You can automate something like that, and have a much greater sense of security as a result.
    Keeping the hours down for said smaller team.
    Bless Ya's
    **EINSTEIN**
    ;-)

  • @nixigaj11
    @nixigaj11 24 วันที่ผ่านมา +17

    sauce: th-cam.com/video/cPvRIWXNgaM/w-d-xo.html

  • @evikone
    @evikone 16 วันที่ผ่านมา +3

    Trust? Perhaps, within the programmers of the 70s and 80s (even 90s), a prevailing idea was to never trust anything----being a paranoid programmer was a good thing. We know that the first rule of security is that any security is "only as good as its weakest link" which is always "trust." These days, everything is going up to such a high-level, that no one is paranoid anymore, and everyone trusts everything so long as it appears to be trustworthy. That's my snazzy sassy.

  • @Hagaskill
    @Hagaskill 23 วันที่ผ่านมา +1

    Did you listen to the audio?

  • @TheRealStevenPolley
    @TheRealStevenPolley 24 วันที่ผ่านมา +5

    Who is Jia Tan really

    • @TheGeorey
      @TheGeorey 23 วันที่ผ่านมา +30

      The friends we made along the way

    • @ecereto
      @ecereto 23 วันที่ผ่านมา +18

      Most probably not a real person, but a state sponsored group.

    • @Daydream_Dynamo
      @Daydream_Dynamo 19 วันที่ผ่านมา

      GUY/GIRL behind the XZ Security breach Fiasco!!!!

  • @notthere83
    @notthere83 22 วันที่ผ่านมา +5

    I understand that one has to be able to trust maintainers. But maintainers merging malicious code reminds me of something I've been advocating for at companies for years: Never approve/merge a PR that you don't understand!
    It is preposterous that there are many people who believe that they should be able to trust someone else's code. Usually, this is of course about introducing bugs rather than malicious code. But the principle is the same.
    And then there are those who argue that that's an unreasonable burden on open source maintainers. To which I would say that if you don't want to accept this basic responsibility towards society, you probably should just work on your fun projects that may or may not cause harm privately.

    • @kazioo2
      @kazioo2 21 วันที่ผ่านมา +3

      Even experienced programmers don't just see all the bugs in code. A carefully crafted bug hidden in code can be easily missed even by an expert who understands it all.

    • @notthere83
      @notthere83 21 วันที่ผ่านมา +1

      @@kazioo2 Just because something can happen doesn't mean that you shouldn't make a reasonable effort to avoid it.

    • @notthere83
      @notthere83 21 วันที่ผ่านมา +1

      @@kazioo2 Automated testing is of course essential too. Yes, something can slip through here as well. But - see above.

    • @foldionepapyrus3441
      @foldionepapyrus3441 18 วันที่ผ่านมา +1

      When you are talking about a project as complex and diverse as the Linux kernel no one person however gifted could really hold all the interactions of the current version in memory to know there isn't a carefully crafted bit of malicious sneaking in as they review a submission. So at some point with a complex project as you can't just spend a week or four figuring out every detail of every interaction this one submission out of the 40 that have come in has, all you can do is check the code standard and comments are up to scratch. You simply have to trust your fellows that have earned it for anything to happen at all.
      Which is still better than trusting the fellow programmers employed by your company - those people are there for the wage and the industry standard seems to mean frequently treated poorly and ignored by the management and marketing/sales folks, put under insane time pressure etc - so they just don't care about anything else but getting paid. Making them ripe targets for getting paid more by slipping something 'useful' in. Where most open source developers do it because what they are developing they also use and care about (hopefully they are still getting paid as well).

  • @jfbeam
    @jfbeam 23 วันที่ผ่านมา +23

    "Didn't do it very well." Bull. F'ing. Shit. They did it expertly, that's why it pissed Linus off so much. So much so EVERY patch EVER submitted from ANYONE with that university's domain was pulled, despite the researchers not using university email addresses. (for the record, they used gmail) They reacted _entirely_ out of spite. NONE of their bad code ever actually made it to the mainline kernel; they proved their point by getting maintainers to accept their bad code, and upper levels blindly accepted the commits from those maintainers. If I want to test how blindly you trust shit, I'm not about to tell you beforehand. They proved it was possible - even trivial - and that it could have already happened.
    Sure, these things "get caught"... at some point in time, through shear, random f...ing luck. How many of these sorts of things have people NOT randomly stumbled into? That we don't know. And that's the point.

    • @timoruohomaki
      @timoruohomaki 23 วันที่ผ่านมา

      So what would be a better way to do all that at that scale? Of the original UMN study, one patch did make it into repositories but didn't cause any harm, according to K-H. A large number of the patches were generated with a software they developed in an earlier project that fixed some bugs but were not otherwise very high quality code. It wasn't really a big loss for all that to be reverted.

    • @coversine479
      @coversine479 23 วันที่ผ่านมา +24

      I’m pretty sure he meant they didn’t do it well in an ethical sense, not a technical sense

    • @grokitall
      @grokitall 22 วันที่ผ่านมา

      ​​@@coversine479there are ways to do such research which protect both the subject and the researchers. they ignored all of them, acting like bad actors.
      upon discovery, they got roasted for it, and so did every chain in their supervision.
      in the meantime, the kernel was left with the only choice being to block every submission from them, until they could audit the credibility of the contributers.
      the reason the maintainers were pissed is that maintainer time is always less than what is needed, and they wasted a lot of it. if they had done the study properly they could have said upon being discovered, so and so knew, and here is the list of patches from only these email addresses, but that process took way too long.

    • @vilian9185
      @vilian9185 19 วันที่ผ่านมา +1

      lmao wasn't he who banned the university read the mailing list, the university also pissed the devs there too

    • @grokitall
      @grokitall 19 วันที่ผ่านมา

      @@coversine479 no, he meant both. the idea of doing the study was fine, but there are a number of ethical and technical steps that should be taken prior to starting it which they completely failed to even consider.
      the first of which is should we even do it, and if so what rules should we set up?
      the standard way to do this is for the university to look at the size of the project, and see if it is big enough to absorb any potential harm caused by the study, and to document the potential harm prior to beginning the study so as to minimise it when setting the rules of engagement for the study. they did not do this.
      as this was a code study, the next step should have been to find someone connected to the project who did not do code review who could be a point of contact and potentially could have a full audit trail of all the submissions. they did not take either step as far as i have been able to discern. this is what pissed off the devs, because having discovered someone looking like a bad actor, and tracing them back to the university, it was then impossible for a while to determine if it was student or faculty, and if this was a one off or systematic.
      this is what caused the fallout. yes they blocked the gmail account, but they should then have been able to ask the developer what was going on, and got a reply of here is what we were doing, these people knew about it, and here is every patch involved. they could not do any of that, so that got the university blocked until that information could be independently created and confirmed, at which time the University got unblocked.
      they implemented the study protocols so badly that they were not only technically bad, end ethically questionable, but due to hacking being illegal to some extent in most countries their behaviour skirted around being criminal. all of these problems would have been caught if a proper review was done by the university legal and ethics board prior to starting the project. not doing so not only slimed themselves, but brought the University into disrepute for allowing it to happen.

  • @MarkHall-cf6ji
    @MarkHall-cf6ji 20 วันที่ผ่านมา +3

    3:15 is such bullshit, he says having no rules is good because attackers don't follow rules but they got mad when a school tried to submit malicious code, but that's what bad guys will do too.

    • @vilian9185
      @vilian9185 19 วันที่ผ่านมา +12

      poeple were mad about being tested on, also he was talking about trust, and the university broke that trust, the university also pissed the devs in tha mailing list so their fault

    • @MarkHall-cf6ji
      @MarkHall-cf6ji 19 วันที่ผ่านมา

      @vilian9185 But he contradicted himself by talking about the benefits of having no fixed rules because the bad guys are likely to use those rules to their advantage. Well that's exactly what the school did to the linux dev process, which is only fair imo.

    • @MarkHall-cf6ji
      @MarkHall-cf6ji 19 วันที่ผ่านมา +1

      @vilian9185 He then twisted it into a positive by pointing out that despite having no fixed rules the devs were able to detect all attempted attacks. This is survivorship bias, as they're only aware of attacks they've detected. For all they know, the linux kernel could be teeming with backdoors submitted by bad guys, they just haven't noticed yet.

    • @vilian9185
      @vilian9185 19 วันที่ผ่านมา +5

      @@MarkHall-cf6ji fine them good luck creating rules that's only difficult the devs job and don't stop anyone malicious

    • @XGD5layer
      @XGD5layer 18 วันที่ผ่านมา +1

      ​@@MarkHall-cf6ji rules are only needed for cohabitation, they don't matter in open source but they will impact engagement

  • @multinaute
    @multinaute 19 วันที่ผ่านมา +1

    never true CCP