There are some people won’t touch anything to do with open source projects as they feel it might have issues with security. What does open source actually do for security or change how it works?

  • PrecisePangolin@lemmy.ml
    link
    fedilink
    English
    arrow-up
    15
    ·
    8 months ago

    In my opinion it makes a project even more secure. Many eyes are able to inspect the code and review it for known and unknown vulnerabilities. It is a cat and mouse game anyway, you might as well broadcast all the flaws in hopes of people catching them and helping to fix them.

      • Otter@lemmy.ca
        link
        fedilink
        English
        arrow-up
        12
        ·
        edit-2
        8 months ago

        I think the argument is usually

        If bad people see the code, they can spot vulnerabilities and exploit them

        But I that’s not really how it works because it doesn’t cost anything to try an exploit. People generally aren’t going to look through the code to try and spot a weakness when they can just run an automated thing to attempt common vulnerabilities. Open source, closed source, bad code will fail the same.

        I see it as a lock. With open source, you know how the internal mechanism is supposed to work and you can judge how secure it is. With closed source, someone says “trust me” and doesn’t show you how the inside works. It could just be a “if something metal is inserted, unlock the system”.

        Ultimately the best thing is to look for open source software that’s been audited. If no one has checked the FOSS code, then you don’t actually know it’s safe. Once that’s happened, best of both worlds.


        One other concern might be “if it’s open source, then everyone can see my password!”

        Which is just… wrong

        • Otter@lemmy.ca
          link
          fedilink
          English
          arrow-up
          4
          ·
          8 months ago

          Oh and in practice, companies might pick a closed source paid product over a free and open source one.

          But it’s not the product, it’s the legal/financial agreements. Companies like to externalize the risk instead of taking it on themselves. They like being able to sue someone if things go wrong.

          The other company might be running the FOSS software too. They’re taking on the responsibility.


          Oh and finally, a lot of open source products and protocols are used by closed source companies.

          ex. Signal protocol is used by Facebook for some things

  • meseek #2982@lemmy.ca
    link
    fedilink
    arrow-up
    11
    arrow-down
    4
    ·
    8 months ago

    People who don’t touch open source are mouth breathers. So next time someone says they won’t use it because it’s FOSS, you know who the weakest link in the building is.

    As others have mentioned, it’s more secure code because it’s freely available. With closed source, you have no idea whats going on.

    Look at it this way. FOSS is like a real safe. You can see it. Touch it. Kick it. Punch it. Closed source is like a blanket and I tell you there’s a safe under there. No you can’t touch it or see it. Trust me tho, there’s a safe.

    Which would you store your money in?

    • BaguetteSamurai@lemmy.one
      link
      fedilink
      arrow-up
      2
      arrow-down
      1
      ·
      8 months ago

      In fairness, this is only the case when people are actually inspecting the code. That safe could be a cake that looks like a safe, but if nobody tastes it there is no real benefit (in terms of security at least)

      • andruid@lemmy.ml
        link
        fedilink
        arrow-up
        2
        ·
        8 months ago

        It’s a harder con to build a real looking fake safe, hoping no one will actually test it out, then just lying about what’s behind a curtain no one is allowed to look behind.

  • Captain Beyond@linkage.ds8.zone
    link
    fedilink
    arrow-up
    6
    ·
    edit-2
    8 months ago

    It doesn’t really. In theory more eyes on the code means more chance for a security bug to be found, either by white hat researchers or black hat exploiters. In practice this doesn’t really pan out; not only are most free software projects small hobbyist endeavors, but even large free software projects with many eyes on them, such as OpenSSL and curl, have had critical security vulnerabilities over the years. When it comes to security issues, having the right eyes on the code matters more than having many eyes.

    The original promise of free software, the four freedoms, is all it guarantees. In my opinion this is enough to prefer free software over proprietary.

    • JustEnoughDucks@feddit.nl
      link
      fedilink
      arrow-up
      2
      ·
      8 months ago

      Isn’t your OpenSSL and curl points proving the opposite? Every program will have vulnerabilities and they had critical security vulnerabilities that were found and fixed.

      But yes, I agree that 95% of open source projects have absolutely 0 security testing. Might not matter for some embedded applications, but it matters a great deal for public facing container plugins for example. Then again, most closed source software also hasn’t been pen tested.

  • Monkey With A Shell@lemmy.socdojo.com
    link
    fedilink
    arrow-up
    5
    ·
    8 months ago

    In theory it helps to have multiple people verify the code. In reality, unless it has wide use and a fairly clean core it won’t likely get reviewed by anyone, but even without that it helps provide a level of trust just by the author laying their cards out for people to look at if they like.

  • ErwinLottemann@feddit.de
    link
    fedilink
    arrow-up
    4
    ·
    8 months ago

    the reason why many companies try to avoid using open source software is support. they usually can’t throw money at the creator to fix their problems or create custom solutions for them. which is kind of not accurate anymore today.

    • andruid@lemmy.ml
      link
      fedilink
      arrow-up
      2
      ·
      8 months ago

      To be honest I’m a FOSS advocate, but when I recommend software I absolutely mention that getting devs (capable of fixing that software) in a SLA for critical bugs is what the absolutely should do, or accept the security risk or operational risk of insecure software.

      • andruid@lemmy.ml
        link
        fedilink
        arrow-up
        1
        ·
        8 months ago

        This risk extends even more to non-foss software though as organic fixes can’t happen and the company that owns it HAS to fix it for you. Not all purchase agreements say they have to do this, and again it is our organizations that bare the risk then.

  • cogitoprinciple@lemmy.world
    link
    fedilink
    arrow-up
    4
    ·
    8 months ago

    Only potential security issues, would be related to a lack of maintenance on a particular project. If an open source tool has not been updated for over a year or more, then it may have security vulnerabilities. I usually won’t use something if it has not been updated for a year or more.

    However, people who make that claim, seem to subscribe to security by obscurity. They may think that the source code being public makes it more likely to be exploited for vulnerabilities. But I would say that is a strength, since many people can verify the security of a project and can have patches applied. In standard proprietary software, a security vulnerability could exist for years, without being patched because no one knows it’s there. It may or may not be exploited within this time.

    It is our responsibility to choose the digital tools we use wisely, and to be mindful of a lack or drop of maintenance on a particularly project.

  • NuXCOM_90Percent@lemmy.zip
    link
    fedilink
    arrow-up
    3
    ·
    8 months ago

    In terms of actual vulnerabilities? Probably comes out comparable? You have more eyes which means more opportunities for code review. But that is going to boil down to how rigorous the code review is and whether it is just people rubber stamping “trusted” developers.

    Its controversial for a lot of reasons but a couple years back there was the university professor and his grad student who intentionally introduced vulnerabilities into one of the big projects. I forget at what point that was caught or what project, but it happens every few years. And likely happens a lot more that we don’t know about.

    But mostly? When I am assessing software for a production situation, the security of an open source library versus a proprietary one isn’t even on the list. Depending on the company I am investigating the contributors, but that happens whether it is a company or a github page.

    What really matters to me is how critical it is and what the support model is. Because if a vulnerability takes a week to get properly fixed or results in significant development slowdowns in the aftermath: It is worthless to me. Whereas a company that is on the hook to go all hands on deck and crunch their developers (because that always helps and doesn’t cause problems down the line…) to fix an issue within N hours? That shit means I don’t lose any sleep when the poo hits the fan.

  • andruid@lemmy.ml
    link
    fedilink
    arrow-up
    2
    ·
    8 months ago

    Being FOSS doesn’t it make secure, but it doesn’t make it more possible for people to actually test and secure it (people with less interests in it being seen as secure, but instead actually secure).

  • ZenFriedRice@lemmy.ml
    link
    fedilink
    arrow-up
    2
    ·
    8 months ago

    In college I was told over and over that SECURITY BY OBSCURITY IS NOT ACTUALLY SECURITY. So using thoroughly tested and examined security techniques from open source software is the gold standard.

    There are known secure algorithms that cannot be cracked by modern computers. So, no reason to try and reinvent the wheel and just hope your way is better despite decades of refinement and research into modern open algorithms.

  • Zink@pawb.social
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    2
    ·
    8 months ago

    Nothing, really. Anything you got from the play store is just as capable as a foss file is. The only changes between open source and closed source are that open source runs the risk of virus clones being created, at the benefit of a person being able to review the code to check for viri.

    I might be wrong though idk

    • SamXavia@kbin.runOP
      link
      fedilink
      arrow-up
      2
      ·
      8 months ago

      @Zink Yeah I would guess some people might pass up a software as ‘safe’ and then it not to be and just ripped the code from an Open Source project.