Not having a package installed DOES decrease your potential attack vectors. But it is more about decreasing the burden of picking a solution. For example, let’s say you are setting up a kubernetes install and need to pick an ingress controller. You can read through the documentation and maybe even check various message boards to figure out which are good options. But you need to sift through the FUD and often end up at the point of needing an expert to make an informed decision.
Or you can rely on the company you are paying to have already done that and likely have already contracted this out to an expert to figure out which solutions are well maintained and have solid update policies.
Because, getting back to a CVE: Some software has a policy of backporting security fixes to the current LTS (or even a few of the previous ones). Others will just tell you to upgrade to the latest version… which can be a huge problem if you were holding on 3.9 until 4.x became stable enough to support the massive API changes. A “properly” curated package repository not only prioritizes the former but does so at every level so that you don’t find out you were dependent on some random piece of software by a kid who decided he is going to delete everything and fuck over half the internet (good times).
And yes, you can go a long way by reading the bulletins by the various security researchers. But that is increasingly a full time job that requires a very specialized background.
Given infinite money and infinite time? Sure, hire your own team of specialists in every capacity you need. Given the reality, you look for a “secure”/“enterprise” OS where you can outsource that and pay a fraction of the price.
As for the 2 am wake up call: If you have global customers then “wait until next morning” might mean a full work day where they are completely vulnerable and getting hammered and deciding that every single loss is your fault because you couldn’t maintain a piece of software. Or if you have sensitive enough customers/data where a sufficiently bad breach is the company itself (and an investigation to see who is at fault).
Which all gets back down to why this is a non-issue for consumers. Enterprise OSes already exist and are not some evil scheme MS are working toward. And the vast majority of even companies don’t need them (but really should run them and consider paying for the support package on top…). So there is absolutely zero reason that the “home” version would ever be locked away behind one.
Well, no.
You can get it for free with Debian, or even Ubuntu on an LTS version. Just not forever.
The reason enterprises want to pay money for extended long-term support is so they don’t have to keep jumping major versions (with the possibility of breaking whatever unique environment they had going) every couple of years.
Even the Linux kernel itself scaled back how far back they’re willing to support, leaving long-term users with the work of sourcing backports or constantly testing out new features.
I’m very comfortable running Sid at home, but there the annoyance is limited to one person if I have to spend a couple hours combing through git diffs.
Not having a package installed DOES decrease your potential attack vectors. But it is more about decreasing the burden of picking a solution. For example, let’s say you are setting up a kubernetes install and need to pick an ingress controller. You can read through the documentation and maybe even check various message boards to figure out which are good options. But you need to sift through the FUD and often end up at the point of needing an expert to make an informed decision.
Or you can rely on the company you are paying to have already done that and likely have already contracted this out to an expert to figure out which solutions are well maintained and have solid update policies.
Because, getting back to a CVE: Some software has a policy of backporting security fixes to the current LTS (or even a few of the previous ones). Others will just tell you to upgrade to the latest version… which can be a huge problem if you were holding on 3.9 until 4.x became stable enough to support the massive API changes. A “properly” curated package repository not only prioritizes the former but does so at every level so that you don’t find out you were dependent on some random piece of software by a kid who decided he is going to delete everything and fuck over half the internet (good times).
And yes, you can go a long way by reading the bulletins by the various security researchers. But that is increasingly a full time job that requires a very specialized background.
Given infinite money and infinite time? Sure, hire your own team of specialists in every capacity you need. Given the reality, you look for a “secure”/“enterprise” OS where you can outsource that and pay a fraction of the price.
As for the 2 am wake up call: If you have global customers then “wait until next morning” might mean a full work day where they are completely vulnerable and getting hammered and deciding that every single loss is your fault because you couldn’t maintain a piece of software. Or if you have sensitive enough customers/data where a sufficiently bad breach is the company itself (and an investigation to see who is at fault).
Which all gets back down to why this is a non-issue for consumers. Enterprise OSes already exist and are not some evil scheme MS are working toward. And the vast majority of even companies don’t need them (but really should run them and consider paying for the support package on top…). So there is absolutely zero reason that the “home” version would ever be locked away behind one.
This is what you get when you pay.
Security backports to old versions of software that have fallen out of support.
Or, you know… you can get it for free with Debian, which circles back to my initial argument.
Well, no.
You can get it for free with Debian, or even Ubuntu on an LTS version. Just not forever.
The reason enterprises want to pay money for extended long-term support is so they don’t have to keep jumping major versions (with the possibility of breaking whatever unique environment they had going) every couple of years.
Even the Linux kernel itself scaled back how far back they’re willing to support, leaving long-term users with the work of sourcing backports or constantly testing out new features.
I’m very comfortable running Sid at home, but there the annoyance is limited to one person if I have to spend a couple hours combing through git diffs.
Ten years between OS refreshes is money.