I was thinking about the recent story about the DB looking for windows 3.1 administrator.
A classic issue I’ve soon working in heavy industry is that hardware last longer than windows version. So 10 years ago, you bought a component for the product you design or a full machine for your factory which only comes with a windows XP driver.
10 year latter, Windows XP is obsolete, upgrading to a more recent windows might be an option but would cost a shit load of money.
I have therefore the impression that Linux would offer more control to the professional user in term of product lifecycle and patch deployment. However, there is always that stupid HW which doesn’t have a Linux driver.
Market share. If you look on the server side though, you find the total opposite.
It does but for the 90’s/00’s a computer typically meant Windows.
The ops staff would all be ‘Microsoft Certified Engineers’, the project managers had heard of Microsoft FuD about open source and every graduate would have been taught programming via Visual Studio.
Then you have regulatory hurdles, for example in 2010 I was working on an ‘embedded’ platform on a first generation Intel Atom platform. Due to power constraints I suggested we use Linux. It worked brilliantly.
Government regulations required anti virus from an approved list and an OS that had been accredited by a specific body.
The only accredited OS’s were Windows and the approved Anti Viruses only supported Windows. Which is how I got to spend 3 months learning how to cut XP embedded down to nothing.
One thing to also remember is 15 years ago there was a lot of anti-Linux marketing. To be fair, Linux sucked back then
yeah, especially desktop and interface, that is changing tho i already saw a lathe running ubuntu mate
It’s happening slowly. A year ago my employer had all PLCs, and we are starting on Linux+PIs
PLCs just have so many issues and none of them are being resolved.
Linux sucked back then
Linux already ran the vast majority of the web and internet services back then. I think qualifying as “it sucked” isn’t particularly accurate. Remember, fifteen years ago, Vista was still very current with Windows 7 having just been released.
Linux was a terrible experience for the standard home user and non programmer professional uses. You could make it work for some stuff, but today I’d feel comfortable telling anyone with basic computer and troubleshooting skills that they can make linux work for them. Meanwhile 7 or so years ago I was an engineering student who tried ubuntu and found it couldn’t do what I wanted and took too much work to do what it could.
Look at your keyboard and you’ll get your answer (hint: Microsoft symbol)
Microsoft integrated itself into every aspect of the industry (and beyond) to be a monopoly, as part of making technology.
Linux in the other hand, is just a technology, which doesn’t really cares about market share, profits and being a monopoly.
https://en.m.wikipedia.org/wiki/Embrace,_extend,_and_extinguish
I also look at it as Coca Cola vs. Natural freshly squeezed juice
2 main reasons in my view:
- windows is the de facto standard for desktop ans users management. So each corp has at least one guy used to the interface to dofirst-level debug
- windows comes with support, not linux. So corps don’t want to employe one Linux admin “just in case”. That’s the main reason I keep hearing from sysadmins I know
There’s plenty of support for Linux. REHL, for example; their entire business is selling support. Suse, CentOS, and Debian all have people specifically to support enterprise.
Sure, there’s not a hotline with a guy on the other end who may or may not be more knowledgeable than a 5yo post on Reddit or stack exchange or wherever… but they do have enterprise-grade support
There’s plenty of support for Linux.
This is the way tech is. Once someone gets an idea in their head it doesn’t go away. IT guy at my job told me this about two years ago, ok yeah buddy not like I used to sysadmin a RHEL system for years. No support contracts, or irc rooms, or websites, or books, or man pages, no nothing.
I just stopped arguing with people about this stuff. I get a contract and I give them the best design I can come up with. They tell me they want to use some ancient thing and I give it to them.
I knew a REHL admin that developed a custom theme for KDE that made everything look like XP… it was all the modern bells and whistles under the covers; but it looked ancient and for some reason that kept his clients happy.
Oh I believe. I have a folder full of schematics named “for morons” on my work computer. It is achieves most of the basic functionality of a modern design except every part of it looks like it was made in 1994 or so. Your tax dollars at work btw.
Did your buddy make that grass hill thing as well as the background?
Not sure. I made the mistake of telling him I grew up on redhat so he gave me a ~'98 version of gnome.
You said it - money. They want you to have to buy new.
Yep, a place i was working in had a plotter that only had drivers for windows ME. not 98, not 2000. Only for windows ME. I gave up after 30 minutes of scouring the internets and left the piece alone. But it was mind boggling to me that they had a precious and fragile desktop that was the only thing that could talk to that plotter. That was around 2003-2004, VMs were not around much. I hope that the guy that worked there after me cloned the machine into an image.
Others have given good reasons. I just wanted to point out that you are generally supposed to use a dedicated computer that is firewalled or, preferably airgapped. And never patch anything unless absolutely required.
The main reason is that there is no single Linux operating system. Linux basically is just the kernel. Every thing else around this kernel, like tools, applications and libraries, is highly customisable and exists in form of various forms of Linux operating system distributions. The fact that these distributions are very different from each other makes it almost impossible to certify industrial products for „the Linux“ operating system. There are just too many variations of it.
Purchasing decisions are often made by people who are not IT experts. They are heavily advertised to and lobbied at by Microsoft. Also, the people who make purchasing decisions now won’t have to live with the consequences twenty years down the line. So they go with the easy option.
it should be noted, that the hardware doesn’t necessarily last longer than the windows support- in large server farms that have been running for a long time, it’s normal for the occasional odds and ends to go down and not come back up. hard disks in particular come to mind.
you think windows 3.1 was old… the IRS’s largest database (and much of the financial world,) runs on emulations of the original IBM Mainframes; specifically running KOBOL.
Also Linux has a reputation of being used by pirates. a lot of industrial tools are expensive; and the fear is letting it go on Linux is likely to lead to it being pirated; so corpo dicks freak out over it. The reality is, it’s going to be pirated anyway, probably. Unless your tool is so obscure, uselss or just down-right awful. but don’t tell them that.
Finally, supporting multiple environments is relatively expensive, and leads to increased complexity. So, developers focus on the environment they think will be most frequently used by their end-users. which is why most games are “exclusive” to windows, for example.
At least the whole automobile industry runs on Linux (and now more and more Android)
Linux is on a shitton of products nowadays, but often people don’t recognise it as such.
In terms of the DB - Windows 3.1 story the solution is simple: It’s time. Windows 3.1 is old as fuck and while Linux is one year older in theory it was a hobby project back then and, more important, was not providing any graphical interface back then - which is what the German Bahn used it for - as a graphical interface. While Unix would have been an option these systems were often hardware-vendor specific(AIX for IBM, HP-UX for HP) and the then standard supplier Siemens-Nixdorf did not provide it’s own Os afaik.OS/2 was basically still Microsoft at that point so there was little reason to not use MS.
The other point are the incredibly long development and usage times of industrial equipment - if you start to design a new high speed train from scratch it can easily take 15 years from start to finish - and the decision which OS you use is one to be done rather early on. And that train will then be used for 30-40 years. The complete IT business will change A LOT during that time. And maybe you bring out a newer model but need it to be backward compatible for some reason. And bam,you use Windows 3.1 in 2030.
For a matter of fact I know of at least one nuclear plant still being controlled by a Digital Equipment PDP11 and one conventional power plant controlled by a Robotron system.
Which brings us to the old:“Never change a running system”. If your application works under windows XP and usually these things do after so many years, why do you need a new OS ? Unlike consumer systems these systems are often in a walled garden or not connected at all and there is literally zero reason to change it.
Nowadays things are different - we have much more outside connections in hardware and linux/Unix is in many more products than people think - I personally would even be fairly sure that it’s in more products than MS nowadays, since they got rid of CE there has been a steep decline in their market share afaik.
I don’t think it’s the marketing crap, but business people like to have warranty and someone to blame when something goes wrong.
Now, GPL and MIT explicitly state that there is no warranty, not even implied warranty.
Because someone in sales ordered it to be done that way.