Old software

One reason I often see for calling for new features or changes in GNU/Linux is a form of ageism—dismissing something because it’s “too old” or “antiquated.”

This is frequently said about components like Xorg. However, this argument overlooks an important fact: just because something is old doesn’t mean it hasn’t been updated. Many of these systems, including Xorg, receive regular modern updates.

Consider that much of what we use today, including the Linux kernel itself, is written in C. C is around 52 years old, having been introduced long before many users were born. C++, the object-oriented extension of C, emerged in 1985. Even desktop environments like GNOME and KDE, which date back to the mid-to-late 1990s, are older than many of today’s users.

The key point here is that age often indicates that something works, not that it’s obsolete. If these technologies were broken or inadequate, they wouldn’t have remained in use for so long. Instead, they’ve been continually developed, extended, and improved over time. Programming languages and system components are not like beat up old cars—they aren’t designed to be temporary. In fact, many are built to last for decades, if not longer. Take COBOL, for example: developed in 1960, it’s still in widespread use today, particularly in financial systems that many of us rely on heavily.

When considering new programming languages or technologies, it’s important to apply skepticism rather than buying into the hype. Just because something is new doesn’t mean it’s better. Promises of innovation are common, but they don’t always hold up under scrutiny. That’s not to say that new tools are inherently bad, but we should evaluate them critically. Time-tested, proven systems often hold more value than the latest trends, and proponents of new technologies rarely highlight their shortcomings.

9 Likes

I'm an early adopter, but I don't actually disagree with your main point. The catch though is this: COBOL continues to maintain our financial systems because it works and those systems are critical. I am not a coder and can't speak to COBOL's pros and cons, but I do know that it's virtually unused for new purposes, to such an extent that the financial sector has a really rough time replacing retirees.

Yep, COBOL is still getting the job done. It doesn't need to be replaced. Except that it does what it does fine, and isn't well suited to tons of modern purposes, resulting in a serious dearth of people who've taken it up. Unless you went into college with the goal of programming for the financial sector, odds are that you studied much more modern languages, or C/C++ for their (from what I understand) unparalleled flexibility. I've seen more projects in x86 assembly than in COBOL. If maintaining the old thing becomes extremely difficult because no one knows how anymore, then perhaps evaluating moving on is worthwhile. I say that with one big caveat: no critical system should ever move over without its replacement having run in parallel for a WHILE, for real battle testing. At the company I work for, we moved from Python to go for a number of services handling millions of concurrent users, but we didn't just compile, test internally, and cut over. Load balancing switched things over part way and we spent time making sure things didn't blow up.

Anyway, I doubt your real point was the eternal sanctity of COBOL so much as something else. GNOME's recent antics? The feud between System 76 and the Linux kernel team? A general trend you've observed?

All of the above?
I see the article you posted - which is a reasonable reference. However, COBOL does not remain in use merely because of declining (Can we use the word "arcane" here? :stuck_out_tongue_winking_eye: ) COBOL available expertise.
COBOL is extremely efficient at managing vast volumes of data, which nothing we have today even comes close to.
It has also expanded into niches, where components have become highly tailored and specialized in institutions due to its longevity.

Suggesting that COBOL only remains in use due to one factor over simplifies it.
Even if there were many COBOL programmers, it would very likely still remain in use with no viable replacement. And, COBOL programmers have among the best job security and pay - so there is definite incentive for career goers to move into it.

2 Likes

Those were actually three separate articles. I wasn't actually saying it's in use because its programmers are hard to find—that's a reason it might fall out of use: the need to maintain the system and inability to find people who can. I didn't (mean to) imply any particular reason it's still in use, but if I were going to, it'd be because it works and replacing it is extraordinarily dangerous compared to most systems.

At least two of the articles I linked were specifically about that demand. What I was suggesting is that given the dearth of qualified developers, if that trend continues, that in and of itself is a reason to consider preparing a new system before replacing devs in adequate numbers becomes impossible. A stitch in time saves nine: We may not need to move off of COBOL now, and we may not need to in ten years or twenty. But if we hit the point that we just can't fill those positions, full stop, I'd rather have moved to something else before that point than have a .NET guy sweating as he asks Chat-GPT how to do X or Y in COBOL. <_<

1 Like

Ah, I see. Yes, I think I must have misinterpreted your meaning.
I am glad it got clarified, though, in further posting since it drives the point home.

You might be surprised to hear this @Aravisian but I 100% agree with everything you said here. Just because something is old, doesn't mean that its outdated. Old just means, the product is tried and true, got the bugs out, and working as expected.

Just because a product is new, doesn't necessarily mean its better. There is always bugs with new products, problems to be solved, before the product is ready for prime time. Being an early adopter, isn't as amazing as one might think.

The majority of time, I am the type of person that, I like to be a late adopter, for the reasons stated above. Take me for example, I am still using Zorin OS 16, why? Cause it just works! And personally, as long as the support holds out on it, there's no reason for me to upgrade.

Now yes, there is something to be said that, if you buy a brand new computer, that has the latest hardware in it, there might be kernal related reasons, for why you need something newer. Perfect example, when I bought my MSI computer in 2021, I had to use POP OS, while I waited for Zorin OS 16 to come out, that had a new enough kernel, to utilize my hardware.

But take notice of what I said above, I had no interest in brute forcing my way into Zorin OS 12, with hopes that forcing a new kernel into it manually, could magically make my new computer work, I'd likely cause more harm then good. So I decided to use POP OS then, cause it had a newer kernel at the time.

Two take away's here...

(1) Don't assume an OS or software is bad, simply because its old, its tried and true, and likely runs better then a brand new released OS or software.

(2) If what you have now works perfectly fine, then its better to wait, and enjoy what works great now, until the next release. Take me for example, I am waiting to upgrade until OS 18 is released, cause I have no reason to upgrade right now, not when OS 16 works just fine, as long as I have support for it, which I do, right now.

In conclusion, yes its true that Linux goes way back, 1960 as @Aravisian said. What I think is truly amazing, is how far Linux has come. It used to be a simple command line tool, and now its a full fledged GUI operating system, that in its own way, competes with Windows & IOS, as a new take on how an operating system should work, an OS that works for you.

Because at the end of the day, aint an OS that works for you, and not for company ill gotten gains, a better way? IMHO, I think it is the better way, the Floss & FOSS way. Great post @Aravisian !


I have to clarify this statement you understand. The Linux Kernel was introduced in I think 1991. It was joined with Gnu, which Gnu had started in the early 1980's

COBOL was released in 1960. And the C programming language still used in the Linux kernel was introduced in 1972.
So, Linux goes way back to 1991.
Gnu goes back to I think 1982 or 83
C language, which is utilized by Linux goes back to 1972...

COBOL is unrelated directly to Linux or Gnu and was used as an example of Old and Still Rockin'.

And just for clarity, GNU was written first for the problematical Hurd kernel, but Stallman was struggling to devise a working kernel for GNU (GNU Not Unix). Then Linus Torvalds, an undergraduate studying Minnix developed Linux. This became the working kernel for GNU, so it is called GNU/Linux. Linux is NOT the OS. No GNU, no Linux and vice-versa.

Side-note: Hurd Running Under the Bochs Emulator – OSnews

Well, this is a big topic I would say. I agree that the newest isn't always the best. The great Difference here is in my Opinion: Time.

When you have something like Xorg for Example. It had Years over Years for Improvements and to show what it can. A new Software don't have that. So, theoretically, it should get the Time to improve and develop, too. That it isn't at the Begining perfect and had Bugs, Issues and Stuff like is not unexpected I would say. The Thing is: How it will be later? That is something what the Time will tell.

On the topic of time, I think it's also consider this from the perspective of the era in which competing technologies were created, especially when talking about decades of difference.

For example, Wayland was created in 2008; it will make 16 years on September 30th. In 16 years it still has not managed to match the success that X11 had in the same period of time since its release in 1984. This is in spite of having more experience to learn from, to avoid past mistakes and things like that, having much better technologies like the faster computers and whatnot, better tooling for writing code, review it, sharing it with others, etc. Not to mention the existence of a global internet to look stuff up at any time...

This is not to say bad things about Wayland or the people working on it. It just goes to show that technologies should be chosen based on merit, and not because of FOMO.

And on the topic of package formats, this was just published a few hours ago by one of Vivaldi's QA engineers, as they have just recently the Snap version:

1 Like

Great job! :wink:
@Aravisian

I thought Ubuntu would be derived from Debian...I'll study more about it

Ubuntu is derived from Debian, but at some point they decided to develop their own package format and nudge .debs out. I can't recall if they backtracked any or not, but if I recall correctly, when 24.04 LTS dropped, people had to manually restore the ability to use .deb files at all.

1 Like

They did it on 20.04, as well.
The Software Store was replaced by the Snap Store.

3 Likes

Sooooooo, its safe to say, the software store disappeared, in a snap! Ohhhhhhhh, SNAP!

Chris Pratt Oh Snap GIF


2 Likes

The other item that was dropped was the recognition of GNU/Linux in the bootloader, instead in their haughtiness they state Ubuntu with Linux. Remember, "Haughtiness comes before a fall and Pride comes before destruction!"

I remember reading Debian wanting to set up a Debian OS consortium to work together and Mark Shuttleworth, owner of Canonical declined.

Having said that I have found a blog that I posted elsewhere by an eminent British Linux engineer of a link between Debian and suicides due to bullying and other tactics used against developers in the Debian eco chain. I never realised until recently that the creator and founder of Debian, Ian Woods, who met his wife Deborah who worked in a coffee shop, (hence the OS name Debian) had been estranged from his wife and children for some time had committed suicide.

Side note: What is interesting to note is I am getting more downloads of the unofficial manual for Zorin 15 than 17!

2 Likes

Thank you all! This community is one of the reasons why I love this Linux world. You are all amazing.
@Aravisian @zenzen @Ponce-De-Leon @StarTreker @swarfendor437 @Locklear93

2 Likes

What an amazing story! I've never heard of this and I follow Linux channels on YouTube.

1 Like

Re: Old software.
Remember when software was developed back in the last century, hardware was much more restricted than now. Code had to be finely crafted to be efficient, to the extent that machine code was used for time cricical tasks. Todays apps are exponentially larger size than similar apps that ran on the first IBM PC's, but the functionality is not exponentially superior or faster. Just bloated.

3 Likes

Sad, but true. A lot of people still think that because computers are faster today, we shouldn't worry about optimizations. There's obviously some value in using quick, cheap tricks to get things moving forward, but this is not sustainable in the long run.

A while back I saw this video talking about how this is non-sense, with detailed examples of how large companies actually do care about it.

The only problem with this is that companies only put in the effort when there's some return from it. For most companies, it's more important to continue to deliver new features as quickly as possible, as otherwise they fall behind the competition. This means that the consumer does not directly benefit, at least not immediately and definitely not as much.