It has a healthy appetite.
This is a common complaint.
While I can tell you that it’s not likely to change, let me throw out a few reasons that might explain why it’s happening.
Become a Patron of Ask Leo! and go ad-free!
Software growth over time
Software grows over time mostly because of the competition that drives new features and capabilities. Market demand favors programs that evolve and add more functionalities. As software is updated, it’s developed for current, more powerful systems, leaving older machines behind. It’s not about laziness or selling new machines, but competition, complexity, and effort.
It’s mostly about competition
Far and away the number one reason software only gets bigger is because products compete with each other for customers.
Competition means that program vendors need to add more features and capabilities: more of this, more of that. Over time, it all takes more code to implement those features, capabilities, and bug fixes, including the patches you get from time to time.
Companies rarely remove code or features.1 Over time, they just add, making things bigger.
None of this happens overnight. It doesn’t even happen over a year. But over the years, you start to notice the program that used to be this big and had certain running requirements is now that big and either runs slowly on your old hardware or requires more hardware to do its job.
It’s not about you
I get it: not everybody wants more and more. And yet the market speaks with its wallet and its downloads. Programs that do more are more popular. More people buy them.
That’s the measure that program vendors use.
Programs that continue to grow and innovate get used by more people than programs that stagnate.
That may not describe you, but it does the majority, and often the vast majority. Companies that don’t continue to improve their products with features and functionality eventually die away. Speed and stability are a selling feature, but beyond a “good enough” point, it’s not something into which manufacturers heavily invest resources because it rarely impacts individual buying or retention decisions.
With great power comes… more features
Another contributing factor here is what I’ll call the march of technological progress. Machines sold today are significantly more capable than machines sold just five or ten years ago.
With hard disks in the terabytes and RAM measured in gigabytes, software developers often develop to the latest and greatest systems, or at least the current ones, providing as much functionality as they can into the future. Older, less capable machines are a shrinking market by definition, so it’s difficult to justify developing software with those machines in mind.
So that leaves those of us with older, less capable machines in a difficult spot.
Realize, though, that we contributing to this problem to some degree because we’re asking more of our machines now than we did five or ten years ago. Maybe we’ve accumulated ten years of email. Maybe we’re doing way more on the internet than we used to. Maybe it’s something else.
And maybe it’s not you, but once again, the needs of the majority of folks have grown. The software we expect to meet those needs has grown accordingly, while our trusty old machine hasn’t.
It’s not laziness
A complaint I get from time to time is that software engineers must be lazy or they’d write better code.
I’m sure a few individuals meet that description, but in my experience, the size of software programs has nothing to do with laziness. It’s mostly about being overworked. With the pressures of creating new features to maintain competitiveness, there’s often little time for anything else. I have had the experience of writing software, wanting more time to make it better (faster, smaller, more stable), and being overruled because the software was considered good enough and needed to get launched in order to compete. And I’ve heard the same from many of my colleagues.
I tend to hear this complaint from people who’ve never (or not recently) written software. The landscape is incredibly, almost inconceivably complex. The effort required to fix this pet peeve would likely be more than you realize. In many cases, it’s not worth delaying and/or destabilizing2 the product.
There’s no conspiracy
Some people believe that software is intentionally constructed so as to force people to buy new machines.
No. Just no.
There are so many alternatives to purchasing new devices that this entire line of thinking makes no sense. Software vendors design to the standards of today’s machines, and if you have yesterday’s machine, then yes, you might be left behind. But the reality is that we all get new computers for a variety of other reasons — at which point we’re caught up and the cycle repeats.
But no one is forcing you to do anything.
Do this
I’m not saying software vendors are always right, always make the correct decision, or always have your best interests in mind.
But by and large, most are simply trying to compete, and that involves many tradeoffs along the way. Size, as well as speed and stability, are often part of those tradeoffs. And for the most part, those decisions are not made lightly, even if you and I disagree with the outcome.3
Subscribe to Confident Computing! Less frustration and more confidence, solutions, answers, and tips in your inbox every week.
Podcast audio
Footnotes & References
1: Though it does happen, the things removed are typically small.
2: Every change — every change — risks introducing more problems. Always.
3: I’m looking at you, OneDrive “backup”.
Someone(s) always pipe(s) in about Linux on an article like this. This time, it’s me.
With the number of free and web apps available, most people would get along fine with Linux. I recommend Linux Mint. It’s relatively lightweight, has the apps most people need (with thousands of apps available to download for free), and has a user interface that is easy for Windows users.
The only inconvenience I find is that it’s not as plug-and-play as Windows. For example, I have to manually connect Bluetooth devices, where Windows automatically recognizes them or remembers if you’ve connected manually.
I’m sure you’re right and yet..it’s hard to take because this supposed insatiable yen for shiny new features over any reliability and speed doesn’t accord with the priorities of anyone I know or work with.
My field and the software I am most familiar with, Geographic Information Systems, is dominated by a major company, and the mode of operation does indeed seem to be like you describe. However in every shop I’ve worked in, no-one was slavering at the mouth for the latest new versions; instead we would delay installing them for months because we knew our workflows would be disrupted and we’ve have a slew of new bugs to deal with. Everyone’s biggest complaints were always about speed and stability. If we needed new features, we wrote them ourselves. Why did we stay with this software? Because it dominated the industry and was what we knew, and we didn’t know enough about any better alternatives. And mostly we were never given any choice. Hardly a free market competition.
I don’t know how it is in other industries but perhaps those who make the purchasing decisions are vastly divorced from the users’ actual needs and desires.
That’s the problem with technology. It’s like two steps forward and one step backward (but probably more accurately, 20 steps forward on step backward). Adding new features invariable adds new bugs. I was very happy with Windows 3.11. It was stable and did everything I needed to do. The alternative would have been to say that Windows 3.11 is a fine OS. No need to upgrade other than to make minor improvements. Or maybe we should have stuck with DOS. An OS has hundreds of features that most people don’t need but each feature not used by most is still usefull for hundreds of thousands of people.
There are a lot of similar issues in the physical world. Some examples:
Unleaded gas. Many people had to get new cars, and I’d have to get an additive to replace the lead that was removed until I got a car that had a catalytic converter. Lead was a major cause of death and mental illness in the 20th century.
LED bulbs. They significantly reduce the amount of energy consumed and save a lot of money. I know a couple of people who hate them for the kind of light they give off. Technological advances generally help many but negatively affect others.
Interesting article. I have both old and newer computers. up to 10 years for old and tw No to 4 years old for newer.
None of them have problems running Windows 10 or 11.
The software I use come from the dim past as well as today. No problem! I’ll get back to you in another 10 years – should I make it. BTW some of the old stuff is often more demanding of resources than the latest.
Chris
When I started learning about GNU/Linux, its philosophy was what most impressed me (Create one small thing that performs one task very well, then combine many small things to perform much larger tasks). Software application vendors could/should adopt a similar philosophy (build and sell a base component, then develop modules to add new features/functionality). With this philosophy, software updates would provide security/reliability improvements (managed by a maintenance team), and new feature modules could be produced by a development team. The vendor would still establish a revenue stream by selling new modules to customers who want/need them, and customers would be able to more specifically define which modules (if any) they need so they have the functionality they require, and nothing they don’t need/want. Ultimately, such a concept could help a vendor obtain more customers. I don’t know how practical this concept would be for the Windows world, but it works very well with GNU/Linux. It’s why Linux-based distributions can work so well on older hardware and provide modern features/interfaces.
Yet another stream of consciousness from
Ernie (Oldster)
That’s how structured programming works, but it doesn’t guarantee that one module doesn’t interfere with another one. One problem is software bloat. When a module is no longer called, it may not always be removed and this leads to software bloat. Even Linux constantly requires upgraded computers to keep up with the OS evolution.
I’ve spent decades as a software developer, and never once did I wake up in the morning and say “today I shall write something to degrade the computer’s performance”. However, my bosses sometimes had other ideas.
Once, back in the 1980s, my supervisor was so eager to get the latest version so she could send it to the customer that she was *literally* standing over my shoulder while I wrote the code and compiled it. This was back in the days of 8-inch floppy disks and the CPM operating system. When the compile cycle finished she reached past me and snatched the disk out of the drive so she could go to the room with the modem and transmit the >>COMPLETELY UNTESTED<< software to the poor customer.
On another occasion a different boss asked me how long it would take to add a specific set of features to an existing software package. "About six or eight months", I replied. "Well can you do it in two weeks? That's the install date we just promised." Yes, I did manage to add the requested features in just two weeks, but as you might guess the resulting mess was a nightmarish monster of bugs and glitches. It took almost a year to iron out the worst problems, and my boss was never interested in letting me take the time to rewrite the package to do it the right way. After all, the customer had already paid us, so now it was time to move on to other projects.
I do not believe modern managers are one bit better than they were forty years ago. Now, like then, we're lucky the software runs at all.
Program designers don’t do anything with the intention of degrading performance. As new hardware is developed, new capabilities are added. New features are added to operating systems and programs to take advantage of those capabilities.
A few examples. When CDs came out, floppies started to become obsolete. Computers came with CD, later DVD and even Blue Ray drives. Computers that didn’t support those, became obsolete.
When I upgraded an OS, my scanner became obsolete and the scanner company had moved on and no longer updated their drivers for that scanner.
There are many non-computer-related technological advances that render older technologies obsolete.
If we stuck with the If it ain’t broke, don’t fix it practice, we’d still be using floppies on DOS.
If a program needs to be big because of added functionality, that’s perfectly OK. But here are a few different perspectives of what makes code big. First, in any “well-written” program, most of the code deals with error detection and recovery – basically, preventing code crashes when the user does something unexpected or stupid. Second, blame it on Microsoft (mostly). Today’s code development environments (tools) auto generate hundred lines of code which do nothing functional for the user. Try writing a Hello World program in Visual Studio. Finally, blame it on the internet. Just about every software wants to connect online to do something – anything. I just bought a vacuum cleaner and it wants to be online. WTH?
I have a fridge with WiFi capability. It wants to chat with your vacuum cleaner.
“That may not describe you, but it does the majority, and often the vast majority.”
Many features aren’t aimed at the majority of users, they are aimed at their real customers. Most Microsoft and other software companies earn their money from corporate users who require capabilities most people have never imagined. Consumer computer manufacturers get Windows at a steeply discounted price, so most of Microsoft’s efforts are toward satisfying corporate customer’s needs.
Though I understand why software is getting larger, why software is developed based upon the capabilities of current hardware, and that developers are some of the most overworked professionals, I find business imposed arbitrary restrictions obnoxious.
MS refuses to upgrade my OS to Windows 11 due to the processor not being an 8th generation processor. My notebook, purchased at a Microsoft Store (why did I do that?????), has an i7-6700HQ (4 cores/8 threads) which is the only component flagged by “PC Health Check”. With the notebook’s 16GB of RAM and 1.5TB of storage I’ve synchronously ran multiple Windows 10/11 VMs on my notebook.
https://learn.microsoft.com/en-us/answers/questions/735111/why-windows-11-does-not-support-intel-core-i7-6700?page=2#answers
BTW, love your askleo.com “Find your answer”. I see you addressed this issue a couple years ago:
https://askleo.com/windows-11-is-not-supported-on-my-newer-pc-what-can-i-do/
Unfortunately time has not yet resulted in MS veering from its course of alienating customers. I’ve been looking around for new hardware but SO many of the newer notebooks are plagued with nonupgradeable soldered on LPDDR RAM. Some notebooks priced at 5-6 hundred have processors with a CPU Mark over 25,000. While these new processors are great and 16GB of RAM is fine for Windows 11, Windows 12 might find 16GB of RAM less than ideal, especially when you want to run multiple VMs like I do. Beyond that I would want the ability to upgrade a new notebook with more than twice the RAM of my current system, over 32GB.
So, what to do? I know Windows 11 can be installed in ‘LabConfig’ mode but I am concerned about the security of that work around. My next option was LINUX but I feel security MAY be lacking in certain areas. For all of MS’s failures I’m sure they are not pushing unvetted code to their customers. It seems from my limited understanding of LINUX that contributors, far and wide, push updated software to LINUX repositories which are then used to update LINUX systems. Bless these contributors but faith is not enough. I’m much more comfortable with a trust but verify approach.
Are you aware of any LINUX distros which have corporate managed ISOs and probably more importantly software repositories?
Along with a corporately maintained LINUX variant is a need for anti-virus software. I read repeatedly that LINUX can’t have viruses but if people can push unvetted software to repositories, maybe LINUX users are simply unaware they have a virus.
Red Hat,SUSE, and others have corporate managed ISOs and repositories.