Tech news and discussions about computers and hardware

Is there a limit on specs for which a computer could be useful all your lifetime? I mean, a computer from 95 was not useful in 2005, a computer in 2005 is somewhat useful in 2015 and I bet a 2015 computer will be useful in 2025, so the gap is closing

What people need to realize is that computers in ten years between 1995 and 2005 went from 133 MHz Pentiums to the 3800MHz Pentium 4. You might as well call it a 30X performance gain. In one decade. We will never see that again.

From 2005 to 2015 we went from 65nm single cores to 14nm quad cores with a great integrated GPU and integrated Northbridge. Not quite as dramatic, but still very significant. Clock speeds improved in ten years fro the 3800MHz to maybe 4500MHz. Not very impressive at all.

So far from 2015 to 2021, we have gone from four cores to eight and boosted clock speed fro 4500MHz to 5100MHz. Not actually very good at all, if I am being quite honest. Still IPC on the i9–11900K and 5800X is fantastic and L3 cache is quite substantial at 32MB for the 5800X and 16MB for the 11900K.

The chips from 2015 use 40W to 65W, while chips today can easily use above 150W and in the case of the stock i9–11900K as much as 218W. So much for efficiency.

We can’t just keep adding cores and bumping up clock speeds by 100MHz every other generation. We obviously need to creatively integrate. I love the Apple M1 Pro for its efficiency and balance. With the right code enhancement it will eventually play any game you throw at it. The temps and efficiency numbers are good. It does admittedly get demolished by the M1 Max in some tasks but honestly not as many as I might have expected.

In many ways, I think the M1 Pro is a true endgame chip. Buy one today with a 4TB SSD and 32GB and you could realistically use it for ten or fifteen years—assuming it is reliable and the amazing screen holds up.

Another endgame chip I see is the six-core Ryzen 5 5600G. Less so than the M1 Pro, but still something with enough power and efficiency to be a real contender for a solid decade or two. I am a little but disappointed by the GPU on this SKU but it is something that has some solid basic functionality and capable of 60fps gaming on a secondary or home theater type box that will fit neatly on a shelf next to a console.


interesting point made by
Brett Bergan


Are you familiar with Moore's Law. See this:


More interestingly:


I liked reading interesting things what people wrote it gived my brain and mind more cleaner, easier to understable. Where comes Linux GUI? Gnome od Adwaita - xfce if not wrong.

1 Like

That's why my laptop is about to say goodbye. But well, I can't let her go. No replacement yet. hahahaha


The life of a computer is limited by the features included with newer chips that are manufactured. Some of the newer features offered to make development easier are not backward compatible. Developers choose what features to support and utilize in their applications. A subroutine to use multiple cores will never get used on a single core processor, slowing the application down to a crawl. This would technically still be usable, but would you use a program on a laptop that takes ten minutes to render a web page, or a half hour to render a 1080P image?

At some point you will need to get newer hardware or the machine will be to slow to do much of anything. You may be able to use something that is a few years old, but even then it will get slower, unable to handle the improvements coded into software because of the lack of features on the older hardware. Purposefully or not, developers determine the life-cycle of computers now, instead of manufacturers.


Grafen is very good but why not popular?

1 Like

I don't know but my guess would be economics. In just about every field, a breakthrough in the lab doesn't translate to the assembly line - in fact, rarely so.
I used to do a lot of Radio Control flying. RC hobbyists tend to monitor technology a lot. In that hobby there is a running joke about the numerous "breakthroughs" in battery technology that never make it to the consumer.

1 Like

Because material is better from diamond then economy bussiness cannot selling a products.

Likely some other obstacle. Graphene batteries have been on the market and they're not that expensive:

GUI needs X11 or, then the Desktop Environments overlay that.

1 Like

I think that developers need to focus on developing software that runs on the oldest hardware, rather than the newest hardware. That is the problem! Every new piece of software is made to run on the newest flash hardware! Everyone is just a consumin, consumin, consumin... I think we have it all wrong. The better the code, the better will run on an ancient battleship. That means less upgrades for everyone and a more sustainable planet.

Take the web for example... when I was growing up, all those webpages were built in html/css. That was it! Then came along java and flash. Then came wordpress with it's epic bloat plugins and all kinds of heavyweight themes etc. This is not efficient.


This is a good and strong point. E-waste is piling up as quickly as carbon emissions. And the only thing that seems to matter is the bottom line.
If a line is to be drawn, I think it is up to us to draw it, not those that profit from shifting the goal posts.


I wish my laptop had a battery like that so i can give it to my daughter and she can pass it on too to one of her kids and so on :stuck_out_tongue_winking_eye:


I remember in history tights also was great product never damaged. The project was selled and never saw the world.
If product could be a great then bussiness economy could be died. Ok food is most important. Addicted from electronic if it died i will back to books and more hobbies and more dating. xD


I found the RC thread that lists all the lab successes: Some company announces groundbreaking new battery - again. - RC Groups
The one you cited is linked there too.


I'm always happy to read threads like this. I also enjoyed reading the article that Carmar mentioned:

At the same time it is confronting. For the people are not struggling to survive and keep their families alive, it seems a majority of people are mostly after economic gains and convenience.

It does not provide economic incentives to create expensive-to-manufacture batteries that will never be replaced (consumed). Economically it is preferred to produce products that are consumed (rapidly, but at a rate and price point that is acceptable to consumers) so that long-term economic gains are assured.

For consumer-software, it does not provide economic incentives to optimise code so that it requires less energy to run or can run on older hardware. A new feature however might attract new users. More convenience might do the same.

If these are the primary goal, then runtime libraries and packaging formats like snap and flatpak are great. You can reach more people, you can focus on adding features rather wasting time on compatibility and troubleshooting, its convenient for users, its sand-boxed so you don't need to worry about security as much and finally its easier (and more economic) to offer support when problems do occur.

For all their merits, they do the exact opposite of trying to optimise for computing resource and energy consumption.

  • They require more storage capacity → more resources required to manufacture storage devices at the programmer, distribution and user.
  • They require more bandwidth and more energy to transfer between programmer and user, for every single update.
  • They require more processing power, all the borrowed libraries need to be processed and transferred to (more) memory.

They I know someone would argue that in this way, you could run a snap version on an otherwise out-dated system that cannot run the latest version of an OS. I would say that if economic restrictions and incentives did not apply, it would be possible to code a version of that program that does run on the out-dated system too.

I'm not opposed to snaps and flatpaks, but like streaming music and many other conveniences, it does not come without a cost.

I get a kick out of optimising an 10 year old PC the the point that it is more responsive and has a lower power consumption than a brand new device, while offering the same functionality. But it takes effort (opposite of convenience) and offers a less visually and audibly stimulating experience. (Why on earth would you play a video in 720p when you can watch 4K, right? Right...?) Thus sadly there isn't much economic incentive to do so.

Sorry this has become a bit of a rant.

Long story short, I wish the lifetime of computers was better, and I think like carmar pointed out, we're reaching an economic ceiling for innovation. Computers haven't really become more powerful. They just do things faster so they can do more simultaneously in a time-frame that is acceptable for users and even have the capacity to do so while offering a better user-experience. Gains in efficiency are mostly countered by increased usage.

Edit: grammar.


Ok but Jeff Bezos sayed a human eye can see only 255 dpi or something like that then 4K is scam or something? The many future technology using old technology to hope people believe and buying that to selling with higher price. Sometimes people sayed they don't see diffrents from cheap and expensive. Sometimes you paying for logo name some companies.