Hi, everyone. Before I switched over 100% to Zorin OS, I was a Windows user for a long time. I also am a Mac OS user (on a laptop I use when I have to be "mobile"). Now, with Windows (and to a lesser degree, Mac OS), system optimization is a "thing," as I am sure many of you know. My question is this: Is system optimization a "thing" on Linux, too (i.e., for performance)?
Is it even necessary? I know there's Stacer and Bleachbit available. But all Stacer seems to do is clean out space (as its main function). Can I do without Stacer, and let the OS handle itself? Does Linux self-optimize, as Mac OS does? Should I just let it be and merrily focus on day-to-day computing? Are there any things I can do to "optimize" Linux? Thanks to those who respond.
P.S.: Specs are an i5-9400 CPU, 32 GB RAM, and a RX 580 with 8 GB VRAM.
P.P.S.: I also have a 250 GB SSD that I use as my primary system drive for Zorin.
I use that (with code in there to clear ZFS snapshots older than X days) from a keyboard shortcut (Zorin menu > Settings > Keyboard Shortcuts; scroll all the way to the bottom, click the + button).
For defragging, the only sure way is to copy everything from Drive 1 to Drive 2 (or clone Drive 1 to Drive 2), wipe Drive 1 via: sudo dd if=/dev/zero of=/dev/sdxX bs=512 status=progress
... where sdxX = the drive and partition (sda1, sdd4, sdh2, etc.).
... then copy (not clone) everything back from Drive 2 to Drive 1. The OS automatically writes to the first free space, and since it's all free space (because the sectors have been zero'd), it's the same as defragging.
With ZFS, you can do the defrag while the system is running live... it essentially removes a drive from the pool, adds it back into the pool, silvers and scrubs the drive, then initializes the drive by writing zeroes to the free space.
You optimize when you identify some sort of bottleneck in your overall performance or productivity. Optimizing without a reason or goal insight, for the sake of doing so, is just procrastination
Well, that's just the thing, it's not necessary until you've identified a reason to do it. By definition, optimizing means taking something that works and make it better.
But some areas where you can obtain some gains relatively easy:
Cutting down on non-native package formats like Snap and Flatpak.
File system, depending on the type of work you're doing.
The kernel itself, usually newer versions of the kernel bring performance improvements:
Depending upon how much memory you've got (if you've got scads of it, there's no need), you might look into NoHang, the Linux low memory handler.
I've got 64 GB now, so I don't use it, but back when I only had 12 GB I did, and I tried to crash the system by consuming all memory... couldn't do it, and the machine barely even hiccuped.
There is also Bleachbit (bit like CCleaner on Windows), that should be available in the Software Store as a apt. I installed it but do not feel the need to use it if the system is running OK.
I would also say I find software updates now remove the oldest kernel soon after updating to new one, which sort of defeats the need to do my monthly autoremove exercise to flush it out.
For me, whilst the system is running OK, there is not much maintenance to be done, just use Disks to check my small HDD disk space has headroom.
Thanks to everyone. I think I will just focus on the "merrily focus on day-to-day computing" aspect. No need for optimizations (through Bleachbit, for example).