Monday, April 15, 2013

'Bitrot' Not a GNU/Linux Issue

by Dr. Roy Schestowitz

In computing, everything should ideally scale linearly or logarithmically if/where possible, except perhaps for innovation in hardware which can be nearly exponential in some terms due to multidimensionality and various other factors. Linux takes good advantage of hardware and, owing to reuse of code, programs are rarely bloated. With Windows, contrariwise, common practice/advice is to assume bloat is normal and reinstallation a routine task which mitigates bloat. Those are two separate issues; one deals with scalability and the other with the steps needed to remediate. In GNU/Linux, where malware is rare, optimising a system is often possible without radical measures like clean-installing.

It is not uncommon to see distributions of BSD or GNU/Linux running for years without a reboot or a reinstallation. These systems, which first found widespread use in (gradually more mission-critical) servers, required a high degree of tolerance, robustness, stability, and minimal downtime or rebuilding time. Windows, which had primarily emerged through the desktop, took over a decade to get the basics of networking and user privileges almost right -- an issue that still makes it attractive to rogue programs.

The three Rs, restart (application), reboot , and reinstall, have made infamous a class of box booters who are sometimes synonymous with Microsoft-certified administrators. Whereas UNIX and Linux professionals tend to deal with complicated issues of automation and troubleshooting, many of their Window-centric counterparts spend their days wrestling with issues associated with performance (setting aside restrictive licensing that impedes expansion) and malware, which are two related but separable issues. Over the years I have narrowed down the low efficiency of maintaining Windows clusters (requiring more administrators per cluster) to what some call bitrot, or the notion that digital data -- or an executable program - inevitably needs to erode over time, requiring one to revert it back to a pristine condition.

A solid GNU/Linux distribution is unlikely to slow down or break down on its own. On my main workstation (since 2008), for example, I never had to reinstall an operating system unless I switched between distributions (Mandriva was losing its corporate backing at the time). I could use the system for months at a time without any reboot. I could install over a thousand packages without it resulting in slowdown or performance degradation of any kind. It is harder to achieve the same thing with Windows, based on people to whom I speak. The three Rs are essential there.

More and more enterprises pursue GNU/Linux and people who know how to maintain it. For continuity of service and for minimal intervention it takes a system that will not 'rot' over time or be made deprecated because the company which has exclusive rights to the source code decides so.

-- Dr. Roy Schestowitz

Enhanced by Zemanta


  1. The time to rot can be surprisingly short. My new VM of Win 7 is already noticeably slower. My old VM, which had HBSS and Visual Studio

  2. I tend to think that there is an inherent design flaw in the registry which will manifest in a short period of time and once the btree insertion becomes 'highly unbalanced' the process for each subsequent key insertion into the registry (database) becomes correspondingly slower. A highly unbalanced b-tree has many levels of nodes which require traversal whereas a balanced b-tree is one or two levels at most for quick insertion.

    Once the database takes a whack (as in when there is a lock-up; there's no way to flush the i/o buffer to write back the keys) it slogs.

    With Linux at least, (r-e-i-s-u-b crtl-sysrq) one can be assured that an unresponsive pc can be gracefully shut down and i/o buffers will be written through to disk. Not so with NTFS on Windows.

  3. On most desktop Linux systems you have programs designed for KDE and Gnome - or neither. This leads to a lot of redundancy (what you are calling "bloat").

    The main reason for Windows slowing down is the insane "culture" of the environment where so many programs add useless (and even detrimental) toolbars and startup items. If you remove those then most of what you refer to as "bit rot" will be solved. I use a tool to do this which makes it easier, but even using the built in tools this is not that hard (though the very idea it is the norm is insane). It also makes no sense to have a registry where programs save their preferences. Would make a lot more sense to have these saved to a folder where each program had its own preferences file which could be easily removed or shared if needed. The Windows design is simply inferior here - I have never heard a good reason or a benefit of the design it uses.

    With rebooting: this is not needed nearly as often on Windows XP and above as it was on the 9x series, though - as with Linux - it does sometimes help (you might be able to log out and back in and not fully reboot, but for the user there is not much difference here).

  4. The registry is purely a database and its index primary keys get 'hosed' so much so that a single key insertion can make the system come to a crawl on a highly imbalanced b-tree.
    Bitrot is the process of an executable on disk through time flipping one or more bits (state on hdd magnetic media, probably less the case with ssd). It happens.

  5. The Registry is an absurd design choice. No argument from me on that.

    With bitrot: any system can have "one or more bits" flipped - in other words, data can be corrupted and this can harm a system. On a well made system it should be pretty easy to re-install the OS without having to re-install most programs (some drivers and the like might need to be re-installed, but not general user-level programs). Windows gets this very, very wrong.

  6. I use Windows 7 - in a VM, but that should not really matter for the purpose of this discussion. Been using it for years and it works fine. I do not install and uninstall programs on it nearly as often as I do with my primary OS.

    Windows also does not allow for installation of a program in, say, a folder on your desktop so you can test it and, if you decide to like it, allow for easy moving to the "standard" place or applications. As a work around you can create shortcuts to a folder and run uninstallers for software you do not like... far from elegant.

    Lots to not like about Windows... just do not think Roy's article is fully accurate / well supported.

  7. I have to agree with the author, having used windows since win95 right up until just after7 ( im a Linux GNUbe, iv'e only 2 years use under my beard) I suppose i could be described as a power user & thankfully i wasnt dependant on any .exe file/s , i noticed in Wind ows OS including 7 (widely regarded as their best offering) id be spending a lot of my time scanning for malware and rebooting. Yet on the very same hardware Linux would be twice as fast at booting, opening applications etc. Another ++ is i dont have to spend a percentage of my day doing malware scans and rebooting (now i choose when reboots occur & i don't need to reinstall every 6 months) For me there is no comparison. And i too wholeheartedly agree that the windows registry is inherintly flawed by its very nature, not to mention the NT kernel.

  8. This was precisely one of the reasons I switched to Linux in the first place. It's nice to not have to worry about my boxen slowing down after a few months.

    In the Windows world, I became accustomed to a procedure called "slicking": back up important files, reformat the hard drive, and reinstall Windows and all the software. It was (and still is) one of those common-knowledge aspects of Windows administration. Ever since I experimented with Linux, I realized that such a need to constantly reinstall and reimage software was simply unacceptable; if a free-as-in-beer OS developed by a bunch of volunteers can properly handle years of use without reinstallation, then why can't a $100+ OS developed by a multi-billion-dollar company do the same?

    I've got machines running heavy loads 24/7 on Linux that are just fine and dandy with months of continuous use; at the very most I might have to restart a daemon or two if one of them gets wedged on something, and that's just a simple "/etc/rc.d/rc.$daemon restart" that gets $daemon back up and running in seconds. Windows, on the other hand? Gotta schedule reboots every two weeks to keep things running well.

    All in all, very nice article.

  9. I'm a strong Linux supporter but think it is worth pointing out that the people who are boasting about never needing to reboot will be running very out of date kernels, probably with vulnerabilities and definitely missing newer features.