Your computer might be at risk
I dislike all three major OS — Windows, Mac, Linux.
Although I write software for a living and write about writing software in my spare time, I don’t like computers much. I’m not afraid of them. I’m not completely incompetent at operating them. They just don’t behave the way I’d like. Here’s an example.
Hard drive failure
The family computer wouldn’t boot. Instead it came up with white text on a blue screen listing register contents in hex. A hardware check confirmed my fears: Error Code 7, hard drive failure.
This was irritating but not so very surprising: it’s an aging computer which had been running Windows XP progressively more and more slowly, and which was almost exclusively used by the kids for playing games on. On more than one occasion it had frozen and needed a hard reboot. What’s more saddening but also not so very surprising is that our domestic backup policy isn’t as good as it should be. There were some files on the dead drive we wanted to recover.
And even though Gail and I each have our own laptops, it soon became apparent how important a part of family life the defunct desktop was. Our children enjoy playing computer games. They like to watch Merlin on iPlayer. We needed their machine back.
Fortunately I had a laptop and internet access. After hunting through shelves and drawers I reckoned I could get my hands on most of the original install media. It didn’t take long to come up with a recovery strategy.
Recovery strategy
The plan was:
- disconnect the dead drive
- connect a new drive
- install Windows on the new drive
- reconnect the dead drive
- boot Windows from the new drive
- try and recover files from the dead drive
This article is not meant to be a tutorial on recovering from hard drive failures — if that’s what you’re after, the sanest source of information I found was this forum post by Patrick Keenan — but I do want to stress here that when you get to stage 5 you should not allow windows to run a disk check on the dead drive, which is what it defaults to doing.
Be ready to skip the check by pressing the Any Key!
“Free” file recovery software
The first few stages of this plan were tedious but straightforward. I nursed the machine through the Windows XP installation onto a new drive. Since the desktop was designed to accommodate a second hard drive, slotting in the dead drive alongside the new one was idiot-proof. One minor wrinkle: the system didn’t automatically register the new drive. I had to press F12 to interrupt the boot sequence, enter the BIOS and explicitly enable the drive. By now, I felt quite comfortable with the BIOS screens: this stuff works!
On reboot I skipped the default disk check, as mentioned. My hope was that the physical memory on the dead drive could still be read and that some lower level file reader utility could patch together enough of the file-system for me to recover what I needed. By the way, at this point the PC had no network access. Evidently Windows XP hadn’t installed a driver for the wireless PCI card. Fortunately the USB ports were functional, so I could download software on the laptop and transfer it to the desktop.
I wasn’t prepared to pay for any file recovery software (yet). The value of the lost files wasn’t that high, especially if I couldn’t get them back. Surely I could find some freeware? I took a couple of wrong turnings here and ended up feeling rather gullible. The top few search hits led me to software I could download and demo for free, which claimed they could do the job, but which I’d have to pay to use.
DiskInternals NTFS Reader turned out to be what I was looking for. I’m sure there are others. Happily I’d soon managed to locate and recover the files we wanted, and could disconnect the faulty drive. Phew!
Restoring the machine
Restoring the files was one thing. Next I needed to restore the machine itself to full functionality. The screen resolution wasn’t right. Screen repaints were jaggy. It couldn’t access the internet. There was no sound (apart from an annoying system beep). Etc. etc.
What’s going on here? I’m running a licensed copy of the world’s most popular operating system on hardware purchased from the world’s best known PC dealer, not some bleeding edge Linux distribution. Surely it should just work!
Hobbyists may delight in tinkering and fixing computers. Not me. I’m not clueless, though, and I did realise that what I needed to do was keep downloading and installing drivers until everything behaved. It would have been nice if the system had been a little more self aware, though. Why couldn’t it tell me why it wasn’t working? Why couldn’t it at least provide a hardware manifest in some standard format?
I won’t bore you with the details of what followed. You don’t need me to tell you about the repeated reboots, broken download links, quirky installers; like me, you’ll be immune to the screenfuls of THREATENING LEGAL MUMBO JUMBO YOU HAVE TO AGREE TO. But I do want to highlight a couple of egregious examples of what I’m talking about.
Netgear Wireless PCI card
I had the original install media for the wireless PCI card, and this got the machine connected to the internet. I didn’t just have to reboot to complete the installation, though, I also needed to physically remove the card and reseat it halfway through the process. Good job the PC chassis was already open. As if all this wasn’t bad enough, I also had to bypass a grim warning from Windows XP about destabilizing my system. Lovely!
Then I discovered wireless internet access only worked for system administrators, a level of privilege no one should really need to run at. Googling the problem suggested I was not alone. So I downloaded the latest drivers, uninstalled the original version, reinstalled the new (power off, remove the old card, power on, install software, power off, insert card, power on, finish installing the software) only to find the driver problems still had not been fixed!
To work around the problem I had to follow the install procedure for a third time, in this case not clicking through the recommended path, but rather allowing the Windows network connection manager to run the wireless service. Even then I had to stop the Netgear software from auto-starting.
Grrrr!!
Flash install problems
The kids like playing Adobe Flash games so I installed the Flash browser plugin. Well, I tried. Apparently I’d run out of disk space.
An alarming message, considering H:\WINDOWS\system32 was a directory on my brand new, unpartitioned and almost empty 500 GB hard drive! A little more googling revealed that the Flash installer actually required the presence of a C: drive. After mounting a USB flash drive and fiddling about with the control panel to remap drive labels, I managed to get Flash plugged in.
Unfunnily enough, I recently faced a similar but less tractable issue at work, trying to get the Flash plugin working on a Red Hat Linux box. Why should running Flash require me to upgrade libc!?
The antivirus racket
Checklist:
- Hardware happy? ✓
- Software happy? ✓
- Computer happy? ✘
Despite the clean install of properly licensed software on a new hard drive, Your computer might be at risk. Clicking the balloon to fix the problem directs me to the Microsoft antivirus partners page, where plenty of companies will be happy to help — at a price, that is.
This, I think, is the final straw. I’m no longer willing to play along with this racket. The antivirus software is itself a virus, a virus carried by my reconstructed computer from the moment I installed Windows; and clicking the threatening balloon will only make things worse. In the computer’s former life, I installed some free antivirus software which turned out to be free as in free beer, not free as in without cost. It soaked up resources, hammered the hard drive mercilessly, and was forever interrupting me, cajoling and wheedling me to upgrade to the professional edition.
You idiot!
As a software professional I should know better. I should know better than run Windows. I should know about security. I should know about backups. I had a lucky escape, and I’ll admit, I was lazy, but the truth is this kind of thing happens to computer users — both professional and amateur — all the time, and many have been burned far worse than me. I think the model of standalone computers with their own hard drives running their own operating system is all wrong, and bolting on an external hard-drive and some backup software doesn’t make it all right.
Alternatives
Right now, I’m not inspired to go out and buy another computer plus the additional hardware and software required to make the house a secure and safe place for computing. It’s a home, not a server farm!
Yes, if we went the Apple route and purchased hardware and software in beautiful (but expensive!) self-contained packages the driver problems ought to disappear and time machine would allow me to chase backwards and recover anything and everything. I’ll admit I’m tempted, but I have my concerns about vendor lock-in: if being tied to a software application is bad, being tied to a platform is worse. And yes, if we went the Ubuntu route I don’t think we’d be troubled by the antivirus protection racket but I suspect I’d still end up grubbing round for drivers, possibly with even less success, and I’m sure it wouldn’t be such a good games platform.
In an interview on stifflog a number of well known programmers answer questions about their favourite tools (operating systems, languages, editors etc.) I sympathise with Peter Norvig’s rather negative answer.
“I dislike all three major OS — Windows, Mac, Linux. I like Python and Lisp. Emacs.”
Personally, I prefer the model of computing as a service, a service you connect to from a variety of devices in a variety of locations. I don’t really care where exactly my data is — I just want (secure) access to it, wherever, whenever. I don’t want a house full of wires and hard drives. Backup should be transparent to me as a user, and it should be straightforward for service providers to implement: bytes are easy to replicate and transmit, to distribute, and there are economies of scale. (For what it’s worth, my personal backup strategy, based around travelling light, internet access, version control, cron and rysnc, does what it can to emulate this model, but it certainly isn’t for everyone.)
I don’t think this a Utopian view and I certainly don’t claim it’s original: computing has been gradually moving in this direction for some time now. Perhaps, though, we should refuse to accept the dystopia we’re currently living with.
The future of operating systems
You’ll have to look hard for an honest and open insider’s assessment of the state of Mac and Windows, but the Unix community have always been quick to acknowledge design limitations. In an interview on Slashdot, Rob Pike explains how weak the Unix design has become in the networked world we live in.
The major things we saw wrong with Unix […] back around 1985, all stemmed from the appearance of a network. As a stand-alone system, Unix was pretty good. But when you networked Unix machines together, you got a network of stand-alone systems instead of a seamless, integrated networked system. Instead of one big file system, one user community, one secure setup uniting your network of machines, you had a hodgepodge of workarounds to Unix’s fundamental design decision that each machine is self-sufficient.
We really are using a 1970s era operating system well past its sell-by date. We get a lot done, and we have fun, but let’s face it, the fundamental design of Unix is older than many of the readers of Slashdot, while lots of different, great ideas about computing and networks have been developed in the last 30 years. Using Unix is the computing equivalent of listening only to music by David Cassidy.
Earlier in the same interview, Rob Pike clears up a misconception. Although he co-authored a book on about Unix, he cannot take credit for its creation.
Ken Thompson and Dennis Ritchie created Unix and deserve all the credit, and more.
Which leads us from Slashdot to Google, the computing equivalent of listening to, say, the White Stripes. On the Ask a Google engineer forum, Jeff B from Woodstock GA asks a leading question.
I believe the future of operating systems is a “Cloud OS.” Do you agree?
Ken Thompson from Mountain View CA responds.
yes
Back in the present
Ironically, the remade computer is actually tolerable to use at present. It boots in seconds, has decent peripherals, and (at last!) carries no craplets. Having spent a few hours getting to know it, from BIOS, through driver, to control panel, I’m on better terms with it than ever before. Maybe it’s not such an old sad thing after all.