I couldn’t tell you when I actually got my first computer. I can tell you that I first got an old used Tandy TRS-80 color computer from someone with loads of games and software. It came with a 5 1/4-inch floppy and an actual color ink-jet printer! I had the computer for a little while before it started to become difficult to load programs and then the thing just stopped working. I don’t know how old it was when I got it, but it was old.
My next computer was a used Apple Performa 400. My first computer that was really usable. It ran some version of System 7, had a geoport 56k modem, and a 15-inch monitor. The computer came with some productivity software (Claris Works, Pagemaker) and some games (Sim City, Myst, Doom II). The hard drive was 750MB. Although the thing never died (computer or hard drive) the system would just stop working from various System 7 related crap. I lost it 2 times in the year or so I owned it, and finally I let it go.
As you can tell from the title, this post is obviously not about those machines. No, this post is just about systems I have had that never wanted to go. They never died, I never had major hardware issues, just the occasional software hiccup. I’m going to start with my first REAL machine. It was a no name brand custom PC purchased sometime in 1998.
My parents finally decided it was time for me to get a real system. After years of begging for a real computer our friend Rocco decided to put it on his credit card for my parents and actually get me a full, new, computer system. We drove to this computer store way out in the middle of no where and look around a bit. My dad and Rocco had no idea what to look for, but I did. I found one machine that stood out from the others; 300mhz AMD K6-2 processor, 32MB SDRAM, 56K modem, 8MB onboard VRAM, 52x CD drive, 4GB hard drive with windows 95, a beautiful 17-inch monitor, and even 2 USB ports! This was it. I wanted it, and I got it for $975 with a year warranty.
I was one of the only people on my block to have a 56k modem, and a computer that could play games and movies and music. I was one of the first people out of all of my friends with a new computer all to himself. I started eBay-ing at around 15 or 16, when I got my first job. With that money I started to upgrade my system. I upgraded the video to a Savage 4 PCI card, a Diamond audio sound card, I got up to 96MB ram, a CD-R (which was $300 when I first bought it), I added a 10GB and a 12GB hard drive for music and programs, and used the original 4GB Seagate drive as the system drive. I installed so many versions of Linux and windows it could make your head spin. I formatted it so many times I lost count. It was on almost all the time, hours and hours of intense (at the time) gaming, loud music, viruses, everything. I just worked that machine for everything it was worth and nothing inside it ever failed. Then I finally “upgraded” to a 450Mhz P3 Dell Gigaplex which was given to me from my old school library. I re-purposed the old machine as my own personal webserver (I had Cable internet in my house which another thing I really adopted first: getting the best internet in the neighborhood…before cable was DSL).
The old webserver lasted through most of my undergraduate career sitting in my bedroom at home (and then our house in South Philly) idling on my cable connection 24/7 for maybe 3 years. I used it to host various things (pictures, my former website, friends’ pics, class projects, etc). In 2007 or so I finally decided to take the server offline, and drop the cable internet. I was never home anymore, and my parents didn’t go online at all, it was just an extra $45/mo that wasn’t going anywhere. The system was still using the original 4GB Seagate drive, original USB controller, original RAM, and the original video card (not the Savage 4) and even the original power supply. All working when I took it offline. It was slow, the CMOS battery kept dying (which I replaced 2 times in its lifetime), and it was loud from the old drives and old dusty fans, but it still worked perfectly.
The replacement Dell I had was tossed for an upgrade I got in college, another old Dell Dimension XPS with 700mhz P3 (The old Dell wasn’t dead, I just traded it for the other Dell). It was an old webserver years before, and I used it as my computer in my dorm for a few semesters. Many papers and hours of movies and gaming occurred on this machine’s life in my dorm also. It’s now used as my parents only computer at home. It’s running XP, has wireless, 512MB RAM, 2 hard drives, and still works fine.
My replacement for that, a PowerMac Blue and White G3 is sitting in my basement in Philly. It has a G4 upgrade in it, a 20GB and a 40GB hard drive, a DVD-R/RW, 512+MB RAM. That computer is from 1999, and it still works. I got the Mac from my current employer who was throwing it out. The original 60GB drive died on it (which is only my second drive failure in my life, my other was my old Toshiba laptop 5+ years into its life) and the system needs to be reinstalled…but it still WORKS.
I have a lot of hardware like the the Dell and the Mac; my old third-gen iPod (still used for storage!), my old Toshiba laptop (needs new hard drive, screen hinges are breaking), and other various systems I have built over my life. These all still work fine. Is it something I do to my systems that make them last longer? I like to think so. I like to think that I do take good care of my systems and try to get the most life out of them. Maybe it’s also that I just don’t thrown systems out at all until they are completely useless to me. Whatever it is, I have surprisingly good luck with hardware. I’m going to be replacing the Dell at home with an actual new computer sometime in the near future, but that dell will be reused as a server somewhere I can assume.
So I wonder, why do people have so many issues with their new machines which they spent thousands on and I have almost no issues at all on older systems? Do they not make computers like they used to? Is hardware getting crappier? Or is it that performance comes with the high price of high failure rates? I can safely assume that a 10,000RPM 700GB drive will get much more wear and tear on it than a 300GB 7200RPM drive, but shouldn’t the failure rate be lowered as time goes on with technology? We have better materials, and better methods of testing hardware now. Should we just stick to old hardware because it’s “safer?” Or should we just accept the high failure rate because performance is worth much more? I wonder.