Installing redhat 7.3
I couldn't help but notice that David Coursey, executive editor at zdnet, started installing redhat 7.3
at just about the same time I did on thursday.
He's dug himself a small hole.
Note to David: You are installing a 2002 era Linux on a Windows 95 era box, and while you can easily get GNU/linux to run well on such a machine, you have to make some informed choices as to your environment, which you haven't. (yet)
You've got the problems of the genie in the movie Aladdin, which Robin Williams so well vocalizes:
" Itty bitty living space....".
In particular, chosing KDE as your default desktop is not a good idea on such a small/old machine. It has a lot of shared libraries which, well, aren't shared with major productivity applications such as openoffice and mozilla, which are gtk based. You will save an enormous amount of RAM if you were to use GNOME as your default (and it is redhat's default, actually) instead. Even if you stick to KDE or switch to GNOME, I'd suggest burning a few (<60) bucks to get up to 128 or 256mb ram for your machine, and/or switching to an even more lightweight window manager such as icewm or matchbox.
Note 2: Yep, getting the screen set up properly is a pain. But you only have to do it once. It is rather amazing how many otherwise intelligent people running windows don't know enough about it to change their screen resolutions/refresh/color depth on Windows, either.
OK, enough kibitzing on David's article. I look forward to hearing more about his adventures on monday. Here's the story of mine.
My primary machine at home has been a dual, 180Mhz Pentium Pro computer for nearly 6 years now. "Lugosi" has been a great computer. I'm very attached to him, he's done a million things for me - I'm typing up this blog on him now! He's still fast enough for most of what I do, but one of his cpu fans has had a death rattle for months now. The sound has been driving me bonkers.
Frys wanted an outragous 50 bucks for a replacement fan.
I'm sorry, Lugosi, your time has come.
Thursday night I stopped at Central Computer and picked up the new components. They were:
|Asus V333 Motherboard||155.00|
|Athlon 2000+ CPU||169.00|
|256MB DDR 2700 DRAM||69.00|
|80GB Seagate Hard Drive||125.00|
|Logitech Itouch wireless keyboard/mouse||79.00|
All I really needed was the motherboard, cpu, and ram (440 with tax), as I could cannibalize the rest of the parts from another computer, but I wanted a single, bigger, quieter hard drive, (Rattle and Hum, the album by U2, is vastly preferable to the same tunes by "Lugosi and his 3 hard drives"), and the wireless keyboard was just too cool, so...
It took me about an hour and a half to assemble the dang thing. It took me forever to mount the cpu fan, and in doing so I cut my hand on backplate. Bleeding into a new computer is always a good thing, it's an ancient geek voodoo magic to ensure its long life and reliability. I ran some tests, tweaked some bios settings, installed an old network card, and packed everything up to take into work friday morning.
The IT department at my company convienently keeps a mirror of the redhat distribution on a server, accessible via http and via NFS. It saves on external bandwidth and beats shlepping CDs around. They also keep the mirror updated with the latest rpms and our corporate specific applications. With the aid of a single floppy, also provided by IT, it took me about 5 minutes to boot the machine and get through the configuration steps, and then 16 minutes to get through the network install. The time I spent building and installing a faster machine compares favorably with the time David spent installing an old one.
On reboot, I noted three problems, the network did not see my NIS server, anaconda deleted my USB mouse and keyboard, and the X11 graphical environment would flicker the display a couple times and then fail to start, leaving me with a text only login prompt. Time to login as root and figure things out.
OK, it turned out that in setting medium firewall security on the install, I'd disabled every possible attack against my machine - and also made it less useful than I needed it to be. Security is always a compromise between usefulness and, well, security. So I used the lokkit
tool to add support for ssh, http, and dns in this configuration, but I still couldn't convince NFS or NIS to work, so I said "screw it" temporarily and disabled security altogether. (Note to my IT guys - temporarily... temporarily...) I note that even with this level of security disabled, there are plenty of other less invasive internet-proven security measures also in place automatically, a Linux box is more secure than a windows box would be at this stage.
The USB problem was nifty. For some reason the USB drivers weren't being started on bootup. I googled for USB linux HOWTO
, and after a little reading, I figured out that the usb kernel modules weren't being inserted. I had a bad moment when I did a modprobe of the core usb driver by hand - and my keyboard "locked up"! A few moments of twiddling later I realized that I had enabled usb but not enabled my usb keyboard driver. Grr. OK, one nice thing about Linux is that you can log into it from another machine using telnet or ssh and do the same amount of work, so I did that, and, following that USB howto, inserted more driver modules until my keyboard worked again.
Rather than figure out the right way to enable usb on boot, I just added
modprobe wacom # My graphics tablet
to the /etc/rc.d/rc.local file, which is the last thing run as part of the system startup, and added a line to /etc/fstab:
none /proc/bus/usb usbdevfs defaults 0 0
so that the usb device filesystem will be mounted (made available).
Now, a typical windows or NT guy would have rebooted multiple times at this point, but me, I just did a:
telinit 3; sleep 5; telinit 5
to restart graphics.
Hot dang, it came up, at the right resolution and everything. I fiddled with the default Redhat tools and environment for a while, and decided that, while it was pretty good, Ximian's GNOME desktop was better. So I popped over to Ximian's site
, and started an install of their entire desktop suite, while I went off to get some real work done. (The install would have gone much faster if my IT guys mirrored Ximian as well as redhat, but I digress)
Ximian has a cool thing called Red Carpet, which makes it really easy to get new applications and update existing applications. I spent a little time updating the machine to the most current, recomended upgrades, notably mozilla 1.0, and subscribed to the OpenOffice channel and installed OpenOffice, and debated seriously about maybe trying the GNOME 2.0 test release.
I did a couple other things to improve my user experience, notably, I changed my mozilla preferences to support tabbed browsing by default and to open new tabs in the background (This last is a godsend for browsing over slower links), started a copy of several gigabytes of corporate apps. Then I did what every veteran NT and Windows administrator longs to do in the middle of an install... I went home.
Interlude for dealing with Valley traffic and dinner excluded
From home, I logged onto to the new machine and resumed working.
"Wait, wazzat?", Cryeth the Windows user! "But your new machine is at your office!".
Well, out of the box, Linux supports Client/Server
networking & client/server X11 graphics. This is an extra cost option for windows NT/2000 users (Windows Terminal Services
- I couldn't help but laugh when I looked up this link - Microsoft has announced an important licensing hotfix
. Linux programmers focus on producing better applications, not licensing schemes!)
Close to 100% of all Linux applications work transparently over the network (The ones that don't are things like dvd players and games that require extraordinary amounts of bandwidth). No matter where I go, on whoever's computer I'm on, I can be at my desktop, running my applications. I don't even have to be running a linux box where I am, there are many easy ways to take advantage of this capability - putty
is a great Windows ssh tool, and you can go either go the free route (cygwin's XFree86
) for X or get a commercial X11 for your PC such as starnet
's product. Mac OS-X is already X11 based, so you don't need to do anything to it to run graphical apps over the network though I still recomend finding an ssh client.
But as much as I like point and drool interfaces, (I regularly run xemacs, mozilla, and evolution over my DSL link) sometimes using a command line is more efficient. I happily browsed the web using the "links" tool; I got audacity
audio editor, (which is comparable to CoolEdit
) and it's related libraries, libdvdcss
, and lm-sensors
(useful for seeing things like tempurature inside your machine), among other useful things. Lots of interesting tools and applications for linux can be found by browsing sourceforge
, and rpmfind
Graphical installers are all the rage in the Windows/Mac world, but I really don't understand what's so hard about typing
rpm -Uvh *.rpm
, especially when you can install a whole bunch of packages, all at once! by just typing rpm -Uvh *.rpm after your download frenzy.
Maybe people on other OSes like
going through graphical installer after installer, clicking on next button after next button, and reading EULA after EULA, and clicking "yes I agree" to terms written in the most obtuse legalese - but I don't. I do Linux because I don't have to put up with that sort of crap....
OK, so after about 10 hours or so (4 of active use) I've got a mostly built machine that does most of what I need it to do, I'm a happy camper. There are things left to do (like getting printing setup, and in fixing a performance issue I noticed) that I'll blog about later.
In conclusion I just wanted to point out some things that typical reviewers rarely point out about Linux:
A network based install of the core OS takes 16 minutes. It had a useful, full desktop and office productivity suite, out of the box, as well as hundreds of other applications. Getting hung up on the ease of the install is a barrier to linux, but once you're over that hump it gets easier. I think it would be only fair for this veteran Linux user to tell his horror story on installing Windows 2000, but not today.I did a significant portion of the install remotely. It didn't require rocket science.I didn't have to co-ordinate with license lawyers, corporate or otherwiseI upgraded to the latest and greatest stuff using an innovative network based installerThe software cost was... nothing. The hardware cost was... next to nothing.
More news as it happens. I hope that perhaps my sharing my install, methods, and tools, will make your linux experience a more enjoyable one. Have a good weekend!
One of my axioms about technology is that any given new, revolutionary, technology has to be 5x better than the old to replace it. The more catagories you push that 5x figure into, the better your chances. Case in point - Strongarm based handhelds have, in less than 2 years, begun to outsell palm based ones. Why? They have 4x more memory (32MB minimum), 6x more processor power (206 mhz clock vs 32), and 5x more flash. It was obvious to me when I saw the first sample boards using this chipset, over two years ago, that the palm chipset for PDAs was doomed.
Similarly, due to 5x price/performance advantages of the x86 part, massive cost savings on licenses, and 5x improvements in overall reliability, the film industry seems to be making a mass move to x86 based GNU/Linux on both
their render farms and their artistic workstations. A recent article in Linux Journal
talks about the wholesale replacement of NT and SGI based workstations with Intel based ones.
It is rare to see revolutions in technology that are this clearcut. For example, Intel's new Xscale is a merely evolutionary, not revolutionary, upgrade of the strongarm (SA) part. Intel got lucky with the SA, and to this day doesn't seem to understand the advantages of SOC designs in small form factors. The doubled clock is not a big enough upgrade to really supplant the SA. I predict Xscale takeup into PDAs will be slow until we see full integration of the Xscale into something as compact as the SA is, and new features and functionality that exceed it (say, for example, USB 2.0 host mode).
Sometimes evolutionary changes ultimately lead to revolutions. Moore's law means that there can potentially be a 5x revolution in cpu/memory/flash design every 3.5 years. Flash memory chips have finally reached the point where they can supplant hard disks in many low end applications (although admittedly, todays prices reflect a temporary oversupply), as density has finally reached useful sizes - with 512MB possible in a 2 1/2 inch form factor. Flash chips already had advantages in power consumption, & speed over conventional hard disks, but price and capacity were limiting factors. Until last year, anything over 16MB was uncommon. Today, 64MB flash is at a sweet spot, according to pricewatch
. It's only 26 bucks, 3x cheaper than even the cheapest hard drive you can buy today.
Today, under Linux. you can build a pretty functional word processor, web browser, email client (or server), and even a wireless router blogger/router/vpn box in 64MB of flash, and still have plenty of room left over to write hundreds of documents and store thousands of emails. The gigabyte race in the hard drive market and the massive bloat of common pc software have left people with the impression that you actually need all this space to do useful work, which is simply untrue. You can fit many a term paper into a megabyte of flash. I got through college on 360k floppies alone.
I'm building a little crusoe based machine just like I describe above, using MontaVista's
cross development tools. The results so far are impressive - I'm waiting for the battery to die on the box I'm playing with - I'm going on a couple days now.
Nope, it's not a full fledged system (in particular, if you are a MS Feature-Freak), but it's good enough to meet the majority of my needs, and even a veteran MS-word user would be able to get by in it.
The machine is totally
silent. I put a good feeling wireless keyboard and mouse on it, which boosts my mobile productivity. It's a little larger than the laptop, but still fits into my laptop bag. It's got 802.11b and VPN capability so I can do useful work in a coffee shop. X windows so I can run applications from my desktop at the office. It's rather nice.
This gets me to another axiom, somewhat inspired by the Innovator's Dilemma, a wonderful book that I will write about from time to time. I call it: "The Rule of Good Enough". What I got is "good enough" to totally replace one of my desktops and my laptop, for what I do, and has compelling advantages over both. It has rough edges, sure.
More news as it happens.
Electric Cars - why not?
Back when I lived on the side of a mountain
, I'd look out, most days, over Silicon Valley, at a sea of dark, Impenetrable smog. Most of the time I could make out Mount Diablo, 60 miles away, and nothing else.
It doesn't have to be this way, but the barriers that face the Valley in clearing up the smog seem nearly insurmountable.
California's government once mandated that 2% of all cars sold in the state be emission-free by 1998.
In 1997, GM announced with great hoopla and excitement the EV1
, an electric two seater that used many advanced new technologies to maximize it's range and performance.
Free recharging stations were established in popular areas such as fry's electronics, California cleared the HOV lanes for electric car use, and everyone sat back and waited for the inevitable take-up by consumers and the average environmentalist in the street.
And waited. And waited...
GM sold less than 200 cars in california before they pulled the plug.
The cars have basically three problems - range, price, and performance.
The performance problem is largely solved. The EV1 can do 0 to 60 in a matter of seconds, which is better than the old diesel beater I got partway through school on.
Price is a really hard one. Even with a major government subsidy, the two seater EV1 cost 35,000 dollars, which is a bit more than your average environmentallist can afford.
Worse, the car had a maximum range of about 55-130 miles. Most cars have a range of 300 miles or more with some cars pushing 600, which reduces your fill-up time to once per week, at 6-8 minutes per. A typical californian drives 60+ miles a day, which leaves little margin for error in the inevitable lunch or coffee shop run..
And "Filling up" a GM EV-1 takes 6-8 hours
Run out of juice on hwy 85? Getting a jump from a stranger takes on a whole new level of obligation when you're talking about taking an hour or more to transfer enough electricity over to get you home.
The electric car problem is compounded by the lack of infrastructure. It's the public transportation problem all over again.
I've got a way of solving half the recharging problem. Why not change out the batteries instead of recharging them? You would pull into a service station, drop out the batteries near discharge and get new batteries, freshly charged overnight. Driving patterns for most people don't change much over time, well, there might be a few alternate routes, but mostly you spend each day going from point A to point B and from point B to point A again.
Or better yet, have the swap out operation take place at your office location.
Yes, this approach has inherent problems. The batteries on a Toyota weight 1600 pounds, which isn't something that grandma is going to be able to easily deal with.