I haven't written any code in 9 months, and precious little these last 2 years. I've been a build monkey for nearly 9 million lines of code, on more platforms and boards than I could count. It was my responsibility to port it, to fix it, to build it, to add to it - and it was just too much. Periodically over 2002 my vision would blur, C itself would become garbage. Computer text itself became difficult to read. I would spend days not being able to do anything with the code at all. I had my eyes checked, they are fine. I'm thinking maybe I have low blood pressure issues, I'm working on that theory. Oxygen starvation to the brain, not enough sleep, definately. I know I was getting burned out, but I don't really know what that means, medically, brain-activity wise - it just seemed that a skill I've had in spades (C programming) for 15 years went hiding every time I reached for it. At rest, passively, I could see code on the back on my eyelids and be able to manipulate it. Eyes open. Nothing. A blur. A system crash.
I got by on seeing the patterns of the build - I could see the pattern of an error, I could see the pattern of "make", of the C compiler - as this stuff whizzed by. It was like reading the matrix directly, only that when I got close enough to really look at it it fuzzed out, it didn't become anything like reality at all. I had to have two monitors at 1280x1024 and multiple virtual desktops just to retain enough context on the different environments I was working in, and god help me if X crashed, I'd never find anything again. I spent something like 87 days out of 90 like that.
I wasn't sleeping. And then I had a really scary day - a day I couldn't read at all
. It was also kind of liberating, and I'll talk about that soon, but it was mostly - terrifying. Gradually reading came back - though retaining anything requires the act of writing it down and reviewing it multiple times, but I couldn't sit still for more than a few minutes. I blogged for a while using text to speech software, and I've had to stare at a computer with what Buddists call "A beginners mind" for months now as I relearned how to use it.
Programming is something I've done since I first learned to program a ti-99 calculator. It's a world I'd mastered. It's a simple world. It's my living. I've spent these past months thinking: Am I washed up? You only know two programmers stil doing it older than you... where do they go after 40? Do they get chopped up and ground into fertilizer like the horse in animal farm? They don't seem to turn into managers... does someone roll up to your house in an inflated ball and take you away to the old programmer's home?
It's not.. entirely my
problem... that I can't get the world to shut up long enough for me to think. The outside has been screaming into my world ever louder. It's not just the banner ads, the distractions of the office, the glowballs on front of everything you see on tv, google and the branding of the web itself - but it's snuck into user interface design, on the one system I care about - Linux.
Most of new GUI stuff today has been a lemming-like orgy of emulation what's wrong with the other computer interfaces. Thankfully - thanks to the melting pot - alternatives exist.
Mac-think has run amuck. The original research that created the Xerox star and the mac showed that inexperienced users - people that didn't grow up on computers - worked better with motive reminders on the screen of how to "do the next action". Good research. Solid research. That research never did track the productivity or skill improvement of those users over time - and no-one has really followed up on it since - and while this style of interface is a great introduction to productivity using computers it's also... training wheels. A ball and chain. A source of visual distractions...
Homo Computatis has evolved since that mac research. Speaking computer is part of the grammar you learn growing up. Children the age of 6 instead of asking daddy "why is the sky blue?", reach for google instead.
Physical devices and interfaces have changed, too. Adopt a beginners mind for a minute, with me.
Take a hard look at your keyboard. Pretend like you've never seen it before. Do you have a scrolling mouse, and cursor keys? Look up at your screen. Why do you have a vertical scrollbar on every window?
Look down at your keyboard, and up at the screen. Why is there no correlation between the row of function keys and the menu items on the screen
Play with your hand-motor control - Can you easily hit the scrollbar with the mouse?
Why is it so wide by default? Why can't you turn it off?
The mac-like interfaces depend on sight-recognition vocabulary, text patterns like (File, View, Help), etc for the people that remember text. Icons for the people that remember pictures. And help files, maybe, for the people that remember keystrokes. Still, no user interface is quite the same - in fact - there's barely any real school of thought or standard for each of these concepts anymore - it's all buried in the noise. Everything has to be in everything for everybody in an orgy of overengineering.
The browser comes by default with 4 rows of buttons and text, and a status bar. In browsing the web itself you get 1-3 banner ads, some sort of branding for whatever page you are on, and a few lines of text. I didn't really notice this until I tried to browse the web two year with a string of 320x240 resolution devices - it is impossible to get to any useful data, immediately, anywhere at resolutions below 640x480 nowadays. Try it.
As part of that 320x240 project I hacked gtk to give you a little more real estate by default. But it still wasn't good enough, and browsing the web hasn't taken off on smaller devices...
All that stuff on a normall screen - is very busy, very distracting. All these graphical objects compete for your mindshare, all the time. You have to either conciously blur them out - or conciously - find ways to eliminate them. It's time to target everything we've learned about user interface design directly at the kinds of brains we are trying to reach.
Recently slashdot.org picked up on the idea and published a story about the vastly simplified window manager evilwm
. It has just the basic commands required to manipulate windows and workspaces. It's wonderful, it felt like I'd got back 10% of my brain and my screen real estate when I implemented it. I'd only add one feature - the ability to get back to a previous window - backtab. The commands required to use it fit on a post-it. Great. I can memorize those and throw away the post-it eventually. And once I looked at that screen with that beginners mind - I noticed - really noticed -
That I run 5 programs on a daily basis. That's it. - all of them can be stripped down - with a little reading and memorization I can get more room to think
Mozilla can run, with F11. in a full screen mode. I haven't figured out how to get rid of the right scrollbar, but after getting rid of all the extra stuff, even the overly busy interface of blogger leave room for me to see what I'm writing.
Add three postits over the brand logos on the monitor itself, and 2 over the mouse and keyboard - and finally, finally, there isn't a errant thought in sight - It's relaxing. there's enough space left to create... I went from room for a paragraph on the screen to room for this entire article - I can read it at a glance, comprehend it's chaos and be able to edit... I could probably edit it better, (sorry) but I'm off to try something I haven't done in a while.
Programming. Without an icon in sight.