on the edge

computers & technology, books & writing, civilisation & society, cars & stuff


Greg Black

Contact Me
Home page
Blog front page


Follow evenhanded on Twitter


If you’re not living life on the edge, you’re taking up too much space.


FQE30 at speed



Syndication / Categories

  All
   Announce
   Arts
   Books
   Cars
   Family
   House
   Meta
   People
   Places
   Random
   Society
   Software
   Technology
   Writing



Worthy organisations

Amnesty International Australia — global defenders of human rights

global defenders of human rights


Médecins Sans Frontières — help us save lives around the world

Médecins Sans Frontières - help us save lives around the world


Electronic Frontiers Australia — protecting and promoting on-line civil liberties in Australia

Electronic Frontiers Australia



Blogroll

(Coming soon…)



Software resources


GNU Emacs


blosxom


The FreeBSD Project

Sat, 12 Jun 2010

Blog Moving to New Home

After six years of this temporary blog, I’ve finally setup something intended to be its successor. If you want to follow the new blog, it’s at Tea Parties and Unicorns.

If you want to subscribe to the feed, use one of these:

    Atom: http://teapartiesandunicorns.blogspot.com/feeds/posts/default
     RSS: http://teapartiesandunicorns.blogspot.com/feeds/posts/default?alt=rss

The technology used in the old blog doesn’t allow any automatic redirection, so this has to be done manually. I have updated Planet Humbug to the new feed, but it would be good if maintainers of other Planet installations could do the same.


Fri, 28 May 2010

Customer Prevention

A friend of mine used to joke that his job title was “Customer Prevention Officer”—it wasn’t entirely a joke as he made a single sale in four years with the employer he was at, admittedly for many millions of dollars. I see lots of customer prevention by businesses that I try to buy goods and services from and that puzzles me in times of economic gloom such as we are living through now.

So here’s a tip for tradies: if you promise to turn up to quote on a multi-thousand dollar job and don’t bother to turn up or even ring to make excuses, don’t be surprised that I won’t even consider giving you the job.

While my wife is away in Europe, I plan to replace the roof on the house and to have all the decks around the house and the pool rejuvenated. So I’ve been seeking quotes from various companies that do such jobs. Some of them have been excellent, making a convenient time to visit and then turning up right on time and getting their quote to me very promptly. But, for each job, there’s been at least one potential supplier who has made an appointment with me and just failed to show up. I hate that.


Fri, 21 May 2010

Buying Books with Booko

I buy lots of books, most of them technical, and spend about $3k a year at Amazon. In fact, once I found out how easy it was to find and buy books there, it became my only port of call when looking for a book. But a few months ago I wanted a book that I could not get from Amazon and had to look around for some help.

I found a few places that sold books and a few that provided information about book sellers. Of the latter, one that looked interesting to me was an Australian site—Booko—which claims to find the best deals for buying books in Australia. I’ve used their service a few times now and it has been excellent in terms both of finding books that I could not source from Amazon and in finding books at good prices.

Just this morning, a book arrived from England a week after I ordered it and the total cost was $14-86 AUD. It wasn’t a pamphlet, but a 330-page trade paperback. Other purchases have worked out pretty well too. I recommend Booko to Australian book buyers.


Wed, 19 May 2010

Is Clojure the Answer, or Assembler?

The ongoing saga of my project of learning new programming languages and eventually getting some real software written with one or more of them has been derailed again—this time by a new(ish) entry in the Lisp family, Clojure.

I already knew about it, but had been disinclined to delve into it because of its foundations in Java, a language I really dislike. But I’ve seen a few tempting things about it recently and the stuff I’ve been reading seems to show that you can use it without having to get into real Java. If that’s true, I’m interested.

The principal features I need in any language that’s going to engage me are useful tools for managing concurrency, a coherent and not overly verbose syntax to make things easy for the human readers, decent performance and portability to all the free operating system platforms I care about.

Of course, there’s also the business of getting close to the machine—something I think all programmers need to be comfortable with—and for that I’m looking at X86 Assembler. I last did lots of coding in Assembler when the Z80 was king of the hill and 2 MHz was fast. Lots of my small business customers had machines that had no hope of running anything serious unless it was written in Assembler. But I’ve hardly looked at it since the decline of the Z80, so it seemed like time to complement my focus on very high level languages with a bit of low level stuff.

I’ve found a few references to modern Assembly Languages and plan to get up to speed a bit with that over the next few months as a counterpoint to my functional languages.


Wed, 12 May 2010

Dual Monitors for Productivity?

I keep seeing blogs and mainstream media stories claiming that two monitors help productivity, with just a few saying the opposite. So, having a spare monitor, I thought I’d try it out.

Lack of experience with dual monitors made me a bit slow to sort out the way to get my video card driving them both—and not in the most ideal way. It has one HDMI output, one DVI output and one VGA output and can drive them all, although using the HDMI/DVI combination resulted in a less than ideal outcome as the screen attached to the HDMI chose to treat screen blank as a synonym for a bright blue display about 80 per cent of the time. The DVI/VGA combination works, but seems wrong.

Once I had them both working, I had to decide how to use them and came up against squirrelly behaviour from Gnome, similar to the silly way Apple’s OS-X behaves: you have to chose which screen displays the panels with all the useful little gadgets you choose to have available. And, since my normal modus operandi is to use a single screen with eight Gnome desktops and about fifty open windows, I ran into other issues.

The second display is attached to the same set of desktops as the first one, but the desktop chooser thing is only on the first one. So I have to move the mouse a lot further than usual to switch desktops if I’m on the second screen. And the little desktop selector things are twice as wide as usual because the screen is twice as wide, so they are clumsy to use and don’t attract my eyes and mouse as they used to do.

So I’m not sure whether this is a good idea or not. Maybe if I was using a pair of 19-inch screens it would seem more useful than the pair of 24-inch monitors I’m using which are so wide that I have to move my head a lot to take them in.

Or maybe I need to sit with somebody using dual monitors and see how they work them. If I was doing pair programming, I could imagine that the extra screen might be very useful. Or if I played games or watched TV, I could imagine using the second screen for that stuff. But that’s not me.

Anyway, I’ve moved everything around now to fit a display that physically occupies 1,100mm of desk width in place, so I might as well play with it for a week or so to see if interesting benefits jump up and hit me in the face. And maybe I’ll put the second screen back on the shelf to gather dust until it’s needed.


Fri, 07 May 2010

Banks Puzzle Me

I recently agreed with a dealer to buy a car. After the haggling over prices and features was over, I was asked how I would pay for it. I said I’d give them a personal cheque (check for the Americans) and they said that was fine. That was no surprise, as that’s how I’d paid for the previous couple of cars I’d purchased there.

So yesterday I hunted down the cheque book. It’s rarely used and in fact the previous cheque was dated exactly 21 months ago, for the last car I bought there. But then the dealer rang me apologetically to tell me that the head office had just issued an edict that personal cheques were no longer acceptable and would I please bring a bank cheque.

I had to go to the bank in question this morning anyway, so that was no problem. I did my other business and then put my cheque for a bit over $23k on the counter and asked for a bank cheque. No problem. Fair enough, I expected that.

The bit that surprised me was that the teller, who did not know me at all, asked for no evidence that I was me; did not verify the signature on the cheque at all; did not verify that the account had backing for $23k; and didn’t even worry about the fact that the cheque I gave her was made out wrong. She just explained how I should do it next time and then gave me a crisp bank cheque for $23,310-00.

Of course, I know that the cheque I gave her was good and that there was no reason to refuse me the bank cheque. But the lack of checks was a bit of a surprise.


Wed, 05 May 2010

Time for a Smaller Car

I’ve been driving biggish cars for the last 25 years, largely to cater for the need to cart kids and luggage around. Today I made a small step towards reducing my footprint on the earth by putting a deposit on a Hyundai i30 1.6L SX CRDi 5 door hatchback which should use about 100 litres less fuel per month than my current VE Commodore.

No doubt there will be the occasional whine from family members who usually turn to me to provide the big comfortable car, but I expect to be able to cope with that.


Fri, 30 Apr 2010

That's My Name!

I’ve been asked about the email addresses I give to businesses a few times recently, so it seems like time to write it up. When a business I deal with requests an email address from me, my first inclination is to simply refuse. If it’s a bank, I always refuse—I like to consign all email purporting to be from banks to /dev/null without further thought. If it’s a business that has some reason to send me email, e.g., to inform me about stuff I’ve ordered online, then I provide an email address.

Those email addresses are always of the form <businessname@notyourname.com> and they allow me to identify businesses who hand over email addresses to third parties (either intentionally or by incompetence) and to quickly cancel the address they use.

Some businesses protest about this use of their name and I tell them they have no choice. If they want to email me, they must use the address I give them. If they refuse to use the address I provide, I don’t do business with them. Some businesses say that they alone have the right to use their name and I tell them that I’m not infringing on their name, because only they will be using it if they keep it properly locked up.

And, as we all know, some businesses think that an email address provided to facilitate some real transaction allows them to spam you with all sorts of marketing you don’t want. I like having an easy way to stop that in its tracks.

As a side note, when I was looking for a new domain name just for this use, it was quite hard to find real names that had not been taken by the domain name squatter scum—but that won’t surprise anybody. More annoying was the number of names that have been registered by squatters without even having an associated name server.


Wed, 07 Apr 2010

Erlang or Haskell?

I recently wrote about my plans to learn some new programming languages and indicated that my thoughts were leaning towards Erlang and Haskell. I’ve now made some progress with what appear to be the most suitable books for me about each language—although I’m yet to write anything beyond the extremely trivial in either.

But I am starting to get some initial feelings about which way I want to go, although there are clear concerns with both languages. Nevertheless, based on my reading so far, I think I’ll focus first on Erlang—at least unless I stumble over something that seems like a total impediment to further progress.

For the record, I have several books about both languages, but the two I’m using at the moment are Real World Haskell by Bryan O’Sullivan, John Goerzen and Don Stewart; and Erlang Programming by Francesco Cesarini and Simon Thompson.


Mon, 29 Mar 2010

The End of Some Email Addresses

Along with everybody else who uses email, I have waged a long and always somewhat losing battle against the spammers of the world. Since almost all my incoming spam is addressed to old email addresses that I stopped giving out years ago and since almost nothing I care to read is now addressed to those addresses, I am about to kill them.

So, if you are thinking of emailing me at any of my “gba.oz.au” or “gbch.net” addresses, do it soon because all email to those addresses will be rejected by the server starting from a date in the very near future. There will be no whitelist, just a blanket ban.

Email addressed to me at my other domains will still be delivered, although it will be subject to pretty draconian anti-spam filtering and suspect items will just silently disappear into a black hole with no indication to the sender of the outcome.


Tue, 09 Mar 2010

My Next Programming Languages

I’ve been thinking (and talking) about learning some new programming languages over the past couple of years and it seems like time to make a decision about what to tackle. I’m not talking about learning a language just well enough to be able to poke at some crufty code that needs a tweak—what I’m interested in is learning languages well enough to seriously use them. And that, as Peter Norvig says, takes time. Which means I can’t learn every language out there.

After a lot of reading and thinking, I’ve decided that it has to be a functional language and that brings me to Erlang and Haskell. There are, of course, other candidates, but these two seem to offer the best opportunities for me at present, not least because both languages now have what appear to be good books available. And I like learning, at least in the beginning, from books.

It will be a while before I have anything to report about this plan, as I will first need to fit in my reading and practice until I can get something that I define as interesting ready for testing. At this stage, my tentative plan is to work with one language until I can write something easy but useful like a web server with it and then to reimplement the same thing in the other language and see how the two languages stack up.

It might not be a web server, but it will be about that size. And it will be something I don’t need to write (since, to keep with the web server example, I already have mature software handling that task for me), because I want to do this without any time pressure so that I can really delve into it. And it’s quite possible that I’ll just love whichever language I start with and not bother to even learn the other one. I really hope that I don’t hate them both, but I’m ready for that outcome too.


Sun, 07 Mar 2010

Goodbye RapidXen, Hello Linode Virtual Hosting


Update: Fri, 12 Aug 2011: See my RapidXen Update for new more cheerful news about my RapidXen setup and ongoing relationship. The original post follows.

 

About 18 months ago, I grabbed a couple of virtual hosts from RapidXen after reading some positive reviews of their services and prices. The prices are good and—for the first year or so—the service was pretty good too. So I extended my initial contracts, added a third VPS to the mix, and started hosting some of my services on these things.

Late last year, the wheels fell off the RapidXen service. Outages became longer and more frequent. Information about what was happening became impossible to get until well after the event. Then, because they were unhappy with their provider in one of the locations I was using, they announced a move to another city and another provider.

Fair enough, and maybe even a good thing. They at least gave plenty of warning about the move, announced that there’d be several hours of downtime while the equipment was removed from the racks in one place, put into trucks, and transported 400 miles south.

But the downtime and the outages were much longer than stated, and no explanations were offered until weeks later. Apparently, provider A refused to let them unbolt the servers in the old datacentre, hence the resulting fiasco. I can live with things going wrong. I understand that problems are sometimes out of people’s control. What I can’t tolerate is a complete failure (or refusal) to communicate truthfully to affected customers the real state of affairs.

So, despite the pain involved in moving my services elsewhere, I started setting up alternatives. Step 1 was to acquire a Linode once one became available in Fremont. There were some glitches with payment, but the Linode people were remarkably prompt to respond to my queries and fixed things effectively. And later there were a couple of difficulties where I needed customer support and that was also almost instant and solved the problems nicely.

After running the Linode for a while and finding that it was reliable over a few months, I recently migrated most of the services I had been hosting with RapidXen to the Linode and that appears to be working well.

And, so as not to have all my eggs in the one basket again, I also acquired a second Xen-based VPS from a friend in Chicago, and I have migrated some other services to it, also successfully. The third string to my bow will be to use either Amazon or Google to host some other bits for me, but the final decision on which way to go is still in the future.

So I’m feeling much happier about the state of my Internet presence now. And I’m also feeling much better about my off-site data backup arrangements. All in all, the recent changes seem to be a good thing.


Fri, 05 Mar 2010

An Update on Mercurial Updates

I recently whined about my inability to discover how to keep up to date with Mercurial. And the fine denizens of the intarwebs came—partly at least—to the rescue. I now know how to grab the latest release and have successfully installed it on the seven different systems I cared about.

Even better, I also know how to clone the mercurial-stable repository (and, obviously, how to grab updates when I want them). For the benefit of others who may have the same question:

    hg clone http://selenic.com/repo/hg-stable

That will create a clone of the stable branch and put it in a directory called ./hg-stable. Then it’s a matter of doing hg incoming in that directory to see if there are any updates, followed by hg pull and hg update to get them into the working directory tree.

There’s still one little imperfection: I have not yet found any source of announcements about new releases or even important updates. I suppose I’ll survive without that, although it is nice to receive a notice when there’s an important fix available. And I can always setup a cron job to let me know if there are updates to consider.

Anyway, thanks intarwebs, you rock.


Tue, 02 Mar 2010

Mercurial Updates

As revealed recently, I’ve decided to stick with Mercurial as my DVCS. But I’m not really inclined to use the out of date packages that some of my operating systems provide. Then I had an epiphany: it’s a VCS, so I should be able to just use it to keep itself up to date. Software is such wonderful stuff.

Sadly, my Google-fu is not up to the task of finding out how to accomplish this—or else the Mercurial folk don’t support it, but that seems unlikely. Somehow, I haven’t even managed to find out how to subscribe to an RSS feed to tell me about updates.

If any Mercurial-using person out there happens to know the secret answers to these puzzles, I’d be grateful for a pointer.


Sun, 21 Feb 2010

How To Drive Customers Away

My wife and I dine out frequently, sometimes out of laziness but mostly because we find dining out together to be a real pleasure. We have our regular restaurants but we also make a point of trying new places reasonably often. Today, the first two regular places we thought of were full when we tried to book, so we tried a new (to us) place that had been on our radar for some time.

We were both astonished at the number of things they got wrong, to the extent that we won’t go back there.

When we asked for a table, the waitperson neither replied nor even indicated that we should follow but marched briskly off to a table for two and waited, apparently impatiently, for us to get ourselves to the table and seated. She then dashed off to get us the water we requested.

By the time she returned, we knew we needed another table. This table was positioned close to and with one seat facing a giant screen displaying some intensely boring winter olympic pseudo-sport. Not that the merit of the sport was the issue, it was the in-your-face TV in a place that showed, by its prices at least, that it believed itself to be in the “fine dining” category. We don’t go out to expensive restaurants to watch TV.

So we politely requested a table away from the screen. The waitperson treated this request as though it was bizarre, and an imposition, and probably impossible to comply with. We insisted and so she led us into another section of the restaurant and seated us at one of the ten vacant tables set for two.

Then another waitperson approached and rattled off a list of things that were listed on the relatively short menu that were not available, together with a much shorter list of specials that were apparently available. The fact that the Black Lip Mussels made it on to the lists of unavailable and available dishes was brushed aside.

We placed our orders for an entree, a couple of mains, a side dish and a couple of glasses of wine. Off went the waitperson. Some time later he returned to tell us that the wine we had ordered (at $12 a glass) was also on the unavailable list and he suggested another of the same variety but at $16 a glass. When we didn’t leap at that, he offered us something quite different without mentioning the price. We settled for the $16 wine of the variety we had originally requested. He said he would bring us a taste of the other wine anyway. I don’t know why he said that—he certainly did not bring us the taster.

The food, when it eventually arrived, was excellent. But the bill for $129 for a small lunch for two abstemious diners and the multiple faux pas along the way ensured that we will remember the name of the place only to ensure that we don’t go back.


Thu, 18 Feb 2010

Chrome Is Useless For Printing

A few days ago, I announced a decision to try Chrome as a replacement for Firefox. I said that I would keep using it unless it fails to do something that I really want and it has now done just that.

Chrome has no print dialog or preferences and it insists on sending pages to the printer as US Letter pages. My printer is loaded with A4 paper and it knows that. So my printer—quite correctly—refuses to print the output from Chrome.

So I googled for an hour or so and discovered that my experience was common and well-known and that, even in version “5.0.307.5 dev”, there’s no solution. How utterly lame.

Since I’ll have to use Firefox whenever I need to print anything, I’ll have to go back to using Firefox. If somebody tells me that Chrome has been fixed, I’ll try it again. But I must say that my general attitude towards Chrome and Google is pretty negative right now. And, even if they give me back the two hours I wasted this afternoon, I’ll still be pretty unimpressed.


Mon, 15 Feb 2010

About Turn on Version Control Systems

Just the other day I wrote about my plan to switch from Mercurial to Bazaar for my version control system. Since then, I’ve had a few days away by the sea and away from computers and email and blogs and all that stuff.

One of the things I was thinking about during my little break was my ongoing problems with procrastination. And a little light went off in my head—changing from an almost-perfect DVCS to a possibly minutely-better DVCS is almost certainly a ploy to avoid getting on with things that actually matter to me.

So I’m going to abandon that plan to switch to Bazaar and I’m going to keep using Mercurial—at least until I find that Mercurial just can’t do what I need. And, of course, I won’t be reporting on the outcome of my experiment with Bazaar.


Wed, 10 Feb 2010

Switch from Firefox to Chrome

I whine a lot about Firefox and it continues not to improve at a satisfactory rate, so I decided to have a look at Chrome. My first few experiments showed me that it was far from ready for prime time on either OS X or Linux, but various people encouraged me to try the developer version instead of the regular one and so I gave that a fly last weekend.

And then I announced three days ago on Twitter that I was switching to Chrome on both OS X and Linux. So far, so good. There are things I don’t particularly like, some of which might change for the better and others of which I’ll obviously have to learn to live with. But, for the most part, I like it better than Firefox. It seems quite a bit faster. And, although it consumes a lot of system resources, it seems to leave me with a system that still allows me to do other things. So far, it hasn’t crashed.

Some elements of its handling of tabs please me a lot, other elements not so much. It did a good job of importing my Firefox settings, although it insisted that I had to shut down Firefox before it would do the import. Under Linux, it seems to have trouble getting access to the sound system, although many Youtube videos are better silent.

The Linux instance I have running has been going for three days. It is using a bit over 2GB of memory—which I think is rather a lot, but I can live with it on my main machine. It has 45 processes. I have 7 windows and 68 tabs open—light use for me, but I’m not doing much with it at this time of the year.

Under OS X, it’s much less busy as I just fire it up when I need it and never leave it running for long—the machine is a laptop which is only used when I’m away from home.

The verdict after three days: I’ll keep using it for a while until it either does something dreadful or fails to do something that I really want. It would be really nice to be able to stop whining constantly about my browser.


Thu, 04 Feb 2010

Another Look at Version Control Systems

I’ve been using version control systems for ever—well, back to the days of SCCS anyway. Every few years, I survey the scene to see if there’s something that better fits my current needs. That way I came to use RCS instead of SCCS. Then I found CVS and, after some hesitation, migrated all my RCS repos to CVS. And then I found I hated some of the weaknesses of CVS and migrated back to RCS.

There things stayed until Subversion was ready for real world use. I chose not to migrate old work, but just started using svn for new projects and then for new work on old RCS-managed projects. That went pretty well and served me for some years.

But, as Subversion was hitting its stride, other people were working on distributed revision control systems and I started watching those projects. From time to time, I would spend a few days having a good look at the obvious contenders. A couple of years ago I felt there were a few that were ready to be considered: Git, Mercurial, Darcs, Bazaar all seemed interesting. After some consideration, I chose Mercurial and I have been happy with it.

But Bazaar, or bzr as it’s called on the command line, had been a close second in my assessment. Bzr was let down by some performance issues and also appeared to have a few other minor concerns.

Recently, I’ve had another look at the various DVCSes as part of another project and I think there’s very little to choose between Git, Mercurial and Bazaar. It comes down to comfort with the command structure and support for the workflows that you might want to adopt. For me, Git is still too clunky to use—it takes more typing to get the same result. But I think Bazaar has just moved ahead of Mercurial in terms of workflow options and it seems to have caught up in the performance area.

So I’m going to use Bazaar for a couple of new projects and I’m also going to convert a couple of active Mercurial projects over to Bazaar. And, in a few months, I’ll have an opinion about the wisdom of that choice and I’ll write about that in due course. I know I haven’t exactly explained my choice, but that’s deliberate because it really is a fine distinction and I’m pretty certain that Git, Mercurial and Bazaar are all fine systems.


OS X Fails to Please

I’ve been using Apple laptops for a number of years in order to have access to some specific capabilities, but I have always found it hard to come to terms with the limited functionality of OS X as a work environment. Nevertheless, when I acquired my MacBook Pro recently, I decided to just go with the flow and learn to use Snow Leopard as it was meant to be used. And that has worked out quite well for the purposes that I normally use the MacBook for—email, IRC and web browsing while on the road.

But I recently had a reason to use it for my normal work stuff. I had needed to visit a Mac retailer for some minor item and stopped to look at the 27-inch iMac, where I became entranced by the display and, to a lesser extent, by the neat overall package.

This led to thoughts of possibly buying one of these things, which in turn led to thoughts of discomfort with OS X.

So I decided to try out OS X on a decent-sized display instead of the teensy thing on the 13-inch MacBook. I hooked the MacBook up to a 24-inch display to see how things might work. This brought me into contact with Apple Fail Number 1—the ability to get stuff onto the display you want it on is a black art and in some cases it’s only possible to start an application, see where it lands and then drag it to the desired display. That was hugely unimpressive, but wasn’t the point of the exercise, so I tried to ignore it while doing my testing.

I believe I succeeded in applying my attention to the factors that would be relevant with a single large display running OS X. To give it a fair go, I used this setup for three days as my desktop environment. But that was as much as I could stomach. Gnome—whether under FreeBSD, or OpenSolaris, or Linux—is just so much better to work with than OS X that it’s really not even a contest.

The upside of this is that I’ve saved $3k that I had put aside for the iMac which I could now partly apply to the bicycle that I’ve been thinking about buying as part of my fitness program. Another upside is that I won’t be constantly chafing against all the annoying little restrictions that Apple impose on their customers. So, although I will slightly regret the decision not to add something shiny to my desk, I think I’m probably more pleased than sad.


Fri, 29 Jan 2010

To Do Lists and Life

In my life as a stellar-class procrastinator, I have evolved a variety of techniques aimed at convincing myself that I’m doing something about my desire to get stuff done. Probably the most well-used of these—apart from straight-out avoidance—has been via the creation of numerous lists of things to do. At its best, this results in lists of lists and lists of lists of lists. And, since I am a software person, that then results in an occasional burst of time-wasting in search of the ideal software tool for making lists.

Mostly, I find a few tools that I haven’t seen before and a few that I have tried but which I hope might have improved just that little bit to make them useful. Always, I spend a day or two playing with the tools I find and sometimes I choose one—only to discard it a week or so later.

Usually, I keep these little excursions into procrastination secret. But today’s find—todoist—has already impressed me as being much better than anything I’ve tried previously, so I’m going to give it a mention here in the hope that it will force me to just get on with ticking off stuff from my new lists (and also to let people who like lists know about a new toy).

And, to be honest, the other reason for posting about it is so I can tick off the item about blogging every Friday.


Fri, 11 Dec 2009

Issues With OpenSolaris --- The GNU Tools

After my recent post about giving up on OpenSolaris I received a few requests for more information from some people who were prepared to jump through the hoops of contacting me despite the lack of comments on this blog. This is one such followup. I plan more.

For reasons which are probably understood somewhere inside Sun, but which I believe to be at least in part a requirement to support legacy software, OpenSolaris is still delivered with antique, if genuinely Unix, software tools. If an innocent newbie whines about the fact that the supplied awk or tar—to pick just two examples out of many—is unable to handle just about any task that someone might expect in 2009, she will be told that the GNU utilities are available. Not only are they available, but they are available in multiple ways.

To take the case of awk, old awk is /usr/bin/awk, GNU awk is /usr/gnu/bin/awk and that one turns out to be a symlink to /usr/bin/gawk. So you can get GNU awk either by calling it gawk or /usr/gnu/bin/awk, which might tempt you to put /usr/gnu/bin in your $PATH before /usr/bin. If you do that, you might manage for hours or days until, for example, you needed to read a man page. At that point, man will be mysteriously broken, because it depends on the old Solaris versions of some of the formatting tools rather than the GNU versions, but the idiots who built man for current versions of OpenSolaris have apparently forgotten something I thought all Unix people had known for at least the last 25 years—in system tools, you exec full pathnames rather than relying on the user’s $PATH to find the right ones.

One last note for today: despite the fact that “everybody” knows that shell scripts start with #!/bin/sh, in OpenSolaris they start with #!/usr/bin/sh despite the fact that the historical formulation would work. Why? Because at some point, they did away with /bin and moved everything into /usr/bin. While it’s true that they do provide a symlink from /usr/bin to /bin, gratuitous changes like that really don’t help anybody.

Sane readers will quite possibly feel that this little essay is hardly sufficient reason to abandon OpenSolaris—and I would agree with that. But there is much more and I’ll try to cover some other issues in the near future.


Tue, 01 Dec 2009

Second Thoughts on OpenSolaris

After my recent post about migrating to OpenSolaris, I’ve had cause to re-think. For me, there was one really significant argument in favour of the move and one rather significant other argument. The biggie was ZFS, the other was dtrace. They are both compelling, but the more I push against all the things that make OpenSolaris painful the more I think that it’s not worth the pain.

I had been thinking of writing at length about the reasons for turning away from OpenSolaris, but I think I might defer that (at least unless I see significant interest in my thoughts).

That leaves the question of which way to jump. I’m a long-term FreeBSD user and FreeBSD does have ZFS and dtrace—so there’s some incentive to go back there. But the truth is that many of the things that make OpenSolaris a pain also apply to the BSD camp. Again, I can expand on that if people care.

When I first went to FreeBSD, it was the early days of Release 2 and Linux was far from ready for prime time. Linux is still not ideal, but it has enough people working on it to ensure that big nasty things like Gnome and Firefox and OpenOffice are as up-to-date as one might reasonably hope for; and it has lots of people writing software aimed at Linux that becomes painful to port to other systems. So, for what it’s worth, I think Linux is the path of least pain. On top of that, all my VPSes in the USA run on Debian and so I need to have Linuxisms burned into my fingertips anyway.

Which Linux? I’ve had just enough experience with Ubuntu that I think that will be my choice for boxes that have a desktop function and I’ll stick to Debian for my servers for now.

One benefit from this decision is that I expect to be able to power down most of the 7 computers that pump heat and noise into my office at home while consuming many kW of power. There will inevitably be things to regret in this plan, but I think I can live with that. I will keep at least one powerful box with 4 disks in it for temporary installations of alternate operating systems for those occasions when I want to check some software on various platforms, but I’ll only power it up when I’m actually using it.


Fri, 20 Nov 2009

Migration to OpenSolaris

This entry is really just to verify that the migration of the development site for this blog to an OpenSolaris host has been completed successfully. The deployment site is still running Debian and that’s not likely to change any time soon.


Premature Upgrade

You are a technically-competent geek who has been in the sysadmin world for decades. You have many machines under your care. One of your machines is a fax server that sends hundreds of faxes a day. Your operating system is going through the pre-release stages of getting a new major release out which has many new features and many changed features, as you might expect in a change from release 7.2 to 8.0. A release candidate for 8.0 is announced, so you grab it.

So far, all is good. But then you blindly upgrade your one fax server to the release candidate and discover that the completely new (and not at all secret) serial I/O system doesn’t work quite right with your hylafax setup. You already know, from at least 10 years of experience with it, that hylafax is demanding and that issues with the serial hardware or software result in bad things happening.

This is where you are supposed to say, “Oops, silly me. I should have learned not to do that by now. Quick, let’s unwind that to a known working setup real fast before this turns into a disaster.”

But no, this person decides to conduct pointless experiments instead of unwinding his mistake. And he also finds spare time to complain to the providers of the free operating system he has relied on for so long.

This just doesn’t make any sense to me. I’d love to think that people could learn from their own mistakes and even from other people’s mistakes—but sometimes that seems like a foolish dream.

And yes, I deliberately avoided mentioning names or providing URLs. I’m not interested in having a go at any individual, just using a real current case as a cautionary tale.


Sun, 15 Nov 2009

Apology to Apple: I Was Wrong

Earlier today, I whined about Apple’s deletion of the line-in port on my MacBook Pro. Today, while fiddling with sound settings, I discovered by accident that there is a line-in port.

Careful inspection of the case showed only one possible candidate, clearly labelled for audio output. Just for giggles, I plugged in an audio input and told QuickTime to record it and presto, that worked.

It still seems silly to make one receptacle do dual duty, but I can live with that. It seems worse than silly to mislabel it, but at least I now know that it serves two functions. Hopefully, anybody who took any notice of my post yesterday will now know the truth.


Apple Gives With One Hand

A few months back, I bought a MacBook Pro for a bunch of reasons of which the most important was my expectation that I would be able to just do things when I wanted to. That expectation has been largely met and — apart from the usual annoyances arising from the irritating OS-X user interface and its inability to be configured to my taste — I’ve been quite pleased with it.

A couple of days ago, I wanted to do something and immediately thought of the MacBook as the enabling tool only to be thwarted by Apple’s decision to remove the audio line-in jack from later models of the MacBook Pro. Yes, they added a couple of other trinkets to compensate, but still this deletion is incomprehensible to me. If they had actually run out of space on the edges of the beautiful case, I’d have understood; but there’s plenty of room left.

Yes, I can probably buy a USB sound thingy and expect it to work, but my visitor wanted a copy of an old audio cassette with an interview that we’d done in the past and I had thought it would be easy to manage on the spot, having been sure that I’d seen somebody with an audio cable plugged into a MacBook Pro in the past — it hadn’t occurred to me to check that my more recent machine had the same capabilities.


Thu, 05 Nov 2009

Firefox Keeps Finding New Ways to Fail

I have been using—and whining about—Firefox since it first appeared. It has improved in many ways since the early days, and I am pleased about that. But it still finds new and astonishing ways of driving me crazy.

I have been using a Ubuntu-badged variant with the ridiculous name Shiretoko, and it appears to be based on Firefox-3.5.3 (which I know is not the latest, but it’s pointless going through upgrade pain when I am on the verge of changing a few other elements of my desktop—new motherboard, new memory, and a non-Linux operating system).

The new misfeature in this Firefox is that it constantly appears to freeze, for between 8 and 25 seconds. This is sometimes accompanied by greying out the Firefox windows, which seems to say that Firefox knows that it’s dragging its feet. The only good thing is that, whatever it happens to be doing while it’s doing nothing useful, it doesn’t also elect to bring the rest of the machine to its knees—all other windows and processes are able to operate quite normally while Firefox is thinking. (Which is why I’m able to write this as I wait.)

The machine is a quad-core 64-bit Intel thing with 8 GB of memory, so it should be more than enough to handle simple web browsing. By simple web browsing, I include accessing static web pages on my LAN, which is also afflicted by this bizarre behaviour.

A partial list of the activities that provoke this behaviour includes: pressing a key to scroll the page down; clicking on anything; attempting to grab a scroll-bar to do the obvious; typing in an input field; and so on. This makes net banking a fraught and perilous exercise, as it’s necessary to wait for up to 30 seconds to see if the click you made was actually registered or not—you don’t want to be clicking bank buttons more than once, but it’s not a lot of fun if the bank website times you out near the end of some complex transaction while you waited to see if your click was being processed.

It’s almost enough to make me go back to bricks and mortar banks. It’s certainly enough to make me hope I can get some other software installed pretty soon. And it’s certainly sufficiently annoying that I’d be hard-pressed to maintain my normal exemplary politeness if I met a Firefox developer any time soon.

Oh, that bit about my normal exemplary politeness was a joke. Just so you know.


Wed, 28 Oct 2009

Two Weeks of Dspam

It’s now two weeks since I setup dspam-3.9.0-BETA1 to handle my home network’s incoming email and it’s time for a review. I began by training dspam with a recent corpus of about 70k spam messages and 10k ham. Then I passed everything through dspam and checked its accuracy.

In my home situation, we can live with some missed spam turning up in our inboxes, but we can’t live with false positives. Dspam made one false positive out of 11,873 messages processed and that was in the first few hours. I’m ready to stop checking for false positives now and have started just dropping the spam on the floor.

Over the two weeks, I’ve only seen 17 spams per day out of the 435 that get delivered; and my wife has only seen 4 per day out of the 320 that are delivered for her. I’m calling this a great success and have decided that it’s sufficiently good that I don’t need to implement any other anti-spam measures at all.

The minor downside with the methodology I’m using is that any false positives will never be reported to anybody now that the testing phase is over. If anybody sends me a genuine email that dspam thinks is spam, I won’t see it and the sender won’t get a bounce. I can live with that.


Wed, 14 Oct 2009

Clawing Back Time

I have a project to reduce or eliminate some of the things that take up a fair bit of time but which don’t seem to provide commensurate benefits. This project advances in fits and starts, but today sees a couple of steps forward.

The second thing I did was to deactivate my Facebook account. That was quickly done, although I did need five attempts before I got a captcha that I could decode to complete the deactivation process.

The other thing, which took most of the day, was to setup a new dspam installation to help in the fight against the incoming deluge. I’ve been a keen dspam user for ages, ever since I setup a test installation to see if it had promise. Finally, I’ve updated it to the latest version, put it into place properly, trained it on a corpus of 10k ham and 70k spam that I’ve carefully assembled over the past few weeks, and am now watching it do an almost perfect job.

So this is all good. Now I need to move on and do some of the things I want to do with the free time—but first we are having a little holiday for a couple of weeks.


Sat, 10 Oct 2009

Mail Client Software Keeps Getting Clunkier

There was a time, before the WWW, when email client software was clumsy beyond belief—those who remember the original mail command and UUCP bang-path email addresses will know that things improved over a decade or two.

Then, with the growth of the Internet, graphical mail user agents (MUAs) appeared. Some of them were better than others, but they all suffered from some irritations. At the same time, the text-based MUAs continued to be developed.

Then, just when you might have expected that we were on the threshold of some really good software, things just stopped. I like to blame Microsoft for developments that I don’t like, but I don’t know if that’s fair in this instance and it’s not really important.

One of the early graphical MUAs was Exmh and, despite some clunkiness, it was a pretty useful utility. So much so that I persuaded my wife to use it when she decided to enter the email age. And she has been happy with it for about twelve years. I also used it for a few years, but eventually changed to a text-based MUA as I found myself dealing with ever-increasing quantities of email and discovered that I preferred the speed of the keyboard over the purported convenience of the mouse.

And there things stayed for several years. Recently it became necessary to update my wife’s computer—it was a seven-year-old box running an almost equally old operating system and the hardware was almost on its last legs and some of the software (e.g., Mozilla-1.x, OpenOffice-1.0) was simply inadequate to handle modern websites and data. And there were also more than a few security vulnerabilities in the operating system.

The search for a replacement, which she wanted to be silent, first led to selection of a Sunray thin client workstation. A number of factors resulted in the abandonment of that plan, but one thing that happened while I was exploring it was the discovery that Exmh, which has not been further developed since version 2.7.2 was released in January 2004, was probably not going to be an option on the intended Solaris platform.

That was no surprise, since it’s what happens to older software that doesn’t match the dominant design. So it became necessary to research alternatives that she could live with. There’s no shortage of choice and I won’t list any of them here. Where there is no choice is in the user interface—yes, there are differences, but they are insignificant against the overall architecture. And all of them, although faster in things like actual message display than Exmh, are much slower and more painful to use. I tested several and reviewed all those I could discover and they were all the same.

Eventually, I set up the one I thought best (for various other technical reasons not relevant to this discussion) and tried to teach my wife how to work with it. This was a disaster. She was already upset about the other changes I was going to force on her for the “upgrade”, but she uses email frequently now for her work and the modern software simply didn’t cut it for her.

Fortunately, the Sunray project died for other reasons and I had to find an alternative. And that machine, an Eee PC that was originally intended for me, runs Ubuntu and still provides Exmh as an optional package. Crisis averted. For now.

Sadly, I see no signs of any of the MUA authors making any effort to make their software more functional—adding bling is popular, but you’d think these people would use their software and would get frustrated with its clunky behaviour and would therefore want to improve it.

I still hope that something better than anything we have now will arrive in the next three to four years so that, the next time I have to upgrade my wife’s computer, I’ll be able to introduce her to a new MUA that she will be able to learn to like.


Tue, 06 Oct 2009

An Error of Thumb

Late last night, as I positioned our largest sharp kitchen knife to make a cut, I thought to myself, “If I observed somebody else holding a knife like this, I would warn them about the risk of injury that was attached to such incorrect knife usage.”

Moments later, as I was trying to stop the blood spurting all over the kitchen, I realised that I could now complete my warning with a remark about how effectively I had demonstrated the said risk. Fortunately, since I was holding the knife in my more dominant hand, the thumb I attempted to sever was on the less dominant hand. But it still keeps getting in the way and every bump sends urgent don’t do that messages to what passes for my brain.

That makes two “accidents” with that knife in two months. It might be time to learn a lesson while I still have the usual number of thumbs.


Wed, 30 Sep 2009

Another Go At ZFS?

A few days ago, I wrote: “I’m going to install Ubuntu and fuse-zfs on one of my machines …” To my surprise, especially in the absence of comments on this blog, I got quite a bit of feedback about that idea—all of it indicating that this idea would fall well short of my expectations for ZFS.

What to do? One idea that has occurred to me is to tentatively blame the motherboard/memory in the system that was giving me grief under OpenSolaris. That then allows me to justify buying another motherboard, new memory and video card and having another crack at running OpenSolaris with native ZFS on my workstation.

I’ve bought those bits and pieces and plan to install them and experiment in the very near future. In the meantime, I will continue my planned server setup on a different box that will also run OpenSolaris and ZFS, but that machine won’t be bothered by pesky irritants such as a screen or keyboard. I expect it to behave nicely. As for my desktop box, we’ll have to wait and see.

That will also force me to learn to love OS-X since my MacBook Pro will be pressed into service on the desktop for a while. Perhaps I’m not too old to learn new tricks …


Fri, 25 Sep 2009

Using Linux and Wanting ZFS

I’ve been using Linux in a limited manner for about four years, meaning that it has been installed on at least one machine that I use fairly regularly over that period. In common with all the other operating systems that I have used, it has its good points and its shortcomings. But, in recent years, the strengths have got stronger while also becoming more important to me and many of the weaknesses have been addressed.

However, it has been what I now regard as the failure of my attempts to make friends with Solaris Express and OpenSolaris during some quite intense attempts over the last year that has forced me to look hard at Linux as my main operating system for the near future.

There is just one serious fly in the ointment—the unfortunate fact that none of Linux’s multiplicity of file systems is a match for Sun’s ZFS. At first, I thought ZFS was something that was at least theoretically a good thing, but its unfamiliarity made it seem like something that you could live without. However, in a remarkably short time, ZFS becomes ridiculously easy to use and that’s when I started to see just how big a step forward it is. I really don’t want to go back to old-style Unix file systems.

Unfortunately, due to the old wrangles over which open source licences are good and which are not, the Linux people don’t feel able to adopt ZFS. I see that btrfs is being developed and that it is hoped that it will bring the features that I love in ZFS to Linux. But btrfs is years away, and I need a file system today.

I’m going to install Ubuntu and fuse-zfs on one of my machines at the start of next week to see how well that combination works. It’s far from ideal—fuse-zfs has been pretty well abandoned, as far as I can tell; and it is well behind the zpool/zfs versions that are now in Solaris. But if it works well enough I’ll give it a go and then I’ll cross my fingers hoping that Sun might fix the ZFS licence problem as they have finally managed to do with Java.

My other option would be to setup a server running FreeBSD with their implementation of ZFS and to use it as a file store for my Linux desktop machines. I’d rather just run Linux, but we’ll have to wait and see.


Wed, 23 Sep 2009

Perils of Perfectionism

I began this blog in May 2004, largely as an experiment to see if this new-fangled blogging thing was something I wanted to do. I kept at it, more or less, until the end of 2007—by which time I’d decided that I did want to continue blogging.

And that’s when things went wrong. I had already more or less decided that I really needed two blogs—one for more technical material and one for all the rest. And I knew that I wanted some features that were not provided by the simple software that I had been using for my experiment. So it was then just a simple matter of choosing names for the new blogs and selecting one of the full-featured blogging platforms and I could get on with things.

Almost 20 months later, and I’m still wrestling with these weighty questions. And I’m completely out of what habit I had developed of writing. I have chosen and then rejected about a gazillion names for the new blogs; and I have chosen and then rejected and then re-chosen and then re-rejected just about every blogging platform in existence. Nothing seems to be just right for my needs.

And so, even more belatedly than I often manage to achieve it, I have come to realize that my perfectionism has become the obstacle to getting anything done.

The way forward from here now seems clear—I will resume writing posts in this blog (in a new location, but retaining its original URL) and when I make those difficult decisions about names and blogging platforms, I’ll announce them here and allow this blog to rest in peace.


Fri, 21 Dec 2007

Xmas Parking Perfection

At this time of year, pressure on parking spaces at Indooroopilly (and no doubt other shopping centres) gets pretty intense. But this morning there were a few spaces available in the reserved parking outside the post office for people to use when collecting their mail from the PO boxes there.

Even so, the hero of this story felt that the effort of parking her tiny car in a marked space was too much, so she just stopped in the roadway, locked her car and dashed across to get her mail. By the time she had the box open, both owners of the cars she had blocked had arrived at their cars and were looking a bit irritated to see they had no way out. However, everybody relaxed as the pest dashed back to her car with her mail.

This was when she discovered that she had managed to lock her keys inside the car. She had also locked her phone in there with the keys. So she then borrowed a phone from one of her victims, eventually worked out what number to call for information, got the number for RACQ, called them, waited for a bit and finally reported that they would be there—probably within an hour to an hour and a half. The trapped owners were as ecstatic as you might expect. All the cars that had to squeeze past were looking unhappy. And I had emptied my PO box by then, so I went home.


Wed, 08 Aug 2007

Code Craft falls down hard

I know it’s not possible to write a big book without having any errors fall through the cracks, and I don’t make a habit of public excoriation of people for things that can be forgiven — but there are unforgiveable things.

Take Code Craft by Pete Goodliffe, published by No Starch Press as an illustration. Here we have a 580-page tome dedicated to the practice of writing excellent code and on page 13 it has an egregious example of unforgiveable content.

Before getting to the details, I would mention that neither the book nor the website give me any information that I could find in a reasonable amount of time about how to report errata. Had there been such an avenue, I’d have taken it. As it is, this seems the easiest approach.

This is in Chapter 1, On the Defensive, subtitled Defensive Programming Techniques for Robust Code. Under the heading Use Safe Data Structures, he gives the following example of some C++ code:

    char *unsafe_copy(const char *source)
    {
        char *buffer = new char[10];
        strcpy(buffer, source);
        return buffer;
    }

He then gives the correct explanation of the problem with this code when the length of the string in source exceeds 9 characters. After some discussion, he then says it’s easy to avoid this trap by using a so-called “safe operation” and offers this idiotic solution:

    char *safer_copy(const char *source)
    {
        char *buffer = new char[10];
        strncpy(buffer, source, 10);
        return buffer;
    }

In case the reader doesn’t know how the C string library (which is what is being used here, despite the otherwise C++ content) works, let me point out that strncpy is guaranteed not to solve the problem under discussion. The strncpy function will only copy at most the specified number of characters, but — in the critical case where the source string is too long — it will not add the very important NUL-terminator character. And so users of the returned buffer will still fall off the end of it and cause breakage.

Every C or C++ programmer who has been paying attention knows what is wrong with the C string library and knows how to use it correctly. So an error of substance like this should simply never have happened. It’s not a typo. It’s not a trivial error. It’s just plain wrong. And there’s no excuse for it.

I’m sure the author has many good things to say in this book and many of the sentences I have skimmed certainly do make sense. But stuff like this makes it impossible for me to suggest that it has any place on the budding programmer’s bookshelf. That’s a shame, because we need books that do what this book purports to do.

What irritates me most about this is that none of the book’s reviewers spotted this glaring error and none of the online reviews that I found noticed it either. This means that nobody with even a tiny clue has been looking at it.


Wed, 04 Jul 2007

Python 3000

About three years ago, I announced my plan to move away from Python for future development work. I returned to that theme twelve months ago in a couple of posts about recent experiences with Python.

It seems time to update things now. I have just been reading Guido van Rossum’s Python 3000 Status Update in an attempt to understand what the future holds for Python.

Clearly, the Python people have decided to make major changes to Python, such that software written for Python-2.x will need work if it’s to be expected to run on Python-3. Equally clearly, a great deal of work has gone into creating mechanisms to assist programmers with the necessary translations when the time comes and that’s something I applaud.

However, I have long been unhappy with Python’s continual introduction of what I see as gratuitous changes and have been looking at alternatives. Now seems like the time to jump ship. My plan now is to do some serious testing with alternative languages so that—when the time comes for me to write some new thing—I will be ready to do it in some non-Python language.

This post is just to mark the point where that decision was finally made and to link to the Python 3000 paper that marked the tipping point.


Wed, 30 May 2007

A new approach to spam filtering

About three years ago, I first considered DSPAM as a potential solution to the incoming tide of spam that was drowning me and that was increasingly overwhelming SpamAssassin, my then tool of choice. I wrote a couple of blog entries that discussed my research and included references to papers by Jonathan A. Zdziarski (the author of DSPAM) and Gordon Cormack (who, with Thomas Lynam, wrote an evaluation of anti-spam tools). I also mentioned some of my discussions with both Zdziarski and Cormack and said I would report more when I had more information.

Much time has passed and the spam problem has, as we all know, continued to get worse. Last December, having become completely fed up with the worsening performance of SpamAssassin, I decided to install DSPAM for testing. I elected not to bother training it, but allowed it to do its thing and contented myself with informing it of its errors. The downside was that I had to look at every incoming message, whether spam or not, to be sure of the classification. I have examined 82,931 messages in the last five months and I’m amazed at how well DSPAM works.

Overall, it has caught 98.92% of all spam and its false positive rate has been 0.02%. Most of the errors were in the first and second months while it was learning. Now, it is catching over 99.2% of spam with a false positive rate below 0.01% and there have been no false positives at all for a couple of months. For my wife, the learning was a little slower because she receives much less total email than me and her legitimate email volume is so small that it’s a bit of a challenge to get enough for training. However, even in her case, the detection rate is up to 98.90% and false positives have also disappeared.

I was going to modify qmail to reject messages that were deemed to be spam, but I’ve decided that it’s too much work, given the ickiness of the qmail code and the excellent performance of DSPAM. I also toyed with the idea of changing MTA, but I have not found an MTA that I would be willing to use that also has the ability to do what I want. I may one day decide to write my own MTA for in-house use, but for now I’m going to stick with qmail and the other modifications I had made to it in the past and—starting right now—I’m going to stop my practice of reviewing incoming spam in case any legitimate email is lurking there.

In other words, from now on, if anybody emails us and DSPAM thinks it’s spam, nobody will ever see the message. There will be no bounce, there will be no error message, there will be no sign that the message was lost. But it will be irretrievably lost. I have decided that the time spent on reviewing the spam is not worth the rewards, when the chances of finding a real message seem to be less than one in a million now. This is especially true when it’s also true that anybody who might need to contact us and who we would care to hear from has other methods of doing so.

I am really delighted to have got to the point where my spam load consists of hitting ‘S’ once a day to tell DSPAM about something it has missed.


Fri, 20 Oct 2006

Software quality

It’s hard to find examples where the two words in my title belong together. People of all kinds–users of software, software developers, and those who teach the next generation of developers–have been pontificating about both the problems with software and various approaches that might help to solve the problems for decades. But, as a general rule with almost no exceptions, software still sucks. And it’s getting worse, not better.

Anybody who happens to read this already knows that software is a problem since they have to be using quite a bit of software just to be reading a blog–and it’s my contention that just using software is enough to drive you to drink.

I’m a software developer, so I am fully aware of the difficulty of creating high quality software. It is indeed difficult to produce software with no bugs and, for most software, it’s probably impractical–or at least not worth the cost. But that’s not to say that the quantity of bugs in the stuff we all have to deal with every day is justifiable.

Here are a couple of examples from some of my appliances. I have a DVR. It has a 60G hard disk and a DVD writer. And it doesn’t have to do much. So why does it take 30 seconds to boot up? Why does it have to boot up after making a recording? Why can’t I set it to record something that starts less than five minutes after the previous item? Why can’t it sort titles alphabetically using the same rules as anybody else?

To elaborate on the last point, as it’s a classic example, consider the following list of titles:

  • Apple
  • Apple Tree
  • Azimuth

That’s sorted in the way that any sane person would expect. Getting software to sort it that way is child’s play. The Unix sort program will sort it that way by default. So what does my DVR do? Behold:

  • Apple
  • Azimuth
  • Apple Tree

I’ve had enough time to study its behaviour now, so I can choose titles for things that it can sort the way I want–but it’s completely crazy that I should ever have been driven to think about this. I don’t want to belabour the point about this one appliance too much, so I’ll limit myself to one other bizarre fault. It has, as I mentioned earlier, a 60G disk. So it can store quite a number of off-air programs for viewing at more convenient times. Well, it could do that if it didn’t have a limit of 7 recording slots. That’s right–seven. The first VCR I ever owned, more than 20 years ago, could be programmed for more than seven recordings.

I’m sorry, I lied. I’m going to mention one more thing about this device–because it’s faintly possible that this is a deficiency in the hardware rather than the software (not that I believe that for a minute). The advertising material and the manual for my DVR claim that it can copy between media at accelerated speed. That seems to be a reasonable capability, given what we know about other such equipment, so I expected it to work. But it doesn’t. Copy an hour of video from the hard disk to a DVD and it takes exactly one hour. Copy an hour of video from a DVD to the hard disk and that also takes exactly one hour.

And, for another instance that might be hardware but probably isn’t, consider the fact that it makes DVDs that quite likely can’t be read by at least a few of the 8 other DVD readers in the house. Sometimes, to show just how clever it is, it can make a DVD that it can’t read itself. And when that happens, the only way to get control of it again is to remove the power and physically remove the offending disk.

That device has a flotilla of other hideous bugs that make it a nightmare. My wife just leaves it to me to drive it. She is pretty smart.

Now, let’s turn to my phone. No, let’s not. At least, not beyond mentioning that it’s slow and as buggy as hell too. Having once worked as a consultant for one of the mobile phone manufacturers–where my task was to teach their programmers C and to help them with some of the interesting bits of their software–I’m not surprised to see that this phone has software that I wouldn’t spit on.

The trouble is that software is ubiquitous in our world. Even if you never touch a thing called a computer, you can’t escape it. So you’d imagine that we–the software creators of the world–would have figured out some of the basics of making software by now. But we simply haven’t come close. And nothing I see in our tertiary institutions makes me think that’s about to change all of a sudden.

One of the good things about the world of software is Free Software. (For anybody who doesn’t recognise why those two words were capitalised, have a look at this definition for some insight.) Sadly, the Free Software people are at least as bad as the rest of the software community when it comes to quality. The Free Software crowd have a bunch of silly slogans written by would-be philosophers without much insight such as the famous “Given enough eyeballs, all bugs are shallow”, usually attributed to one of the worst poseurs in the community.

The problem is that bad software is much easier to get done than good software. Of course, if you consider the subsequent investment of time by the software authors while they try to address the worst bugs, then the apparent speed of the favoured method seems less of a sure thing. And, heaven forbid, if we considered the time lost by the unfortunate users of this software, then the equation becomes ridiculous. Time spent on whatever process we can find that results in fewer bugs going into the product will be amply rewarded. And, as many people are now showing, it is almost certain that taking the time to get it right in the first place will be quicker than rushing bug-infested rubbish out the door–certainly once the developers have had time to become established in this new pattern of work.

Just as I avoided listing brand names of my appliances above, I’m not going to single out individual pieces of software for criticism here. But it’s pretty safe to say that any high-profile piece of software is almost certainly riddled with thousands of maddening bugs. And the bigger the software, the worse it will be, for reasons that will be obvious.

As a final comment on the sad state of things, I’m going to look at the state of programming languages–again without naming names. I’m not talking about the suitability of our languages for developing great software, although that is indeed an important matter. I’m just talking about the woeful state of the software in the interpreters and compilers themselves. I recently acquired a 64-bit computer, not so much because I needed the extra capabilities it had, but to use as a platform for me to use to weed out any little buglets in my own code that might be exposed by a 64-bit machine. As it happens, I have not found any. But I have been amazed at the number of languages that I wanted to use in my testing that are simply not able to run on a 64-bit platform, despite the fact that 64-bit systems have been around for years. And not to mention all the other applications that are not yet available for my 64-bit platform. This is really sad.

And it’s so bad that, in the next few weeks when my operating system comes out with its next release, I’m going to install the 32-bit version on my workstation so that I’ll be able to use all the stuff I want to use.

I’ve been working all my life at continuing to improve the way I do things. I will keep doing that. I’m happy to talk with people about ways of improving software. And I really think it’s way past time for the software development community to get off its collective butt and to start looking hard at injecting quality into software.