People who I've worked with, or worked for, or read my writing here and elsewhere, have probably figured out that I'm a huge fan of standards just about everywhere they make sense: data formats, user interfaces, and so on. After all, why should we have to relearn how to drive a car simply because we buy a new Ford in place of a Toyota that the Government doesn't want us driving anymore? (You see very few old — or even fully-paid-for — cars in Singapore.) The steering wheel, pedals, and other controls are in a familiar layout; any slight differences are quickly adapted to.
Not so with the Western world's most widely-sold word processing software (for instance); when Microsoft Word 2007 for Windows shipped with a different, unique ('innovative', whether or not you find it debatable, is beside the current point) interface. Bloggers bloviated, many users were deeply confused, and corporate help-desk calls (and support/training costs) spiked. People running Windows PCs were very rarely neutral about the change.
A year later, Microsoft shipped Word 2008 for the Mac. Although there were some interface changes, the points of loudest discussion in the Word:Mac user community seemed to be
- the omission of Visual Basic for Applications as an attempted cross-platform macro language; and
- the new semi-proprietary document format, which allowed flawless interchange with Windows users (VBA notwithstanding).
Interface changes, per se, didn't spark nearly as much angst as had the Windows version of the year before. While a certain amount of this should no doubt be attributed to Microsoft's experience with the earlier release, the main reason was both different and obvious.
When developing Mac applications, it's much easier and better to follow the Apple Human Interface Guidelines than to "roll your own" interface. Developers, including Microsoft, are well aware of the ways in which the Mac development tools make your life easier if you structure and style your app to meet the Guidelines (and user expectations), as opposed to how much scutwork needs to be reinvented from scratch to do things differently. Users benefit even more, as the amount of learning needed to use a new app, or a new version of an existing app, are much less than is the average under Windows or Linux. And, unlike far too many Windows programs, Mac programs are usually highly discoverable; the user may not know how to accomplish a desired action, but there is one (and preferably only one) obvious path to follow, and mis-steps are generally not heavily penalised.
Right, "everybody" knows this, so why did I spend five paragraphs restating the reasonably obvious? Because the real intent of this post is to draw your attention to a phenomenon which is a necessary outcome of that standardisation and discovery: it is much easier to switch from one Mac app that performs a given task to another than it is on Windows. Most Mac users periodically switch between different applications for a given purpose, even keeping two or three installed on their systems. When you ask them why, they don't (generally) point to bugs or deficiencies in one product over another; they merely switch between them as their use cases change. For example, though I have both Microsoft Office and Apple iWork on this Mac, I will often create or open smaller Word documents in a simpler application such as AbiWord instead. It doesn't have all the features of Word or Pages, but it has the basics, and it loads more quickly and uses fewer resources than its two "big brothers."
The average Mac user is also generally more willing to try new applications, and generally owns more applications, than is the average for Windows. Since she is confident in her ability to pick up and use a new program, generally without resorting to a manual or even online help, there is a much more open discussion between users and developers, since both have seen a good bit of "the competition" and know what they like and don't like.
More rarely than is the case elsewhere, but not rarely enough, this easy migration from one app to another is due to real or perceived defects in a previously-used program. This happened to me recently; the program I had been using for a few months as my main Twitter client was not showing me all the tweets of people I was following in the "mainline" stream that I would see when I looked at each person's stream individually. Once you start following more than about two or three people, the mainline becomes absolutely indispensable; you simply don't want to have to take the time to look at each stream in isolation. So, I moved to another client, Nambu (now in "private beta" for a new release; version 1.2.1 can be found via web search).
Two immediate observations: I already know how to use this, even though Nambu has a far more dense presentation than my earlier client. And, because of that "dense presentation", it now takes me about a fifth as much time to get through my morning and afternoon Twitter catchups as it did previously. (Multi-column view is the killer feature, guys; there's only one little thing I'd like to see different...)
Again, why make noise about this? Simple: I've been a Windows user (usee?) and developer quite literally as long as there's been a "Windows"; I ran across my 1.0-beta developer kit floppies (5-1/4", of course) a couple of weeks ago (thinking about having them bronzed...or mounted on a dartboard. Maybe both.) But the nasty truth is, I very rarely change applications that perform a given task in Windows. The pain level and the (hopefully temporary) hit on my productivity aren't worth it until the pain becomes absolutely excruciating. I don't have that problem with the Mac, at all. I can try out new applications at will, even daily-workflow applications, secure in the knowledge that
- I already know how to use this, at least well enough to get started, and
- I can go back — or on to another candidate — any time I want to.
There's a word for the feeling that having that kind of freedom, that control over your computing experience gives you: