Showing posts with label Microsoft. Show all posts
Showing posts with label Microsoft. Show all posts

Saturday, 14 July 2012

Getting Your Stuff Done, or Stuff Done To You

This is the response I wanted to leave to "MrChimei" on the spot-on YouTube video, "Steve Jobs Vs. Steve Ballmer". Since YouTube has such a tiny (but understandable) limit on comment size, a proper response would not fit. Therefore...


Let me put it this way. It doesn't matter whether you're speaking out of limited experience, or limited cognition, or what; your flippant attitude will not survive first contact with reality (to paraphrase von Moltke).

I'm a Windows developer who's been developing for Windows since Windows 1.0 was in early developer beta, on up to Windows 8. I had nearly ten years professional development experience on five platforms before I ever touched Windows. I had three stints at Microsoft back when that was cool, and sold most of my stock when it was still worth something.

I've also supported users of Windows and various operating systems, from groups of 3-5 small businesspeople on up to being comfortably high in the operational support pecking order in a Fortune 100 company. I've seen what helps and doesn't help intelligent non-geeks get their work done.

Both in that position, and in my own current work, I've observed and experienced order-of-magnitude-or-better differences in productivity, usability, reliability, supportability… all in Apple's favour. I've worked with and for people who became statistics junkies out of an emotional imperative to "prove" Windows better, in any way, than other systems. The next such individual I meet who succeeds, out of a sample of over 20 to date, will be the very first.

In 25 years, I have never experienced a Windows desktop machine that stayed up and fully functional for more than approximately 72 hours, *including* at Redmond, prior to a lightly-loaded Windows 7 system.

In the last 6 years of using Macs and clones half-time or better, I have never experienced a Mac that failed to stay up and working for less than a week. In the last five years, my notes show, I've had two occasions where a hard reset to the Mac I'm typing this on was necessary; both turned out to be hardware faults. Prior to Windows 7, any Windows PC that did not need to be hard-rebooted twice in a given fortnight was a rarity. Windows 7 stretched that out to 6 weeks, making it by far the most stable operating system Microsoft have shipped since Windows NT 3.51. (Which I will happily rave about at length to any who remember it.)

For many years, I too was a Windows bigot. The fact that Unix, then OS/2, then Mac OS had numerous benefits not available in Windows was completely beneath my attention threshold. The idea that (on average over a ten-year period) some 30% of my time seated at a Windows PC was devoted to something other than demonstrably useful or interesting activity was something that I, like the millions of others bombarded by Ziff-Davis and other Microsoft propaganda organs, took as the natural order of things.

Then I noticed that Mac users were having more fun. "Fine," I thought, "a toy should bring amusement above all." Then I noticed that they were getting more and better work done. "Well," I said to myself, "they're paying enough extra for it; they should get some return on their investment. I'm doing well enough as is."

And then, within the space of less than a year, all five of my Windows systems were damaged through outside attack. "Why," I asked. "I've kept my antivirus current. I've installed anti-spyware and a personal firewall in addition to the (consumer-grade) router and firewall connecting me to the Internet. I don't browse pr0n or known-dodgy sites. I apply all security patches as soon as they're released. Why am I going to lose this development contract for lack of usable systems?"

I discovered a nasty little secret: it's technically impossible to fully protect a Windows PC from attacks, using tools that a reasonably-bright eight-year-old can master in a Saturday afternoon. People responsible for keeping Windows PCs have known this for over a decade; it's why the more clueful ones talk about risk mitigation than prevention, with multi-layered recovery plans in place and tested rather than leaving all to chance. For as long as DSL and cable Internet connections have been available, it's taken less time to break into a new, "virgin" Windows PC than to fully patch and protect it against all currently-likely threats.

People used to think that using cocaine or smoking tobacco was healthy for you, too.

What I appreciate most about the Mac is that, no matter what, I can sit down in front of one and in a minute or less, be doing useful, interesting work. I don't have the instability of Windows. I don't have the sense that I'm using something that was designed for a completely different environment, as Windows too closely resembles the pre-network (let alone pre-Internet) use of isolated personal computers. Above all, I appreciate the consistency and usability that let me almost forget about the tools I'm using to work with my data, or with data out in the world somewhere, than on what I'm trying to accomplish.

One system treats its users as customers, whose time, efficiency and comfort are important and who know they have choices if they become dissatisfied. The other platform treats its users as inmates, who aren't going to leave no matter what... and if that's true, then quality sensibly takes a back seat to profitability.

Which would you recommend to your best friend? Or even to a respected enemy?

Tuesday, 28 September 2010

IE9 and Standards? Nothing to see here, folks; move along...

This is a re-post of a comment I left to the post "HTML5 Support in Internet Explorer 9 on Louis Lazaris' Impressive Webs blog (which I highly recommend for people working in or interested in Web development). I've slightly edited a few places for clarity. The comments that had been left by people were almost universally complaining about Microsoft in general and IE's continuing history of imperfect-at-best compliance with either-you-do-or-you-might-as-well-not-try standards. Having worked in and with Microsoft on several occasions, and being active in a number of open-source projects besides, I have a slightly different view, as I state below. Thanks for reading.


Actually, I'm amazed that it's progressed as much as it has.

Let me be clear; I'm not in any way praising IE; the industry as a whole needs to bury it — decisively, irretrievably and imminently. We as development professionals need to have the professional self-respect to tell current and potential clients, "We develop to current Web standards that are supported across various browsers and platforms. To the degree that any particular release of Microsoft Internet Exploder supports the markup, styling and behaviour of any particular site, that's well and good. However, without monumental additional time and budget resources, no attempt will be made to provide IE support when those resources could instead be used to improve the experience for users of modern browsers."

I firmly believe that IE hasn't progressed as far as, say, Chromium, can be laid firmly at the feet of Microsoft's existing code base. Microsoft's developers and managers are fully experienced with the reality that complete green-field rewrites of existing projects almost never succeed. They've got a code base where the list of major Windows modules and subsystems that do not have dependency relationships with IE could be read aloud comfortably in a single breath. That was done initially by choice; now, it doesn't matter how well-meaning the intentions or competent the team are, they have to live with the codebase they have. It's all legacy code. The developers at Microsoft are (with rare exception) not morons, but living in the sewer while they're trying to make something that can stand next to the shiny new browsers next door has to be a psychologically toxic exercise. Their baby is blowing up the Web left and right; they know it, and they know they can't do a damned thing about it without staging a coup d'état, replacing dozens of levels of management and senior executives and fundamentally changing the culture of the organisation. Don't hold your breath waiting for that to happen.

That isn't sour grapes or a diss against the IE developers; it's simple reality. Microsoft do some amazing things — just not so much for (or on) Windows. Unfortunately for all of us, Windows and Office for Windows are the herds of cash cows for Microsoft, and anything that could be seen as even potentially disrupting that would get shot down faster than you can say S-E-C; the investors would never stand for it. And, with the system and the rules the way they are, they'd be perfectly right. Innovation isn't as important to the bean-counters (or the regulators) as "maintaining and enhancing shareholder value," and MSFT have had enough problems with that lately. (Just compare their share values to, say, AAPL over the last ten years.) Doing anything "risky" is simply politically (and legally, really) impossible.

So, no matter how many times the IE team jump up and down and say "look at all our neat new features," without mentioning the standard features left unimplemented because they pulled off numerous miracles just making what they have work, the response has to be "nothing to see here, folks; move along."

And move along we shall.

Tuesday, 23 February 2010

Again, Standards Matter

People who I've worked with, or worked for, or read my writing here and elsewhere, have probably figured out that I'm a huge fan of standards just about everywhere they make sense: data formats, user interfaces, and so on. After all, why should we have to relearn how to drive a car simply because we buy a new Ford in place of a Toyota that the Government doesn't want us driving anymore? (You see very few old — or even fully-paid-for — cars in Singapore.) The steering wheel, pedals, and other controls are in a familiar layout; any slight differences are quickly adapted to.

Not so with the Western world's most widely-sold word processing software (for instance); when Microsoft Word 2007 for Windows shipped with a different, unique ('innovative', whether or not you find it debatable, is beside the current point) interface. Bloggers bloviated, many users were deeply confused, and corporate help-desk calls (and support/training costs) spiked. People running Windows PCs were very rarely neutral about the change.

A year later, Microsoft shipped Word 2008 for the Mac. Although there were some interface changes, the points of loudest discussion in the Word:Mac user community seemed to be

  • the omission of Visual Basic for Applications as an attempted cross-platform macro language; and
  • the new semi-proprietary document format, which allowed flawless interchange with Windows users (VBA notwithstanding).

Interface changes, per se, didn't spark nearly as much angst as had the Windows version of the year before. While a certain amount of this should no doubt be attributed to Microsoft's experience with the earlier release, the main reason was both different and obvious.

When developing Mac applications, it's much easier and better to follow the Apple Human Interface Guidelines than to "roll your own" interface. Developers, including Microsoft, are well aware of the ways in which the Mac development tools make your life easier if you structure and style your app to meet the Guidelines (and user expectations), as opposed to how much scutwork needs to be reinvented from scratch to do things differently. Users benefit even more, as the amount of learning needed to use a new app, or a new version of an existing app, are much less than is the average under Windows or Linux. And, unlike far too many Windows programs, Mac programs are usually highly discoverable; the user may not know how to accomplish a desired action, but there is one (and preferably only one) obvious path to follow, and mis-steps are generally not heavily penalised.

Right, "everybody" knows this, so why did I spend five paragraphs restating the reasonably obvious? Because the real intent of this post is to draw your attention to a phenomenon which is a necessary outcome of that standardisation and discovery: it is much easier to switch from one Mac app that performs a given task to another than it is on Windows. Most Mac users periodically switch between different applications for a given purpose, even keeping two or three installed on their systems. When you ask them why, they don't (generally) point to bugs or deficiencies in one product over another; they merely switch between them as their use cases change. For example, though I have both Microsoft Office and Apple iWork on this Mac, I will often create or open smaller Word documents in a simpler application such as AbiWord instead. It doesn't have all the features of Word or Pages, but it has the basics, and it loads more quickly and uses fewer resources than its two "big brothers."

The average Mac user is also generally more willing to try new applications, and generally owns more applications, than is the average for Windows. Since she is confident in her ability to pick up and use a new program, generally without resorting to a manual or even online help, there is a much more open discussion between users and developers, since both have seen a good bit of "the competition" and know what they like and don't like.

More rarely than is the case elsewhere, but not rarely enough, this easy migration from one app to another is due to real or perceived defects in a previously-used program. This happened to me recently; the program I had been using for a few months as my main Twitter client was not showing me all the tweets of people I was following in the "mainline" stream that I would see when I looked at each person's stream individually. Once you start following more than about two or three people, the mainline becomes absolutely indispensable; you simply don't want to have to take the time to look at each stream in isolation. So, I moved to another client, Nambu (now in "private beta" for a new release; version 1.2.1 can be found via web search).

Two immediate observations: I already know how to use this, even though Nambu has a far more dense presentation than my earlier client. And, because of that "dense presentation", it now takes me about a fifth as much time to get through my morning and afternoon Twitter catchups as it did previously. (Multi-column view is the killer feature, guys; there's only one little thing I'd like to see different...)

Again, why make noise about this? Simple: I've been a Windows user (usee?) and developer quite literally as long as there's been a "Windows"; I ran across my 1.0-beta developer kit floppies (5-1/4", of course) a couple of weeks ago (thinking about having them bronzed...or mounted on a dartboard. Maybe both.) But the nasty truth is, I very rarely change applications that perform a given task in Windows. The pain level and the (hopefully temporary) hit on my productivity aren't worth it until the pain becomes absolutely excruciating. I don't have that problem with the Mac, at all. I can try out new applications at will, even daily-workflow applications, secure in the knowledge that

  • I already know how to use this, at least well enough to get started, and
  • I can go back — or on to another candidate — any time I want to.

There's a word for the feeling that having that kind of freedom, that control over your computing experience gives you:

Empowerment.

Monday, 21 December 2009

When Standing on the Shoulders of Giants, Don't Trip

I've been doing a lot of software installation lately, on Mac OS X and various BSDs and Linuxes. Doing so reminds me of one of the major banes of my life when developing for Windows. If you're a Windows usee, you're acutely familiar with the concept.

DLL Hell.

On Windows, as you install a dozen apps, each of them will try to install their own versions of many system-wide (or vendor-wide) libraries. The poster child for these is msvcrt.dll, the Microsoft C Runtime Library, upon which virtually everything in or running under Windows depends in some fashion. Many unhappy man-millenia have been spent by admins and PC owners the world over trying to resolve compatibility issues, usually introduced when a newly-installed app overwrites a more recent version with the earlier version that app shipped with. Other things — including other libraries the app may depend on or even Windows itself — may break since features and bug fixes they rely on (present in later versions of the library) are suddenly gone.

Why think about this, when I'm as likely to write (or even use) any Windows software in the next month as I am to win the local lottery without buying a ticket?

Because, one of the things I added to my new MacBook Pro was the MacPorts software, which is a Mac analogue to the BSD ports collection. The Linux equivalent to this is (more or less) your package manager, whether it be aptitude or yum or portage or whatever. Windows has no equivalent concept; every app you install is its own universe, or thinks it is (with the "benefits" noted earlier), and no central one-stop "update everything" capability.

Modern operating systems may not have DLL hell, but they do tend to have numerous different versions of different libraries and support frameworks installed. Since those can be accessed by apps as either "give me the most recent version" or "give me version x.y.z", no equivalent to DLL hell takes place. But it does tend to eat up disk space. And it makes it harder to mentally keep track of what's installed; installing the MacPorts edition of Bluefish, an HTML editor, installs some eighty other "ports" — direct or indirect dependencies of Bluefish (which itself is primarily a Linux app). Some of these dependencies are last-month recent updates; a few are several years old. But MacPorts determines which packages are needed by Bluefish (and its dependencies, and...) and installs the latest version (by default) of those dependencies, or the specific version asked for. Thus, several files with the same basic name, but different version numbers encoded into the filename, can coexist peacefully in virtually any modern OS.

Recent Microsoft system software, particularly the "managed" software using the .NET Framework, avoids most of the cavalier instances of DLL hell, but introduces other quirks and brittleness in compensation.

But I'm still a bit unnerved by a software package — any package mdash; that has 80+ designated dependencies. I'm certainly thankful that I don't have to manage that dependency matrix myself.

If modern software development is "standing on the shoulders of giants", to appropriate a Newton paraphrase, then careless dependency introduction would be the functional equivalent of drunken tap-dancing: almost impossible not to trip.

Tuesday, 4 August 2009

The Debate between Adequacy and Excellence

I was clicking through my various feeds hooked into NetNewsWire, in this case The Apple Core column on ZDNet, when I came across this item, where the writer nicely summed up the perfectly understandable strategy Microsoft have always chosen and compared that with Apple and the Mac. Go read the original article (on Better Living without MS Office and then read the comment.

As I've commented on numerous times in this blog and elsewhere (notably here), I'm both very partial to open standards (meaning open data formats, but usually expressed in open source implementations) and to the Apple Mac. As I've said before, and as the experience of many, many users I've supported on all three platforms bears out, the Mac lets you get more done, with less effort and irritation along the way, than either Windows or Linux as both are presently constructed.

But the first two paragraphs of this guy's comment (and I'm sorry that the antispam measures on ZDNet apparently don't permit me to credit the author properly) made me sit up and take notice, because they are a great summation of how I currently feel about the competing systems:

The Macs vs. PC debate has been going on for about 25 years or so, but the underlying debate is much older. What we are really discussing is the difference between adequacy and excellence. While I doubt I would want to be friends with Frank Lloyd Wright or Steve Jobs, both represent the exciting belief in what is possible. While Bill Gates and Steve Ballmer rake in billions, their relative impact on the world of ideas is miniscule.

Bill Gates understands that business managers are on the whole are a practical, albeit uninspired and short-sighted bunch. By positioning Microsoft early on to ride into the enterprise with the implicit endorsement of one of the biggest, longest-lived, and influential suppliers of business equipment, Gates was able to secure Microsoft's future. Microsoft's goal has never seemed to me to be to change the world, only to provide a service that adequately meets business needs. Microsoft has also shown from early on a keen awareness that once you get people to use your product, your primary goal is not to innovate to keep your customers, but, rather to make leaving seem painful and even scary. Many companies do this, but Microsoft has refined this practice into an art.

He then expands on this theme for four more paragraphs, closing with

Practically speaking Microsoft is here to stay. But I am glad that Apple is still around to keep the computer from becoming dreary, to inspire people to take creative risks, to express themselves, and to embrace the idea that every day objects, even appliances like the computer, can be more than just the sum of their functions.

Aux barricades! it may or may not be, depending on your existing preferences and prejudices. But it does nicely sum up, more effectively and efficiently than I have been able to of late, the reasons why Apple is important as a force in the technology business. Not that Microsoft is under imminent threat of losing their lifeblood to Apple; their different ways of looking at the world and at the marketplace work against that more effectively than any regulator could. But the idea that excellence is and should be a goal in and of itself, that humanity has a moral obligation to "continually [reach] well past our grasp", should stir passion in anyone with a functioning imagination. Sure, Microsoft have a commanding lead in businesses, especially larger ones — though Apple's value proposition has become much better there in the last ten years or so; it's hard to fight the installed base, especially with an entrenched herd mentality among managers. But, we would argue, that does not argue that Apple have failed, any more than the small number of buildings designed by Frank Lloyd Wright and his direct professional disciples argue for his irrelevance in architecture. If nobody pushes the envelope, if nobody makes a habit of reaching beyond his grasp, how will the human condition ever improve? For as Shaw wrote,

The reasonable man adapts himself to the world. The unreasonable man persists in trying to adapt the world to himself. All progress, therefore, depends upon the unreasonable man.

And that has been one of my favourite quotes for many years now.

Tuesday, 14 July 2009

Expanding the Omniverse

Anybody who's worked with me in about the last 25 years knows that I've been preaching the idea that software craftfolk should never stop learning. Further, I've always believed that learning new languages or tools is one of the easiest ways to accomplish this, keeping the mind supple and open to new ways of doing things. And by and large, I've kept this up, learning enough of a new or long-neglected language to at least be able to read and patch code every two or three months. (You do the math.)

One of the languages that I learned rather early on in its lifecycle is Ruby. (Wikipedia has a rather good article summarizing the history, if you're unfamiliar with it.) Early 1.0-1.4 or thereabouts releases were interesting — they showed the power and promise of the concepts that Ruby is built on, without the bloated inscrutability of, say, perl. As so often has happened, I'd learned just enough to be dangerous, and then got sucked back into the charnel house that is Microsoft Windows development. By the time I had the time and inclination to start messing with Ruby again, an unfortunate thing happened.

Ruby on Rails.

Not that Rails isn't a great tool for building 37signals-type Websites; it clearly is. But it became a victim of its own hype and started being used for everything imaginable — famously including Twitter (a hype explosion in its own right.) Rails was enough to push me — and apparently a good number of other folks — away from Ruby and onto other languages, notably Python. And so I spent the bulk of the next couple of years in PHP, Python, C++, Objective-C, D (another bit of unsung genius), and managed to keep busy.

We developers have a well-known cliché for what drives us to do new things or participate in development projects; "scratching an itch". To scratch an itch in this context is to solve a problem that we ourselves are facing, or to do something that otherwise interests us. What got me motivated to get back up to speed in Ruby wasn't the Rails hype, or even seeing all the nice APIs that Twitter and Repertoire had made available. My problem was a bit simpler and more immediate.

Apple's Mail app started crashing under the load I was giving it. For about a year, I'd had a mail store that averaged about 2 GB and I was getting on the order of 600 to 800 emails a day. I don't mean to be critical of the app; it just wasn't designed to do what I was asking it to, certainly not when sharing 2 GB of RAM with the rest of the system. By the time I migrated away from Mail, I had over 400 filtering rules defined, to slice and dice incoming emails into appropriate folders where I could deal with them as I chose.

In late May, 2009, I up and migrated my email from Apple Mail to Microsoft Entourage. For those of you whose only exposure to Microsoft email software has been Exchange or Lookout! ("Outlook") Express, you're in for a pleasant shock. Entourage runs quite happily in the system as it is (even if I can't run certain other apps at the same time without upgrading RAM), and doesn't give me the maddening ten-minute freezes that were common with Apple Mail as it tried to figure out what to do next. Importing my existing mail store was a breeze. The filtering rules even came along for the ride, and Microsoft's rules editor is a real treat. There was, however, one small problem.

Somehow, the ordering of rules had become scrambled during the import, and after a few weeks of hand-editing to fix the biggest problems, I started looking for a program that would let me import, export, reorder and bulk-edit Entourage's rules. So far, I haven't been able to find one. (If anybody knows of anything useful, please add a comment or email me.) OK, I thought, no problem; nearly everything on the Mac is scriptable. I should just need to learn how to access Entourage's rule set from AppleScript or something similar, then I can write the tool I want. Not trivial, but certainly something that seemed conceptually quite practical.

So I started learning AppleScript, and casting about for tools and sample code that talked to Entourage from AppleScript. While searching, I ran across Matt Neuberg's site. Dr. Neuberg wrote the definitive guide on AppleScript — and then found something that worked better for him: rb-appscript. He's written an online book about it (eventually to be published on dead trees). At this point, I said "ok, let's get started with Ruby again and see what we can do."

If anybody has any pointers or suggestions, please comment. Thanks for reading.

Wednesday, 8 July 2009

The Best Tool for the Job

One of the nice things about growing up around (almost exclusively) men who were master mechanics, carpenters or other such highly skilled tradesmen was that I developed an appreciation both for "the best tool for the job at hand" and "making do with what's available" — and whichever of these applied, accomplishing the task at hand to the best of anyone's ability.

As I've progressed through my software and Web career, I've become highly opinionated about the tools I use, just like any other experienced software craftsperson I've ever known. You and I might use different tools to accomplish what functionally is the same task, but so long as we each have practical, experiential bases for those preferences, we should just go ahead and get what needs doing done. (There's an argument in there for open standards as a requisite for that to happen, but that's another post.)

Too many people who should know better have religious-level devotion to or hostility towards certain companies and/or products. Yes, that includes me; I know I've said some pretty inflammatory things, usually when I felt someone was expressing a religious belief masked as a technical opinion. No doubt they've felt the same about me and any others who were incautious enough to oppose their evangelism (or reactionism, depending on the circumstances). In general, it should be pretty evident to everyone with a personal or professional involvement in IT or personal electronics that trends are driven as much by "what I say three times is true!" as what actually can be shown to be true. That's how mediocre-at-best products become "industry Leaders"; inertia and close-mindedness set in, reinforced by a well-funded, continuous and strident marketing/branding campaign.

I was having a discussion about this online recently, with a former associate who's long had me pegged as an ABMer ("Anything but Microsoft"). I can understand how he formed that opinion; I've long complained about the (innumerable) defects in the "market-leading" operating system, and about how slowly progress has been made in cleaning up the most egregious faults (such as security). But I've also worked at Microsoft in Redmond — three different times — and I've always been impressed by the number of truly gifted people working there. They've had their triumphs and tragedies (anyone used Microsoft Bob lately?). They've had to deal with widely differing process and management effectiveness as they transfer between or liaise with different groups. They've ignored a lot of what has been done outside the company, but they've also created some amazing things inside; too many of which unfortunately never make it into public products.

And the quality of their work product varies as much as any of the factors that go into it. Cases in point: compare, say, Windows Vista with Windows Mobile or the XBox; compare Microsoft Outlook (forever known as "Lookout!" to security/admin people) with Entourage; compare Word for Windows to Word for the Mac — what I understand is a completely different code base (and visibly so) that "just happens" to be able to flawlessly read and write documents shared with Word for Windows.

I also reread a blog post I wrote last December where I detailed the issues I was starting to have with Apple's own Mail app for the Mac. I have a mail store that's hovered somewhere above 2 GB for the last year. I receive 100-200 legitimate emails per day (and up to 700 spams). I presently have over 230 filtering rules defined for how to handle all that mail. Those rules have been built up over the last five years or so — first using Mozilla Thunderbird, then Apple's Mail.app, and now a new system; a progression that also speaks eloquently about the value of open standards. I have never, to my knowledge, lost a saved message whilst transferring from one package to its successor. The few hiccups each transition has had with filtering rules have all been relatively easy to find and fix, with the newest app making that process breathtakingly simple.

The new mail app? As you've no doubt guessed, Microsoft Entourage. It, like every other Mac app I've ever used, Just Works as expected (at least until you get out to the far, bleeding edges). If Microsoft made Windows and Office for Windows as well as they make Entourage (and the rest of their Office:Mac products), they really wouldn't have to worry about competition — and they'd richly deserve that. The market-friendly price for their Mac product (where their major, worthy competitor sells for US$79) is just icing on the cake.

I don't hate Microsoft. I just wish they would stick to what they do as well or better than anyone else, and leave the crappy products that can never be anything but hypersonic train wrecks — like Windows and Internet Exploder. I wish that ever more fervently every time I'm asked to help some hapless Windows usee fix "why my computer doesn't work". That would also make Microsoft's long-suffering stockholders — including current employees, former employees and myself, among others — feel a lot better.

Wednesday, 27 May 2009

News Flash: Microsoft Reinvents Eiffel, 18 Years On

One of the major influences on the middle third of my career thus far was Bertrand Meyer's Eiffel programming language and its concept of design by contract. With such tools, for the first time (at least as far as I was aware), entire classes of software defects could be reliably detected at run time (dynamic checking) and/or at compile time (static checking). I worked on a couple of significant project teams in the mid- to late '90s that used Eiffel quite successfully. Further, it impacted my working style in other languages; for several years, I had a reputation on C and C++ projects for putting far more assert statements than was considered usual by my colleagues. More importantly, it made me start thinking in a different way about how to create working code. Later, as I became aware of automated testing, continuous integration and what is now called agile development, they were all logical extensions of the principles I had already adopted.

This all happened over a period of 15 or so years, in a field where anyone with more than 2 or 3 years' experience is considered "senior". But for me, and most other serious practitioners who I knew and worked with, two to three years was really just about as long as it took to answer more questions than we raised. That, in most crafts, is considered one of the signs of becoming a journeyman rather than a wet-behind-the-ears apprentice.

Then, a few hours ago, I was reading a blog entry by one David R. Heffelfinger which mentioned a project at Microsoft DevLabs called "SmallBasic". Another project that the same organization developed is called "Code Contracts"; there's a nice little set of tools (which will be built into the upcoming Visual Studio 2010 product), and a nice introductory video. Watch the video (you'll need Silverlight to view it), and then do some research on Eiffel and design-by-contract and so on, and it's very difficult not to see the similarities.

So, on the one hand, I'm glad that .NET developers will finally be getting support for 20-year-old concepts (by the time significant numbers of developers use VS 2010 and .NET 4.0). Anything that helps improve the developer and user experiences on Windows (or, in fact, any platform) is by definition a Good Thing™.

On the other hand, I see more evidence of Microsoft's historical Not Invented Here mentality; beating the drum for "new and wonderful ideas for Windows development" that developers on other platforms have been using effectively for some time. While the Code Contracts project indirectly credits Eiffel — the FAQ page links to Spec# at Microsoft Research, which lists Eiffel as one of its influences — it would have been nice to see acknowledgement and explanation of precursor techniques be made more explicitly. Failure to do so merely reinforces the wisdom of Santayana as applied to software: "Those who cannot remember the past are condemned to repeat it", as well as "Fanaticism consists in redoubling your efforts when you have forgotten your aim." This last is something that we who wish to improve our craft would do well to remember.

What do you all think?

Tuesday, 16 December 2008

Happy Updating....

If you're a Windows usee with a few years' experience, you've encountered the rare, monumental and monolithic Service Packs that Micorosoft release on an intermittent basis (as one writer put it, "once every blue moon that falls on a Patch Tuesday"). They're almost always rollups of a large number of security patches, with more added besides. Rarely, with the notable (and very welcome at the time) exception of Windows XP Service Pack 2, is significant user-visible functionality added. Now that SP3 has been out for seven months or so, it's interesting to see how many individuals and businesses (especially SMEs) haven't updated to it yet. While I understand, from direct personal experience, the uncertainty of "do I trust this not to break anything major?" (that is, "anything I use and care about?"), I have always advised installing major updates (and all security updates) as quickly as practical. Given the fact that there will always be more gaping insecurities in Windows, closing all the barn doors that you can just seems the most prudent course of action.

I got to thinking about this a few minutes ago, while working merrily away on my iMac. Software Update, the Mac equivalent of Windows' Microsoft Update, popped up, notifying me that it had downloaded the update for Mac OS X 10.5.6, and did I want to install it now? I agreed, typed my password when requested (to accept that a potentially system-altering event was about to take place, and approve the action), and three minutes later, I was logged in and working again.

Why is this blogworthy? Let's go back and look at the comparison again. In effect, this was Service Pack 6 for Mac OS X 10.5. Bear in mind that 10.5.5 was released precisely three months before the latest update, and 10.5.0 was released on 26 October 2007, just under 14 months ago. "Switchers" from Windows to Mac quickly become accustomed to a more pro-active yet gentle and predictable update schedule than their Windows counterparts. The vast majority of Mac users whom I've spoken with share my experience of never having had an update visibly break a previously working system. This cannot be said for Redmond's consumers; witness the flurry of application and driver updates that directly follow Windows service packs. XP SP2, as necessary and useful as it was, broke more systems than I or several colleagues can remember any single service pack doing previously...by changing behavior that those programs had taken advantage of or worked around. Again, the typical Mac customer doesn't have that kind of experience. Things that work, just tend to stay working.

Contrast this with Linux systems, where almost every day seems to bring updates to one group of packages or another, and distributions vary wildly in the amount of attention paid to integrating the disparate packages, or at least ensuring that they don't step on each other. Some recent releases have greatly improved things, but that's another blog entry. Linux has historically assumed that there is reasonably competent management of an installed system, and offers resources sufficient for almost anyone to become so. Again, recent releases make this much easier.

Windows, on the other hand, essentially requires a knowledgeable, properly-equipped and -staffed support team to keep the system working with a minimum of trouble; the great marketing triumph of Microsoft has been to both convince consumers that "arcane" knowledge is unnecessary while simultaneously encouraging the "I'm too dumb to know anything about computers" mentality — from people who still pony up for the next hit on the crack pipe. Show me another consumer product that disrespects its paying customers to that degree without going belly-up faster than you can say "customer service". It's a regular software Stockholm syndrome.

The truth will set you free, an old saying tells us. Free Software proponents (contrast with open source software) like to talk about "free as in speech" and "free as in beer". Personally, after over ten years of Linux and twenty of Windows, I'm much more attracted by a different freedom: the freedom to use the computer as a tool to do interesting things and/or have interesting experiences, without having to worry overmuch about any runes and incantations needed to keep it that way.

Friday, 15 August 2008

( C/C++ != C) && (C/C++ != C++)

A thought which ran through my mind as I was browsing some job requirements recently... Why are recruiters still hung up on "C/C++", years after even Microsoft got around to shipping a reasonably compliant compiler (depending on your prejudices and code needs, anywhere from Visual Studio 6 in 1998 to VS.NET 2003)? "C/C++" started life (or zombiehood) as a Microsoft marketing term back in the late 1980s with the release of Version 7 of their C compiler, which included "some C++ features". MSC 7 wasn't a "real" C++ compiler, but companies such as Borland (now CodeGear), Watcom (now part of Sybase), IBM and others, were shipping compilers that implemented the bulk of the (then-) Draft Standard in a (largely) portable, consistent fashion, so Microsoft was able to muddy the waters by calling their product "C/C++", secure in the knowledge that many of their customers had too little C++ experience to see through the marketing. Incidentally, this (non-Microsoft) competitive innovation spurred numerous advances, such as Alexander Stepanov's (of AT&T, later at HP) Standard Template Library (STL). Microsoft, in response, introduced a "Container Class Library" which was in practice quite inferior (since it required contained objects to be derived from the Microsoft Foundation Class library's CObject class and (if memory serves) did not support either multiple inheritance or thread safety. Since Microsoft's compilers at the time did not properly support important Standard C++ features such as templates and runtime type information (RTTI) that were needed for the STL, the compiler defects created market opportunities for companies like Rogue Wave and Dinkumware to create products with similar but not identical function. Timewise, this was really when Microsoft was starting to really push developer lock-in - the practice of introducting non-standard and/or proprietary "features" which were made central to the development process. Despite the existence of numerous superior (in design, function and in productivity) class libraries such as Borland's ObjectWindows Library, Inmark's zApp library, the previously-mentioned Rogue Wave toolkits, and others, Microsoft's MFC carved out huge market share and mindshare, largely because:
  • it was bundled with the Microsoft C ("C/C++") compiler;
  • its limitations and defects mapped most closely to those of the underlying compiler;
  • it came with a primitive but usable GUI builder, for "click-and-drool" development; and
  • it was relentlessly praised by the Microsoft-beholden tech press of the day.
That last point should never be underestimated; publishers of less-than-laudatory articles, such as C Users Journal and Will Zachmann (when he was writing for PC Magazine) would find themselves cut off from Microsoft's press briefings, rumor mil and other means of "keeping up with the competition". This was meant as punitive, to "hurt" the "offenders"...who promptly wrote up the entire sordid affair, built a certain amount of loyal sympathy from the industry grass-roots, and survived quite well, thank you very much. Getting back to "C/C++"... the term was a marketing fix to a technical problem which rapidly gained "mindshare" with its intended audience: marginally to non-technical people (senior managers, HR people, etc.) who wanted or needed to sound technically knowledgeable. Microsoft was able to play on their lack of real language knowledge coupled with follow-the-herd instincts to help force adoption in enterprises, from the top down. While this helped to increase sales, and helped preserve Windows' market share and lock-in in the enterprise for nearly two decades, it seriously retarded the take-up of standard, portable C++ in the industry (as intended). It also gave companies like ParcPlace (with Smalltalk) and NeXT, later Apple (with Objective-C) incentives to use "alternative" languages, either to gain some "control over their own destiny" independent of a competitor, or simply because C++ at the time was not up to the tasks which they wanted to accomplish. In any event, by around 2000 (plus or minus a half-decade), Microsoft had caught up with where the rest of the industry had been for a decade or so (bringing serious, proprietary backward-compatibility baggage along with them). The marketing need for the 'C/C++' Newspeak was gone - but the corporate world that had learned the newfangled technical language back in the day was still in place, bound only by the Peter Principle (whose bar, thanks to the new technology throughout the enterprise, had been set depressingly high). Consequently, you still run across job ads with text like this (from the Singapore Straits Times of 13 August 2008):

C/C++ EMBEDDED SOFTWARE Engr. Contract. Call 6xxx7085

Truly informative about the needs; at first blush, seemingly written by a completely non-technical HR person. (I didn't follow up the advertisement to actually verify this, however). What's the point of this whole rambling rant? To try to impress upon you, my half-dozen Loyal Readers, a technical truism that has been around as long as there have been technical gadgets: "80% of what you know will be obsolete in n months; the other 20% will never be obsolete. Using that 80% beyond its shelf life just makes you look silly." Or, if not 'silly', then at least 'locked in to an out-of-date technology or idea.' And that, with very high likelihood, does not deliver a competitive advantage to your organization.

Thursday, 10 July 2008

Standard Standards Rant, Redux: Why the World-Wide Web Isn't "World-Wide" Any More

The "World Wide Web", to the degree that it was ever truly universal, has broken down dramatically over the last couple of years, and it's our mission as Web development professionals to stand up to the idiots that think that's a Good Thing. If they're inside our organization, either as managers or as non-(Web-)technical people, we should patiently explain why semantic markup, clean design, accessibility and (supporting all of the above) standards compliance are Good for Business. (As the mantra says, "Google is your most important blind customer," because your prospective customers who know what they're looking for but don't yet know who they're buying it from find you that way.) Modern design patterns also encourage more efficient use of bandwidth (that you're probably paying for), since there's less non-visible, non-semantic data in a properly designed nest of divs than in an equivalent TABLE structure. Modern design also encourages consistent design among related pages (one set of stylesheets for your entire site, one for your online product-brochure pages, and so on). Pages that look like they're related and are actually related reassure the user that he hasn't gotten lost in the bowels of your site (or strayed off into your competitor's). It's easier to make and test changes that affect a specified area within your site (and don't affect others). It's easier to add usability improvements, such as letting users control text size), when you've separated content (XHTML) from presentation (CSS and, in a pinch, JavaScript). Easier-to-use Web sites make happier users, who visit your site more often and for longer periods, and buy more of your stuff.

Experienced Web developers know all this, especially if they've been keeping up with the better design sites and blogs such as A List Apart. But marketing folks, (real) engineers and sales people don't, usually, and can't really be expected to -- any more than a typical Web guy knows about internal rate of return or plastic injection molding in manufacturing. But you should be able to have intelligent conversations with them, and show them why 1997 Web design isn't usually such a good idea any more. (For a quick Google-eye demo, try lynx).  Management, on the other hand, in the absence of PHBs and management by magazine, should at least be open to an elevator pitch. Make it a good one; use business value (that you can defend as needed after the pitch).

That's all fine, for dealing with entrenched obsolescence within your own organization. What about chauvinism outside — from sites you depend on professionally, socially or in some combination? For years, marginalized customers have quietly gone elsewhere, with at most a plaintive appeal to the offenders, pointing out that a good chunk of Windows usees don't browse with Internet Explorer anymore (check out the linked article; a major business-tech Website from 2004(!!); the arguments are much stronger now). But some companies, particularly Microsoft-sensitive media sites like CNet and its subsidiary ZDNet, still don't work right when viewed with major non-Windows browsers (even when the same browser, such as Opera or Safari, works just fine with that site from Windows). And then there are the sites for whom their Web presence is the entire company, but they haven't yet invested the resources into competent design required to take their site construction from a point-and-drool interface virtually incapable of producing standards-compliant work, and instead present a site that a) actively checks for IE and snarls at you if you're using anything else, and b) has their design so badly broken and inaccessible that people stay away in droves. (Yes, I'm looking at you — every click opens a new window).

When we encounter Web poison like this, we should take the following actions:

  • Notify the site owner that we will use a better (compatible, accessible, etc.) site, with sufficient details that your problem can be reproduced (flamemail that just says "Teh site sux0rs, d00d!" is virtually guaranteed to be counterproductive);
  • When you find an acceptable substitute, let that site's owners know how they earned your patronage. Send a brief thank-you note to one or two of their large advertisers (if any), as well as to the advertisers on the site you've left (if you know any). Politely thank them for supporting good Web sites, or remind them why their advertising won't be reaching you anymore (as appropriate);
  • Finally, there really ought to be a site (if there isn't already) where people can leave categorized works/doesn't-work-for-me notes about sites they've visited. This sounds an awful lot like the original argument for Yahoo!; I can see where such a review site would either die of starvation or grow to consume massive resources. But praise and shame are powerful inducements in the offline world; it's long past time to wield them effectively online.
I'm sure that there are literally millions of sites with Web poison out there, and likely several "beware" sites as well. For the record, the two that wasted enough of my week this week to deserve special dishonor are ZDNet and JobStreet. Guys, even Microsoft doesn't lock people out and lock browsers up the way you do; I can browse MSDN and Hotmail just fine on my Mac, on an old PC with Linux, or on an Asus Eee. And if you need help, I and several thousand others like me are just an email away. :-)

Tuesday, 24 June 2008

Browser Support: Why "Internet Explorer 6" Really Is A Typo

(Experienced Web developers know that the correct name for the program is Microsoft Internet Exploder — especially for version 6.)

Case in point: I was browsing the daringfireball.net RSS feed and came across an article on the 37signals blog talking about Apple's new MobileMe service dropping support for IE6. The blog is mostly geared towards 37signals' current and potential clients who, if not Web developers themselves are at least familiar with the major technical issues involved. Not surprisingly, virtually every one of the 65 comments left between 9 and 13 June was enthusiastic in support for the move; not because the commenters necessarily favor Apple (though, clearly, many do), but because anybody who's ever cared about Web standards knows that IE6 is an antediluvian, defiantly defective middle finger thrust violently up the nostril of the Web development community; the technological equivalent of the Chevrolet Corvair: unsafe at any speed.

The degree to which this is true, and to which this truth continues to plague the Web developer and user communities, were brought into sharp focus by three of the comments on the post. The first, from 37signals' Jason Fried, estimates that 38% of their total traffic is IE, of which 31% is IE 6.0 (giving a grand total of 11.8% of total traffic — not huge, but significant).  The second is from Josh Nichols, who points out that Microsoft published a patch to solve the problem with IE6 in January, 2007; he notes, however, that an unknowable number of users may not have applied that patch. Finally, Michael Geary points out that later versions of Internet Explorer (version 7 and possibly the now-in-beta Version 8) also have the problem of not being able to "set cookies for a 2×2 domain, i.e. a two-letter second level domain in a two-letter top level domain with no further subdomain below that," (including his own mg.to blog domain). The fact that relatively few domains fall into that category can be argued to be part of the problem; users running IE, particularly an out-of-date version of IE, are likely to be less experienced, less able to recognize and solve the problem correctly, than to blame it on "something wrong with the Internet". For those people and companies who've paid for those perfectly legitimate domains, the negligence and/or incompetence of the browser supplier and/or user mean that they're not getting their money's worth.  And ICANN, the bureaucracy "managing" the domain-name system, is now "fast-tracking" a proposal to increase the number of top-level domain names (TLDs) used. (In time-honored ICANN custom, the press release is dated 22 June 2008 and "welcome[s]" "Public Comments" "by 23 June 2008." Nothing like transparency and responsiveness in governance, eh?

Wednesday, 28 June 2006

Promises Kept, Credibility Gaps, and Microsoft: Are we Customers or Consumers?

As reported on Slashdot, quoting Quentin Clark's WinFS team blog (which spun the item mercilessly), and commented on widely, particularly by rjdohnert and Kamal:

WinFS is dead. What has been understood for a decade or so to refer to a "Windows File System", recently rechristened in Microsoftspeak as "Windows Future Storage" (to imply a lack of commitment to a product or in fact anything specific at all); in any form recognisable as the product/technology that has been hyped unrelentingly by Microsoft when they needed something to keep users (and developers) committed to the Next Windows Version, the plug has been pulled for what promises to be the very last time. This could be viewed in a number of ways; the least uncharitable explanation that concievably touches upon our shared reality is the subject of the remainder of this item.

Yet another case of Microsoft overpromising and underdelivering? Since they really don't care about providing great software to consumers — either end users or developers, there is no real penalty for failing to keep promises (though they do, in true Rove/O'Reilly fashion, try to spin the sucker positive as hard as they can, just to keep the yokels giving the slack-jawed "wow....they say it's cool" and, as Michalski originally wrote, crapping cash).

There is absolutely no reason to keep waiting for a relational file store in Windows or any product except SQL Server (and possibly some future version of OFfice that requires SQL Server). There is no reason whatever to believe Microsoft will keep ANY promise made to developers or end users, nor or in future. There is absolutely no reason to believe that any gee-whiz "technology preview" given by Microsoft will ever turn into a real, stable, usable product unless that product is announced (with a ship date) at the show or conference where the demo is made. Stability and usability of said product will, as with all previous Microsoft releases, have to wait for the second service pack.

What this boils down to, in other words, is a matter of trust, and commitment, and honesty, and all the values that a company which values its customers (and workers) is expected to incorporate into its ethos. That Microsoft deliberately chooses not to do this, as it has proven on numerous occasions, shows its complete and consistent contempt for those poor schmucks it sees as consumers, not customers.

We, as developers and users, have two choices. We can either continue to prove Microsoft right, gulping whatever product they deign to deliver, crapping out whatever cash they choose to take, abjectly powerless to exert any change over their behaviour. Or, we can refuse to play their game any more. There are other tools to develop products for Windows. Most of these have the additional benefit of being cross-platform.

"Cross-platform". There's a quaintly radical word in these times. The idea that people could use a variety of systems, tools, applications, to get their work done. Companies don't have to pay US$600 to buy an office "suite" with a heavy-duty word processor, spreadsheet, and yadda yadda for a manager whose work is primarily limited to short memos? Revolutionary. Selecting tools based on the needs of the user rather than the "default" "choice" for the entire organisation? If one choice of office layout doesn't fit everybody from the managing director to the secretarial pool, then by what logic should they use the same software tools to do their work? How ma many users of, say, Microsoft Word use more than a tiny percentage (say, 5%) of the "features" in the product? (According to surveys dating back to 2000, roughly 5%). By looking at the situation as a need to give each user tools appropriate for the task at hand, rather than imposing a uniform "solution" and adapting the task to the "solution"?

This whole WinFS affair is yet another bit of weight pushing the Good Ship Microsoft towards (or past, in some opinions) the tipping point. Those already on board might do well to examine their options; those considering extending their 'booking' may wish to reconsider. The main forces arguing that no 'realistic' options exist have been marketing-driven, rather than technically- or business-driven. Consumers blindly take whatever they're given; customers demand products that meet their needs. It is high time that those who purchase and use business computer software systems, and the tools to work with them, availa themselves of their options.