Saturday 30 January 2010

Piling On: More Kibitzing about the iPad

What do I think about the new iPad? Glad you asked. What's that? You didn't, really – or not many of you did. But that's OK, really; at least two other writers argue that we're going from an era of "nearly universal literacy" to "nearly universal authorship", so here's my two rupiah worth.

John Gruber, the blogger behind Daring Fireball, is a justly respected voice on a variety of topics, notably all things Apple. Two of his posts on the iPad, The iPad Big Picture and Various and Assorted Thoughts... are a good starting point if you've somehow just managed to crawl out of a bubble that protected you from hearing anything about it for the past, oh, six months or so, or more crucially the last three days. I also enjoyed reading Stephen Fry's thoughts on the device.

But one bit I would recommend above all to anybody who is thinking about "what does this iPad thing/phenomenon really mean?", or anyone who just cares about free expression, open societies, and all the other progress that humanity has made these past few centuries, should seriously ponder Alex Payne's thoughts On the iPad. Al3x makes some very good, if deeply disturbing, points.

As many people have pointed out, the iPad differs in one critical respect from every personal computing device as computing device that Apple have ever built, from the Apple I on up to the iMac that I am typing this on. The iPad is a device, first, foremost and specifically, for the consumption of digital "content". As James Kendrick says, Apple just want us to push the 'Buy' button, in an endless, mindless Pavlovian dystopia.

"Hang on," you say, "it can't really be as bad as all that. You're just pushing histrionics!" I truly hope so. But consider: Apple has sold every Mac, as well as all their earlier models, on the basis of a small group of generally positive, empowering ideas:

  • "It just works better."
  • "The power to be your best."
  • And, of course, "Think different".

One of the (many) things that make the Mac special is the fact that every single Mac sold comes with a copy of the full development toolset for Mac OS X, right in the box. Anybody who wants to invest the time and effort can become a Mac developer. (The money involved, of course, is in buying a Mac in the first place, but you were going to do that anyway, right?) You. The smart-aleck kid down the street. Your Aunt Tillie. Anyone. That has been one of the core strengths of the Apple Mac platform and product; the ability for anyone, without genuflecting before any sort of gatekeeper, to write anything they can conceive of, using some pretty great tools. And thousands, dozens of thousands have. You don't need to be a big corporation. You don't even need to ask Apple's permission, or use Apple's site to market your creation. All you need is an idea, some persistence, and the willingness to learn.

Apple made a radically different statement with the iPhone. If you want to write a "real" app for the iPhone, you need to submit your bits to the App Store, a process that is the seeming antithesis of open, collaborative or fair. As a practical matter, you need to join the iPhone Developer Program, and pay a fee. Doing so will subject you to various license agreements, limitations of what can be developed, and so on.

Apple justified this by saying "hey, the iPhone is a phone, an appliance. We have to keep some control over things, to make sure that our (non-technical) customers have the best possible experience — and incidentally to ensure that we continue to honor agreements we've made with carriers like AT&T." And the developer community grumbled and moaned, but large numbers went along. And the App Store, by almost any measure, is a raging success in the aggregate — even though most individual developers aren't making much at all.

Now comes the iPad, which has been widely, often dismissively, described as a "super-sized iPod Touch." Which, in several senses, it is. But whereas the "iTouch" is an accessory in one's life, being a combination PDA, music player, and (ostensibly) simple application platform, the iPad promises to be all that and more. Specifically, it's being pitched as this "magical" piece of technology which you'll "always" have at hand. Why? Well, you can listen to music on it, or watch videos, or run the same apps you ran on your iTouch or iPhone along with a new generation of "super-sized" apps. Oh, and Apple did announce that their office-software packages would be available on the iPad — so you can use it for "work stuff."

What about the kind of creative outlets that have been highlighted at each Mac introduction since the 1980s? Well, um, good luck with that. Because, even though this is a "real" computer, it's "really not"; it's an up-sized iPod Touch, which is (officially, mostly) a passive device.

The iPad is being pitched as a "magical", but inherently passive, device that consumers will use to buy (or, usually, rent) "content". There'll be "social media" like Facebook, Twitter and so on; the iPhone already supports those. Anything that can be done entirely over the Web, without the use of plugins like Java or Flash (like blogger.com, is kosher, too. But anything truly creative, "revolutionary", "game-changing", is going to have to survive the App Store gauntlet — which means that it's going to have to be consistent with Apple's view of how the iPad "should" be used. As a passive device which consumers use to buy access to content.

Think "57 Channels and Nothing On," a million times over. Perfect for the top-down, don't-make-me-think society. (I'm sure it will be very popular here in Singapore, for precisely that reason.)

Apple, you can do better. We know, because you've mostly done much better before, and encouraged us, developers and users, to do great things with what you've built for us. The iPad isn't so much a step in the wrong direction, it's a leap of faith worthy of Wile E. Coyote — but we know how far off the edge of the cliff we've gone. And the only way to go from here, is down.

Thursday 28 January 2010

What's wrong with this picture?

The New York Times front page, as displayed in Apple's iPad sales page

Apple chose the only name that ever made sense for their new change-the-world device: the iPad.

So what's not to like about it? Well, that depends on who you are and what you really wanted this to be. No surprise there.

My biggest problem with it was summed up well by James Kendrick in his post, Thoughts on the iPad — Just Push the Buy Button, says Apple: it really is primarily about media consumption - i.e., paying big corporations for temporary, passive access to data. Something to switch your creative intellect off, not on.

And those of us who are often called the "Apple faithful" were hoping, praying if you will, for something so much more. Which in fact this could still be. But for the first time in Apple history, a product with serious computing power is being positioned in the marketplace such that creative, collaborative expression by individuals is almost a subversive act.

We've fallen a long, long way from "Think Different". And it's now much, much harder to argue that Apple is as much a computer company dedicated to creativity and the pursuit of excellence, than to argue that Apple has become "just another" corporate big-media company, helping to turn the Internet from what was once called "the greatest opportunity for expressive, collaborative democratic action in human history" into just another television set. Active creation becomes passive "consumption." All hail the primacy of content, and don't even think about upsetting the applecart. That would be Thinking (too) Differently.

Tuesday 19 January 2010

ANFSD: Newman and Redford, Not Streisand and Farrow

The leads in the classic 1973 movie, "The Sting", of course. Both leads were male, playing roles that were prototypically male in our social order. Why is that?

Clay Shirkey has some ideas about that, which he wrote up as "A Rant About Women". Not "A Rant Against Women", mind you. Some did not see that as a difference that makes a difference, notably Zo at this post on her (misnamed) humorlessbitch.com blog. This post, beyond this explanatory header, is my comment to her blog-post-as-comment, the 190th comment in 30 hours. I have slightly reformatted it, as the original was only able to use CAPS for emphasis.

A tip of the hat to Venessa Miemis (@venessamiemis on Twitter) for tweeting about this.


Maybe this is because I have the ability to look at this as a Shirkeyesque male who “only told lies I could live up to, and I knew when to stop.” The sorry fact that he’s pointing out – too obliquely – is that many/most arenas of professional and creative endeavour have been created, led and/or taken over by men who fit the portrait that Shirkey is painting. Risk-taking, self-promotion and so on are seen as predominantly “male” character aspects/flaws, in large part because society has been and is fundamentally sexist – and is likely to remain so unless and until women can function within the existing framework successfully enough to alter it.

That’s what I get from reading Shirkey’s piece; not a snarky bit of braggadocio that “men are on top because we know how to game the system,” so much as “until increasing numbers of women in a variety of fields – professional, academic, artistic, and so on – can work within this aspect of the system, it won’t be changed.” Not because change wouldn’t benefit men as well as women. It would. But rather, because those “gatekeepers” who control the system, having got to where they are now by gaming the system, lack the will if not the means to change it.

We are at a turning point in human history, of a kind that hasn’t been seen in well over a thousand years. Institutions and conventions are changing all around us, and, not knowing how to change and survive, they change and die. Others watch the process attentively, determined against reality to avoid the same fate. But if we do NOT come up with a way to fundamentally change our society, to make it more equitable, transparent and open, we risk a new Dark Ages that will compare to the European mediaeval one as a broken fingernail compares to a petrol bomb.

I believe that is what Shirkey fears, as do I.


Comments, please.

Sunday 17 January 2010

Fixing A Tool That's Supposed To Help Me Fix My Code; or, Shaving a Small Yak

Well, that's a couple of hours I'd really like to have back.

One of the oldest, simplest, most generally effective debugging tools is some form of logging. Writing output to a file, to a database, or to the console gives the developer a window into the workings of code, as it's being executed.

The long-time "gold standard" for such tools has been the Apache Foundation's log4j package, which supports Java. As with many tools (e.g., PHPUnit), a good Java tool was ported to, or reimplemented in, PHP. log4php is uses the same configuration files and (as much as possible) the same APIs as log4j. However, as PHP and Java are (significantly) different languages, liberties had to be taken in various places. Add to this the fact that PHP has been undergoing significant change in the last couple of years (moving from version 5.2 to the significantly different 5.3 as we wait for the overdue, even more different, 6.0), and a famous warning comes to mind when attempting to use the tool.

Here be dragons.

The devil, the saying goes, is in the details, and several details of log4php are broken in various ways. Of course, not all the breakage gives immediately useful information on how to repair it.

Take, as an example, the helper class method LoggerPatternConverter::spacePad(), reproduced here:

    /**
     * Fast space padding method.
     *
     * @param string    $sbuf      string buffer
     * @param integer   $length    pad length
     *
     * @todo reimplement using PHP string functions
     */
    public function spacePad($sbuf, $length) {
        while($length >= 32) {
          $sbuf .= $GLOBALS['log4php.LoggerPatternConverter.spaces'][5];
          $length -= 32;
        }
        
        for($i = 4; $i >= 0; $i--) {    
            if(($length & (1<<$i)) != 0) {
                $sbuf .= $GLOBALS['log4php.LoggerPatternConverter.spaces'][$i];
            }
        }

        // $sbuf = str_pad($sbuf, $length);
    }

Several serious issues are obvious here, the most egregious of which is acknowledged in the @todo note: "reimplement using PHP string functions." The $GLOBALS item being referenced is initialized at the top of the source file:

$GLOBALS['log4php.LoggerPatternConverter.spaces'] = array(
    " ", // 1 space
    "  ", // 2 spaces
    "    ", // 4 spaces
    "        ", // 8 spaces
    "                ", // 16 spaces
    "                                " ); // 32 spaces

If you feel yourself wanting to vomit upon and/or punch out some spectacularly clueless coder, you have my sympathies.

The crux of the problem is that the function contains seriously invalid code, at least as of PHP 5.3. Worse, the error messages that are given when the bug is exercised are extremely opaque, and a Google search produces absolutely no useful information.

The key, as usual, is in the stack trace emitted after PHP hits a fatal error.

To make a long story short(er), the global can be completely eliminated (it's no longer legal anyway), and the code can be refactored so:

    public function spacePad(&$sbuf, $length) {
        $sbuf .= str_repeat( " ", $length );
    }

Of course, this makes the entire method redundant; the built-in str_repeat() function should be called wherever the method is called.

I'll make that change and submit a patch upstream tomorrow... errr, later today; it's coming up on 1 AM here now.

But at least now I can use log4php to help me trace through the code I was trying to fix when this whole thing blew up in my face.

At least it's open source under a free license.

Friday 15 January 2010

Virtual Real Work

(As posted in a comment on a very detailed Ars Technica review:

After giving PD5 and Fusion 3 full evaluations, I'm going back to Fusion for another cycle.

A lot of what I do is testing software in odd developer builds of various BSDs, Linuxes and OpenSolaris; Fusion gives me the least trouble when venturing away from the Microsoft megalith. I have XP and Win7 VMs too; I just don't fire up either more than once or twice a month. I typically have 2-3 other VMs up and on my network at any given time.

I also did take a good look at OpenBox; it's the best of the bunch for running OpenSolaris (wonder why?), but for anything else, particularly the oddball Linuxes, it gave me as much trouble as PD 5. Where Parallels would crash like a drunken teenager in a Maserati, OpenBox just would not install several important systems. So much for that.

VMWare isn't as slick in some areas as PD5 (LOVE that one-click 'exclude from TM' feature, and the VM list is nice), but it DOES have one feature that's absolutely non-negotiable: It tends to work better/more often. That's worth the extra US$10 or so for the upgrade. (with an additional new license for my new MBP).

I've put about 20 hours over the past 2 months into formal evaluation of the hypervisors mentioned (along with a couple of hundred hours of just using them to run VMs with which I was working, and not trying to think "is this better than...?"). The pain level with Parallels has been fairly consistent; VMWare only surprised me a couple of times. Parallels also gives you only half the review time that VMWare does before you have to pony up. That tells me that they aren't as confident in their product (and rightfully so); there are problems that merely suggest themselves in 15 calendar days of varied use that you could really get a handle on in 30. Not having confidence borne of experience in whether I can solve subtle issues as they arise wasn't the only reason for rejecting Parallels Desktop 5, but it certainly didn't help.

The impression I get from using both products, from reading what has been written and from chatting with other users, is very consistent. VMWare portray themselves as a company that tries, and often succeeds, to do things well - starting with a truly usable, stable, versatile virtualisation platform. Parallels, on the other hand... by bringing out releases well before they're really ready, and with a shorter evaluation time, give the impression that they're desperate for your money. Give it to the company that actually earn it.

Tuesday 5 January 2010

Address Book him, Danno!

This is a follow-up to a tweet I left yesterday, where I was praising a great little app called ABMenu. This little guy just sits in the system menu bar and gives you easy, two-click access to any entry in your Address Book. I'd installed it earlier on my main iMac. This morning, I noticed that it wasn't installed on the new, two-week-old MacBook Pro, and fixed that.

Then of course I noticed that the new system didn't have all the entries that the old one did. The customary way of syncing data like this between Macs is to use Apple's MobileMe service; the Mac can also use LDAP or other enterprise-style directory services. "Fine, " I thought, "this definitely sells a lot of MobileMe subscriptions." (In fact, getting one is on my New Year's resolution list; the reasons for the delay have been Complicated™.) Continuing the thought process, " There's gotta be a dedicated app out there that will let me sync just Address Book."

A couple of minutes browsing VersionTracker brought me to address-o-sync, which does just what it says on the tin. Obviously reasonable but slightly irritating irritation: the app has to be installed and running on both Macs in order to sync; address-o-sync can't reach across your local net and pull "raw" Address Book data. (It compensates nicely, with a couple of features that would be impractical otherwise.)

So, I've updated my "how to bring up a new Mac" checklist and starred both ABMenu and address-o-sync on my installed-app lists. Take a look, and see if you agree.

(Now if I could only get the Hawaii Five-O theme song out of my head.)

Monday 4 January 2010

The Audacity of Hype

This is the content of a comment which I posted to a TechCrunch article on The World Doesn't Need Someone Telling Us What We Don't Need In Tech. For what it's worth, I hope the iSlate/iPad/iTablet/iWhatever is real this time; if we have another Charlie Brown, Lucy and the Football moment, I think the market will see this as one of Apple's rare blown opportunities - to the tune of maybe $50 off the share price.

I'd be interested in your thoughts.


I also disagree with the "comfort zone" limitation idea; cf. Shaw and the Unreasonable Man. Apple's successes have all been revolutionary innovations; the few times that a "me too" product has come out of Cupertino (Apple III, Lisa), it's sunk at high warp speeds.

If the iPad is even halfway affordable (by Apple standards, anyway), they'll sell like iPods on steroids; I'd certainly buy one. Even having a big, beautiful iMac and a trusty MacBook Pro, I can see several useful areas for an iPad. More-convenient-than-laptop media browsing and ebook reading is the one everybody latches on to. But I could take an iPad onto the train every morning and get stuff done - standing on a lurching train filled with people from cultures that don't know how to move in crowds efficiently. I could take it into meeting-room meetings and hallway meetings more easily and effectively than a full-clamshell laptop. And it would probably be easier to gather a couple of people and run through a quick presentation with an iPad than a laptop, too. (Think "elevator pitch.")

Sure, it would fill a niche and scratch a few itches that I already know about. But for it to be truly revolutionary, it will need to find uses that I/we can't really predict yet, because we've never really had something capable of filling them - we've just sort of worked around limitations we weren't really aware of. (See 'iPhone' for previous example.) I've got more faith in Apple to pull something that revolutionary out of their hats than I do in any other tech company today — or in the 30 years I've been in this industry.

We ain't seen nothin' yet.

Sunday 3 January 2010

ANFSD: The Present Isn't Always Perfect

If anybody has had doubts about the decline and fall of the English language over the past two or three decades, current media (print and online) should clear that up very nicely.

If I see another reviewer who starts off with "I've had the (Product X) for a week and I am loving it," it will be very difficult to restrain myself from throwing a brick through the display. There's this widespread tendency to use the present perfect in place of the simple present.

Part of this, I understand, is simple cultural differences. Many of these writers either come from or work in an environment heavily influenced by South Asian variations of British English. Somebody in a position of educational or institutional power, somewhere along the line, decided that if the "present perfect" was indeed "perfect", then it was better to use that in any conceivable sentence.

This influence is relatively new in the larger English-literate world. I don't remember seeing this style when I started writing seriously 30 years ago, or even during the time of my first real multi-national distributed development team back in the early 1990s. People had varying levels of language ability, just as we all had varying abilities in other areas. One of the great things about those early teams was that each participant clearly understood that they had something to learn; that other teammates had skills and abilities that they could learn from. Since that time, the offshoring/outsourcing industry has devolved into what too often seem a series of sealed bubbles bumping against each other, but that's a topic for another post. Just let me say that, as has happened so many times before throughout history, language patterns mingled (or 'mangled', depending on your views). Absent widespread, effective English language instruction in American schools over the last 30 years (yet another topic unto itself), many younger American technical folks adopted the "new", "cool", "exotic" English usage borrowed from their peers. And since the software (and Web) industry is unrelentingly ageist, at least as much as, say, theoretical physics (where, I've been told, you're over the hill by your mid-20s), what the "cool kids" were doing quickly became the norm for the industry.

It may well be that "all this has happened before, and all this has happened again." That does not make it any more comfortable, or comforting, to those who are aware of how imperfect the present really is.

Oh, and by the way, Happy New Year! One is sorely tempted to say "It can't possibly be as bad as 2009 was..." Don't give in to the temptation. $DEITY tends to take that as a personal challenge... one that you or I can't possibly win.