Tuesday, 31 August 2010

My reaction to the new, pico-sized iPod Nano

This is the content of a comment I originally posted on ZDNet, in the Talkback section for the article New iPod Nano rumored to shrink further, by Jason D. O'Grady.

Cupertino, CA (1 April 2012) (UPI) Apple Inc (s AAPL) today introduced the new iPod Implant, which is to replace both the existing Nano and Shuffle models.

At the launch event at the now-traditional Yerba Buena Center, retiring Chairman Emeritus Steven P. Jobs announced that the new device would be available immediately. Roughly the size and shape of two typical breath mints, the newest iPod model is designed to be implanted next to the user's cochleas where, using very low-powered vibrations, it plays stereophonic music which the user hears in "perfectly detailed clarity," according to Mr. Jobs.

Apple announced a partnership with The Curanderismo Group, a nationwide chain of medical practitioners who have been designated as "preferred installers" of the new device. At the event, Dr. Yuri Gallyutsinogenov, President of the Group, explained, "Installing the iPod Implant is a simple surgical procedure, which nearly all our practitioner members are now trained in. For the low, low price of $199, a customer can buy the new device and have it implanted on the same day by our specialists."

Mr. Jobs was quick to clarify that the price quoted by Dr. Gallyutsinogenov was just for the installation procedure. The iPod Implant, available with storage capacities of 8, 16 and 32 gigabytes (GB), is priced from $199 to $399 for the device itself.

Tuesday, 17 August 2010

In Praise of Robust Tools and of The Future™

One of the best points about doing Web development in PHP is that it's so widely used; several respectable estimates by organizations that get paid to find these things out say that some varying number north of 50% of all sites on the World Wide Web use PHP as their implementation language. This includes numerous content management systems, or CMS, such as WordPress, Joomla! and Drupal.

One of the far-less-good points about doing Web development in PHP is that it's so widely used; the (evolving) "standard" set of tools, in true open-source fashion, are assembled and maintained by an ad-hoc band of individual and small-corporate luminaries, on infrastructure that worked just fine back when any given server was hit a couple of thousand times a year; by continuing to rely on (what to the outside Net appear to be) individual servers without terribly huge pipes connecting them to the larger Internet, the infrastructure completely fails when that's scaled up by a factor of a thousand or ten.

Web developers using PHP are highly dependent on an extension architecture/platform called PEAR, the "PHP Extension and Application Repository." Singular. Well, for any given extension or application, it's singular. So, as the user base scales upwards, popular tools that go through a vigorous release cycle, the servers they're hosted on (and network choke points between large subsets of users and those servers) start having reliability problems; transfers slow to a crawl, or fizzle out entirely. (One such transfer this evening proceeded at a sedate 600 bytes per second. Not kilobytes: bytes. That's 1980s dial-up speed.)

Developers in other languages have similar tools; Rubyists have gems available from sites like rubygems.org; Pythonistas have eggs. But somehow I hear a lot less grumbling, and do less of my own, when the subjects of gems or eggs come up; they seem to Just Work™ – which indicates that the network infrastructure is a lot more robust; either larger "pipes" and/or more, distributed servers such as with a CDN.

Trying to access servers like this from Second World cities like Singapore that apparently devote more resources to content filtering/monitoring than easing congestion doesn't help either. Hey, SingNet and M"D"A, how come it takes ten hops to get off an island that's barely that many miles across? If we want to plug Singapore into the "new global economy," having grotesquely under-resourced connections is not helpful.

But back to the main subject: If PHP is going to continue its phenomenally successful growth, then the infrastructure is going to have to decentralise, aggressively. Having essential tools like PEAR and the PHPUnit repository available only from a single point ensures that single points of failure will continue to seriously compromise the PHP ecosystem. No access to servers, no access to tools. No access to tools, then development of PHP artifacts with reasonable efficiency and economy is severely degraded.

In other words, the status quo is a limiting factor on future growth and success. However, as many will be quick to point out, building infrastructure costs money – for hardware, for connectivity, for the paid labour of skilled craftspeople and professionals to create, install, operate and maintain this enhanced infrastructure. No explicit means of funding such an endeavour presently exists, at least not to my knowledge.

So where do we go from here, PHP community?>

Monday, 16 August 2010

Proof Against Most Idiots

Fair warning: Geekish laudatory rant ahead. Still with me? Good!

I suppose it says something less than complimentary about our Craft that, when things actually work in a sensible fashion, recovering from Stupid User Actions™, it's surprising enough to be noteworthy. But I seriously doubt that I'm the only one who's noticed this.

Case in point: yesterday, I started downloading the Debian Linux testing-version DVDs. Yes, plural; there are 8 of them., as the standard software repository comes along with the test build. Anyway...

As most of us do, I was multitasking pretty heavily last night; when I knocked off, I just put my iMac to sleep as usual. It was long after I had gone to bed that I remembered, "hey, wasn't I downloading a passel of Linux DVDs when I slept the system?"

Expecting to have several "interesting" error messages displayed when I woke the system up (if in fact the file system hadn't been somehow damaged... yes, I was used by Windows for far too long), I powered up the system and was about to log in when BAM!... 15 minutes of phone calls. Thoroughly distracted afterwards, I log in, check my email, read a couple of new bug reports for a project I'm working on, and then remember "hey! everything's working!"

I switch to the Terminal window for the download (which was using GNU wget 1.12 if you care), to be greeted by the following display (excerpting):

--2010-08-15 19:45:15--  http://cdimage.debian.org/cdimage/weekly-builds/amd64/iso-dvd/debian-testing-amd64-DVD-6.iso
Resolving cdimage.debian.org (cdimage.debian.org)...,
Connecting to cdimage.debian.org (cdimage.debian.org)||:80... connected.
HTTP request sent, awaiting response... 302 Found
Location: http://caesar.acc.umu.se/cdimage/weekly-builds/amd64/iso-dvd/debian-testing-amd64-DVD-6.iso [following]
--2010-08-15 19:45:20--  http://caesar.acc.umu.se/cdimage/weekly-builds/amd64/iso-dvd/debian-testing-amd64-DVD-6.iso
Resolving caesar.acc.umu.se (caesar.acc.umu.se)... 2001:6b0:e:2018::142,
Connecting to caesar.acc.umu.se (caesar.acc.umu.se)|2001:6b0:e:2018::142|:80... failed: No route to host.
Connecting to caesar.acc.umu.se (caesar.acc.umu.se)||:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 4696719360 (4.4G) [application/octet-stream]
Saving to: “debian-testing-amd64-DVD-6.iso”

89% [===============================================================================================>           ] 4,214,102,016 --.-K/s   in 13h 40m 

2010-08-16 09:26:01 (83.6 KB/s) - Read error at byte 4214102016/4696719360 (Operation timed out). Retrying.

--2010-08-16 09:26:04--  (try: 2)  http://caesar.acc.umu.se/cdimage/weekly-builds/amd64/iso-dvd/debian-testing-amd64-DVD-6.iso
Connecting to caesar.acc.umu.se (caesar.acc.umu.se)||:80... connected.
HTTP request sent, awaiting response... 206 Partial Content
Length: 4696719360 (4.4G), 482617344 (460M) remaining [application/octet-stream]
Saving to: “debian-testing-amd64-DVD-6.iso”

100%[++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++==========>] 4,696,719,360 1.26M/s   in 6m 26s  

2010-08-16 09:32:50 (1.19 MB/s) - “debian-testing-amd64-DVD-6.iso” saved [4696719360/4696719360]

Note the report of a read error at 09:26:04 on 2010-08-16. The software was complaining that it had to recover from a timeout. Yes, I'm sure ten hours or so significantly exceeds whatever timeout value had been coded into wget... but it never missed a beat (or a byte; the SHA1 checksum matched afterwards).

I'm including the obligatory "don't try this at home, kids!" warning... but, if you aren't sitting in front of Windows, isn't it nice to know you can?

Sunday, 15 August 2010

Where Do We "Go" From Here? D2 Or Not D2, That the Question Is

Unless you've been working in an organization whose mission is to supply C++ language tools (and perhaps particularly if you have, come to think of it), you can't help but have noticed that a lot of people who value productive use of time over esoteric language theory have come to the conclusion that the C++ language is hopelessly broken.

Sure, you can do anything in C++... or at least in C++ as specified. How you do it in any particular C++ implementation varies. That's a large part of the justification for libraries like the (excellent) Boost; to tame the wilderness and present a reasonable set of tools that will (almost certainly) work with any (serious) C++ compiler in the wild.

But, as several language professionals have pointed out, very, very few (if any) people know all there is to know about C++; teams using C++ for projects agree (explicitly or otherwise) on what features will and won't be used. Sometimes, as in the case of many FLOSS projects, this is because compiler support has historically been inconsistent across platforms; more generally, it's to keep the codebase within the reasonable capabilities of whatever teammates poke at it over the life of the project.

I've been using C++ for 20+ years now, and I still learn something — and find I've forgotten something — about C++ on every project I embark on. Talking with other, often very senior, developers indicates that this is by no means an isolated experience. Thus, the quest for other language(s) that can be used in all or most of the numerous areas in which C++ code has been written.

If you're on Windows or the Mac, and happy to stay here, this doesn't much affect you beyond "there but for the grace of $DEITY go I." Microsoft folks have C#, designed by Anders Hejlsberg, who previously had designed and led development of the sublime Delphi language. C# essentially takes much of the object model from Delphi, does its best to avoid the various land mines encountered by Java and particularly C++ while remaining quite learnable and conceptually familiar to refugees from those languages.

On the Mac, Objective-C is the language of choice, providing commendably clean, understandable object-oriented wrapping around the nearly-universal C language (which is still fully usable). Objective-C is the reason most Mac-and-other-platform developers cite for their high productivity writing Mac code, along with the excellent Mac OS X system libraries. Objective-C is supported on other platforms, but the community of users on non-Apple or -NeXT systems is relatively small.

Google created the Go language and first made it public in late 2009. The initial design was led by Robert Griesemer, Rob Pike and Ken Thompson. Pike and Thompson are (or should be) very well-known to C and Unix developers; they led the pioneers. Robert Griesemer, currently with Google (obviously), was apparently previously best known in the Oracle data-warehousing community, but a quick search turns up a few interesting-sounding language-related hits prior to Go.

Go is described as "simple, fast, concurrent, safe, fun, " and open source. It doesn't try to be all possible things to all possible users; it has a relatively small, conceptually achievable mission, and leaves everything else to either The Future™ or some to-be-promulgated extension system.

Go is small, young, has a corporate sponsor acting as benevolent dictator with an increasingly active community around the language and, if it doesn't fall into the pits of internal dissension or corporate abandonment (as, say OpenSolaris has), looks set to keep that community happily coding away for some years to come. Given a choice for a greenfield project between C and Go, I can see many possibilities where I'd choose Go. I can see far fewer similar instances where I'd choose C++ over either Go or D.

D is another, rather different, successor language to C++ (which, as the name states, considers itself an improved or "incremented C."). Where Go is starting small and building out as a community effort, D was designed by Walter Bright, another C language luminary. He created the first native compiler for C++ (i.e., the first compiler not producing intermediate C code to be compiled "again"). Unlike Go, D's reference implementation is a revenue-generating commercial product. Unlike Go, D has an extremely rich language and set of core libraries; one can argue that these features make it closer to C++ than to the other languages discussed here. Yet it is that richness, combined with understanding of the (two major competing) leading implementations, that makes D attractive for C++ and Java refugees; they can implement the types of programs as before in a language that is cleaner, better-defined and more understandable than C++ can ever be.

D2, or the second major specification of the D language, appears to have wider community acceptance than D1 did, and we confidently expect multiple solid implementations (including GCC and hopefully LLVM)... if the larger community gets its house in order. I wrote a comment on Leandro Lucarella's blog where I said that "Go could be the kick in the pants that the D people need to finally get their thumb out and start doing things "right" (from an outside-Digital Mars perspective)."

Competition is generally a Good Thing™ for the customers/users of the competing ideas or products. It's arguably had a spotty record in the personal-computing industry (85%+ market share is not a healthy sign), but the successor-to-C++ competition looks to be entering a new and more active phase... from which we can all hope to benefit.