Tuesday, 18 August 2009

Responding while Forbidden; Gender and Other Issues in OSS and PHP

This post is what would have been a response to a post on Elizabeth Naramore's blog, which she titled Gender in IT, OSS and PHP, and How it Affects Us *All*. Quite a good post, actually, with a long and often thoughtful (but as often thoughtless) comment thread following. I was hoping to respond to a comment on the post, but that apparently is no longer allowed, even though the "Leave a Reply" form at the bottom of the page is functional. There's also a mysterious "Login/Password" pair on the page, but no indication of which ID one uses, or how to go about getting one.

Following is the content of the reply-that-wasn't: I (perhaps unreasonably) think this has some points worth pondering. Please do read the original post first and then come back here - where the "comment" feature definitely does work.

@A Girl: Great that you're doing AP CompSci next year. As someone who's been in the Craft for 30 years, I have a great sentimental attachment to your idea that "teachers and professors have the chance to shape the mindsets of their students towards women in the industry". If we were a true profession, where essentially all practitioners have a certain common level and content of educational background combined with qualifying experience (e.g., apprenticeship, internship), I'd agree wholeheartedly.

The fact is that many if not most of the people in the industry - both the "coders in the trenches" and the ex-coders who got promoted into management ("because they were such great coders" - thereby removing two qualified people from an organization)... far too many of these people have noformal education in CS (or, often, anything else). And by the time they realize how important that might be, they're old enough that they're facing ageism in the workplace already - they're not confident enough to put the "big hole in the middle of [their] career" and "go back" to school. It doesn't help that schools the world over do such a lousy job of outreach and marketing to those potential students - they're focused on the Executive MBAs and other graduate-level returning students, who can have their pricey programs paid for by their employer. Joe or Jane Schmuck trying to keep head above water in the face of cut-throat competition from planeloads of new arrivals with mimeographed certifications, who've been taught their entire lives to never think out of the box to begin with... things start getting really tough out there. I'm not surprised that enrollment in CS programs is down. I'm amazed beyond words that it's still as high as it is; a less starry-eyed observer might expect the number of CS majors to closely track, say, majors in Phoenician economics.

A lot of the new entrants into CS and IT over the last ten years or so have degrees - they're just not in the "obvious" field. Someone, and I wish I could find the original, wrote an article in one of the industry mags (like C/C++ Users Journal, not IEEE Computer) that, to be good at software in the modern era, one needed to have exposure to "behavioral science, psychology, linguistics, human factors, sociology, philosophy, rhetoric, ethnology, ethnography, information theory, economics, organizational politics, and a dozen other things - and please, please learn to write competently in English!" - I've had that taped above my display for years now. So it's not that we're not educated; the problem - and it is a problem - is that there is no universal common body of knowledge for software "engineering" - which is one of the necessary precursors of any true profession. We're not going to have a CBOK without either a broad consensus within the industry, or imposition of a system from outside (and very narrowly-focused) forces in government or the larger economy. Given the prevailing social and political attitudes of current practitioners ("herding libertarian-poseur cats" is a phrase not infrequently heard), that would seriously disrupt the Craft and, by extension, any industry or field dependent upon software (which by now is pretty much everything).

How to solve the problem - and, in so doing, help redress the pandemic sexism, racism and ageism (in huge parts of the world, recruiting with explicit age limits is perfectly legal, and here in South Asia, you're old for coding at 28)? I've got no idea. When I first started doing this, I thought that within the next thirty years or so (from 1979), we'd be able to turn this informal craftwork that had taken the industry away from the "educated CS types" and turn it into a real profession. Now? I'd say we're 30 to 50 years away from now - unless we have a software equivalent of the New London School explosion and a "solution" gets imposed from outside. We need to grow up, and quickly.

Monday, 17 August 2009

We Interrupt This Program...

We interrupt this tutorial to interject an observation about goals, methods and promises. Goals we have for ourselves as people and as professionals; the method we use to pursue those dreams; perhaps most importantly, the promises we make, both to ourselves and to our customers, about what we're doing and why.

I consider this after reading the Website for some consultants who've done some (relatively nice, relatively low-key) link-dropping on LinkedIn. I'm not naming them here, not because I don't want to draw attention to them (their own site is very clean and well-done), but because the point that I'm going to be making here isn't just limited to them - we, as a craft and an industry of Web development, have some Serious Problems™.

The "problem" is nicely summarized by this group's mission statement:

Our mission is to produce the perfect implementation of your dreams.

What could possibly be wrong with that?

As a goal, implied but left unspoken, absolutely nothing; both as practitioners and as clients, we tend to set ever-higher goals for ourselves. Indeed, that's the only way the "state of the art" - any "art" - can advance. But we who practice the art and craft of software (including Web) development (as opposed to the engineering discipline of hardware development) have a history of slashed-beyond-reality schedules and budgets coupled with a tendency for stakeholders not to hear "if all goes well" as a condition to our latest schedule estimate. We have a history, perceived and actual, of promising more than we can deliver. Far more attention is paid by non-technical people to the "failures" and "broken promises" of software than to things done right. For a craft whose work is accruing increasing public-policy and -safety implications, the effect of unrealistic expectations, brought about by poor communication and technical decisions being made by people who aren't just technically ignorant but proud of the fact, is disturbing. What started as a slow-motion train wreck has now achieved hypersonic speeds, and represents a clear and present danger to the organisational health and safety of all stakeholders.

I don't mean to say that projects always fail, but an alarming number of them do. If, say, dams or aircraft were built with the same overall lack of care and measurable engineering precision that is the norm in commercial software development, we'd have a lot more catastrophic floods, and a lot few survivors fleeing the deluge by air. When I entered this craft thirty years ago (last May), I was soon led to believe that we were thirty to fifty years away from seeing a true profession of "software engineering". As a time frame beginning now, in 2009, I now think that is almost laughably optimistic.

Why have things gotten worse when we as a tool-building and -using society need them to get better? Some people blame "The Microsoft Effect" - shipping software of usually-dubious quality to consumers (as opposed to 'customers') who have bought into the (false) idea that they have no realistic choice.

It's more pervasive than that; commercial software development merely reflects the fashion of the business "community" that supports it, which has bought into one of the mantras of Guy Kawasaki's "The Art of Innovation", namely "don't worry, be crappy." Not that Kawasaki is giving bad advice, but his precondition is being ignored just as those of other software people have been: the key sentence in his "don't worry, be crappy" paragraph is "An innovator doesn't worry about shipping an innovative product with elements of crappiness if it's truly innovative" (emphasis mine). In other words, if you really are going to change the world, nobody will notice if your Deus ex Machina 1.0 has clay feet as long as you follow up quickly with a 1.1 that doesn't...and follow that with a 2.0 that changes the game again. But that space between 1.0 and 1.1 has to be fast, Kawasaki argues (in the next paragraph, titled "Churn, Baby, Churn"), and the version after that has to come along before people (like possible competitors) start saying things like "well, he just brought out 1.1 to fix the clay feet in 1.0." If the customers see that you're bringing out new versions as fast as they can adapt to the previous ones, but that each new version is a vastly superior, revelatory experience compared to the earlier release that they were already delighted by, they'll keep giving you enough money for you to finish scaling the "revolutionary" cliff and take a (brief) rest with "evolutionary" versions. Business has not only forgotten how important that whole process is to their continued survival, but they've removed the capability for their bespoke software (and Web) infrastructure to use and reuse that model. All that remains is "it's ok if we ship crap; so does everybody else." That's the kind of thinking that made General Motors the world-bestriding Goliath it is today - as opposed to the wimpy also-ran it was (emphatically NOT) half a century ago. We really don't need any more businesses going over that sort of cliff.

What we do need, and urgently, are two complementary, mutually dependent things. We need a sea change in the attitude of (most) businesses, even technology businesses, towards software - to realise and acknowledge that the Pointy-Haired Boss is not merely a common occurrence in the way business manages software, but actively threatens the success of any project (and business) so infested. Just as businesses at some point realise that "paying any price to cut costs" is an active threat to their own survival, they need to apply that reality to their view of and dealings with the technical infrastructure that increasingly enables their business to function at all.

Both dependent on that and as an enabler of that change, the software and Web development industry really needs to get its house in order. We need to get away from the haphazard by-guess-and-by-golly estimation and monitoring procedures in use by the majority of projects (whose elaborate Microsoft Project plans and PowerPoint decks bear less and less resemblance to reality as the project progresses) and enforce use of the tools and techniques that have been proven to work, and have an organised, structured quest to research improvements and New Things.. Despite what millions of business cards and thousands of job advertisements the world over proclaim, there is no true discipline of "software engineering", any more than there was "oilfield engineering" in widespread use before the New London School explosion of 1937. Over 295 people died in that blast; we have software-controlled systems that, should they fail, could in fact hurt or kill many more - or cause significant, company- or industry-ruinous physical damages. We should not wait for such an event before "someone" (at that point, almost certainly an outside governmental or trans-governmental entity) says "These are the rules." While I understand and agree with the widespread assertion that certification tests in their present form merely demonstrate an individual's capability to do well on such tests, we do need a practical, experiential system - probably one modelled on the existing systems for engineering, law or medicine. Not that people should work 72-hour shifts; there's enough of that already. But rather that there should be a progression of steps from raw beginner to fully-trusted professional, with a mix of educational and experiential ingredients to ascend that progression, and continuing educational and certificating processes throughout one's entire career. The cost for this is going to have to be accepted as "part of the system" by business; if business wants properly competent engineers, and not just the latest boatload of unknowns with mimeographed vendor certs, then they're going to have to realize that that benefit does not come without cost to all sides. The free ride is over - for all the stakeholders at the table.

Tuesday, 11 August 2009

Tutorials, best practices and staying current

A gent by the name of Brian Carey has written a very nice little tutorial on "Creating an Atom feed in PHP", and gotten it published on the IBM DeveloperWorks site. In the space of about ten pages, Brian gives a stratospheric overview of what Atom is and why PHP is a good language for developing Atom-aware apps, and then gets into the tutorial - defining a MySQL database table to hold the data used to 'feed' the Atom feed, and writing code to get the data out and put it into the form that a reader such as NetNewsWire expects an Atom feed to be in.

Now, to be fair, Brian describes himself as "an information systems consultant who specializes in the architecture, design, and implementation of Java enterprise applications", and the paper is clearly meant as a whirlwind tutorial, not to be taken by the careful/experienced reader as necessarily production-quality code. And if this had been published in, say, 2002 or so, I'd have thought it a great how-to for banging out some PHP 4 code. But this is 2009 PHP 4 is a historical artifact, and blogs and industry journals of seemingly every stripe are decrying the poor quality (security, maintainability, etc.) of PHP code... much of which is still written as if it were the turn of the century, ignoring PHP 5's numerous new features and the best practices that both spawned them and grew from them.

So what's really wrong with doing things like they did in Ye Olden Tymes™?

  • Procedural code makes it harder to make changes (fix bugs, add features, change to reflect new business rules) and be certain that those changes don't introduce new defects. This is largely because...
  • While it is possible to test procedural code using automatable tools like PHPUnit, it's a lot harder and more complex than testing a clean, object-oriented design.
  • Hard-coding everything, interspersing 'magic values' throughout code, is a major hindrance to future reuse - or present debugging;
  • Using quick-and-dirty, old-style database APIs exposes you to the risk of input data that is more 'dirty' than it should be - opening the door to SQL injection and other nastiness;
  • Not staying current exposes your code to the risk that it's either using features that have since been deprecated or removed entirely, or (arguably worse), the risk that new features (such as standard library additions) make some of your existing code redundant at best.
Each of these, to varying degrees, is true of much of the PHP code I've read in the last couple of years, including that in the DeveloperWorks paper. For example, the DW paper's code makes use of the old-style PHP4 mysql_* API rather than the more abstract/portable MDB2 database abstraction layer. And there's also the rather terse implementation of a date3339() function for converting a timestamp to RFC 3339 output format, that's now nicely handled through the standard PHP DateTime class.

How would I have done things instead? Read the next few posts in this blog to find out. And, of course, comments are always welcome.

Thursday, 6 August 2009

Smokin' Linux? Roll Your Own!

As people who've encountered the "business end" of Linux have known for some time, the system (in whichever distribution you prefer, greatly rewards (some would say 'requires') tinkering and customisation. This can be done with any Linux system, really; some distros, like LinuxFromScratch and, to a lesser degree, Gentoo and its derivatives, explicitly assume that you will be customizing their base system according to your specific needs.

Now the major distros are getting into it. There have been various Ubuntu and Fedora customisation kits on the Net, but none as far as I can tell that are as directly supported (or easy to use) as from OpenSUSE, the "community-supported" offering from Novell (who also offer SUSE Linux Enterprise Desktop and Server.

Visit the OpenSUSE site, and prominently visible is a link to the OpenSUSE Build Service, which "allows developers to package software for all major Linux distributions", or at least those that use rpm packaging, the packaging system used by Red Hat, Mandriva, CentOS, and other similar systems. But that's not all...

SUSE now have a new service, SUSE Studio, which allows users to create highly customized systems based on either the community (OpenSUSE) or enterprise versions of SUSE Linux. These "appliances" can be put together on the basis of "patterns", such as lamp_server (LAMP, or Linux/Apache/MySQL/PHP Web server) or technical_writing (which includes numerous tools like Docbook). You can even supply your own (either self-built or acquired elsewhere) RPM packages to include in the appliance you're building, and SUSE Studio will deal with the dependency matching (warning you if packages are required that aren't either among its standard set or uploaded by you).

Startup scripts, networking, basically anything that is usually handled through the basic installation or post-installation configuration - all can be configured within the SUSE Studio Web interface.

And then, when you've got your system just the way you want it, you can build it as either an ISO (CD/DVD) image to be downloaded and burned onto disc, or as a VM image for two of the most popular VM systems (VMWare and Xen).

But wait, there's more...

Using a Flash-enabled browser, you can even "test drive" your appliance, testing it while running (transparently) in an appropriate VM hosted within the SUSE Studio infrastructure. Especially if you have a relatively slow connection, this will let you do preliminary "smoke testing" without having to download the actual image to your local system. Once you're ready to do so, of course, downloading is very nearly a single-click affair. Oh, and you're given (presently) 15 GB of storage for your various builds - so you can easily do comparative testing.

What don't I like about it? In the couple of hours I've been messing around with it today, there's really only one nagging quibble: When you do the "test drive" of your new creation, the page you're running it in is a standard, non-secure http Web page. The page warns you that any data and keystrokes sent will not be encrypted, and recommends the use of ssh if that is a concern (by which most people will think https). But there's no obvious way to switch, and shutting down the running appliance (which starts by the time you read the warning) involves keystrokes and so on...

In fairness, this is still very clearly a by-invitation beta offering (but you can ask for an invite), and some rough edges are to be expected. I'm sure I'll run into another one or two as things go on. I'm equally certain that all the major problems will be smoothed out before SUSE Studio goes into general public availability.

So, besides the obvious compulsive hackers and the people building single-purpose appliance-type systems, who would really make use of this?

One obvious use case, which the SUSE Studio site describes, is as a canned demo of a software system. If you're an ISV, you can add your software or Web app to a SUSE Studio appliance, lock down the OS image to suit (encrypting file systems and so on), and hand out your discs at your next trade show (or have them downloadable from your Website). No worries about installing or uninstalling from prospective customers' systems; boot from the CD (or load it into a VM) and they're good to go.

Another thought that hit me this morning was for use as an interview filter. This can be in either of two distinct modes. First, you might be looking for people who are really familiar with how Linux works. Write up the specs of a SUSE Studio appliance (obviously more demanding than just the click-and-drool interface) and use an app of your own devising to validate the submitted entries. This validation could be automated in any of several ways.

The second possible interview filter would be as a programming/Web dev system. As a variation on the "ISV" example above, you load up an appliance with a set of tools and/or source files, ready to be completed and/or fixed by your candidates. They start up the appliance (either a live CD or VM), go through your instructions for the test, and then submit results (probably an encrypted [for authentication] archive of all the files they've touched, as determined by the base system tools) via email or FTP. On your end, you have a script that unpacks the submission into a VM and uses the appropriate automated testing tools to validate it. I can even see this as a business model for someone who offers this capability as a service to companies wishing to have a better filter for prospective candidates than resume-keyword matching - which as we all know is practically useless due to the high number of both false negatives and false positives.

What do you all think?

Tuesday, 4 August 2009

The Debate between Adequacy and Excellence

I was clicking through my various feeds hooked into NetNewsWire, in this case The Apple Core column on ZDNet, when I came across this item, where the writer nicely summed up the perfectly understandable strategy Microsoft have always chosen and compared that with Apple and the Mac. Go read the original article (on Better Living without MS Office and then read the comment.

As I've commented on numerous times in this blog and elsewhere (notably here), I'm both very partial to open standards (meaning open data formats, but usually expressed in open source implementations) and to the Apple Mac. As I've said before, and as the experience of many, many users I've supported on all three platforms bears out, the Mac lets you get more done, with less effort and irritation along the way, than either Windows or Linux as both are presently constructed.

But the first two paragraphs of this guy's comment (and I'm sorry that the antispam measures on ZDNet apparently don't permit me to credit the author properly) made me sit up and take notice, because they are a great summation of how I currently feel about the competing systems:

The Macs vs. PC debate has been going on for about 25 years or so, but the underlying debate is much older. What we are really discussing is the difference between adequacy and excellence. While I doubt I would want to be friends with Frank Lloyd Wright or Steve Jobs, both represent the exciting belief in what is possible. While Bill Gates and Steve Ballmer rake in billions, their relative impact on the world of ideas is miniscule.

Bill Gates understands that business managers are on the whole are a practical, albeit uninspired and short-sighted bunch. By positioning Microsoft early on to ride into the enterprise with the implicit endorsement of one of the biggest, longest-lived, and influential suppliers of business equipment, Gates was able to secure Microsoft's future. Microsoft's goal has never seemed to me to be to change the world, only to provide a service that adequately meets business needs. Microsoft has also shown from early on a keen awareness that once you get people to use your product, your primary goal is not to innovate to keep your customers, but, rather to make leaving seem painful and even scary. Many companies do this, but Microsoft has refined this practice into an art.

He then expands on this theme for four more paragraphs, closing with

Practically speaking Microsoft is here to stay. But I am glad that Apple is still around to keep the computer from becoming dreary, to inspire people to take creative risks, to express themselves, and to embrace the idea that every day objects, even appliances like the computer, can be more than just the sum of their functions.

Aux barricades! it may or may not be, depending on your existing preferences and prejudices. But it does nicely sum up, more effectively and efficiently than I have been able to of late, the reasons why Apple is important as a force in the technology business. Not that Microsoft is under imminent threat of losing their lifeblood to Apple; their different ways of looking at the world and at the marketplace work against that more effectively than any regulator could. But the idea that excellence is and should be a goal in and of itself, that humanity has a moral obligation to "continually [reach] well past our grasp", should stir passion in anyone with a functioning imagination. Sure, Microsoft have a commanding lead in businesses, especially larger ones — though Apple's value proposition has become much better there in the last ten years or so; it's hard to fight the installed base, especially with an entrenched herd mentality among managers. But, we would argue, that does not argue that Apple have failed, any more than the small number of buildings designed by Frank Lloyd Wright and his direct professional disciples argue for his irrelevance in architecture. If nobody pushes the envelope, if nobody makes a habit of reaching beyond his grasp, how will the human condition ever improve? For as Shaw wrote,

The reasonable man adapts himself to the world. The unreasonable man persists in trying to adapt the world to himself. All progress, therefore, depends upon the unreasonable man.

And that has been one of my favourite quotes for many years now.