Showing posts with label innovation. Show all posts
Showing posts with label innovation. Show all posts

Tuesday, 27 April 2010

Let's Do The Time Warp Agai-i-i-i-n!! (Please, $DEITY, no...)

For those who may somehow not be aware of it, LinkedIn is a (generally quite good) professionally-oriented social-networking site. This is not Facebook, fortunately. It's not geared towards teenagers raving about the latest corporate boy band du jour. It often can be, however, a great place to network with people from a variety of vocational, industry and/or functional backgrounds to get in contact with people, share information, and so on.

One of the essential features of LinkedIn is its groups, which are primarily used for discussions and job postings. In the venerable Usenet tradition, these discussions can have varying levels of insightful back-and-forth, or they can degenerate into a high-fidelity emulation of the "Animal House" food fight. As with Usenet, they can often give the appearance of doing both at the same time. Unlike Usenet, one has to be a member of LinkedIn to participate.

One of the (several) groups I follow is LinkedPHPers, which bills itself as "The Largest PHP Group" on LinkedIn. Discussions generally fall into at least one of a very few categories:

  • How do I write code to solve "this" problem? (the 'professional' version of "Help me do my homework");

  • What do people know/think about "this" practice or concept?

  • I'm looking for work, or people to do work; does anybody have any leads?

As veterans of this sort of discussion would expect, the second type of discussion can lead to long and passionate exchanges with varying levels of useful content (what became known on Usenet as a "flame war.") The likelihood of such devolution seems to be inversely proportional to its specificity and proportionally to the degree which the concept in question is disregarded/unfamiliar/unknown to those with an arguable grasp of their Craft.

It should thus be no surprise that a discussion on the LinkedPHPers group of "Procedural vs Object Oriented PHP Programming" would start a flame war for both of the above reasons. With 58 responses over the past month as I write this, there are informational gems of crystal clarity buried in the thick, gruesome muck of proud ignorance. As Abraham Lincoln is reported to have said, "Better to remain silent and be thought a fool than to speak out and remove all doubt."

What's my beef here? Simply that this discussion thread is re-fighting a war that was fought and settled over a quarter-century ago by programming in general. The reality is that any language that has a reasonable implementation of OOP (with encapsulation/access control, polymorphism and inheritance, in that order by my reckoning) should be used in that way.

Several of the posts trot out the old canard about a performance 'penalty' when using OOP. In practice, that's true of only the sharpest edge cases – simple, tiny, standalone classes that should never have been developed that way because they don't provide a useful abstraction of a concept within the solution space, generally by developers who are not professionally knowledgeable of the concepts involved and quite often by those copying and pasting code they don't understand into their own projects (which they also don't understand). That bunch sharply limited the potential evolution and adoption of C++ in the '80s and '90s, and many of their ideological brethren have made their home in Web development using PHP.

Yes, I know that "real" OOP in PHP is a set of tacked-on features, late to the party; first seriously attempted in PHP 4, with successively evolving implementations in 5.0, 5.2 and 5.3, with the semi-mythological future PHP 6 adding many new features. I know that some language features are horribly unwieldy (which is why I won't use PHP namespaces in my own code; proven idea, poor implementation). But taken as a whole, it's increasingly hard to take the Other Side ("we don' need no steeeenkin' objects") at all seriously.

The main argument for ignoring the "ignore OOP" crowd is simply this: competent, thoughtful design using OOP gives you the ability to know and prove that your code works as expected, and data is accessed or modified only in the places and ways that are intended. OOP makes "software-as-building-blocks" practical, a term that first gained currency with the Simula language in the mid-1960s. OOP enables modern software proto-engineering practices such as iterative development, continuous integration and other "best practices" that have been proven in the field to increase quality and decrease risk, cost and complexity.

The 'ignore OOP in PHP' crowd like to point to popular software that was done in a non-OOP style, such as Drupal, a popular open-source Web CMS. But Drupal is a very mature project, by PHP standards; the open-source project seems to have originated in mid-2000, and it was apparently derived from code written for a project earlier still. So the Drupal code significantly predates PHP 5, if not PHP 4 (remember, the first real whack at OOP in PHP). Perusing the Drupal sources reveals an architecture initially developed by some highly experienced structured-programming developers (a precursor discipline to OOP); their code essentially builds a series of objects by convention, not depending on support in the underlying language. It is a wonder as it stands – but I would bet heavily that the original development team, if tasked with re-implementing a Web CMS in PHP from a blank screen, would use modern OO principles and the underlying language features which support them.

And why would such "underlying language features" exist and evolve, especially in an open-source project like PHP, if there was not a real, demonstrable need for them? Saying you're not going to do OOP when using PHP is metaphorically akin to saying you intend to win a Formula One race without using any gear higher than second in the race.

Good luck with that. You might want to take a good, hard look at what your (more successful) colleagues are doing, adopt what works, and help innovate your Craft further. If you don't, you'll continue to be a drag on progress, a dilettante intent upon somehow using a buggy whip to accelerate your car.

It doesn't work that way anymore.

Monday, 4 January 2010

The Audacity of Hype

This is the content of a comment which I posted to a TechCrunch article on The World Doesn't Need Someone Telling Us What We Don't Need In Tech. For what it's worth, I hope the iSlate/iPad/iTablet/iWhatever is real this time; if we have another Charlie Brown, Lucy and the Football moment, I think the market will see this as one of Apple's rare blown opportunities - to the tune of maybe $50 off the share price.

I'd be interested in your thoughts.


I also disagree with the "comfort zone" limitation idea; cf. Shaw and the Unreasonable Man. Apple's successes have all been revolutionary innovations; the few times that a "me too" product has come out of Cupertino (Apple III, Lisa), it's sunk at high warp speeds.

If the iPad is even halfway affordable (by Apple standards, anyway), they'll sell like iPods on steroids; I'd certainly buy one. Even having a big, beautiful iMac and a trusty MacBook Pro, I can see several useful areas for an iPad. More-convenient-than-laptop media browsing and ebook reading is the one everybody latches on to. But I could take an iPad onto the train every morning and get stuff done - standing on a lurching train filled with people from cultures that don't know how to move in crowds efficiently. I could take it into meeting-room meetings and hallway meetings more easily and effectively than a full-clamshell laptop. And it would probably be easier to gather a couple of people and run through a quick presentation with an iPad than a laptop, too. (Think "elevator pitch.")

Sure, it would fill a niche and scratch a few itches that I already know about. But for it to be truly revolutionary, it will need to find uses that I/we can't really predict yet, because we've never really had something capable of filling them - we've just sort of worked around limitations we weren't really aware of. (See 'iPhone' for previous example.) I've got more faith in Apple to pull something that revolutionary out of their hats than I do in any other tech company today — or in the 30 years I've been in this industry.

We ain't seen nothin' yet.

Monday, 17 August 2009

We Interrupt This Program...

We interrupt this tutorial to interject an observation about goals, methods and promises. Goals we have for ourselves as people and as professionals; the method we use to pursue those dreams; perhaps most importantly, the promises we make, both to ourselves and to our customers, about what we're doing and why.

I consider this after reading the Website for some consultants who've done some (relatively nice, relatively low-key) link-dropping on LinkedIn. I'm not naming them here, not because I don't want to draw attention to them (their own site is very clean and well-done), but because the point that I'm going to be making here isn't just limited to them - we, as a craft and an industry of Web development, have some Serious Problems™.

The "problem" is nicely summarized by this group's mission statement:

Our mission is to produce the perfect implementation of your dreams.

What could possibly be wrong with that?

As a goal, implied but left unspoken, absolutely nothing; both as practitioners and as clients, we tend to set ever-higher goals for ourselves. Indeed, that's the only way the "state of the art" - any "art" - can advance. But we who practice the art and craft of software (including Web) development (as opposed to the engineering discipline of hardware development) have a history of slashed-beyond-reality schedules and budgets coupled with a tendency for stakeholders not to hear "if all goes well" as a condition to our latest schedule estimate. We have a history, perceived and actual, of promising more than we can deliver. Far more attention is paid by non-technical people to the "failures" and "broken promises" of software than to things done right. For a craft whose work is accruing increasing public-policy and -safety implications, the effect of unrealistic expectations, brought about by poor communication and technical decisions being made by people who aren't just technically ignorant but proud of the fact, is disturbing. What started as a slow-motion train wreck has now achieved hypersonic speeds, and represents a clear and present danger to the organisational health and safety of all stakeholders.

I don't mean to say that projects always fail, but an alarming number of them do. If, say, dams or aircraft were built with the same overall lack of care and measurable engineering precision that is the norm in commercial software development, we'd have a lot more catastrophic floods, and a lot few survivors fleeing the deluge by air. When I entered this craft thirty years ago (last May), I was soon led to believe that we were thirty to fifty years away from seeing a true profession of "software engineering". As a time frame beginning now, in 2009, I now think that is almost laughably optimistic.

Why have things gotten worse when we as a tool-building and -using society need them to get better? Some people blame "The Microsoft Effect" - shipping software of usually-dubious quality to consumers (as opposed to 'customers') who have bought into the (false) idea that they have no realistic choice.

It's more pervasive than that; commercial software development merely reflects the fashion of the business "community" that supports it, which has bought into one of the mantras of Guy Kawasaki's "The Art of Innovation", namely "don't worry, be crappy." Not that Kawasaki is giving bad advice, but his precondition is being ignored just as those of other software people have been: the key sentence in his "don't worry, be crappy" paragraph is "An innovator doesn't worry about shipping an innovative product with elements of crappiness if it's truly innovative" (emphasis mine). In other words, if you really are going to change the world, nobody will notice if your Deus ex Machina 1.0 has clay feet as long as you follow up quickly with a 1.1 that doesn't...and follow that with a 2.0 that changes the game again. But that space between 1.0 and 1.1 has to be fast, Kawasaki argues (in the next paragraph, titled "Churn, Baby, Churn"), and the version after that has to come along before people (like possible competitors) start saying things like "well, he just brought out 1.1 to fix the clay feet in 1.0." If the customers see that you're bringing out new versions as fast as they can adapt to the previous ones, but that each new version is a vastly superior, revelatory experience compared to the earlier release that they were already delighted by, they'll keep giving you enough money for you to finish scaling the "revolutionary" cliff and take a (brief) rest with "evolutionary" versions. Business has not only forgotten how important that whole process is to their continued survival, but they've removed the capability for their bespoke software (and Web) infrastructure to use and reuse that model. All that remains is "it's ok if we ship crap; so does everybody else." That's the kind of thinking that made General Motors the world-bestriding Goliath it is today - as opposed to the wimpy also-ran it was (emphatically NOT) half a century ago. We really don't need any more businesses going over that sort of cliff.

What we do need, and urgently, are two complementary, mutually dependent things. We need a sea change in the attitude of (most) businesses, even technology businesses, towards software - to realise and acknowledge that the Pointy-Haired Boss is not merely a common occurrence in the way business manages software, but actively threatens the success of any project (and business) so infested. Just as businesses at some point realise that "paying any price to cut costs" is an active threat to their own survival, they need to apply that reality to their view of and dealings with the technical infrastructure that increasingly enables their business to function at all.

Both dependent on that and as an enabler of that change, the software and Web development industry really needs to get its house in order. We need to get away from the haphazard by-guess-and-by-golly estimation and monitoring procedures in use by the majority of projects (whose elaborate Microsoft Project plans and PowerPoint decks bear less and less resemblance to reality as the project progresses) and enforce use of the tools and techniques that have been proven to work, and have an organised, structured quest to research improvements and New Things.. Despite what millions of business cards and thousands of job advertisements the world over proclaim, there is no true discipline of "software engineering", any more than there was "oilfield engineering" in widespread use before the New London School explosion of 1937. Over 295 people died in that blast; we have software-controlled systems that, should they fail, could in fact hurt or kill many more - or cause significant, company- or industry-ruinous physical damages. We should not wait for such an event before "someone" (at that point, almost certainly an outside governmental or trans-governmental entity) says "These are the rules." While I understand and agree with the widespread assertion that certification tests in their present form merely demonstrate an individual's capability to do well on such tests, we do need a practical, experiential system - probably one modelled on the existing systems for engineering, law or medicine. Not that people should work 72-hour shifts; there's enough of that already. But rather that there should be a progression of steps from raw beginner to fully-trusted professional, with a mix of educational and experiential ingredients to ascend that progression, and continuing educational and certificating processes throughout one's entire career. The cost for this is going to have to be accepted as "part of the system" by business; if business wants properly competent engineers, and not just the latest boatload of unknowns with mimeographed vendor certs, then they're going to have to realize that that benefit does not come without cost to all sides. The free ride is over - for all the stakeholders at the table.

Tuesday, 4 August 2009

The Debate between Adequacy and Excellence

I was clicking through my various feeds hooked into NetNewsWire, in this case The Apple Core column on ZDNet, when I came across this item, where the writer nicely summed up the perfectly understandable strategy Microsoft have always chosen and compared that with Apple and the Mac. Go read the original article (on Better Living without MS Office and then read the comment.

As I've commented on numerous times in this blog and elsewhere (notably here), I'm both very partial to open standards (meaning open data formats, but usually expressed in open source implementations) and to the Apple Mac. As I've said before, and as the experience of many, many users I've supported on all three platforms bears out, the Mac lets you get more done, with less effort and irritation along the way, than either Windows or Linux as both are presently constructed.

But the first two paragraphs of this guy's comment (and I'm sorry that the antispam measures on ZDNet apparently don't permit me to credit the author properly) made me sit up and take notice, because they are a great summation of how I currently feel about the competing systems:

The Macs vs. PC debate has been going on for about 25 years or so, but the underlying debate is much older. What we are really discussing is the difference between adequacy and excellence. While I doubt I would want to be friends with Frank Lloyd Wright or Steve Jobs, both represent the exciting belief in what is possible. While Bill Gates and Steve Ballmer rake in billions, their relative impact on the world of ideas is miniscule.

Bill Gates understands that business managers are on the whole are a practical, albeit uninspired and short-sighted bunch. By positioning Microsoft early on to ride into the enterprise with the implicit endorsement of one of the biggest, longest-lived, and influential suppliers of business equipment, Gates was able to secure Microsoft's future. Microsoft's goal has never seemed to me to be to change the world, only to provide a service that adequately meets business needs. Microsoft has also shown from early on a keen awareness that once you get people to use your product, your primary goal is not to innovate to keep your customers, but, rather to make leaving seem painful and even scary. Many companies do this, but Microsoft has refined this practice into an art.

He then expands on this theme for four more paragraphs, closing with

Practically speaking Microsoft is here to stay. But I am glad that Apple is still around to keep the computer from becoming dreary, to inspire people to take creative risks, to express themselves, and to embrace the idea that every day objects, even appliances like the computer, can be more than just the sum of their functions.

Aux barricades! it may or may not be, depending on your existing preferences and prejudices. But it does nicely sum up, more effectively and efficiently than I have been able to of late, the reasons why Apple is important as a force in the technology business. Not that Microsoft is under imminent threat of losing their lifeblood to Apple; their different ways of looking at the world and at the marketplace work against that more effectively than any regulator could. But the idea that excellence is and should be a goal in and of itself, that humanity has a moral obligation to "continually [reach] well past our grasp", should stir passion in anyone with a functioning imagination. Sure, Microsoft have a commanding lead in businesses, especially larger ones — though Apple's value proposition has become much better there in the last ten years or so; it's hard to fight the installed base, especially with an entrenched herd mentality among managers. But, we would argue, that does not argue that Apple have failed, any more than the small number of buildings designed by Frank Lloyd Wright and his direct professional disciples argue for his irrelevance in architecture. If nobody pushes the envelope, if nobody makes a habit of reaching beyond his grasp, how will the human condition ever improve? For as Shaw wrote,

The reasonable man adapts himself to the world. The unreasonable man persists in trying to adapt the world to himself. All progress, therefore, depends upon the unreasonable man.

And that has been one of my favourite quotes for many years now.

Saturday, 15 September 2007

Crap Is Not a Professional Goal

I recently, very briefly, worked with a Web startup based in Beijing (the "Startup"). The CEO of this company, an apparently very intelligent, focussed individual with great talent motivating sales and marketing people, takes as his guiding principle a quote from Guy Kawasaki, "Don't worry, be crappy".

The problem with that approach for the Startup, as I see it, is two-fold. To start, Kawasaki makes clear in his commentary that he's referring strictly to products that are truly innovative breakthroughs, exemplars of a whole new way of looking at some part of the world. Very few products or companies meet that standard (and even if yours does, Kawasaki declares, you should eliminate the crappiness with all possible speed). No matter how simultaneously useful and geeky the service offered by the Startup's site is, it is, at best, a novel and useful twist resting on several existing, innovative-in-their-day technologies. (A friend who I explained the company to commented that "it sounds as innovative as if Amazon.com only sold romance novels within the city limits of Boston" — hardly a breakthrough concept). Indeed, Kawasaki makes clear that he's talking about order-of-magnitude innovation; the examples he cites are the jump from daisy-wheel to laser printing and the Apple Macintosh.

The second, more insidious, problem with the approach, and the trap that many early-1990s Silicon Valley startups fell into, is that you take crappiness as a given, without even trying to deliver the one-two punch of true innovation and a sublimely well-engineered product that immediately raises the bar for would-be "me-too" copycats. (Sony, for instance, has traditionally excelled at this, as has the Apple iPod (learning from the mistakes Kawasaki cites for the earliest Macintosh). Deliver crap, and anybody can compete with you once they understand the basics of your product. The wireless mouse is a good example of this.

If you tell yourself from the get-go that you'll be satisfied if you ship a 70%-quality product, what will happen is that, as time goes by, that magical 70% becomes 50%, then 30%, then whatever it takes to meet the date you told the investors. And if management doesn't trust engineering to give honest, realistic estimates (as is typical in software and pandemic in startups), you have a recipe for disaster: engineering takes a month to come back with an estimate that development will take 12-18 months; management hears '12' and automatically cuts that to 6 and pushes to have a "beta" out in 4. The problem is that, if you're dealing with an even marginally innovative product, things are not cut-and-dried; the engineers will have misunderstood some aspects of the situation, underestimated certain risks, and been completely blind to others. This was pithily summed up, in another field entirely, by Donald Rumsfeld:

There are known "knowns." There are things we know that we know. There are known unknowns. That is to say there are things that we now know we don't know. But there are also unknown unknowns. There are things we don't know we don't know. So when we do the best we can and we pull all this information together, and we then say well that's basically what we see as the situation, that is really only the known knowns and the known unknowns. And each year, we discover a few more of those unknown unknowns.

Companies that simultaneously forget the ramifications of this while taking too puffed-up a view of themselves are leaving themselves vulnerable to delivering nothing more useful or profitable than the old pets.com (not the current PetSmart) sock puppet — and probably nothing that memorable, either.

And that further assumes that they don't fall prey to preventable disasters like losing the only hard drive running a production server built with non-production, undocumented software. If Web presence defines a "Web 2.0" company in the eyes of its customers and investors, going dark could be very costly indeed.