Showing posts with label howto. Show all posts
Showing posts with label howto. Show all posts

Thursday, 18 August 2011

The Yak Shavery; J. Dickey, Proprietor

Starting a new project for a new client, coming up to speed on a bit of kit almost like something I poked around with a ways back, and a ferociously Singaporean flu do not make for a productive week. I'm almost back to where I expected to be by noon on Monday. Since it's noon on Thursday

What I'm trying to learn and leverage is Chef, an automated-configuration tool for computer systems (mostly servers). If you want to be able to reliably, repeatably set up a server (or a server farm), Chef or one of several available alternatives, will make your life much easier. Once you get your tool of choice working, that is.

Step One is, as various Chef amateur tutorials suggest, to start out with a plain-vanilla installation of your OS (Linux, in this case) of choice. Rather obviously, getting from bare metal (or bare VM disk platters, if you'd rather) to the basic working system should be push-button automatic as well. "Fine," you say, "nearly every major distro has its own automation system; surely I should be able to just pick up, say, a Red Hat Enterprise Linux clone like CentOS or Scientific Linux and use a Kickstart file, and it should Just Work. Right?"

Close; but then, "close" only counts for scoring purposes in horseshoes and (arguably) the use of hand grenades. This is neither, though it does have the ability to blow up in our faces with unpleasant consequences.

Putting together a basic Kickstart file to set up a base system (on which we can use Chef to complete installation and configuration; the whole point, remember) is itself pretty straightforward. Except for one "little" thing:

A rather scary dialog
(not my actual screenshot; this one courtesy of Máirín Duffy.)

Hitting the "Reinitialize All" button will allow the install to finish as expected, but it's still a manual action in a process that's trying to eliminate such.

Some bit of Google-fu later and asking questions on the #centos channel of irc.freenode.net (thanks, Wolfie!) pointed me to the correct option to set when clearing the disk partitions, and all should have been peachy-keen and wonderful from that point on. Except it wasn't, of course.

Tests conducted with two different versions each of three RHEL clones on my VMWare Fusion 3.1.3 system all failed with a mysterious "cannot read repodata" fatal error being thrown by the installer.

Except, of course, that mounting the DVDs in question and browsing them showed the /repodata directory and contents exactly as they should be. Fast-forward through a day of flailing and fuzzing different options "just to see what happens", and you have a classic yak shave.

So, knowing that I'll have to get the RHEL versions working eventually, and having been previously warned against using Chef-on-Ubuntu as a learning exercise, I'm now spinning up AutoYAST on openSUSE, which is (experientially) what I should have started with from the beginning.

The wall, where my head has banged against it repeatedly, can be repaired. The calendar? Time is the ultimate non-renewable resource.

(Anybody who has any suggestions for Making This Work with RHEL clones, ideally Scientific 6.1 and/or CentOS 6: enlightenment would be Greatly Appreciated.)

Tuesday, 28 September 2010

Don't Waste My Time.

What follows is a critique and a complaint, not a rant. What's the difference, you ask? A rant, in my view, is a primal scream of frustration and near-impotent rage that often doesn't let a few (allegedly) non-essential facts get in the way of a good thrashing. It also takes far more energy than I'm able to summon at present, for reasons that may become clear below.


As I've mentioned previously, I've been kicking the tires on the (relatively new) Kohana 3.0 application framework for PHP Web development. I'd previously used (and enthused about) the 2.3.x ("KO2") release; I was itching to get my hands dirty with 3.0.1 After a couple of dozen hours spent plinking away at it, however, I'm reevaluating my choices.

Two of the things that endeared the earlier system to me were the good-to-excellent, copious documentation and the active, always-willing-to-help user community (which, quite frankly, are the backbones of any open-source project).

KO3 is a different proposition entirely than KO2. As I write this, the eighth bug-fix update to 3.0 is available on the Web site. Since this is a new "the same, but different" project, all the users are starting fresh, so there can't be the same level of everyone-helps-everyone-else camaraderie as with KO2. This puts a foreseeable, far greater burden on the documentation to help people get productive as quickly and effectively as possible. However, the documentation, quite charitably, remains in a beta-quality state. This is true in comparison to both the earlier release and to other PHP Web frameworks such as Symfony2, CakePHP, FLOW3 and (notoriously) Zend. With most of these other frameworks, as with KO2, it was a quick, straightforward process figuring out how to get from the "hello-world" style demos, to being able to create a useful-but-simple site, to branching out from there. It's taken four times longer to get half as far with KO3 as with KO2.

Judging by comments in the Kohana support forums, I'm not alone in this; the documentation has been roundly panned by virtually all users who've bothered to comment. There's been far too much defensive "no, it isn't bad, you just need to read a bit more, and use the source, Luke" attitude from the project folks. During the KO2 lifecycle, the attitude I understood from their responses was more along the lines of "we know there are a few problems; we're working on them," quickly followed by "take a look at this." I don't know if 3.0 is so much more complex than 2.x that they simply don't have the bandwidth to document things to their previous standards. Frankly, I don't care any more.

I've decided that my existing projects that have been started in Kohana 2.3 will remain there; possibly moving to 2.4 when it becomes the officially-supported release. But I do not plan to invest any more time and effort into Kohana 3.0, at least until the new project has had some time to mature. I fully recognise the potentially self-defeating attitude inherent in that viewpoint. Virtually any open-source project depends on its community to "plug the holes" that the "official" project maintainers don't have time for or deliberately leave as open questions for the community. Well-run community projects are extremely collaborative, communication-rich environments.

Other projects take a "vendor/product" approach, essentially saying "Here's everything you'll need, soup-to-nuts; we'd really appreciate it if you built around the edges and took things deeper in your direction of interest, but the core that we're responsible for is solid." Those "vendors", the Zends and Sensio Labs of the world, have rich open-source offerings that they use as a platform to build (or offer) what actually pays the bills.

While I have a strong philosophical and experiential preference for community-driven (or at least -oriented) projects, there have to be times when I just want to Get Things Done rather than embark on an endless voyage of discovery.3 It is at those times that I'll reach for something that "just works" well enough for me to accomplish whatever it is I'm trying to do at the moment, whether it's to write a book or to bring up a client's Web site. I know and accept that any new tool, or new version of a tool I've used previously, will require a certain amount of time to "get up to speed" on. I don't know everything (or necessarily anything) before I learn it; the most I can hope for (and what I really do insist on) is that things make sense within the logical and semantic frameworks4 that they're expressed in, and that that expression is accessible, complete and correct enough that I can feel like I'm making progress. This invariably involves some sort of documentation; whether a "conventional" or online manual, a Wiki, or a user forum; the format is less important to me than the attributes I mentioned earlier.

Kohana 3.0, as it currently stands, does not meet that standard for me. And so I'm back in a feverish learn-and-evaluate mode with at least two other frameworks. I have projects to finish, and I have several chapters of a book I'm working on that had initially been written around Kohana 2, and will now need to be substantially redone.5

I intend to give each of those new candidate frameworks the same amount of time that it took me to get productive in Kohana 2.x (which was significantly less than the "industry leader," as I've previously mentioned). This is going to be an interesting week and a half or so, in the Chinese-curse sense of the word.

Footnotes:

1. Technical reasons for moving to KO3 included the switch from a model-view-controller (MVC) architecture to hierarchical MVC (or HMVC); if you know what these mean, you should know this is a Very Big Deal™. Additionally, I've found it a Very Bad Thing to tie myself to a legacy (obsolescent) project, and the end of KO2 is being made very plain on the Kohana Web site.(Return)

2. If the new, pre-release Symfony 2 code is as good as the documentation, we're in for a real treat.

3. I am typing this on a Mac rather than a Linux box, after all (though I have three Linux VMs running at the moment).(Return)

4. This implies, of course, that there are "logical and semantic frameworks" sufficiently visible, understandable and relevant to the task at hand. (Return)

5. I certainly don't want to fall into the same trap as a previous series of PHP books, which relied on the obsolete and inadequate Ulysses framework. Ulysses has been justly and productively panned, which has to reflect poorly on the aforementioned book (which I happen to own) and its authors. (Return)

Wednesday, 26 May 2010

Beating Your Head Against the Wall, Redux

...or, the "Monty Python and the Holy Grail" monks' guide to making your Mac desktop work like a server, instead of going and getting a copy of OS X Server like you should...

Mac OS X Server brings OS X simplicity and Unix power to a range of hardware systems. Most of the things that Server makes trivially simple can be done in OS X Desktop. Some of them, however, require the patience of Job and the ingenuity and tenaciousness of MacGyver...or so they at first appear.

One such task is installing Joomla! This is a nice little Web CMS which has some nice features for users, developers and administrators. On most Unix-like systems, or even recent versions of Microsoft Windows, installation is a very straightforward process for any system which meets the basic requirements (Web server, PHP, MySQL, etc.) as documented in the (PDF) Installation Manual or the PDF Quick Start guide. On most systems, it takes only a few minutes to breeze through from completed download to logging into the newly installed CMS.

The OS X desktop, as I said, is a bit different. This isn't a case of Apple's famous "Think Different" campaign so much as it appears to be a philosophical conflict between Apple's famous ease-of-use as applied to user and rights management, coming up against traditional Unix user rights management. Rather than the former merely providing a polished "front end" interface for the latter, some serious mind-games and subversion are involved. And I'm not talking about the well-known version control software.

When things go wrong with regard to a simple installation process of a well-known piece of software, usually Google is Your Best Friend. If you search for joomla mac install, however, you quickly notice that most of the hits talking about OS X desktop recommend that you install a second Apache, MySQL and PHP stack in addition to the one that's already installed in the system — packages such as XAMPP, MAMP and Bitnami. While these packages each appear to do just what it says on their respective tins, I don't like having duplicate distributions of the same software (e.g., MySQL) on the same system.

Experience and observation have shown that that's a train wreck just begging to happen. Why? Let's say I've got the "Joe-Bob Briggs Hyper-Extended FooBar Server" installed on my system. (System? Mac, PC, Linux; doesn't matter for this discussion.) When FooBar bring out a new release of their server (called the 'upstream' package), Joe-Bob has to get a copy of that, figure out what's changed from the old one, and (try to) adapt his Hyper-Extended Server to the new version. He then releases his Hyper-Extended version, quite likely some time behind the "official" release. "What about betas," you ask? Sure, Joe-Bob should have been staying on top of the pre-release release cycle for his upstream product, and may well have had quite a lot of input into it. But he can't really release his production Hyper-Extended Server until the "official" release of the upstream server. Any software project is subject to last-minute changes and newly-discovered "show-stopper" issues; MySQL 5.0 underwent 90 different releases. That's a lot for anybody to keep up with, and the farther away you get from that source of change (via repackaging, for example), the harder it is to manage your use of the software, taking into account features, security and the like.

So I long ago established that I don't want that sort of duplicative effort for mainstream software on modern operating systems. (Microsoft Windows, which doesn't have most of this software on its default install, is obviously a different problem, but we're not going there tonight.) It's too easy to have problems with conflicts, like failing to completely disable the "default" version of a duplicated system, to inject that kind of complexity into a system needlessly.

That isn't to say that it doesn't become very attractive sometimes. Even on a Mac OS X desktop — where "everything" famously "just works" — doing things differently than the "default" way can lead to great initial complexity in the name of avoiding complexity down the line. (Shades of "having to destroy the village in order to save it.")

The installation of Joomla! went very much like the (PDF) Installation Manual said it should... until you get to the screen that asks for local FTP credentials that give access to the Joomla! installation directory. It would appear that setting up a sharing-only user account on the system should suffice, and in fact several procedures documenting this for earlier versions of Mac OS X describe doing just that. One necessary detail appears different under 10.6.2, however: the "Accounts" item in System Preferences no longer allows the specification of user-specific command shells...or, if it does, it's very well hidden.

Instead, I created a new regular, non-administrative user for Joomla! I then removed the new user's home directory (under /Users) and created a symlink to the Joomla! installation directory.

Also, one difference between several of the "duplicating" xAMP systems I mentioned above and the standard Apache Web server as on OS X (and Linux) is that in the default system, access to served directories is disabled by default; the idea is that you define a new Apache <Directory> directive for each directory/application you install. Failing to do this properly and completely will result in Apache 403 ("Forbidden") errors. Depending on your attitude to security, you may either continue to do this, or change the default httpd.conf setting to Allow from All and use .htaccess files to lock down specific directories.

Once you have the underlying requirements set up (and FTP access is the only real think-outside-the-box issue, really), Joomla! should install easily. But if you're in a hurry and just trying to go through the documented procedures, you're quite likely to spend considerable time wondering why things don't Just Work.

And no, neither Fedora nor Ubuntu Linux "do the right thing" out-of-the-box either. At least, not im my tests.