Showing posts with label gnu. Show all posts
Showing posts with label gnu. Show all posts

Monday, 16 August 2010

Proof Against Most Idiots

Fair warning: Geekish laudatory rant ahead. Still with me? Good!

I suppose it says something less than complimentary about our Craft that, when things actually work in a sensible fashion, recovering from Stupid User Actions™, it's surprising enough to be noteworthy. But I seriously doubt that I'm the only one who's noticed this.

Case in point: yesterday, I started downloading the Debian Linux testing-version DVDs. Yes, plural; there are 8 of them., as the standard software repository comes along with the test build. Anyway...

As most of us do, I was multitasking pretty heavily last night; when I knocked off, I just put my iMac to sleep as usual. It was long after I had gone to bed that I remembered, "hey, wasn't I downloading a passel of Linux DVDs when I slept the system?"

Expecting to have several "interesting" error messages displayed when I woke the system up (if in fact the file system hadn't been somehow damaged... yes, I was used by Windows for far too long), I powered up the system and was about to log in when BAM!... 15 minutes of phone calls. Thoroughly distracted afterwards, I log in, check my email, read a couple of new bug reports for a project I'm working on, and then remember "hey! everything's working!"

I switch to the Terminal window for the download (which was using GNU wget 1.12 if you care), to be greeted by the following display (excerpting):

--2010-08-15 19:45:15--  http://cdimage.debian.org/cdimage/weekly-builds/amd64/iso-dvd/debian-testing-amd64-DVD-6.iso
Resolving cdimage.debian.org (cdimage.debian.org)... 130.239.18.163, 130.239.18.173
Connecting to cdimage.debian.org (cdimage.debian.org)|130.239.18.163|:80... connected.
HTTP request sent, awaiting response... 302 Found
Location: http://caesar.acc.umu.se/cdimage/weekly-builds/amd64/iso-dvd/debian-testing-amd64-DVD-6.iso [following]
--2010-08-15 19:45:20--  http://caesar.acc.umu.se/cdimage/weekly-builds/amd64/iso-dvd/debian-testing-amd64-DVD-6.iso
Resolving caesar.acc.umu.se (caesar.acc.umu.se)... 2001:6b0:e:2018::142, 130.239.18.142
Connecting to caesar.acc.umu.se (caesar.acc.umu.se)|2001:6b0:e:2018::142|:80... failed: No route to host.
Connecting to caesar.acc.umu.se (caesar.acc.umu.se)|130.239.18.142|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 4696719360 (4.4G) [application/octet-stream]
Saving to: “debian-testing-amd64-DVD-6.iso”

89% [===============================================================================================>           ] 4,214,102,016 --.-K/s   in 13h 40m 

2010-08-16 09:26:01 (83.6 KB/s) - Read error at byte 4214102016/4696719360 (Operation timed out). Retrying.

--2010-08-16 09:26:04--  (try: 2)  http://caesar.acc.umu.se/cdimage/weekly-builds/amd64/iso-dvd/debian-testing-amd64-DVD-6.iso
Connecting to caesar.acc.umu.se (caesar.acc.umu.se)|130.239.18.142|:80... connected.
HTTP request sent, awaiting response... 206 Partial Content
Length: 4696719360 (4.4G), 482617344 (460M) remaining [application/octet-stream]
Saving to: “debian-testing-amd64-DVD-6.iso”

100%[++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++==========>] 4,696,719,360 1.26M/s   in 6m 26s  

2010-08-16 09:32:50 (1.19 MB/s) - “debian-testing-amd64-DVD-6.iso” saved [4696719360/4696719360]

Note the report of a read error at 09:26:04 on 2010-08-16. The software was complaining that it had to recover from a timeout. Yes, I'm sure ten hours or so significantly exceeds whatever timeout value had been coded into wget... but it never missed a beat (or a byte; the SHA1 checksum matched afterwards).

I'm including the obligatory "don't try this at home, kids!" warning... but, if you aren't sitting in front of Windows, isn't it nice to know you can?

Wednesday, 23 July 2008

Differences that Make Differences Are Differences

(as opposed to the Scottish proverb, "a difference that makes no difference, is no difference")

This is a very long post. I'll likely come back and revisit it later, breaking it up into two or three smaller ones. But for now, please dip your oar in my stream of consciousness.

I was hanging around on the Freenode IRC network earlier this evening, in some of my usual channels, and witnessed a Windows zealot and an ABMer going at it. Now, ordinarily, this is as interesting as watching paint dry and as full of useful, current information as a 1954 edition of Правда. But there was one bit that caught my eye (nicknames modified for obfuscation):

FriendOfBill: Admit it; Microsoft can outmarket anybody.
MrABM: Sure. But marketing is not great software.
FriendOfBill: So?
MrABM: So... on Windows you pay for a system and apps that aren't worth the price, on Linux you have free apps that are either priceless or worth almost what you pay (but you can fix them if you want to), and on the Mac, you have a lot of inexpensive shareware that's generally at least pretty good, and commercial apps that are much better. THAT's why Microsoft is junk... they ship crap that can't be fixed by anyone else.
FriendOfBill: So you're saying that the Linux crap is good because it can be fixed, and the Mac being locked in is OK because it's great, but Windows is junk because it's neither great nor fixable?
MrABM: Exactly. Couldn't have said it better myself.

Now...that got me to thinking. Both of these guys were absolutely right, in my opinion. Microsoft is, without question, one of the greatest marketing phenomena in the history of software, if not of the world. But it is unoriginal crap. (Quick: Name one successful Microsoft product that wasn't bought or otherwise acquired from outside. Internet Explorer? Nope. PowerPoint? Try again.) Any software system that convinces otherwise ordinary people that they are "stupid" and "unable to get this 'computer' thing figured out" is not a net improvement in the world, in my view. I've been using and developing for Windows as long as there's been a 'Windows'; I think I've earned the opinion.

Linux? Sure, which one? As Grace Hopper famously might have said, "The wonderful thing about standards is that there are so many of them to choose from." (Relevant to The Other Side: "The most dangerous phrase in the language is, 'We've always done it this way.'") As can be easily demonstrated at the DistroWatch.com search page, there are literally hundreds of active "major" distributions; the nature of Free Software is such that nobody can ever know with certainty how many "minor" variants there are (the rabbits in Australia apparently served as inspiration here). Since every distribution has, by definition, some difference with others, it is sometimes difficult to guarantee that programs built on one Linux system will work properly on another. The traditional solution is to compile from source locally with the help of ingenious tools like autoconf. Though this (usually) can be made to work, it disproportionately rewards deep system knowledge to solve problems. The "real" fix has been the coalescence of large ecosystems around a limited number of "base" systems (Debian/Ubuntu, Red Hat, Slackware) with businesses offering testing and certification services. Sure, it passes the "grandma test"....once it's set up and working.

The Macintosh is, and has been for many years, the easiest system for novice users to learn to use quickly. Part of that is due to Apple's legendary Human Interface Guidelines; paired with the tools and frameworks freely available, it is far easier for developers to comply with the Guidelines than to invent their own interface. The current generation of systems, Mac OS X, is based on industry-standard, highly-reliable core components (BSD Unix, the Mach microkernel, etc.) which underpin an extremely consistent yet powerful interface. A vast improvement over famously troubled earlier versions of the system, this has been proven in the field to be proof against most "grandmas".

A slight fugue here; I am active in the Singapore Linux Meetup Group. At our July meeting, there was an animated discussion concerning the upcoming annual Software Freedom Day events. The question before the group was how to organize a local event that would advance the event's purpose: promoting the use of free and open source software for both applications and systems. What I understood the consensus to be basically worked out as "let's show people all the cool stuff they can do, and especially let's show them how they can use free software, especially applications, to do all the stuff they do right now with Windows." The standard example is someone browsing the Web with Firefox instead of Internet Explorer; once he's happy with replacement apps running under Windows, it's easier to move to a non-Windows system (e.g., Linux) with the same apps and interface. That strategy has worked well, particularly in the last couple of years (look at Firefox itself and especially Ubuntu Linux as examples). The one fly in the ointment is that other parts of the system don't always feel the same. (Try watching a novice user set up a Winprinter or wireless networking on a laptop.) The system is free ("as in speech" and "as in beer") but it is most definitely not free in terms of the time needed to get things working sometimes... and that cannot always be predicted reliably.

The Mac, by comparison, is free in neither sense, even though the system software is based on open-source software, and many open-source applications (Firefox, the Apache Web server) run just fine. Apache, for instance, is already installed on every current Mac when you first start it up. But many of the truly "Mac-like" apps — games, the IRC program I use, a nifty note organizer, and so on) are either shareware or full commercial applications (like Adobe Photoshop CS3 or Microsoft Word:mac). You pay money for them, and you (usually) don't get the source code or the same rights that you do under licenses like the GNU GPL.

But you get something else, by and large: a piece of software that is far more likely to "just work" in an expectable, explorable fashion. Useful, interesting features, not always just more bloat to put a few more bullet items on the marketing slides. And that gives you a different kind of freedom, one summed up by an IT-support joke at a company I used to work for, more than ten years ago.

Q: What's the difference between a Windows usee and a Mac user?
A: The Windows usee talks about everything he had to do to get his work done. The Mac user...shows you all the great work she got done.
That freedom may be neither economic or ideological. But, especially for those who feel that the "Open Source v. Free Software" dispute sounds like a less entertaining Miller Lite "Tastes Great/Less Filling" schtick, for those who realize that the hour they spend fixing a problem will never be lived again, this offers a different kind of freedom: the freedom to use the computer as an appliance for interesting, intellectually stimulating activity.

And having the freedom to choose between the other, seemingly competing freedoms... is the greatest of these.

Monday, 16 June 2008

g++ != gcc (arrrrrrgh!)

Coming back up to speed on Mac programming, now that I've finally got a shiny new iMac. Their XCode IDE looks like a great tool (Objective C, C++, C, etc., etc.), but I was hacking around building some simple test code. Being a fully-certified Unix operating system, of course that's an easy way to get something done while minimizing the number of known and unknown unknowns that need to be dealt with.

This mostly transpired between 2300 Sunday and 0130 Monday (last night/this morning). I installed CppUnit (Boost was already on the system), and kept running into the same problem.

jeffs-imac:Foo jeff$ gcc -I /opt/local/include -I /opt/local/include/cppunit foo.cpp -L /usr/local/lib  -lcppunit  -o foo
ld: in /usr/local/lib, can't map file, errno=22
collect2: ld returned 1 exit status

Hmmm. Maybe the library that's on there is screwed up somehow? Go to Sourceforge, pull down the library source, build it, install it, and try again.

Same thing. Fiddle with the code, fiddle with the command line, nothing fixes it. Go out and Google for help. The very first hit, from MacOSXHints, had a silly-sounding but tantalizing "clue":

You're right. I figured out the problem is that I was missing the -c switch when building the .o file with gcc. For some reason the linker doesn't complain about it, but when I try to link the shared lib with my main program I get the obscure can't map file error. Now it is working. Thanks.

Hmmm again. Go fiddle some more, this time compiling and linking my trivial proof-of-concept in separate gcc command lines. Still no joy.

I go back and play with building libcppunit again, wondering if I've missed some funky option to configure. Nope. It's pushing 0130, I need to be up at 6-something, and my brain is fried, so I shut down for the night.

(Later) in the morning, something's niggling at the back of my mind, saying I missed something while watching libcppunit compile, so I do it again. Yep, cue the "you dumb palooka!" moment: it's not using gcc to compile; it's using g++. For those who've never had the (dubious) pleasure - gcc is the "general-purpose" front-end to the GNU Compiler Collection, a set of (numerous) language systems, of which g++ is the C++-specific toolchain (and standalone front end). All compilers in the Collection produce object code in compatible format (the same back-end is used for almost everything), so usually all you have to do is invoke the One Command to automagically compile your Ada, FORTRAN, Java, PL/I, whatever. And, to be honest, it had been a few months since I'd dealt with C++ on gcc/g++ from the command line. (Thank you, Eclipse!) But I remembered a bit of wisdom lost in the Mists of Time...

They can both compile your C++ (in a single step). But: They're. Not. The. Same.

So, I re-do my (now-) two-step process, substituting g++ for gcc:

jeffs-imac:Foo jeff$ g++ -c -I /opt/local/include/ -I /usr/local/include/cppunit/ foo.cpp
jeffs-imac:Foo jeff$ g++ foo.o -L/opt/local/lib -lcppunit -o foo
jeffs-imac:Foo jeff$

Ta-daaaaa!  OK, can we go back to a single step? That is, compiling and linking (using g++) in one go:

jeffs-imac:Foo jeff$ g++ -I /opt/local/include -I /opt/local/include/cppunit foo.cpp -L /usr/local/lib -lcppunit  -o foo
ld: in /usr/local/lib, can't map file, errno=22
collect2: ld returned 1 exit status
jeffs-imac:Foo jeff$ 

Nope. This is where the MacOSXHints commenter was right on the money. But why? I used to be knee-deep in the (FORTRAN-specific) code, a feeling akin to being knee-deep in the dead on occasion, and the answer doesn't come immediately to mind. Any ideas?