Eric S. Raymond's Blog, page 36

September 10, 2014

Review: A Call to Duty

A Call To Duty (David Weber, Timothy Zahn; Baen Books) is a passable extension of Baen Book’s tent-pole Honorverse franchise. Though billed as by David Weber, it resembled almost all of Baen’s double-billed “collaborations” in that most of the actual writing was clearly done by the guy on the second line, with the first line there as a marketing hook.



Zahn has a bit of fun subverting at least one major trope of the subgenre; Travis Long is definitely not the kind of personality one expects as a protagonist. Otherwise all the usual ingredients are present in much the expected combinations. Teenager longing for structure in his life joins the Navy, goes to boot camp, struggles in his first assignment, has something special to contribute when the shit hits the fan. Also, space pirates!


Baen knows its business; there may not be much very original about this, but Honorverse fans will enjoy this book well enough. And for all its cliched quality, it’s more engaging that Zahn’s rather sour last outing, Soulminder, which I previously reviewed.


The knack for careful worldbuilding within a franchise’s canonical constraints that Zahn exhibited in his Star Wars tie-ins is deployed here, where details of the architecture of Honorverse warships become significant plot elements. Also we get a look at Manticore in its very early years, with some characters making the decisions that will grow it into the powerful star kingdom of Honor Harrington’s lifetime.


For these reason, if no others, Honorverse completists will want to read this one too,

 •  0 comments  •  flag
Share on Twitter
Published on September 10, 2014 02:02

September 7, 2014

Review: The Abyss Beyond Dreams

The Abyss Beyond Dreams (Peter F. Hamilton, Random House/Del Rey) is a sequel set in the author’s Commonwealth universe, which earlier included one duology (Pandora’s Star, Judas Unchained) and a trilogy (The Dreaming Void, The The Temporal Void, The Evolutionary Void). It brings back one of the major characters (the scientist/leader Nigel Sheldon) on a mission to discover the true nature of the Void at the heart of the Galaxy.


The Void is a pocket universe which threatens to enter an expansion phase that would destroy everything. It is a gigantic artifact of some kind, but neither its builders nor purpose are known. Castaway cultures of humans live inside it, gifted with psionic powers in life and harvested by the enigmatic Skylords in death. And Nigel Sheldon wants to know why.



This is space opera and planetary romance pulled off with almost Hamilton’s usual flair. I say “almost” because the opening sequence, though action-packed, comes off as curiously listless. Nigel Sheldon’s appearance rescues the show, and we are shortly afterwards pitched into an entertaining tale of courage and revolution on a Void world. But things are not as they seem, and the revolutionaries are being manipulated for purposes they cannot guess…


The strongest parts of this book show off Hamilton’s worldbuilding imagination and knack for the telling detail. Yes, we get some insight into what the Void actually is, and an astute reader can guess more. But the final reveal will await the second book of this duology.

 •  0 comments  •  flag
Share on Twitter
Published on September 07, 2014 21:02

Request for code review: cvs-fast-export

Sometimes reading code is really difficult, even when it’s good code. I have a challenge for all you hackers out there…



cvs-fast-export translates CVS repositories into a git-fast-export stream. It does a remarkably good job, considering that (a) the problem is hard and grotty, with weird edge cases, and (b) the codebase is small and written in C, which is not the optimal language for this sort of thing.


It does a remarkably good job because Keith Packard wrote most of it, and Keith is a brilliant systems hacker (he codesigned X and wrote large parts of it). I wrote most of the parts Keith didn’t, and while I like to think my contribution is solid it doesn’t approach his in algorithmic density.


Algorithmic density has a downside. There are significant parts of Keith’s code I don’t understand. Sadly, Keith no longer understands them either. This is a problem, because there are a bunch of individually small issues which (I think) add up to: the core code needs work. Right now, neither I nor anyone else has the knowledge required to do that work.


I’ve just spent most of a week trying to acquire and document that knowledge. The result is a file called “hacking.asc” in the cvs-fast-export repository. It documents what I’ve been able to figure out about the code. It also lists unanswered questions. But it is incomplete.


It won’t be complete until someone can read it and know how to intelligently modify the heart of the program – a function called rev_list_merge() that does the hard part of merging cliques of CVS per-file commits into a changeset DAG.


The good news is that I’ve managed to figure out and document almost everything else. A week ago, the code for analyzing CVS masters into in-core data objects was trackless jungle. Now, pretty much any reasonably competent C systems programmer could read hacking.txt and the comments and grasp what’s going on.


More remains to be done, though, and I’ve hit a wall. The problem needs a fresh perspective, ideally more than one. Accordingly, I’m requesting help. If you want a real challenge in comprehending C code written by a master programmer – a work of genius, seriously – dive in.


https://gitorious.org/cvs-fast-export/


There’s the repository link. Get the code; it’s not huge, only 10KLOC, but it’s fiendishly clever. Read it. See what you can figure out that isn’t already documented. Discuss it with me. I guarantee you’ll find it an impressive learning experience – I have, and I’ve been writing C for 30 years.


This challenge is recommended for intermediate to advanced C systems programmers, especially those with an interest in the technicalia of version-control systems.

 •  0 comments  •  flag
Share on Twitter
Published on September 07, 2014 08:07

September 2, 2014

Reality is viciously sexist

Better Identification of Viking Corpses Reveals: Half of the Warriors Were Female insists an article at tor.com. It’s complete bullshit.


What you find when you read the linked article is an obvious, though as it turns out a superficial problem. The linked research doesn’t say what the article claims. What it establishes is that a hair less than half of Viking migrants were female, which is no surprise to anyone who’s been paying attention. The leap from that to “half the warriors were female” is unjustified and quite large.


There’s a deeper problem the article is trying to ignore or gaslight out of existence: reality is, at least where pre-gunpowder weapons are involved, viciously sexist.



It happens that I know a whole lot from direct experience about fighting and training with contact weapons – knives, swords, and polearms in particular. I do this for fun, and I do it in training environments that include women among the fighters.


I also know a good deal about Viking archeology – and my wife, an expert on Viking and late Iron Age costume who corresponds on equal terms with specialist historians, may know more than I do. (Persons new to the blog might wish to read my review of William Short’s Viking Weapons and Combat.) We’ve both read saga literature. We both have more than a passing acquaintance with the archeological and other evidence from other cultures historically reported to field women in combat, such as the Scythians, and have discussed it in depth.


And I’m calling bullshit. Males have, on average, about a 150% advantage in upper-body strength over females. It takes an exceptionally strong woman to match the ability of even the average man to move a contact weapon with power and speed and precise control. At equivalent levels of training, with the weight of real weapons rather than boffers, that strength advantage will almost always tell.


Supporting this, there is only very scant archeological evidence for female warriors (burials with weapons). There is almost no such evidence from Viking cultures, and what little we have is disputed; the Scythians and earlier Germanics from the Migration period have substantially more burials that might have been warrior women. Tellingly, they are almost always archers.


I’m excluding personal daggers for self-defense here and speaking of the battlefield contact weapons that go with the shieldmaidens of myth and legend. I also acknowledge that a very few exceptionally able women can fight on equal terms with men. My circle of friends contains several such exceptional women; alas, this tells us nothing about woman as a class but much about how I select my friends.


But it is a very few. And if a pre-industrial culture has chosen to train more than a tiny fraction of its women as shieldmaidens, it would have lost out to a culture that protected and used their reproductive capacity to birth more male warriors. Brynhilde may be a sexy idea, but she’s a bioenergetic gamble that is near certain to be a net waste.


Firearms changes all this, of course – some of the physiological differences that make them inferior with contact weapons are actual advantages at shooting (again I speak from experience, as I teach women to shoot). So much so that anyone who wants to suppress personal firearams is objectively anti-female and automatically oppressive of women.

 •  0 comments  •  flag
Share on Twitter
Published on September 02, 2014 22:33

Adverse selection and old technology

Yesterday I shipped cvs-fast-export 1.15, with a significant performance improvement produced by replacing a naive O(n**3) sort with a properly tuned O(n log n) version.


In ensuing discussion on G++, one of my followers there asked if I thought this was likely to produce a real performance improvement, as in small inputs the constant setup time of a cleverly tuned algorithm often dominates the nominal savings.


This is one of those cases where an intelligent question elicits knowledge you didn’t know you had. I discovered that I do believe strongly that cvs-fast-export’s workload is dominated by large repositories. The reason is a kind of adverse selection phenomenon that I think is very general to old technologies with high exit costs.


The rest of this blog post will use CVS as an example of the phenomenon, and may thus be of interest even to people who don’t specifically care about high version control systems.



Cast your mind back to the point at which CVS was definitely superseded by better VCS designs. It doesn’t matter for this discussion exactly when that point was, but you can place it somewhere between 2000 and 2004 based on when you think Subversion went from a beta program to a production tool.


At that point there were lots of CVS repositories around, greatly varying in size and complexity. Some were small and simple, some large and ugly. By “ugly” I mean full of Things That Should Not Be – tags not corresponding to coherent changesets, partially merged import branches, deleted files for which the masters recording older versions had been “cleaned up”, and various other artifacts that would later cause severe headaches for anyone trying to convert the repositories to a modern VCS.


In general, size and ugliness correlated well with project age. There are exceptions, however. When I converted the groff repository from CVS to git I was braced for an ordeal; groff is quite an old project. But the maintainer and his devs had been, it turned out very careful and disciplined and comitted none of the sloppinesses that commonly lead to nasty artifacts.


So, at the point that people started to look seriously at moving off CVS, there was a large range of CVS repo sizes out there, with difficulty and fidelity of up-conversion roughly correlated to size and age.


The result was that small projects (and well-disciplined larger projects resembling groff) converted out early. The surviving population of CVS repositories became, on average, larger and gnarlier. After ten years of adverse selection, the CVS repositories we now have left in the wild tend to be the very largest and grottiest kind, usually associated with projects of venerable age.


GNUPLOT and various BSD Unixes stand out as examples. We have now, I think, reached the point where the remaining CVS conversions are in general huge, nasty projects that will require heroic effort with even carefully tuned and optimized tools. This is not a regime in which the constant startup cost of an optimized sort is going to dominate.


At the limit, there may be some repositories that never get converted because the concentrated pain associated with doing that overwhelms any time-discounted estimate of the costs of using obsolescent tools – or even the best tools may not be good enough to handle their sheer bulk. Emacs was almost there. There are hints that some of the BSD Unix repositories may be there already – I know of failed attempts, and tried to assist one such failure.


I think you can see this kind of adverse selection effect in survivals of a lot of obsolete technology. Naval architecture is one non-computing field where it’s particularly obvious. Surviving obsolescent ships tend to be large and ugly rather than small and ugly, because the capital requirement to replace the big ones is harder to swallow.


Has anyone coined a name for this phenomenon? Maybe we ought to.

 •  0 comments  •  flag
Share on Twitter
Published on September 02, 2014 18:47

August 27, 2014

Mysterious cat is mysterious

Our new cat Zola, it appears, has a mysterious past. The computer that knows about the ID chip embedded under his skin thinks he’s a dog.


There’s more to the story. And it makes us think we may have misread Zola’s initial behavior. I’m torn between wishing he could tell us what he’d been through, and maybe being thankful that he can’t. Because if he could, I suspect I might experience an urge to go punch someone’s lights out that would be bad for my karma.



On Zola’s first vet visit, one of the techs did a routine check and discovered that Zola had had an ID chip implanted under his skin. This confirmed our suspicion that he’d been raised by humans rather than being feral or semi-feral. Carol, our contact at PALS (the rescue network we got Zola from) put some more effort into trying to trace his background.


We already knew that PALS rescued Zola from an ASPCA shelter in Cumberland County, New Jersey, just before he would have been euthanized. Further inquiry disclosed that (a) he’d been dumped at the shelter by a human, and (b) he was, in Carol’s words, “alarmingly skinny” – they had to feed him up to a normal weight.


The PALS people didn’t know he was chipped. When we queried Home Again, the chip-tracking outfit, the record for the chip turned out to record the carrier as a dog. The staffer my wife Cathy spoke with at Home Again thought that was distinctly odd. This is not, apparently, a common sort of confusion.


My wife subsequently asked Home Again to contact the person or family who had Zola chipped and request that the record be altered to point to us. (This is a routine procedure for them when an animal changes owners.)


We got a reply informing us that permission for the transfer was refused.


These facts indicate to us that somewhere out there, there is someone who (a) got Zola as a kitten, (b) apparently failed to feed him properly, (c) dumped him at a shelter, and now (d) won’t allow the chip record to be changed to point to his new home.


This does not add up to a happy picture of Zola’s kittenhood. It is causing us to reconsider how we evaluated his behavior when we first met him. We thought he was placid and dignified – friendly but a little reserved.


Now we wonder – because he isn’t “placid” any more. He scampers around in high spirits. He’s very affectionate, even a bit needy sometimes. (He’s started to lick our hands occasionally during play.) Did we misunderstand? Was his reserve a learned fear of mistreatment? We don’t know for sure, but it has become to seem uncomfortably plausible.


There’s never any good reason for mistreating a cat, but it seems like an especially nasty possibility when the cat is as sweet-natured and human-friendly as Zola is. He’s not quite the extraordinarily loving creature Sugar was, but his Coon genes are telling. He thrives on affection and returns it more generously every week.


I don’t know if we’ll ever find out anything more. Nobody at PALs or Home Again or our vet has a plausible theory about why Zola is carrying an ID chip registered to a dog, nor why his former owners owners won’t OK a transfer.


We’re just glad he’s here.

 •  0 comments  •  flag
Share on Twitter
Published on August 27, 2014 22:08

Phase-of-moon-dependent bugs suck

I just had a rather hair-raising experience with a phase-of-moon-dependent bug.


I released GPSD 3.11 this last Saturday (three days ago) to meet a deadline for a Debian freeze. Code tested ninety-six different ways, run through four different static analyzers, the whole works. Because it was a hurried release I deliberately deferred a bunch of cleanups and feature additions in my queue. Got it out on time and it’s pretty much all good – we’ve since turned up two minor build failures in two unusual feature-switch cases, and one problem with the NTP interface code that won’t affect reasonable hardware.


I’ve been having an extremely productive time since chewing through all the stuff I had deferred. New features for gpsmon, improvements for GPSes watching GLONASS birds, a nice space optimization for embedded systems, some code to prevent certain false-match cases in structured AIS Type 6 and Type 8 messages, merging some Android port tweaks, a righteous featurectomy or two. Good clean fun – and of course I was running my regression tests frequently and noting when I’d done so in my change comments.


Everything was going swimmingly until about two hours ago. Then, as I was verifying a perfectly innocent-appearing tweak to the SiRF-binary driver, the regression tests went horribly, horribly wrong. Not just the SiRF binary testloads, all of them.



My friends, do you know what it looks like when the glibc detects a buffer overflow at runtime? Pages and pages of hex garble, utterly incomprehensible and a big flare-lit clue that something bad done happened.


“Yoicks!” I muttered, and backed out the latest change. Ran “scons check” again. Kaboom! Same garble. Wait – I’d run regressions successfully on that revision just a few minutes previously, or so I thought.


Don’t panic. Back up to the last revision were the change comment includes the reassuring line “All regression tests passed.” Rebuild. “scons check”. Aaaand…kaboom!


Oh shit oh dear. Now I have real trouble. That buffer overflow has apparently been lurking in ambush for some time, with regression tests passing despite it because the phase of the moon was wrong or something.


The first thing you do in this situation is try to bound the damage and hope it didn’t ship in the last release. I dropped back to the release 3.11 revision, rebuilt and tested. No kaboom. Phew!


These are the times when git bisect is your friend. Five test runs later I found the killer commit – a place where I had tried recovering from bad file descriptor errors in the daemon’s main select call (which can happen if an attached GPS dies under pessimal circumstances) and garbage-collecting the storage for the lost devices.


Once I had the right commit it was not hard to zero in on the code that triggered the problem. By inspection, the problem had to be in a particular 6-line loop that was the meat of the commit. I checked out the head version and experimentally conditioned out parts of it until I had the kaboom isolated to one line.


It was a subtle – and entirely typical – sort of systems-programming bug. The garbage-collection code iterated over the array of attached devices conditionally freeing them. What I forgot when I coded this was that that sort of operation is only safe on device-array slots that are currently allocated and thus contain live data. The test operation on a dead slot – an FD_ISSET() – was the kaboomer.


The bug was random because the pattern of stale data in the dead slots was not predictable. It had to be just right for the kaboom to happen. The kaboom didn’t happen for nearly three days, during which I am certain I ran the regression tests well over 20 times a day. (Wise programmers pay attention to making their test suites fast, so they can be run often without interrupting concentration.)


It cannot be said too often: version control is your friend. Fast version control is damn near your best friend, with the possible exception of a fast and complete test suite. Without these things, fixing this one could have ballooned from 45 minutes of oh-shit-oh-dear to a week – possibly more – of ulcer-generating agony.


Version control is maybe old news, but lots of developers still don’t invest as much effort on their test suites as they should. I’m here to confirm that it makes programming a hell of a lot less hassle when you build your tests in parallel with your code, do the work to make them cover well and run fast, then run them often. GPSD has about 100 tests; they run in just under 2 minutes, and I run them at least three or four times an hour.


This puts out little fires before they become big ones. It means I get to spend less time debugging and more time doing fun stuff like architecture and features. The time I spent on them has been multiply repaid. Go and do thou likewise.

 •  0 comments  •  flag
Share on Twitter
Published on August 27, 2014 01:58

August 26, 2014

Master Foo and the Hardware Designer

The newest addition to Rootless Root:



On one occasion, as Master Foo was traveling to a conference with a few of his senior disciples, he was accosted by a hardware designer.


The hardware designer said: “It is rumored that you are a great programmer. How many lines of code do you write per year?”


Master Foo replied with a question: “How many square inches of silicon  do you lay out per year?”


“Why…we hardware designers never measure our work in that way,” the man said.


“And why not?” Master Foo inquired.


“If we did so,” the hardware designer replied, “we would be tempted to design chips so large that they cannot be fabricated – and, if they were fabricated, their overwhelming complexity would make it be impossible to generate proper test vectors for them.”


Master Foo smiled, and bowed to the hardware designer.


In that moment, the hardware designer achieved enlightenment.

 •  0 comments  •  flag
Share on Twitter
Published on August 26, 2014 17:00

August 25, 2014

Spam alert

Yes, I’m aware of the spam on the blog front page. The management does not hawk dubious drugs.


Daniel Franke and I just did an audit and re-secure of the blog last night, so this is a new attack. Looks like a different vector; previously the spam was edited into the posts and invisible, this time it’s only in the front-page display and visible.


It’s a fresh instance of WordPress verified against pristine sources less than 24 hours ago, all permissions checked. Accordingly, this may be a zero-day attack.


Daniel and I will tackle it later tonight after his dinner and my kung-fu class. I’ll update this post with news.


UPDATE: The initial spam has been removed. We don’t know where the hole is, though, so more may appear.


UPDATE2: It’s now about 6 hours later and spam has not reappeared.  I changed my blog password for a stronger one, so one theory is that the bad guys were running a really good dictionary cracker.

 •  0 comments  •  flag
Share on Twitter
Published on August 25, 2014 15:20

August 22, 2014

Review: Once Dead

Once Dead (Richard Phillips; Amazon Publishing) is a passable airport thriller with some SF elements.


Jack Gregory should have died in that alley in Calcutta. Assigned by the CIA to kill the renegade reponsible for his brother’s death, he was nearly succeeding – until local knifemen take a hand. Bleeding, stabbed and near death, he is offered a choice: die, or become host to Ananchu – an extradimensional being who has ridden the limbic systems of history’s greatest slayers.



It’s a grim bargain. Ananchu will give him certain abilities, notably the ability to sense life at a distance and read the intentions of his enemies. But the cost is a near-uncontrollable addiction to danger and death. Gregory will literally be in constant struggle with an inner demon – and when the human who dragged his body from the alley is found insane, mumbling of the return of Jack the Ripper, a dark legend is reborn.


Airport-thriller action ensues as Gregory, believed dead by the CIA, goes freelance but is drawn into opposing a plot to cripple the U.S. with an EMP attack. There are lots of bullets, big explosions, a heavy from the Russian Mafia, treachery from rogues inside the CIA, torture scenes, exotic international locations, some sex, weapon porn, and a climactic Special-Ops-style assault on Baikonur. There’s not much surprising here, and the SF elements tend to recede into the background as the plot develops. There are clear indication that the author intends a series.


It’s not brilliant or terribly original, but it’s competently done. The author is a former Army Ranger; the gunplay, hand-to-hand fighting, and combat ops are written as by someone who has seen how it’s done right, if not done it himself. Read it on an airplane.

 •  0 comments  •  flag
Share on Twitter
Published on August 22, 2014 10:48

Eric S. Raymond's Blog

Eric S. Raymond
Eric S. Raymond isn't a Goodreads Author (yet), but they do have a blog, so here are some recent posts imported from their feed.
Follow Eric S. Raymond's blog with rss.