Sony adds DRM against author’s wishes

It has come to my attention that the Sony Reader Store added DRM (Digital Restrictions Management) to copies of my novel Blue Screen of Death. I do not approve of DRM, and it was never my intention for my book to be sold with it.

I’m not sure why Sony did this. They do not add DRM to all of the books they sell, but Sony does have a terrible history of putting anti-consumer technologies into their media products, which is why I try to avoid purchasing Sony products.

I’ve begun the process of removing my book from the Sony store, and I do not plan to make future work available there. If you were one of the readers who purchased Blue Screen of Death from Sony, I would like to replace your copy for one without DRM. Contact me at, tell me approximately when you purchased your copy and where you live, and I will send you a DRM-free copy of the book to replace the defective copy sold to you by Sony. (This offer ends when I believe I’ve replaced all the copies indicated in my Sony sales statements.)

DRM is not the solution to piracy. The solution is to provide good products at fair prices. Amazon, Barnes & Noble, iTunes, and other ebook retailers have authorized, DRM-free copies of Blue Screen of Death available at a bargain price. Please support these booksellers who honor the wishes of their authors.

Response to TSA Proposal to use AIT

In regard to Docket Number TSA-2013-0004:

The TSA’s proposal to use advanced imaging technology routinely as a primary screening method at transportation security checkpoints should be denied.

The proposal utterly fails to justify the costs against the infinitesimal incremental improvement in the ability of the TSA to detect a weapon that poses a significant threat to safe air travel. The proposal makes specious, unsubstantiated assertions, uses flawed reasoning, ignores certain costs, and misleadingly suggests that unrelated facts are somehow relevant to the alleged need for the AIT systems.


The TSA’s estimate of the net economic cost of the AIT systems are $2.4 billion over an eight year period, which includes the early trial period where costs were significantly lower. The run rate for just 2015 is $357 million.

By itself, this is a substantial incremental cost to justify. Certainly there are better ways to spend that kind of money that are less invasive and have a more substantial (and quantifiable!) net benefit. Consider the lives that could be saved by spending $357 million per year on roadway improvements in dangerous locations.

The TSA’s cost estimate is incomplete. The TSA screens approximately 1.8 million passengers per day or 660 million per year. The TSA analysis fails to compute the cost in lifetimes lost to the additional delay of AIT (and, as usual, they have provided no data to quantify this delay). If we conservatively assume that the security lines in airports are, on average, one minute longer with AIT than with traditional walk-through metal detectors (WTMD), that amounts to 18 lifetimes per year lost to waiting in line.

Also consider that many passengers are sole-proprieters, and thus “small entities” impacted by the rule change. Where has the TSA estimate the costs to them for time lost?

Non-Metallic Threats

The proposal argues that AIT is necessary because terrorists are focusing on non-metallic explosives, rendering the checkpoint WMTDs useless. They point to the shoe bomber, the underwear bomber, the alleged liquids plot, and to plots entirely outside of (and not targeted at) the United States. While it may be true that terrorists are considering non-metallic weapons, letting the TSA use AIT–even if AIT was 100% effective–wouldn’t have stopped any of these attempts, since none of them originated at airports where the TSA does screening.

Furthermore, the underwear bomb and the shoe bomb failed, in part, because it’s extremely difficult to detonate reliably a purely chemical explosive. Even the 1994 Bojinka Plot, which successfully detonated liquid explosives aboard an airliner headed for the U.S., relied on a detonater with metallic components that would have been detected with a modern WTMD. (It’s also interesting to note that today’s TSA rules wouldn’t even have stopped the bomber from bringing the liquid explosive through a checkpoint.)

Janet Napolitano was absolutely correct when she claimed success after the failure of the Umar Farouk Abdulmutallab to detonate his bomb. The rules in place before AIT are sufficient enough to drive would-be terrorists into desperate, risky schemes that involve unreliable devices. The plane landed with the suspect in custody an no injuries, except to Abdulmutallab. This is exactly what success looks like.


Let’s look at the effectiveness of AIT. Oh, that’s right. We can’t. The TSA either doesn’t know or won’t tell us the rates of false-positives and false-negatives. (Though John Pistole testified that the false-positive rate was higher than desired.) Without this data, it’s impossible to do a cost-benefit analysis.

If we look to other countries to get clues as to the effectiveness of AIT machines, we find that the German interior ministry has declined to roll them out because there are too many false positives to make them a useful screening tool. Italy found that the machines are too slow and ineffective.

My own experiences have indicated a tremendously high false-positive rate, causing lines to slow to the point that TSA agents allow some passengers to go through the WTMD instead simply to relieve the backup. (If the TSA proposal can rely on nothing but anecdotal evidence, then so can I.)

By replacing WTMD with AIT, the TSA is actually reducing its ability to reliably detect metallic weapons. They assert that any metallic anomaly on a person would be detected by AIT as reliably as by WTMD. This is false, as has been publicly demonstrated multiple times. Furthermore, a peer-reviewed article in a scientific journal explains how AIT scanning can fail to detect PETN explosives shaped to conform to the body. Perhaps this is why the Israeli airport security doesn’t think they’re useful.

If we examine the TSA Blog posts on prohibited items that have been found, we learn that virtually no prohibited items have been detected with AIT. Some of those finds (like the gun strapped to a passenger’s ankle) would have been found just as easily with a WTMD. Others, like small ceramic knives, might actually be allowed today as the TSA has wisely revised the guidelines about small knives.

Like many of the TSA’s rules, AIT makes the traveling public less safe in small ways. To submit to AIT or a patdown, the would-be passenger must remove virtually everything but they’re actual clothes. With a WTMD, you can keep your wallet (identification, cash, bank cards) and your boarding pass. With AIT, you cannot. Instead, the passenger must relinquish control and view of their most vital, hard-to-replace items. While the consequences of an individual lost wallet or boarding pass are small compared to explosives on airplanes, the AIT process greatly increases the rate of such losses while doing nothing to thwart the large-but-very-rare threats.

Until the TSA installs AIT on every security lane at every checkpoint in every terminal at every airport, it means that an observant terrorist with a non-metallic weapon can simply choose the WTMD line and avoid detection. Only the innocent are forced into the lose-lose proposition of AIT or patdown (often both). The proposal does not indicate if the costs of getting to the point where very lane has an AIT scanner are included in the eight-year cost projections.

Without real data as to the effectiveness of these devices, the TSA cannot possibly expect the traveling public to bear the invasiveness of the scanners and expect all tax payers to bear the tremendous cost of instituting this screening regime.


The TSA proposal asserts that there has been massive public approval of the body scanners, but they provide absolutely no data to back that assertion.

It seems, at nearly every airport, there is a checkpoint lane available that allows passengers to self-select for a WTMD instead of an AIT scanner. From observing the crowds at checkpoints, it’s very clear that the vast majority of people who recognize the opportunity to select the WTMD line will indeed choose that option over the AIT scanner. Every one of them should be considered an opt-out for AIT scanning.

The official way to opt out of AIT scanning is to submit to an invasive patdown. That’s a false choice. If TSA actually kept accurate tallies of opt-outs and opt-ins, the numbers would tell us nothing except whether passengers preferred one invasive form of search over the other. There’s no indication of how acceptable they find the concept of an invasive search overall.

Whether the opposition to AIT is a vocal minority, is irrelevant. The burden is on the TSA to demonstrate that AIT scanning is a cost effective, meaningful improvement to the screening process that remains within the bounds of theit mandate to perform minimally-invasive administrative searches for weapons.


No screening method will stop every threat. Every screening method has direct and indirect financial costs and some amount of unwanted invasion of privacy. When evaluating the addition of a new screening method to the mix or, as here, evaluating a replacement of one screening method (WTMD) with another (AIT scanning), we have to weigh the incremental costs against the incremental benefits. If the new screening method is approved, the terrorists will find a way past it. This arms race never ends. We’ll never get to 100% detection of weapons even with an infinite budget and limitless tolerance for invasive searches.

“Security at any cost” is impossible and a bad strategy. At some point, we have to draw a line and realize that additional spending and sacrifice of personal dignity won’t significantly improve the screening process.

The TSA, as usual, has not actually done a cost-benefit _analysis_. Instead, they’ve totaled up the bills and provided a vague, unconvincing argument that they must do something about non-metallic threats. They assert–without any supporting data whatsoever–that AIT is effective at detecting non-metallic items. This reasoning is faulty in so many ways that I find it hard to believe they can argue the point with straight faces. Clearly, as a nation, we’d be better off spending $357 million per year bringing logic and statistics courses back into the core curriculum of our education system.

Silencing a Wine Refrigerator


grommet[This is a reconstruction of a blog post I made back in 2007, before I bungled my Word Press backup and lost the old blog posts. Several people have requested it. I might find the pictures later.]

My brother and I just had one of our quarterly project weekends. I can’t believe it was time for another one already, since I hadn’t even blogged about the last one yet. So let me catch up by showing off the projects we did this summer.

My wife and I have a wine refrigerator in our dining room. It has always bugged me with the noise it makes. There’s an air circulation fan that runs all the time. The manufacturer says it’s to prevent condensation and mold. We have a friend with a fancier model that doesn’t have the circulation fan, and he has had problems with mold, so it’s probably a good feature. But the fan cycles: 20 seconds on, 10 seconds off. So not only did I hear it when it’s running, I had to hear it whine as it revved up twice a minute. It drove me nuts.

So when my brother showed up for project weekend, I suggested we try to find a way to make the fridge run quietly. I had peeked at the fan, and determined that it looked a lot like a fan you’d have in a PC, so I hoped that there would be a super-quiet PC fan we could swap in. After all, there’s a significant number of people out there trying to make their PCs run silently.

We got the manufacturer and part number of the fan off its label. A quick web search gave us detailed specs, including dimensions, power requirements, air flow, and sound level. At Fry’s Electronics, we found a matching fan with a slightly lower decibel rating. I don’t recall the exact value, but it was something like 27 db instead of 30 db. I hoped it would be enough. Since decibels are logarithmic units, a small change might make a big improvement. And the fan was only $17.

We took the false back out of the fridge and wired in the new fan. Our first test was like a dream. Even with the door open, the new fan was significantly quieter. With the door closed, I could hardly hear it at all.

But that test was just with the fan sitting inside the fridge. Once we re-mounted it to the false back, it got dramatically louder. The backing acted like a soundboard, amplifying the vibrations of the fan.

My brother smartly pointed out that we needed to dampen the vibrations, so he suggested we get some rubber grommets to replace the washers used when fastening the fan to the backing. ACE Hardware had just the thing, and it really did the trick.

We put the fridge back together, and though it’s not silent, it a fantastic improvement. I wish we had done it years ago. And I wish manufacturers would realize how important these little details are. Between the fan and the grommets, we spent less than $20. The incremental cost for the manufacturer to have started with a quieter fan and used rubber grommets would have been tiny, yet it would have made their product noticeably better.

Quick Review: The Starcrossed

The StarcrossedThe Starcrossed by Ben Bova

My rating: 3 of 5 stars

An amusing tale of the crazy mid-1970s television industry. It’s told as a science fiction story, but isn’t really necessary. It’s based on the adventures of Ben Bova and Harlan Ellison in the making of The Starlost, a pathetic 1973 science fiction show. I’m sure there are lots of inside jokes for those who know the backstory, but it’s not necessary to appreciate the antics of these outrageous characters and their scheming. I laughed out loud a couple times during this fast read.

View all my reviews

Quick Review: Masters of Mystery: The Strange Friendship of Arthur Conan Doyle and Harry Houdini

Masters of Mystery: The Strange Friendship of Arthur Conan Doyle and Harry HoudiniMasters of Mystery: The Strange Friendship of Arthur Conan Doyle and Harry Houdini by Christopher Sandford

My rating: 3 of 5 stars

This wasn’t quite as entertaining a read as Hiding the Elephant by Jim Steinmeyer, but Masters of Mystery: The Strange Friendship of Arthur Conan Doyle and Harry Houdini by Christopher Sandford added dimensions to the era of spiritualism that the former book only touched on. I had never known that Doyle and Houdini had corresponded extensively despite their differing views on spiritualism. Nor did I know that, Doyle–creator of the scientifically-minded detective Sherlock Holmes–was an outspoken advocate of mediums, seances, spirit guides, and the afterlife. These aspects were fascinating, but I think I was able to appreciate them better because I had read Hiding the Elephant first.

View all my reviews

Review: Timecaster

TimecasterTimecaster by Joe Kimball

My rating: 3 of 5 stars

Timecaster is more of an action/adventure story set in the future than a hard science fiction novel. Nevertheless, J.A. Konrath (writing as Joe Kimball) puts some interesting what-if ideas out there, and they help the story to good effect. It’s a fast read–one cliffhanger after another. Overall, it was a nice mindless escape with the usual Konrath humor, lots of imaginative action sequences, and downright over-the-top violence and sex.

I have a few minor quibbles with the plot, which I’ll leave out to avoid spoilers. There are some anachronistic cultural references that should be long forgotten by the time of the story, but, as those are mostly just punchlines to throwaway jokes; they don’t adversely affect the story.

The book felt short, both in actual length and in the sense that the ending leaves you hanging. It’s as though Konrath held back the final act in order to sell a sequel. Someday, I might read the next one in the series, but it’s not high on my list.

Formatting for the nook was pretty good, with just a few missing characters and a couple of backwards apostrophes. I wish the legacy publishers wouldn’t put so much front matter in their ebooks; the reader has already bought the book and shouldn’t have to click through pages of jacket copy. It felt like padding, as did the useless glossary at the end. Perhaps if there were links to the glossary wherever the futuristic terms are used, it would have been useful. But, honestly, the meanings are clear enough in context, and, by the time you get to the end of the book and realize there’s a glossary, it’s too late to make a difference.

View all my reviews

Review: The Bug

The Bug: A NovelThe Bug: A Novel by Ellen Ullman

My rating: 5 of 5 stars

I stumbled upon a description of this book earlier in the year and added it to my reading list. Just a few weeks later, a friend who had read my book, Blue Screen of Death, said he enjoyed BSoD much more than The Bug, so naturally I moved it up to the top of my reading list.

Comparing them isn’t fair. Blue Screen of Death is a genre mystery. The Bug is a mainstream literary work, mostly a character study. Nevertheless, there were some striking similarities: both are narrated from the point of a young woman, both are set in a software company in Fremont, California, both start during an unusual heat wave, and both examine aspects of debugging—tracking down and fixing mistakes (bugs) in software.

But the similarities end there. At times, The Bug is almost like poetry, with beautifully crafted sentences and a rhythm that evokes the feeling of programming and of debugging. Ullman captures the fleeting elation when months of development finally pay off in some little way, and the utter despair of pounding your head against the same intermittent bug month after month after month. Ullman understands and conveys the emotional journey of software development at a level I never expected possible from a novel.

I loved the book. That said, I don’t know if it’s for everyone. It’s immersive. Deeply. Ullman explains the software terminlogy and concepts enough to understand—if you care to. If you don’t care to, you’ll quickly be lost, as the story hinges on abstractions and the mental labors of the characters. If she had written the same story about a field I’m not interested in—say auto mechanics—I’d probably have been bored to tears.

Conclusion: If you work in the software business, read it. If not, you might want to sample it before committing to the entire book.

View all my reviews

Quick Review: Spook Country

Spook Country (Blue Ant, #2)Spook Country by William Gibson
My rating: 3 of 5 stars

The chapters alternate among three separate story lines that eventually intersect at the climax of the novel. I really enjoyed the characters in one of these lines, but the other two didn’t pique my interest until at least the halfway point.

But all along, it seemed to be building to something bigger, something more important. There were so many hints of backstory, that I expected a lot more revelations and interesting interactions, but I was disappointed. If you’re paying attention, the ending isn’t a surprise. And if you aren’t paying attention, don’t worry, the last couple chapters recap exactly what happened.

I got more out of _Pattern Recognition_, the first book of Gibson’s Blue Ant series. I’ll probably read the third book, _Zero History_, because I already own it. If I didn’t, it would be a lower priority.

View all my reviews

Lost Opportunities

This week I came across some C++ code like this:

if (foo & 0x0FFFFFFF >= width * height) {
    /* copy width * height items to a buffer */
} else {
    /* handle error */

This is buggy code. Worse, it’s probably a security vulnerability. The code parses a particular file format. This if-statement is attempting to make sure the fields are internally consistent. Getting this check wrong, probably means an attacker could craft a file to cause the parser to overrun a buffer, which is almost certainly an exploitable security bug.

Don’t see the problem? The variable foo in this case is a 32-bit integral type. The top bits are used for flags, and the remaining portion is a buffer size. The code is attempting to make sure that the buffer is at least large enough to hold width * height items.

Do you see the problem now? It’s a precedence problem. In C and C++, the & operator has a low precedence — lower than even the inquality operators. So the compiler interprets the condition as:

foo & ( 0x0FFFFFFF >= (width * height))

So first the code will compute the product width * height and then it will check if 0x0FFFFFFF is less than the product. This yields a boolean value: true or false. The bitwise AND is then performed between foo and the boolean. Well, sort of. First the boolean will be implicitly converted to an integral type to match foo‘s type. That is, it becomes a 0 for false or a 1 for true.

The net result is that it checks to see if foo is even or odd.

C++, like C, is full of pitfalls like this. It’s easy to write code that does the wrong thing. Programmers aren’t superhuman. Bugs like this get written, but we have lots of techniques for catching these kinds of mistakes.

The first line of defense is the compiler. The incorrect expression is perfectly legal, so it’ll happily compile the code. But most modern compilers are aware of common language pitfalls and will issue warnings about suspicious code. For example, I spotted this problem because the Micosoft VC++ compiler pointed at the line and said:

warning C4554: ‘&’ : check operator precedence for possible error; use parentheses to clarify precedence

There’s little excuse for ignoring (or worse, silencing) compiler warnings. Some programmers don’t want to be bothered with false positives.  In almost all cases, however, it’s possible to eliminate false positives by making the code more clearly express what you intended. For example, if you did intend to evaluate the bitwise-AND last, you could have added parentheses to make it explicit. This would not only eliminate the noisy warning, but it would also make the code more obvious to the next programmer to come along.

The next line of defense is to always step through new code in the debugger. This should be second nature to software engineers, but, unfortunately, many still don’t do it. It’s such a useful practice that Steve Maguire dedicated an entire chapter to it in Writing Solid Code. Stepping through the code while keeping an eye on the variables could have alerted the programmer to the fact that the condition wasn’t testing the right thing.

Unit tests definitely should have caught this mistake. If it’s important enough to check the condition, then it’s probably important enough that you have unit tests to check both outcomes. Alas, this code had no unit tests.

And you might think that normal testing would stumble over the problem. In this case, it wouldn’t because the condition was checking for a very unusual condition that’s not likely to occur with any normal data. And even if somehow a regular test case stumbled over the problem, it wouldn’t provide very good isolation for tracking the cause back to this line of code. File fuzzing could have caught the problem. Fuzzing should probably be done for all parsing code.

Having a peer code review your change–that is another programmer look them over–might have caught the problem. If one developer doesn’t see the bug, then a second might miss it, too. But that won’t always be the case. A fresh set of eyes with a different perspective might at least ask whether the expression does what is intended.  And if you know your colleague is going to be reading your code, you might be motivated to make it as clear as possible.  For example, you might make it a little more explicit like this:

const uint32_t buffer_length = foo & 0x0FFFFFFF;
if (buffer_length >= width * height) {
    /* copy width * height items to a buffer */
} else {
    /* handle error */

Notice how trying to make the code clearer to another human can “accidentally” fix the bug.

It’s hard to see how this bug could have made it into a product with all the usual bug defenses in place. I can only conclude that they weren’t all in place. Obviously, the compiler warnings were ignored or silenced. It’s a shame that programmers feel compelled to take such shortcuts. I love it when the compiler says, “um, this looks wrong.” It’s a golden opportunity to fix my stupid mistakes before anyone else sees them.