Most of the software we use today has its origins in the pre-Internet era, when storage was at a premium, machines ran thousands of times slower, and applications were sold in shrink-wrapped boxes for hundreds of dollars.
Anyone in high tech is used to seeing performance increases illustrated by a number and an “x”: it means that the current version of the part in question (and here’s another familiar techy term) is “orders of magnitude” faster than a previous version.
Faster … remember that. We’ll come back to it.
The author of the Wired article wanted to emphasize, even exaggerate (nothing wrong with that) the difference between ye ole computers of the 1960s and 70s and today. But “times” is throwing everything off, because times means to multiply. You can’t multiply and get a product that’s less than what you started off with. But that’s exactly what the author is asking you to do, because it’s “times slower.”
Since when is anything times slower?
AP’s directive on using decreasing adverbs is to use “fewer” for individual items, “less” for bulk or quantity. Yet this still doesn’t really help the author of the Wired article. If I had edited this story, I would’ve rephrased “thousands of times slower” to something rather innocuous but still accurate, like “much slower,” “orders of magnitude slower” or “at a fraction of the speed of today’s processors.”
Unfortunately, attempting to actually quantify “thousands of times slower” as a fraction would still read awkwardly. Imagine this construction: “… machines ran one-one-thousandth times slower.” Ugh.
You might be able to get away with using percent, since we’re all familiar with percent used to indicate a decreased price in a sale. But by the time you get up into the thousands, as in the quantity cited in the example, and then add in the “slower” adverb, that’s just as awful: “… machines ran 1,000 percent slower …”? No thanks.
A study by Larva Labs (the developers of the excellent Slidescreen app) estimates that Apple has paid out 50 times more money to developers than Google has.
(I should note that the author edited this: it originally read 50x.)
My big quibble here is why the author didn’t just offer up the quantity, since he was talking about money. Let’s say Google developers make $1,000 and Apple developers make $50,000. Now isn’t that impressive? And isn’t the disparity between the two payouts still remarkable?
Perhaps the author was manipulating the amounts to make his point. It’s possible that Apple has paid out a gazillion dollars to a gazillion developers, whereas Google has paid out a gazillion dollars to 10 developers. That makes Google far more generous, but the statement proffered by the blogger would still be accurate: Apple would have paid out 50 times more money to developers than Google.
Don’t get tripped up by “times” – avoid this expression unless the most impressive way to describe the increase (not decrease) would be to explain it as a multiple.