I’m sure there are lots of people who are even now blaming blogs and “new media” and God knows what else for the frenzy of stories about how iTunes sales are “collapsing” or “plummeting” or “hemorrhaging” or (insert sensationalized adverb here), all of which were based on a loose interpretation of a Forrester sales report. The key takeaway for most was that iTunes sales were down 65 per cent.
Great story, right? So great that it turns out to be, well… not exactly true. Or rather, true in a fairly limited sense. Forrester’s report was based on a relatively small set of credit-card data, and the research firm itself warned against extrapolating from that data. So what did The Register do? Extrapolated wildly, put the word “collapse” into the mouths of the Forrester team, and then said iTunes’ sales were “collapsing” in the headline (Bloomberg wrote a story too).
Why did The Register do this? Probably because it made the story sound even more interesting, and because the writer, Andrew Orlowski, wanted to use the data as a springboard for a larger story about the death of DRM (digital rights management) and how the music industry might be forced to go the “blanket license” route.
Is this something unique to online media or the blogosphere? Hardly. Newspapers and TV networks do this kind of thing all the time. Staci at PaidContent is right that Rex Hammock had the best line: “Reporters’ inability to interpret statistics is ‘sky-rocketing’.”
Forrester analyst Josh Bernoff has a post here about the reaction to his initial piece about the report, in which he says that the data set was too small to jump to any conclusions, but that this point “was just too subtle to get into these articles.” It wasn’t too subtle at all — it’s just that some outlets couldn’t bear to let the facts get in the way of a good story.