We've all got used to using case studies to make our points: business books like my own work are awash with them (to illustrate better what I fear might otherwise be rather dry and dull points to the reader). The bloggersphere is ripe with case-study arguments, too.
But beware: case studies are bad at establishing truth. Just like using individual cases to make legislation...
1. Case study thinking excludes failures the grey mush of moderate success or indifferent performance. This makes it hard to be clear about what the real causes of success are - btw "blueness" turns out not to be the secret of IBM's success under Lou Gerstner...
2. Case studies make it seem as if the success was inevitable (when all too often the underlying mechanisms mean that rewinding the tape would produce another - different - winner)
3. Case studies force things into an oversimplified narrative arc which we chose to reflect and support our preferred ideas about how things are (and how successful our work is - when was the last time a case study was presented in a "I'm not sure quite what this means or whether it really worked" manner...)
Even HBS thinks there might be a problem, in trying to teach with case studies, at least
Remember: the reason why you're being told the story you're being told is because it makes the point seem real. It makes it tangible. It makes it seem clear. And to whose benefit is that?
So the next time you hear someone extrapolating an explanation for success in e.g. the music biz based on e.g. why Leona Lewis (of all the Talent TV contestants) got to be enormous in the US - as a recipe doing the same again, beware. Go get some real data about this and similar cases (successful or not)
And don't even start me on "best practice" and "benchmarking"
Let's try a bit harder, eh? What's the data say?