Saturday, 29 January 2011

Book review: Steven Johnson's 'Where Good Ideas Come From: The Natural History of Innovation'

I tend to avoid reading this kind of book. The Cluetrain Manifesto, The Tipping Point, Freakonomics, The Black Swan. They all hit the web, and they all pass me by in a largely undifferentiated wash of bold typography, sentence-length sub-titles and (too) easily summarised central points.

I'm not sure now why I ordered 'Where good ideas come from' at the library, but having done so, I dutifully picked it up and settled in to read it over the long weekend. The double line spacing immediately gave me the sense that I was reading an extended blog post, and by and large, reaching the end of the book hasn't much changed my first reaction.

Johnson identifies seven key situations or characteristics and one key context that foster innovation. The introduction defines that context: the city. In the same way that a coral reef nurtures a vast biodiversity compared to the same square meterage of empty sea, closely packed urban environments nurture in one discipline are more likely to come into contact with ideas from another. From here we tumble through the seven situations/contexts:

The adjacent possible The phrase comes from scientist Stuart Kauffman, and describes the situation in which life originated on Earth. At one time, Earth was awash with a small number of simple molecules. Each of these could combine with the others in a finite number of ways, given certain catalysts, and then go on recombining and catalysing. This handful of simple molecules couldn't transform over night into a dandelion or an ostrich, because a whole bunch of innovations have to happen before then (like respiration). But surrounding each instance of each molecule was the adjacent possible - a slightly hazy boundary of what might happen. With each combination and transformation, the boundaries of the adjacent possible become bigger. Innovation fuels innovation.

Liquid networks Information and ideas travel best in liquid networks. Scientific breakthroughs occur not just in the lab setting - perhaps not even most often in the lab setting - but in discussion groups and cafeterias, where different perspectives can be brought to bear on a set of findings, or the 'error' in an experiment can be revised into proof for another concept. From here we move into modular office design, blah blah blah.

The slow hunch Ideas brew over time. Evidence is slowly gathered. Hunches work when they're connected to other bits of experience and evidence: this is why the Phoenix memo about 'suspicious persons' enrolling at American flight schools in mid 2001 didn't trigger any alerts that might have prevented 9/11 :it was filed into the FBI electronic black box, which is structured to prevent pieces of information from rubbing up against each other

Serendipity The chapter heading pretty much sums it up, although Johnson does a nice job of rebutting the argument that the web, and in particular being able to search for information, has driven serendipity out of our lives. Also - stop working, go for a walk, it might help you process information better than labouring over it.

Error Some inventions are the outcome of wrong outcome after wrong outcome after wrong outcome. The guy who invented vacuum tubes did so without ever figuring out how they actually worked. Xrays and daguerreotypes were both accidental discoveries. You get the picture.

ExaptationThis is where Johnson draws heavily on the evolutionary biologist Stephen Jay Gould. Exaptation is when an adaptation is further utlised for another end. Feathers were originally evolved to provide insulation: down feathers continue to perform this function, whereas flight feathers act as airfoils. An idea or finding from one field can be adapted into another. It's like the notion of weak ties ("popularised by Malcolm Gladwell") but somehow better

Platforms Stacked platforms don't just help information move; they recycle and amplify it. Coral reefs, beavers' dams, satellites and APIs. Government as platform. Twitter twitter twitter. You know the drill (or you don't, in which case this chapter might be a real eye-opener for you: the kind of thing you make your dumb-ass Web Strategy Advisory Group read so they'll go hey, yeah, go ahead, API everything!)

In the final chapter, 'The Fourth Quandrant', Johnson proposes two ways of investigating and visualising the history and conditions of innovation. One is the deep drill-down into a single case-study, where you hope the reader will take the points you're making and extrapolate them out. His book on Joseph Priestly and the 'invention' of oxygen is such an example. The second is to go wide, and try to categorise millennia or centuries of innovation and look for trends (as Johnson does here, looking at the speeding up of innovation before and after the establishment of cities, and looking at the last 500 years of invention in terms of individual vs. collaborative/distributed invention, and market vs. open contexts).

The book is subtitled 'The natural history of innovation', and Johnson like to reach down into the primordial soup* of neurons and evolution to draw parallels to human innovation. Darwin is his key motif; his musing on coral atolls open the book, and he is returned to frequently, on topics like slow hunches, serendipity and error. The thing is, I know my Gould. I've got a bit of a grip on Darwin. And recently, Nick Lane has thoroughly schooled me on evolution. While it's always a satisfying feeling to play 'snap' with an author (ah hah! I know that reference. I see your observation, and raise you this contradictory hypothesis!), I felt like all this bolstering was (a) a bit of a stretch - are neuron networks really like well-planned open offices? and (b) light weight compared to the reams and reams and reams of information that are out there - Johnson's few pages on the advantages of sexual reproduction compared to Lane's chapter, for example.

As a result, a lot of the book felt familiar, and often a little thin compared to richer examples of many of the topics that I've read elsewhere. Perhaps ironically then, the one theme I wish Johnson had focused on in more depth - perhaps even written about exclusively - was that of the commonplace book, the tradition of writers and thinkers keeping notebooks full of passages and quotations, mixed in with observations and recordings of their own. The commonplace book lay somewhere between self-help book, memory tool, encyclopedia, and meditative device. (Montaigne kept one, naturally, and also had his favourite passages inscribed on the rafters of his library.)

The English philosopher and physician John Locke started his first commonplace book in 1652, during his first year at Oxford. Over time he developed a method for indexing these books, a method he felt sufficiently important to warrant writing up as an appendix to his major work, 'An Essay Concerning Human Understanding'. Working within the limits of two pages in each commonplace book being reserved for the index, which had to swell to encompass whatever he transcribed into it, he structured his index in this way:

When I meet with any thing, that I think fit to out into my common place book, I first find a proper head. Suppose for example that the head be EPISTOLA, I ook unto the index for the first letter and the following vowel which in this instance are E.i. if in the space marked E.i. there is any number that directs me to the page designed for words that begin with an E and whose first vowel after the initial letter is I, I must then write under the word Epistola in that page what I have to remark.

Even I recognise the 'if X, then Y' pattern of storing and retrieving data going on here. The commonplace book both stores information and allows one to retrieve it, and in the process of doing so, revisit and enrich the ideas once has already had.

From Locke, Joseph Priestly, and Erasmus Darwin, Johnson steps through to 'Enquire Within Upon Everything', a hugely popular Victorian how-to guide for everything from making flowers in wax to burying relatives. Tim Berners Lee named the first iteration of what would the world wide web 'Enquire' after a copy of this manual he remembered from his childhood.

Johnson himself uses DevonThink, an app that allows him to store and search texts, which has a search algorithm that forges relationships between them. It's interesting to see him recounting using Devon Think as a writing tool:

I write a paragraph about something - let's say about the human brain's remarkable facility for interpreting facial expressions. I then plug that paragraph into the software, and ask DevonThink to find other passages in my archive that are similar. Instantly, a list of quotes appears on my screen ... Invariably, one or two of these triggers a new association in my head ... and so I select that quote, and ask the software to find a new batch of passages similar to it. Before long, a larger idea takes shape in my head, built upon the trail of association the machine has assembled for me.

[Apropos this, I found a lovely article over summer called The Theatre of Memory, about libraries as more than data storage, you should totally read it.]

So all up: I think this might be one of those reviews that means you don't need to read the book yourself, unless you having a burning desire to do so. Instead, go out and read Nick Lane, and Steven Jay Gould, and all sorts of other people writing about all sorts of other good things - that's where the good ideas come from.

*I read an essay recently by Michael Chabon where he talked about how he'll get obsessed with a word, and for days on end he'll have to fit the words 'monkeys' or 'washer' into conversation over and over again. He said this slipped into his published writing - an unusual word, say 'shrivelled', will appear twenty times in a single book, and occasionally get picked up by readers. Someone needed to go through 'Where good ideas come from' and edit out every second instance of the phrase 'primordial soup'.


staplegun said...

Whoa there Nellie!

Firstly, thanks for taking the time to read a book you didn't really want to, and then for taking the time to summarise it too - awfully good of you to go well beyond the call of duty methinks :)

Now, may I pose an off-topic question?

You said "a lot of the book felt familiar, and often a little thin compared to richer examples of many of the topics that I've read elsewhere" - um, I'm kinda wondering if your head is so full of science now that you can't read this kind of book (summary overview of a topic) from a layperson's point of view anymore? To me, from your summary, it looks like a perfectly reasonable book and topic coverage??

I'm not saying it's a bad thing, but just pondering out loud... After you build up enough topic expertise, when do you stop reviewing books as a fellow member of the public and begin critiquing them more as a learned, scientist-type person?

I like to think I have some success in explaining complex things to people not working in my field, but I constantly bump into gaps in people's knowledge I had accidentally taken for granted.

I see two ways around this:

1. Usability testing - try it out, watch for when the eyes glaze over, adjust that part

2. Ignore it - people get to understand whether the reviewer is on their level, or some other level that is easier to just ignore.

Is it possible to be 'objective'? Is that ever possible?

Courtney Johnston said...

I guess it's possible to be self-aware of your subjectivity. I certainly felt subjective reading this book - I knew why it felt thin to me.

The thinness is another question though. We talk often about content being 'repurposed', but we don't talk much about why, or what the benefits are (beyond not having to the 'purpose' new content, I suppose). When online material that I'm familiar with (if even by osmosis) gets repurposed as a book, I guess I'm bound to feel jaded. But when scientific research gets repurposed as a pop science book, the author is acting as an intermediary between me and information I can't access (or probably wouldn't understand if I did access it). So I wonder if professional scientists read pop science books in their own field. Interesting.

On another point, I read an observation the other day - somewhere - that authors are subject experts, and editors are reader experts. Or, in the context you're offering, usability experts for text.

staplegun said...

I like that: author = subject expert, editor = reader expert.

I guess the problem here is the editor can't know whether the author has done the subject justice - just whether it is comprehendible to readers. Unless editors also have a role of reader 'advocate', in which case ideally they would be seeking second opinions on the author's subject expertise (a la peer review).

Errr, looks like I might need an editor to make that paragraph comprehendible!

If experts read pop science, I imagine it is either to scoff, or to become, as you say, self-aware (seeing how less expert people see their world). This second purpose woudl feed into the usability testing loop I mentioned.

I imagine (hope) a lot of experts read pop books as usually their work is all about research, and these books potentially have new insights (though they probably just skim-read). Well, that's what I do in my field of expertise anyway. :-)