Thursday 18 July 2013

The enchanted loom


Brains present a problem of scale. How small does one have to go to understand what is going on? 20 micrometres? 25 nanometres? A question of this sort generates a behaviour known as “a furtive wiki peek”.  A micro metre is a millionth of a meter, a nanometre a billionth of metre. Therefore there are 1000 nanometres in a micrometre. A metre is a metal bar in Paris, probably France’s last remaining intellectual asset.

The deeper you dig into brain the higher the mountain of data you throw up behind you. Worse, you are not just mapping a dead city: your main interest is recording all the traffic, the cars, pedestrian flows, telephone messages, emails, and conversations in the street, perhaps even glances. Lots of data. Mounds of it. Terabytes. Yottabytes. Ok, start with the familiar kilobyte, which has 1000 bytes (1024 in binary, but we will keep things simple because it is very hot in London). Then megabytes 10002  giga 10003   tera  10006     and so on upwards to yotta 10008 . This much data will take time to crunch.

How do you record what is going on? Probe with a sharp needle? Slice and stain the brain? These musings are triggered by Alison Abbott, who has done a great job reviewing recent developments in brain research in  Nature.

She arranges her thoughts into three headings: measuring, mapping and understanding.

Measuring one neurone with a probe was cutting edge stuff in the 1970. I can remember watching the Cambridge Psychology demonstrations with awe. Now a probe can record a couple of hundred neurones simultaneously. The new upcoming silicon probes have 52 thin wires leading to 456 silicon electrodes, and can record from all layers of the brain simultaneously.

Mapping brain activity has usually been done by slicing the brain as thinly as possible, staining the slices to render the cells visible, and looking at them under a light microscope. Putting the slices together into a 3 dimensional model is not trivial. Researchers took a decade to slice a brain into 7,400 layers 20 micrometres thick, and then spent 1000 hours on two super-computers to piece together the terabyte of data (finally, we have a reference measure). This revealed folds in the brain usually lost in two-dimensional cross sections. Researchers now want to push on to 25 nanometres (one-thousandth of the thickness of an average cell). At that resolution you see “every damn little thing”. Yet another reference measure.

Understanding what the brain is doing is the most daunting part. One cubic millimetre of brain tissue, using the newest techniques, will generate 2,000 terabytes of data. A human brain would generate 200 exabytes. ( 10006). This is a lot to handle. 30 seconds of brain activity will generate more data than everything sent back by the Hubble telescope.

At the moment, we are not all that well-placed to understand how the brain does the things it does. So, when you hear about the relationship between brain and behaviour, and between brain regions and intelligence, please understand that we are not yet at 25 nanometres, either in measuring, mapping or understanding.


  1. Sometimes it's possible to get data on too small a scale such that you then have to consolidate them to find the pattern/behaviour/regularity that you were searching for.

    I take it that knowledge of the brain is still so slight that nobody knows the most fruitful scales to work on? In which case nearly everyone will press on to finer and finer scale in hopes of landing larger and larger research grants.

  2. Needs to be small scale if it is to work at all, otherwise the whole thing "doesn't compute" or so it is thought.