Processing speed is one of those terms which is used in different ways, but which typically applies to performance on relatively simple tasks such as digit symbol substitution, simple and choice reaction time, letter cancelling and visual and auditory inspection time. They tend to be repetitive tasks, straightforward to explain and to carry out, not likely to be strongly influenced by school teaching or wider cultural factors, and somewhat clerical and dull when compared to more demanding intellectual tasks like solving matrix puzzles, analogies or defining vocabulary words.
Of course, just because a task looks basic does not mean that it is fundamental. Nonetheless, it is tempting to believe that understanding these simple tasks could help us decode the factors involved in higher cognitive abilities. Processing tasks could give us an estimate of the fundamental “clock speed” of each person’s brain.
By analogy, if these tasks are the building blocks of all cognitive abilities they would constitute a periodic table of cognition. We would rise above the confusion of multiple abilities and task dependent skills and be able to specify the real ingredients of thought. Ian Deary bemoans the fact that currently cognitive psychologists are all working on their own, personal, and very disparate periodic tables, and hopes they can come together to get a better understanding of simple processing.
First steps first. Can processing speed explain anything? Journeying to Edinburgh for the conference I speculated that processing is, almost by definition, an integral part of thinking. For example, is low level processing required to work out whether one needs a passport to travel from London to Edinburgh? In the current political climate I think this probably counts as a high level abstract problem. Conversely, estimating the time of departure from home should be a low level problem, but on examination in my case turned out to be another high level problem. From the stated departure time you must subtract the delay imposed by security, the further delay involved in parking a car and then taking a shuttle bus to the terminal, the further delay in driving to the airport imposed by a then-current Tube strike which will have a consequent overflow effect on car traffic and, by recourse to high level mathematics, you eventually prove that you should have left the previous day.
Low level processing involves those mundane tasks which are integral to driving, recognising signals, responding to traffic conditions and noticing features of the environment. By analogy, perhaps measuring the speed of completing simple tasks will reveal the mental horsepower we can bring to bear on more complicated tasks. Galton speculated that synaptic efficiency might explain why one individual is brighter than another, rather like the processing speed of a computer determines what degree of complexity it can cope with.
There are a number of problems with this view, not least that some notable researchers point out that it is a poor fit with the facts. Others, including a new wave of researchers, continue to find merit in the notion, and I will give pride of place to the most recent findings. Meanwhile, here is the programme of talks:
Expert Workshop on Processing Speed and Cognitive Ageing Edinburgh, 30 April 2014 “Is the world too fast when we’re slowing down?”
Session 1: Getting Up to Speed on Slowing
Ian Deary: ’10 Hard Questions about Processing Speed’
Patrick Rabbitt: ‘Reaction Times, Age, Intelligence and Memory’
Nicholas Mackintosh: ‘Correlations and Causes’
Paul Verhaegen: ‘General Slowing Yields to Major Dissociations, and Other Small Victories from the Brinley Front’
Session 2: Quick Summaries of New Data
Geoff Der: ‘Does the Relationship Between Reaction Time and Intelligence Vary
With Age?’
Stuart Ritchie: ‘Inspection Time and Fluid Intelligence in the Eighth Decade of Life’
Elliot Tucker-Drob: ‘Processing Speed and the Positive Manifold of Cognitive Ageing’
Discussant: Tim Croudace
Session 3: Axons and Alacrity
Mark Bastin: ‘What Does Diffusion MRI Tell Us about Relationships Between
White Matter and Information Processing Speed?’
Rogier Kievit: ‘Processing Speed(s), White Matter Integrity and Fluid Intelligence:
A Hierarchical Perspective’
Thomas Espeseth: ‘Processing Speed and its Components – Associations with Age
and Indices of Brain White Matter Microstructure’
Discussant: James Thompson
Session 4: General Discussion
Chair: Ian Deary
I will begin by posting up Ian Deary’s introduction to the conference, which follows soon.
Interestingly, in a recent twin study, processing speed was found to be entirely dependent on g. So maybe it is a window into underlying mental horsepower?
ReplyDeletePaper here
thanks. might be the window, but pity there were not more "basic and physiological" measures of processing speed, like reaction time and inspection time. That might give us the elusive building blocks. However, there are counter arguments from Pat Rabbitt which I hope to post about later.
DeleteAm I wrong, or females' processing speed is far better than males'? :)
ReplyDeleteA paper by Paul Irwing found a substantial advantage for US women in processing speed
Deletehttp://www.sciencedirect.com/science/article/pii/S0191886911002212
Regarding the competing periodic tables idea, as a recent visitor to this blog I have not seen much focus on CHC. I write as a psychologist who does a lot of testing (so, practitioner rather than researcher), and I find the CHC model useful, particularly for understanding and helping students with learning disabilities. Any thoughts?
ReplyDeleteThanks for the blog too - important for practitioners to try and 'keep up' with where the thinking and research in these areas are going.
CHC model implied in all discussions of g, so I don't bother to spell it out.
Delete@James - I feel that IQ researchers have been and are exceptionally prone to what I term micro-specialization - developing hermetically-sealed areas of expertise and enforcing extreme (but arbitrary) rigour within these domains.
ReplyDeleteThis serves to keep IQ researchers out of trouble by preventing them from joining the dots - and giving plausible/ deniable reason for suppressing research from those who do join the dots - i.e. that under a methodological microscope that fail to meet the requisite standards of irrelevant precision.
Also, when a subject is microspecialized nothing ever gets *solved* - but instead ever more technical/ methodological problems get raised, requiring ever more research projects - so there is no danger of running out of reasons to get grants.
But micro-specialization is not a minor quibble about the conduct of science - it is a major, lethal pathology of science.
(continued)
I wrote about micro-specialization here http://corruption-of-science.blogspot.co.uk/ and the main relevant section is:
ReplyDeleteMicro-specialization and the infinite perpetuation of error -- Science, real science, is itself a specialization of philosophy. After which science itself specialized – at first into physical and natural sciences, and then into ever-finer divisions.
Scientific specialization is generally supposed to benefit the precision and validity of knowledge within specializations, but at the cost of these specializations becoming narrower, and loss of integration between specializations.
In other words, as specialization proceeds, people supposedly know more and more about less and less - the benefit being presumed to be more knowledge within each domain; the cost that no single person has a general understanding.
*
However, I think that there is no benefit, but instead harm, from specialization beyond a certain point – an imprecise but long-since-passed point -- Nowadays, people do not really know more, even within their specialization – often they know nothing valid at all; almost everything they think they know is wrong, because undercut by fundamental errors intrinsic and yet invisible to that specialty.
-- We are now in an era of micro-specialization, with dozens of subdivisions within sciences. Biology, for example, fragmented into biochemistry, molecular biology, genetics, neuroscience, anatomy, physiology, pharmacology, cell biology, marine biology, ecology...
*
...
...
*
In the world of micro-specialization that is a modern scientific career, each specialist’s attention is focused on technical minutiae and the application of conventional proxy measures and operational definitions. Most day-to-day research-related discussion (when it is not about fund-raising) is troubleshooting – getting techniques and machines to work, managing personnel and coordinating projects... Specific micro-specialist fields are built-around specific methodologies - for no better ultimate reason than 'everybody else' does the same, and (lacking any real validity to their activities) there must be some kind of arbitrary ‘standard’ against which people are judged for career purposes (judging people by real scientific criteria of discovering truths is of course not done).
('Everybody else' here means the cartel of dominant Big Science researchers who control peer review - appointments, promotions, grants, publications etc. - in that micro-speciality.)
Thus, micro-specialists are ultimately technicians and/or bureaucrats; thus they cannot even understand fatal objections and comprehensive refutations of their standard paradigms when these originate from adjacent areas of science. So long as their own specific technique has been conducted according to prevailing micro-specialist professional practice, they equate the outcome with ‘truth’ and assume its validity and intrinsic value...
*
If we then combine this situation with the prevalent professional research notion that only micro-specialists are competent to evaluate the domain of their micro-speciality – and add-in the continual fragmentation of research into ever-smaller micro-specialties - then we have a recipe for permanent and intractable error.
*
Vast and exponentially-growing scientific enterprises have consumed vast resources without yielding any substantive progress at the level of in-your-face common sense evaluations; and the phenomenon continues for time-spans of whole generations, and there is no end in sight (short of the collapse of science-as-a-whole).
According to the analysts of classical science, science was supposed to be uniquely self-correcting - in practice, now, thanks in part to micro-specialization, it is not self-correcting at all – except at the trivial and misleadingly reassuring level of micro-defined technical glitches and slip-ups.
I agree psychology has become Balkanised, with associated small unstable principalities and internecine warfare. However, that might be due to a lack of agreed grand theory. Physics got round the problem by some deep thinking and good experimentation. Behaviour has proved harder to pin down in that way. Also, much of lab science has become a cottage business, almost like individual restaurants. Understandable. Attempts to bring in proper processes have floundered. Doesn't work in Pharma. About half the work done in labs is private, furtive, and interesting. Curiosity is often messy.
ReplyDeleteThere are different meaning of processing speed and people are presenting this in different ways. But in simple processing speed means the speed calculate while doing any particular work. It also defines our skills and strategies that how much effort we are putting to achieve our goals.
ReplyDeletePerformance Coach