Sunday 20 September 2015

#ISIR15 Last day


Many talking points from yesterday, and I hope to get more of the presentations to send on to you later. However, the morning begins with a look at a very severe problem. Given that intelligence is so important, why is the public debate three or four decades behind the times?

Alice Dreger kicks off with Understanding Science Journalists and Why They Misunderstand You. Alice is an historian of science, science writer, and patient rights activist, and will draw several cautionary tales from her latest book, Galileo’s Middle Finger: Heretics, Activists, and the Search for Justice in Science.

For many scientists whose work touches on identity politics, dealing with science journalists can be a fraught endeavor. Work with them, and you risk being misrepresented. Decline to work with them . . . and you risk being misrepresented. Tis lecture will draw on the speaker’s 20 years’ experience working with and within science journalism to explore how the field has changed in the last two decades and how scientists can protect themselves today in the media.

She has a very good title, and she helps us understand our predicament: we have so many new and interesting results, but the concepts “intelligence” and particularly “IQ” are toxic. As Pinker tweeted:  Irony: Replicability crisis in psych DOESN'T apply to IQ: huge n's, replicable results. But people hate the message.

As a researcher or explainer I have personal experience of having to wade through so many misconceptions, and sometimes politely veiled disdain, that it is hard to keep absolutely calm and give the results.

But I am locked into my version of the conference bubble. Has there been any reporting on the conference, any at all? If there has been, can you send me some links?


  1. I did read about the conference at Steve Sailer's site. Not much information elsewhere, and I often google search 'IQ" in News - perhaps I need to get out more.

    Even working as an Educational Psychologist, I am often struck by my colleagues' lack of interest in a lot of this material - and this is people who do cognitive testing for a living!

    1. Agree. They are often badly informed. They rely on a few key references to write their reports, backing up particular viewpoints. Lawyerly, not scholarly, as Buz Hunt says.