Neuromyths and the future of climate education: interview with Paul Howard-Jones

14 November, 2025

Ahead of the Blackham Lecture on 19 November, we spoke with Paul Howard-Jones, Professor of Neuroscience and Education at the University of Bristol, about the themes of his upcoming talk.


Your lecture is titled ‘Neuromyths: classrooms, culture, and climate change’. To start, could you explain what a ‘neuromyth’ is?

I think, these days, the term ‘neuromyth’ is a misunderstanding about the brain that’s popular in the public domain and also in education. Although the term is now commonly used in education, it was actually a neurosurgeon – Alan Crockard – who has first claim on the term ‘neuromyth’. He used it in his lectures and written articles in the 1980s, to describe a misleading type of ‘received wisdom’ within medicine.

Could you share an example of a persistent neuromyth, and explain why it’s scientifically incorrect?

Learning Styles is undoubtedly the most prevalent and persistent neuromyth. More than 90% of teachers across the world believe that, because we process different sensory modalities in different parts of our brain – there is some benefit in identifying whether someone is a visual, auditory, or kinaesthetic learner – and teaching to their preferred learning style.

But actually neuroscience doesn’t support this. Although some regions are more associated with sense than another, the brain is just too interconnected for this idea to make sense. And neither psychology nor education can provide any convincing evidence for it either. Yes, we have preferred learning styles – and preferences can be somewhat stable over time – but there’s no advantage to teaching to them, and there’s possibly even a disadvantage. Presenting information in different forms – often making use of our different senses – can benefit all learners. We shouldn’t just stick to one style based on a self-report survey.

Why are we so susceptible to neuromyths, and what do they tell us about our society’s hopes and anxieties?

Wishful thinking plays a big role. We want to believe what we want to believe. And many neuromyths provide simple and attract ways of thinking about things – and even sometimes doing things. Neuromyths can be attractive because they appear to offer simple solutions. Teachers know, correctly, that adapting their teaching to the personal needs of students can be important – but it’s often time-consuming and difficult – because learners and their needs are complex. Teaching to learning styles appears to address this challenge to personalise learning in a way that’s easy to grasp and easy to do. But it just doesn’t work.

There are many other factors that make us susceptible. Neuroscience has an ‘allure’ – a sense of concrete authority and sparkle. That probably helps myths that appear to have some neuroscience behind them to proliferate further and faster. Then there is this cultural gap between the fields of neuroscience and education – both with different concepts, aims, and language. That gap provides the space for myths to flourish without being scrutinised and challenged.

What is the tangible harm of neuromyths when they influence classroom practice and educational policy?

At best neuromyths are just a distraction. They absorb the time, energy, and money of educators – and these things can even make their way into professional development – requiring teachers to spend their ‘Inservice training’ (INSET time) learning nonsensical rubbish rather than stuff which is really useful.

But they can also influence practice in negative ways, by displacing good practice – for example, teachers can believe they have differentiated for different types of learner using learning styles, when they haven’t – or more directly by encouraging practices that undermine learning. It has been shown, for example, in psychologically controlled experiments, that teaching someone in their preferred learning style can actually reduce how much they learn.

Some misunderstandings (like ‘the myth of three’ – the idea that the basic has completed its development by three years old) can find its way into policy too. That myth probably contributed to some of the higher education funding cuts that occurred in the 1990s. While the importance of early years education shouldn’t be underestimated (which it still is in many countries) that doesn’t mean that adolescence isn’t a special time for learning too. It is – but just in different ways – because different parts of the brain develop at different times.

Your lecture connects neuromyths to climate change, one of the great challenges of our time. How do misunderstandings about the brain jeopardise effective climate change education?

Climate change education is about empowering action – and that has a lot to do with fostering and balancing emotions like hope and concern. But education doesn’t have a great history of incorporating emotions in learning, and it’s an area where there’s a lot of misunderstandings. One of the most common is the idea that all stress is bad for the brain and bad for learning. Actually positive stress – a moderate level of stress without significant threat – can help motivate you to achieve your goals, help you learn to cope with stress more generally and can benefit your health.

You highlight misunderstandings about the ‘emotional brain’. What myth about emotion and rationality hampers our ability to foster pro-environmental action?

I would say that there’s a general myth that emotions are bad news for learning and reasoning – and that’s a huge barrier for effectively fostering pro-environmental action in schools, and also amongst adults. You hear people talking about things like the ‘reptilian brain’ – like we have an old part of our brain (in evolutionary terms) that’s always threatening to take over our rationality and learning. In actual fact, emotions are important for learning and vital for bringing about change in attitudes and behaviour. I’ve found this in my work with schools but also with the energy sector, in its attempts to encourage customers towards low carbon heating.

Would you call yourself a humanist, and which of Humanists UK’s campaigns is closest to your heart?

Yes I would describe myself as a humanist. Even though I have a respect for religion – my father was an Anglican priest – I believe its dogmatic forms should always be challenged, and that humanism provides a suitable ethical and philosophical basis to do that. I would say Humanist UK’s campaign on ‘science, evolution, and creationism’ is closest to my heart. Evolution provides a meaningful framework for deriving meaning from biology – including the brain – and so it’s really important that people understand and accept the evidence that demonstrates its veracity.

Finally, what is the single most important idea you hope people take away from your Blackham Lecture, and how will it empower them to think or act differently?

Educators and communicators need to start considering emotions more thoughtfully and scientifically – or we’ll never provide the climate education our planet needs.