Psychology

Alchemical futures

Sometimes, I’m lucky enough to have the chance to teach a short course about science fiction to a group of 17 year-olds. I’m always intrigued to find out what ‘science fiction’ means to them. This week, kicking off, one lad went straight for super-powers. As it happens, I’ve never had this answer before, but what made me take note was how well he explained what he meant, quickly but thoughtfully: science fiction giving us access to other possible worlds, possible futures … what if … maybe … perhaps … one day … then I could … dream that … build that … I should add, he was the same student who homed in on science fiction and dystopian futures, so he wasn’t sitting there being idly optimistic.

I went through a phase in my teens of reading lots of Jung and, a little later, Freud, considering medicine and psychiatrist / psychoanalyst as a possible future. I still have many of the books I bought then. Jung led me off on curious paths. Alchemy was in there, of course, and has endured as an interest — morphing along the way. I went off certain Jungians at some deep level after a conference (held in Windsor Great Park!), which struck my 18 year-old self as pretty bonkers and anti-science, and I used to get my Jungian books from a very odd bookshop in the middle of nowhere (deep, rural Gloucestershire) which the friends I persuaded to come along (or give me a lift there) ended up calling ‘the magic bookshop’. New Age, though we didn’t know it.

But alchemy’s never gone away. It couldn’t, could it? I loved that Royal Institution talk I went to back in 2006, ‘Alchemy, the occult beginnings of science: Paracelsus, John Dee and Isaac Newton’. The dream of a very special super-power, transforming both matter (world) and self.

Alchemy, originally derived from the Ancient Greek word khemia (Χημεία - in Modern Greek) meaning "art of transmuting metals", later arabicized as the Arabic word al-kimia (الكيمياء, ALA-LCal-kīmiyā’), is both a philosophy and an ancient practice focused on the attempt to change base metals into gold, investigating the preparation of the “elixir of longevity”, and achieving ultimate wisdom, involving the improvement of the alchemist as well as the making of several substances described as possessing unusual properties. The practical aspect of alchemy can be viewed as a protoscience, having generated the basics of modern inorganic chemistry, namely concerning procedures, equipment and the identification and use of many current substances.

Alchemy has been practiced in ancient Egypt, Mesopotamia (modern Iraq), India, Persia (modern Iran), China, Japan, Korea, the classical Greco-Roman world, the medieval Islamic world, and then medieval Europe up to the 20th and 21st centuries, in a complex network of schools and philosophical systems spanning at least 2,500 years.  Wikipedia

And given a background in zoology and theology, I’ve not been able to get this out of my head since stumbling across it the other week:

Once, he called himself a “biologian”, merging the subject matter of life with the method of a theologian. More recently, he told me that he is an alchemist. In Defense of the Memory Theater

Isn’t that great? What a way to think of what you’re engaged on. The work.

It is, by the way, well worth reading all of Nathan Schneider’s post about his uncle, the “alchemist”:

The most remarkable memory theater I’ve ever known is on a computer. It is the work of my uncle, once a biologist at the National Institutes of Health, a designer of fish farms, a nonprofit idealist, and a carpenter. Now he has devoted himself full-time to his theater … [a] single, searchable, integrated organism. When he tells me about it, he uses evolutionary metaphors cribbed from his years researching genetics. The creature mutates and adapts. It learns and grows.


‘We run videogames in our heads’

James Paul Gee

It was a very great pleasure to welcome James Paul Gee to talk at school, shortly before we broke for half-term. James spent an hour in conversation with our students, examining what games and learning have to do with each other. He was in the UK to speak at Handheld Learning 2009 and this is his talk from there:

At the heart of both talks, besides his zest for life, learning and a passionate engagement with his subject, is the critically important idea of situated meanings and their role in learning: ‘Comprehension is grounded in perceptual simulations [of experience] that prepare agents for situated action’ — Barsalou (1999).

Some photos of slides James used at St Paul’s (which illustrate what he means when he says, around 5m 50s into his Handheld Learning talk, ‘Our schools don’t use the best principles we know about learning, but our popular culture does’):

James Paul Gee

James Paul Gee

James Paul Gee

Many students who came to hear James talk had read Steven Johnson’s Everything Bad is Good for You (2005) and will have recalled Steven’s discussion of James’s thinking. Here’s Steven on ‘probing’, that process in learning to play a videogame where the player ‘probe[s] the depths of the game’s logic to make sense of it’ — exploring the rules, yes, but also something subtler and more complex, ‘the physics of the virtual world’:

The games scholar James Paul Gee breaks probing down into a four-part process, which he calls the “probe, hypothesise, reprobe, rethink” cycle:

  1. The player must probe the virtual world (which involves looking around the current environment clicking on something, or engaging in a certain action).
  2. Based on reflection while probing and afterward, the player must form a hypothesis about what something (a text, object, artefact, event, or action) might mean in a usefully situated way.
  3. The player reprobes the world with that hypothesis in mind, seeing what effect he or she gets.
  4. The player treats this effect as feedback from the world and accepts or rethinks his or her original hypothesis.

Put another way: When gamers interact with these environments, they are learning the basic procedure of the scientific method.

It might be useful to summarise here James’s six headline slides from his Handheld Learning talk about what characterises videogames: an experience of being simultaneously inside and outside a system; situated meanings; action orientated tasks; lucidly functional language; modding; passionate affinity groups. From his talk to us, some points I jotted down:

  • 700 games design courses have started in US universities in the last six years.
  • “We’re a profoundly contradictory people”: we worry about violence and videogames and GTA is put in the spotlight, yet a very violent game like Postal goes largely unnoticed and America’s Army is free — funded with tax-payers’ money! (James talks about America’s Army here.)
  • Games are not like books: Doom has a poor story (and graphics), but very good mechanics and mechanics really matter in our appreciation of a game. Warren Spector thinks story is very important to games. The creator of Doom doesn’t. Of course, if it’s got good mechanics and a good story …
  • The modern world handles knowledge distinctively, working with large, broad, cross-disciplinary themes.
  • If education is only about standard skills, it will only get you a job with standard skills (probably off-shore). In the US and UK, three-fifths of workers are in the service industries.
  • Success at school may square with the job you get, but it doesn’t predict how well you’ll do in your job.
  • Games are about problem-solving. Our problems are now all complex ones — complexity and complex systems interacting. You must be able to think way beyond standard skills.
  • Cross-functional teams, a feature of games such as World of Warcraft, require very high order skills — greatly valued in high-tech firms. Working in such teams is exceedingly intense and demanding.
  • A game like Portal creates an embodied feel for physics and provides continuous assessment of your knowledge (performance). The game itself guides the experience.
  • James Paul Gee

  • Good games makes you feel smarter than you are. Play first, learn later (situated meanings). Where school fails is when it’s like a bunch of manuals without the games — and that’s also a very good way to make the poor look stupid.
  • Yu-Gi-Oh cards and their associated ecosystem are a striking example of geeking out with passion. Here’s a card James took from a seven year-old — who understood it completely (complex, technical language made lucidly functional by being married to action in the game) and explained it to him:
  • James Paul Gee

  • Modding: not only ‘How can I use what this game design has given me to my best advantage?’, but also ‘How can I improve/develop this?’
  • James Paul Gee

  • As Will Wright said, my games designers can make better stuff than 90% of players — but not the other 10%.
  • Recommendations: Half-Life; Deus Ex (1); System Shock; Flower (PS3); Braid. My colleague, OIly Rokison, chipped in with Fable 2.
Here’s an interview with James from Gamezone, 2007:

What is it specifically about video games that help people learn?  Does it have more to do with the gameplay than the story, the visual content or the characters?

My book covers 36 good learning principles built into good games like System Shock 2, Rise of Nations, Arcanum, or even Tomb Raider: The Last Revelation.  But there are many more.  Let me just give a few examples.  First, humans are terrible at learning when you give them lots and lots of verbal information ahead of time out of any context where it can be applied.  Games give verbal information “just in time” when and where it can be used and “on demand” as the player realizes he or she needs it.

Second, good games stay inside, but at the outer edge of the player’s growing competence, feeling challenging, but “doable.”  This creates a sense of pleasurable frustration.  Third, good games create what’s been called a “cycle of expertise” by giving players well-designed problems on the basis of which they can form good strategies, letting them practice these enough to routinize them, then throwing a new problem at them that forces them to undo their now routinized skills and think again before achieving, through more practice, a new and higher routinized set of skills.  Good games repeat this cycle again and again—it’s the process by which experts are produced in any domain.

Final example: good games solve the motivation problem by what I think is an actual biological effect.  When you operate a game character, you are manipulating something at a distance (a virtual distance, in this case), much like operating a robot at a distance, but in a much more fine-grained way.  This makes humans feel that their bodies and minds have actually been expanded into or entered that distant space.  Good games use this effect by attaching a virtual identity to this expanded self that the player begins to care about in a powerful way.  This identity can then become a hook for freeing people up to think and learn in new ways, including learning, or least thinking about, new values, belief systems, and world views, as the Army realized in building America’s Army.  If you stick with it, The Elder Scrolls 3: Morrowind does this brilliantly and people play the game very differently depending on the different ways in which they have invested in their character.  We would do better at teaching science in school if kids really invested in a scientist identity.  But you have to make it happen, you can’t just say “pretend.”

You can read a recent paper written by James and Elizabeth Hayes, his wife, here: Public Pedagogy through Video Games.

‘Passionate affinity groups’. That stays in my mind when I’m thinking about school and how education works, doesn’t work … and is changing. Here’s James’s slide about the qualities these groups exhibit, from his Handheld Learning talk:

Screen shot 2009-10-30 at 10.05.14.png

Technorati tags: , , , , , , , ,


We are all Bayesians now

Intent on not being late for an evening session at Tinker.it! last week, I dropped by Bunhill Fields for too short a time, the light beginning to fail and a hurriedly printed off, crumpled map for guide.

image

Easy to find the memorials to Blake and his wife and Defoe. But I was on a quest for Thomas Bayes:

Bayes, Thomas (b. 1702, London - d. 1761, Tunbridge Wells, Kent), mathematician who first used probability inductively and established a mathematical basis for probability inference (a means of calculating, from the number of times an event has not occurred, the probability that it will occur in future trials). He set down his findings on probability in "Essay Towards Solving a Problem in the Doctrine of Chances" (1763), published posthumously in the Philosophical Transactions of the Royal Society of London.

It took me too long to find his resting place, railed off and not in a great state of repair, and my rushed photos weren’t worth posting, but here’s one from the ISBA site (taken by Professor Tony O'Hagan of Sheffield University and seemingly not copyright):

Bayes 1 

The famous essay is online (PDF).

I need to spend more time in and around Bunhill Fields, but what prompted me to try to take it in as I sped across London was reading in Chris Frith’s book, Making up the Mind, how important Bayes is to neuroscience:

… is it possible to measure prior beliefs and changes in beliefs? … The importance of Bayes’ theorem is that it provides a very precise measure of how much a new piece of evidence should make us change our ideas about the world. Bayes’ theorem provides a yardstick by which we can judge whether we are using new evidence appropriately. This leads to the concept of the ideal Bayesian observer: a mythical being who always uses evidence in the best possible way. … Our brains are ideal observers when making use of the evidence from our senses. For example, one problem our brain has to solve is how to combine evidence from our different senses. … When combining this evidence, our brain behaves just like an ideal Bayesian observer. Weak evidence is ignored; strong evidence is emphasised. … But there is another aspect of Bayes’ theorem that is even more important for our understanding of how the brain works. … on the basis of its belief about the world, my brain can predict the pattern of activity that should be detected by my eyes, ears and other senses … So what happens if there is an error in this prediction? These errors are very important because my brain can use them to update its belief about the world and create a better belief … Once this update has occurred, my brain has a new belief about the world and it can repeat the process. It makes another prediction about the patterns of activity that should be detected by my senses. Each time my brain goes round this loop the prediction error will get smaller. Once the error is sufficiently small, my brain “knows” what is out there. And this all happens so rapidly that I have no awareness of this complex process. … my brain never rests from this endless round of prediction and updating.

… our brain is a Bayesian machine that discovers what is in the world by making predictions and searching for the causes of sensations.


Chthonic

If we dream of people we have long since forgotten or who have for long been dead, it is a sign that we have gone through a violent change within ourself and that the ground upon which we live has been completely turned over: so that the dead rise up and our antiquity becomes our modernity.

Reading Blind Pony’s To Die No More, the Nietzsche made me recall this:

Digging over at an outlying corner of the vegetable garden this afternoon, I unearthed a length of chain with a rusted spring clip on one end and recognised it as a goat tether I had last used twenty years ago or more. It was about a foot down under a retired compost heap and must have been buried, at least in part, by the ploughing of earthworms.

To discover my own life becoming archaeology like this was a shock. I had now lived my way into a timespan in which my own artefacts, tools or relics had become archaeological finds.

I was no longer digging up things from my own past metaphorically but literally.

From Roger Deakin.

Technorati tags: , ,

Failure

I'm always fascinated by the way people talk about failure. Reminded by reading again James Dyson's famous remark, "Enjoy failure and learn from it. You can never learn from success." (at Dan Saffer's blog), here are some of my other favourite touchstone quotations/reference points on failure and its close relationship with learning, creativity and innovation. When we spend so much time training young people to jump through examiners' hoops, we ought to be very concerned about how we are also steering them away from taking risks — away from daring to fail, to be innovative and, yes, wrong. Effecting change in education that does something about this requires just as much visionary leadership and management as it does in business.

Failure is the rule rather than the exception, and every failure contains information. One of the most misleading lessons imparted by those who have reached their goal is that the ones who win are the ones who persevere. Not always. If you keep trying without learning why you failed, you'll probably fail again and again. Perseverance must be accompanied by the embrace of failure. Failure is what moves you forward. Listen to failure. Steve Wozniak

Tough task, to open a high-profile conference like Aula2006 (see this previous post for background) with a speech on "failure". But social software expert Clay Shirky dissected it carefully and out came an interesting insight: organizations that want to encourage innovation should focus on reducing the cost of failure rather than focusing on minimizing its likelihood, as most companies do today. LunchoverIP

"Getting good" at failure, however, doesn't mean creating anarchy out of organization. It means leaders -- not just on a podium at the annual meeting, but in the trenches, every day -- who create an environment safe for taking risks and who share stories of their own mistakes. It means bringing in outsiders unattached to a project's past. It means carving out time to reflect on failure, not just success. Perhaps most important, it means designing ways to measure performance that balance accountability with the freedom to make mistakes. People may fear failure, but they fear the consequences of it even more. "The performance culture really is in deep conflict with the learning culture," says Paul J. H. Schoemaker, CEO of consulting firm Decision Strategies International Inc. "It's an unusual executive who can balance these." BusinessWeek

Being setup for failure is to be setup for success. This week I plan to rejoice in my various failed trials and actions. I hope your week goes just as well for you too. John Maeda

Enlightened managers strive to be collaborative rather than controlling. Only through engaged conversations over time can managers create failure-tolerant work environments that invite innovation. This is not to say that a major achievement shouldn’t be applauded, or that repeated, avoidable mistakes should be tolerated. But astute managers mark the daily progress of small successes and failures with an evenhanded, open curiosity about the lessons learned and the next steps to take. Richard Farson

Dyson: There’s a famous Honda (NYSE:HMC) quote. I’ll get it slightly wrong, but in essence what it says is, “You’ve got to fail and then have the courage to overcome failure in order to succeed.” FastCompany.com

You once described the inventor's life as "one of failure." How so?
[Dyson:] I made 5,127 prototypes of my vacuum before I got it right. There were 5,126 failures. But I learned from each one. That's how I came up with a solution. So I don't mind failure. I've always thought that schoolchildren should be marked by the number of failures they've had. The child who tries strange things and experiences lots of failures to get there is probably more creative.

Not all failures lead to solutions, though. How do you fail constructively?
We're taught to do things the right way. But if you want to discover something that other people haven't, you need to do things the wrong way. Initiate a failure by doing something that's very silly, unthinkable, naughty, dangerous. Watching why that fails can take you on a completely different path. It's exciting, actually. To me, solving problems is a bit like a drug. You're on it, and you can't get off. I spent seven years on our washing machine [which has two drums, instead of one].
FastCompany.com

JP wrote something about failure recently and mentioned Esther Dyson's famous saying, 'Always make new mistakes'. (I have Esther Dyson's saying as a fridge magnet in both London and Wiltshire.) JP concluded:

Today, we are so enmeshed in blame cultures that organisations often get into Failure-Is-Not-An-Option syndrome. What happens in this syndrome is that people hide failure rather than prevent it, and over time that hiding culture gets deep into the organisation. This culminates in an even worse syndrome, The-Emperor’s-New-Clothes syndrome. Here, everyone knows that what they say is not true, yet no one does anything about it.

Without risk there is no learning. Without learning there is no life. We need to be careful about being too careful. 


Fame and Glory

At the weekend, the Observer Music Monthly published a column about YouTube videos of pop stars throwing tantrums.

This is Elton saying he makes music, not films, whilst his team and the film people stand around and endure the paddy. (There's an Olbermann special on another Elton flare-up, here.) Then, also behaving badly: Grace Jones; yet another Liam Gallagher moment (apparently in February this year and, this time, in front of his five year-old); Preston (from Ordinary Boys) walking off Never Mind the Buzzcocks (Simon Amstell is merciless —€” who's behaving badly?); and Bjork — 'my motherly instincts took over'.

But in a different league altogether is The Bee Gees meet Clive Anderson (1996). I've heard about this on and off over the years, but this is the first time I've seen it — and to see is to be amazed by Barry Gibb's reaction to Anderson's wind-ups. The best moment is the one the Observer picked out:

Barry Gibb: 'We used to be called Les Tossers.'

Anderson: 'You'll always be Les Tossers to me.'

Some of this might come in useful when we get to discussing (in ICT) online-posting, privacy and forgetting. I'd want to work this in with danah's reflections on narcissism and "MySpace". Obviously this is germane:

One of the reasons that celebrities go batty is that fame feeds into their narcissism, further heightening their sense of self-worth as more and more people tell them that they're all that. They never see criticism, their narcissism is never called into check.

danah's focus, though, is designedly elsewhere:

What i do know is that MySpace provides a platform for people to seek attention. It does not inherently provide attention and this is why even if people wanted 90M viewers to their blog, they're likely to only get 6. MySpace may help some people feel the rush of attention, but it does not create the desire for attention. The desire for attention runs much deeper and has more to do with how we as a society value people than with what technology we provide them.

I am most certainly worried about the level of narcissism that exists today. I am worried by how we feed our children meritocratic myths and dreams of being anyone just so that current powers can maintain their supremacy at a direct cost to those who are supplying the dreams. I am worried that our "solutions" to the burst bubble are physically, psychologically, and culturally devastating, filled with hate and toxic waste. I am worried that Paris Hilton is a more meaningful role model to most American girls than Mother Theresa ever was. But i am not inherently worried about social network technology or video cameras or magazines. I'm worried by how society leverages different media to perpetuate disturbing ideals and pray on people's desire for freedom and attention. Eliminating MySpace will not stop the narcissistic crisis that we're facing; it will simply allow us to play ostrich as we continue to damage our children with unrealistic views of the world.

Once again, it's not the technology that's the problem, but as we "teach" the technology we can expect these social and ethical and psychological issues to make themselves known. Increasingly, I think the ICT teacher, and the teacher using ICT, is called upon (almost first and foremost) to be pastorally skilful. We haven't been focusing on this in ICT, instead looking nearly always at the technological skills. We need both, but I think the pastoral is going to prove crucial.


Cerebrotonic

No sooner do I post about Auden and include 'The Fall of Rome' ('Cerebrotonic Cato may / Extol the Ancient Disciplines'), than up pops 'cerebrotonic' in another blog post.

'Cerebrotonic' sounds like an Auden coinage, but isn't. Here's the OED:

A. adj. Designating or characteristic of a type of personality which is introverted, intellectual, and emotionally restrained, classified by Sheldon as being associated with an ECTOMORPHIC physique. B. n. One having this type of personality. So cerebrotonia (-{sm}t{schwa}{shtu}n{shti}{schwa}), cerebrotonic personality or characteristics.

1937 A. HUXLEY Ends & Means xi. 165 Dr. William Sheldon, whose classification [of types of human beings] in terms of somatotonic, viscerotonic and cerebrotonic I shall use. Ibid. xii. 193 The cerebrotonic is not such a ‘good mixer’ as the viscerotonic. 1940 W. H. SHELDON Var. Human Physique 8 In the economy of the cerebrotonic individual the sensory and central nervous systems appear to play dominant roles. 1945 A. HUXLEY Let. 2 Apr. (1969) 517 There was just enough of the somatotonic in his..cerebrotonic make-up to make him regret his cerebrotonia. 1950 {emem} Themes & Var. i. 121 Too secretively the introvert, too inhibitedly cerebrotonic, to be willing to take the risk of ‘giving himself away’. 1951 AUDEN Nones (1952) 28 Cerebrotonic Cato may Extol the Ancient Disciplines. 1954 R. FULLER Fantasy & Fugue iv. 75 You..unfortunately incline to the cerebrotonic ectomorph{em}you worry too much, you're too good looking, and you can't abandon yourself happily to booze.

The other blog post? Momus' Celebrating diversity means measuring difference. Momus writes about William Sheldon:

I discovered his writings when I was 20, and trying to understand my own problems and potentialities better. Sheldon proposed what seems at first like a very simple way to measure body types. He isolates three basic components: fatness, muscularity and thinness, which he calls endomorphy, mesomorphy and ectomorphy. … "Ectomorphy means linearity, fragility, flatness of the chest, and delicacy throughout the body," he wrote. "We find a relatively scant development of both the visceral and the somatic structures. The ectomorph has long, slender, poorly muscled extremities with delicate pipe-stem bones, and he has, relative to his mass, the greatest surface area and therefore the greatest sensory exposure to the outside world. He is thus in one sense overly exposed and naked to the world." …

I'm a classic ectomorph, which means that by temperament I'm a cerebrotonic. In ectomorph-cerebrotonics, "the sensory-receptor properties are well developed. As a consequence however the central nervous system (CNS) is soon overloaded and rapidly tires. The cerebrotonic has the gift of concentrating his attention on the external world as well as on his internal world. His vigilance and autonomic reactivity make him behave in an inhibited and uncertain way: introverted behaviour. He has problems with expressing his feelings and with establishing social relationships, and can very well bear to be alone. The elementary strategies of coping with life are perception, reconnaissance and vigilance, cognition and anticipation, and a certain amount of privacy." …

Personally, I like people who structure the world boldly, especially if their structurations ring true. I don't take any structuration as holy writ, though -- I like to play with them, snap them together and pull them apart. But I also like it when structurations make for lovely poetry. The way Sheldon describes the ectomorph has a behaviourist beauty, a 1940s severity. He has "a relative predominance of skin and its appendages, which includes the nervous system; lean, fragile, delicate body; small delicate bones; droopy shoulders; small face, sharp nose, fine hair; relatively little body mass and relatively great surface area".

"The cerebrotonic may be literate or illiterate," says Sheldon, "may be trained or untrained in the conventional intellectual exercises of his milieu, may be an avid reader or may never read a book, may be a scholastic genius or may have failed in every sort of schooling. He may be a dreamer, a poet, philosopher, recluse, or builder of utopias and of abstract psychologies. He may be a schizoid personality, a religious fanatic, an ascetic, a patient martyr, or a contentious crusader. All these things depend upon the intermixture of other components, upon other variables in the symphony, and also upon the environmental pressures to which the personality has been exposed. The essential characteristic of the cerebrotonic is his acuteness of attention. The other two major functions, the direct visceral and the direct somatic functions, are subjugated, held in check, and rendered secondary. The cerebrotonic eats and exercises to attend."

I know next to nothing about Sheldon and need to go back to Momus and read it all again. John Fuller, in his W H Auden: A Commentary, says only this apropos 'The Fall of Rome' and 'cerebrotonic':

Stanza 4: Auden was inclined to prefer the endomorphic type to either the ectomorphic ('Cerebrotonic Cato') or the mesomorphic ('muscle-bound Marines'). The typology is from W H Sheldon.

Momus, quoting Sheldon on endomorphs and mesomorphs:

For comparison, in endomorphs "The body is rounded and exhibits a central concentration of mass. The trunk predominates over the limbs, the abdomen over the thorax, and the proximal segments of the limbs predominate over the distal segments. The bones are gracile and the muscle system is poorly developed. Muscle relief and bone projections are absent. The body displays a smoothness of contour owing to subcutaneous padding. The head is large and spherical, the face is wide with full cheeks. The neck is frequently short and forms in side view an obtuse angle with the chin. The shoulders are high and rounded. The trunk is relatively long and straight, the chest is wide at the base. The limbs are comparatively short and tapering with small hands and feet."

"When mesomorphy predominates, the body is sturdy, hard and firm. The bones are large and heavy, the muscles well-developed, massive and prominent. The heavily muscled thorax predominates over the abdomen. The proximal and distal segments of the limbs are evenly proportioned. The bones of the head are heavy. The face is large in relation to the cranial part of the head. Massive cheekbones and square jaws are the rule. The arms and legs are uniformly massive and muscular, strongly built knees, massive wrists."

Ah, classificatory schema: they have their own fascination

Oh, and one other gem from Momus:

Interestingly, Sheldon met and befriended Aldous Huxley during a residence at a writers and artists' refuge at Dartington Hall in Devon, England. Huxley also recognized himself as an ectomorph and cerebrotonic, and saw it as a limitation …

(Have another look at the clip from the OED above. Wouldn't it be interesting if we could overlay the OED with transfers of social and intellectual relationships? … Hey OUP, open up the OED!) You'll have to click through to iMomus to hear what Huxley had to say.


Eternal Sunshine of the Spotless Mind: memory and film-making

I need to go out and buy the DVD of Eternal Sunshine of the Spotless Mind. I much enjoyed it when it came out and blogged about it twice, with excerpts from Steve Johnson's essay about it and quoting from the review of the film. The film's back on my screen again. Via (del.icio.us link), today I came across the detailing some of their work on the film — and it's really impressive.

Last month, on dream, memory and the film:

Seed Magazine has a video of a fascinating conversation between sleep scientist Robert Stickgold and film director Michel Gondry, director of Eternal Sunshine of the Spotless Mind. Stickgold has reinvigorated sleep research by investigating the borderlands of consciousness with a series of novel experiments.

Favourite quote from the Stickgold/Gondry video clip: 'the reason why cuts work in movies is' (Stickgold) … 'because we dream' (Gondry)/'because we're all familiar with them' (Stickgold). 

From Mind Hacks, then, to the following: 

Gondry's new movie, The Science of Sleep, also explores the mind's outer reaches. …

Link to fantastic article on the cognitive science of Eternal Sunshine of the Spotless Mind.

Also, from the last link:

Now there's a whole bundle of stuff and possibilities for teaching …


"bloody computer games … thin gruel indeed"

The quotation is from Michael Shayer, Professor of Applied Psychology at King's College, University of London, and appears in American Scientist's Smart as We Can Get?. To begin at the beginning:

Psychometricians have long been aware of a phenomenon called the Flynn effect—a widespread and long-standing tendency for scores on certain tests of intelligence to rise over time. … Ever since Flynn published his startling results, psychologists and educators have struggled to figure out whether people really are getting smarter and, if so, why. No clear answer has emerged. And now they have another curiosity to ponder: The tendency for intelligence scores to rise appears to have ended in some places. Indeed, it seems that some countries are experiencing a Flynn effect with a reversed sign.

'a Flynn effect with a reversed sign'. Or, at least, as some of the research from Scandinavia cited by American Scientist has shown, a plateau can be reached.

Back in January, the Guardian carried a lengthy piece about recent research conducted by Shayer:

New research funded by the Economic and Social Research Council (ESRC) and conducted by Michael Shayer … concludes that 11- and 12-year-old children in year 7 are "now on average between two and three years behind where they were 15 years ago", in terms of cognitive and conceptual development.

"It's a staggering result," admits Shayer, whose findings will be published next year in the British Journal of Educational Psychology. "Before the project started, I rather expected to find that children had improved developmentally. This would have been in line with the Flynn effect on intelligence tests, which shows that children's IQ levels improve at such a steady rate that the norm of 100 has to be recalibrated every 15 years or so. But the figures just don't lie. We had a sample of over 10,000 children and the results have been checked, rechecked and peer reviewed."

I remember being stopped in my tracks when I read this article. I recommend reading it in full: it goes into some detail about Shayer's distinguished, lifelong contribution to educational research, the attendant debates and controversies.

And Shayer's most recent research, its methodology and conclusions, will be discussed widely and with passion once it is published. Anyone doubting the storm that will break then need only ponder this (Guardian):

Those likely to be particularly discomforted by Shayer's findings are people who swear by the validity of GCSE and Sats results. The idea that most children are achieving the government level 4 targets in maths and science at key stage 2 is clearly anomalous with Shayer's findings, as is the notion that secondary schools are now taking children who are two years behind developmentally and still getting them up to GCSE speed in just five years.

So how does Shayer explain this? "The Qualifications and Curriculum Authority obviously insists that standards haven't dropped," he says, "but this doesn't fit all the evidence. A-level maths and science teachers often report that their students don't know as much as they used to. And some parts of the GCSE science syllabus, such as density, have been dropped. Examiners may well be asking easier questions and marking more leniently. These things can happen unconsciously.

There is some evidence that the extra hour allocated to maths in primary schools under the numeracy initiative has had some impact on Sats scores, but there is greater evidence of teachers teaching to the tests. This means students can perform well in the tests without necessarily understanding the underlying concepts.

… I would suggest that the most likely reasons are the lack of experiential play in primary schools, and the growth of a video-game, TV culture. Both take away the kind of hands-on play that allows kids to experience how the world works in practice and to make informed judgements about abstract concepts."

American Scientist winds up, saying that 'Flynn himself is much less gloomy about what appears to be happening':

For one, he points out that the situation varies quite a bit from country to country. "All the evidence is that the IQ gains in America are still robust, " he says. And he notes that at the very time that scores were declining in the UK on the Piagetian tests that Shayer examined, British kids were making gains on a test called the Wechsler Intelligence Scale for Children or WISC. Flynn points out that results gathered with two versions of this test (WISC-III, introduced in 1991, and WISC-IV, in 2003) show the usual effect, a rise in raw scores over time. But he also notes that one subtest—on arithmetic reasoning—did show a decline.

Although Flynn cautions against generalizing the recent Danish and Norwegian experiences, he anticipates similar results will crop up elsewhere in the world. But he's not glum about it. Flynn is convinced that the cause of his eponymous effect has to do with changes in the environment that allow children more opportunity to exercise the kinds of skills probed in today's intelligence tests—changes like a shift to smaller family sizes, which allow parents more time to interact with each child, for example, or devotion of an ever-greater portion of kids' leisure time to abstract, mentally demanding games. He points out that in industrialized, middle-class countries (like those of Scandinavia) such influences must be reaching a point of saturation: "You can't really get the family much smaller than one or two kids." And the current craze for Sudoku puzzles not withstanding, as Flynn says, "eventually, people do want to relax."

Technorati tags: , , , ,


Changing our minds

Café Scientifique tonight in Oxford. I can't get to this ... and it looks pretty interesting — Dr Martin Westwell (Deputy Director of the Institute for the Future of the Mind, Oxford University) on 'Bending minds - how technology can change who you are':

Martin will talk about the mind, the brain and how pills to make you smarter, pills to make you forget, electrodes inserted into the brain, and devices to let you control computers just by thinking are all technologies that are with us now or are just around the corner. How do these technologies and the new experiences they bring transform and bend the human mind? How are we going to harness the new technologies to maximise the potential of individuals without sacrificing that individuality? What roles do scientists play in deciding how they are to be implemented?

And the Institute for the Future of the Mind?

In the 21st Century, technology will exert unprecedented influence indirectly and directly upon the brain and the critical issue is not whether, but how, such new experiences will transform the human mind. The Institute for the Future of the Mind, is one of 10 research institutes in the new James Martin 21st Century School made possible by a $100M benefaction to Oxford University, with the aim of finding solutions to the biggest problems facing humanity and identifying the key opportunities of the 21st century.

I'm interested in the brief profile there of Dr Westwell: 'Martin’s particular interest is in the way that young people form their minds and the influences of technology on this process in the future'.

Meanwhile, Peter Brunner, an American scientist, can be seen here, demonstrating a BCI — a brain-computer interface.

Technorati tags: , , , , ,