Science

The enchanted loom

Symphony of Science has recently posted ‘Ode to the Brain!’:

 

‘Ode to the Brain’ is the ninth episode in the Symphony of Science music video series. Through the powerful words of scientists Carl Sagan, Robert Winston, Vilayanur Ramachandran, Jill Bolte Taylor, Bill Nye, and Oliver Sacks, it covers different aspects [of] the brain including its evolution, neuron networks, folding, and more. The material sampled for this video comes from Carl Sagan’s Cosmos, Jill Bolte Taylor’s TED Talk, Vilayanur Ramachandran’s TED Talk, Bill Nye’s Brain episode, BBC’s ‘The Human Body’, Oliver Sacks’ TED Talk, Discovery Channel’s ‘Human Body: Pushing the Limits’, and more.

Carl Sagan:

What we know is encoded in cells called neurons
And there are something like a hundred trillion neural connections
This intricate and marvelous network of neurons has been called
An enchanted loom

Wikipedia — Enchanted Loom:

The enchanted loom is a famous metaphor for the brain invented by the pioneering neuroscientist Charles S. Sherrington in a passage from his 1942 book Man on his nature, in which he poetically describes his conception of what happens in the cerebral cortex during arousal from sleep:

The great topmost sheet of the mass, that where hardly a light had twinkled or moved, becomes now a sparkling field of rhythmic flashing points with trains of traveling sparks hurrying hither and thither. The brain is waking and with it the mind is returning. It is as if the Milky Way entered upon some cosmic dance. Swiftly the head mass becomes an enchanted loom where millions of flashing shuttles weave a dissolving pattern, always a meaningful pattern though never an abiding one; a shifting harmony of subpatterns.

The “loom” he refers to was undoubtedly meant to be a Jacquard loom, used for weaving fabric into complex patterns. The Jacquard loom, invented in 1801, was the most complex mechanical device of the 19th century. It was controlled by a punch card system that was a forerunner of the system used in computers until the 1970s. With as many as thousands of independently movable shuttles, a Jacquard loom in operation must have appeared very impressive. If Sherrington had written a decade later, however, he might perhaps have chosen the flashing lights on the front panel of a computer as his metaphor instead.

According to the neuroscience historian Stanley Finger, Sherrington probably borrowed the loom metaphor from an earlier writer, the psychologist Fredric Myers, who asked his readers to “picture the human brain as a vast manufactory, in which thousands of looms, of complex and differing patterns, are habitually at work”. Perhaps in part because of its slightly cryptic nature, the “enchanted loom” has been an attractive metaphor for many writers about the brain …

Oliver Sacks:

We see with the eyes
But we see with the brain as well
And seeing with the brain
Is often called imagination

‘Whole orchestras play inside our heads’ (Sagan).


Auden: aspects of our present Weltanschauung

Looking for something in Auden, I hit another passage, about human nature, art, tradition and originality (below), that I couldn’t put my finger on when I last needed it a few months ago. We’re edging towards the World Brain, but it can’t come fast enough:

It seems possible that in the near future, we shall have microscopic libraries of record, in which a photograph of every important book and document in the world will be stowed away and made easily available for the inspection of the student…. The general public has still to realize how much has been done in this field and how many competent and disinterested men and women are giving themselves to this task. The time is close at hand when any student, in any part of the world, will be able to sit with his projector in his own study at his or her convenience to examine any book, any document, in an exact replica. — H G Wells, ‘The Brain Organization of the Modern World’ (1937)

Auden. I’ve often referred to this passage and am very happy to make it ready to hand through pinning it here:

3) The loss of belief in a norm of human nature which will always require the same kind of man-fabricated world to be at home in. … until recently, men knew and cared little about cultures far removed from their own in time or space; by human nature, they meant the kind of behaviour exhibited in their own culture. Anthropology and archaeology have destroyed this provincial notion: we know that human nature is so plastic that it can exhibit varieties of behaviour which, in the animal kingdom, could only be exhibited by different species.

The artist, therefore, no longer has any assurance, when he makes something, that even the next generation will find it enjoyable or comprehensible.

He cannot help desiring an immediate success, with all the danger to his integrity which that implies.

Further, the fact that we now have at our disposal the arts of all ages and cultures, has completely changed the meaning of the word tradition. It no longer means a way of working handed down from one generation to the next; a sense of tradition now means a consciousness of the whole of the past as present, yet at the same time as a structured whole the parts of which are related in terms of before and after. Originality no longer means a slight modification in the style of one’s immediate predecessors; it means a capacity to find in any work of any date or place a clue to finding one’s authentic voice. The burden of choice and selection is put squarely upon the shoulders of each individual poet and it is a heavy one.

It’s from ‘The Poet and The City’, which I think appeared first in the Massachusetts Review in 1962 and was then included in The Dyer’s Hand (1963). Lots in this essay. ‘There are four aspects of our present Weltanschauung which have made an artistic vocation more difficult than it used to be.’ The others:

1) The loss of belief in the eternity of the physical universe. … Physics, geology and biology have now replaced this everlasting universe with a picture of nature as a process in which nothing is now what it was or what it will be.

We live now among ‘sketches and improvisations’.

2) The loss of belief in the significance and reality of sensory phenomena. … science has destroyed our faith in the naive observation of our senses: we cannot … ever know what the physical universe is really like; we can only hold whatever subjective notion is appropriate to the particular purpose we have in view. This destroys the traditional conception of art as mimesis …

4) The disappearance of the Public Realm as the sphere of revelatory personal deeds. To the Greeks the Private Realm was the sphere of life ruled by the necessity of sustaining life, and the Public Realm the sphere of freedom where a man could disclose himself to others. Today, the significance of the terms private and public has been reversed; public life is the necessary impersonal life, the place where a man fulfils his social function, and it is in his private life that he is free to be his personal self.


Alchemical futures

Sometimes, I’m lucky enough to have the chance to teach a short course about science fiction to a group of 17 year-olds. I’m always intrigued to find out what ‘science fiction’ means to them. This week, kicking off, one lad went straight for super-powers. As it happens, I’ve never had this answer before, but what made me take note was how well he explained what he meant, quickly but thoughtfully: science fiction giving us access to other possible worlds, possible futures … what if … maybe … perhaps … one day … then I could … dream that … build that … I should add, he was the same student who homed in on science fiction and dystopian futures, so he wasn’t sitting there being idly optimistic.

I went through a phase in my teens of reading lots of Jung and, a little later, Freud, considering medicine and psychiatrist / psychoanalyst as a possible future. I still have many of the books I bought then. Jung led me off on curious paths. Alchemy was in there, of course, and has endured as an interest — morphing along the way. I went off certain Jungians at some deep level after a conference (held in Windsor Great Park!), which struck my 18 year-old self as pretty bonkers and anti-science, and I used to get my Jungian books from a very odd bookshop in the middle of nowhere (deep, rural Gloucestershire) which the friends I persuaded to come along (or give me a lift there) ended up calling ‘the magic bookshop’. New Age, though we didn’t know it.

But alchemy’s never gone away. It couldn’t, could it? I loved that Royal Institution talk I went to back in 2006, ‘Alchemy, the occult beginnings of science: Paracelsus, John Dee and Isaac Newton’. The dream of a very special super-power, transforming both matter (world) and self.

Alchemy, originally derived from the Ancient Greek word khemia (Χημεία - in Modern Greek) meaning "art of transmuting metals", later arabicized as the Arabic word al-kimia (الكيمياء, ALA-LCal-kīmiyā’), is both a philosophy and an ancient practice focused on the attempt to change base metals into gold, investigating the preparation of the “elixir of longevity”, and achieving ultimate wisdom, involving the improvement of the alchemist as well as the making of several substances described as possessing unusual properties. The practical aspect of alchemy can be viewed as a protoscience, having generated the basics of modern inorganic chemistry, namely concerning procedures, equipment and the identification and use of many current substances.

Alchemy has been practiced in ancient Egypt, Mesopotamia (modern Iraq), India, Persia (modern Iran), China, Japan, Korea, the classical Greco-Roman world, the medieval Islamic world, and then medieval Europe up to the 20th and 21st centuries, in a complex network of schools and philosophical systems spanning at least 2,500 years.  Wikipedia

And given a background in zoology and theology, I’ve not been able to get this out of my head since stumbling across it the other week:

Once, he called himself a “biologian”, merging the subject matter of life with the method of a theologian. More recently, he told me that he is an alchemist. In Defense of the Memory Theater

Isn’t that great? What a way to think of what you’re engaged on. The work.

It is, by the way, well worth reading all of Nathan Schneider’s post about his uncle, the “alchemist”:

The most remarkable memory theater I’ve ever known is on a computer. It is the work of my uncle, once a biologist at the National Institutes of Health, a designer of fish farms, a nonprofit idealist, and a carpenter. Now he has devoted himself full-time to his theater … [a] single, searchable, integrated organism. When he tells me about it, he uses evolutionary metaphors cribbed from his years researching genetics. The creature mutates and adapts. It learns and grows.


‘the more I write, the more I shall have to write ... I shall never overtake myself’

I go silent on my blog without explanation. It may seem, in the short-term, like a blip, but in the long-term … the pattern becomes clear. — Tom Armitage, ‘Telling Stories’ (Reboot 8, Copenhagen, 2006) (pdf)

I’ve spent a lot of time over the last few months paring and pruning, trying to focus more closely on the things which really matter to me. I’ve got something to put down here soon about attention and curation, but before this new year runs away with me and everything, yet again, tilts Tristram Shandy–wards, I thought I might look back, sum up, take stock (a bit).

Here’s something I wrote for our annual school magazine about last year’s talks. (It goes over some of what I’ve written about here during 2009–2010 and I’ve given the links, in square brackets, where that’s the case.) It's very … potted.

ICT Talks 2009–10

This year, our talks continued to cross disciplines. We kicked off with Andy Huntington (RCA graduate, designer, musician) on interaction design [see 20.9.2009 entry]. In Digital Ground (MIT, 2004), Malcolm McCullough set out how interaction design ‘studies how people deal with technology — and how people deal with each other, through technology. As a consequence of pervasive computing, interaction design is poised to become one of the main liberal arts of the twenty-first century’. Andy, who has worked on interactive objects and experiences for clients from the BBC and the Science Museum to Nokia and the Bartlett School of Architecture, talked us through tapTap (‘The system is built up of individual knock boxes. Each box has its own memory and is completely self-contained. As you tap on the top of a box, the box waits for a few seconds and then taps back what it has heard. If you want more you add another box, and another, and another, tap, tap, tap’) and Beatbox (‘a physical programmable drum machine’). Later in the autumn we were delighted to welcome Usman Haque, architect and co-founder of Pachube (‘store, share & discover realtime sensor, energy and environment data from objects, devices & buildings around the world’):

Usman Haque

The domain of architecture has been transformed by developments in interaction research, wearable computing, mobile connectivity, people-centered design, contextual awareness, RFID systems and ubiquitous computing. These technologies alter our understanding of space and change the way we relate to each other. We no longer think of architecture as static and immutable; instead we see it as dynamic, responsive and conversant. Our projects explore some of this territory. — Haque Design + Research

Playing with tapTap and Beatbox, thinking how objects are now interacting with us through the internet, reflecting on how we can use Pachube … Ubiquitous computing has well and truly arrived and, as McCullough foresaw, educators need to address interaction design as a matter of urgency.

Also in the autumn, Adrian Hon came to talk about his games company, Six to Start [see 30.9.2009 entry]. He began by looking at the role of story-telling in human society, the reception of the first European novels, the ways in which our strong identification with literary heroes and heroines has been elicited and the striking role now played in our lives by online text. The main part of his talk focused on We Tell Stories — a project developed for Penguin: ‘six stories, written by six authors, told in six different ways — ways that could only happen on the web … released over six weeks’. Adrian, who left a career in neuroscience to co-found Six to Start with his brother, sets great store by narrative: ‘Writers are important. When a game’s graphics grow old, and the game mechanics become dated, all that’s left to remember is the story. As designers and writers of games, we all need to set a higher bar for ourselves’. His ambition for games is, indeed, remarkable: ‘Historians will look back hundreds of years from now, and they will say that the explosion of narrative and game forms that we have now was a momentous time that transformed the way that people think and see the world. … It’s hard to imagine a world without books; without Lord of the Rings, or Catch 22, or Pride and Prejudice, or Great Expectations. Equally, it’s already hard to imagine a world without games. Just imagine where we’ll be in a few decades time. We have the opportunity to make those new types of games and stories that will changes people’s lives in the future, and there are so many possibilities.’

Professor Chris Frith, FRS, talked about how our brain generates emotions and thoughts and he was followed soon afterwards by Professor James Paul Gee, the distinguished American scholar, on games and learning. In his book, Making up the Mind [see 22.9.2009 entry], Frith argues that, ‘on the basis of its belief about the world, my brain can predict the pattern of activity that should be detected by my eyes, ears and other senses … So what happens if there is an error in this prediction? These errors are very important because my brain can use them to update its belief about the world and create a better belief … Once this update has occurred, my brain has a new belief about the world and it can repeat the process. It makes another prediction about the patterns of activity that should be detected by my senses. Each time my brain goes round this loop the prediction error will get smaller. Once the error is sufficiently small, my brain “knows” what is out there. And this all happens so rapidly that I have no awareness of this complex process. … my brain never rests from this endless round of prediction and updating’. In Gee’s thought, the world of a complex game mirrors the functioning of the mind: ‘We run videogames in our heads’ [see 30.10.2009 entry]. At the heart of his critical understanding of games is the idea of situated meanings and their role in learning. Games are about problem-solving. Today’s problems are now all complex ones — complexity and complex systems interacting. Today, we must be able to work way beyond standard skills, learning how to be part of a cross-functional team — a very high order skill common to play in many games.

Another theme this year has been how we are living in a time when information is becoming more accessible. We welcomed Timo Hannay, publishing director of Web Publishing at Nature Publishing, to talk about open science and in March we had the opportunity to hear Jimmy Wales, founder of Wikipedia [see 22.3.2010 entry]. Timo spoke about the nature of early scientific publishing and the rise of the expensive (and therefore relatively inaccessible) specialist journal. He explained projects he has helped to develop at Nature, including Connotea, a social bookmarking service for scientists, Nature Network (a social network for scientists) and Nature Precedings (‘a platform for sharing new and preliminary findings with colleagues on a global scale’). Jimmy, arriving straight from Heathrow, spoke to a packed hall on the origins, vision and role of Wikipedia. One thing to emerge from this very well-received talk: about 80% of the students present had edited Wikipedia. Next day at a Guardian conference for heads of media, the same question from Jimmy revealed that only about 30% of that audience had edited the online encyclopaedia.

Another highlight of the year was the chance to hear Stewart Brand and Brian Eno talk about the Long Now and Brand’s new book, Whole Earth Discipline.

Stewart Brand & Brian Eno
The idea of the active intellectual is very important, Brand said, and we’re very pleased that St Paul’s is the first school in the UK to join the Long Now and engage with its commitment to long-term thinking and sustainable living. This takes us neatly back to Pachube and the way we interact with technology. The future requires that the young grow up learning about the history of technology, of man’s long journey of inventiveness in manipulating nature and of the possibilities, for good and ill, that lie in this relationship we have with our world.


We are all Bayesians now

Intent on not being late for an evening session at Tinker.it! last week, I dropped by Bunhill Fields for too short a time, the light beginning to fail and a hurriedly printed off, crumpled map for guide.

image

Easy to find the memorials to Blake and his wife and Defoe. But I was on a quest for Thomas Bayes:

Bayes, Thomas (b. 1702, London - d. 1761, Tunbridge Wells, Kent), mathematician who first used probability inductively and established a mathematical basis for probability inference (a means of calculating, from the number of times an event has not occurred, the probability that it will occur in future trials). He set down his findings on probability in "Essay Towards Solving a Problem in the Doctrine of Chances" (1763), published posthumously in the Philosophical Transactions of the Royal Society of London.

It took me too long to find his resting place, railed off and not in a great state of repair, and my rushed photos weren’t worth posting, but here’s one from the ISBA site (taken by Professor Tony O'Hagan of Sheffield University and seemingly not copyright):

Bayes 1 

The famous essay is online (PDF).

I need to spend more time in and around Bunhill Fields, but what prompted me to try to take it in as I sped across London was reading in Chris Frith’s book, Making up the Mind, how important Bayes is to neuroscience:

… is it possible to measure prior beliefs and changes in beliefs? … The importance of Bayes’ theorem is that it provides a very precise measure of how much a new piece of evidence should make us change our ideas about the world. Bayes’ theorem provides a yardstick by which we can judge whether we are using new evidence appropriately. This leads to the concept of the ideal Bayesian observer: a mythical being who always uses evidence in the best possible way. … Our brains are ideal observers when making use of the evidence from our senses. For example, one problem our brain has to solve is how to combine evidence from our different senses. … When combining this evidence, our brain behaves just like an ideal Bayesian observer. Weak evidence is ignored; strong evidence is emphasised. … But there is another aspect of Bayes’ theorem that is even more important for our understanding of how the brain works. … on the basis of its belief about the world, my brain can predict the pattern of activity that should be detected by my eyes, ears and other senses … So what happens if there is an error in this prediction? These errors are very important because my brain can use them to update its belief about the world and create a better belief … Once this update has occurred, my brain has a new belief about the world and it can repeat the process. It makes another prediction about the patterns of activity that should be detected by my senses. Each time my brain goes round this loop the prediction error will get smaller. Once the error is sufficiently small, my brain “knows” what is out there. And this all happens so rapidly that I have no awareness of this complex process. … my brain never rests from this endless round of prediction and updating.

… our brain is a Bayesian machine that discovers what is in the world by making predictions and searching for the causes of sensations.


What will remain of us

Smithsonian has an article about nautical archaeology, focusing on Dunwich:

The sea that brought trade to Dunwich was not entirely benevolent. The town was losing ground as early as 1086 when the Domesday Book, a survey of all holdings in England, was published; between 1066 and 1086 more than half of Dunwich’s taxable farmland had washed away. Major storms in 1287, 1328, 1347, and 1740 swallowed up more land. By 1844, only 237 people lived in Dunwich. Today, less than half as many reside there in a handful of ruins on dry land.

Here’s Henry James on Dunwich, in English Hours (‘Old Suffolk’ — originally published in Harper's Weekly, 25 September, 1897), 1905:

If at low tide you walk on the shore, the cliffs, of little height, show you a defence picked as bare as a bone … [The land] stretched, within historic times, out into towns and promontories for which there is now no more to show than the empty eye-holes of a skull; and half the effect of the whole thing, half the secret of the impression, and what I may really call, I think, the source of the distinction, is this very visibility of the mutilation. Such at any rate is the case for a mind that can properly brood. There is a presence in what is missing — there is history in there being so little. It is so little, to-day, that every item of the handful counts.

The biggest items are of course the two ruins, the great church and its tall tower, now quite on the verge of the cliff, and the crumbled, ivied wall of the immense cincture of the Priory. These things have parted with almost every grace, but they still keep up the work that they have been engaged in for centuries and that cannot better be described than as the adding of mystery to mystery. … The mystery sounds for ever in the hard, straight tide, and hangs, through the long, still summer days and over the low, diked fields, in the soft, thick light. We play with it as with the answerless question, the question of the spirit and attitude, never again to be recovered, of the little city submerged. For it was a city, the main port of Suffolk, as even its poor relics show ; with a fleet of its own on the North Sea, and a big religious house on the hill. We wonder what were then the apparent conditions of security, and on what rough calculation a community could so build itself out to meet its fate. It keeps one easy company here to-day to think of the whole business as a magnificent mistake.

I’m keeping The Rings of Saturn for when we have a chance to go walking in Suffolk, but, via John Naughton, here’s Sebald on Dunwich:

The Dunwich of the present day is what remains of what was one of the most important ports of Europe in the Middle Ages. There were more than fifty churches, monasteries and convents, and hospitals here; there were shipyards and fortifications and a fisheries and merchant fleet of eighty vessels; and there were dozens of windmills … The parish churches of St James, St Leonard, St Martin, St Bartholomew, St Michael, St Patrick, St Mary, St John, St Peter, St Nicholas and St Felix, one after the other, toppled down the steadily-receding cliff-face and sank in the depths, along with the earth and stone of which the town had been built. All that survived, strange to say, were the walled well-shafts, which, for centuries, freed of what had once enclosed them, rose aloft like the chimney stacks of some subterranean smithy, as various chronicles report, until in due course these symbols of the vanished town also fell down.

Thinking about Dunwich and nautical archaeology made me read again about the project to use 3D seismic data to map the North Sea Palaeolandscapes — lands of hunter-gatherer communities, lost as water levels changed over 8000 years ago:

image

Professor Vince Gaffney:

It's like finding another country. … At times this change would have been insidious and slow — but at times, it could have been terrifyingly fast. It would have been very traumatic for these people. … It would be a mistake to think that these people were unsophisticated or without culture. … they would have had names for the rivers and hills and spiritual associations - it would have been a catastrophic loss. … In 10,000 BC, hunter-gatherers were living on the land in the middle of the North Sea. By 6,000 BC, Britain was an island. The area we have mapped was wiped out in the space of 4,000 years. BBC News

From the project’s introduction:

The British continental shelf contains one of the most detailed and comprehensive records of the Late Quaternary and Holocene landscapes in Europe. This landscape is unique in that it was extensively populated by humans but was rapidly inundated during the Mesolithic as a consequence of rising sea levels as a result of rapid climate change. Previous researchers have recognised the rapid inundation may have preserved topographic features and caches of environmental data of high quality which may be used to provide insights into Holocene landscapes which, if located and sampled, may be unparalleled by terrestrial sites. Knowledge of the development of this landscape is also critical to our understanding of the impact of climate change on palaeobathymetry and shoreline sequences. It is clear that the exploitation of the Southern North Sea for energy and mineral resources, most notably aggregate extraction, remains a strategic goal for the UK and without adequate data this remarkable landscape is under significant threat from development. Furthermore, given that this landscape suffered changes comparable with those predicted for the British shoreline over the next century, the value in providing comparative data for the future impact of global warming seems clear.


Wi-Fi and health

Recently, I needed to prepare something for use at school that would act as a summary to date of this debate. I took as my markers some of the high profile coverage Wi-Fi has received over the last year. It might be worth publishing this brief overview here.

1)  There's no basis for proceeding that's worthy of our consideration other than one based on the scientific evidence. There's masses of conjecture which generates fear, uncertainty and doubt.

2)  Let's start with mobile phones and phone masts - forms of wireless communication the radiation from which is (at source) far more powerful than that emitted by the kinds of wireless access points we'd be installing.

a)  December, 2006: Journal of the National Cancer Institute, a study of 420,095 cell phone users (Danes). They began subscribing to cellular phone services between 1982 and 1995, and the study examines their cancer rates through to 2002. The study 'finds no increased risk of tumors or leukemia in subscribers'; 'Even among the 56,000 people who have used the phones for more than a decade, researchers found no increased risk of cancer'.

b)  July, 2007: the Essex University phone mast study — 'when tests were carried out under double-blind conditions, where neither experimenter nor participant knew whether the signal was on or off, the number of symptoms reported was not related to whether the mast was on or off. Two of the 44 sensitive individuals correctly judged whether the mast was on or off in all six tests, compared with five out of 114 control participants. This proportion is what is expected by chance and was not increased in the sensitive group'.

3)  Now we come to the Panorama programme, 'Wi-Fi: A Warning Signal', that ran last May and which the BBC's editorial complaints unit subsequently (November) conceded had not had 'adequate balance' and so had given 'a misleading impression of the state of scientific opinion on the issue'.

a)  There's a a succinct and clear explanation of a fundamental flaw in the Panorama programme here. From the same source: 'Wi-Fi uses radio frequency (RF) waves that are "non-ionising" - that means they are not powerful enough to knock electrons off molecules in cells. One way they could harm cells is by heating them up. But this requires much higher power than is delivered by Wi-Fi networks or mobile phones (which use similar frequencies).  As every cautious scientist will tell you, you can never prove that something is absolutely safe and no one would want to gamble with the health of children. But there is good reason for thinking that Wi-Fi is, if anything, safer than the radiation from a mobile phone. The UK's Health Protection Agency says a person sitting within a Wi-Fi hotspot for a year receives the same dose of radio waves as a person using a mobile phone for 20 minutes'.

b)  Ben Goldacre, who, of course, writes the excellent Guardian 'Bad Science' column, took the Panorama programme to pieces and has also analysed the whole melange of ideas swirling round the "electrosensitivity" theme: Electrosensitives: the new cash cow of the woo industry; Wi-Fi Wants To Kill Your Children… But Alasdair Philips of Powerwatch sells the cure! ('Of course you should be vigilant about health risks. I don't question that there may be some issues worth sober investigation around Wi-Fi safety. But this documentary was the lowest, most misleading scaremongering I have seen in a very long time.')

4)  I felt it was probably worth my including the two Independent articles from last year, Danger on the airwaves: Is the Wi-Fi revolution a health time bomb? and Wi-Fi: Children at risk from 'electronic smog' (both from April). These will have lodged themselves in the minds of some — and they're truly bad. Ian Betteridge took both apart here, concluding, 'what really matters is that the quality of the Indie's reporting on this is abysmal. Printing scare stories isn't just bad journalism - it's bad behaviour that actually damages our culture, promoting bad, hokey ideas as fact and encouraging anti-scientific and anti-rational propaganda. I'd love to ask the editor of the Indie which they prefer - a world where science and reason are encouraged, or a world of cranks, quacks and charlatans'.

A paragraph or two summing up what we can say we know and how best, then, we should proceed?  I can't really do better than these, from the Guardian article already cited in 3a above:

The World Health Organisation's advice on this is very clear. "Considering the very low exposure levels and research results collected to date, there is no convincing scientific evidence that the weak RF signals from base stations and wireless networks cause adverse health effects."  And an HPA statement issued last week is equally adamant that Wi-Fi almost certainly does not pose a problem. "On the basis of current scientific information Wi-Fi equipment satisfies international guidelines. There is no consistent evidence of health effects from RF exposures below guideline levels and therefore no reason why schools and others should not use Wi-Fi equipment.

And apart from bogus TV experiments, what do we know about the strength of Wi-Fi radiation in homes, schools and businesses? Kenneth Foster, a Professor of Bioengineering at the University of Pennsylvania, took 356 measurements at 55 different sites in four different countries to find out. Even though he took his readings close to wireless routers, in all cases he found that the radiation level from Wi-Fi was far lower than international safety standards and often much lower than other radiation sources nearby.  Wi-Fi is a new addition to modern life and no scientist can say with her hand on her heart that it is perfectly safe - particularly in the long term. But there is no theoretical reason to expect problems and no good evidence for any harm. Of course we need more research to understand its effects more thoroughly and also sensible precautions. But misleading and irresponsible scare stories serve only to cloud the issue.

Technorati tags: , , , ,

Narrating the work

E O Wilson in Consilience, quoted by Jon Udell:

The creative process is an opaque mix. Perhaps only openly confessional memoirs, still rare to nonexistent, might disclose how scientists actually find their way to a publishable conclusion. In one sense scientific articles are deliberately misleading. Just as a novel is better than the novelist, a scientific report is better than the scientist, having been stripped of all the confusions and ignoble thought that led to its composition. Yet such voluminous and incomprehensible chaff, soon to be forgotten, contains most of the secrets of scientific success.

As Jon put it elsewhere,

By narrating the work, as Dave Winer once put it, we clarify the work. There can be more than narrator, but it makes sense to have one team member own the primary role just as other members own other roles.

The first Jon Udell piece referred to above focuses on Timo Hannay:

As director of web publishing for Nature Publishing Group, Timo Hannay’s projects include: Connotea, a social bookmarking service for scientists; Nature Network, a social network for scientists; and Nature Precedings, a site where researchers can share and discuss work prior to publication. The social and collaborative aspects of these systems are, of course, inspired by their more general counterparts on the web: del.icio.us, Facebook and LinkedIn, the blogosphere.

Jon's interview with Timo Hannay is here. I'm keeping a close eye on what Nature is up to.

Dave Winer's original usage runs: 'I think that narrating your work is the way to go'. I can see why Jon Udell likes that as much as he does.


Erasing that memory

When Eternal Sunshine of the Spotless Mind came out I was keen to go and see it — and I wasn't disappointed. I see I read the NYT review in April 2004 and then, when the DVD came out, blogged about it again (2006) and linked (via Mind Hacks) to Eternal Sunshine of the Spotless Mind and the mythical memory videotape.

Back in March 2004, I'd read Steve Johnson's review of the film in Slate, part of which ran:

Eternal Sunshine of the Spotless Mind is remarkably in sync with modern neuroscience, but in one respect the film put its emphasis in the wrong place. To be fair, it's a failing shared with a host of recent films about memory loss: Memento, 50 First Dates, Paycheck. Selective erasure of memories may not be a feasible procedure in the near future, but cosmetic memory enhancement is likely to be a reality in the next 10 years, just as targeted mood enhancers like Prozac have become commonplace over the past 10. You won't be able to sharpen your memory of a single person, but you may well be able to take a pill that will increase your general faculties of recollection. This is the ultimate irony of Eternal Sunshine and films like it. While the culture frets over the perils of high-tech erasure, we should really be worrying about the opposite: what will happen when we remember too much.

And then along comes this:

A single, specific memory has been wiped from the brains of rats, leaving other recollections intact. … The brain secures memories by transferring them from short-term to long-term storage, through a process called reconsolidation. It has been shown before that this process can be interrupted with drugs. But Joseph LeDoux of the Center for Neural Science at New York University and his colleagues wanted to know how specific this interference was: could the transfer of one specific memory be meddled with without affecting others? "Our concern was: would you do something really massive to their memory network?" says LeDoux.

To find out, they trained rats to fear two different musical tones, by playing them at the same time as giving the rats an electric shock. Then, they gave half the rats a drug known to cause limited amnesia (U0126, which is not approved for use in people), and reminded all the animals, half of which were still under the influence of the drug, of one of their fearful memories by replaying just one of the tones. When they tested the rats with both tones a day later, untreated animals were still fearful of both sounds, as if they expected a shock. But those treated with the drug were no longer afraid of the tone they had been reminded of under treatment. The process of re-arousing the rats' memory of being shocked with the one tone while they were drugged had wiped out that memory completely, while leaving their memory of the second tone intact.

LeDoux's team also confirms the idea that a part of the brain called the amygdala is central to this process - communication between neurons in this part of the brain usually increases when a fearful memory forms, but it decreases in the treated rats. This shows that the fearful memory is actually deleted, rather than simply breaking the link between the memory and a fearful response.

Greg Quirk, a neurophysiologist from the Ponce School of Medicine in Puerto Rico, thinks that psychiatrists working to treat patients with conditions such as PTSD [post-traumatic stress disorder] will be encouraged by the step forward. "These drugs would be adjuncts to therapy," he says. "This is the future of psychiatry - neuroscience will provide tools to help it become more effective."


Historic Quotes

An important scientific innovation rarely makes its way by gradually winning over and converting its opponents. What does happen is that its opponents gradually die out, and that the growing generation is familiarised with the ideas from the beginning. — Max Planck

Found here — a great source of quotations, some just plain "historic", others forecasting the future, greeting new discoveries/ideas, etc, and getting things badly wrong:

"What use could this company make of an electrical toy?" — The President of Western Union responding to Alexander Graham Bell's offer to Western Union of the exclusive rights to the telephone for $100,000 in 1876.

Technorati tags: , ,