Two sites

Google Earth (via Helmintholog) is well worth watching — both immediately and as the layers feature fills out (for the UK) and the resolution at high-zoom improves for all mapped areas. Playing with it today made me think back to Matt Jones' piece, Practical Mirrorworlds.

And then my thanks to anti-mega for introducing me to Bleep, 'the digital distribution arm of Warp Records'. Excellent site and anti-mega's nudge got me to download Jamie Lidell's new album, Multiply.

My thanks, too, to Ol for telling me about Tom Vek's We Have Sound.

The significant role of user experience

There's been a lot of excitement about Google fusing Keyhole and Google Maps. I just wish the rest of the world could get a look in …

Of this integration, Jason writes:

The addition was pretty big news … The ability to view satellite images online has been around for years in the form of Microsoft's Terraserver … So why is everyone so excited about it? … it … has a lot to do with someone I wrote about a couple of years ago: it's the user experience, stupid:

Robert Morris from IBM argued last year at Etech 2002 that -- and I'm paraphrasing from memory here -- most significant advances in software are actually advances in user experience, not in technology. …

The satellite feature on Google is no exception. They took something that's been around for years, made it way easier to use (reposition & zoom maps without reloading, pinpoint addresses and routes onto the satellite imagery, toggle between sat and road maps, map size automatically scales to the browser window, etc.), and suddenly this old thing is much more useful and fun to play around with.

Jason's examples: map view, satellite view. On the latter, I put in a simple, 'To here' address (210 3rd Ave, New York) and the result was this:


Don Park's take is well worth reading and John Battelle pointed me to MemoryMap, a 'Flickr/Google Maps mashup'.

Wikipedia and Heavy Metal

Watching John Udell's excellent screencast on Wikipedia's Heavy metal umlaut entry, I had one eye on the value of screencasting generally (see previous entry) and on what I was learning about Wikipedia. Ignoring the data and its accuracy/inaccuracy, for a while it took me back to days spent observing the early stages of growth in bacteria colonies — the parallel is superficial in lots of ways, but as words multiplied and covered blank space the 'hypnotic' (Udell's word) effect cast its spell and I almost felt I was watching something organic at work.

It is a fine demonstration of collaborative editing in action. The old questions remain, however: John Udell himself comments on that part of the entry that runs, 'This is a construction only found in the Jacaltec language of Guatemala' — 'and that fact, if it is a fact' …

Or, another example, the appearance and removal of references to heavy metal's sometime "interest" in Hitler and Nazi Germany. Udell remarks he wasn't surprised to see that go, but it left me wondering about the way the darker side of some bands and music gets treated. Even the plainer statement that 'The Nazi/Hitler theme is glorified by some heavy metal groups' was edited out — 'too strong', in Udell's commentary. Yet, from the Rolling Stones' darkest days, or Bowie's infamous Nazi salute, to heavy metal — here, surely, is something unglamorous and offensive that needs to be looked at critically and in detail. As it is, we're currently left with the thought that röckdöts are designed simply to give a 'tough Germanic feel' — no explanation as to why things Germanic should be considered 'tough'.

Mind-Web: I

Mind-web symbiosis, a tantalising prospect. For one thing, a mind's memory is now being under-written:

I'm slowly trying to teach myself the methodology that Doctorow has modeled for several years now: If you want to be able to find something in the future, don't bury it in your files -- blog about it, put it out on the Net, where Google will never lose it, and if for some reason you can't find it, someone else will probably have picked it up and saved it for you. Scott Rosenberg

And with this come changing social habits: the web and individual machines that we use (mine, friends', our work place's, cafés' …) flowing into one — my hard drives, my memory sticks, my camphone … all now integrated with web-based applications. Prompted by Jeremy Zawodny, I turned up Paul Graham's 2001 essay, 'The Other Road Ahead':

When we look back on the desktop software era, I think we'll marvel at the inconveniences people put up with … When you own a desktop computer, you end up learning a lot more than you wanted to know about what's happening inside it. …

There is now another way to deliver software that will save users from becoming system administrators. Web-based applications …  For the average user this new kind of software will be easier, cheaper, more mobile, more reliable, and often more powerful than desktop software. … you won't ordinarily need a computer, per se, to use software. All you'll need will be something with a keyboard, a screen, and a Web browser. Maybe it will have wireless Internet access. Maybe it will also be your cell phone. Whatever it is, it will be consumer electronics: something that costs about $200, and that people choose mostly based on how the case looks. You'll pay more for Internet services than you do for the hardware, just as you do now with telephones. … if you look at the kind of things most people use computers for, a tenth of a second latency would not be a problem. My mother doesn't really need a desktop computer, and there are a lot of people like her. …

The whole idea of "your computer" is going away, and being replaced with "your data." You should be able to get at your data from any computer. Or rather, any client, and a client doesn't have to be a computer.

Jeremy's comment: 'The more I find myself using increasingly larger and cheaper USB memory sticks, my colocated server, and on-line services like Flickr, I realize how my desktop and laptop computers are becoming less and less important in the grand scheme of things. And when I think about how popular web-based mail systems (Y! Mail, Hotmail, GMail) are, it's apparent that a lot of folks are keeping their data elsewhere.'

It's already happened for many of us, but let's not forget how few in number "we" still are: my teaching colleagues were amazed last week that I have on my hard drive files of data stretching back many years and I've long since grown used to their amazement that I keep e-mail. The gap between rich and poor nations preoccupies us, and quite rightly so, but the divisions within our own societies between the "digitally active" and the rest is so big it's often difficult to remember that it's there at all. To many of our co-workers and colleagues, computers are merely clever calculators, typewriters and terminals for Googling and grabbing an article.

We have a long way to go before we will have realised that vision of collaborative interaction, of many mind-webs working together across physical and temporal distances, that John Udell wrote about last December:

The blog network is made of people. We are the nodes, actively filtering and retransmitting knowledge. … The resemblance of this model to the summing of activation potentials in a neural system is more than superficial.

We need to do much more in schools to introduce students to the implications of the revolution that is the web. A point of entry that appeals to students is memory and it would be interesting to devise a sixth form course (in the UK, this means 17–18 year-olds) which takes in ancient memory practices, looks at tips and tricks, at Dominic O'Brien's Learn to Remember (etc) and also embraces digital and other, web-based developments that greatly enhance our ability to record and access individual and communal memory. At the far reaches of this canvas might be the (surely highly speculative, if undoubtedly ambitious!) project being undertaken by Artificial Development — an attempt to create an artificial neural network cognitive system (as reported by Roland Piquepaille last July). Once engaged, students might be led to discuss work such as that of Tom Daniel who 'integrates concepts in zoology, engineering, and mathematics'. This might then lead us back to the idea of the web as analogous to a neural system in which we are each a node and to a wider discussion of the biological metaphors underlying some of the ways we are coming to understand what we are creating in this thing we call the web and, in turn, what this tells us about ourselves  … I can envisage such a course — at once rich, allusive and tightly integrated with traditional knowledge and cutting-edge technology.

And when we tackle memory in this course, we must not forget forgetfulness! Anne Galloway wrote about this back in 2003, and linked there to Fabio Sergio (also writing in 2003):

All in all we are facing a future strung tight between the ideal, pacific world of the Memex, where man will be given "access to and command over the inherited knowledge of the ages", and one where Lenny Nero will feel at home, characterized by our collective inability to let go of our past. I keep hoping (and working) for the first scenario to become our future, but recognize it will require active involvement from everyone, driven by ample awareness of what's at stake.

Wikipedia and the search for truth

The debate about Wikipedia will no doubt continue for a long time. This post is written as a small contribution to it (and assumes some familiarity with the recent round of discussions).

Jimmy Wales is quoted in Wired News as saying, "We're after ... a standard that is suitable for the general reader … it should lead me to where I'm ready to learn more".

Wired News suggests that Wales may have been referring to more than Wikipedia in making these remarks: the indication is that he was arguing that no encyclopedia is a 'top-tier reference source'.

However these terms ('top-tier reference source', 'encyclopedia') are defined, the truthfulness of a reference work, its reliability, must not be lost sight of in the discussion surrounding Wikipedia. We value first rate reference books for their attempt to approach the truth, however arduous the attempt and elusive the quarry. As contemporary users (and not as historians studying "encyclopedias"), we value them as they are perceptive, summary reports, consolidating upon both what is established as known and what might (or might not) be glimpsed at the frontiers of knowledge. To perform these tasks well demands remarkable skills, knowledge and personal qualities.

In his inaugural lecture as Professor of Poetry at Oxford (1956), Auden spoke of the 'unselfish courage' of scholars who 'read the unreadable' and so 'retrieve the rare prize'. Often the target of jest (Rabelais!), scholars perform indispensable, painstaking work, building up the knowledge and apparatus by which we can come closer to the truth about a subject.  In a scholarly resource, writes Danah Boyd (her particular example is taken from the Emile Durkheim Archive), we have 'citations as well as interpretation of both primary and secondary texts … We know the status of the author (here, a student in sociology), why he wrote this entry and who has checked his entry for verification. Yes, he could be lying, but this is much more reassuring than an entry written by N unknown people'. She adds, 'I also believe that there is something to be said about expertise. The eccentric PhDs with their narrow focus have spent years dedicated to understanding very particular areas of knowledge. They are invested in the accuracy of a particular topic, understand the different debates and are deeply aware of the consequences of poor interpretation. They research things actively, trying to express all sides. It is not simply their authority that makes their descriptions have weight - it is also what they have to lose'. They are under discipline and it is (or should be) exacting: Samuel Johnson defined a scholar as 'one who learns of a master'.

Todd Wilkens (More Smarter) says (in two postings):

Authority of information and knowledge is about quality not quantity.

… too much power in the hands of an elite can be extremely dangerous. However, the majority of us believe that knowledge is not merely a matter (of) rhetorical consensus. Some people really do have expertise and we should take advantage of it. Perhaps the wiki model as it stands now is not the right way to address this.

Ross referred at Many2Many to a paper by Andrew Lih, Wikipedia as Participatory Journalism: Reliable Sources? (pdf), in which Lih concludes:

Open content projects such as Wikipedia received their inspiration from the earlier open source software community that emerged from online collaboration for developing software. Linus Torvalds, leader of the Linux open source movement once said,“Given enough eyeballs, all bugs are shallow.” He was referring to software development, but it is equally relevant to Wikipedia. This use of more “eyeballs” is a rather unique feature of participatory journalism, as it benefits directly from more traffic and more users making their impression by scrutinizing or contributing to content. This tight feedback loop between reading and editing provides for very quick evolution of encyclopedic knowledge, providing a function that has been missing in the traditional media ecology.

Lih's research, as I understand it, is based explicitly on two assumptions that give him his "metrics" for measuring an article's reputation within Wikipedia:

The assumption is that more editing cycles on an article provides for a deeper treatment of the subject or more scrutiny of the content. …

With more editors, there are more voices and different points of view for a given subject

The number of times an article in Wikipedia has been edited, or the number of editors it has had, indeed tell us about the level of attention it has attracted. But this, the very "open source-ness" of Wikipedia, is absolutely not a guarantee of its scholarliness or its truthfulness.

A better model for collaborative investigation into truth might be drawn from the teamwork commonly involved in science. In my experience of this, groups within a research laboratory make presentations on their work in progress, other members of the team and guests discuss and criticise this, the work is re-considered, etc. Papers that are then submitted for publication are subjected to peer review and, once published, stand or fall by their tested veracity.

'peer production of shared knowledge' (Clay Shirky) without these checks and challenges is not desirable in a work of reference. Some accommodation of the wiki model is required.

Wikipedia: what is the main issue?

'the real issue here: media literacy' — foe romeo

Having the page history of a Wikipedia entry clearly and accessibly represented on that entry's page will be valuable. But in itself this can't be the core issue: the core issue is still the validity or otherwise of the information (see my previous entry on this). I don't think the threefold nostrum that "more is better, older is better, fresher is better" (see Clay Shirky's recent piece) somehow leads us to the promised land. Trustworthiness in the world of the Wiki Now doesn't lie simply in the number of contributors, the length of time an entry has been worked on or in how recently it has been re-worked. In short, I share Danah Boyd's critical enthusiasm for Wikipedia. Danah Boyd: 'As a contributor to and user of Wikipedia, there is no doubt that i have a deep appreciation for it. … On topics for which i feel as though i do have some authority, i'm often embarrassed by what appears at Wikipedia'.

Take, for example, an issue to do with a minority group in our society: Roman Catholics. The Spectator published an article in its 18/25 December issue, We are all pagans now. In it, the claim was made that Dominican monks are 'Satan spawn'. A reply from the former Master of the order, Timothy Radcliffe, is then published in the following week's issue, quoting the non-Catholic Professor Diarmaid MacCullough, Professor of Church History at Oxford University, and challenging the widespread view in our society that there was such a thing as "the" Inquisition, that the Dominicans were responsible for it and that they (and inquisitors in general) were very wicked.

Wikipedia has a number of entries to do with, or bearing on, "the" Inquisition. Reading these entries, I can hear a number of different editorial voices, detect differing currents of thought tugging the articles this way and that. What I can't tell, even if I had the benefits of the additional visualisation (as proposed), is how … authoritative (informed, academically impartial) these editors are. Turning from The Spectator to do some research on "the" Inquisition, I would find in Wikipedia much to fire me up (no pun intended) and set me off on further research, but I wouldn't feel confident that I am in touch with something I could then simply use in a lesson (I'm a teacher). I would be happy to say to students, 'Wikipedia says …; now let's do some further digging …'

In the case of a minority group and its rights (including its rights to be understood fairly across time), it clearly matters that we can, indeed, trust the reference "books". But the same would apply, mutatis mutandis, if I were wanting some information on nuclear physics or post-modern literature or …

I can't see that, in itself, it matters how many times a page has been edited: views that are prejudiced or partial can still get to stand as truth unless there is some careful consideration given to matters to do with governance and editing. As I said before, these are issues centred around accuracy, trust, collaborative editing and reputation, and these simply won't go away. I welcome the discussion being informed by matters to do with the design of Wikipedia. In the end, transparency to the truth is what matters and design is very important here — but it has to be in the service of that quest for truth, veritas (as the Dominicans say). (I, too, by the way, find the notion of 'authority' rather boring. The authority of the truth, however, is a different matter.)

I am struck by the fact that within Wikipedia there seems to be a recognition that these issues to do with governance and editing must be faced. IMSoP's contributions to Joi Ito's post (see my earlier post) indicate this.

Interesting new search engine,

… a free, ad-supported, reference search service, created to provide you with instant answers on over a million topics. As opposed to standard search engines that serve up a list of links for you to follow, displays quick, snapshot answers with concise, reliable information. Our editors take our content from over 100 authoritative encyclopedias, dictionaries, glossaries and atlases, carefully chosen for breadth and quality. For ultimate convenience, install 1-Click AnswersTM software, and click on any word in any document on your screen for "Answers at your Fingertips". is the next-generation product of the GuruNet Corporation, which leverages its patented "Answer Engine" technology to bring instant answers to millions of internet users around the globe.

via log os

Wikipedia: when is an encyclopedia not an encyclopedia?

Danah Boyd:

I don't believe that the goal should be 'acceptance' so much as recognition of what Wikipedia is and what it is not. It will never be an encyclopedia, but it will contain extensive knowledge that is quite valuable for different purposes.

Clay Shirky:


So the idea that the Wikipedia will never be an encyclopedia is in part an ahistorical assertion that the definition and nature of encyclopediahood is fixed for all time, and that works like Britannica are avatars of the pattern. Contra boyd, I think Wikipedia will be an encyclopedia when the definition of the word expands to include peer production of shared knowledge, not just Britannica's institutional production.


Wikipedia is not a product, it is a system. The collection of mass intelligence that I value unfolds over time, necessarily. Like democracy, it is messier than planned systems at any given point in time, but it is not just self-healing, it is self-improving. Any given version of Britannica gets worse over time, as it gets stale. The Wikipedia, by contrast, whose version is always the Wiki Now, gets better over time as it gets refreshed. This improvement is not monotonic, but it is steady. …

So, is Wikipedia authoritative? No, or at least not yet, because it has neither the authority of the individual merchant or the commercial brand. However, it does have something that neither mechanism offers, which is a kind of market, where the investment is time and effort rather than dollars and cents. This is like the eBay model … Now when eBay launched, people were skeptical, because the site wasn't trustworthy. The curious thing about trust, though, is that it is a social fact, a fact that is only true when people think it is true. Social facts are real facts, and have considerable weight in the world. … Ebay has become trustworthy over time because the social fact of its trustworthiness grew with the number of successful transactions and with its ability to find and rectify bad actors. …

So, under what conditions might the Wikipedia become a kind of authority, based on something other than authorship or brand? And the answer to that question, I think, is when enough people regard it as trustworthy, where the trust is derived from the fact that many eyes have viewed a particular article. And here danah points to something interesting — she believes, and I believe with her, that a Wikipedia page created by a single user isn't as valuable as a page that has been edited by many users. … And once that social fact is established, authority, albeit of a more diffuse and systems-oriented sort, won't be far behind.

Many interesting points are raised here, but I believe there are problems too, already touched on elsewhere by others who have contributed to this on-going debate. (I would like to single out 'more is better': really? Who makes up the more, and who, at any given time, has had the last word?) For example, in the comments to Joi Ito's 29 August (2004) posting, Wikipedia attacked by ignorant reporter:

Lis Lawley (comment 8):

… while the back-and-forth of community editing may, over time, result in information with significant balance and validity, there's also the very real potential of an unsuspecting user coming across an article during a pendulum swing. With print reference sources, that back-and-forth occurs as well, but it's typically invisible to the end-user, who always receives the post-debate version.

Horst (comment 23):

Imagine a user who needs to find a bit of information and consults a wikipedia article. Can this user be certain that the article is correct as he finds it at any given point in time? The average unsuspecting user doesn't care about knowledge building processes, he is only interested in the result of such a process. Wikipedia, however, is one giant process that, by its very nature, is constantly changing its shape. For the average user (who does not care at all about version histories of the articles) there is no way of telling whether an article is more in flux or more stable, or whether it has just been defaced by a person with a sense of humour like Alex Halavais [see comment 5].

Furthermore, there are apparently plans to print Wikipedia or to produce it on CD-ROM (IMSoP, comment 27). What, then, of the argument from wiki-ness ('Wikipedia, by contrast, whose version is always the Wiki Now, gets better over time as it gets refreshed')? IMSoP continues:

In future, it is likely that the wiki-process will be used to build, improve, and correct articles, which are then verified before being labelled as "authoritative"; under such a system, arbitrary vandalism would not only be corrected, but would be invisible to genuine end-users (i.e. "users not engineers" as you put it). OK, so we're not yet sure how; that's a challenge, but it's not an impossibility just because we want to balance it against the clear advantages of the wiki approach. The Wikipedia interface already diverges quite significantly from the "classical" WikiWikiWay - it has separated discussion from content, the ability to protect pages, ban users, and yet do so to some extent within the spirit of openness that the project is founded on. So yes, it has had to become more complicated than a traditional wiki, and gets ever more so as it approaches in similarity to a traditional encyclopedia. It is neither the same as a "normal" wiki, nor is it the same as a "normal" encyclopedia; nonetheless, it has many real uses for real people. In short: it's not a wiki, it's not an encyclopedia, it's the one and only Wikipedia; and as it matures, it will find it's own, authoritative, place in the world.

(Consider, also, this from Matt Jones' September '04 posting, Authority and Autonomy: 'The wikipedia's structural strength and resilience confered by its form, also condemns it to be being in the constant flux of the wikinow - and that immediately erodes it's 'authority' in traditional terms …'. And this, from Dave Winer: '… the inherent weakness in the Wiki model, the consensus isn't always correct, esp when some people want to have their point of view prevail above all others.')

As is probably well known, Dispatches from the Frozen North performed a more covert series of acts of "vandalism" on Wikipedia than did Alex Halavais:

I was disappointed that all my changes in Wikipedia went unchallenged. … One way to solve this weakness is to create a formal fact-checking mechanism. In Wikipedia, contributions of new material are certainly valuable, but fact checking is even more important.

Clay's second posting concludes:

And the more macro point is that Wikipedia is still in the early days of experimenting with models of governance, editing, or, as here, presentation to the users.

Issues to do with governance and editing surely lie at the heart of this debate: issues, that is, centred around accuracy, trust, collaborative editing and reputation — see Ross Mayfield's post of August last year.

Footnote: I read in Reagle's A Case of Mutual Aid:

Wikipedia is the populist offshoot of the Nupedia project started in March of 2000 by Jimbo Wales and Larry Sanger. Nupedia's mission was to create a free encyclopedia via rigorous expert review under a free documentation license. Unfortunately, this process moved rather slowly and having recently been introduced to Wiki, Sanger persuaded Wales to set up a scratch-pad for potential Nupedia content where anyone could contribute. However, "There was considerable resistance on the part of Nupedia's editors and reviewers, however, to making Nupedia closely associated with a website in the wiki format. Therefore, the new project was given the name 'Wikipedia' and launched on its own address,, on January 15 [2001]" (Wikipedia 2004hw). [ History of Wikipedia. Retrieved on April 29, 2004 from <>]

Given this, I think it amusing that this whole recent round of debate about Wikipedia's authority and/or integrity and/or reliability has been sparked off by (amongst others) none other than Larry Sanger. Many of the "problems" he identifies seem to be characteristic of the radically open nature of wikis …