Wikis

Kayaking

The internet means you don’t have to convince anyone else that something is a good idea before trying it.
Scott Bradner, former trustee of the Internet Society (quoted in Here Comes Everybody)

The communications tools broadly adopted in the last decade are the first to fit human social networks well,
and because they are easily modifiable they can be made to fit better over time.
— Clay Shirky (Here Comes Everybody, p 158) 

Clay Shirky at the ICABack before Easter, I was at the ICA for the Eno/Shirky evening. One of the books I then read over the break was Here Comes Everybody. I’ve been meaning for some time to put down a few notes about it here. This has grown to be a long post as I’ve added to it, wanting to get a few things out on the page and, so, clearer in my own mind.

It’s a great book to suggest to friends who are not familiar with the technologies Shirky discusses as it hides its knowledge well — but there are still leads to follow up. The modest ten or so pages of the Bibliography threw up a number of articles I'd either not heard of before or hadn’t visited in a long while. In the former camp, I recommend: Anderson: More Is Different (Science — 1972); R H Coase: The Nature of the Firm (pdf) — a 1937 economics paper; Richard P. Gabriel — Lisp: Good News, Bad News, How to Win Big: worse is better (1991); Alan Page Fiske: Human Sociality. (There’s an online “webliography” here.) And chapters 8–11, covering so many big topics — social capital; three kinds of loss (some solve-a-hard-problem jobs; some social bargains; negative aspects to new freedoms); small world networks; more on social capital; failure (‘open source … is outfailing’ commercial efforts, 245); more on groups (‘every working system is a mix of social and technological factors’, 260) — hit my Amazon Prime account hard. (Incidentally, there’s a Kevin Kelly piece on “more is different”, Zillionics, that appeared earlier this year. See also Kevin Kelly’s The Google Way of Science and Wired’s The Petabyte Age: Because More Isn't Just More — More Is Different.)

Further reading to one side, a number of things discussed in the book particularly interested me straightaway. Firstly, sociality, privacy and exposure online. Leisa recently posted Ambient Exposure, an update (of sorts) to her post of last March, Ambient Intimacy. The titles tell their own story. Early on, Clay writes about ‘how dramatically connected we've become to one another … [how much] information we give off about our selves’. This took me back to Adam Greenfield’s recent talk at the Royal Society (I’ve also been re-reading Everyware). Our love of flocking is being fed handsomely by means of the new tools Clay Shirky discusses so well.

Privacy is always coming up in conversations at school about online life, and what I’m hearing suggests our students are beginning to look at privacy and exposure with growing circumspection. Facebook’s People You May Know functionality has made some sit up and wonder where social software might be taking us. We’re slowly acquiring a stronger sense of how seduction through imagined privacy works (alone in a room, save for screen and keyboard) and a more developed understanding of what it means to write for unseen audiences. Meanwhile, there are things to be unlearned: ‘those of us who grew up with a strong separation between communication and broadcast media … assume that if something is out where we can find it, it must have been written for us. … Now that the cost of posting things in a global medium has collapsed, much of what gets posted on any given day is in public but not for the public’ (90).  In the Bibliography, Clay refers to a post of Danny O’Brien’s — all about register — which is a longtime favourite of mine, too.

Then there was what the book had to say about media and journalism. Simon Waldman, well-placed to pass comment, on chapters 3 and 4:

The chapters most relevant to media/journalism - ‘Everyone is a media outlet’ and ‘Publish first, filter later’ should be required reading for pretty much everyone currently sitting in a newspaper/broadcaster. It’s certainly the best thought through thing I’ve read on this, and the comparison to the decline of the scribes when the printing press came in is really well drawn. 

The summary to Chapter 4 (‘Publish, Then Filter’) runs, ‘The media landscape is transformed, because personal communication and publishing, previously separate functions, now shade into one another. One result is to break the older pattern of professional filtering of the good from the mediocre before publication; now such filtering is increasingly social, and happens after the fact’. ‘Filter-then-publish … rested on a scarcity of media that is a thing of the past. The expansion of social media means the only working system is publish-then-filter’ (98). (Language like this can sound an utopian note that rings on in the head long after the book’s been closed, as if we’d entered a world beyond old constraints. And look!: the Praetorian Guard of elite gatekeepers is no more.)

I was interested, too, to read Shirky’s thoughts about the impact of new technologies on institutions. His application of Ronald Coase’s 1937 paper and, in particular, the idea of the Coasean floor (‘activities … [that] are valuable to someone but too expensive to be taken on in any institutional way’), was very striking: the new tools allow ‘serious, complex work [to be] taken on without institutional direction’ and things can now be achieved by ‘loosely coordinated groups’ which previously ‘lay under the Coasean floor’.

We didn't notice how many things were under that floor because, prior to the current era, the alternative to institutional action was usually no action. (47)

Later in the book (107), he comes back to institutions, taking what is happening to media businesses as not unique but prophetic — for ‘All businesses are media businesses … [as] all businesses rely on the managing of information for two audiences — employees and the world’:

The increase in the power of both individuals and groups, outside traditional organisational structures, is unprecedented. Many institutions we rely on today will not survive this change without significant alteration, and the more an institution or industry relies on information as its core product, the greater and more complete the change will be. The linking of symmetrical participation and amateur production makes this period of change remarkable. Symmetrical participation means that once people have the capacity to receive information, they have the capability to send it as well. Owning a television does not give you the ability to make TV shows, but owning a computer means that you can create as well as receive many kinds of content, from the written word through sound and images. Amateur production, the result of all this new capability, means that the category of "consumer" is now a temporary behaviour rather than a permanent identity.

‘Every new user is a potential creator and consumer’ (106) is reminiscent of Bradley Horowitz in Creators, Synthesizers, and Consumers (2006).

*****

Continue reading "Kayaking" »


TiddlyWiki

I've been playing around with a variety of wiki software with an eye on what I might recommend for colleagues at school. It was good to meet Jeremy Ruston at Reboot. Jeremy is the founder and CTO of Osmosoft, 'the publisher of TiddlyWiki, a popular and well-regarded free tool that is relied on by hundreds of thousands of people around the world to record, organise and share all kinds of information'.

Doc Searls' post yesterday, Food for rethought, is a good and quick reminder of some of the things that make TiddlyWiki interesting, but I particularly liked these comments of Jeremy's — made whilst demo-ing TiddlyWiki to Doc:

We don't have many weapons to use against really ineffectual people ...  It's reasonable to talk about software as being alive ...  It's symbiotic ... It needs a host geek in which to live ... The value in software is as much in its potential as in its functionality ...

I heard the news of BT buying Osmosoft from Jeremy when we were in Copenhagen. He blogged about it at the end of May:

I’m delighted to announce that the mighty BT has acquired my tiny little company Osmosoft Limited. I’m joining BT as Head of Open Source Innovation, and I’ll be building a crack open source web development team called BT Osmosoft. … BT is becoming a remarkable thing: a truly internet-scale consumer company that doesn’t rely on owning “secret sauce” software for it’s business. At the most senior levels, there’s an appetite to embrace open source that wouldn’t disgrace a web 2.0 startup. I’ll be working with a great many talented and interesting people, and I’m looking forward to it immensely. … I hope BT’s endorsement of TiddlyWiki will open up new applications that we haven’t thought of yet. To meet the challenges that they bring, I’ll continue to strive to keep the core of TiddlyWiki true to its origins as a lean, efficient non-linear personal web notebook.

I see TiddlyWiki has just had its release 2.2. I'm off to look more closely at TiddlyWiki.


Paradigm shifts

I like the discipline of the del.icio.us 255 character limit for the excerpt from, or comment on, the item you're bookmarking there. But sometimes there's just too much that's good to be contained or summed like that.

The amazing miracle of YouTube versus The Times, as everyone reading this blog surely already knows, is that YouTube is a platform where cream--user-uploaded videos--rises the the top, to be savored by the world, while The New York Times Company is an information organization that pays thousands of journalists, designers, business people and administrative types millions of dollars to create expert content that tells people what to think and what to like. And honey, that day is passing fast. 

The point here--just to kick it a little harder--is that this is yet more evidence how social media platforms that are shifting the paradigms in a profound way--Not only does YouTube have a mass market, it's video on the web appeal that the more high-brow Times will never have (Is YouTube the next MTV?). Furthermore, it's a platform that gives Google the opportunity to morph into a multimedia MySpace ecosystem, way beyond what Orkut could ever be--and most cruelly, it's something that teens and twenty-somethings care about, which may no longer be the case for The New York Times.

So Google bought YouTube, not a media company, and the fact that doesn't even surprise anyone one anymore and that it makes perfect sense, that, dudes, is a paradigm shift.

*

… consumerization will be the most significant trend to have an impact on IT over the next 10 years. … "Consumers are rapidly creating personal IT architectures capable of running corporate-style IT architectures," he [Gartner's director of global research, Peter Sondergaard] said. "They have faster processors, more storage and more bandwidth."

He advised corporate IT executives to adapt to the changes and prepare for what he called "digital natives," or people so fully immersed in digital culture that they are unconcerned about the effects of their technology choices on the organizations that employ them. … 

In a paper prepared by Gene Phifer, David Mitchell Smith and Ray Valdes, Gartner researchers noted that corporate IT departments historically have lagged behind popular technology waves, such as the arrival of graphical user interfaces and the Internet in business. They argued that the biggest impacts of Web 2.0 within enterprises are collaboration technologies--notably blogs, wikis and social networking sites--and programmable Web sites that allow business users to create mashup applications. … "Our core hypothesis is that an agility-oriented, bifurcated strategy--one reliant on top-down control and management, the other dependent on bottom-up, free-market style selection--will ultimately let IT organizations play to their strengths while affording their enterprises maximum opportunity as well," the Gartner report said.

Technorati tags: , , , , , , , , , ,


Web 2.0: 'what the Web was supposed to be all along'

Tim Berners-Lee, interviewed by Scott Laningham for IBM developerWorks

BERNERS-LEE: … the original World Wide Web browser of course was also an editor. I never imagined that anybody would want to write in anchor brackets. We'd had WYSIWYG editors for a long time. So my function was that everybody would be able to edit in this space, or different people would have access rights to different spaces. But I really wanted it to be a collaborative authoring tool. And for some reason it didn't really take off that way. And we could discuss for ages why it didn't. You know, there were browser editors, maybe the HTML got too complicated for a browser just to be easy. 

But I've always felt frustrated that most people don't … didn't have write access. And wikis and blogs are two areas where suddenly two sort of genres of online information suddenly allow people to edit, and they're very widely picked up, and people are very excited about them. And I think that really for me reinforces the idea that people need to be creative. They want to be able to record what they think. … 

LANINGHAM: You know, with Web 2.0, a common explanation out there is Web 1.0 was about connecting computers and making information available; and Web 2 is about connecting people and facilitating new kinds of collaboration. Is that how you see Web 2.0? 

BERNERS-LEE: Totally not. 

Web 1.0 was all about connecting people. It was an interactive space, and I think Web 2.0 is of course a piece of jargon, nobody even knows what it means. If Web 2.0 for you is blogs and wikis, then that is people to people. But that was what the Web was supposed to be all along. 

And in fact, you know, this Web 2.0, quote, it means using the standards which have been produced by all these people working on Web 1.0. It means using the document object model, it means for HTML and SCG and so on, it's using HTTP, so it's building stuff using the Web standards, plus Java script of course. So Web 2.0 for some people it means moving some of the thinking client side so making it more immediate, but the idea of the Web as interaction between people is really what the Web is. That was what it was designed to be as a collaborative space where people can interact. 

Now, I really like the idea of people building things in hypertext, the sort of a common hypertext space to explain what the common understanding is and thus capturing all the ideas which led to a given position. I think that's really important. And I think that blogs and wikis are two things which are fun, I think they've taken off partly because they do a lot of the management of the navigation for you and allow you to add content yourself. 

But I think there will be a whole lot more things like that to come, different sorts of ways in which people will be able to work together. 

The semantic wikis are very interesting. These are wikis in which people can add data and then that data can then be surfaced and sliced and diced using all kinds of different semantic Web tools, so that's why it's exciting the way people, things are going, but I think there are lots of new things in that vein that we have yet to invent.


Transcript here. Podcast here. (Found via Read/Write Web.)

There is something so generous and inspiring in this originating vision of Sir Tim's — made all the more so because it was there at the outset. Had the Web been widely understood in this way from the start, many walled gardens (I'm thinking particularly about schools) would have resisted it vigorously. But now, or (at least) for now, walls have been breached.

I spoke about the-web-as-the-read/write-web, and its implications for education, at Reboot and MicroLearning: see here.

Technorati tags:


Librarians and the future now

Yahoo! search blog — Mark Sandler, University of Michigan:

1,100 librarians recently swarmed on the seaside town of Monterey, California for a deep dive in search technology, and I was among them. Topics included desktop search, visual and clustering search, podcasting, taxonomies and metadata, RSS, blogs, wikis, online education, intranets, spyware, digitization, wireless access, and more. In today’s world of search engines, librarians are reaching way beyond the physical walls of the library.

To make library services more compelling, some librarians have begun experimenting with new virtual reference techniques like instant messenger and text-messaging to interact with patrons. Although some adults may be slow to adopt these techniques in the library, just imagine the usefulness to all the teenagers who already use instant messenger and text-messaging as their main methods of communication. Elsewhere, librarians discussed creating online library catalogs that allow patrons to tag, comment, review, share, recommend, and otherwise create a virtual community around records in the catalog. Imagine browsing through a library catalog and seeing other people’s reviews or recommendations for similar items. Sounds like what happens on many Web sites now, places like Yahoo! Local, My Web 2.0, Flickr, Furl, Amazon.com, etc.

… librarians are continuing to evolve their roles now that people rely so heavily on search engines. What does this mean?

  • For search, knowing when to use particular vertical and specialty engines, specialty databases, meta-search engines, advanced search syntax for the big engines, and so forth.
  • For news, helping people use RSS, email alerts … to know when new and relevant content is available online.
  • For sharing information, helping people find and share with others by using blogs, wikis, and tagging.

As the world of online and offline libraries continue to converge, I think this quote summarizes the conference perfectly: “In 2020, Internet Librarian will simply be called the Librarian Conference.”             

Technorati tags: , , , , ,


Challenges for pupils … and teachers

Barb writes:

Answers that used to be difficult to find were disseminated by teachers and students were quizzed to see if they’d paid attention. Now the knowledge itself is no longer scarce — is there a sense in which we should be teaching our kids how to “pull” the information they need instead of “pushing” in advance what we think they might need to know? Is there a sense in which the always-on information field of the web may be shifting what we think of as education? What are your thoughts?

I think we are not just on the threshold of some fundamental alterations to the ways we teach, but already well down a road which will alter the very idea of what teaching is about. That this isn't necessarily clear to us, or even noticed by many, is hardly surprising. Pull, not push — we have a lot to do to show students (and colleagues) how this works and what differences it makes.

In an apparently unrelated posting, Folksonomy Definition and Wikipedia, Thomas writes:

The lack of understanding the medium of a Wiki, which is very fluid, but not forgetful, is astonishing. They have been around for three or four years, if not longer. It is usually one of the first lessons anybody I have known learns when dealing with a Wiki, they move and when quoting them one must get the version of the information. They are a jumping off point, not destinations. They are true conversations, which have very real ethereal qualities. Is there no sense of research quality? Quoting a Wiki entry without pointing to the revision is like pointing to Time magazine without a date or issue number. Why is there no remedial instruction for using information in a Wiki?

Personally, I love Wikis and they are incredible tools, but one has to understand the boundaries. Wikis are emergent information tools and they are social tools. They are one of the best collaboration tools around, they even work very well for personal uses. But, like anything else it takes understanding on how to use them and use the information in them.

Thomas' posting is important on a number of fronts — folksonomy (obviously), how to use Wikipedia — but just now these remarks about how to use wikis struck home as I was pondering the push/pull question. Yes, Barb, things are shifting in education, and amongst the pressing challenges for us and our pupils is to learn what revision means and how, in pulling knowledge, we must acquire research disciplines that have hitherto been fairly embryonic at the secondary level.


Wikipedia and Heavy Metal

Watching John Udell's excellent screencast on Wikipedia's Heavy metal umlaut entry, I had one eye on the value of screencasting generally (see previous entry) and on what I was learning about Wikipedia. Ignoring the data and its accuracy/inaccuracy, for a while it took me back to days spent observing the early stages of growth in bacteria colonies — the parallel is superficial in lots of ways, but as words multiplied and covered blank space the 'hypnotic' (Udell's word) effect cast its spell and I almost felt I was watching something organic at work.

It is a fine demonstration of collaborative editing in action. The old questions remain, however: John Udell himself comments on that part of the entry that runs, 'This is a construction only found in the Jacaltec language of Guatemala' — 'and that fact, if it is a fact' …

Or, another example, the appearance and removal of references to heavy metal's sometime "interest" in Hitler and Nazi Germany. Udell remarks he wasn't surprised to see that go, but it left me wondering about the way the darker side of some bands and music gets treated. Even the plainer statement that 'The Nazi/Hitler theme is glorified by some heavy metal groups' was edited out — 'too strong', in Udell's commentary. Yet, from the Rolling Stones' darkest days, or Bowie's infamous Nazi salute, to heavy metal — here, surely, is something unglamorous and offensive that needs to be looked at critically and in detail. As it is, we're currently left with the thought that röckdöts are designed simply to give a 'tough Germanic feel' — no explanation as to why things Germanic should be considered 'tough'.


Comment spam and "nofollow"

"re Google’s rel="nofollow"  initiative, I am pleased to see that voices critical and/or doubtful are making themselves heard. With due acknowledgment of the anti-social nature of irresponsible self-promotion by linking to your own blog in comments, I share the anxieties of other small (and not so small) bloggers and left some thoughts on Anil Dash's post yesterday, The Social Impacts of Software Choices.

Will the "cure" be worse than the disease? Ben Hammersley thinks so: 'forcing comment spammers to cast a wider net will cause them to target the long tail of people who have no idea what to do'. There's also the issue of whether or not companies are right to have imposed this initiative on their customers, about which TDavid makes good points. Various writers have raised the problem that webmasters now have an easy way to 'abuse the tag and control the PageRank of their pages' (eg, Slowplay).

I was pleased to read John Battelle yesterday, questioning the rel="nofollow" development in a fair, calm and open-minded way. I would have hoped to have had more discussion within the blogosphere before this move had been forced on so many of us. John Battelle wrote:

… what bothers me is that there may well be an ecology that evolves based on the link mojo in comments which we can't imagine, but that would be important and wonderful, and that will not develop if every comment has a tag telling search engines to ignore it. Like it or not, search engines are now processors of our collective reality, and fiddling with that requires some contemplation.

In an update to this same posting, John Battelle adds (leading off from observations about Anil Dash's post and the discussion-in-comments it attracted):

No Follow will discourage people from doing what I'll call "fully web-expressed writing" on other people's blogs - where they write in that rather post-modern way of linking as they write, which is what we all do in this bloggy world we live in. A deft web writer is like a spider pulling strands to support his or her central thesis - it's an emerging form of communication, and from what I can tell, it's going to be very important long term to our culture.

If as a commentator on someone's blog, you know that you're spending ten, twenty, or more minutes crafting a response, and that response - because it lives in someone's comments field - will be ignored by the conferrers of future societal attention (ie - search indexes) - then I can imagine many folks will simply avoid writing thoughtful responses in comments altogether. Instead, they'll post on their own site. It seems that one of the things No Follow will do - subtly or not - is discourage active and intelligent dialog on a post. That is not, to my mind, a good thing.

Ben Hammersley concluded:

… as respecting rel="nofollow" will involve loosing an enormous amount of implicit metadata, any tools that are interested in that will be forced to ignore it. Technorati will have to choose if it’s a site that measures raw interconnectivity, or some curious High School metric of look-at-that-person-but-don’t-pay-her-any-attention that the selective use of the rel="nofollow" attribute will produce. For many purposes, this would mean the results are totally debased and close to useless.

And TrackBacks? Like John Battelle, I've been led to believe that they are affected by rel="nofollow". Is this true?


Wikipedia and the search for truth

The debate about Wikipedia will no doubt continue for a long time. This post is written as a small contribution to it (and assumes some familiarity with the recent round of discussions).

Jimmy Wales is quoted in Wired News as saying, "We're after ... a standard that is suitable for the general reader … it should lead me to where I'm ready to learn more".

Wired News suggests that Wales may have been referring to more than Wikipedia in making these remarks: the indication is that he was arguing that no encyclopedia is a 'top-tier reference source'.

However these terms ('top-tier reference source', 'encyclopedia') are defined, the truthfulness of a reference work, its reliability, must not be lost sight of in the discussion surrounding Wikipedia. We value first rate reference books for their attempt to approach the truth, however arduous the attempt and elusive the quarry. As contemporary users (and not as historians studying "encyclopedias"), we value them as they are perceptive, summary reports, consolidating upon both what is established as known and what might (or might not) be glimpsed at the frontiers of knowledge. To perform these tasks well demands remarkable skills, knowledge and personal qualities.

In his inaugural lecture as Professor of Poetry at Oxford (1956), Auden spoke of the 'unselfish courage' of scholars who 'read the unreadable' and so 'retrieve the rare prize'. Often the target of jest (Rabelais!), scholars perform indispensable, painstaking work, building up the knowledge and apparatus by which we can come closer to the truth about a subject.  In a scholarly resource, writes Danah Boyd (her particular example is taken from the Emile Durkheim Archive), we have 'citations as well as interpretation of both primary and secondary texts … We know the status of the author (here, a student in sociology), why he wrote this entry and who has checked his entry for verification. Yes, he could be lying, but this is much more reassuring than an entry written by N unknown people'. She adds, 'I also believe that there is something to be said about expertise. The eccentric PhDs with their narrow focus have spent years dedicated to understanding very particular areas of knowledge. They are invested in the accuracy of a particular topic, understand the different debates and are deeply aware of the consequences of poor interpretation. They research things actively, trying to express all sides. It is not simply their authority that makes their descriptions have weight - it is also what they have to lose'. They are under discipline and it is (or should be) exacting: Samuel Johnson defined a scholar as 'one who learns of a master'.

Todd Wilkens (More Smarter) says (in two postings):

Authority of information and knowledge is about quality not quantity.

… too much power in the hands of an elite can be extremely dangerous. However, the majority of us believe that knowledge is not merely a matter (of) rhetorical consensus. Some people really do have expertise and we should take advantage of it. Perhaps the wiki model as it stands now is not the right way to address this.

Ross referred at Many2Many to a paper by Andrew Lih, Wikipedia as Participatory Journalism: Reliable Sources? (pdf), in which Lih concludes:

Open content projects such as Wikipedia received their inspiration from the earlier open source software community that emerged from online collaboration for developing software. Linus Torvalds, leader of the Linux open source movement once said,“Given enough eyeballs, all bugs are shallow.” He was referring to software development, but it is equally relevant to Wikipedia. This use of more “eyeballs” is a rather unique feature of participatory journalism, as it benefits directly from more traffic and more users making their impression by scrutinizing or contributing to content. This tight feedback loop between reading and editing provides for very quick evolution of encyclopedic knowledge, providing a function that has been missing in the traditional media ecology.

Lih's research, as I understand it, is based explicitly on two assumptions that give him his "metrics" for measuring an article's reputation within Wikipedia:

The assumption is that more editing cycles on an article provides for a deeper treatment of the subject or more scrutiny of the content. …

With more editors, there are more voices and different points of view for a given subject

The number of times an article in Wikipedia has been edited, or the number of editors it has had, indeed tell us about the level of attention it has attracted. But this, the very "open source-ness" of Wikipedia, is absolutely not a guarantee of its scholarliness or its truthfulness.

A better model for collaborative investigation into truth might be drawn from the teamwork commonly involved in science. In my experience of this, groups within a research laboratory make presentations on their work in progress, other members of the team and guests discuss and criticise this, the work is re-considered, etc. Papers that are then submitted for publication are subjected to peer review and, once published, stand or fall by their tested veracity.

'peer production of shared knowledge' (Clay Shirky) without these checks and challenges is not desirable in a work of reference. Some accommodation of the wiki model is required.


Wikipedia: what is the main issue?

'the real issue here: media literacy' — foe romeo

Having the page history of a Wikipedia entry clearly and accessibly represented on that entry's page will be valuable. But in itself this can't be the core issue: the core issue is still the validity or otherwise of the information (see my previous entry on this). I don't think the threefold nostrum that "more is better, older is better, fresher is better" (see Clay Shirky's recent piece) somehow leads us to the promised land. Trustworthiness in the world of the Wiki Now doesn't lie simply in the number of contributors, the length of time an entry has been worked on or in how recently it has been re-worked. In short, I share Danah Boyd's critical enthusiasm for Wikipedia. Danah Boyd: 'As a contributor to and user of Wikipedia, there is no doubt that i have a deep appreciation for it. … On topics for which i feel as though i do have some authority, i'm often embarrassed by what appears at Wikipedia'.

Take, for example, an issue to do with a minority group in our society: Roman Catholics. The Spectator published an article in its 18/25 December issue, We are all pagans now. In it, the claim was made that Dominican monks are 'Satan spawn'. A reply from the former Master of the order, Timothy Radcliffe, is then published in the following week's issue, quoting the non-Catholic Professor Diarmaid MacCullough, Professor of Church History at Oxford University, and challenging the widespread view in our society that there was such a thing as "the" Inquisition, that the Dominicans were responsible for it and that they (and inquisitors in general) were very wicked.

Wikipedia has a number of entries to do with, or bearing on, "the" Inquisition. Reading these entries, I can hear a number of different editorial voices, detect differing currents of thought tugging the articles this way and that. What I can't tell, even if I had the benefits of the additional visualisation (as proposed), is how … authoritative (informed, academically impartial) these editors are. Turning from The Spectator to do some research on "the" Inquisition, I would find in Wikipedia much to fire me up (no pun intended) and set me off on further research, but I wouldn't feel confident that I am in touch with something I could then simply use in a lesson (I'm a teacher). I would be happy to say to students, 'Wikipedia says …; now let's do some further digging …'

In the case of a minority group and its rights (including its rights to be understood fairly across time), it clearly matters that we can, indeed, trust the reference "books". But the same would apply, mutatis mutandis, if I were wanting some information on nuclear physics or post-modern literature or …

I can't see that, in itself, it matters how many times a page has been edited: views that are prejudiced or partial can still get to stand as truth unless there is some careful consideration given to matters to do with governance and editing. As I said before, these are issues centred around accuracy, trust, collaborative editing and reputation, and these simply won't go away. I welcome the discussion being informed by matters to do with the design of Wikipedia. In the end, transparency to the truth is what matters and design is very important here — but it has to be in the service of that quest for truth, veritas (as the Dominicans say). (I, too, by the way, find the notion of 'authority' rather boring. The authority of the truth, however, is a different matter.)

I am struck by the fact that within Wikipedia there seems to be a recognition that these issues to do with governance and editing must be faced. IMSoP's contributions to Joi Ito's post (see my earlier post) indicate this.