Putting the people to work

In his excellent science fiction novel ‘The Diamond Age’ writer Neal Stephenson describes a world in which nanotechnology and nanobots are commonplace and ‘Matter Compilers’ can create objects at will.

However there are no artificial intelligences on his imagined earth, the technology having failed to deliver on the promises made by generations of researchers.  Computers do lots of things, but they are unable to replace or even convincingly impersonate humans.

One consequence of this is that some of the characters in the book make their living by providing voices for virtual reality-based entertainment since although computers are able to produce convincing 3-D worlds they cannot, in Stephenson’s alternative reality, substitute for real human intonation or emotion.

It’s an interesting idea, and when I first read the book in 1995 it resonated with my view that while non-human intelligence is perfectly possible we will never actually manage to create it ourselves because intelligence emerges from biology not technology.

As it happens we’ve got rather good at doing computer-generated speech that sounds pretty human-like, but of course the sentences generated are written by humans, and spoken human-computer interaction remains limited to relatively constrained domains like calling airline reservation systems.

A newly-launched project from MySociety, the charitable project that builds socially useful websites and services, has drawn my attention to another area where computers are simply not very good – watching television.

Their Parliamentary website, TheyWorkForYou.com, now includes video of debates in the House of Commons, but the feed they get only has BBC-provided captions to indicate who is speaking or what the debate is.

This lets them chop the video into segments, but there’s no way to automate the process of linking a segment to a particular speech.

Rather than ask for a billion pounds to build an AI capable of doing this automatically they’ve decided to go for a simpler approach. They are asking internet users to watch short video clips of Commons speeches and match them to the transcripts made for Hansard, the official record of parliamentary proceedings.

The site aims to do for speeches in Parliament what the SETI@Home project has done for radio astronomy data. But while it’s relatively easy to process digitised radio traffic looking for regular patterns that might indicate a signal from another civilisation it’s a lot harder to tell which MP is talking at any one time, so they are using the spare processing cycles of the person sitting in front of the computer instead of the chip inside.

It’s straightforward and, so they claim, rather addictive. You get an extract from the speech and a bit of embedded video, press play and when you hear the extract start you press ‘Now’ so they know where in the video stream that text appears and can add it to their database.

It’s similar in intent to the project running on Flickr to index the public domain images provided by the US Library of Congress, where public-spirited users are asked to use their spare time to tag them.

It is astonishingly difficult to extract meaning from the grid of  pixels that a camera delivers to a programme, and both initiatives show how far we have to go before we will have reliable automatic indexing of images, whether still or moving.

Well-indexed and properly tagged video is astonishingly useful.  A couple of years ago I saw a demonstration of a searchable video archive created by Cambridge IPTV. They had taken David Attenborough’s Life on Earth and indexed the entire series so that it was possible to search by keyword or phrase and go straight to relevant sections

However this was only possible because the shows had all been subtitled with closed-captions and so the text was available in a machine-readable form.  And the subtitlinghad been done by people, so the indexing was still reliant on human interpretation of the video stream.

The MySociety tool is, at the moment, a specialised service for Parliamentary video but like all their projects it’s designed to be replicated and used as widely as possible. Coupled with a job allocation service like Amazon’s Mechanical Turk it could be a boon to anyone with a video archive that needs to be indexed.

Of course, it’s rather ironic that just as the net frees us from the tyranny of the television in the corner and offers us a multitude of new ways to express our creative potential we are being encouraged to watch arbitrary segments of a large archive in order to help the librarians.

Bill’s Links

MySociety announcement

Try it here:

Tagging Flickr images:

SETI@Home:

Cambridge IPTV: