Hallucination should not be a dirty word

09.41, Friday 26 Jul 2024

One of my local schools, just down the road, is the Creative Computing Institute, part of University of the Arts London.

I was honoured to be a judge at the recent CCI Summer Festival. Students from the BSc and diploma courses were showing their projects.

Here’s the press piece: Creative industries experts recognise exceptional student work at the UAL Creative Computing Institute Summer Festival.

There was so much great work. All the awards have amazing winners, and there were many other projects right up there too.


I was awarding for Innovative Materials: redefining the ways in which we use and conceive of the ‘stuff’ of computing practice.

Congratulations to the winners, Gus Binnersley, Kay Chapman, and Rebecca De Las Casas, for Talking to Strangers.

Artists’ statement: Talking to Strangers explores early theories of language development and symbiotic interspecies communication. Inspired by the work of linguist Jan Baudouin de Courtenay, this game of telephone explores his ‘bow-wow’ theory, which suggests that the beginnings of language involve progenitors mimicking sounds in their natural environment.

Ok I want to say something about this work and why it spoke to me, and about AI and hallucinations.


Let me describe the project, because I can’t find any pictures online:

  • Two sheets of metal hanging from the ceiling. Two telephone handsets, one at each end.
  • Speaking into a phone, your voice is transformed into different signals, transmitted through the metal, and reconstructed at the far end – that’s the game of telephone.
  • If you scratch or tap the metal, adding noise along the transmission path, the scratches and taps are reconstructed into what sounds like a voice.

Now my personal scoring rubric, for this particular award, was for using the material of computing - signal in the case of this project - as an intrinsic part of the work. And to tell a story about that material, rather than using it in service of another story.

And the story about the continuity of data is an interesting one. Voice remains regardless of the substrate. The invention of the category of data is a big deal!

But data-as-material is a well-trodden investigation.

SO:

What grabbed me here was the accidental voice reconstruction.

The project group used machine learning voice changing software, off the shelf, made for streamers.

The scratches and taps on the metal were transformed by the proto-AI into fragments of voice: burbles and syllables that sound something like a person speaking, but not quite. You strain to hear.

(I didn’t ask but I got the impression that the group didn’t originally intend for this to be part of their project, even though it was part of their demo by the time I spoke with them. That’s what you get from working directly with material.)

And this is something new:

Where does the voice come from?

Novelty in the signal.


Signal vs noise.

The story of the our networked age is noise. Data rot. Lossy compression. Entropy. Message attenuation over distance and time. Lost in translation.

And yet - with modern gen-AI - something new: Novelty on the wire. Originality from… somewhere?

If we’re to take that idea seriously then first we need to encounter it and experience it for ourselves.

That’s the work that Talking to Strangers was embarking on, for me.

The project had put its finger on brand new ‘stuff’, so new we can barely see it, but it found it somehow and that’s special.


Because novelty from computers is special.

I think it’s hard to come to terms with originality from computers and AI because it’s so counter to our experience of what data does.

But I was using a prototype of an AI system yesterday and the bot said back to me:

Oh, that reminds me of the time I accidentally entangled my toaster with my neighbor’s cat. Poor Mr. Whiskers meowed in binary code for a week!

A trivial example. But like, where does this even come from?

Here’s one of my posts from September 2020, just after I used GPT-3 for the first time:

Here’s what I didn’t expect: GPT-3 is capable of original, creative ideas.

(It had told me about a three-mile wide black ring deep in the Pacific Ocean.)

Now, we call these “hallucinations” and the AI engineers try to hammer it out, and people swap prompts to steer outputs with great reliability. Apple Intelligence irons out world knowledge, SearchGPT gives chatbots ground truth.

It’s so easy to dismiss any output that looks new, calling it just a recombination of training data fed through the wood chipper. We often resist the idea that originality might be possible.

But here’s a thought: a major source of new knowledge and creativity for us humans is connecting together far flung ideas that haven’t previously met. (That’s why multidisciplinary projects are so great.)

And as I said back in that 2020 post:

It occurred to me that GPT-3 has been fed all the text on the internet. And, because of this, maybe it can make connections and deductions that would escape us lesser-read mortals. What esoteric knowledge might be hidden in plain sight?

So, just in how it’s trained, the conditions are there.

I began my defence when I spoke in Milan in April about hallucinations, dreaming and fiction.

And

I am even more convinced of it today.


Those babbling voices from the sheet metal are not noise in the signal. They’re the point. Sources of creation are rare and here’s a new one!

What would happen if we listened to the voices?

What if we built software to somehow harness and amplify and work with this new-ness? There are glimmers of it with Websim and so on. But I don’t think we’ve really grappled with this quality of gen-AI, not yet, not fully. We should!

Hallucination is not a bug, it’s the wind in our sails.


Congratulations again to the Talking With Strangers team, and thank you to UAL CCI for having me – a privilege and a joy to see all the work and speak with the students.

More posts tagged:
Auto-calculated kinda related posts:

If you enjoyed this post, please consider sharing it by email or on social media. Here’s the link. Thanks, —Matt.