16:44, Friday 28 Jan., 2011

Domestic robots

Wikipedia maintains a list of domestic robots. I knew of the Roomba, the autonomous vacuum cleaner, but I hadn't realised quite how many autonomous vacuums there are. I guess vacuum cleaners are the Hello World of robots.

There's a beautiful comparison of their patterns of movement that Russell Davies references in his post on 'designing behaviour': the top shot shows the pattern the Roomba used. The second one down shows the Neato. The Roomba pattern may be more efficient, but it just doesn't look right to a human brain. It's not how a human would do it. The Neato pattern looks more like how I would clean. The Roomba pattern is organic, but an alien organic. The Neato cleaner's pattern is rectilinear.

Russell continues: That's going to be a thing - not just designing efficient, effective behaviour - but designing behaviour that's emotionally satisfying to the owner and appropriate to the character of the object.


Wikipedia claims that the first successful attempt to produce and sell a domestically-aimed robot was the Furby, launched in 1998. It was a toy - a plush owl with aesthetics that frankly creep me out, now I look back from the safety of this side of the millennium - and it had the illusion of intelligence. Get this:

Furby was crazy popular. (1.8 million units in 1998, 14m in 1999; 40m over the first 3 years.)

There's so much going on here. Furby's language of interaction is human and physical (light and movement). It responds to the environment. It develops. It learns and can be taught. It communicates with humans and its own kind. It doesn't do anything of these things in a hugely sophisticated way, but it does everything just enough and it never, never breaks frame.

There's a checklist of the bare minimum you need to make something feel sentient, even if it's just in a fractional way, puppy-smart, and that checklist may have been discovered by Furby.

There's something that happens to your relationship with an object once that threshold is crossed, and that's why we use the word robots instead of saying products or objects.

(A short thought experiment: a kettle product that doesn't boil properly needs to be replaced. A kettle robot that doesn't boil properly will piss you off, or will need to be made redundant, or otherwise elicit an emotional reaction.)

Robots aren't merely artifacts that move. They're the fourth kingdom of nature.

The several kingdoms of nature

Here, by the way, is my personal list of kingdoms of nature:

Rocks. Rocks are slow life. When Ursula Le Guin mused on the language of ants, penguins and plants in her (beautiful) short story The Author of the Acacia Seeds, she speculated about how rocks would talk: the first geolinguist, who, ignoring the delicate, transient lyrics of the lichen, will read beneath it the still less communicative, still more passive, wholly atemporal, cold, volcanic poetry of the rocks: each one a word spoken, how long ago, by the earth itself, in the immense solitude, the immenser community, of space.

By rocks I mean all kinds of matter, from clay to stars. And I don't entirely mean that stellar nebulae are sentient but I do mean there's a universe of interacting, unfolding things that can be understood only on their own terms -- like all of these kingdoms I have in my list. The rules of this kingdom we call physics.

Organic life! DNA-based, RNA-based, carbon based. Plants and animals and lichen. This is a kingdom of stuff which is able to control probability: the metabolic pathways are highways of catalysed, otherwise-unlikely chemical reactions. And it is able to alternate between the two worlds of information and matter, from protein machines encoded in the letters of DNA, to the fizzing chemical mushy flesh that the protein machines build.

The third kingdom is corporations. The philosopher Manual DeLanda, in A New Philosophy of Society, diagrams societies at multiple levels: social networks, organisations and governments, cities and nations. His book is a zoo of these inhuman macro buckyballs. Such massive animals have flows of money, power, and people instead of blood and nerves. In Platform for Change, Stafford Beer outlined the intrinsic behaviour of corporations: that they have a desire to continue their existence, and this dominates their response to stimuli. At the very smallest, cellular level, organisations are small groups of people, and their actions are dominated by group psychology -- at a national and planetary scale, economics. But cities and corporations cannot be understood in the same terms as dumb matter or organic life, so that's why they're the third of my kingdoms of nature.

Robots are the fourth kingdom. By robots I mean everything from inorganic information processing to smart matter. But I contend that, because of the following two qualities, it's not possible to understand robots in terms of any of the three other kingdoms:

  1. These are sentiences - simple like Furby or extraordinary artificial intelligences like Joshua in the movie War Games - that we can relate to as animals, but that are not rooted in organic life. They will have different motivations for survival, different priorities, and different psychologies.
  2. These are creatures that live natively in cyberspace. Pragmatically, they talk through the Internet. A world made out of information, not atoms, where we don't have regular things like momentum, distance, heaviness, conservation of matter. Other rules apply, rules of which we have only indirect experience.

And between those two qualities, it means we can't treat robots as artificial people, or magical moving puppets. They are, and will develop, their own new nature, which we - as members of the second kingdom of nature - have to explore, discover and understand fresh, on its own terms.

Back to Furby

Which brings me back Furby, the electronic talking owl.

Furby has a spin-off called Shelby. Shelby is a grumpy electronic talking clam: When I placed one of my Shelby's in a group with 5 furbys, greetings were exchanged and then, for no reason I could discern, my Shelby started babbling at the furbys, then slammed its shell shut and stayed closed up. As if somehow it had been offended? The furbys ALL stopped talking at once when this happened and remained silent.

Nattie has a Shelby and tells her story: Shelby doesn't stop talking unless it doesn't get any response for five minutes or something... and ignoring it is agonizing, because it's being cute, and you just feel so awful when it says it loves you, or it tries to tell you a knock knock joke, and you know you can't respond. He'll outright say things like, "I want to PLAY!" and you feel like the worst person in the world.

Nattie named her Roomba 'Ricky.' They had a more loving relationship: When Ricky got stuck in a corner and started furiously backing up and rotating, backing up and rotating, we'd frown and stand watch over him, concerned: "What are you doing, Ricky?" When he couldn't get himself unstuck, we'd sigh, pick him up -- "Oh, calm down" we'd say when we whirred in the air -- and put him back down, like he was a toddler learning to walk. And when he finished cleaning the room and sang that -- er, emitted that triumphant little chime, his joy was our joy.

And then of course, one day Ricky will die and then where do you put your feelings? Robots, man. They're nothing but heartbreak. Robots ain't shit.

The question is, as it always is, how do we live together?

It's something to consider. A different bit of the brain activates when we're dealing with sentiences -- or, as it turns out, even when we imagine we're dealing with sentiences (I use sentiences to mean "intelligence things," of varying levels of intelligence, but not necessarily human or animal). It doesn't take much: just a human-like appearance or even, as in The Media Equation (Reeves and Nass, 1996), painting the computer the same colour as its user's t-shirt.

When we imagine something is intelligent, we simulate its mind inside our own, in order to anticipate it. We begin to think a bit like it, in some small way. We socialise with it, takes cues from it.

On the one hand, this is very clever. Robots don't need their own brains: they can parasite on ours. Be intelligent simply by appearing to be intelligent.

On the other, do we want to relate to robots in this way? Sherry Turkle points out the risks of sociable machines: If convenience and control continue to be the values we hold uppermost, we will be tempted by sociable robots which, just as slot machines attract a gambler, promise us excitement programmed in, just enough to keep us in the game. ... We come to a point where we are so smitten by the idea of conversation with computers that we forget what human conversation about human problems is about: human meaning through the first-hand knowledge of the human life cycle, something of which robots will be forever innocent, no matter how "expressive" we make their faces or voices.

We don't get to choose what personality robots have

When Ben Bashford writes about Emoticomp he talks about objects with behaviours and personalities - robots - but questions how we should design those personalities. What is the watch-word we're after? He proposes politeness. A polite thing... is interested in me; is deferential to me; is forthcoming; has common sense; ... Etc.

...which is a great way to approach it. Polite robots would be the best! But I don't think we get to choose. Polite robots would be lovely. But the nature of the fourth kingdom - their equivalent of evolution - is that they reproduce in the sales figures of technology corporations and the womb-factories of China. The testes of robots are the shelves of Toys-R-Us. Humans don't get to choose the personality of robots, the market does.

And judging by Furby and Shelby, our robots won't be polite but will be needy and paranoia-inducing, resembling helpless infants.

The half-breed children of robots and humans

I'll wrap on a final weird-future slippiness between kingdoms two and four, and the story starts with a phenomenon called Hello Little Fella, which is the human habit of recognising illusionary faces in objects and the environment. Here's a favourite.

It's not just faces. There's a widespread habit of believing things having feelings, and, because we're human and because this is the 21st century, there's a community of people who fantasise about having sex with these inanimate things, then write stories about it, and it's called anthropomorfic.

All of which, finally, brings us to an iPhone game in which you have a virtual girlfriend to woo. Each girlfriend comes from a barcode. This is Barcode Kanojo: I'm currently dating a can of Heinz tomato soup in Barcode Kanojo, but it wasn’t my first choice. I wanted Heinz Beanz or a box of Shreddies, but both have already been taken by faster scanners.

It's an offensively brilliant idea. Barcode Kanojo's free iPhone app will scan any product you have knocking around your house and turn it into a delicate anime girl over whom you can obsess, masturbate, and fight. The game in Barcode Kanojo's game comes when another player scans the same bottle of bleach you just scanned ... In a sad parody of real life sexual politics, Kanojos will date only their creator until someone else who scanned the same tin of beans gives them more money and attention. Mostly money. (via)

All hail our weird new robot overlords indeed. Welcome to the fourth kingdom of nature, folks.


A weblog by Matt Webb, CEO of BERG, makers of BERG Cloud and Little Printer.

Korbo, Lorbo, Jeetbo.

You're probably looking for my email address or the syndication feed.

You can get updates to this blog on Twitter: follow @intrcnnctd.

I'm @genmon on Twitter. Also find me on Flickr and LinkedIn.

Continue reading...

The 8 latest posts are named Cricket and pixel cityscapes, How any of the Big 3 could own connected products, Pricing hardware and changing business models, Orbits and hardware, BERG Cloud press, Testing, Facebook should make a camera, and Instagram for webpages.
Read them.


2013 June, May. 2012 July, May, April, March, February, January. 2011 May, March, February, January. 2010 December, January. 2009 February. 2008 December, November, September, August, July, June, May, April, March, February, January. 2007 December, November, October, September, July, June, May, March, February, January. 2006 December, November, October, September, August, July, June, May, April, March, February, January. 2005 December, November, October, September, August, July, June, May, April, March, February, January. 2004 December, November, October, September, August, July, June, May, April. 2003 December, November, October, September, August, July, June, May, April, March, February, January. 2002 December, November, October, September, August, July, June, May, April, March, February, January. 2001 December, November, October, September, August, July, June, May, April, March, February, January. 2000 December, November, October, September, August, July, June, May, April, March, February.

Interconnected is copyright 2000—2013 Matt Webb.