2003-05-03 the semiotcracy I can only really explain the semiotcracy as a place the technological/social world is heading. It's all of: a mindset, a worldview, an ideology, a direction. If an invention or a law violates the semiotcracy, it's breaking expectation or "the nature of things", and it has to be corrected. The following are areas in which the boundaries are being contested/formed: . spam . intellectual property . mp3s and filesharing . RFIDs . Total Information Awareness Okay, so what is it? The various underlying metaphors come from the computer world [or at least: the same underlying metaphors that made the computer world possible and have been strengthened by their success; they are also responsible for this]. These metaphors are interlinked and different aspects of something deeper I can't articulate, so the list that follows is pretty lame but it should do as a sketch: * The conduit metaphor We have this idea that we communicate by sending a bullet of information from one head to another: it's packed into words and unpacked at the other end. This implies all kinds of things, not least that there is something to be packed at the origin. Weinberger [groups.txt] hinted at this when talking about Knowledge Management, the largest failed social software experiment so far. Knowledge Management implies that there's this *thing*, "knowledge", inside a person's head that may be extracted and given to someone else. But knowledge is contextual; it's not something that can be contained. In fact, transferring knowledge in this way is so hard we have a specific word for people who are good at it: Teachers. And good teachers are rare. So ironically, the backlash against the conduit metaphor is articulated in the social software world - the field of putting real world "conduits" in the software world - whilst the real world adopts some of the computer world's metaphors. It's at times like this I realise how tightly bound linguistics is to ideas. Is there a historical linguistics field, a study of past and future metaphor sets? * Nouns and verbs Dividing the world into objects, giving each object methods - uh, *verbs*, sorry - to manipulate them. Alan Kay and Smalltalk, MOOs, and so on. Now this is something that's been knocking around my head for a while, and I can't really express is, but there's something *wrong* with this model, and there's something wrong with using it so universally. Cohen [turing.txt] attacked object orientation in a computing context, on two main fronts. Firstly: encapsulation. Consider a disk drive on a satellite. Spinning it up will impart rotational stability along that axis to the satellite, release heat and vibration and so on. There's no way to encapsulate opening a file on that disk because "the size of the problem is the size of the thing you have to fix it": there's nowhere to throw away the side effect. Secondly he talked about aspect oriented programming. Some attributes of the system simply must cut across the object hierarchy: we need a "multidimensional separation of concerns". Again, it's ironic that criticism of these powerful metaphors is coming from the computing world. But I have both good and bad things to say about Cohen's talk. His comments about side-effects really resonated with me. I've been thinking a lot about software architecture -- and tesugen.com (probably the best weblog I read) been thinking a lot *better* about it, which explains why this is on your mind: reading about city life-cycles inspiring software design encourages more cross-disclipline thinking on the matter. If the universe is something like a computer (which it's not) and yer fundamental particles are something like running code (which they're not), and life something like high-level programmes (which it's not): then there's something to be thought about in terms of side-effects. If *indeed* there's something to be said for the idea of *transformation*: that in some way everything is proportionate, and changes are always just transformation of other changes - and now this is the metaphor of reality like a mechanical thing, levers and cogs and whatnot - then side-effects always have to be dealt with *somewhere*. What to do with them? Well if you were designing a system from the ground up and you knew exactly where everything was going to be placed, you'd direct the side-effects somewhere: a heatsink, or a log file, or you'd make sure there weren't any side-effects. That doesn't happen, we still get buffer overflows. No matter how well the code is designed, there's always the possibility for a side-effect to crop up somewhere quite dangerous. But this doesn't happen with real world side-effects. The giraffe's neck grew longer and the side-effect might have been an adverse effect on the population of butterflies high in the [whatever those tall trees with the spreading almost 2d branches giraffe's each are] forests. But the side-effect wasn't a supernova, or a kernel panic. And what's the more, the world *wasn't* designed from the ground up as a consistent unit (say I...). It's evolved and changing. So the universe needs to do two things: dump side-effects effectively and safely; even though there's an *incentive* for new developments to direct side-effects (it would be preferable - for the butterflies - for there to be a supernova instead of a population crash), it has to be fair. So the universe uses *distance*. Given there's no way of knowing where the side-effects should be directed in the fairest possible way over all time, the universe uses the unbreakable mechanism of distance to distribute side-effects locally. Maybe this is something we could learn in software? Or maybe not: there are advantages to doing it this way. Nature is maximally complex. Every side-effect has something taking advantage of it. That's what environmental niches are. Programming to make something that's neatly encapsulated in nature, like a bacterium, seems really hard. Drexler [drexler.txt] says that a single bacteria holds 1Mb of code. That's really quite a lot. I don't think nanotech is going to be quite so easy to program as Drexler believes. Especially because - and this brings me back to Cohen - the bacterium can make use of universal attributes, like distance and who knows what else, that we get rid of in digital computers. We run an abstraction of the universe on a chip, and deliberately get rid of things like distance that the universe uses! No wonder our protein folding experiments use on of the top 500 computers on the planet for three solid months and only get 1ms into a real world simulation of a single protein. Proteins are enormously complex, we can't emulate them easily so we know they're doing something really hard -- but do we really want to do it in the same way? Could we chop away the "decision" part of the protein and use it to solve the travelling salesman problem? Probably not, but if we could it would be a slice across all kinds of weird basic vector dimensions. Not so with software! With software this is much easier. And so with software we search for our own useful attributes. We're building a rapidly evolvable system to get useful, efficient things. And that means we chose Unix over Smalltalk and Lisp. That means that Kay's [alankay.txt] frankly wonderful attack on the modern computing world wasn't entirely justified: yes they did great things that we still haven't caught up with in 1968, and yes they published, and *big* yes we should all read more -- but it's not on my desktop and on a million other desktops, and somebody in Bulgaria didn't grab it, tweak it, and make it better, and that's not a success in my book. Two things: . I think it was Rusty from Kur0shin, who said something like (I paraphrase badly, he probably said nothing of the sort): "we didn't get Kay's world, but we have all of this, and it might be shit but here it is". . There was an article on chip design and the ground bounce in Computer the other month. It's easy, it said, to design a great chip that'll run quite happily, but to design a chip that'll run happily *and* start from cold, that's pretty hard. But back to the underlying metaphors for the semiotcracy. * The filesystem Everything is labled! Everything is contained and a container! Directories contain directories contain files contain, uh, content. All of which, extrapolating in either direction [since when did the direction metaphor apply to containerness?] means the *real world* contains directories, and content contains information. Which can be tranferred to my head. A useful metaphor. Now we start thinking of real world things like files. If music can be a file as an mp3, swapped and traded (Drexler again: the universal music playing machines. A CD player replaces the necessity to have a flute and a flute-player! Me: it's not robust though, is it? You can change the music on a flute, or hit someone with it. Or seduce someone), then why can't you ID3 tag a can of baked beans? This came up three times that I heard: Rheingold [smartmobs.txt] mentioned scanning the barcode on a product and googling for the vendor and product name; then again in Data Mining Social Cyberspaces [dataminding.txt]; then again elsewhere -- perhaps Clay? perhaps Smart Dust on RFIDs? So we have these metaphors, and they give us the staggering ability to throw away everything about the flute and the flute-player and the experience of being there, and the ability to ask for encores, different music, conversation and so on and so forth and focus on the single aspect: the noise, and take it away with us, play it on a CD, the universal music-playing-but-nothing-else machine. (This is a characteristic shared by good, or at least successful and following my own lead then successful is a good indicator for "good", technologists. Tom Broxton pointed out the March 2001 thread between Ian [surname? hixie.ch] and Tantek [full name?] in which the two have a long an involved argument on the interpretation of the CSS2 spec, in which [I believe?] both were involved. That the thread could be so long, detailed and emotionally involving without once mentioning the intentions of the spec's authors illustrates their ability to regard the specification as an almost sacred text; to operate in a present system effectively contained or *encapsulated* by the past. Another example occurred in conversation with Cory Doctorow about the activities of the EFF. I was attempting to provoke a point of moral ambiguity before I understood that the EFF take the Bill of Rights as *axiomatic*. There's nothing wrong with that, but it's not what I was expecting, and something a good technologist wouldn't question.) (Talking about it like this makes me wonder whether the semiotcracy isn't a natural outgrowth of the industrial mindset. The production line (or the Unix habit of chaining pipes...) was revolutionary when conceived, leading to workers as cogs within machines, and a mindset to accept orders: the decision making ability is not within the worker's remit. It could be argued that this was a contributing factor to the *excuse* used after the attrocities of the Second World War that the perpetrator was "just following orders". Our repulsion when faced with this attitude effectively adjusts the system to say that the industrial attitude must not be used to *utterly* model society, but that's a recent refinement to the model. I wonder what the counterweight to this latest wrinkle will be?) Holding these metaphors shapes our expectation of the world, and that's what the semiotcracy is about: it's the expectation that, as in the computer world, every object in the real world will come with a handle that allows it to be manipulated. Technologists exhibit a kind of disgust when faced with a file that cannot be copied, or a Personal Video Recorder that doesn't chunk the continuous audio-visual stream into discrete programmes that can be referred to, saved, etc. There is a push to overlay an ontology onto our world and apply this handles to everything, and a battle over what attributes these handles have. Illustrations of the push: RFIDs, national ID cards, barcode scanners joined to Google, Google scanning catalogues for that matter. Location-based services, GEOUrl, real world tagging of the electonic world, electronic tagging of the real world, "the mouse for physical objects" [datamining.txt]. Everything in the real world should have a URI -- the Semantic Web? Are we pushing like this because we want to, because we can, or because somewhere deep down we expect it? Why does Andrew McCargow want his mobile phone bill as an Excel document, not on paper? Why do people get excited about capturing voices as mp3? Tagging parts of the real world expands our ability to refer to things in conversation because it removes the necessity to have common terms of reference. Now everybody can make comments about this class of breakfast cereal. Now I can find conversations about this particular hat shop. Fronts on the battle: But how far should the semiotcracy go? We argue about what verbs/handlers are bound to objects, and we argue over control. Shouls government departments be allowed to give me an ID and share information? Should I be allowed to copy my music to different devices, or share it with friends? Should companies be allowed to send me spam? Should I be allowed to block it? Am I just an email address in their eyes? Now my ID has been abstracted, to what lengths should others go to to ensure I have just one? Who owns software? Is it okay for a car rental company to monitor the speed of the car I'm driving? We're moving towards a world where everything must have a handle to allow it to be referred to. This handle must be sharable, and we must be able to communicate it. Innovations that don't include this handle are bad. Okay, so this is Adaptable Design [Dan Hill], or this is the battle over intellectual property, or this is a natural attribute of digital media; or this is a way of making objects in the physical world subject to greater evolutionary pressures, like the online world. But whatever it is, when corporations obfuscate the handle or don't put it in, consumers are upset, at least the technologist ones. This is the great tendancy or current of the modern physical world, as the technological metaphors infect everyday life: the push towards the semiotcracy, and it's refinement, definition, and own system of ethics.