Interconnected

All posts made the week commencing Sunday 9 Feb., 2003:

The 16th rule of BlogNomic is you don't talk about BlogNomic. Given the game is leaking quite seriously into players' weblogs, the rule that "they should not specifically announce that a given post was made for the purposes of BlogNomic" should make things very interesting. Major motive obfuscation.

The Stop the War Coalition march, today in London... Find pictures on the BBC's travel-congestion webcams. Some screenshots mirrored here: one, two, three, four.

Big Music's Broken Record and RIAA Statistics Don't Add Up to Piracy, two articles saying basically the same thing: that while CD sales have fallen, this is more to do with above-inflation price rises, and a larger drop in releases. (A while ago I asked whether music sales had really fallen. I don't believe piracy is a problem. The industry could easily create a system with value-added so people would want to pay. Leave it to the market - including piracy - and we'll get there sooner.)

Collected Tears of the Weeping Nivbed, the work of Justin Cherry, is beautiful art. A coherent world. Pictures, animations and music. One pick isn't enough, really, so work your way through the entire Cg Gallery. Really.

Two steps.

  • Visit Perversion Tracker: Apparently Useless Software. Marvel at the cleverly satirical fake software reviews. Chuckle at the absurd smiley creator app ($12); the nickle, dime and quarter tracking app ($3). Funny-for-a-moment fictional utilities.
  • Slowly, slowly realise. Jesus god these apps are real.

[via Crazy Apple Rumors Site.]

The computational cost of any activity can be measured in instructions processed. In the early days, when the model was central mainframes and dumb terminals, whatever you did had a very low local cost and high remote cost (but not very remote, because the mainframe wouldn't be very far away). Next, microcomputers: Computation on the desktop. Whatever you do is processed locally. Now, when I hit a website, there's a certain cost locally, but there's computation at every switch and router and on the webserver itself. (I'm only talking about direct cause computation here, so the instruction processed to create the website in the first place, or even the chips themselves, don't count.)

I'd like to see a graph of: Computation power consumed per person per second, how that's changed over the years, and where that computation occurs (on an axis from local to very remote). (And incidentally, the metaphors are hard here. Is computation consumed? Spent? If it's not used, you can't save it up. Maybe a better graph would be percentage of total computational capacity of the globe. I wonder what percentage of the maximum one consumes in reading this sentence?)

Joho's Car Talk Puzzler. Prisoners, a room with two levers, sending signals. It's a really hard puzzle, fun to think about.

I can't solve it. I devised a fairly efficient system based on using one lever to signify the position of the other carried meaning, and then treating the sequence of prisoners into the switch room like a circuit. Each prisoner could generate a signal (up or down), consume a signal, or retransmit. The job of over half of the prisoners was to generate a single down signal, and retransmit all others. All but one of the rest would consume a certain number (say, 4) down signals, and then generate an up one. All others must be retransmitted. The last prisoner knows that when they've received the correct number of up signals, all prisoners have visited the room. Blah blah blah.

But, the switches are in a random position on the (random) date of the first visit! So it doesn't work. And my half-solution is hardly elegant -- not right for a puzzle answer. Alas, this is a problem with all proposed utopias. Too much top-down control, and no way to get there from here. (Incidentally, Es suggested the prisoners rush the warden and feed him to the alligators for being such a pompous arse.)

Dan Hon's article Inflection Point asks: What if we didn't have to save documents? What if all versions were always kept?

Firstly, the barriers between documents disappear. There's no point starting a new document if you can just edit one that is mostly what you want and email that off. So files come to represent activities, each a tree of connected documents that are variously printed, emailed, considered.

This is the meat of Dan's article -- how would the UI change in this case? I'm an fan of the division of metadata into pay-for and free varieties. Word frequencies, date of creation, these are metadata that occur as a by-product of the task at hand (writing). On the other hand, file name and location have to be added, if you want to find your document again. If these could be added for free, if documents were so easy to discover even without explicitly naming them -- what other pieces of metadata could be demanded from the operator to make their life easier down the road? And this in itself creates UI change: if the similarity of documents is valuable, then it has to be easier to mutate a document for a new version that to create one afresh.

Which is the real point. Given this endlessly mutable document, how to tell when there's a useful revision and not an in-progress one? Example. When I use my paper notepad at work, if I'm four or five pages on from a set of notes I want to add to, I'll still start a new page, and rewrite what I need to from my old notes. Putting the lid on a document, archiving it, this serves some kind of purpose. And when it comes to computers which don't by necessity have to follow any metaphor at all, it's important to break the symmetry and place these purposeful things in... As ugly and unaesthetic different documents, opening and saving are, maybe not having infinite undo (in this paradigm) serves some kind of conceptual purpose?

But really this is about making computers more like the human aspects [putting a lid on something, for example, but not emulating loss] of the RL environment. Instead of binary save/open there's a less digital approach. Or to put it another way, clots in the Lifestreams; points of inflection on the activity curve where the document creator has paused to print, email, consider, consult, spellcheck. What Dan's talking about is softening the computer interface to take the best psychological parts of the utter mess that is my work desk, perversely good filing system that it is, and add it to what we know computers are good at. So hopefully not continuing the query versus hierarchy dichotomy, but taking a synthesis of it. There's an angle here, I think, an interesting one for thinking about UI.

"At the simplest level a game designer needs to ask himself, 'how many different decisions am I offering to my players at any time?' In other words, how many different options make up the decision set which a player is faced with at any one time? The answer to that question affects the complexity of its game and also its playability". Designing Strategy: Decision Sets [via As Above] covers decision trees in games, and how to prune them back at the design stage to not overwhelm the gamer.

Conversational User Interfaces have been occupying me again recently. One problem specifically, which is how to design and model them. This is hard because a conversation (with an IM bot, for example) isn't a series of stateless request/response pairs, not if you're designing according to Grice's Cooperative Principle: "Participants assume that a speaker is being cooperative, and thus they make conversational implicatures about what is said". What this means is that the bot must make maximum use of all that the user [interviewer?] has said, or at least make conversationally clear what's understood and not.

So, the tree of possible answers grows massively at every question. How to model/design this? Various plans, two of which can be seen (together with the indescribably ugly code itself) in this month's notes folder. A couple of early ideas: hoping only recent steps in the conversation would be important for a maximally cooperative answer; hoping that the tree would fold into major channels, conversational watercourses. The metaphor I'm happiest with is that the internal state of the bot is representated by a position (which may be discrete or smeared) in a dataspace. The conversation is represented by the trail of this cursor. An answer by the bot is an expression of this location on the map; how exactly it's expressed is governed by how the trail is shaped, how the current position was approached. The code is terrible, the CUI almost as bad -- but it's a framework to operate in, which is better than the last iteration.

Biosemiotics: Towards a New Synthesis in Biology [via a sylloge comment]. "The semiotisation of nature as a trend in 20th century life science is discussed. The reasons for this trend is analysed and it is claimed that semiosis is an emergent property in our universe appearing with the first life forms nearly 4 billion years ago. From this tender beginning semiotic freedom has increased throughout organic evolution, and it is suggested that this fact holds the key to an eventual bridging of the gap between history in the sense of thermodynamic irreversibility and history in the sense of human culture. A unification of biology, a true 'modern synthesis', should base its understanding of evolution on a semiotic theroy of life". Now I've no idea what biosemiotics is but it sounds awesome, and worthy of investigation based on its name alone. Furthermore a rich metaphor mine, I'd hazard.

The New BFS (Brendan File System) [via Play with the Machine], filesystem postulation: "File systems need to change. Current file systems are horribly out-of-touch with the realities of what users need to effectively find, organize, and modify their vast quantities of files. [...] Not a single proposal I've read has ever started by considering the most important motivator of good file system design: how will the user interact with it?"

Buy insects and mites for scientific research, from the Central Science Laboratory [via family of breath].

Konfabulator [via RAILhead design] is the coolest Mac OS X app I've seen in a long while. It's a manager for those tiny desktop widgets that everyone uses - calendar, battery monitor, weather - and these downloadable widgets look beautiful. Full use of the OS X transparency and shading. The app itself is polished to the extreme. Drag and drop installation, and the first time you run it it explains what it does and how to use it. All apps should be this easy.

And here's the kicker. Konfabulator is fully open to developers -- and widgets look trivial to write. Each is an XML document that says what images go where, and what to do when there's an event (mouse click, mouse over, etc). It's all in Javascript, which makes things really easy, especially because you can also fire off AppleScript, shell commands and make connections over the www. One of the guys behind it is Arlo Rose, who was also partially behind Kaleidoscope on Mac Classic, the skinning engine that had an enormous community developing themes. He obviously wants to do the same here. And I can really see it happening. Tiny, lightweight widgets that look gorgeous and do powerful things.

Documentation is in the Widget Workshop (and it's comprehensive), or you can Show Package Contents of any widget to read the source. Their journal also gives you an idea about what the developers are thinking. This is very definitely one to watch.