2003-04-25 If you meet Alan Turing on the road, kill him http://conferences.oreillynet.com/cs/et2003/view/e_sess/3645 Geoff Cohen http://www.coherenceengine.com/ FAQ: - What have you got against Alan Turing? - You know he's dead already, right? Turing came up with the Turing machine: A tape of infinite length of symbols, rules for moving along it. The title is a play on a zen koan: "If you meed Buddha on the road, kill him" It's from a 9th century Chinese monk, talking about getting away from authority figures. The thing we need to get away from is the idea of the mathematical proof as the basis of programmings. The six lies, a subset of the lies that we all believe so strongly we think they're fundamental truths. ONE! The Lie of the Turing Machine. Turing said this machine could process this set of computation. But it's been turned around: people say that all computation can be done with Turing machines. Wrong way round! And untrue! Turing machines can't do quantum stuff, for example. Example here, protein folding. Nobody can do this. Somebody ran one of the top 500 computers in the world for 3 months to try and fold one protein, and they only got 1 microsecond into their simulation. But the protein DID it in a millisecond! It's a billion times faster than that computer! "I suggest you that we're missing something." "In some sense, it figured out the answer." TWO! The Lie of the Program. Turing machine doesn't separate program from data. The "Harvard Architecture" was the first to separate these two. Now chips have separate bits to deal with these two bits. Why should a program be a series of symbols? We should be thinking instead of: A manipulable process. Opening a text editor and typing a file is *so* tightly bound to our idea of programming. THREE! The Lie of the User. There's no concept of the user in the Turing tape, the idea of interaction is unalienable. FOUR! The Lie of the Bug. First Computer "Bug", 1945. [I'm not sure this is true. I think bugs came up in telegraph systems too.] Aaaah. They wrote it down in the book because the term is *funny*. Bug does actually come from Edison. Oh, the lie is that the program can have a flaw or a mistake. Of course the program has flaws and bugs. You can't have a QA department that does a mathematical proof on the software. We need to think about the acceptibility envelope of programs. It's not a point, it's bigger than that. FIVE! The Lie of Modularity. Encapsulation of state.. it doesn't work. If you spin up a disk on a satellite (which is basically a computer) you . draw power . impart rotational stability on the axis of the spinning . vibration etc So you do these things every time you read a file! It doesn't matter on earth. But up there, the size of the problem is the size of the thing you have to fix it (you can't throw away the side-effect into gravity or heat). But now, we're resource constrained. The size of the problem is the corporation, the thing to solve it is the same size. Hierarchy of decomposition, of problem solving, that's not how modern problems look. [this is cool. distance! distance is the real world's way of spreading out side effects fairly. like all the bits of the universe have come to a fair and ethical way of doing it. software will need to do the same thing. because at the moment is directs it all somewhere, and it's often *not* fair.] SIX! The Lies of Memory and Time. Computer memory, when you change a value it's gone. That's not like memory! Computing languages don't even have an idea of time! There's a programming language "Elephant" which was designed and does have some properties closer to this. .... The Eightfold Path, what do we do now? This is how to break our habits of suffering. * It's Coming From Inside the House! "we've traced the call, it's coming from inside the house" This idea that we can build firewalls, it's wrong. You have to assume when you write code, modules can be invoked by evil. * The Visible Computer To have a testable program, you have to be able to peel back the interface and see what's happening inside . changing -- you need to be able to edit things . visible -- peelable . explanatary -- you should be able to ask "why did it happen that way?" * Billions and Billions Thought experiment: what if your software was going to be used by a billion people? So . internationalisation, little thing really . 40 million blind users, 100 million dyslexic users, women, children, etc . you cannot write solely for yourself * Slice Good assumption of AspectJ, aspect oriented programming: "power [electricity] usage cannot be in a single module", in a PDA for example modularity must be able to cross cut the heirarchy, multidimensional separation of concerns * Four dimensional programming Coming back to time... Think that you should be able to go back in time, intervene, change things. -> Undo. At every level, it has to have undo. At *every* level. But undo erases history! Undo shouldn't do this, you should be able to change bits and redo it. Will Wright said a game could have a visible history, and you play it like a four-dimensional thing, and you can go and tweak at any point. [That would be beautiful. Little markers where you've tweaked, in time and space. Building a city or something.] * Plan to Fail Assume that every method call might fail. Assume the net connection is intermittant. [this is good. This is like the biological computing thing from last time.] * Plan to Die Two failure modes: . you can write your program to never die, and it's the Y2K bug . you can plan to die and leave your data stranded So think about the ecology. Plan to die, then have a way of moving your data on. .... Enlightenment. Interesting things, to round up. Apple's user interface guideline book: Maybe we need these for 21st century programming. This is how to make this recoverable. Last quote: I attained not the least thing from complete unexcelled awakening, and that is why it's complete unexcelled awakening. [Oh, he mentions off hand that they built a special kind of neural net that was in a field called "hypercomputation", more powerful that Turing machines. Then they used real numbers to look at the links between the nodes, and it collapsed back to the Turing machine.] Funny. A question saying, why should the next two doublings of computing power make loads of difference? And Geoff says that the doublings start meaning more, 4.7 GHz -> 9.3 GHz means more that the Mhz doubling (he's been saying all along that the computing power increasing lets us eat the cycles with time). The questioner laughs and says "I guess you're too young."