Optimisation makes me itchy.
A couple of examples. The thermostat Google Nest has Rush Hour Rewards which will automatically tune temperatures before and during a Rush Hour to reduce energy use and lower grid costs
(a “Rush Hour” is when everyone turns their air conditioning on at the same time).
Similar: Power Shaper by Carbon Co-op which I’m sorry to pick on because lots of UK energy companies will be doing this with smart meters, but this is the one I saw first. (Thanks Rod McLaren for sending it my way.)
Carbon Co-op technicians will visit your home and install equipment which will enable certain existing electrical appliances (such as electric vehicle chargers, heat pumps, immersion water heaters, battery storage) to be turned on/off remotely.
We turn things on/off only when we receive a request from grid operators and other parties.
On the face of it, this makes a ton of sense.
We’re shifting to renewable energy. The wind and sun have their own schedule. But say everyone gets home at 7pm and plugs in their new electric car, or turns the kettle at halftime in the football, that’s a demand spike, and that’s when a coal power station fires up, so the energy is supplied to the grid but it’s dirty energy.
Long term this gets fixed by having neighbourhood batteries to smooth the spikes. Ahead of that, demand can be adjusted by automatically turning things off. Feedback loops.
BUT.
In Vernor Vinge’s space opera A Fire Upon the Deep (1992) there’s a planet called Namqem with a high technology civilisation 4,000 years old.
Namqem was a triumph of distributed automation. And every decade it became a little better. Every decade the flexibility of the governance responded to the pressures to optimize resource allocation, and the margins of safety shrank.
Which is a problem. As one of the characters says: They’ve accepting optimizing pressures for centuries now. … finally the optimizations have taken them to the point of fragility.
‘The symptoms are classic. The last decade, the rate of system deadlocks has steadily increased throughout Namqem. See here, thirty percent of business commuting between the outer moons is in locked state at any given time.’ All the hardware was in working order, but the system complexity was so great that vehicles could not get the go-ahead.
So eventually, as an alternative to escalating resource wars, optimisation becomes complete: every embedded computing system an instrument for total social control.
But it doesn’t help, collapse comes, billions die, and so on.
I must have read Vinge at a particularly susceptible age. Because since then I see optimisation-through-complexity as a particular kind of danger.
Not optimisation on its own. Doing something with as little energy as possible is elegant.
And not complexity on its own either. Complexity has its own problems: reduced legibility, the creation of priesthoods to maintain it, etc.
But when you increase complexity in order to optimise, demand never really goes down. The optimisation becomes an opportunity to do more, and so the complexity gets locked in – there will never be the chance to remove it.
And that compounding complexity, layers upon layers of it, a nest of interlocking feedback loops, increases the risk of fatal, emergent complexity quakes.
All of which colours my approach to everything from how I architect my code, to how I organise my finances, to what government policies I like.
Whenever I see something like Nest’s Rush Hour Rewards or Carbon Co-op’s Power Shaper, it makes me feel like we’re all taking one step closer to an invisible cliff edge, and the drop could be half a mile away, or it could be one inch.
I also have a high level of nervousness around magnets. I grew up with floppy disks (that would be wiped) and cathode ray tube screens (that would be permanently ruined). So magnets on wallets or toys – I’m on edge if they’re ever near electronics, and I watch them with a hawk eye until they’re at a safe distance. Magnets on laptops and iPads still seem wrong to me. Even though it’s fine now and has been for many years.
By which I mean, don’t take my views too seriously, I’m a mess of unfounded prejudices about emergent systems and ferrous solids.
Optimisation makes me itchy.
A couple of examples. The thermostat Google Nest has Rush Hour Rewards which will (a “Rush Hour” is when everyone turns their air conditioning on at the same time).
Similar: Power Shaper by Carbon Co-op which I’m sorry to pick on because lots of UK energy companies will be doing this with smart meters, but this is the one I saw first. (Thanks Rod McLaren for sending it my way.)
On the face of it, this makes a ton of sense.
We’re shifting to renewable energy. The wind and sun have their own schedule. But say everyone gets home at 7pm and plugs in their new electric car, or turns the kettle at halftime in the football, that’s a demand spike, and that’s when a coal power station fires up, so the energy is supplied to the grid but it’s dirty energy.
Long term this gets fixed by having neighbourhood batteries to smooth the spikes. Ahead of that, demand can be adjusted by automatically turning things off. Feedback loops.
BUT.
In Vernor Vinge’s space opera A Fire Upon the Deep (1992) there’s a planet called Namqem with a high technology civilisation 4,000 years old.
Which is a problem. As one of the characters says:
So eventually, as an alternative to escalating resource wars, optimisation becomes complete: every embedded computing system an instrument for total social control.
But it doesn’t help, collapse comes, billions die, and so on.
I must have read Vinge at a particularly susceptible age. Because since then I see optimisation-through-complexity as a particular kind of danger.
Not optimisation on its own. Doing something with as little energy as possible is elegant.
And not complexity on its own either. Complexity has its own problems: reduced legibility, the creation of priesthoods to maintain it, etc.
But when you increase complexity in order to optimise, demand never really goes down. The optimisation becomes an opportunity to do more, and so the complexity gets locked in – there will never be the chance to remove it.
And that compounding complexity, layers upon layers of it, a nest of interlocking feedback loops, increases the risk of fatal, emergent complexity quakes.
All of which colours my approach to everything from how I architect my code, to how I organise my finances, to what government policies I like.
Whenever I see something like Nest’s Rush Hour Rewards or Carbon Co-op’s Power Shaper, it makes me feel like we’re all taking one step closer to an invisible cliff edge, and the drop could be half a mile away, or it could be one inch.
I also have a high level of nervousness around magnets. I grew up with floppy disks (that would be wiped) and cathode ray tube screens (that would be permanently ruined). So magnets on wallets or toys – I’m on edge if they’re ever near electronics, and I watch them with a hawk eye until they’re at a safe distance. Magnets on laptops and iPads still seem wrong to me. Even though it’s fine now and has been for many years.
By which I mean, don’t take my views too seriously, I’m a mess of unfounded prejudices about emergent systems and ferrous solids.