1.
Voice systems are always listening, but it’s expensive (and invasive) to analyse everything picked up by the microphone. Hence wake-up words, which keep the rest of the system switched off until heard, and are - in theory - cheap to detect.
How the “Hey Siri” wake-up words work, by Apple’s machine learning team.
The wake-up words run as a tiny brain. In the following, DNN stands for Deep Neural Network.
To avoid running the main processor all day just to listen for the trigger phrase, the iPhone’s Always On Processor (AOP) (a small, low-power auxiliary processor, that is, the embedded Motion Coprocessor) has access to the microphone signal (on 6S and later). We use a small proportion of the AOP’s limited processing power to run a detector with a small version of the acoustic model (DNN). When the score exceeds a threshold the motion coprocessor wakes up the main processor, which analyzes the signal using a larger DNN.
Compiled tiny brains. High accuracy, low power recognisers, super focused single feature fetishisers.
A.I. on dedicated silicon is getting cheeeeeap.
My hunch:
Give it a few years, and I reckon voice-on-a-chip and hand-gesture-sensitive-lensless-camera-on-a-chip and make-any-surface-touch-sensitive-on-a-chip and make-use-of-nearby-watches-and-headphones-on-a-chip will be so accurate, so power efficient, and so cheap that they will undercut the cost of physical interface components like buttons and screens – and therefore be used instead. For everything from kitchen scales to door locks. Which will change how we interact with products and what they look like.
2.
Dung Beetles Use the Milky Way for Orientation
This finding represents the first convincing demonstration for the use of the starry sky for orientation in insects and provides the first documented use of the Milky Way for orientation in the animal kingdom.
3.
What factories looked like in the age of steam:
The mechanical power came from a single massive steam engine, which turned a central steel drive shaft that ran along the length of the factory. Sometimes it would run outside and into a second building.
Subsidiary shafts, connected via belts and gears, drove hammers, punches, presses and looms. The belts could even transfer power vertically through a hole in the ceiling to a second or even third floor.
And then electricity:
But electric motors could do much more. Electricity allowed power to be delivered exactly where and when it was needed.
Small steam engines were hopelessly inefficient but small electric motors worked just fine. So a factory could contain several smaller motors, each driving a small drive shaft.
Electricity changed factory architecture:
A factory powered by steam needed to be sturdy enough to carry huge steel drive shafts. One powered by electricity could be light and airy.
Steam-powered factories had to be arranged on the logic of the driveshaft. Electricity meant you could organise factories on the logic of a production line.
Old factories were dark and dense, packed around the shafts. New factories could spread out, with wings and windows allowing natural light and air.
More here: Why didn’t electricity immediately change manufacturing?
4.
The fractional horsepower motor took the domesticated factory drive shaft right into the home:
Electrification began in cities around 1915 and with electrification so too came the potential market for washing machines, refrigerators, vacuum cleaners and a host of other commercial appliances. … By 1920, over 500,000 fractional horse-power motors were powering washers and other appliances in America.
Back in 2012, I wrote about fractional artificial intelligence. Here’s a talk on the same topic from 2010. Watching this now it’s like watching somebody stumbling around in the dark, but I think this is what’s happening today.
1.
Voice systems are always listening, but it’s expensive (and invasive) to analyse everything picked up by the microphone. Hence wake-up words, which keep the rest of the system switched off until heard, and are - in theory - cheap to detect.
How the “Hey Siri” wake-up words work, by Apple’s machine learning team.
The wake-up words run as a tiny brain. In the following, DNN stands for
Compiled tiny brains. High accuracy, low power recognisers, super focused single feature fetishisers.
A.I. on dedicated silicon is getting cheeeeeap.
My hunch:
Give it a few years, and I reckon voice-on-a-chip and hand-gesture-sensitive-lensless-camera-on-a-chip and make-any-surface-touch-sensitive-on-a-chip and make-use-of-nearby-watches-and-headphones-on-a-chip will be so accurate, so power efficient, and so cheap that they will undercut the cost of physical interface components like buttons and screens – and therefore be used instead. For everything from kitchen scales to door locks. Which will change how we interact with products and what they look like.
2.
Dung Beetles Use the Milky Way for Orientation
3.
What factories looked like in the age of steam:
And then electricity:
Electricity changed factory architecture:
More here: Why didn’t electricity immediately change manufacturing?
4.
The fractional horsepower motor took the domesticated factory drive shaft right into the home:
Back in 2012, I wrote about fractional artificial intelligence. Here’s a talk on the same topic from 2010. Watching this now it’s like watching somebody stumbling around in the dark, but I think this is what’s happening today.