Interconnected

How my Twitter bot makes personalised animated GIFs

Ben Brown noticed that my bot @5point9billion made him a personalised animated GIF when it tweeted him yesterday (on the occasion of light that left Earth as he was born, right at that moment passing the star Iota Pegasi, a little over 38 light years away). And he was curious about how it did that. So:

There's a previous write-up about @5point9billion here. From that post:

My new bot is called @5point9billion which is the number of miles that light travels in a year. The idea is that you follow it, tweet it the date of your birth (e.g. here's my starter tweet), and then it lets you know whenever you reach Aldebaran or wherever.

You get tweets monthly, and then weekly, and for the last couple of days... and then you pass the star. It feels neat, don't ask me why.

Since that write-up, I've also added a website to the bot. In addition to getting the realtime notifications on Twitter, you can sign in on the site and see what stars you've already reached.

Check this out: There's also a public view, with an animation. This is a 3D animated map of all the star systems we can see from Earth, within 100 light years. It sits there and rotated. You can type in your date of birth, and it'll show you what stars you've already reached.

I made this public view as a "kiosk" mode when @5point9billion was exhibiting at the Art of Bots show earlier this month. The stars were laid out on the floor, fanning out from the Sun which was right by the kiosk. Here's a photo. It was good fun to walk out from the Sun till you find the star you've just passed. And then to walk out to about 80 light years and think, hey, most people die around this point, and look at the stars falling just further from you and think, hey, I probably won't reach those. Huh.

The star map is drawn and animated in Javascript and WebGL using three.js which I really like.

And doesn't it look kinda the same as the personalised star map that the bot made for Ben? Yup.

Making animated GIFs

I knew I wanted to tweet out personalised, animated star maps, whenever a bot follower passed a star (there are over 500 followers, and between 2 and 5 of them pass a star each day).

Routes I considered but discarded pretty fast:

  • Generating the star maps offline. For sketching on my Mac, I use a Python drawing package called PlotDevice -- this is what I used to make the first quick-and-dirty star map. I don't like generating graphics offline because I want the ability to tweak and change my mind
  • Drawing the graphics frame by frame using a dedicated package like Cairo. But I already have star maps in Javascript for the browser. I don't like the idea of having two routes to draw the same graphics for different outputs. Feel like a lot of work

This is the rendering pipeline I settled on:

  • The source animation is the same animation I use for the website... it's drawn in Javascript using three.js. It's just a page on my site
  • I already have queues and asynchronous processing on my website. The website is all Python because that's my preferred language, and I have a my own Twitter bot framework that I'm gradually building up (this is a whole other story)
  • When a user passes a star, the machine responsible for that task adds a tweet to the send queue, and flags it for requiring media
  • At the appropriate time, the queue runner loads the animation page using PhantomJS which is a web browser that can run headless on the server. It's possible to drive Phantom from Python using Selenium
  • Because the animation is created on demand, and generated just for this tweet, it can include personalised information like today's date, and the name of the user
  • The animation exposes a single Javascript function, step(), that renders the next frame. Phantom has the ability to reach into a page and make Javascript calls
  • Using Phantom, each frame of the animation is generated by calling step(), capturing as a screen shot (as a PNG) to an in-memory buffer, and then down-sampling to half its original dimensions (this makes the lines sharper)
  • Using images2gif (this is the Python 3 version of the library), the frames are assembled into an animated GIF, and saved as a temporary file
  • The GIF is optimised by shelling out to gifsicle, a command-line tool for that purpose
  • Finally, the media is uploaded to Twitter using Tweepy. Technically Twitter supports animated GIFs up to 5MB, but this is only available using a kind of chunked upload that Tweepy doesn't yet support, so the GIFs have to come in under 3MB. Twitter returns a media ID, which the code associates with the queued tweet in my send queue, and that is posted when its time comes round. (The send queue ticks every 40 seconds, because Twitter rate limits.)

If you're curious, here's the source animation on the website. And here's how it looks in a tweet.

If you want, knock the "draw=1" off the URL -- you'll get a blank page. Then call step() in your browser's Javascript console and see each frame being generated.

There's a wrinkle: Phantom doesn't support WebGL, so the star map animation in three.js had to be re-written to draw directly to canvas... which three.js supports but you have to add custom sprites and a few other things. It gets hairy, and I'm super happy to have worked with @phl on that side of things -- he looked after the Javascript drawing with his amazing code chops.

Another wrinkle: PhantomJS 2 (which this requires) installs on the Mac using Homebrew just fine, but is a pain to build on Ubuntu which is what my server runs. There's a pre-built binary here.

In summary, this is a rendering pipeline which:

  • Fits my web-first approach... there's no separate drawing package just for these animations, so debugging an image is as simple as opening a browser window
  • Minimises the number of moving parts: I've added the ability to create images using Phantom but that's it, there's no separate drawing package or offline rendering
  • Is agile: I can tweak and change last minute

What else am I using this for?

I prototyped this rendering pipeline with another Twitter bot, @tiny_gravity which just does a tiny particle simulation once every 4 hours. Sometimes it's pretty.

This animation doesn't use three.js for drawing, it uses processing.js, but the principle is the same. Again, the animation is just a webpage, so I can tweak the animated GIFs in the same way I tweak the rest of my website and bot behaviour. Here's that animation as a tweet.

One of the things I'm most enjoying about having multiple projects is how they cross-pollinate.

My main side project right now is my bookshop-in-a-vending-machine called Machine Supply. Here it is at Campus, Google's space for entrepreneurs in Shoreditch, London.

It tweets when it sells a book. Because of course it does.

The selection is changed over every Monday, and you'll notice that each of the books has a card on the front (here's a photo) because every book is recommended by a real human made of meat.

These cards and the shelf talkers (the label which says the item code and the price) are beautifully designed by my new friends at Common Works. But they're a pain to produce: For layout, the templates are in InDesign (which I don't have), then I have to send an Excel spreadsheet of the new stock over to Sam at Common Works, which he then puts into the template, and prints.

My new process comes straight out of the @5point9billion code. The browser is my layout tool.

So Sam moved from InDesign to the web, and here are this week's shelf talkers as HTML. This is part of my admin site, I've temporarily turned off permission checking to this page so you can see. The template is automatically populated with details from the weekly planogram. (A planogram is the merchandising layout for a set of shelves or a store.)

And here's the exact same page as a PDF. The pipeline is taken from @5point9billion: Phantom is used to grab the webpage, and this time render it to a PDF, complete with vector fonts and graphics. Because it's a PDF, it's super exact -- which it needs to be to print right and fit neatly on the shelf edge.

It's much quicker this way.

My rule for Machine Supply, as a side project, is that it should take the minimum of my time, never feel like an obligation, and I should be able to manage it on the hoof. As a hobby, it should be Default Alive.

So automation is helpful. I like that this mode of generating PDFs can be done without my laptop: I can do everything from my phone, and print wirelessly.

Anyway. You should follow @5point9billion! It's fun, and you get a personalised animated GIF every time you pass a star, generated with the most ludicrous rendering pipeline ever.

Hardware-ish coffee morning tomorrow

[Short version: Coffee morning on Thurs 14 April in Old St! I sent this out to the coffee morning newsletter last week. Subscribe to the newsletter here.]

My Dearest Droogs,

We haven't had a hardware-ish coffee morning at all this year. I've had no Thursdays because I've been avec job for a few months earning coins. I know, I know, but it happens to all of us sometimes. Still, done with that now, and fingers crossed I can avoid gainful employment for a little while longer.

Let's hang out and drink too much coffee and talk about hardware! Same bat-time, same bat-channel:

Thursday 14 April, 9.30am for a couple of hours, at the Book Club (100 Leonard St).

Same old format... If you're curious about, or working in... designing physical things, paper or weird sensors, installations, knitting, manufacturing, internet-connected doodads, retail for hardware startups, sculpture, investment, or whatever, please come along.

There's no formal intros so it's easy to sneak off if everyone is horrendous. (They're mostly not.) Come say hi to me when you turn up, and we'll make sure you chat with interesting folks. (Most everyone is interesting.) Everyone loves prototypes, so bring em along if you have em. There are usually one or two.

If you're a woman, or don't present or identify as a dude, please do feel welcome. It's a concern to me that this tech industry, while very human and egalitarian in its early days (this goes for mainstream tech and hardware startups too) appears to heavily skew to Mainly Dudes as time goes on. That's something I can push against, a tiny bit, by trying to ensure these coffee mornings don't go the same way.

(On which serious note: If you don't feel you would be welcome - obviously or in hidden ways - at a hardware-ish coffee morning, and you'd be willing to share your feedback privately with ideas of how I could improve the format, I'd like to hear. My personal email is matt AT interconnected DOT org. Thank you!)

See you on the 14th!

Matt

ps. I've got a new hobby and it's a robot bookshop that tweets. You can visit it! Here it is. It's called Machine Supply.

Filtered for vending machines

1.

30 bizarre vending machines from around the world.

Live hairy crabs, cupcakes, acne medication, a real live person who hands out sweets.

See also this ticket machine in Japan, where pressing the "help" button leads to an attendant appearing from a tiny hidden door. I can't figure out whether this video is real or not.

2.

Beautiful vending machines that sell fresh salads. (Made for office lobbies.)

The Auto Store... Versatile and modular, ASC provides a retail platform that can be easily adapted to any retail environment, integrating sales and smart locker compartments.

Smart lockers are smart. See Doddle which now has concessions at train stations, allowing for e-commerce click and collect -- and also returns.

3.

A brief history of book vending machines.

See also: The short story vending machine in Paris. Uses a receipt printer.

And not forgetting that the modern paperback was popularised by Penguin, together with a new form of distribution and the ability to sell books outside traditional bookshops.

The New Yorker on mass-market paperbacks:

More than a hundred and eighty million books were printed in the United States in 1939, the year de Graff introduced Pocket Books, but there were only twenty-eight hundred bookstores to sell them in. There were, however, more than seven thousand newsstands, eighteen thousand cigar stores, fifty-eight thousand drugstores, and sixty-two thousand lunch counters—not to mention train and bus stations.

So...

The mass-market paperback was therefore designed to be displayed in wire racks that could be conveniently placed in virtually any retail space.

I like this:

You can’t tell a book by its cover, but you can certainly sell one that way.

4.

Vending Machine (2009) by Ellis Harrison:

An old vending machine is reprogrammed to release free snacks only when search terms relating to the recession make the headlines on the BBC News RSS feed.

Related: Tim Hunkin's Novelty Automation: A new London arcade of satirical home-made machines -- and if you haven't visited already (it's near Holborn) you must, you must.

Filtered for quick links

1.

A robot that sits on your back and feeds you tomatoes while you run.

2.

By Mattel, toy race cars driven by live crickets.

3.

Video: Star Wars by Ken Loach.

4.

The bedrock under Manhattan and how it leads to taller buildings.

Lessons on finding flow

The following written using The Most Dangerous Writing App which deletes everything unless you type continuously for 5 minutes, on 29 February 2016 at 19:05. You get 5 seconds grace. Discoveries are made. Output follows.

This reminds me of that game on Radio 4 where you have to speak continuously for one minute, with no hesitation, deviation, or repetition. Except here I don't this repetition matters. It's all about not stopping.

Which means maybe it's more like the movie Speed with Keanu Reeves where he couldn't slow the bus down below whatever it was, 40 mph, or otherwise it would blow up.

Explode.

Go bang.

Or maybe, it occurs to me, it's more like that neuroscience experiment where you try to say as many difficult challenges as possible for a whole minute. And the effort of that results in more blood flow to the brain, and because that's already a large amount of your oxygen usage anyway, that's detectable, and your head should be warmer, or you end up breathing faster, or something like that.

I don't remember.

The weird thing with this experiment is that it's not the paragraphs that are hard to figure out. I have enough time while I'm typing to choose something that comes next.

No. The problem is this:

It's when I get halfway through a sentence and I don't know exactly how to phrase what I way to say. So I usually pause for a second, delete, choose a different word. Or pause for longer, and in that gap go back and want to revise the previous sentence.

Which breaks my flow state. When I get lost in a particular word - a stutter if you like - I stop being able to think of what's happening in the next paragraph.

I feel that there's a lesson here in how I write usually.

Notes, discovered at this point 4 minutes in, that I need to remember for later, about how to write more fluently without using this app:

  1. I need to slow my writing down, in general, so that I can plan the next paragraph.
  2. I need to keep writing and keep moving forward. Don't go back, don't revise as I go. I can revise later, and that's editing. The point is to write without stopping.
  3. I need to capture this state without the app.

I've got to 5 minutes now, which is the stopping point, and already I found I have revised this sentence by deleting its second clause; I have gone back and added point 3 above which wasn't there before; I am pausing slightly to second guess myself.

So, lessons. Time to stop.

A week later

Looking back on what I wrote a week ago, I boil it down to this:

Writing and editing are separate tasks, and I should approach in different ways and at different times.

I was only able to see this after finding flow for, what, four minutes. And this category of ideas that are only visible after some period of time, or some kind of journey... this is interesting to me.

I've been reading about scoring centuries in cricket and there's something resonant for me in those stories about getting to the magic 100: An individual game, every ball the same as the last but somehow not; a score made run by run. Don't think about the 100 when you start, just start. Every ball on its merits. Even the greats remind themselves to watch the ball every time one is bowled. You can't score runs from the pavilion.

Filtered for rambling thoughts

1.

Position. How fast position changes is velocity. How fast velocity changes is acceleration. How fast acceleration changes is... jerk. Then snap, crackle, and pop.

Here it is explained:

So each one is a measure of how fast the previous one is going. Position is the location of your car, velocity is the speed of your car, acceleration is how hard you have the foot on the gas. jerk is how fast your foot is moving on the accelerator, snap is how fast your foot is accelerating on the accelerator. It can be conceptually visualized as the pedal controlling the thing you're looking at as you just keep repeating it.

And:

The thing is that large variations in 'snap' can be visible as "unnatural" or "uncanny" ... A very consistent 'snap', even when "jerk" is strongly controlled, can make things feel overly precise or planned. Imagine someone "doing the robot dance"

2.

Scientifically accurate images of Earth's sky with Saturn's rings taking into account shadows and latitudes.

See also: Views of Jupiter from the top and bottom.

I was getting the tube back home the other day, just after sunset -- the sky was burnt umber, all deep orange and brown and shadowed. And because it was late, it was dim, that low light where it's not bright and not dark, but the clouds look painted on horizon, unlit. But so, yeah, it looked like a gas giant, a planet just hanging there.

And I thought... Well, Nasa have their Pluto time widget which tells you at what time of day it's the same brightness as it is around Pluto, for where you are. So I worked it out for Jupiter, where it's about 1/27th as bright as Earth -- which, it turns out, for London, on a winter's evening, is about 6pm.

So by coincidence, that day in the late afternoon, I was looking out at a sunset with the colours and pattern of Jupiter, just as bright as Jupiter itself would look, if I was hanging by it in orbit, gazing across its deep clouds and churning storms.

See also: Images of whole galaxies as if they were teeny-weeny.

3.

Back in 2007, it used to be that tech startups were old Unix tools warmed over for the Web. grep is Google. finger is Facebook.

Then there was an era where tech startups were about individuals doing stuff publicly. YouTube, blogging, Twitter.

I think there's a similar, simple pattern now: There are a ton of big startups aimed at doing stuff your parents used to do for you.

Uber is being carted around. On-demand laundry and odd jobs is about having someone pick up after you. Food delivery is about being cooked for.

I know I have this preoccupation about being infantilised by brands -- cynically: modern coffee is a thin excuse for grown adults to drink hot sweet milk from a sippy cup.

But there's a difference between doing stuff for me (while I lounge in my Axiom pod), and giving me superpowers to do more stuff for myself, an online Power Loader equivalent.

And with the re-emergence of artificial intelligence (only this time with a buddy-style user interface that actually works), this question of "doing something for me" vs "allowing me to do even more" is going to get even more pronounced. Both are effective, but the first sucks... or at least, it sucks according to my own personal politics, because I regard individual alienation from society and complex systems as one of the huge threats in the 21st century.

This user experience "stance" is similar to the dichotomy we see in Internet of Things consumer products: Is it me controlling the product with my smartphone, or does the product have smarts of its own? I favour the second. There are a lot of smart home gadgets that you need a phone to control. Fine.

But when you use Sonos speakers you find that they connect to the streaming music services across the Internet themselves. You ask the speaker to "tune in" to music using your phone. Then somebody else can use their phone to adjust the track, change the album, whatever.

The difference between these two stances sounds minuscule and academic... but one approach leaves the product diminished, no more than a physically rendered version of an app. And the Sonos approach allows the speaker to stand alone, and consequentially become more social and more part of the home.

I don't know how to refer to this design challenge (in Internet of Things, in artificial intelligence) except as stance. There must be a better way of talking about it.

4.

Some more imagery...

Beautiful photos of Tokyo.

Geometrica, patterns by Guy Moorhouse.

The gorgeous landscapes of Grand Theft Auto V.

Augustus and Caesarion

I'd like to read a story about these two.

Cleopatra, from an ancient civilisation and a family that rules an even more ancient civilisation, in 250 years the first to really put her roots down and speak the language.

Caesar, bringing about the end of the republic, the expansionist warmonger of the upstart empire.

They fall in love. Love and politics. She has a son with him to cement the throne. For Caesar this is possibly his only son.

Caesar is assassinated in the death spasm of old Rome. Cleopatra falls in love with his right hand man.

Caesar's adopted son - after a second war that engulfs the Mediterranean, the first being Caesar's civil war - succeeds him: Augustus.

Cleopatra's son, Caesarion, succeeds her.

Then the upstart empire takes the most ancient one, and Augustus kills Caesarian.

Did they meet? Caesarion escaped for a time and was lured back. Did they have a final conversation? That must have been something.

Two sons, two brothers. One by blood, one anointed; one with history on his side, the other with the future. Augustus was the founder of the Roman Empire, which would last 400 years.

Another story says that Caesarion escaped.

Filtered for some nice words

1.

From this article about how computers play chess,

The values [of moves] are commonly measured in units of 0.01 called centipawns -- figuratively hundredths of a pawn.

Centipawns!

See also: The micromort which is unit of risk measuring a one-in-a-million probability of death. For example, simply living in England and Wales exposes you to 24 micromorts daily; flying 12,000 miles adds one more.

2.

Look, it's awkward to mention anything by Ezra Pound -- and by "awkward" I mean, wasn't he a fascist and wildly antisemitic? Not the kind of ideas I want near me.

Anyway, he wrote Revolt Against the Crepuscular Spirit in Modern Poetry which is... well, full of feeling.

I bid thee grapple chaos and beget

Some new titanic spawn to pile the hills and stir

This earth again.

See also Ozmandius by Shelley which I hadn't realised was so short, just a sonnet, but picture-packed to the rafters.

3.

The fall of Jersey: how a tax haven goes bust, an article which includes this gorgeous phrase:

45 square miles of self-governing ambiguity entirely surrounded by water.

The ridiculous physics of the free market.

4.

So there are rumours that gravity waves have been detected.

But waves in what? Waves in the fabric of spacetime itself, or in physics words: it's perturbations to the Metric (a description of the curvature of spacetime), where zero-amplitude corresponds to Minkowski space.

From the linked paper,

There are also a number of “exotic” effects that gravitational waves can experience ... scattering by the background curvature, the existence of tails of the waves that interact with the waves themselves, parametric amplification by the background curvature, nonlinear coupling of the waves with themselves (creation of geons, that is, bundles of gravitational waves held together by their own self-generated curvature) and even formation of singularities by colliding waves

You know that feeling where you're listening to choral music in Latin, or Buddhist chanting, and you don't know the words but the sound of them is enough? Yeah. Perturbations to the Metric.

Vincent van Gogh on the stars

Letter from Vincent van Gogh to Theo van Gogh Arles, c. 9 July 1888:

That brings up again the eternal question: is life completely visible to us, or isn't it rather that this side of death we see one hemisphere only?

And:

For my own part, I declare I know nothing whatever about it. But to look at the stars always makes me dream, as simply as I dream over the black dots of a map representing towns and villages. Why, I ask myself, should the shining dots of the sky not be as accessible as the black dots on the map of France? If we take the train to get to Tarascon or Rouen, we take death to reach a star. One thing undoubtedly true in this reasoning is this: that while we are alive we cannot get to a star, any more than when we are dead we can take the train.

So it doesn't seem impossible to me that cholera, gravel, pleurisy & cancer are the means of celestial locomotion, just as steam-boats, omnibuses and railways are the terrestrial means. To die quietly of old age would be to go there on foot.

Which, to me, puts his Starry Night on a bigger canvas than it had before.

Filtered for background

1.

Time-lapse of scenery in Red Dead Redemption.

Western video game. Keeping my fingers crossed for a sequel.

2.

Homo erectus made world's oldest doodle 500,000 years ago.

Art by non-humans would make me feel we had more company in the universe. I'm not quite sure this counts, but it's close.

3.

The art of viewing moss.

Microscopic rainforests.

4.

The colour of e-ink... that grey screen that goes back to the first Amazon Kindle.

Or these new London bus stops using e-paper, the same.

Web pages used to always have a grey background -- inherited from the grey used by Mosaic in 1993.

The other day I stepped out of the tube and the sky was this medium grey -- not matte, not dark, not bright, not quite pearlescent, just... there. The colour of an e-ink screen before the words arrive.

See also, blue.