Given how many screens there are, you would have thought there would be more new stuff.
It’s been 20 years since Apple shipped the Mac OS X Aqua interface, with all its reflections and transparency – the one Steve Jobs called lickable.
So where’s my operating system which has a physics engine plugged in? One that moves the reflections along with the time of day, making the on-screen light source travel with the sun?
It’s been 19 years since Pixar released Monsters, Inc. with all that CGI hair. Where are my hairy icons? Ones that get all long and knotted as the notifications number goes up.
Why can’t I feel my phone? I found that paper from 2010 (when I was complaining about keyboards) about using precision electrostatics to make artificial textures on touchscreens.
I should be able to run my thumb over my phone while it’s in my pocket and feel bumps for apps that want my attention. Touching an active element should feel rough. A scrollbar should slip. Imagine the accessibility gains. But honestly I don’t even care if it’s useful: 1.5 billion smartphone screens are manufactured every year. For that number, I expect bells. I expect whistles.
There are probably all kinds of reasons why screens are basically sharper now and that’s it. Lack of competition. Developers wouldn’t support it. Whatever. Cars were better when they had fins. They don’t have fins now and they aren’t as good, I’m not interested why. What’s the point of technology if we’re not going to have fun with it.
The Nintendo 3DS came out in 2011 with a lenticular layer on the screen that allowed everything to be slightly 3D. Autostereoscopy. It was awesome for 3D photos. Almost a decade later – surely this should be on tablet computers now and really really effective? Imagine the medical imaging applications.
Why are we stuck with only three pixels for red, green, and blue? Why isn’t there a fluorescent yellow pixel to make alerts really pop? If we don’t play we won’t find the uses.
In 2012, iOS 6 had metallic buttons with faux reflectivity. It’s 8 years on. Why isn’t there a fourth pixel, and the fourth pixel is a mirror? Come on people it’s 2020.
Given how many screens there are, you would have thought there would be more new stuff.
It’s been 20 years since Apple shipped the Mac OS X Aqua interface, with all its reflections and transparency – the one Steve Jobs called
So where’s my operating system which has a physics engine plugged in? One that moves the reflections along with the time of day, making the on-screen light source travel with the sun?
It’s been 19 years since Pixar released Monsters, Inc. with all that CGI hair. Where are my hairy icons? Ones that get all long and knotted as the notifications number goes up.
Why can’t I feel my phone? I found that paper from 2010 (when I was complaining about keyboards) about using precision electrostatics to make artificial textures on touchscreens.
I should be able to run my thumb over my phone while it’s in my pocket and feel bumps for apps that want my attention. Touching an active element should feel rough. A scrollbar should slip. Imagine the accessibility gains. But honestly I don’t even care if it’s useful: 1.5 billion smartphone screens are manufactured every year. For that number, I expect bells. I expect whistles.
There are probably all kinds of reasons why screens are basically sharper now and that’s it. Lack of competition. Developers wouldn’t support it. Whatever. Cars were better when they had fins. They don’t have fins now and they aren’t as good, I’m not interested why. What’s the point of technology if we’re not going to have fun with it.
The Nintendo 3DS came out in 2011 with a lenticular layer on the screen that allowed everything to be slightly 3D. Autostereoscopy. It was awesome for 3D photos. Almost a decade later – surely this should be on tablet computers now and really really effective? Imagine the medical imaging applications.
Why are we stuck with only three pixels for red, green, and blue? Why isn’t there a fluorescent yellow pixel to make alerts really pop? If we don’t play we won’t find the uses.
In 2012, iOS 6 had metallic buttons with faux reflectivity. It’s 8 years on. Why isn’t there a fourth pixel, and the fourth pixel is a mirror? Come on people it’s 2020.