Day 1 notes from picking up a modern VR headset

19.47, Wednesday 20 Apr 2022

I’ve tried VR a few times over the past few years - enough to know that it’s amazing - but never spent real time with it. So I picked up an Meta Quest 2 which is a standalone headset with two handheld controllers.

These is my day 1 response. Every time I pick the thing up my views evolve, and I wanted to capture my earliest impressions.


(Side note #1: the Meta Quest used to be known as the Oculus Quest. Meta is the new name for Facebook, and it’s named for the “metaverse”, which is the imagined future immersive VR world that we all inhabit. My view? A lo-fi metaverse is possible.)

(Side note #2: The ecosystem is pretty confusing as a newb. It seems like other headsets are basically output devices for other things, like having a fancy head-mounted monitor for your Playstation or Windows PC? I wanted to experience VR as something self-contained, like a phone or a desktop, so I went for the Quest 2… but it turns out that it’s not entirely standalone. You need a powerful Windows box to pair it with if you want to try certain games or apps.)


Keep in mind that I’m not a gamer. I play video games from time to time, and love a few, but I don’t have a gaming PC and it’s not something that really holds my attention.

Could VR one day displace my laptop or even my phone?

Apps not games: that’s what I’m into understanding. How far off is that? What are the design challenges?

Spoiler: No conclusions yet. After one day I’m still informing my intuitions.


The operating system “frame” to all the apps is waiting for its Macintosh moment.

A big job of the OS, from a user’s perspective, is to help you find apps, launch apps, and give you a consistent experience. That helps app developers (discoverability!) and also users (familiarity!).

That’s what the “grammar” of the original Macintosh did so well: windows, menus, icons, cross-app copy and paste, drag and drop, and so on. If you learnt how to use one application, you could use them all.

Apple’s Human Interface Guidelines were revolutionary: a philosophy and a spec all at once. You can read them online, as a PDF hosted by Andy Matuschak: Human Interface Guidelines: the Apple Desktop Interface (1987).

The iPhone performed the same trick, only it updated the metaphor to make it more immediate: instead of clicking a mouse with your finger which caused a cursor to click an icon, you literally tap an icon with your finger. There is no gap between the embodied action and the metaphorical action.

With Quest 2, there are floating on-screen “buttons” that you “tap” by directing a pointer with your hand controller and pulling a trigger.

I understand that this is a really minor thing for me to focus on. I’m not saying: here is an example of poor design. No. Instead, this is a sign that we’re all still figuring out what this medium is for and what the natural interactions should be.

Likewise, when there are more apps, the OS-makers will be able to see what the common interface patterns are. Like, do they all organise in a faux 3D environment that the user moves around? Or on the inside of a sphere? Etc. At which point the OS will be able to provide standard tools for apps to draw these UIs.

It takes time and it takes work.

Case in point: this morning I learnt about Quest 2 hand tracking (find out how to activate it here). There’s a whole different way to tap buttons (now: pinch) and interact with interfaces, without using controllers at all.

In the meantime there’s no consistent grammar to the apps. I’d like to see more wild experimentation tbh.


There is problematic asymmetry in physical social space.

The Quest 2 doesn’t come with headphones. It plays sound out of the sides. No headphones is good because it means I have some awareness of what’s around me… but not great for anyone else nearby. It’s noisy! I can’t see who I’m interrupting. There’s just the sound of someone closing a door on me.

I was fully expecting VR to be antisocial, that’s completely fine. (Actually it’s kinda funny when I’m playing mini-golf in the front room while my wife is watching TV. She gets to laugh at me and we’re still together.)

It’s the asymmetry in physical social space that surprised me.

So you have asymmetry with sound (you can interrupt people, but can’t tell that you’re interrupting them). You also have it with vision.

There’s a feature of the Quest 2 called passthrough. Here are some GIFs on their developer blog. The idea is that, in certain situations, you can see the room around you in fuzzy black and white. (The Quest 2 has external-facing cameras, and it can play the feed on the internal display.)

Passthrough is magical! (It’s used in some clever ways that I’ll mention below.)

But passthrough is weird because sometimes I can sometimes see people outside the headset and sometimes not. But they can’t tell.

The thing about gaze is that it’s reciprocal. If I’m looking at you, you can tell (and vice versa). Until now. Using passthrough to see my wife feels a bit like spying? Like, she has an expectation that I’m immersed, but secretly I’m peeping.

I feel like the headset should have cartoon eyes that appear on the outside when passthrough is engaged.


I find it physically demanding with stinging eyes and motion sickness.

My eyes sting. I’m not blinking enough I think? I’ll get used to it.

The headband positioning and weight feels kinda… off. If I don’t get the headset in precisely the right place (which is not it’s natural resting point), I get blurred vision. Industrial design tweaks will fix this.

My lower arms get fatigued quickly when using hand tracking (though, weirdly, not with the handheld controllers). Iterating the interaction design will stop this happening.

Motion sickness. Oh my god.

Years ago I tried an Oculus Rift with a low poly game and it was beautiful. I stood on an island beach at sunset and looking from the pink blocky trees out to the horizon. Looking up, rain was falling, and as it stops and the clouds parted, stars sparkled. It moved me to tears.

A week later I tried the same game again - it was in the process of being ported to VR - and the debug code had been left in. The extra code dinged the latency and the apparently view lagged a few milliseconds behind my head moving. It was enough to put me flat on my back in cold sweats for 20 minutes.

VR sickness is wild. I’m still prone to it.

I toured Anne Frank’s house using the Quest 2, and my goodness what an incredible experience. Well put together (I explored the space thoroughly) and a story clearly told. I had a lot to think about at the end.

Then I went straight onto a dinosaur themed rollercoaster. I shouldn’t have chained those two apps. As if the emotional whiplash wasn’t enough, I was on my knees during the rollercoaster, took the headset off halfway through, and was queasy for the rest of the night.

I think it was something to do with the motion on the coaster? The third and fourth differentials of motion weren’t eased; you could sense the snap.

Any VR operating system needs to bake “correct motion” into the SDK provided to app and game developers. They should have to fight the code to make people feel sick.


Here are four magical moments.

I’m such a Debbie Downer. Virtual reality is amazing. My observations sound like criticisms but they’re not. I’m just trying to get a sense of the state of advancement of the tech. But let’s balance that with some great moments.

I’ve used VR before. I have an Oculus Rift, original Kickstarter edition, in a box upstairs. I tried VR back in the Virtuality days of the mid 90s, the last VR boom. Then periodically over the last few years.

Given all that, here’s what made me gasp on day 1 and what I’m still thinking about.

  1. Peeping through passthrough. The way it works is that you “draw” (in VR) a box on the floor. Within that box, you are immersed in 3D virtual reality. Near the edge, you see the box around you outlined as a grid. As you touch the edge, a hole appears… you can poke your head through, and you see the real world beyond, in black and white fuzzy passthrough. I found myself leaning out to have a chat or to grab a drink. Delightful.
  2. A Godzilla’s eye’s view. Playing mini golf, I found a button that let me zoom out. Suddenly I was standing in the middle of this golf course arranged on a mountain, the mountain halfway up my chest. Walking just a foot or two, and bending down, and leaning close, I could examine bridges and trees and caves and courses. An incredible, examinable overview, in a way that would take multiple steps on any other device.
  3. Height, space, and scale. In the first room of Anne Frank’s house, there’s a steep staircase leading down, but it’s inaccessible from the tour. However I was able to kneel down, put my head through the bannisters, and peer over the edge, down the stairwell, and into the next room. I know this is what VR is all about, but the sense of being located continues to astound. What do we do now the gamut of interaction can include vertigo and awe? It’s like suddenly being given an extra colour.
  4. Objects that cross the threshold. When I pick up the real-life controls, they appear in VR space. Headset on, everything black and gone – except the controls in my hands are still there. And now they have extra green lights and details on them! Janus objects that face both ways into physical and fictive reality. The controls are real… but realer in VR. So many opportunities for play.

I can’t help but wonder about the non-game applications.

FOR EXAMPLE:

The Godzilla’s eye view of the golf course was 100% a better experience for getting an overview and examining detail than anything on a phone or a desktop. Imagine seeing a spreadsheet or a PowerPoint deck all at once, with all the interconnections overlaid in glowing arcs, and simply leaning closer to read the words or pick up a sheet to edit. It’s so much more immediate than working via windows and scrolling in a viewport. VR and mixed reality are tangibly better ways of dealing with large amounts of data, at macro and micro scales, and relating to it at your own pace.

That’s what I’m thinking about with these magical interaction moments: what if they were as fundamental to the future VR user interface as menus on a desktop, or scrolling a list on a phone?


As I said I’m training my intuitions so no conclusions yet. Mainstream VR (for apps, not just games) feels super close with the tech and with a ton of work to do regarding interaction design. The space is wide open. Exciting.

Follow-up posts:
Auto-detected kinda similar posts:

If you enjoyed this post, please consider sharing it by email or on social media. Here’s the link. Thanks, —Matt.