Let me say it right in the beginning: in 10 years we won’t need screens anymore. Actually, let me revise that. It’s likely going to be much quicker than that.
During the last two weeks, I’ve spent some time exploring what’s going on in the VR and AR scene and I’m simply blown away.
Some quick definitions:
VR is Virtual Reality where you put a headset on, which obscures everything else and gives you a 3D experience. Think 3D entertainment, 3D environments.
AR is Augmented Reality where “holograms” (for lack of a better word) are overlaid over the real world, so you see everything around you, but with augmentations. Imagine being able to see your GMail on a virtual computer screen hovering in space right in front of you. Now imagine you can put that virtual screen on the desk in front of you and it will stay there when you move your head.
I’ve had the pleasure to get a demo of some really cool VR stuff in silicon valley last year. Not only was it amazing to realize that they have the tracking down (e.g. no discernible lag when moving your head) but also the screen resolution is getting to where you don’t see pixels. Retina anyone? Field of view is another thing that has become substantially better and the total experience was really amazing. To the point where I am beyond doubt that this will have plenty of mainstream application.
Add in the more and more ubiquitous small 360° cameras that are coming on the market and it’s clear where this is going.
But what I’m REALLY excited about is where AR is going. Not only will it show you virtual things combined with the real world, it will let you manipulate what you see. 14 years ago, Minority Report gave us an idea (YouTube) of where things might be going. Tony Stark’s workshop (YouTube) in Ironman is updating this vision to today and we actually aren’t too far away from this becoming a reality.
Why am I so convinced?
Because companies like Meta have many of those puzzle pieces figured out already. Their wearable hardware can map its surroundings in 3D. It can track gestures and 10 or more fingers in space to a precision of 2 milimeters or better. 2 milli-effing-meters, that’s 0.08 inches.
At companies like Caterpillar, Augmented Reality is already a reality (YouTube) . If a service technician does a repair, they can overlay instructions right on the machine.
Meta and other companies have demonstrated their latest VR/AR devices at TED in Vancouver just a few days ago. Meta’s press embargo will lift on Wednesday, Mar 2nd, but just today they gave a tour of their lab to Robert Scoble on live video. Here is an article (Linkedin) that is based on that video. The video he writes about is here (Facebook). Robert Scoble’s take in the video is this: “One thing I realized very quickly: we’re soon not gonna buy pieces of glass anymore. We’re not gonna buy MacBooks. We’re not gonna buy TVs. […] With these kind of glasses on […] we’re gonna have virtual screens in front of us.”
Yes, these glasses are still bigger than I’d like them to be, but the tech is shrinking almost by the minute. Look at what smartphones could do in 2007 and what they can do now, 9 years later. I’m in awe of how far Meta has come in the 7 years they have been working on this. And even though these glasses won’t come out as consumer products just yet, they are getting ready for a whole slew of companies (Meta alone already has 1000 Enterprise customers!) to start building on top of the new hardware revealed at TED (and that they will reveal to the rest of the world on March 2nd).
Let me say it again: in 10 years we won’t need screens anymore. The personal computer has been around for about 30 years, the Internet has been around for about 20 years, mobile computing has been around for about 10 years. These paradigm-shifts are speeding up. Far less than 10 years from now, we’ll use AR in our daily lives the same way we pull our smart phones out of our pockets today.
The possibilities are amazing. And a bit scary.