Wednesday, September 20

The new iPhone is a Clear Piece of Glass

Update: It’s two days after the keynote. I was wrong. And so was Robert (Facebook link). I still have no doubt, that we’ll get to that vision within the next year, but Apple is playing this the Apple way, e.g. slowly. Easing everyone into it in good time.



One year ago, in October 2016, Robert Scoble posted this: “Just had a source that I trust tell me it is a clear piece of glass.” As you can imagine, he was met with some disbelief and even ridicule. And while Robert tends to easily be excited, this time I believe he’s mostly right.

Now I know it’s not smart of me to come out with bold statements just a few days before the actual reveal of a product, especially as I don’t have any first-hand information. I might look like a fool on September 12. But it’s all quite logical if you think about it.

We can safely assume a few things:

1. The iPhone 8 (or whichever name it will have) will have two 3D sensors, one pointing forward, one at you.

2. It will also have dual front-facing cameras.

3. It will come with iOS 11 and thus with Apple ARKit.

4. The iPhone also comes with world-class inertia sensors that can tell rotation and acceleration and help the iPhone know where it is in space and where it’s pointing.

5. The display will cover most of the front of the phone.

In short: the iPhone will know where it’s pointing and it will also know where your eyes are. (the eyes part is crucial) And its camera sees what’s in front of it.

My guess, just 6 days ahead of the big reveal (and most likely not the only one coming to this conclusion): the new iPhone will bring an AR mode which will make it *seem* like it’s made of glass. With all the information it has, It can put on the screen what your eyes would see if the screen wasn’t there, making it seem like the device is made of glass. It can then overlay information on that picture, so it’s like you’re looking through a slab of glass to see AR content.

It’s a glass simulation. Remember the brushed metal knobs on the music player that would change the light depending on how you held it by using the rotation sensor? That was a simulation that felt a little bit like magic. Remember when Apple introduced the API for depth layering, visible on the lock screen when you tilt the phone? It’s a simulation that felt like magic, and that was especially convincing when demoed through a 2D camera. It falls slightly short when looked at with your own eyes due to the stereo vision, but it demoed really really well on stage.

Now we’ll get this new simulation that is way more than eye-candy. It’s a logical step towards glasses, it makes the phone disappear even more then before and melt with the real world around it. You will just see the overlays hanging in mid air. And it will be like magic.

Its only tiny shortfall: you’re looking at a 2D surface with your stereo eyes, so the expected parallax won’t be there. But it will demo really well on stage, when seen through a 2D camera. And it will be close enough for real-world use. The only way to perfect that would be a 3D display, but I don’t think it’ll have that or we would have seen supply chain leaks by now.

So Robert pretty much got that one right: “it is a clear piece of glass”

Let’s see on September 12 if his information was correct or if I will look like a fool.