Lance Nanek took it upon himself to play with Google Glass using debug mode to figure out what the device is capable of doing. Turns out the answer is a lot. Nanek discovered that Glass comes with 13 sensors, including ones for orientation and magnetic fields. He thinks that if developers really wanted to, they could create augmented reality apps. One example Nanek discusses is a to-do list that’s location based. So say he’s looking at the shed in his back yard, Google Glass can remind him that he needs to paint in.
Now in my headline I suggest that this isn’t going to come any time soon. Why? Because just about every review of Google Glass I’ve read says that using the camera cuts the battery life down to less than three hours. That’ll obviously improve with time, but if you want full blown always on augmented reality, that’s going to take at least a hardware generation or two. Assuming Glass will be updated on the same sort of schedule as mobile phones, that means it’ll take at least two years for the WOW version of Glass to land.
[Via: Engadget]