Apple Glass, the Cupertino-based tech giant's long-rumored AR spectacles, may be able to transform any surface into a virtual control panel for user engagement, according to a patent granted to Apple.
The patent was filed four years ago, Apple Insider reported, but it was finally revealed this week.
- Apple Glass: Release date, design, features, price and more
- Apple VR headset: Rumors, release date, price, specs and what we want
- Apple Glass enters trial production -- Here's what that means
How Apple Glass will help you interact with your environment
Apple Glass, according to the patent, will superimpose virtual control panels — filled with interactive buttons — on real-world surfaces. As an example, Apple Glass may be able to project home screen icons onto your wooden table, and you can select the Apple's blue App Store button with your finger to prompt Apple Glass to open the tech giant's popular digital app store.
The patent said that as head-mounted AR apparatuses become smaller and move closer to the user's eye, applying a touchscreen interface on the physical display would be damn-near impossible. So instead, a more feasible solution would be to implement something called "occlusion-based interaction methods" or OCI methods.
"OCI method" is just nerd talk for being able to interact with a virtual display that is projected from a camera. The patent, though, admits that the conundrum with OCI methodology is making sure the AR device can "read" its user's hand gestures, but the inventors suggested the following solution to make Apple Glass responsive to human interaction.
"Physically equip the object or the human body (e.g. fingertip) with a sensor capable of sensing touch. This could be anything from a simple mechanical switch to a touch-pad or touch screen. It could, for example, also be based on electrical voltage applied to the body and closing a circuit when touching a real object. The limitation of such kinds of approaches is that they require modifications of the object or the human body."
The patent also suggests a time-of-flight camera, which uses advanced technology to perceive depth and light, to track users' finger gestures that Apple Glass can "read" for accurate responsiveness.
As aforementioned, Apple Glass plans to map its virtual buttons onto surfaces, and while the inventor stumbled into some difficulties in figuring out how to aid Apple Glass in "reading" human touches on virtual buttons superimposed onto a surface, they came up with a fascinating solution: heat.
"If two objects at different temperatures touch, the area where they touch will change their temperature and then slowly converge back to the initial temperature as before the touch. Therefore, for pixels corresponding to a point in the environment where a touch recently occurred, there reveals a slow but clearly measurable decrease or increase in temperature," the patent said.
In other words, Apple Glass may be able to determine which buttons in the virtual realm were touched by simply detecting temperature changes in the environment. Keep in mind, though, that this is just a patent. Its fascinating ideas may not materialize into the final production model of Apple Glass.
To stay abreast of Apple Glass news, check our oft-updated Apple Glass rumor hub.