Heard about the optician who got a job at the Apple Store? Not yet, but it looks like you might hear something along those lines as the latest leak claims the company’s long-expected AR spectacles will support prescription lenses.
One of the bigger problems about the idea of AR glasses has always been that in order to use them, one must be able to look through them safely.
That’s more difficult than it sounds, given that around half the world’s population probably uses some form of vision correction; you don’t want people suffering accidents just because they were wearing their digital specs.[ Keep up on the latest thought leadership, insights, how-to, and analysis on IT through Computerworld’s newsletters. ]
To get a sense of the numbers, the Vision Council claims approximately 164 million U.S. adults wear eyeglasses. This has been a problem for device makers, who have either sold one-size-fits-all glasses that don’t actually work for half the people, or had to think about creating overlays you wear with existing glasses.
Neither alternative seems to have made any sense to Apple.The latest set of Jon Prosser claims suggest the company has decided to offer its AR glasses as a basic frame, with the prescription lenses costing more.
Apple is looking to charge around $499 for the frames, Prosser posits. It’s not especially surprising that these prices put the company’s offering at the high-end of the frames industry.
The majority of consumers (50.9%) pay between $100-$150 for frames, says the Vision Council, though Apple’s frames will be connected systems that work in conjunction with your iPhone to provide you with AR experiences during your daily life.
It is worth thinking about whether Apple will work with the 43,000 existing U.S. opticians to bring its product to market – or will begin offering its own eye examinations in store.
We know digital health is an important sector for Apple, one in which it believes a combination of advanced sensor technology and smart software can make a difference.
We also know that behavior modification can be an important adjunct to public health response – whether that’s staying indoors to prevent accidental contagion or choosing to walk a little more in order to close the rings in the Activity app.
Worldwide, 650 million adults are obese, according to 2016 WHO data, and it’s possible that technology may help create the environment required to get people to manage this aspect of health more effectively. Apple CEO Tim Cook told shareholders earlier this year, “The big idea is to empower people to own their health.”
That same logic may apply to opthalmology. Think of it this way: if your glasses are already calibrated to your eye, then they may also become smart enough to monitor what your eyes do.
Are your eyes showing reactions consistent with further erosion in sight? Then perhaps your spectacles will let you know.
Similarly, these systems may eventually become smart enough to detect things like early onset diabetes. I don’t expect that in Apple Glass v.1, of course (unless the sensors already exist) – but the opportunity to develop technologies to transform ophthalmological eye condition monitoring seems too sweet a spot to miss.
Apple will also be looking to make its Glass product into a new platform.
That's going to mean developer tools, APIs and functionality that seems, at least at first, likely to be defined by augmented information about where you are and where you are going.
Let’s examine how this might work. Think about how Apple Glass may provide people with very limited vision with contextualized information about the places they happen to be.
Used with the built-in LiDAR camera, these systems could provide turn-by-turn walking instructions that may give people with vision problems more independence than before.
I imagine it might work like this:
“Turn left in five, four, three paces and move to the right of the sidewalk. A person is walking toward you, they are wearing a blue mask and matching shoes while talking on their phone. They will pass by you in three, two, one. The sidewalk is now clear and you will reach the coffee shop in five, three, one pace. You have arrived. The door is on your right. Move your hand six inches to the left to find the door handle.”
If Apple chooses to move in that direction, it could utterly transform people’s perception of the world around them. Imagine wearing glasses that would read you your book, or translate one.
A similar logic means these devices could help deliver augmented education experiences, technical product support and field service advice within highly personalized one-to-one experiences.
Apple’s Glasses may be decidedly exciting in terms of AR experiences, but their most exciting potential may turn out to be that they deliver a revolution in accessibility for many. And usher in those voice-first experiences people are beginning to talk about.
We’ll see, of course, but Prosser seems to think we may find out more concerning Apple’s AR glasses plans later this year or early in 2021.
SOURCE: Jonny Evans