Quick Thoughts: The Long Road to AR Glasses
SEE LAST PAGE OF THIS REPORT Paul Sagawa / Tejas Raut Dessai
FOR IMPORTANT DISCLOSURES 203.901.1633 /.901.1634
psagawa@ / firstname.lastname@example.org
November 21, 2019
Quick Thoughts: The Long Road to AR Glasses
- An analyst reported that AAPL is hoping to launch AR goggles in 2020, with mass-market appropriate glasses to follow in 2023. We believe that this timeline is VERY optimistic.
- Current tech demands severe compromise amongst form factor, performance and cost, with the display as the gating factor. Existing products are either high performance, expensive and bulky (e.g. MSFT HoloLens and Magic Leap), or compact and mid-priced but limited (e.g. GOOGL Glass)
- AAPL has acquired expertise/technology in AR and has filed AR related patents. It has a 1K person AR team that was recently reorganized. It has pushed ARKit for smartphone apps with simple AR. CEO Cook is enthusiastic on AR goggles/glasses but cautious on the technical hurdles.
- AAPL is reputedly working on both high-end goggles with a dedicated processing unit (HoloLens style), and glasses that extend an iPhone’s display to relatively simple glasses (Glass style). Both face substantial tech challenges to improve upon rival products and face major barriers to adoption.
- AAPL works on many ideas that are never released as products. For years, analysts touted its entry into TVs, self-driving electric cars, and other sexy categories. We would not expect AAPL to release a product unless it was very confident in the product and its likely reception.
In the first half of this decade, the Apple blogosphere was obsessed with the possibility of an Apple TV set. Walter Isaacson’s 2011 biography of founder Steve Jobs cryptically reported that Jobs felt he had “cracked” the problem with TV before his untimely death. Reports of a skunk works development team surfaced and some on the Apple beat reported seeing a prototype TV built by an Apple supplier or in the company’s own labs. Analysts began pestering management on every quarterly call about the impending launch of the Apple TV and some began including sales and profits for the product in their future models. Ultimately, the rapacious competition amongst TV makers made it very unlikely that the company could make Apple-like margins on such a product, and the company nixed the project – if it had every been on the track to release it at all.
A similar story followed Apple’s interest in the automotive market. Stories of “Team Titan”, a hush-hush development effort run from a non-descript building well off the main campus, circulated. Supposedly, Apple was reconceiving the car from the bottom up, focusing on electric power, self-driving smarts and a reimagined ergonomic experience. Hires from Tesla and other automakers supported the conjecture, self-driving test vehicles showed up in Cupertino, and once again analysts started adding line items for automobiles in their long-term models. This too seems to have been a dead end.
The new obsession is augmented reality. Here, Apple already does have a commercial product in the market. ARKit is a software API that allows applications to tap distance reckoning tech in an iPhone to overlay digital objects into the field of view captured by the device’s camera. Despite hype and a copycat platform from Google for Android, developers have been flummoxed by phone-based AR. Most Pokémon Go players turn off the much-touted AR feature because it devours precious battery power and adds nothing to the game play (Exhibit 1). Few people have been using the ARKit capability built into Ikea’s app to see how the Bjork shelving unit might look in their living room. No one is walking around with their phone in front of their faces to view turn-by-turn walking directions displayed right on a camera captured image of the scenery in front of them. Phone-based AR is a technology looking for a use case.
Augmented Reality stans are not giving up. The key will be special glasses that can project a digital image into the wearer’s field of view without require them to stare into their smartphone screen. Unfortunately, the technology to make this happen is far from ready for the mass market. Right now, there are two techniques for projecting those images. The first requires the user to wear goggles that provide stereo video screens of the outside world – sort of like the backup camera in a car that replaces the rearview mirror for careful parking. The computer system then places the digital image into the video feed. This approach, favored by Facebook’s Oculus and some others, has the advantage that it works well, but it has the downside of looking absolutely ridiculous on top of being bulky and expensive. If people were hazed for wearing Google Glass, imagine the reaction to someone walking the street wearing a boxy mask about the size and shape of a milk carton.
Exh 1: Phone based AR is resource draining – Avg. Component Power Draw Breakdown for Pokémon Go
Exh 2: Overview of Waveguide Technology used in Augmented Reality Glasses
The second technique involves using a technology called wave guides to embed a projection system directly into transparent glass (Exhibit 2). The user looks through a pair of lenses and the system can show images tied to the world around it. Sounds good, but wave guide tech has a long way to go. The system needs a separate layer of glass for each of the three hues needed for full color projection, plus 3D images demand separate strata for near, mid-range and far away projections. 9 layers of glass and the associated display drivers for each of the layers is bulky by definition and at this early juncture, very expensive as well. Microsoft’s HoloLens and the Magic Leap One both use this approach, and each looks like a set of ski goggles, requires an adjunct processor wired to the unit and costs thousands of dollars.
Google Glass is cheaper and more compact but gets there by punting on true AR – Glass simply projects a flat image to the corner of the wearer’s field of view, allowing them to read information like on a smartphone display. Even this approach seems to require too much tech to be hidden from view and the units still cost a thousand dollars. It has not found a wide market and Google has pulled back to focus on industrial applications as has Microsoft with its more expensive HoloLens product.
So basically, AR glasses are 1. Bulky and unattractive, 2. Super expensive, and 3. Lacking in true AR functionality – pick at least two (Exhibit 3, 4). Development progress to resolve these issues has been glacial – it is hardware display tech, not chips or software where Moore’s Law or another predictable development trajectory can be confidently applied. Apple has bought AR companies and hired in a lot of expertise, but it is not clear that they’ve had any sort of breakthrough that might obviate this Hobson’s Choice. Someday we will get there, but it certainly won’t be in 2020 and probably not in 2023 either. Apple does not release.
Exh 3: Technical Limitations and poor UX is primary hurdle to AR Innovation – will be solved leveraging Cloud and 5G
Exh 4: Summary of major hurdles for mass market adoption of AR Hardware
Exh 5: Summary of AR products rumored in Apple’s pipeline
products before they are ready for mass consumption. Google, Amazon, Facebook and Snap may build prototypes and see if they can generate interest, but Apple doesn’t work that way.
Reportedly, Apple is working on two different projects (Exhibit 5). The first would be AR goggles with an adjunct processor, likely intended for gaming. This would put them head to head with Facebook, which continues to struggle to gain traction with its Oculus products. Such a product could be built for 2020 but would likely cost well over a thousand dollars with limited applications and a very limited end market. It’s hard to see such a system pitched at high end gaming fitting with Apple’s generally coherent brand and product strategy. The other project is a set of AR glasses intended to rely on an iPhone for most of its processing, with a form factor largely consistent with normal sunglasses. Here the ugly realities of AR display technology intrude. 9 layers of wave guides and the associated drivers contained in the glasses? They won’t look like normal glasses – they will have a cigarette pack sized bundle of hardware and batteries even if they are relying on the iPhone plus the lenses will be coke bottle thick. Bluetooth? Probably not good enough, so a higher bandwidth solution will have to be included in every iPhone
Perhaps there is another way. Scientists have experimented with systems to project digital images directly into the eye and onto the retina. However, such a product would likely be deemed a medical device and face extensive testing for regulatory approval. Rumors that prototype testing has damaged the retinas of test subjects are significant cause for concern. Perhaps Apple has discovered a new way to deliver 3D digital images interpolated into a person’s field of view that doesn’t require nine layers of glass or a laser shooting into the eye. Perhaps, but it doesn’t seem like a very good bet.
Apple could go the Google Glass route and offer very limited AR functionality – a single 2D plane that can project images only rudimentarily related to the outside surroundings. This could be a practical product, but even then, there would be significant social pressures around a product that lets the wearer look up info and take pictures without obvious notice. That battle will take a while to play out and Apple would seem unlikely
Exh 6: Summary of Broad Approach and Positioning by Apple and Google in AR
to be the one to fight it. For a company that has tied itself tightly to the idea of privacy, smart glasses that could be used to ID strangers on the fly and to film surreptitious video would seem distinctly off brand.
Technical and social acceptance issues aside, augmented reality goggles or glasses will need software. To deliver a product in 2020, Apple would need to get a LOT of developers on board ASAP to build applications (Exhibit 6). That we are not yet hearing about suggests that either the company has innovated in the field of contract law to build a truly ironclad NDA or that plans are still in the experimental phase. It is possible that a 2020 “launch” will just be an API, like ARKit, to enable developers to get to work. Odds are we will hear a lot more details about Apple’s AR device strategy long before a pair of glasses hit the Apple Store.
Exh 7: Key Elements Necessary for Seamless AR Wearable Experience
A final note. 5G is coming. Most companies contemplating augmented reality plan to use the fast, low latency wireless standard to drive applications directly from the cloud, where massive hyperscale datacenters can store huge amounts of data and GPUs can render highly detailed graphics. Device centered Apple doesn’t think this way. It will grapple with storing and running AR apps directly on an iPhone and then using some new sort of local connectivity (Wi-Fi?) to communicate to the glasses. This approach would limit the sophistication of AR applications relative to cloud-delivered alternatives, presuming 5G lives up to its performance promises. That too is an if, but one in which we feel a bit more confidence.
So, Terminator-style AR glasses that instantly display salient information about the world around you or that allow a group of friends to battle realistic looking aliens together in a public park without wearing $2,000 ski masks are still at least five years away and probably longer. If they are coming earlier that that, we will hear about it long before some contract manufacturer starts tooling up to build Apple Glasses. Our advice: Don’t add a new line item to you Apple models just yet.