Does Augmented reality need AR glasses to reach the masses?
Rumors about Apple’s AR glasses have been around for years. Is 2023 finally THE year of Augmented reality? Or have we been looking into the wrong suspects?
Despite its promises, AR has been underwhelming. It found a few niches on social media (for marketing), remote real-time assistance (for maintenance and repair) and gaming (PokemonGo, even though AR is just a feature), but we don’t see a daily, recurrent, ubiquitous usage of AR. AR glasses were supposed to take over the phone market to become the main computing platform. We’re definitely not there.
Who is to blame?
The usual suspects are:
- Hardware
- Software
- Content & use-cases
Hardware
Before talking about Apple’s glasses, let’s quickly see what’s already on the market: Microsoft has Hololenses, Google has Google glasses, Meta added see-through mode to its Oculus Quest VR headset and Snap offers its Spectacles. Then there are hundreds of smaller makers like Magic Leap, Nreal, Madgaze, Vuzix, Lumus, RayNeo, Dispelix, etc… This is not a complete overview of the market, just an argument to say that there are already many options to buy from. It’s fair to say that at this stage, we’re far from having a clear winner.
Apple’s former product successes don’t have to be described, everyone knows about hits like the Macintosh, the iPod or the iPhone. But the firm is now facing a dilemma, as the laws of science make today’s hardware incompressible in terms of size. For its future main product, iGlass or whatever they’ll call them, Apple will have to choose between :
- Bulky, heavy, not very sexy glasses like the Hololenses / Magic Leap. This option allows Apple to pack in the entire sensors, chips, batteries & powerful displays required to create fully immersive experiences that are spatial (and if possible) context aware.
- Thin, light and stylish like Rayban stories / Spectacles. The second option will surely convince more people to wear them, but they won’t be able to do much, certainly not as much as mobile phones are allowing today. To increase computing capabilities, they could be tethered to an iPhone, but who wants that cable dragging somewhere…
But you just can’t have it all. Even if, hypothetically, microchips’ sizes were reduced and new magical batteries were invented, those AR glasses would still require 2 missing parts: software and content.
Software
There are several layers of software coming to play to create true Augmented reality. It starts with the device operating system and ends with the last layer, usually the website or native app that the AR experience is running on. On the OS side, 2 main systems are used: ARkit for iOS and ARcore for Android. While most AR glasses run on Android, roughly speaking both ARkit and ARcore offer the same functionalities for mobile phones: SLAM, plane detection, occlusion, face tracking, motion capture, light estimation, etc. Those allow mobile phone cameras to render a 3D object that appears to be real, as if it’s really there in the real world. So far so good.
Aside from ARKit/ARcore, you have plenty of niche SDK, usually built on top of native SDK, focussing on areas not covered by Apple and Google: Vuforia is probably the best for image and object tracking, AR foundation the easiest tool for Unity developer and then you have Wikitude, Kudar, Zappar, EasyAR, Eon, Echo3D, Layar, 8thwall, lightship, Augment, Vossle, ARgear, Blippar, etc.
They all can play a role depending on the use-case, but fundamentally none of them reached a sufficient mass to create an infrastructure making AR accessible by anyone, anywhere. That’s because none of those SDKs focus on AR persistency, which to work requires ultra-accurate positioning IN THE REAL WORLD.
If AR content remains ephemeral and short-lived, no matter if users are accessing that content from a mobile phone, AR glass or any other device: it will last the time of a session and then… gone. If AR wants to play a part of our daily lives, software needs to provide that hyper-accurate positioning by default on every device. In other words, the real-world needs to be readable by machines.
Content
When it comes to Augmented reality, we’re in the realm of 3D. Many developers and designers tried to put 2D content (text, photos, videos) in a 3D space but… what’s the point? There are brilliant 3D creators out there, but the hard truth is that they represent a small minority. Today, imagining those AR glasses were ready — small AND powerful enough to render graphics—they would literally show us… our real world. Nothing more than our actual reality!
Of course, here and there, usually in the most iconic touristic spots in big cities, you’ll have a bit of content. At best you’ll be able to see the size of extinct dinosaurs.
Content is lacking because only a tiny fraction of the global population knows how to create a 3D model
But we have 2 good news!
- UGC. Facebook, Instagram or Youtube aren’t producing content, they are just giving anyone the tools to do so. ARE4, along with other no-code platforms, is now offering simple 3D creation tools. When creating AR/3D content becomes as easy as tweeting or sharing a photo on Instagram, we could witness an explosion of 3D content.
- GenAI. Speaking about an explosion, there is one generated by AI models. Text, photos, videos… 3D is next! So ask yourself: where will all those 3D models go? A picture of a 3D model posted on a 2D social network is a tragedy. Aside from video games, we believe there are other ways to anchor down those three-dimensions creations to the real world.
Use-cases
AR can increase productivity and reduce costs in construction, building management, urban planning, public works, medicine, repair, etc... Many industries where the utility prevails.
Yet we believe AR is best used in entertainment. It’s an immersive media to could unveil its true potential soon. Eventually, users will decide on the best use-cases, we just foresee a real world with overlaid UGC where it makes sense to people: in their favorite places.
Conclusion
We’re only in April but 2023 is already, undoubtedly, the year of GenAI. And that’s excellent news for Augmented reality! It will help to solve the content issue, turning more end-users into creators. Software is almost there, several bricks are been built, now is just a matter of time before they all come together. The hardware follows its own path, so expect years if not a decade before AR glasses replace mobile phones.
While Stardust is focusing on the software infrastructure, ARE4 is betting on that missing content layer.
As we’ve seen, we don’t need fancy AR glasses to unlock the true potential of AR: if the content is merging within your favorite places, Augmented reality can become part of your daily life.
Follow Neogoma to learn more about Augmented reality. If you’re a developer, check out Stardust SDK. If you’re a creator, a business or simply an early adopter, you might want to have a try at ARE4. See you in the best of both physical and digital worlds!