Americas

  • United States

Asia

Apple isn’t saying too much, but AR is everywhere at WWDC

opinion
Jun 08, 20225 mins
AppleAugmented Reality

Though Apple didn't say anything this week about the augmented reality glasses most people assume it's working on, AR is in the air at WWDC.

apple wwdc22 developer sessions
Credit: Apple

While they are virtual, the elements — earth, air, fire, and water — are as important in unreality as in reality. Those virtual elements must themselves be augmented with technologies to replace ordinary sense perception: spatial positioning, object detection, distance perception and more. Combine all of this and you have an operating system to emulate reality in unreality.

Apple is building it.

AR is everywhere at WWDC22

WWDC 2022 didn’t see Apple make any mention of the augmented reality (AR) glasses we all think it is working on.

And while AR enjoyed a couple of mentions during the company’s public-facing keynote on Monday, it didn’t really evangelize the tech. That’s quite unlike previous years. The company has usually had something to offer on the topic since six years ago when CEO Tim Cook said: “We are high on AR for the long run, we think there’s great things for customers and a great commercial opportunity.”

That it didn’t say much this year is conspicuous.

While the company may not have mentioned much on stage, developer sessions taking place during the show tell a different story. They appeart to show overt and covert instances of philosophies and technologies to support AR at almost every turn.

Even the capacity to support multiple windows in Swift UI apps may have significance, as the company moves toward new ways to interact with data. That amazing wide-angle-camera-driven Desk View tool to simultaneously show a user’s face and an overhead view of their desk could so easily be a tip of the proverbial hat toward new usability modes.

Do your fingers need a real keyboard when you can have a virtual one? When will we wear our Macs like sunglasses?

[Also read: Apple calls out Meta for hypocrisy]

What Apple is talking about, really

Apple’s WWDC developer sessions are festooned with features to promote, enable, or suggest how it is at an advanced stage of preparing the ground for augmentation across its platforms. Just glance at the sessions calendar and you’ll find relevant sessions, such as:

  • Machine learning in Metal to create ever more realistic gaming experiences.
  • MetalFX, a powerful API to enable high performance and high quality graphics effects.
  • A session on ARKit6, developers can create AR experiences rendered in 4K HDR for even more photorealistic scenes.

Helping computers understand where they are and what they can see is also an essential component to building AR. RoomPlan shows how Apple is building tech for this, even as the widening number of categories understood by its powerful Lookup tool shows growing understanding of surrounding environments. LiveText in video makes every word you can catch on camera actionable across your apps.

If you can read the room, you can read the road, I guess.

And there’s more. Look at the growing harvest of location data now made available in Maps and MapKit, where you can explore whole cities in detailed 3D. LiDAR cameras give you depth. UWB can be an alternative LAN and supporting tech like Universal Scene Description. Glance at those sessions once again and there’s so many doing dual duty, supporting Apple’s existing platforms while also underpinning those we think are slowly entering the light.

More than what’s visible

I’ve only scratched the surface of what we can see, but the takeaway is simple: Apple’s not yet ready to discuss its wider plans, but at WWDC 2022 it is gently equipping its developers with the tools they need to build increasingly sophisticated AR experiences.

The focus on (machine) intelligent perception and understanding of the immediate reality around the computer is the most profound piece of this puzzle. Solving it will empower Apple to offer tools with which to develop automated solutions for many roles. What we may end up calling realityOS for consumer experiences could very easily become an “industry OS” for intelligent manufacturing. Once you have perception, depth, location and object recognition, you have an automation opportunity.

Apple has all these things — and also makes the silicon to drive them.

I’ve already gotten too far ahead of the reality we are in right now. But while Apple seemed sotto voce on its plans for AR during its keynote, if you dig deeper, you’ll see that where it counts — in its contacts with developers at the event — AR and tech to support AR is very much in vogue.

Meanwhile, everyone’s favorite enigmatic Apple rumor machine, Ming-Chi Kuo, believes the firm will shed more light on its AR plans at a special media event in January 2023, 16 years since the January 2007 introduction of iPhone. Thay would be a move that has a certain historical resonance and speaks volumes to the company’s growing confidence in the platform it seeks to build. To my mind, the company already has a huge stack of developer technologies in place to support this next step. It just needs a little more time to get things right.

Apple’s innovation engine remains at “full throttle,” Morgan Stanley has said. I agree.

Please follow me on Twitter, or join me in the AppleHolic’s bar & grill and Apple Discussions groups on MeWe.

jonny_evans

Hello, and thanks for dropping in. I'm pleased to meet you. I'm Jonny Evans, and I've been writing (mainly about Apple) since 1999. These days I write my daily AppleHolic blog at Computerworld.com, where I explore Apple's growing identity in the enterprise. You can also keep up with my work at AppleMust, and follow me on Mastodon, LinkedIn and (maybe) Twitter.