Why augmented reality isn’t about augmenting reality

Delta Reality
6 min readSep 3, 2021

Emerging technologies fight to survive the hype cycle; it is not until they make it past the disillusionment phase and reach the plateau of productivity that industries start to accept new technologies and adopt them into their workflows, services and products.

Although AR is a fairly new technology, there are some indications showing it is going to be one of the leading technologies of the future.

As a seemingly new technology, augmented reality (AR) is no different, and we suggest that it is currently positioned somewhere between the slope of enlightenment and the plateau of productivity. Now is the hottest time to grasp the concept of AR, so we propose a new way of thinking; a slightly different approach to understanding what augmented reality is and how to use this approach when creating your own use case for the new tech.

What exactly is augmented reality?

The augmentation of reality describes the process of overlaying digital content on top of or in the context of a physical space, i.e. reality. The reality here is observed primarily through a camera, while the augmentation is achieved using software. This software thus looks at reality through a camera or similar photosensitive device and processes this information. It then combines this information with other digital content and delivers it to the end user via some sort of display device, ranging from a mobile phone screen to a specialised lens in an AR wearable (e.g. a headset or goggles).

AR is distinct from holographic technology, which produces a somewhat similar visual stimuli but achieves it by using inherently different approaches and producing different end results. AR, to some degree, is aware of its context and of the spatial characteristics of the space within which it is being produced.

“Blurring the boundaries”

As humans, we rely heavily on a very similar set of skills. Like AR software, we too use photosensitive organs to collect information about the environment, about relationships between objects in the environment and about the relationship between our body and the observed objects. This complex cognitive skill is called spatial awareness.

We use our eyes to collect the data about the objects around us, which is quite similar to what the AR does.

Similarly, AR tries to understand the relationships between the objects it observes through the photosensitive device. It also keeps track of the position and rotation of the photosensitive device relative to the environment, and uses all of this awareness to augment and actually mix digital content with the observed reality. This software-based spatial awareness goes by the name of spatial computing. It is probably one of the most important technologies in the industry at this moment. Spatial computing changes the way software perceives its surroundings, redefining the boundary between the virtual and physical environments. The common term used to describe this process is ‘blurring the boundaries’.

AR relies heavily on spatial computing. It uses spatial computing but it does not only consist of spatial computing. AR is best described and understood as an interface for spatial computing. It is the platform where humans can meet spatial computing and interact with it in a very intuitive way, in an area of (now) shared and common data — our shared environment. When understood as an interface for spatial computing, AR becomes a not-yet-invented mouse, touchscreen or keyboard, a new interface for this huge paradigm shift that the disruptive potential of spatial computing offers.

AR at Delta Reality

This potential is why we at Delta Reality look at AR in a different light, and always insist on thinking of it as an interface. This way of thinking makes the subtle but crucial difference between using AR as a mere tool versus using it as an environment for innovation. Our approach to thinking about spatial computing is to think about it from the perspective of the computer, in a way.

“The Metaverse Park” — an app created by Delta Reality for Niantic’s Lightship Global Jam.

Instead of finding possible use cases in the pre-existing reality and then forcing them into a spatial computing paradigm and AR, we try to think of spatial computing as an extension of software into physical reality, and AR as its hands or skin: what can our software touch and feel using spatial information and how can this then be used to deliver something to end users? How can we use this interface to interact with the computer? How can we meet software in our common and shared environment? How can we help it grow its understanding, and how can we learn from it? What does it see that we do not see?

“The Metaverse Park” and its digital inversion.

At Delta Reality we tend to use verbs that describe human actions to describe the processes and features of AR software. In this way, we discover possibilities that we weren’t previously aware of due to the simple fact that they had not emerged from the usual software-related lingo.

Application of AR in the furniture industry

As an exercise in this, we could discuss an AR furniture store app, which is a very common use case for AR technology. Thinking from the perspective of the software, we are connected to the webstore database and can access all of the products and their specifications. We know a lot about our products, from their dimensions, materials and colours to how they are assembled and even how to carry out simple repairs. We even know some DIY methods for customising them, since our talented employee from the design team has decorated her pieces in the office and everyone just loved it.

By using AR, user can see how a certain piece of furniture would look in the space.

As software, we could know a lot of this kind of inside information. Reaching out into the consumers’ space, we could touch their environment and share this knowledge with them, showing them how each of the items in our webstore fits into their environment, while enabling them to select colours and materials and choose the locations themselves.

As the consumer moves closer to an augmented object, we could show them the fine textures in close up or could offer to grab that texture and bring it closer to our shared eyes, i.e. a photosensitive device, to show them even more in-depth info about the material itself. If they love a piece of material that they have in their home but don’t know what it is, we could ask them to show it to us so that we can analyse it and maybe find a match, or offer something similar. We could help them assemble furniture in real time, both automatically or with the help of a remote agent if needed.

The simplest and the most cost effective way of buying a decoration is checking through an AR app whether it would fit into the space.

Through AR we could show them how to decorate and customise their furniture or offer off-the-cuff customisation ideas based on the aesthetic impression we get from their environment. We could even teach them how to perform simple repairs in case anything gets worn out or broken. Finally, we could suggest new items in the future, based on our knowledge of their aesthetic choices and previous selections (or even their DIY customisations).

Conclusion

Although furniture store AR is a common use case, a simple shift in language and perspective offers some new insights and throws a few new ideas into the mix. We at Delta Reality find this approach useful and fruitful, both when developing AR apps for our clients and in internal research and development projects. That is why we find the term augmented reality a bit misleading, since AR isn’t just about augmenting reality. This minor departure from the idea of a digital overlay to spatial awareness from the perspective of the software has helped us craft some new and innovative solutions.

This correction in understanding has given us the opportunity to truly imagine and see the disruptive potential of the technology, and to actually comprehend what is often obscured by the wording of the term ‘augmented reality’.

--

--

Delta Reality

Award-winning XR development studio developing captivating, immersive and mind-blowing experiences using virtual reality, augmented reality, mixed reality.