Edge Computing for Augmented Reality

While our last post talked about the basics of edge computing in this post we will be discussing a more specialized use-case for edge computing technology, augmented reality. Augmented reality is when an experience is created that involves overlaying a digital sensory enhancement on top of the real world. In a simple example, imagine a construction worker uses their phone to look at a two-dimensional blueprint on a piece of paper and the phone displays the blueprint in three-dimensions. Another example is a fighter pilot being able to see important locations overlaid with the ground on their heads-up-display.

Augmented reality being used to add digital furniture to a room when viewed on a phone camera.

Augmented reality being used to add digital furniture to a room when viewed on a phone camera.

Augmented reality is certainly incredible, but it has been and still is frequently plagued with latency issues. Humans are used to immediate responses from their surroundings, when we pick up an object it happens instantaneously. There is no latency between when we do something physically and when that physical reaction occurs. With augmented reality a frequent complaint is that there is a loading time, or delay, between when an action is made and when the digital experience changes to reflect that action. This is especially important when using head mounted AR systems such as the Microsoft HoloLens or the Magic Leap.

A soldier wearing an augmented reality headset for combat training purposes.

A soldier wearing an augmented reality headset for combat training purposes.

The solution to these response time latency issues are edge computers. Using edge computing software and hardware engineers can greatly reduce the computational time required to render or generate the augmented reality experience. We will show why edge computing is the best solution to the latency issue below by comparing each of the existing computation architectures:

  • Cloud Computing - The augmented reality device transmits a video feed of the real world to a server via the internet. There the server interprets the video, adds the augmentations, and transmits the augmented feed back to the user. This solution has the lowest hardware costs. However, it also has the highest latency, as well as significant computational costs.

  • On Premises Computing - The AR device transmits the video feed to a server located very close to the device, for instance in the server room of a factory which has AR assistance goggles for assembly line workers. This data is then augmented and transmitted back to the AR goggles. This solution has lower latency than cloud computing but it still has to transmit data. There are also hardware and staffing costs associated with on-premises computing.

  • Edge Computing - The AR device has a powerful computer designed for video analysis and augmentation built into it. The AR device receives data from the camera, augments it, and then displays it to the user near instantaneously. This very low latency makes the user feel as if the augmented reality is an extension of their own senses rather than a clunky tool they have to wear. In turn this improves the user experience as well as the performance benefit of the AR tools. Edge computing has significantly lower latency than the other solutions, and the hardware costs associated with adding edge computers to devices are decreasing rapidly.

User experiences are extremely important when new technologies are being developed. No one enjoyed using the first touch screens, but now we could not imagine devices without them. Edge computing could be one of the driving technologies that will greatly improve the user experience of augmented reality products. It is vital that technologists understand the impact edge computing devices will have on augmented reality and countless other industries.

We hope you enjoyed this article, if you did please share it with your friends. We hope you comment or leave questions below. Our founder can be reached at joseph@riaforce.com for direct questions!

Previous
Previous

Will Mines Be The First Industry to Use Edge Computing Heavily?

Next
Next

What Is Edge Computing?