As augmented reality (AR) applications become more commonly used in the technology world, users expect increasingly sophisticated experiences comprising impressive visuals and interactions that are adaptive and aware of the users’ environment.
With the expanding processing power and popularity of mobile computing devices such as smartphones, tablets, AR HMDs, AR is becoming more accessible in the consumer market. This has led to an inundation of digital data that calls for learning about different content influencing AR and streamlining disparate types, including 3D and codified AR content.
For the AR experience to improve, virtual content should behave realistically in its physical environment. The user should be able to interact with it naturally and intuitively.
Advanced AR experience creation techniques like Depth Sensing (implemented by organizations in the Connecting Stage in the AR maturity model) enable the next-level of real-time information. This significantly improves how users can interact with their environment where virtual content is placed.
Technology capable of understanding user interaction and their environment was expensive or restrictive. However, newly released depth sensors & LIDAR sensors on the latest Android & iOS mobile devices and devices like the Microsoft HoloLens and Magic Leap provide developers with a low-cost and widely available real-time depth-sensing.
They help AR applications understand the three-dimensional (3D) environment they are operating in and support new ways to blend the natural and digital worlds.
The Need for Environmentally aware AR applications
Image marker and natural feature registration algorithms have been used to create AR experiences. Computer vision algorithms on mobile devices enable the detection of these markers but have no awareness about the targets that exist in the environment. This lack of understanding can cause the virtual content to seemingly float above real objects or appear inside them, or occlude objects it should appear behind, breaking the illusion that the virtual content exists in the real world.
Early attempts at environment awareness required manual modeling of all the real objects in the users’ environment and online localization of the camera to ensure virtual objects interacted with real objects appropriately. This method is both time-consuming and inflexible, as any changes in the environment would require recalibration.
For making AR experiences more realistic, obtaining proper environment awareness is essential. This includes features like correct occlusion (i.e., digital objects’ ability to appear behind real-world objects that make things feel like they’re actually in your space), collision detection, realistic illumination, and shadowing effects, etc.
Though these features are not necessary for augmented reality, it has been shown that applications that include such cues can establish a stronger connection between real and virtual content.
Fig 1: Environmental Awareness with depth tracking
The Significance of Depth Information
With advancements in imaging technologies & techniques such as contour-based object segmentation, depth information from stereo cameras and time-of-flight cameras, online SLAM, depth from motion algorithms, etc., acquiring the relevant information from the scene needs no offline calibration. The system can correctly process the environment even when objects change or are added and removed.
Fig 2: AR on Hololens with depth sensors
With depth sensors coming into action, AR content can be enhanced with realistic physics, surface interactions, environmental traversal, etc.
Depth sensing can unlock new utility use cases. For example, the remote assistance solution that enables AR annotations on video calls (enabled by platforms like Scope AR’s Worklink) uses depth sensing to understand the environment better, so experts worldwide can more precisely apply real-time 3D AR annotations for remote support and maintenance.
When organizations enter the leading stage in the AR Maturity model, implementing depth sensing as part of their AR content strategy will enable;
- AR content to be more personalized and adaptive.
- The emergence of Enterprise AR standards to allow systematic content reuse and improved cost-efficiency.
Implementing Depth Sensing in AR experiences
Using AR Core, Google’s developer platform for building augmented reality experiences, or AR Kit, Apple’s developer platform for building augmented reality experiences, developers can enable mobile devices to create depth maps using a single RGB camera to make the AR experience more natural. Depth sensors on HoloLens and Magic Leap AR HMD’s allow more natural and realistic AR experiences.
Several low-cost consumer mobile devices with depth sensors are available to enable environmental awareness and natural interaction. This has opened the competitive market for engaging AR experiences leveraging the device’s capabilities to examine a three-dimensional volume within the task space and realistically compositing the virtual content in the environment.
Final Word
As Augmented Reality enters the broad awareness stage among consumers worldwide, it’s not just the researchers. Still, enterprises, developers, and product marketers globally are all too enthusiastic about the new opportunities and transformation that realistic AR will bring to mainstream use cases.
However, it is crucial for organizations traversing through the connecting and leading stages of the AR Maturity model to implement intelligent AR depth-sensing applications that broaden the path for immersive innovation everywhere, to check all the boxes.
Radiant Digital is an AR-led enterprise with crucial proficiency in depth-sensing technology for AR applications. Make AR a genuine game-changer for your business by connecting with us today.