Beyond AR vs. VR: What is the Difference between AR vs. MR vs. VR vs. XR?
- 820 shares
- 2 mths ago
Augmented reality (AR) is an experience where designers enhance parts of users’ physical world with computer-generated input. Designers create inputs—ranging from sound to video, to graphics to GPS overlays and more—in digital content, which responds in real-time to changes in the user’s environment, typically movement.
See the differences between Augmented, Virtual and Mixed Reality here.
Augmented reality has science-fiction roots dating to 1901. However, Thomas Caudell described the term as a technology only in 1990 while designing to help Boeing workers visualize intricate aircraft systems. A major advance came in 1992 with Louis Rosenberg’s complex Virtual Fixtures AR system for the US Air Force. AR releases followed in the consumer world, notably the ARQuake game (2000) and the design tool ARToolkit (2009). The 2010s witnessed a technological explosion—for example, with Microsoft’s HoloLens in 2015—that stretched beyond AR in the classical sense, while AR software became increasingly sophisticated, popular and affordable.
Under the umbrella term extended reality (XR), AR differs from virtual reality (VR) and mixed reality (MR). Some confusion exists, notably between AR and MR. Especially amid the 2020s’ technology boom, considerable debate continues about what each term covers. In user experience (UX) design, you have:
AR—You design for digital elements to appear over real-world views, sometimes with limited interactivity between them, often via smartphones. Examples include Apple’s ARKit and Android’s ARCore (developer kits), the Pokémon Go game.
VR—You design immersive experiences that isolate users from the real world, typically via headset devices. Examples include PSVR for gaming, Oculus and Google Cardboard, where users can explore, e.g., Stonehenge using headset-mounted smartphones.
MR—You design to combine AR and VR elements so digital objects can interact with the real world; therefore, you design elements anchored to a real environment. Examples include Magic Leap and HoloLens, which users can use, e.g., to learn more directly how to fix items.
Because of the slight overlap regarding interactivity, brands sometimes use AR interchangeably with MR. “Augmented reality” remains popular—despite the point that the original sense of AR design is overlaying digital elements upon real-world views, e.g., GPS filters/overlays on smartphone screens so users can find directions from street views. So, digital elements are merely superimposed on real-world views, not anchored directly to them: The computer-generated content can’t interact with the real-world elements users see—unlike in MR. The HoloLens is MR, for instance, because it interprets the space in a room and combines digital objects with the user’s physical environment.
Augmented reality combines real-world sensory input with computer-generated real-time content using SLAM (Simultaneous Localization and Mapping).
This process involves three main steps:
Sensing and Tracking: The AR device senses the environment with cameras, accelerometers, gyroscopes, GPS, and even lasers to track the position and orientation of the user and their device.
Image Processing and Recognition: The system analyzes the sensor data and identifies objects or features in the environment that can be augmented. The device uses image processing and recognition algorithms to identify and track objects in real time.
Rendering and Display: The final step is to generate and display the computer-generated content on top of the real-world environment. This step renders and displays virtual objects in the correct perspective and position relative to the user's viewpoint. To the user it should seem as if the objects are really there, like a hologram.
AR designers made considerable strides in the 2010s—a decade full of invaluable AR lessons and examples while the required sensors became cheaper.
Pokémon GO is noteworthy, a GPS-oriented app that “inserts” Pokémon characters into users’ environments so users can find and capture them on device screens.
Google’s AR stickers are another prime example; users drop realistic images into their camera shots. Users find AR particularly appealing for its entertainment value. Still, AR’s mainstream future appears assured across a wide range of applications, including education inside museums. With AR applications, you can bring experiences closer to users in their environments through designs that are more directly engaging, personalized and—indeed—fun.
“Augmented reality is going to change everything.”
— Tim Cook, Apple’s CEO
A UX designer for AR needs to understand context of use, which refers to the specific situation or environment in which the users will use AR technology.
In this video, Frank Spillers, founder of UX consultancy Experience Dynamics, covers the key characteristics of context of use that you should consider to create successful AR experiences.
Copyright holder: T0KEEYO Appearance time: 0:45 - 0:49 Copyright license and terms: CC BY Link: https://www.youtube.com/watch?v=cUpCBHmPCPo&ab_channel=T0KEEYO Copyright holder: Lone Fox Appearance time: 0:49 - 0:53 Copyright license and terms: CC BY Link: https://www.youtube.com/watch?v=FshzYgDXS44&ab_channel=LoneFox
Video copyright info
Copyright holder: T0KEEYO Appearance time: 0:45 - 0:49 Copyright license and terms: CC BY Link: https://www.youtube.com/watch?v=cUpCBHmPCPo&ab_channel=T0KEEYO
Copyright holder: Lone Fox Appearance time: 0:49 - 0:53 Copyright license and terms: CC BY Link: https://www.youtube.com/watch?v=FshzYgDXS44&ab_channel=LoneFox
To design successful experiences for AR consider:
Safety—Remember users’ real-world contexts; don’t distract/mislead them into danger.
Overkill—Beware of drowning users’ senses with meaningless data; keep experiences contextualized.
Environment—Unlike desktop experiences, AR happens anywhere. So, the aim is primarily for users’ contexts regarding whether they’re outdoors/indoors and moving/static. Whatever their setting, users expect pleasurable, user-friendly experiences. AR UX’s Rob Manson stipulates user scenarios:
Public—interacting with software, using the entire body
Personal—using smartphones in public spaces
Intimate—sitting, using a desktop
Private—using a wearable
Comfort—Make comfortable designs to prevent physical strains and reduce cognitive load.
Security—AR data is rich, so design to ensure users’ data is secure.
Familiarize yourself with AR terminology and a new form of information architecture.
Constantly ask “Where are users?” and how they’ll apply and adopt your design.
Remember physical limitations—users hold devices longer while seated, etc.
Make interfaces automatic so users needn’t be prompted with commands. Consider voice controls.
Use AR-software-creating resources optimally (e.g., Apple’s ARKit).
Offer easy onboarding.
Provide clues and maximum predictability.
Prioritize screen real estate.
Design for accessibility.
Design animations where you consider how frame rates and processing power impact device compatibility.
Ensure your design interprets and responds to users’ head and body movements so users can act intuitively and freely without giving commands.
Ultimately, understand what users—in various contexts—expect before you try to meet the experience demands. Do user testing that covers all feasible conditions (lighting, weather, etc.).
Learn how to design your AR experiences with our UX Design for Augmented Reality course.
Find some vital AR considerations here in Designing for Augmented Reality.
Read a specialist’s detailed take on AR in The Principles of Good UX for Augmented Reality | by Tyler Wilson.
You can see Apple’s guidelines for designing for AR in Augmented reality | Apple Developer Documentation.
Read more about AR-MR-VR differences in the article Beyond AR vs. VR: What is the Difference between AR vs. MR vs. VR vs. XR?
Here’s the entire UX literature on Augmented Reality by the Interaction Design Foundation, collated in one place:
Take a deep dive into Augmented Reality with our course UX Design for Augmented Reality .
Augmented reality has emerged as a transformative technology, allowing us to blend the digital and physical worlds to enhance our daily lives. However, the path to create seamless and intuitive user experiences in AR presents unique challenges. This course equips you with the knowledge and skills to overcome these challenges and unlock the full potential of AR.
UX Design for Augmented Reality is taught by UX expert Frank Spillers, CEO and founder of the renowned UX consultancy Experience Dynamics. Frank is an expert in AR and VR and has 22 years of UX experience with Fortune 500 clients, including Nike, Intel, Microsoft, HP, and Capital One.
In this course, you will explore the entire design process of AR, along with the theory and guidelines to determine what makes a good AR experience. Through hands-on exercises and discussions, you will explore and discuss topics such as safety in AR, how to determine whether AR is the right platform for your idea, and what real-world spaces have potential as stages for AR experiences.
In lesson 1, you will learn the origins of AR, what makes it unique, and its colossal impact on human-computer interaction.
In lesson 2, you will dive into user research practices tailored to AR and its unique characteristics.
In lesson 3, you will dig into how to prototype for AR and create low-fi but testable prototypes.
In lesson 4, you will learn the heuristics and guidelines to test your designs and ensure they are practical and user-friendly.
Throughout the course, you'll get practical tips to apply in real-life projects. In the Portfolio projects, you'll build a foundation of an AR product. This will allow you to create a portfolio case study to entice recruiters or developers to make your dream a reality.
Use your industry-recognized Course Certificate on your resume, CV, LinkedIn profile, or website.