• Skip to main content
  • Skip to secondary menu
  • Skip to primary sidebar

Scribe

Literary genius. Academic prowess

  • In the Press
  • Student Articles
  • Editor Blogs
    • Extended Reality: Applications and Implications
    • An Introduction to Flight
    • A Retrospective on Film
    • Psychology: Controversies and Myths
  • About
    • Alumni
    • Staff
  • Contact

XR and Accessibility

September 23, 2022 by Shanna Finnigan Leave a Comment

One of the biggest contentions most critics have with Extended Reality is its lack of attention to accessibility. The fact of the matter is, it is not guaranteed that we can all use our senses to the same extent as everyone else around us; we will all therefore experience XR in different ways. Some are at an even bigger disadvantage due to issues in motor, visual, or auditory capabilities. XR is all about being able to engage with the world around you through your senses. It depends heavily on motion, visuals, and audio to create a memorable experience. So what happens when someone is unable to engage with this medium due to a disability?

VR and AR designers are entering the age of new mediums, in which the usual solutions we tend to have for accessibility are no longer as applicable. Dylan Fox and Isabel Thornton bring up a great example of this issue in their overview of ethical practices in the space: “features like captions are well understood on rectangular 2D screens but present many new challenges in XR” (IEEE Global Initiative). It has been challenging for designers to completely reimagine user interface and experience to accommodate as many people as possible. VR is already very taxing on the able body, but for people who have motor dysfunctions that either cause them to move their head non-stop or to be paralyzed, it is even more so. Many XR products have turned to eye-tracking as a method of solving this problem, but even this is not enough. Many people who have undergone strokes or who have developed ALS, for example, cannot move their eyes at all. Because we are so early in the process, not many solutions exist yet.  

Ventures like Cognixion are trying to change this. Cognixion is a growing company that specializes in “Assistive Reality” solutions. Their main product, Cognixion One, is an AR headset that aids in communication between users who are non-verbal (or cannot clearly communicate verbally) and other people, who may include their caretakers, family members, friends, etc. In a very simplified sense, this is how it works: there is an EEG at the back of each headset which is secured to the user’s head and reads the brain waves in their occipital lobe in the back of their head. On the sunglass-like frames in front of the user’s eyes, they are able to see a stereoscopic hologram-like projection of a keyboard and other UI elements. The EEG at the back of the headset is able to detect the most minute changes in brain activity, and deduct what letter the user is trying to look at on the display in front of them. Essentially, they are typing out words to the little computer inside their headset using their brain! Once fully formulated, the words are played back in the form of audio, and as a reflection on the front side of the shades on their headset. This way, whoever they are holding a conversation with can see and hear what the user is saying. 

The field of XR is growing rapidly, and with this fast growth, comes a responsibility of designers and engineers to be mindful of who they are designing for. 


Shanna Finnigan

View all posts

Filed Under: Editor Blogs, Extended Reality: Applications and Implications

Reader Interactions

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Primary Sidebar

Editor Blogs

  • An Introduction to Flight
  • Extended Reality: Applications and Implications
  • A Retrospective on Film
  • Psychology: Controversies and Myths

Recent Posts

  • Predicting the Path a Particle Will Take in a Fluid: A Brief Overview
    Oliver Khan
    February 16, 2023
  • MDMA and Psychotherapy
    Kaitlyn Woods
    February 16, 2023
  • Monocular Visual Cues and VR
    Shanna Finnigan
    February 16, 2023
  • A Guide Through the Proof of the (Second) Fundamental Theorem of Calculus
    Oliver Khan
    February 2, 2023
  • Innate Moral Core: Part 2 (Morality in Preverbal Children)
    Kaitlyn Woods
    February 2, 2023
  • Why I Write on XR
    Shanna Finnigan
    February 2, 2023
  • Constructing the Riemann Integral: A Brief Prelude to Real Analysis
    Oliver Khan
    November 11, 2022
  • Kiki’s Delivery Service: Emerging into Adulthood
    Dayvin Mendez
    November 11, 2022

Copyright © 2023 · Scribe on Genesis Framework · WordPress · Log in