AUGMENTED REALITY APP
WHAT I DID
UI/UX Developer & Application Developer
Mixed reality: Your world is the canvas
Microsoft HoloLens is the first self-contained, holographic computer, enabling you to engage with your digital content and interact with holograms in the world around you.
Initial UI Design
The goal of this project was to design a user-friendly mixed reality experience that increases the efficiency while performing a set of given tasks during an EVA. In order for a Mixed Reality (MR) Heads Up Display (HUD) to be effective, the the team designed an interface with the goals that it should be simple and intuitive. To ensure safety and flexibility, the interactions are hands free with the option of using gesture controls. Voice commands show and hide elements of the UI on demand.
The user retains control over the user interface (UI) with user initiated interactions that show and hide obstructing overlays. The user is prompted for interaction with the system using an intuitive interface, that incorporates guided step by step instructions and a schematics overlay. Usability testing was performed to ensure that the design implementation meets our requirements, followed by iterations of development and design updates. The third set of testing at NASA Johnson also revealed some interesting features that could be added to the interface to enhance usability. These features are voice guided instructions along with the written interface and animations, as well as audio cues for warning alerts.
The overall display is made up of four different components:
● EVA Checklist - Provides scripted step-by-step instructions that need to be completed to complete EVA Tasks.
● Guided Animation of Steps - Provides visual aid of designate steps in EVA checklist.
● Object Recognition - This functions enables user to identify needed tools for individual steps.
● Telemetry Data (Medbay) - Provides automated communications process by which measurements and other data are collected at remote or inaccessible points and transmitted to receiving equipment for monitoring.
As the user activates the Hololens EVA Task Manager s/he is promoted by EVA to select a task that needs to be completed. Once user selects a task, via voice command or hand gesture, the EVA Checklist Panel will appear to the left of the user field of vision. This panel is slightly opaque light blue (#68B7CA) with a white (#FFFFFF) outline. At the top of panel is the title of task to be completed also in white (#FFFFFF) (Font: Acumin Variable Concept Style: Extra Condensed Black). Check-box beside abbreviated steps will be highlighted red (#F9100C) to indicate to user the current step. At the bottom right of panel will be a numbering scheme to indicate the current step and how many are within the designated task. On completion of steps user will use voice command “Next” to go to next step, a check-mark will appear to mark the completion of a step.
Guided animations panel will appear to the right of the users field of vision. This panel is similar in appearance to the Checklist. It is slightly opaque light blue (#68B7CA) with a white (#FFFFFF) outline. Within the panel will be a 2D or 3D animation of current step
that will loop continuously.
When the user gazes at the current step’s designated object in the Air Scrubber, it is highlighted red (#F9100C) to specify to user correct object to complete step.
Telemetry Data such as BPM (Beats Per Minutes), oxygen levels, temperature, and battery power will be displayed in the MedBay Panel. This panel is a circular and opaque light blue (#68B7CA), that is divided into four individual parts. There are two options for identifiers -- texts or icons.
When user says MedBay or selects the icon with one finger click gesture. The telemetry data will be displayed in the lower right hand side of the user field of view.
The results from the heuristic evaluation revealed the importance of creating an interface that displayed vital information intuitively and minimally. A second iteration of the interface was created using the feedback from the evaluation, which was then tested informally with attendees at the eMERGE AMERICAS conference. The findings were used by the team to create a third iteration of the interface, the version that underwent the most rigorous usability testing before test week.
The results of the comparative paper versus AR usability study show that participants using the AR method when completing the disabling alarm procedure experienced a lower success rate than when using the paper method. When completing the rerouting power task, participants experienced seemingly equivalent success when using paper or AR. As for time on task, more time was spent on completing tasks using AR versus using paper. SEQ mean data suggests that participants found completion of the disabling alarm task more difficult using AR, while completing the rerouting power task was found easier using AR. A table of the results are shown below. Given the small sample size across the usability, we did not conduct statistical analysis of the results. Table 1 provides descriptive statistics of the task-based metrics.
Final UI Design
The final designs were created based on the system requirements and results of the usability evaluations. The interface consists of two main panels with EVA time as fixed object, available across panels. The EVA time is presented at the top right of the user’s point of view providing how long a user has been conducting the EVA.
The two main panels are visible when a user first opens the interface, Telemetry Panel and Task Selection Panel. On the Telemetry Data Panel are data sets grouped into 5 categories Battery, Oxygen, Water, Suit, and Environment. Data sets are considered optimal when that are colored in blue. When data set is not in an optimal state, it is highlighted and turned red. A fixed waring panel appears under the EVA time indicating which data set is not in an optimal state. Switches are grayed out when off and red when on, similar to warnings on car dashboards.
On the Task Selection Panel are tabs indicating what task are available for the user to complete. Once a the user select desired tabe using a the one figure click gesture or says “Start [Name of Procedure] the Instruction Panel will appear.
On the Instruction Panel the title of the panel includes the name procedure and pagination. The content of the panel includes the step number, instructions, animation on how to complete the step, directional button, and schematic button. Users can use voice commands such as “Next” “Previous” and “Open/Hide Schematic” to select designated buttons.