UI/UX Developer & Application Developer

Project Description

NASA astronauts have been using the same model of spacesuit for spacewalks since we went to the Moon. That was in the 80's!


In the fall of 2018, NASA announced its first Spacesuit User Interface Technologies for Students (S.U.I.T.S) Design Challenge. The challenge involved designing and developing an augmented reality interface that would ideally be built into the headpiece of the next-generation spacesuit. Our team and nine other schools across the nation were selected to participate in the semester-long challenge.

The goal of this project was to design a user-friendly mixed reality experience that increases the efficiency while performing a set of given tasks during an EVA. In order for a Mixed Reality (MR) Heads Up Display (HUD) to be effective, the the team designed an interface with the goals that it should be simple and intuitive. To ensure safety and flexibility, the interactions are hands free with the option of using gesture controls. Voice commands show and hide elements of the UI on demand.

The user retains control over the user interface (UI) with user initiated interactions that show and hide obstructing overlays. The user is prompted for interaction with the system using an intuitive interface, that incorporates guided step by step instructions and a schematics overlay. Usability testing was performed to ensure that the design implementation meets our requirements, followed by iterations of development and design  updates. The third set of testing at NASA Johnson also revealed some interesting features that could be added to the interface to enhance usability. These features are voice guided instructions along with the written interface and animations, as well as audio cues for warning alerts.

Mixed reality: Your world is the canvas


Microsoft HoloLens is the first self-contained, holographic computer, enabling you to engage with your digital content and interact with holograms in the world around you.

The Design Process & Deliverables

Phase 1 Define

Created initial proposal by brainstorming the product and how we would  execute on the project at the highest level with all necessary stakeholders.


Deliverable Proposal

Phase 2 Ideate Prototype User Test

Once selected a by a committee we were given documentation on current  processes and feature requirements. With this information the team refined assumptions and filled in the blanks by creating a user persona, application flow and GUI design prototype.


Deliverable Preliminary Design Report

Phase 3 Design & Build

Coded and designed assets were assembled to create a product that follows the product design specifications.


Deliverable Build Created with Unity

Phase 4 Ideate Analyze & Iterate

At Johnson Space Center we where able to test our build with personnel that support EVA operations. That allowed the team to generate data-driven product improvement through measuring and iterating the product.


Deliverable Final Report

Final UI Design

The final designs were created based on the system requirements and results of the usability evaluations.  The interface consists of two main panels with EVA time as fixed object, available across panels.  The EVA time is presented at the top right of the user’s point of view providing how long a user has been conducting the EVA.


The two main panels are visible when a user first opens the interface, Telemetry Panel and Task Selection Panel.  On the Telemetry Data Panel are data sets grouped into 5 categories Battery, Oxygen, Water, Suit, and Environment. Data sets are considered optimal when that are colored in blue.  When data set is not in an optimal state, it is highlighted and turned red. A fixed waring panel appears under the EVA time indicating which data set is not in an optimal state. Switches are grayed out when off and red when on, similar to warnings on car dashboards.


On the Task Selection Panel are tabs indicating what task are available for the user to complete. Once a the user select desired tab using a the one figure click gesture or says “Start [Name of Procedure] the Instruction Panel will appear.


On the Instruction Panel the title of the panel includes the name procedure and pagination. The content of the panel includes the step number, instructions, animation on how to complete the step, directional button, and schematic button. Users can use voice commands such as “Next” “Previous” and “Open/Hide Schematic” to select designated buttons.




Morgan H. McKie