Skip to main content

SLS Data Exchange (DEx)

Supporting the delivery of crucial data between NASA engineers

A few screenshots from SLS DEx.


This work contains a few space secrets (shh!), so I've redacted and abstracted a few things to tell the story.


Designing NASA's new Space Launch System (SLS) is no small task, and engineers and contractors from across the agency have to exchange and review all manner of models, drawings, and more with one another. Their process wasn't going as expected. Things weren't being delivered on time and it was difficult to get everyone to sign off, causing tension within the organization. I led the research, information architecture, and design to create DEx, a web application to address some of SLS' biggest challenges.

Since launch, we've halved the time it takes to complete the data exchange process.

Project details

My contribution

  • Product management
  • Information architecture
  • Interaction design
  • User research
  • Usability testing
  • Client relations
  • Facilitation and workshops


Since retiring the Space Shuttle, NASA has been hard at work on a new human spaceflight vehicle, the Space Launch System (SLS). To design and build the rocket, all manner of specialized engineers and contractors have to collaborate on, exchange, and review data artifacts with one another. These are things like environment models, trajectories, drawings, schematics, tables, and more. Each of these deliverables are dependent on one or more others, so it's important that data is completed and delivered on time. In addition, every deliverable needs to be signed off on by multiple different parties.

To keep track of everything, the folks at SLS had come up with a process for requests, signatures, and deliveries. However, it was run by a single person inside an Excel spreadsheet of ever-increasing size and complexity. Nothing was working as expected. There were many bureaucratic silos and unclear requests. Things weren't getting delivered on time (if at all), and it took way to long to get everyone to sign off on things. While there was a process in theory, it wasn't really being followed, sewing discord, stress, and unhappiness throughout the SLS organization.

A screenshot of the spreadsheet SLS had been using to track everything.
SLS had been running everything from a single spreadsheet!

I led the user research, information architecture, and product design on SLS Data Exchange (DEx), a web application that addresses some of SLS' most critical issues.

Our goal was to make the process more efficient by breaking down key barriers, reducing stress for the organization.

The product: Mapping opportunities to features

Shared, web-based documents

Opportunity: Make it easier to locate and access material.

Solution: By providing all the data on a single, web-based platform rather than individual engineers' hard drives or scattered throughout email, we made it easier for people to find what they were looking for.

Three screenshots from our design system documentation.
Our Sketch library.


Opportunity: Reduce the time needed for sign off.

Solution: Web-based signatures that can be completed with the click of a button rather than running around searching for people to physically sign off on paper.

Linked information

Opportunity: Better communicate status and impacts.

Solution: By linking records together, users can easily see how each is related and know which items have an impact on others.

Three screenshots from our design system documentation.
Our Sketch library.


Opportunity: Encourage collaboration.

Solution: Using comments, users can discuss an exchange in a single, public place so everyone involved in the process can see. This keeps people in the loop and reduces the load on people's mailboxes.


User interviews

After a kickoff with our main clients, we traveled to Marshall Spaceflight Center (MSFC) in Huntsville, AL to do some interviews. I facilitated thirteen interviews with folks from different disciplines, elements, and roles in the process. Our research questions sought to answer who was involved in the process, what it looked like from each participants' perspective, frustrations and pain points, and any things that might be working well, if any.

Analyzing our notes

Rather than doing an affinity diagram, we tried out using Airtable to document and classify our notes. We thought it would save us some time and allow us to quickly add data and make associations. My colleague Stephen and I combed through our raw notes and pulled out "nuggets" that could stand on their own. We tagged each note as we went in order to pull out common trends. We went through rapidly and at the end a pretty good set of themes had emerged.

A screenshot of our Airtable instance, which we used to organize all our research notes.
We used Airtable to organize all our notes, and later to link all our data user stories.

Determining the most critical opportunities

We quantified our data in two ways to filter out the most important opportunities. The first was a simple bar chart showing how many times a theme was mentioned. Nothing was too surprising, but we now had evidence to back up some of the frustrations we heard about when the project began.

The second model we created was a "prevalence-severity matrix". I had used impact-effort matrices in past projects to help prioritize work, and wondered if we could use something similar to further narrow our opportunities. On this chart, we plotted how often a theme came up against how severe it was. The severity, was, admittedly, a subjective measure based on the language our participants used and our previous domain knowledge. However, it offered a useful tool to get a sense of our top opportunities.

A bar chart showing how often each theme came up.
A matrix showing how often an issue came up versus how severe it was.
Two of the models we used to quantify our data.

Modeling the process

We wanted a clearer picture of the overall process, and wanted to be able to show our stakeholders where the biggest issues were happening. We pieced together everything we heard into a unified model and highlighted the breakdowns.

Our model of the original process, marking breakdowns.
Just one part of our model highlighting some of the breakdowns in the old process.

Workshopping project goals and design ideas

We travelled back out to Marshall, and I facilitated two workshops with our clients and a few other leaders from SLS. The first was to generate project goals, and the second was to explore design ideas.

For the goals workshop, we did a simple brainstorm on sticky notes. In the second exercise, we got them drawing! I wanted to make sure our findings and the goals of our stakeholders were aligned. If there were goals or design ideas that didn't map clearly to our findings, we could analyze them further. We also wanted to make the process participatory and make folks feel included in the process. Making changes is a scary thing, so we worked hard to make sure everyone felt welcome in shaping the project.

Our affinity diagram of project goals created with our clients.
We brainstormed and grouped dozens of goals generated by our clients.

Roles and user stories

We divided our user community into a few main roles:

For each role, we wrote several user stories that combined everything we learned from our research and workshops. We used these user stories to scope the work for the first release.

An example user story.
Just one example user story we presented to our clients.

Information architecture

My team's flagship product, Mission Assurance System, is a customizable platform that fits into a lot of NASA use cases. For many projects, it's a way to rapidly the meet our users' needs without spending too much time creating an entirely new piece of enterprise software. We decided that it would be a good fit for SLS as well.

Even though we were building using our platform, we still needed to define a logical information architecture. We drew content from all the artifacts we had collected (memos, the original Excel spreadsheet, and more) and our primary research in order to create our IA.

Our information architecture diagram.
Our information architecture diagram illustrating how we would structure the records in the system.

Creating a new process

In addition to creating an information architecture, we worked with our clients to draw out a new workflow. We defined what the responsibilities of each role would be in the new system.

A workflow for the new system showing the responsibilities of each role.
We created this workflow diagram to map out how each role would interact with the system.

Usability testing and validation

With our information architecture and new process workflow in hand, I created a prototype environment of our product to put in front of users. We took our third trip to Alabama to test and validate the interface. I facilitated seven sessions where we sought to validate some of our concepts and get some light usability feedback. We provided our participants a different set of scenarios depending on what role they played in the process.

What needed work

What went well


Reducing the number of days per cycle


Days saved per review

Since launch we've cut the time the average data exchange takes in half, from 30 days to just 15.


Days saved for all signatures

Within each cycle, we've seen the time it takes engineers to gather all their signatures fall from 8 days on average to just three.

Tremendous work! We might know how to do this by the time we fly.