arrow_upward

Redesigning the Benefits Dashboard to Improve Usability

Client
Nebraska Department of
Health & Human Services
My Role
UX Designer
Team
1 UX Designer,
3 Developers,
1 Project Manager
Duration
Dec 2023 - Apr 2024
(4 Months)

Project Context

The Benefits Dashboard is a web application developed by the Nebraska Department of Health and Human Services to help residents manage their federal and state benefits online.

Although the dashboard's design was nearly complete, stakeholders raised concerns about its usability. With the original designer leaving the team, I was brought in to reassess the user experience and implement any necessary enhancements without delaying the release timeline.

The Problem

As development on the Benefits Dashboard was about to begin, stakeholders raised concerns about its usability. They worried it could frustrate users and make it harder for them to access their benefits smoothly.

Solution Preview

The video below shows the happy path a user might take when they access the Benefits Dashboard.

Users start by checking alerts for pending actions to avoid benefit delays, then review their benefit summary for an overview of active and pending programs.

Next, they can explore case details to verify associated people and information, confirm payment schedules and amounts, and review correspondence history and statuses.

Impact After a Year

220,203
Unique site visits
+20%
Applications processed
100%
Released on-time
$0
Over budget
After the dashboard was released to production, I was able to obtain some metrics from stakeholders a year later. What stood out was the high traffic: of approximately 340,000 Nebraskans who receive state and federal benefits, 220,203 visited the dashboard. This means 65% of benefit recipients accessed their information through the platform.

Caseworkers also reported fewer support calls (though not formally measured), and the department processed 20% more applications compared to the previous year—roughly 32,754 more people receiving benefits faster.

Furthermore, the dashboard was released on time, required no additional enhancements, and stayed within budget. This successfully met the business’s deadline and budget goals.

Designing With Constraints

Before diving into the redesign, I felt it was important to identify key constraints early on to understand how they would affect the project's outcome.
Upcoming Deadline
Given a 4 month timeline, turnaround for the redesign needed to be quick.
No Major Visual Changes
Some features were approved by stakeholders so a full redesign wasn't needed.
Strict Budget
Due to the proposal and budget approval process, major post-release enhancements are not preferred.

Key Findings From Stakeholder Interviews

After understanding the constraints, my next step was to gather as much information as possible about the dashboard. Here were the key questions I asked stakeholders to get more insights.
❓ Why was the dashboard being built?
The Department receives many calls from users inquiring about their benefits. However, there is a limited number of caseworkers available to handle them. This has led to bottlenecks in processing new applications.
🎯 Who is the target audience?
Nebraska residents applying for benefits like healthcare or food assistance. This includes first-time applicants and returning users checking or updating their information.
⚠️ What are the challenges with the current solution?
Frustrating Interactions:
  • Stakeholders felt the UI was frustrating to use when trying to view details for multiple benefits.
  • They also had a hard time reading information from the data tables.
✅ What does success look like?
User Goal:
  • To easily check the status of their applications/benefits and complete any required steps.
Business Goal:
  • Streamline application processing to deliver benefits to residents faster.

Secondary Research Uncovered More Challenges

For secondary research, I reviewed earlier versions of the dashboard to understand its history and the reasoning behind certain design decisions. However, the last three versions showed only minor functional and visual updates.

When I followed up with stakeholders about this, I discovered that the designs had never been user tested. This presented a significant gap, as it leaves critical questions about usability and user needs unanswered. Addressing this presented a major opportunity to improve both the design and user experience.

Outlining an Action Plan for the Solution

Based on the insights gathered, I wanted to create an action plan to streamline the design process. Since no user testing had been conducted, I began with a usability heuristic evaluation to identify major issues that a usability test would likely uncover. This approach allowed me to address obvious problems first, reserving user testing for solution refining later.
Start

1

Usability Heuristic Evaluation
Conduct a heuristic evaluation to confirm stakeholder concerns and uncover any additional usability issues.

2

Design Solution
Design a solution addressing these concerns and present it to stakeholders for feedback and approval.

3

Test + Iterate
Conduct usability testing to refine the solution. Although not required, I felt it was essential to get unbiased feedback and identify any pain points that might have been overlooked.

4

Follow up
Follow up post-release to gather operational metrics, assess improvements, and iterate as needed.
End

Creating a Component Library and Style Guide to Streamline Current and Future Projects

Before starting the heuristic evaluation, I reviewed the department’s existing applications to get a better sense of their overall branding and style. This uncovered significant inconsistencies in design across different applications.

Discussions with the engineering teams revealed that without general guidelines, maintaining a cohesive look was challenging since teams worked independently. To address this, I developed a style guide to help both developers and the business create a more consistent and unified experience.
Style Guide
While developing the style guide, I also noticed that a Figma component library was missing, even though design work had been in progress for a while. This created an opportunity to incorporate one that aligned with the engineering team’s React component library to improve consistency and efficiency.
Design System Component Library
Below are partial screenshots of some (not all) elements in the component library

Main Usability Concerns and How They Were Resolved

While doing an initial review of the dashboard, I was able to identify several areas for improvement, with the biggest usability issue stemming from the mobile design being an afterthought. Since 60% of users accessed the application on mobile, improving the mobile experience became a top priority.

Below are some key usability issues I discovered.
Layout
Along with the usability issues I found, my first focus was to redesign the layout, since that would have the biggest impact on improving mobile scalability. Here are the two options I explored.
New Global Menu
Moved Alerts’ and ‘Take Action on Your Case’ to be part of a new global navigation, making them accessible from anywhere once users are signed in. Improves navigational scalability when we introduce new features like 'Preferences'.

Also, Case Actions are links that can be accessed from the Landing Page, so they belong to a global menu rather than being limited to just the Benefits Dashboard.

Prioritizing Benefit Information Viewing
Expanded the Benefit Summary component and repositioned it to the top so users can get a high level overview of their programs before diving into details. Improves user focus.
👏 Pros: 
More streamlined viewing of information, scales well into mobile, improved information architecture for navigation.

👎 Cons: 
Significantly differs from the original layout, requires more development effort to implement the navigation bar consistently across applications.
Case Actions as a Side Menu
Redesign the 'Take Actions on Your Case' to be a side menu for improved discoverability and standardization.

Prioritizing Benefit Information Viewing
Expanded the Benefit Summary component and repositioned it to the top so users can get a high level overview of their programs before diving into details. Improves user focus.
👏 Pros: 
Less development effort required, slight improvement in usability.

👎 Cons: 
Doesn't scale as well once more features are added, alerts are not accessible when users navigate to other pages while logged in.

Stakeholders agreed that Layout 1 was better for long-term scalability however, for the MVP, they went with Layout 2.

Usability
Below are some examples of the usability changes with before and after screenshots to demonstrate the improvements.

Content & Clarity

The dashboard’s copy was often too long, making it inefficient—especially on mobile, where space was limited. Below is an example of how I improved the copy to enhance clarity and usability.

New Pain Points Uncovered in User Testing

Taking all the usability issues I found in the heuristic evaluation, I designed a new solution and explained my design decisions to stakeholders. This made it easier for them to approve the design. Finally, it was time to user test the designs.

Since there were no resources to conduct formal testing, I took a scrappy approach and recruited three users through family and friends.

What Users Said

What do these tab sections/columns mean?

3 Participants

When exploring benefit details

There's a lot going on and I'm not sure where to start.

3 Participants

When exploring benefit details

When was this document request last updated?

2 Participants

When viewing Correspondences

Lorem Ipsum is simply dummy text of the printing and typesetting industry.

Profile name

CEO / Creative IT

Lorem Ipsum is simply dummy text of the printing and typesetting industry.

Profile name

CEO / Creative IT

Refinements Driven by User Testing

When I presented my findings to stakeholders, they agreed that additional descriptions would be helpful, along with alerts to let users know when a request was last updated.
The dashboard shown in the training video is based on the MVP released in one of their test environments. Names and data are fictional.
A brief description about each tab along with informational alerts like Document Update Notice are added to the dashboard.

Learnings

Navigating Complex Approval Processes
Navigating government approval processes can feel like a black box, with multiple teams across departments bringing their own legal requirements and biases. I learned that patience and active listening are key. By addressing concerns directly, often in smaller, more personal conversations rather than large meetings, and ensuring stakeholders felt heard—even if they didn’t agree with the final decision—I was able to foster collaboration and gain alignment.
The Value of Usability Testing
Even in the absence of a formal usability testing process, taking the initiative to conduct testing can make a significant impact. Usability testing provides concrete, user-centered evidence that resonates with stakeholders, even the most skeptical ones. I learned that the effort not only validates design decisions but also builds trust and credibility across teams.
Lack of a formal user testing process: The Department didn’t have a system in place for user testing, so I had to recruit users myself. This meant finding people, coordinating schedules, and making sure they felt comfortable sharing honest feedback. It took extra time and effort, but it was worth it to get the insights needed to improve the redesign.

Designing for ADA compliance as a legal requirement: With ADA compliance being mandatory, we focused on ensuring our designs met accessibility standards. Tools like AXE was invaluable for checking color contrast ratios, legibility, and other key accessibility features. To support the dev team, I also created a guide for implementing keyboard tabbing, as it wasn’t straightforward within the dashboard's features.

Design approval can be a 'black box': The approval process was often lengthy, involving multiple teams from different departments. To streamline it, I added documentation for each design decision, allowing solution architects to clearly explain the rationale behind the design and secure faster approval.


Instead, design reviews were handled by Business Analysts and the Operating Committee. The analysts made sure the program information were accurate, while the committee checked for compliance with state and federal policies, alignment with other program departments, and whether the interactions looked "right."