top of page

National Science Foundation

A Case Study on Digital Accessibility Compliance

The Basics

During my time at the National Science Foundation (NSF), I worked on the National Science Foundation’s (NSF) Proposal Management Service (PMS), a platform used by 50,000+ scientists and researchers to apply for grant funding.

 

My role in this project went beyond typical wireframing and prototyping. I led the integration of WCAG/Section 508 web accessibility standards into the design and development workflow, establishing a repeatable process for identifying, documenting, and resolving accessibility issues.

 

Ultimately, the success of this pilot is helping to inform a broader accessibility strategy that will be implemented across all NSF digital applications.

Image of a laptop displaying NSF's grant management homepage

My Role: Lead Accessibility Analyst; UX/UI Analyst

Project Timeline: 6 months

Tools: WAVE, NVDA, ChatGPT, Confluence

Other Team Members: Scrum Master/BA; Front-end developers; UI/UX analysts

Key Achievements

Identified 42 accessibility issues that were not detected by automated testing tools. Resolved 20 issues to date, including:

​

  • Enhanced browser title tags for better wayfinding and screen reader identification

  • Improved inline and page-level error messaging

  • Added descriptive table captions across the application

  • Streamlined navigation through complex, multi-page forms

Why Digital Accessibility Matters

In an increasingly digital world, equitable access to information is a fundamental right. According to the CDC, over 60 million U.S. adults live with a disability. However, as of 2020, only 2% of websites fully comply with WCAG 2.1 accessibility standards (WebAIM Million Report).

​

As a federally operated platform, NSF is legally bound to Section 508 of the Rehabilitation Act, which mandates adherence to WCAG 2.1 AA standards. â€‹By enabling screen readers, improving keyboard navigation, and designing more thoughtful interactions, we empower every user to independently pursue research funding opportunities. This not only enhances usability but promotes a more inclusive and equitable scientific community.

Establishing NSF's Accessibility Auditing Process

Before this initiative, accessibility checks at NSF were mostly limited to automated tools like SortSite and Google Lighthouse. While helpful, these tools often miss real-world usability issues that only manual testing can uncover.


1. Accessibility Audit Workflow

First, I conducted a first-pass audit of the page using WAVE, identifying basic markup issues such as color contrast, links and navigation, ARIA usage, alternative text on graphics, and overall navigational structure.​

​​​

​

​

​

​

​

​

​​

​

​

​

​

​​

 

 

 

 

I also manually tested each page with the free

Windows NVDA screen reader to evaluate:

  • Reading order

  • Screen reader-only text

  • Table captions

  • Form legends and associated fieldsets

  • Hidden elements

​

​

​

2. Documentation & Backlog Management

Based on the audit, accessibility issues were documented directly in a Confluence-based backlog. Regular meetings were held with developers to prioritize items according to time constraints, priority, and technical feasibility. This ensured accessibility fixes were built into ongoing functional enhancements as they were ready. Resolved issues were archived to track progress and maintain visibility.​​​​​​​​​

Image of WAVE tool analyzing an NSF webpage
Image of Welcome to NVDA pop-up

​​Each issue included:

  • Description and accessibility area

  • JIRA ticket number

  • UX priority (assigned by me and 2 UX teammates)

  • Level of effort (LOE) from developers

  • Relevant screenshots

Image of a table listing accessibility issues by area, description, JIRA ticket, and UX priority

Accessibility Audit Results and Metrics

Over a 6-month pilot, I identified 42 accessibility issues that were not detected by automated testing tools. We prioritized and resolved 20 of the most impactful issues, including:

  • Enabling screen readers to announce both inline and page-level form errors

  • Ensuring radio buttons could be navigated using only the keyboard

  • Making browser title tags uniquely descriptive for multi-step forms

  • Adding table captions and ARIA labels to complex data tables

​

These changes significantly improved the experience for screen reader users and reduced cognitive load for all users.

 

Page-Level Error Example​​​​

Before Audit:

  • Reads 'Alert' but not what content inside alert message

  • User is prevented from knowing what errors will prevent submission

  • Displays weird special characters/symbols

Image of NVDA speech viewer skipping alert banner content
Image of NVDA speech viewer reading alert banner content

After Audit:

  • Announces error types and all content inside error

  • Announces when there is a link embedded in the alert message

Other Examples

Adding table captions

Image of NVDA speech viewer reading the submitted proposals table caption

Clarifying field-level errors

Image of NVDA speech viewer reading an inline error for special characters on a text field

Challenges and How I Solved Them

While every project comes with its own set of challenges, I found myself faced with some pretty major hurdles that I feel made me a better user advocate and problem solver. 

​

1

Complexity of Existing UI Elements

Certain components, like dynamic tables and nested form fields, presented significant accessibility challenges. There were some co-existing and overlapping issues, making it hard to pinpoint the exact issue.

How I Problem Solved

  • Took a W3C online accessibility course to better understand compliance benchmarks

  • Leveraged resources like ChatGPT, YouTube tutorials, and WCAG documentation

  • Ran tests with screen readers to evaluate practical implementation

2

Lack of Leadership Guidance

NSF does not have any established standards on how teams must integrate accessibility considerations into existing workflows, leaving it up to each team to dictate their own standards.

How I Problem Solved

  • Adopted WCAG 2.1 AA as the baseline

  • Focused efforts on screen reader text, color contrast, and keyboard operability

  • Led ad-hoc meetings with front-end developers to prioritize "quick wins" using the Confluence tracker

3

Balancing Accessibility & Functionality

Time constraints meant accessibility activities often competed with functional work.

How I Problem Solved

  • Embedded accessibility into my day-to-day wireframing process

  • Maintained a living document via Confluence that tracked progress and ensured nothing was lost

Lessons Learned

Accessibility should be baked into the design process, not bolted on later
Retrofitting compliance is expensive and time-consuming. Building with accessibility in mind from the beginning ensures better outcomes for all users and reduces the redundancy of having to build things twice.
​
Automated tools are not enough
While tools like WAVE, SortSite, and Google Lighthouse are helpful, they can both miss critical issues and flag non-issues. Manual testing is essential to gauge real usability.
​
Accessibility is a deep, specialized skill
I learned just how much depth there is to this field. As the need for accessibility continues to grow, teams must be equipped with the knowledge and skills to integrate accessibility into both design and development.

Final Thoughts

This pilot laid the groundwork for systemic accessibility improvements across NSF’s digital ecosystem. By combining thoughtful design, methodical testing, and collaborative implementation, we made real progress toward an inclusive, equitable web experience for the nation's research community. Digital accessibility isn’t just a box to check, it’s a commitment to equal opportunity, independent access, and universal design.

bottom of page