How do you audit and test websites for accessibility issues?

Explore automated tools, manual testing, and user sessions with assistive tech for accessibility.
Learn a structured accessibility workflow: automated scans, manual audits, and usability testing with AT users.

answer

I combine automated tools, manual reviews, and assistive tech testing. Automated scanners (axe, Lighthouse, WAVE) catch common WCAG violations. Manual testing checks keyboard navigation, color contrast, landmarks, and semantic HTML. I run screen readers (NVDA, JAWS, VoiceOver) to ensure usable flow, plus speech recognition and switch devices when possible. Finally, user testing with people with disabilities validates real-world accessibility. Issues are tracked, prioritized, and retested after fixes.

Long Answer

Auditing accessibility is not a one-step scan; it requires multiple complementary approaches. Automated tools catch patterns, manual checks validate usability, and user testing ensures inclusivity. My workflow layers these methods for coverage and accuracy.

1) Automated testing for fast detection

Automated scanners like axe DevTools, Lighthouse, WAVE, and Pa11y are my first pass. They detect missing alt text, ARIA misuse, unlabeled inputs, poor contrast, and invalid HTML. Integrated into CI/CD, they prevent regressions by failing builds when new violations appear. Still, these tools cover only ~30–40% of WCAG issues, so I treat them as triage, not full audits.

2) Manual expert review

Human inspection finds nuanced issues. I test keyboard navigation to ensure tab order is logical, focus states are visible, and no trap blocks progression. I validate headings for hierarchy, ARIA landmarks for screen reader navigation, and form labels for clarity. I check color contrast with tools like Colour Contrast Analyser, including hover/focus states. Motion and animations are reviewed against WCAG 2.3.3 (pause/stop/hide). I also test zoom/responsiveness at 200–400% to confirm reflow compliance.

3) Assistive technology testing

I run actual screen readers—NVDA and JAWS on Windows, VoiceOver on macOS/iOS, TalkBack on Android. I check that page titles, headings, regions, and links are announced meaningfully. For dynamic content, I validate ARIA live regions. I also test speech recognition (Dragon NaturallySpeaking) to confirm interactive controls respond to voice commands. For motor accessibility, I simulate switch control and keyboard-only users. This uncovers issues automation never catches.

4) User testing with people with disabilities

The most valuable insights come from real users with assistive tech. Collaborating with blind, low-vision, mobility-impaired, or neurodiverse testers uncovers barriers that no checklist covers. For example, screen reader users may find a workflow technically accessible but practically inefficient. Feedback loops from these sessions feed into design iterations.

5) Issue tracking and prioritization

All issues are logged with severity (blocker, major, minor) and mapped to WCAG levels (A, AA, AAA). I document reproduction steps, expected behavior, and suggested fixes. This ensures developers can remediate quickly. After fixes, regression testing repeats automated scans, manual audits, and AT checks.

6) Continuous monitoring

Accessibility is not one-time. I integrate testing into CI/CD, set up scheduled audits, and train teams to code inclusively from the start. I use monitoring tools like Siteimprove for ongoing compliance tracking.

In essence, a solid audit blends automation, manual review, and lived experience. Automated scans find patterns, manual checks validate usability, and user testing ensures inclusivity for real people.

Table

Method Examples/Tools Purpose Coverage
Automated Scans axe, Lighthouse, WAVE, Pa11y Catch common WCAG issues early ~30–40% issues
Manual Review Keyboard nav, contrast, ARIA check Validate usability and semantic markup Nuanced issues
Screen Readers NVDA, JAWS, VoiceOver, TalkBack Ensure logical flow, ARIA announcements Critical flows
Other AT Dragon, switch control Test voice/motor accessibility Broader access
User Testing Disabled testers + AT Real-world validation, efficiency Highest impact

Common Mistakes

  • Relying only on automated tools, assuming 100% coverage.
  • Ignoring keyboard-only navigation and focus order.
  • Providing ARIA roles incorrectly, leading to screen reader confusion.
  • Missing color contrast in hover/focus states or images with text.
  • Not testing with real assistive tech users.
  • Failing to document issues with clear reproduction steps.
  • Running one-time audits instead of continuous testing.
  • Overlooking mobile accessibility on iOS/Android.

Sample Answers

Junior:
“I run Lighthouse and axe to catch missing alt text and contrast errors. I also test keyboard navigation to make sure everything is reachable.”

Mid:
“I combine automated scans with manual checks: headings, ARIA landmarks, and form labels. I test with NVDA and VoiceOver to confirm screen reader flow. Issues are logged by severity and retested after fixes.”

Senior:
“My approach layers automation, manual review, and user testing. Automated tools flag common errors; manual checks validate focus order, reflow, and semantics. I run multiple screen readers plus speech/motor AT. Finally, I involve users with disabilities for real validation. Issues are prioritized by WCAG severity, tracked, and retested. Accessibility is built into CI/CD for continuous compliance.”

Evaluation Criteria

Look for layered answers: automation + manual + user testing. Strong candidates mention specific tools (axe, Lighthouse, NVDA, VoiceOver), manual methods (keyboard, color contrast, ARIA), and real user sessions. They should highlight prioritization, WCAG mapping, and continuous monitoring. Red flags: relying only on Lighthouse, ignoring assistive technologies, or treating accessibility as a one-off. The best answers show empathy, technical rigor, and integration into development pipelines.

Preparation Tips

  • Run axe and Lighthouse on a demo site.
  • Practice keyboard-only navigation and document focus traps.
  • Use Colour Contrast Analyser for text vs background.
  • Learn VoiceOver gestures on Mac/iOS and NVDA commands on Windows.
  • Test a form with screen readers to check labels and error announcements.
  • Try Dragon voice commands on interactive elements.
  • Partner with accessibility testers or organizations.
  • Map findings to WCAG 2.1 and document remediation steps.
  • Be ready with a 60-second explanation of your layered audit process.

Real-world Context

A university site passed Lighthouse but remained unusable for blind students—forms lacked proper labels. Manual screen reader testing revealed the issue. After fixes, form completion success rose 70%. A retailer ignored focus states; keyboard users couldn’t check out. Adding visible focus outlines and ARIA roles solved it. A healthcare portal missed color contrast in hover menus, failing older users. After adjustments, support tickets dropped. In all cases, layered audits—automation + manual + user testing—proved essential.

Key Takeaways

  • Use automated tools for quick coverage, but never rely solely on them.
  • Manual checks catch focus, semantics, and contrast issues.
  • Test with screen readers, speech, and switch devices.
  • Validate with real users with disabilities.
  • Integrate accessibility into CI/CD for continuous compliance.

Practice Exercise

Scenario:
You’re tasked with auditing an e-commerce site’s checkout flow for accessibility.

Tasks:

  1. Run axe and Lighthouse to flag initial WCAG issues.
  2. Test keyboard navigation: confirm tab order, focus states, and no traps.
  3. Validate color contrast on all buttons and labels with a contrast analyser.
  4. Use NVDA or VoiceOver to complete checkout, ensuring form labels and error messages are announced.
  5. Test with Dragon speech commands and switch control for motor accessibility.
  6. Recruit a blind tester to attempt checkout; gather feedback.
  7. Document all issues with severity, WCAG references, and remediation suggestions.
  8. Retest after fixes, confirming improvements with automation + manual + user testing.

Deliverable:
A documented accessibility audit report showing automated results, manual checks, AT findings, and real user feedback, with prioritized remediation guidance.

Still got questions?

Privacy Preferences

Essential cookies
Required
Marketing cookies
Personalization cookies
Analytics cookies
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.