How do you maintain accessibility during continuous development?
Web Accessibility Specialist
answer
Maintaining accessibility at scale requires shifting left and continuous validation. I embed a11y checks in CI/CD pipelines, enforce semantic HTML and ARIA patterns, and standardize components in design systems. Content authors use accessible templates and editors with linting. Regression tests run via axe-core, Lighthouse CI, and screen readers in QA. Governance includes accessibility champions, training, and audit dashboards, ensuring new features respect WCAG and reduce regressions.
Long Answer
Accessibility in large-scale environments is not a one-time audit but an ongoing governance practice across development, design, and content teams. My strategies integrate accessibility into every layer of the lifecycle—planning, coding, testing, deployment, and maintenance—so regressions are minimized and user experience remains inclusive.
1) Shift-left accessibility in development
I ensure accessibility is part of the earliest dev cycles. Static analysis tools like ESLint plugins for JSX a11y or pa11y are part of the pipeline. Developers use semantic HTML first, ARIA only when strictly necessary. Code review checklists include a11y items—alt text, keyboard navigation, color contrast, and focus order.
2) Design systems and component libraries
Reusable, pre-tested components enforce accessibility at scale. A design system with tokens for contrast, spacing, and motion ensures compliance with WCAG. Components (buttons, modals, dropdowns) ship with baked-in focus management, ARIA labeling, and reduced-motion variants. Teams inherit accessibility by default instead of reinventing.
3) Continuous integration & automated testing
Accessibility tooling is integrated into CI/CD pipelines. I use axe-core and Lighthouse CI for automated checks on every pull request. Tests verify headings, labels, ARIA roles, contrast, and keyboard focus. Failures block merges. Snapshot tests catch regressions in semantic markup. Screen reader scripts (NVDA, VoiceOver) are run in QA cycles.
4) Content workflows
Content editors often introduce accessibility risks. I implement CMS workflows with accessible templates, alt-text reminders, and Markdown/WYSIWYG editors that lint heading order and link labels. Training for editors ensures PDFs, captions, and transcripts meet standards.
5) Governance and champions
Accessibility champions are embedded in each team to advocate and review. Dashboards track a11y regressions, passing vs failing checks, and coverage against WCAG criteria. Quarterly audits validate automated tooling against manual testing. Leadership receives reports tied to KPIs—e.g., % of templates passing audits.
6) Monitoring and regression prevention
Synthetic tests run regularly to check color contrast, focus visibility, and ARIA coverage on live pages. Browser devtools audits (Lighthouse, Accessibility Tree) confirm correctness. Errors trigger alerts in the same channels as performance or uptime issues, showing accessibility is a first-class metric.
7) Inclusive content updates and releases
When shipping new features, I pair manual exploratory testing with users of assistive tech (screen readers, switch devices). Pre-release test scripts check not only compliance but usability—tab order, error feedback, motion comfort. This ensures accessibility goes beyond technical minimums.
8) Documentation and knowledge sharing
Playbooks guide developers and editors: how to write alt text, use headings, structure forms, and avoid motion triggers. A11y tickets are prioritized as bugs, not “nice-to-haves.” Teams learn from regressions through postmortems.
Summary: To maintain accessibility at scale, I embed automated checks, accessible components, editor workflows, champion networks, and governance dashboards. Accessibility becomes part of continuous delivery—not a bolt-on—so large sites remain inclusive during constant change.
Table
Common Mistakes
Teams often treat accessibility as a one-off audit instead of continuous. They rely only on automated tools, missing real usability problems like keyboard traps or unclear focus. Developers misuse ARIA to “patch” issues instead of using semantic HTML. Content teams forget alt text or headings, creating regressions. Design systems lack accessibility baked-in, so teams rebuild flawed components repeatedly. CI/CD pipelines skip a11y gates, leaving late fixes expensive. Training is ignored; new hires repeat old mistakes. Another pitfall: tracking accessibility separately from quality metrics, so regressions aren’t visible to leadership. Finally, failing to test with assistive tech users leads to technically compliant but practically unusable experiences.
Sample Answers (Junior / Mid / Senior)
Junior:
“I rely on semantic HTML and test pages with a screen reader. I also run axe in the browser to catch missing alt text or labels before release. Content updates get reviewed for headings and links.”
Mid:
“I add a11y linting and axe-core tests into CI/CD so regressions fail fast. Our design system has accessible components with proper roles and focus. Content editors use templates that enforce alt text. We run QA with VoiceOver and NVDA before releases.”
Senior:
“I manage accessibility as governance: automated checks in pipelines, champion networks in teams, and dashboards tracking WCAG coverage. Our design system provides tokenized contrast and reduced motion options. All features ship with Reduced Motion and AT-tested flows. We train editors, run quarterly audits, and tie results to business KPIs. Accessibility is continuous, not episodic.”
Evaluation Criteria
Interviewers expect layered strategies: shift-left dev practices, design systems with accessible components, CI/CD integration, content editor guardrails, manual + AT testing, and governance dashboards. Strong answers highlight both automation (axe-core, Lighthouse CI) and human validation (NVDA, VoiceOver, keyboard flows). They should emphasize preventing regressions via linting, tokens, and CI/CD gates. Senior candidates should mention embedding accessibility in design systems, cross-team champions, and leadership reporting. Weak answers only say “we check with Lighthouse” or “we test with screen readers sometimes.” A standout answer frames accessibility as a continuous quality practice, tied to metrics and culture, not an afterthought.
Preparation Tips
Set up a demo site and integrate ESLint a11y rules, axe-core, and Lighthouse CI into your pipeline. Build a small design system with accessible button and modal components using semantic roles and focus traps. Create a CMS template requiring alt text and heading hierarchy. Run manual QA: tab through flows, use VoiceOver/NVDA, and enable reduced motion. Write an audit report and link it to metrics like error rates and accessibility coverage. Prepare a 60–90 second pitch: automation + design system + editor workflows + AT testing + governance. Use a real example: e.g., reducing regressions by 40% after adding axe-core gates in CI.
Real-world Context
At a large publisher, accessibility issues spiked during weekly content updates. By adding CMS plugins that enforced alt text and heading rules, they cut regressions by 60%. Another SaaS company embedded axe-core in GitHub Actions; accessibility bugs dropped in staging by half, freeing QA. A fintech built an accessible design system with WCAG contrast tokens and focus traps; product teams shipped faster with fewer bugs. Quarterly audits at a healthcare platform revealed hidden regressions; dashboards showed leadership the ROI of accessibility investments. In all cases, continuous practices—design system tokens, CI/CD gates, editor training—proved essential for sustaining accessibility under constant releases.
Key Takeaways
- Shift-left with semantic HTML, linting, and ARIA best practices.
- Bake accessibility into design systems and tokens.
- Run CI/CD a11y gates with axe-core/Lighthouse CI.
- Enforce accessible content templates in CMS.
- Combine automation with AT testing and governance dashboards.
Practice Exercise
Scenario: You work on a large e-commerce site with frequent releases. Accessibility regressions have been slipping into production.
Tasks:
- Add ESLint a11y and axe-core integration in CI/CD. Fail builds on missing alt text, labels, or color contrast.
- Create a design system button with semantic roles, focus trap, and Reduced Motion option. Store tokens for contrast.
- Add CMS plugin requiring alt text, heading order, and accessible link text before publishing.
- Run QA flows with NVDA and VoiceOver. Validate keyboard navigation, focus states, and error feedback in forms.
- Build a dashboard tracking accessibility coverage and regression trends over releases. Report to leadership.
- Train editors and devs: write a playbook for alt text, headings, ARIA usage.
- Simulate a regression: a new carousel without focus management. Catch it via CI/CD, fix via design system component, and retest.
Deliverable: A 90-second walkthrough explaining how automation, design systems, CMS guardrails, manual AT testing, and governance reduce regressions while sustaining accessibility in a high-change environment.

