How AccessPulse works
The scan pipeline
When you submit a URL, here is exactly what happens:
- Browser launch. We spin up an isolated Chromium instance in a sandboxed container. No state is shared between scans.
- Page load. Chromium navigates to your URL and waits for
DOMContentLoaded, then pauses for 2 seconds to let client-side JavaScript render. This matters for SPAs — a static HTML scraper would miss dynamically rendered content. - axe-core injection. We inject axe-core v4.10, the open-source accessibility testing engine created and maintained by Deque Systems. axe-core is used by Google Lighthouse, Microsoft Accessibility Insights, and most enterprise accessibility platforms.
- Rule execution. axe-core runs its full ruleset against the rendered DOM. We test against WCAG 2.0 A/AA, WCAG 2.1 A/AA, WCAG 2.2 AA, and additional best practices — approximately 90+ rules covering color contrast, ARIA usage, form labels, heading structure, keyboard navigation, and more.
- Results processing.We parse axe-core's output, compute a weighted accessibility score, and structure the violations by severity (critical, serious, moderate, minor).
- Cleanup. The browser instance is destroyed. We do not store page content, screenshots, or cookies — only the accessibility report.
What we test
AccessPulse evaluates your page against WCAG 2.2 Level AA success criteria that can be verified programmatically. This includes:
- Perceivable: Images have alt text, color contrast meets 4.5:1 ratio for normal text and 3:1 for large text, text can be resized to 200%, content adapts to different viewports.
- Operable: All interactive elements are keyboard accessible, focus indicators are visible, no keyboard traps exist, page titles are descriptive.
- Understandable: Form inputs have labels, error messages are associated with their fields, language is declared in the HTML.
- Robust: ARIA roles and attributes are valid, IDs are unique, HTML parsing errors are minimized.
What we can't test
Automated testing has real limits. According to Deque's own research, automated tools catch approximately 57% of WCAG violations. The remaining 43% require human judgment:
- Alt text quality: We can detect missing alt text, but not whether existing alt text accurately describes the image. "Image of product" passes automated checks but fails the actual WCAG requirement.
- Reading order: Screen readers follow DOM order, which may differ from visual order. Automated tools can flag some cases but cannot judge whether the reading experience makes sense.
- Cognitive load: WCAG 2.2 includes criteria for consistent navigation and predictable behavior. These require human evaluation.
- Video captions: We can detect the presence of track elements but not whether captions are accurate or synchronized.
- Custom widgets: Complex ARIA patterns (data grids, tree views, comboboxes) need manual keyboard testing to verify they work correctly with screen readers.
We are transparent about this because the alternative — claiming full compliance from automated scans — is how overlay vendors got fined by the FTC. AccessPulse handles the automated layer reliably. For the other 43%, we recommend pairing automated scans with periodic manual audits.
Scoring methodology
The AccessPulse score (0–100) is computed as a weighted ratio of passing checks to total checks:
score = passes / (passes + weighted_violations) * 100
Severity weights:
critical = 10x (e.g., missing form labels, no alt text)
serious = 5x (e.g., low color contrast)
moderate = 2x (e.g., heading order skipped)
minor = 1x (e.g., redundant ARIA role)This means a single critical violation (like an unlabeled form on a login page) impacts the score more than several minor issues. We weight this way because critical violations directly block users from completing tasks.
Infrastructure
- Scanner runtime: Isolated containers with headless Chromium. Each scan gets a fresh browser instance — no shared state, no cookies persisted.
- Data storage: Scan results are stored in a PostgreSQL database. We do not store page content, DOM snapshots, or screenshots — only the structured accessibility report.
- Security: URLs are validated against SSRF attacks (private IP ranges, metadata endpoints, non-HTTP protocols are blocked). Scan results are scoped to the requesting user.
Open source foundation
AccessPulse is built on axe-core (MPL-2.0 license), the same engine used by Google, Microsoft, and the US government for accessibility testing. We chose axe-core over alternatives like htmlCS (used by Pa11y) or WAVE because of its active maintenance, comprehensive rule coverage, and industry adoption. AccessPulse is not affiliated with or endorsed by Deque Systems.