Validation is the step where you confirm that each accessibility issue identified in your audit report has been properly fixed. You review the original issue, check the remediated code or content, and verify it now conforms to the relevant WCAG success criterion. Without validation, remediation is guesswork.
Most organizations receive an audit report, assign fixes to their development team, and then wonder what comes next. The answer is validation: a structured review of every fix against the original audit findings. This is the only way to close the loop between an audit and actual WCAG 2.1 AA or WCAG 2.2 AA conformance.
| Validation Factor | What to Know |
|---|---|
| Purpose | Confirm each remediated issue now meets the WCAG success criterion it originally failed |
| Who Validates | An auditor or qualified accessibility professional, not the developer who made the fix |
| When It Happens | After remediation is complete for a batch or full set of issues |
| What You Need | The original audit report with specific issues, locations, and WCAG criteria references |
| Outcome | Each issue marked as resolved, partially resolved, or still present |

What Does Validation Actually Look Like?
Validation follows the same methodology as the original audit, applied in reverse. The auditor goes back to each issue documented in the report and evaluates whether the fix resolves it.
For example, if the audit identified a form input missing a visible label (WCAG 1.3.1), the auditor checks that input again. They confirm a proper label element is now associated with the field and that assistive technology reads it correctly. If the fix is correct, the issue is closed. If not, it goes back to the development queue with notes on what still needs attention.
This process repeats for every issue in the report. A thorough validation covers each finding individually rather than spot-checking a handful.
Why a Separate Person Should Validate
The developer who writes the fix is too close to the code. They know what they intended. An independent auditor evaluates what actually rendered in the browser, in the DOM, and through a screen reader.
Accessible.org validation follows this principle. The auditor who conducted the original evaluation reviews the fixes, bringing the same methodology and assistive technology configuration used during the initial audit. This consistency matters because WCAG conformance depends on real-world behavior, not developer intent.
How to Prepare Your Fixes for Validation
Organization is everything here. Before submitting fixes for validation, structure your work so the auditor can move through it efficiently.
Reference the original issue ID. Every issue in a good audit report has a unique identifier. When your developer commits a fix, that commit or ticket should reference the same ID. This creates a direct line from the reported issue to the code change to the validation result.
Deploy fixes to a reviewable environment. The auditor needs to evaluate the live or staging version of the page, not review a pull request. Fixes have to be rendered and functional.
Batch your fixes strategically. Sending one fix at a time creates unnecessary overhead. Group fixes by page or by component type, then submit them as a batch. This lets the auditor evaluate related areas together.
Tracking Issues Through Validation
A spreadsheet works for small projects. For anything with more than a few dozen issues across multiple pages, a tracking system is the better path. You upload audit report data, assign issues, track remediation status, and then mark items ready for validation, all in one place.
Whatever tool you use, each issue should carry a status that moves through a clear sequence: identified, assigned, in progress, fixed, and validated. No issue should sit in “fixed” indefinitely without validation confirming it.
What Happens When an Issue Fails Validation?
It goes back to the development team with specific feedback. A good validator does not mark an issue as “still failing” without context. They note exactly what remains wrong and how the current fix falls short of the WCAG criterion.
Partial fixes are common. A developer might add alt text to an image but use a description that does not convey the image’s function. The fix addressed the presence of the attribute but not the quality of the content. Validation catches this distinction.
When issues cycle back, they re-enter the remediation queue and go through validation again once updated. This loop continues until every issue is confirmed resolved.
How Long Does Validation Take?
It depends on the number of issues and how well the fixes were implemented. A report with 40 issues where developers followed clear remediation guidance might take a few hours to validate. A report with 200 issues and inconsistent fixes takes considerably longer.
Well-documented audit reports speed up validation significantly. When the original report includes screenshots, code snippets, WCAG criterion references, and recommended fixes, the auditor spends less time re-establishing context. Accessible.org audit reports are written to be clear and actionable for exactly this reason.
Does Validation Mean You Are Fully Conformant?
Validation confirms that identified issues have been fixed. It does not guarantee full WCAG conformance across your entire digital asset. New content added after the audit, pages not included in the original scope, or dynamic functionality that was not triggered during the evaluation could still contain issues.
After validation, many organizations conduct a follow-up accessibility audit on a broader scope or updated content. This creates a cycle: audit, remediate, validate, re-audit. Each cycle brings you closer to full conformance.
Can Scans Replace Validation?
No. Automated scans only flag approximately 25% of issues. Many of the issues in an audit report involve content meaning, interaction patterns, and screen reader behavior that scans cannot evaluate. Running a scan after remediation might show green checkmarks while significant issues remain unaddressed.
Scans can supplement validation by catching regressions in areas like color contrast or missing alt attributes. But they are a separate activity from validation. The two serve different purposes.
Do I need the same auditor who did the original evaluation?
Not necessarily, but it helps. The original auditor already understands the scope, the digital asset’s architecture, and the specific context behind each issue. A different qualified auditor can validate, but they need access to the full audit report and enough time to familiarize themselves with the findings.
How many rounds of validation should I expect?
Most projects go through one to three rounds. The first round catches incomplete or incorrect fixes. The second round typically resolves most remaining items. A third round is less common but happens on larger projects with complex interactive components. Planning for two rounds in your remediation workflow is a reasonable starting point.
Can I validate fixes myself without an auditor?
If you have WCAG expertise and experience with assistive technology, you can perform internal checks. But for formal conformance documentation, independent validation by a qualified auditor carries more weight. Procurement teams, legal counsel, and compliance officers expect third-party verification. Self-assessed conformance claims do not hold the same credibility.
What documentation should I keep after validation?
Keep the original audit report, the validation results for each issue, and timestamps for when fixes were deployed and confirmed. This documentation supports ADA compliance efforts and can serve as evidence of due diligence. If your organization needs an ACR, the validation record feeds directly into completing that document against the VPAT template.
Validation is where remediation becomes conformance. Without it, fixes are assumptions. With it, every issue has a documented resolution tied to a specific WCAG criterion.
Contact Accessible.org to discuss validation as part of your accessibility audit and remediation project.