Skip to secondary navigation Skip to main content

Governmentwide Findings: Accessibility Conformance Testing and Technology Lifecycle

Accessibility Conformance Testing and Technology Lifecycle

The Assessment included questions regarding outcome-based results in order to gauge if policies, practices, and procedures – a dimension that received higher maturity governmentwide – translated into Section 508 conformant ICT. The average conformance index value (for all reporting entities) was 1 .79 (out of 5), or Low, emphasizing that inputs are not translating into conformant ICT.

ICT Testing Outlook

As expected, the majority of reporting entities used a combination of automated and manual tools for ICT testing. While not feasible to test all ICT manually, strategic employment of automated tools coupled with manual testing allows reporting entities to achieve a wide scope and targeted depth of testing.

One question asked respondents what manual or hybrid testing methodology they used. 194 respondents reported using one or more of the manual or hybrid ICT accessibility test methodologies for web content shown in Table 6 below:

Table 6. Number of reporting entities using specified testing methodologies
Methodology Number of Reporting Entities Using Specified Methodology (of the 194 Reporting Entities)
Manual Testing with Guided Developer Tools 119 reporting entities (61%)
Assistive Technology 94 reporting entities (48%)
Manual Code Inspection 79 reporting entities (41%)
Trusted Tester 5.x 75 reporting entities (39%)
Reporting Entity-Specific Test Methodology 56 reporting entities (29%)

Similarly, the majority of respondents (153 reporting entities or 61%) reported using at least one automated accessibility testing tool for comprehensive, large-scale monitoring of web content. Of those reporting entities, 103 (67%) responded that personnel who use the tool and interpret the results received training on the tool.

Average percentage of respondents with automated testing tools by maturity category: Very Low: 30%, Low: 70%, Moderate: 68%, High: 92%, and Very High: 67%.
Figure 16. Average percentages of reporting entities with an automated testing tool by maturity brackets
Average percentage of respondents with automated testing tools by performance category: Very Low: 43%, Low: 65%, Moderate: 89%, High: 72%, and Very High: 81%.
Figure 17. Average percentages of reporting entities with an automated testing tool by conformance brackets

In the preceding two figures, an average percentage of reporting entities that reportedly had automated testing tools was assessed across each bracket with respect to both maturity and conformance. That is, the percentages of each overall category with the same conformance bracket (i.e., Very Low-Very Low, Low-Very Low, or Moderate-Very Low) were calculated and then included in the chart.19 Again, a similar trend was seen: generally, the higher the conformance or maturity, the higher the percentage of reporting entities with automated testing tools.

When asked how often reporting entities conduct comprehensive conformance validation testing for web content (internet and intranet) prior to deployment, almost half of all respondents (120 reporting entities or 48%) reported they sometimes or never conduct comprehensive manual tests on web content for Section 508 conformance. A similar percentage (113 reporting entities or 45%) reported they sometimes or never conduct comprehensive automated tests on web content for Section 508 conformance.

Additionally, when asked how often reporting entities conduct web content user testing with people with disabilities prior to deployment to address all applicable Section 508 standards, reporting entities overwhelmingly (217 respondents or 87%) reported they never or only sometimes conduct user testing with people with disabilities (see Figure 18 below.)

A bar graph shows the percentage of responses for how often reporting entities conducted web content user testing with PWD prior to deployment (Q7). Forty two percent (42%) of respondents selected 'never conducts user testing with PWD on web content for Section 508 conformance'; 45% selected 'sometimes conducts user testing with PWD on web content for Section 508 conformance, but generally on an ad hoc basis'; 6% selected 'regularly conducts user testing with PWD on web content for Section 508 conformance'; 4% selected 'frequently conducts user testing with PWD on web content for Section 508 conformance prior to deployment'; 2% selected 'comprehensive user testing with PWD is integrated into development and deployment, resulting in almost no accessibility issues deployed to production environments'; and 1% selected 'N/A - does not publish or maintain any web content'.
Figure 18. Percentage of responses for how often reporting entities conduct web content user testing with PWD prior to deployment (Q7)

Some reporting entities integrate accessibility throughout the technology lifecycle, which may have influenced lower results for comprehensive testing prior to deployment. To gauge the level of integration, the Assessment included a question asking the extent to which ICT accessibility is integrated throughout a reporting entity’s technology development lifecycle activities:

  • Just over half of reporting entities (125 reporting entities or 51%) reported ICT accessibility requirements are regularly, frequently, or almost always integrated throughout technology lifecycle activities, leaving just under half of the reporting entities unsure of inclusion, never including, or including on an ad hoc basis.

  • About half of all reporting entities (131 reporting entities or 52%) also reported that accessibility reviews are not or never formally integrated into the publication process, with review generally ad hoc.

  • Half of all reporting entities reported they have a formal policy requiring inclusion of Section 508 requirements, and regularly, frequently, or almost always include Section 508 requirements in ICT governance processes. 20

A bar graph shows responses indicating the extent to which ICT accessibility is integrated throughout technology development lifecycle activities (Q45): 13 respondents selected that ICT accessibility considerations are never included; 76 selected that ICT accessibility considerations are sometimes included; 66 selected that ICT accessibility requirements are regularly included; 29 selected that ICT accessibility requirements are frequently and usability, accessibility and AT testing are performed, and post-implementation IT reviews are conducted; 30 selected that ICT accessibility requirements are almost always included and universal design best practices are followed and accessible component re-use and sharing of accessible concepts and solutions; and 35 selected N/A - does not have a formal technology development lifecycle.
Figure 19. Response count indicating the extent to which ICT accessibility is integrated throughout technology development lifecycle activities (Q45)

The Assessment included several questions specifically related to electronic documents and communications and found:

  • Just over half of reporting entities (142 reporting entities or 57%) reported they regularly, frequently, or almost always test electronic documents before posting, essentially reporting more bandwidth to perform more comprehensive electronic document testing than web testing.

  • Approximately 44% of reporting entities reported they have formal processes to ensure formal communications (internal, external, and in response to an emergency) are Section 508 conformant and they regularly, frequently, or almost always follow these processes.

About half of all reporting entities integrate Section 508 into the technology lifecycle to varying degrees, and thus may not need robust testing prior to deployment. However, as conformance metrics depicted at the beginning of Findings, processes are not robust enough governmentwide to fully translate into highly conformant ICT outcomes. Future years will necessitate a deeper dive into reporting entity actions related to testing to help paint a more robust picture of Section 508 implementation throughout the technology lifecycle to help pinpoint successes and areas of improvement.
Taken together, while reporting entities reported having testing tools and testing methodologies, additional data points did not support full utilization for comprehensive testing, including integration into technology lifecycles and testing prior to deployment.

Conformance Relationships: Regression Deep Dive

Through regression analysis, electronic document conformance emerged as a critical dependent variable. Regression 29 delved into a more complex analysis by examining the predictors of Section 508 conformance of electronic documents (Q80). It considered three independent variables: the status of the Section 508 Program (Q22), Section 508 Program resources and staffing (Q29), and Section 508 awareness training (Q59).21 Most notably, it showed a positive association between the status of a Section 508 Program (Q22) and electronic document conformance (Q80). A one-point change in the status of Section 508 Program (Q22), significantly predicted a 0.11 change in electronic document conformance (Q80). In contrast, a positive one-point change in Section 508 awareness training (Q59) significantly predicted a -0.042 change in electronic document conformance (Q80). Despite our initial expectation that increased training efforts would correspond to fewer accessibility issues within electronic documents, the empirical findings indicated a more nuanced relationship. There may be a limit to how much training can help electronic document conformance or reporting entities with more conformance issues are the ones pushing for more awareness training. In this model, resources and staffing (Q29) was not a significant predictor of electronic document conformance (Q80).

Regression 35 mirrored the structure of Regression 29, exchanging Section 508 awareness training (Q59) for ICT accessibility-related training (Q60) as independent variables.22 Like Regression 29, it showed another positive association between Section 508 Program status (Q22) and electronic document conformance (Q80). In particular, a one-point change in the status of Section 508 Program (Q22), significantly predicted a 0.12 change in electronic document conformance (Q80). In contrast, a positive one-point change in ICT accessibility-related training (Q60) significantly predicted a -0.057 change in electronic document conformance (Q80). Like Section 508 awareness training (Q59) in Regression 29, ICT accessibility-related training (Q60) was negatively correlated with electronic document conformance (Q80), and we speculate Regression 29 and Regression 35 share underlying reasons for this negative correlation. Also in this model, resources and staffing (Q29) were not significant predictors of electronic document conformance (Q80).

Furthermore, equivalent regression analysis replacing electronic document conformance with intranet, public internet, and video conformance did not stand out as statistically meaningful. The reasons behind this difference remain uncertain. Improved data quality or year-to-year analysis may shed more light on this matter. For now, regression suggests electronic document conformance serves as a better indicator of Section 508 Program maturity.

Regression analysis investigated the relationships between reporting entity size, as provided by publicly available datasets from Fedscope OPM, and Section 508 conformance of intranet web pages, public internet web pages, public electronic documents, and videos (Q61, Q71, Q78, Q79, Q80, Q81).23 The results consistently showed reporting entity size, on its own, does not have a meaningful impact on ICT conformance. While size itself may not be a good indicator, reporting entities with a strong department-level or parent agency that offers resources to component reporting entities may lead to higher conformance. Additionally, reporting entities that have a parent-component dynamic have implications for size, and we expect the department as a whole is relatively large while components individually are much smaller. For FY23, the criteria did not include tailored questions to pinpoint reporting entities who utilize parent-agency level resources in order to determine any correlation. We intend to hone questions in FY24 to find correlations between parent and component reporting entities.

(See FY23 Data Dictionary (XLSX))

Non-Conformance Tracking and Remediation

As demonstrated by low conformance, governmentwide, non-conformant ICT is prevalent. To understand how reporting entities track and prioritize remediation efforts, the Assessment included several questions surrounding the methodologies used and found that most respondents did not report robust accessibility remediation tracking and prioritization processes:

  • Just over half of all respondents (128 reporting entities or 51%) reported they do not track non-conformant digital content, or they do track but only sometimes take action to remediate. Figure 20 shows a further breakdown of responses.

  • 79 respondents (40%) who engage in technology lifecycle activities said they do not identify or prioritize risk of Section 508 non-conformant ICT throughout the technology development life cycle, or they only sometimes utilize a risk assessment.24

  • Specifically related to web remediation, 120 reporting entities (48%) said they never remediate Section 508 conformance issues after deployment, or they only sometimes do so mostly on an ad hoc basis.

A donut chart shows the percentages of responses for how non-conformant digital content is tracked and remediated (<a href='/manage/section-508-assessment/criteria-05/#q36'>Q36</a>): 25% of respondents selected that reporting entity does not track non-conformant digital content; 27% selected that reporting entity tracks but only sometimes takes action to remediate the content; 19% selected that reporting entity tracks and regularly creates and takes action on a plan to remediate; 17% selected that reporting entity tracks and frequently creates and takes action on a plan to remediate; and 13% selected that reporting entity always or almost always creates conformant content that does not require remediation/agency always or almost always takes action on a plan to remediate.
Figure 20. Percentages of responses for how non-conformant digital content is tracked and remediated (Q36)

Conclusion

While reporting entities showed moderate maturity in Policies, Procedures, and Practices (government average of 2.54 out of 5), efficacy is lacking as conformance is relatively low at 1.79 out of 5. It may be that policies and procedures are not being thoroughly enforced or followed within reporting entities. Alternatively, if followed, reporting entities’ policies and procedures may not be effective or specific enough to ensure Section 508 conformant ICT is produced. Additionally, although respondents reported integrating accessibility into Technology Lifecycle Activities, implementing test methodologies and processes, and testing large swaths of ICT, the ICT that is deployed typically has Section 508 defects.

While the reporting entities are testing ICT, they are not prioritizing remediation or following processes they already have in place. Some non-conformance issues may be possible to identify and overcome in the acquisitions and procurement process, for example by holding vendors and contractors accountable for producing Section 508 conformant ICT). Some defects may be platform related, such as when a small error in a template outside of a reporting entity’s control can cause failures across all intranet pages. However, monitoring web content via automated tools should be coupled with a strategic manual testing and remediation plan to track, fix, and prevent the perpetuation of accessibility defects.


  1. Absolute values of reporting entities within each bracket (conformance or maturity) were also calculated, with similar results. That is, instead of averaging the percentage (%) of reporting entities with automated testing tools across overall categories with like conformance (or maturity brackets), the total number of reporting entities that reported having access to automated testing tools that fell into a conformance bracket (i.e. Very Low Conformance or Low Conformance) was divided by the total number of reporting entities that fell within that bracket. Again, similar trends were observed.
  2. Governance processes include milestone reviews, publication and deployment decisions, and change control reviews.
  3. Taken together, the independent variables yielded the regression equation Q80 = 0.084 + 0.11(Q22) + 0.015(Q29) - 0.042(Q59). The overall relationship was highly statistically significant (***), and independent variables (Q22, Q29, and Q59) could explain 12% of the differences in Q80. Both Q22 and Q59 significantly predicted Q80 (*** and **, respectively). Asterisks refer statistical significance at the following levels: *** 0.01 ≥ p-value; ** 0.05 ≥ p-value; * 0.1 p-value > 0.05.
  4. The regression equation Q80 = 0.093 + 0.12(Q22) + 0.022(Q29) - 0.057(Q60) resembled that for Regression 29. Again, the overall relationship was highly statistically significant (***), and independent variables (Q22, Q29, and Q60) could explain 12% of the differences in Q80. Both Q22 and Q60 significantly predicted Q80 (*** and **, respectively). Asterisks refer statistical significance at the following levels: *** 0.01 ≥ p-value; ** 0.05 ≥ p-value; * 0.1 p-value > 0.05.
  5. Regressions 37 to 42.
  6. 51 reporting entities (20%) noted they do not engage in technology lifecycle activities and were removed from the calculation of overall percentage.

Reviewed/Updated: December 2023

Section508.gov

An official website of the General Services Administration

Looking for U.S. government information and services?
Visit USA.gov