Introduction
In today’s performance-focused environment, agency executives require performance metrics to monitor progress against agency goals and evaluate the effectiveness and efficiency of business processes. Agencies use workload measures when allocating and managing resources. Metrics and measures enhance Section 508 program processes, boost credibility and can positively influence decisions about program budgets, priorities, staffing, and program activities.
Best Practice Guidelines
Measures vs. Metrics
There is overlap between measures and metrics. Both can be qualitative or quantitative, but what distinguishes them is important. Measures are concrete, usually measure one thing, and are quantitative in nature (e.g. I have five apples). Metrics describe a quality and require a measurement baseline (I have five more apples than I did yesterday). In a Section 508 program, measures are useful for demonstrating workloads and activity, and metrics are useful for evaluating compliance, processes effectiveness, and measuring success against established objectives.
Three Basic Flavors
Measures and metrics can be useful for setting program priorities, allocating resources, and measuring performance. Keep things simple, consistent, easy to read, and tailored for the desired audience. Each type is identified in the examples below, but first an explanation of each type. Most Section 508 reporting will fall into three basic areas: 1) Compliance Metrics, 2) Process Metrics, and 3) Workload Measures.
Compliance Metrics
Use compliance metrics to evaluate progress against the agency’s accessibility goals and objectives. Compliance metrics can measure periodic performance through activity reports (e.g. compliance results of applications each quarter) and through cumulative compliance metrics that evaluate progress against agency goals over time and give insight into the current level of risk exposure (e.g. what is the compliance status of my entire inventory of applications, and how is that changing over time).
Process Metrics
Use process metrics to evaluate business processes, establish process improvement goals, and measure progress against those goals. Consider applicable constraints and limitation when developing these reporting and analysis methods, but realize that process improvement is meaningless without an associated process metric (For instance, you change a process without realizing benefits. What is the cause: ineffective process change or lack of process adoption?) Without process metrics in place, you have no visibility into the effectiveness of the changes you make to business processes. Use process metrics to evaluate the progress of agency components, discover areas of the agency that need more attention or resources, and recognize components that are effective or show impressive improvements over time.
Workload Measures
Use these transactional measures to demonstrate workloads, capacity, and resource utilization. This type of reporting may include the number of transactions performed, hours expended, requests for assistance, number of people trained, etc. This reporting is most useful in demonstrating resource requirements, but they are also useful when presented in combination with process or compliance metrics. For instance, you may want to evaluate the effectiveness of a training program for developers by correlating improvements in application testing results with the number of developers trained.
Use Objective Oriented Performance Metrics
Preconceive how your reporting will support your program and agency mission. Simply collecting information for later analysis is wasteful and resulting reports may not fulfill your purposes. Here are some example objective oriented questions that can lead to productive metrics:
- What is the agency’s cumulative inventory of risk exposure?
- Are web page compliance numbers improving?
- Are Section 508 procurement processes effective and resulting in the insertion of strong language and evaluation practices into solicitations?
- Is the agency ensuring that accessibility is an evaluated requirement giving vendors that are more compliant a competitive advantage?
- What agency components or divisions need more training, developer assistance, and scrutiny? Which are leading and deserve recognition?
Applicability
Develop metric and measurement practices by identifying the outcomes you want to support and then assessing your ability to collecting supporting data. Methods and techniques are difficult to prescribe because so much will depend on what you want to evaluate and what you can evaluate. Below are a number of good examples that may help you develop ideas of how your program can develop an effective metrics and measures practice within your Section 508 program.
Collection Constraints
Agency structure and Section 508 program activities and priorities will influence information collection, how it is collected, resulting reports, and associated analysis. Business objective must drive reporting methodologies, but your reporting will be constrained by some limiting factors:
- Difficulty and expense of collecting information
- Accuracy and timeliness of information
- Whether activities are performed under one, or several program sponsors (i.e., centralization-vs.-decentralization)
- Simplicity and ease of interpreting reports
- Organizational culture, structure, and priorities
Compliance Metrics Examples
Use these examples to generate your own ideas on how to present performance metrics, the data you want to collect, and the benefits each metric or measure may offer. These examples predominately break out by year or quarter, but alternative methods can prove effective too. For example, agencies with a distributed Section 508 program will benefit by breaking metrics out by agency component or division.
Agency Cumulative Web Compliance Inventory
This chart focuses on web and similar reports can address Electronic Documents, Multimedia, Webcasts, Telecommunications, Software, and Hardware. It provides a snapshot of the agency’s cumulative compliance with Section 508, and demonstrates progress against agency long-term improvement goals.
Q4 FY2010 | Q1 FY2011 | Q2 FY2011 | Q3 FY2011 | |
---|---|---|---|---|
Internet Web Page Compliance | 91% | 87% *see note |
91% | 92% |
Internet Applications Compliance | 93% | 93% | 94% | 94% |
Intranet Applications | 76% | 76% | 79% | 82% |
Number of Administrative Complaints Received (since 2005) | 46 | 49 | 49 | 50 |
Number of Administrative Complaints Resolved (since 2005) | 41 | 42 | 44 | 47 |
Outstanding Administrative Complaints (since 2005) | 5 | 7 | 5 | 3 |
*Sample Analysis: A more rigorous automated test program caused the drop in Q1. Q2 and Q3 show incremental improvements addressing the newly discovered non-compliant web content.
Quarterly Compliance Activity Report
Unlike the cumulative summary chart above, this chart measures activity performed only within the quarter. This more granular approach will help program managers use metrics to progressively monitor and improve programs over the short term. If aggregated by agency component or division, managers can identify groups that need more training and developer assistance.
Q4 FY2010 | Q1 FY2011 | Q2 FY2011 | Q3 FY2011 | |
---|---|---|---|---|
Internet Web Page Compliance | 98% | 74% *see note |
95% | 97% |
Internet Applications Compliance | 97% | 94% | 96% | 97% |
Intranet Applications | 79% | 81% | 84% | 84% |
Number of Administrative Complaints Received | 2 | 3 | 0 | 1 |
Number of Administrative Complaints Resolved | 1 | 1 | 2 | 3 |
*Sample Analysis: A more rigorous automated test program caused the drop in Q1. Q2 and Q3 show significant incremental improvements addressing the newly discovered non-compliant web content.
Procurement Evaluation Determinations
This is a yearly cumulative chart of procurement determinations. It provides a landscape view of the IT market, the procurement activity within an agency, and an inventory of cumulative risk.
Determination (cumulative since 2005) | FY 08 | FY 09 | FY 10 | FY 11 |
---|---|---|---|---|
Compliant | 343 | 796 | 1270 | 1671 |
Partially Compliant | 541 | 1193 | 1855 | 2453 |
Undue Burden Exception | 9 | 3 | 5 | 4 |
National Security Exception | 0 | 0 | 0 | 0 |
Not Commercially Available | 13 | 25 | 31 | 42* see note |
Service Space Exceptions | 2 | 7 | 4 | 2 |
*Sample Analysis: Solicitations qualified to except part or all of the Section 508 requirements are rising significantly as the agency adopts 3rd party collaboration tools and application architectures that are less compliant than previous technologies.
Process Metric Examples
Section 508 Requirements in Procurements
Periodic audit reports give a Section 508 office and management visibility into the effectiveness of programs, life cycle requirements, and business processes. This chart shows yearly results of a random sample of solicitations evaluated for the completeness of Section 508 requirements included in solicitations.
FY 08 | FY 09 | FY 10 | FY 11 | |
---|---|---|---|---|
# Solicitations Sampled | 75 | 75 | 75 | 75 |
% that state IT must be accessible | 78% | 81% | 84% | 89% |
% indicating which Section 508 standard apply to the purchase | 72% | 73% | 77% | 71% |
% that request accessibility information from vendors (e.g. VPATs and other clarifying materials) | 69% | 69% | 72% | 74% |
% that state how vendor proposals are evaluated for Section 508 compliance, including acceptance criteria | 62% | 64% | 65% | 65% |
% containing language giving the agency the option to perform hands-on testing to validate vendor 508 compliance claims | 57% | 59% | 59% | 61% |
% requiring product replacements to be equally 508 compliant (or better) than original product purchased | 29% | 27% | 38%* see note |
43% |
*Sample Analysis: The communications and training plan, instituted by the Office of Procurement and the Section 508 office in FY 09, are resulting in a significant increase in Section 508 product replacement requirements being placed in agency solicitations.
508 Requirements in Development
This chart shows yearly results of a random sample of development projects. The chart evaluates how effectively the sample projects followed agency procedures for incorporating Section 508 throughout the life Cycle. This type of approach could be applicable for any development activity, including development of web pages and applications, mobile applications, eLearning modules, videos, VOIP implementations, etc.
FY 08 | FY 09 | FY 10 | FY 11 | |
---|---|---|---|---|
# Development Projects Reviews Sampled | 75 | 75 | 75 | 75 |
% which identified all applicable Section 508 requirements in planning documents | 78% | 81% | 79% | 82% |
% which conducted a Section 508 compliance accessibility review in the design stage | 65% | 69% | 74% | 81% |
% which were tested for Section 508 compliance prior to release | 89% | 91% | 93% | 97% |
Acceptance Determinations
This chart demonstrates a combination of compliance metrics and process metrics. The purpose of this chart is to demonstrate how non-compliance issues are addressed and any resulting trend associated with a recent process change (see the sample analysis).
Final Acceptance Criteria Determination | Q4 FY2010 | Q1 FY2011 | Q2 FY2011 | Q3 FY2011 |
---|---|---|---|---|
Pass with no exceptions | 12 | 9 | 11 | 14 |
Pass with minor exceptions | 5 | 2 | 4 | 8 |
Pass with an approved follow-on remediation plan | 12 | 7 | 5 | 4* see note |
Pass with an approved follow-on remediation plan | 1 | 5 | 3 | 2 |
Qualified for an exception | 3 | 0 | 2 | 2 |
Pass with major exceptions and an approved reasonable accommodation plan | 2 | 0 | 1 | 1 |
*Sample Analysis: We continue to pressure project teams to delay project acceptance rather than ‘Pass with an approved follow-on remediation plan’ since we discovered project teams are recalcitrant in this category.
Agency Component Status and Comparison
This chart displays indicators of program maturity and correlating performance and outcomes. See the DHS example to get ideas on additional items you may want to track.
Agency Component | Section 508 Plan? | 508 Coordinator? | # of FTEs and Contractors Dedicated | # of Applications in Production | # of Applications with Section 508 Defects | % of Applications with Section 508 Defects | Number of Complaints and Post Acceptance Issues |
---|---|---|---|---|---|---|---|
Div A | Yes | No | 2 | 34 | 6 | 18% | 2 |
Div B | Yes | Yes | 5 | 67 | 4 | 6% | 0 |
Div C | No | No | 1 | 18 | 11 | 61% | 5* see note |
Div D | Yes | Yes | 3 | 62 | 7 | 11% | 1 |
*Sample Analysis: Div C maintains fewer Section 508 resources (no 508 Coordinator and just one tester) because it manages fewer applications, but it represents 40% of the agency’s application risk exposure and the majority of formal and informal complaints.
Workload Measure Examples
Monthly Procurement Assistance Report
This chart tracks the resource requests and resources consumed by a Section 508 office in support of agency procurements. Use measures similar to this to demonstrate the level of effort a Section 508 office expends in supporting procurement or many other support functions: Developer Assistance, Application Testing, Web Page Testing, Document Testing, Document Remediation, and Administrative Complaints, etc.
Procurement Phase | Number of Requests | Hours Expended |
---|---|---|
RFP/SOW Requirements Assistance | 3 | 7 |
VPAT Evaluation by Section 508 Office | 32 | 18 |
Hands-on Testing by Section 508 Office Testers (new procurements) | 3 | 12 |
Hands-on Testing by Section 508 Office Testers (contract substitutions) | 2 | 5 |
Number of Employees Trained
This simple transactional report shows the number of employees trained in Section 508 related courses. Use a similar chart to monitor and demonstrate level of effort needed to support training, agency exposure to 508 training, and determine where additional classes might be required. Similar to the chart above, areas that benefit from transactional reporting include Developer Assistance, Application Testing, Web Page Testing, Document Testing, Document Remediation, and Administrative Complaints.
Training Class | Q4 FY2010 | Q1 FY2011 | Q2 FY2011 | Q3 FY2011 | Class Totals |
---|---|---|---|---|---|
Intro to 508 | 85 | 92 | 91 | 99 | 367 |
Word Accessibility | 35 | 46 | 59 | 70 | 210 |
PDF Accessibility | 25 | 21 | 32 | 38 | 116 |
HTML Developer | 42 | 56 | 61 | 67 | 226 |
Web Testing | 51 | 92 | 107 | 125 | 375 |
Procurement | 18 | 38 | 41 | 47 | 144 |
Totals | 256 | 345 | 391 | 446 | 1438 |
Examples