Testing Details and Workflow

DAO accessibility testing

  • Involves testing by individuals with disabilities on one or more platforms (Windows, macOS, iOS, Android) with a variety of assistive technologies.
  • The scope of testing is determined by the needs of the customer and the complexity of the system being tested.

Testing timeframes

Timelines may vary based on the length of our testing queue and availability of our testers. Full tests can take a few weeks to allow for testing and report writing.

Once testing is complete, the lab will provide a report documenting the findings. Please review the for an idea of what is encompassed in a report document. If you are not a Â鶹Ãâ·Ñ°æÏÂÔØBoulder entity, please request access from AUL@Colorado.EDU. We can also customize our reporting to fit your needs (e.g. narrative report, spreadsheet, list of issues, video demonstration).

To request testing, please contact us at AUL@Colorado.EDU. You can also review the usual workflow between the lab and the client (shown further down the page) before contacting us so you have a good idea of what to expect.

Testing technology

The DAO testing staff includes individuals with disabilities who utilize different methods for access as well as different assistive technologies. The DAO keeps all of our systems up to date to ensure we’re testing with the most current technologies. If you need a test with older technology, please email us and check the availability.

What we don't do

Before understanding our testing workflow, it is important to note that there are a few things that we will not be able to do:

  • Reveal the identity of testers.
  • Make decisions to launch or stop a service - we can only provide recommendations and evaluations about what we believe to be the best course of action.
  • Correct the application's source code to remediate the encountered issues.

Testing workflow

All testing follows a standard workflow. Please review the following to see the three stages of what happens before, during, and after testing.

    Testing process

    • You will meet with a DAO representative to answer .Ìý
    • We will develop a specialized script for testers with language that makes sense to the tester (e.g. replacing visual-based instructions with navigation-based instructions).
      • We have a . If not a Â鶹Ãâ·Ñ°æÏÂÔØBoulder entity, please request access to the sample script from AUL@Colorado.EDU.
    • ​We will schedule testers for the test and assign tools.
    • Testers generally test independently, noting issues as they encounter them in the script and task list. If a test is particularly difficult or note taking is complex, then a group test is completed with a sighted observer taking notes and verbally supporting (if needed).
    • You will have a representative on call to address unexpected problems (access denied, data reset, login expired, etc).
    • We can update you on how the testing is going and share early impressions of the project's accessibility, if needed.
    • We will compile the test report and group problems by severity.
      • Severe issues represent items that create access barriers and need to be remediated.
      • Significant issues represent items that create a great deal of difficulty and should be remediated.
      • Minor issues represent items that are the lowest priority but would be good to remediate.
      • Usability issues can impact users of any ability.
    • We can relate all accessibility problems to the sections and subsections of the WCAG standards, if needed.
    • We will deliver the final report to the customer and the vendor when warranted.
    • - accessible to anyone with a Â鶹Ãâ·Ñ°æÏÂÔØBoulder account on Office365.
    • We can offer video recordings and live demos to showcase the issues.
    • We strongly recommend live demos, especially for service owners and developers with limited accessibility knowledge.
    • We can make recommendations on how to improve the usability of problematic elements.
    • We can provide additional testing, as needed, after the developer attempts to correct the reported issues.
    • When possible, we will work with vendors to help them improve their products holistically, not just at our own institution.
    • We would appreciate if you complete our feedback form to tell us how we did and how we can improve.