Hearing on the Web Accessibility Directive completed

The public hearing on the implementing acts of the Web Accessibility Directive is now completed. Funka responded to the hearing regarding monitoring methodology and model accessibility statement.

Funka has been working extensively with this topic for the commission leading the study on monitoring methodologies (SMART 2014/0061) and being the pennholder for monitoring in the WADex SubGroup. We are also serving several individual member states governments with support when it comes to national implementation of the directive.

Monitoring the accessibility of millions of websites, intranets, documents and apps is tricky. If you want to do it for real covering all of them, it will be extremely costly. If you monitor just a few websites in each country, the result will not be statistically valid, and the chance – or maybe risk? – of being monitored will be too small for the regulation to have thhe desired effect. With automatic tools, monitoring large numbers of websites is a lot cheaper, but the quality of the results is highly questionable and becomes much more complex to interpret.

Proposed monitoring methodology

This is why both the SMART study and the WADex SubGroup have concluded that a combination of qualitative and quantitative tests is necessary. This is also what the draft implementation act is proposing:

The qualitative in depth testing of compliance covers all success criteria in the EN-standard, and will be performed on a smaller number of websites, intranets, documents and apps.

The quantitative simplified testing of non-compliance will be performed as cluster-sampling on a larger number of websites. Please note that the result of this simplified testing says nothing about the accessibility of the website, it can only find examples of flaws.

Together, these methods have the potential to provide both details on the level of accessibility and high level trends supporting the member states in prioritising on which issues or sectors where more focus or training is needed.

Other significant recommendations that I am happy to recognise in the draft are:

  • At least 20% of the websites selected for in depth monitoring should be chosen after consultation with disabled persons organisations. This makes it more probable that interfaces important to the target group is being monitored.
  • The monitoring is going to be performed annually. This is key to keep the topic of accessibility on the agenda everytime budgets are planned and decided.
  • In the simplified testing for non-compliance, the needs of all user groups covered in the EN standard must be tested. This is a very good way to make sure the simplified testing is at least basically relevant. Far too many of the automatic tools check a very limited part of accessibility for end users, focusing as they are on technical issues that are easy to test. This is also highlighted in the draft, urging for more innovation in automatic tools.
  • In the in depth testing, stepwise processes and user interaction with forms must be tested, such as e-services. This is an extremely critical quality criteria to make sure monitoring is relevant for the services in use and it also pushes the tool vendors to create new functionality.

The missing parts

Unfortunately, some crucial items have been taken out of the draft:

  • The draft says that memberstates might change the easy checks over time. But when performing simplified spot-checking for non-compliance, it is absolutely essential to alter the tests from one monitoring period to the next. If the same tests are performed each year, we know from experience that website owners and suppliers alike will focus only on these. The tests must be altered between monitoring periods, and preferrably randomised. Only this way, compliance with the full standard continues to be the goal, even when only small parts of the standard is tested.
  • Since the public sector bodies covered by the directive are going to publish a statement on their level of accessibility, they need to know how accessible their websites, intranets, documents and apps are. This generally means that they perform an evaluation, either internally or with external experts. If such an evaluation is done, the draft says that the monitoring body in each member state can use this evaluation as part of their monitoring. This makes sense, there is no need to perform the same monitoring twice.

But, there are several problems here:

    • For the monitoring body to be able to use the evaluation done at website owner level, it can’t be older than three years according to the draft. Three years! Wow. Can you even remember how your website looked three years ago?
    • The evaluations done at website owner level have very few quality criteria in the draft. This way, it will be difficult to make sure it’s comparable to the member state level monitoring.
    • In the draft model accessibility statement, the link to an evaluation is only optional, which means the website owner does not have to publish it. This is surprising, given that the directive explicitly state that the statement must be ”detailed, comprehensible and clear” (article 7). The lack of transparency would be problematic both from an end user and monitoring perspective.

Another difference from the recommendations is that the number of monitored objects is very low. This has been a constant discussion between those who want the monitoring to be an effective tool for implementation and those who claim the monitoring exercise to become too expensive. The financial argument is usually presented as wanting to spend the money on increasing accessibility instead of monitoring the (lack of) accessibility. I do support that view, and frankly I don’t think the exact numbers of websites monitored is the crucial point here. Making sure the monitoring is of high quality, relevant, transparent and trustworthy is much more important. But the monitoring must make sense; if the sample gets too small, it won’t show anything.

The chance – or risk? – of being monitored may well be a driving force for the website owners that still don’t see the positive effects of accessibility.

Susanna Laurin

Public feedback on the implementing acts under the Web Accessibility Directive, opens in new window

Funka's response to the implementing acts of the Web Accessibility Directive regarding monitoring methodology, opens in new window

Funka's response to the implementing acts of the Web Accessibility Directive regarding model accessibility statement, opens in new window


Susanna Laurin

Title: Chief Research and Innovation Officer

susanna.laurin@funka.com (Susanna Laurin)

+46 8 555 770 61 (Susanna Laurin)