Matteo Battistelli , Silvia Mirri , Ludovico Antonio Muratori

12
Page 1 MAKING THE TREE F ALL SOUND: REPORTING WEB ACCESSIBILITY WITH THE V AMOLÀ MONITOR Matteo Battistelli 1 , Silvia Mirri 2 , Ludovico Antonio Muratori 2 , Paola Salomoni 2 , Simone Spagnoli 1 1 Corso di Laurea in Scienze e Tecnologie Informatiche Via Sacchi, 3 Cesena - Italy {matteo.battistelli4, s.spagnoli}@unibo.it 2 Dipartimento di Scienze dell’Informazione Via Mura Anteo Zamboni, 7 Bologna - Italy {silvia.mirri, paola.salomoni, ludovico.muratori3}@unibo.it Abstract – The sheer banality of stating the Web as wide deeply impacts on any aim to control it or regulate it or, again, improve it on the strength of some principles and best practices. This is the case of Web accessibility. Whenever Public institutions, according to the law in force, have to provide strategies for Web accessibility promotion and surveillance, they always have to face a diverse, rapidly varying and endless scenario of pages and services over the Net. And it happens at every level: from regions to countries, from Public health to universities. Reports which come from automatic, in time monitoring of such a wide realm, represent a suitable solution to make any intervention and prevention effective and efficient. This paper presents the VaMoLà project, which has been issued to integrate a monitor and a validator of the Web accessibility, according to the Italian Law. By using such tools, evaluations of two different samples of Italian Public administrations Web sites were conducted and their results are illustrated in the following. 1. – Introduction Nowadays Web wideness and dynamism represent critical parameters whenever contents and services on the Net become the target of a certain control or any regulation has to be applied to them. This is the case (among some other ones) of Web accessibility compliance to the national laws some countries have enacted since the last ‘90s. Even by reducing any assessment to a particular Public institution (such as a Regional one, or a Public University), the number of Web pages and services to be evaluated may be actually prohibitive. Centralization of decisional powers, of control roles and of prevention provision, which typically characterize Public organizations, amplifies the impact of Web contents and services amount, variety and variability. As an example, let us consider the Italian scenario, where Regional governments assume the responsibility of controlling and promoting Web accessibility for a very articulated dominion of institutions (from municipalities to Public health). In order to face such a complex goal, preventive and controlling activities have been typically issued. At one side, the training of Web authors and the spreading of universal design principles have been provided. At the other side, citizens feedback logging services and accessibility benchmark services have been set up. Making these policies effective and efficient is obviously another critical aspect of the big picture we sketched above. Indeed, the use of applications which are customized for the national regulations would surely be the first

Transcript of Matteo Battistelli , Silvia Mirri , Ludovico Antonio Muratori

Page 1

MAKING THE TREE FALL SOUND: REPORTING WEB ACCESSIBILITY WITH THE VAMOLÀ MONITOR

Matteo Battistelli1, Silvia Mirri2, Ludovico Antonio Muratori2, Paola Salomoni2, Simone Spagnoli1

1 Corso di Laurea in Scienze e Tecnologie Informatiche

Via Sacchi, 3 Cesena - Italy {matteo.battistelli4, s.spagnoli}@unibo.it

2 Dipartimento di Scienze dell’Informazione Via Mura Anteo Zamboni, 7 Bologna - Italy

{silvia.mirri, paola.salomoni, ludovico.muratori3}@unibo.it

Abstract – The sheer banality of stating the Web as wide deeply impacts on any aim to control it or regulate it or, again, improve it on the strength of some principles and best practices. This is the case of Web accessibility. Whenever Public institutions, according to the law in force, have to provide strategies for Web accessibility promotion and surveillance, they always have to face a diverse, rapidly varying and endless scenario of pages and services over the Net. And it happens at every level: from regions to countries, from Public health to universities. Reports which come from automatic, in time monitoring of such a wide realm, represent a suitable solution to make any intervention and prevention effective and efficient. This paper presents the VaMoLà project, which has been issued to integrate a monitor and a validator of the Web accessibility, according to the Italian Law. By using such tools, evaluations of two different samples of Italian Public administrations Web sites were conducted and their results are illustrated in the following.

1. – Introduction

Nowadays Web wideness and dynamism represent critical parameters whenever contents and services on the Net become the target of a certain control or any regulation has to be applied to them. This is the case (among some other ones) of Web accessibility compliance to the national laws some countries have enacted since the last ‘90s. Even by reducing any assessment to a particular Public institution (such as a Regional one, or a Public University), the number of Web pages and services to be evaluated may be actually prohibitive. Centralization of decisional powers, of control roles and of prevention provision, which typically characterize Public organizations, amplifies the impact of Web contents and services amount, variety and variability. As an example, let us consider the Italian scenario, where Regional governments assume the responsibility of controlling and promoting Web accessibility for a very articulated dominion of institutions (from municipalities to Public health). In order to face such a complex goal, preventive and controlling activities have been typically issued. At one side, the training of Web authors and the spreading of universal design principles have been provided. At the other side, citizens feedback logging services and accessibility benchmark services have been set up. Making these policies effective and efficient is obviously another critical aspect of the big picture we sketched above. Indeed, the use of applications which are customized for the national regulations would surely be the first

Page 2 Making the Tree Fall Sound: Reporting Web Accessibility with the VaMoLà Monitor

step toward such effectiveness. Tailoring automatable evaluations around law requirements would, in fact, guarantee time and resource saving, which otherwise would be wasted in pre-set single procedures. Moreover, the automatic monitoring of Web accessibility, whether it were done in time and according to some classifications (geographical, by role) on the dominion of controlled institutions would represent a means to suitably focus every preventive or controlling intervention. In particular, whenever some report reveals the prevalence or arising of a certain shortcoming, an ad hoc procedure of correction or (re)control could be “locally” done on some sub-set of Public Web sites, without involving all of them. General or particular worsening of one or more aspects of accessibility could be rapidly discovered, so that suitable campaigns of training or advertising can be done, aimed at their surveillance and fixing, rather than at any single case correction. Finally, differently from citizens feedbacks, automatic accessibility evaluation and monitoring warrants (at least, partially) that there is no need of a listener “to make any falling tree sound”, so that some barriers can be identified before they are suffered by users. In this paper we present some results from the VaMolà Monitor, which has been designed and implemented to gather accessibility status of Web sites on the wide and various dominions of Italian Public institutions and according to the Italian regulations [7]. Such an application is a part of the VaMoLà project [9, 2], which was born from a collaboration between the Emilia-Romagna Region and the University of Bologna. VaMoLà (standing for the Italian “Accessibility Validator and Monitor”) combines two different applications: an automated evaluation tool (the VaMoLà Validator [14]) and a tool which periodically controls and records the accessibility level of a predefined set of URLs (the VaMoLà Monitor [13]). In particular, the Validator allows automatic evaluation of URLs, and it is capable to control their compliance to all the Italian law requirements [6] (in the following, the Stanca Act requirement) and provides some guidelines to the subjective manual evaluation the regulations state as well. The Monitor periodically queries the Validator (on the basis of some suitable parameters and preferences) and reports the accessibility level of a given set of Web sites. It uses messages and errors from the Validator, in order to compute some measures, on the strength of metrics [9, 2]. Both the applications have been distributed through SourceForge and applied the EUPL license. Despite only a subset of the Stanca Act requirements are fully automatable, so that Checks that typically require manual controls and human judgment cannot be done, a meaningful and structured screening of accessibility level and trend is produced by the VaMolà monitor. In particular, this paper details and discusses some preliminary reports, resulting from the evaluations of two different case studies: the University of Bologna and the Emilia-Romagna Region Web sites (on the whole 885 Web sites, with 37,299 Web pages). The reminder of this paper is organized as follows. Section 2 presents main work related to the evaluation and the monitoring of Web sites accessibility. In Section 3 we describe the main issues of the VaMoLà project, illustrating details about the design and the development of the Monitor application. Section 4 presents the two samples of evaluated Web sites. Section 5 discusses evaluation and monitoring results. Section 6 concludes the paper.

2. – Related Works

During the last years different national and international studies on Web accessibility have been conducted [3]. Some of them are devoted to identify the accessibility level of different kinds of Web sites (i.e. governmental, public administration, higher education) according to national or international guidelines. Some of these studies are based only on automatic tests and some other on both automatic and manual evaluations. In the following, we present three

Matteo Battistelli , Silvia Mirri, Ludovico Antonio Muratori, Paola Salomoni, Simone Spagnoli Page 3

international works which include the evaluation of Public administration and governmental Web sites. In 2006, the United Nations has commissioned the “United Nations Global Audit of Web Accessibility” [10]. In such a study 100 Web site homepages were evaluated, in order to present an indication of the existing Web accessibility level. The homepages were chosen from twenty countries around the world, five for each country and, in turn, one of them was from the Public administration (very often the main governmental Web site home page was selected to be evaluated). This study was conducted by testing only the home page of each Web site, using both automated and manual techniques, according to WCAG 1.0 AAA conformance level [15]. The resulting report presents detailed analysis for each country and shows that only 3% of the analysed Web sites (all from the European Public sector) were found to meet WCAG 1.0 level A, while none of the sites were WCAG 1.0 level AA or AAA compliant. The European Community has commissioned the study “Assessment of the Status of eAccessibility in Europe” in 2007 [4]. The main aim of this work is to follow up on previous researches and support the future development of European policy in the field of eAccessibility. The study presents a policy survey, a status measurement based on a set of key indicators, and the results from questionnaires sent to stakeholder groups (such as ICT industries, user organizations and Public procurement officials). The Measuring Progress of eAccessibility in Europe (MeAC) assessment covers a wide variety of ICT products ranging from TV and telephony to computer hardware and software. The accessibility of Public and private sector Web sites is also part of the survey. Overall, 336 Public and private sector Web sites were evaluated by using a combination of automated and manual testing procedures to assess WCAG 1.0 level A. In particular, 25 pages were collected from each site with an automated retrieval process and then they were automatically and manually analyzed. Only 314 of the sites could be evaluated successfully, the remaining 6,5% produced no results due to problems during retrieval or testing. The outcome is summarized into four conformance classes: - Pass: Web site passes the test for all applicable checkpoints, including a range of

checkpoints to be evaluated manually. - Limited Pass: Web site passes all checkpoints that can be tested automatically. - Marginal Fail: Web site fails certain checkpoints, but the number of check-points failed or

of failure instances is below specific quantitative thresholds. - Fail: Web site fails multiple checkpoints. It is worth noting that only a very small percentage of Web sites were found to meet accepted international accessibility standards: 8.2% were accessible based on automated testing and just 2.6% when subjected to a more stringent follow-up manual testing. For government websites, the accessibility percentages were 12.5% and 5.3% for automated and manual testing, respectively. These results mean that only a small proportion of key Public Web sites (national government, national parliament, and key ministries such as social, employment, health and education) meet the accessibility standards. In few countries, the majority of tested Public Web sites met the standards but in many none of them did. Another study about accessibility monitoring of European Web sites was conducted by the European Internet Accessibility Observatory (EIAO) [3] in 2008. The EIAO is an implementation of the automated monitoring application scenario of UWEM [12]. UWEM stands for “Unified Web Evaluation Methodology” and it has been developed by a group of experts and institutions from several European countries. It addresses the methodological needs of various Web accessibility evaluation approaches ranging from in-depth evaluation of a single Web site to large scale monitoring of thousand sites. UWEM describes methods for the collection of evaluation samples, test procedures according to WCAG 1.0 level AA, and several different reporting options. EIAO uses the most aggregated reporting format, the

Page 4 Making the Tree Fall Sound: Reporting Web Accessibility with the VaMoLà Monitor

UWEM accessibility score card. This study evaluated 3232 Public European Web sites (with a stop criterion of evaluating 600 Web pages for each site). Actually, the system was able to download and evaluate 2317 (71,9%) of the sites, due to problems such as unavailability of the site, restrictions of the robots, wrong handling of redirection or character encoding. In [3] a comparison between the EIAO study results and the MeAC ones is presented: there is a positive trend of the overall level of accessibility of European governmental Web sites, even if it is rather poor. It is worth mentioning the EIAO sample size is much wider than the MeAC one. An evaluation of the accessibility level of the 100 largest municipalities Web sites in U.S.A. was conducted in 2006 and is described in [5]. Such a study is based on Section 508 [11]. The results show that only a number of cities Web sites had accessibility statements, but the overall compliance with Section 508 was low. In [8], a study about universities Web sites accessibility is presented. In several countries, university Web sites accessibility is regulated, as well as governmental and Public administration ones. In particular, Kane and his colleagues present a multi-method analysis for the home pages of 100 top international universities, in order to access the current state of university Web site accessibility. Each site was analysed for compliance with accessibility standards (WCAG 1.0), image accessibility, alternate-language and text-only content, and quality of web accessibility statements. The primary goals of such a work were to identify common accessibility issues affecting university Web pages and to define groups of universities with the greatest accessibility issues. This study utilized both automated and manual tests. In particular, automated accessibility evaluation tools were used to measure compliance with accessibility standards and image accessibility. Each Web site was analysed using four automated evaluation tools: WebXACT/Bobby, Cynthia Says, Functional Accessibility Evaluator (FAE) and WebInSight. Manual tests were used to evaluate aspects of accessibility that cannot easily be tested automatically, such as the presence of accessibility policies, text-only versions of the home page, and alternate language versions of the home page. The results of such a study are described in [8] and they report 36 Web sites containing no Priority 1 errors in either evaluation tool, while only 2 universities home pages are shown to be free of all Priority 1, 2 and 3 errors. In such a study 40 European universities Web homepages were analysed: the average Priority 1 errors per page were 0,53, while the average Priority 1, 2 and 3 errors per page were 4,45. By considering universities sites grouped from a geographical point of view, this is the second best result, after the 9 Australian and New Zealand universities Web sites which were evaluated. While North American universities Web homepages evaluation (37 Web sites were analysed) reports 0,55 Priority 1 and 4,81 Priority 1, 2 and 3 errors per page.

3. – The VaMoLà Project

The VaMoLà Project was born from a collaboration between the Emilia-Romagna Region and the University of Bologna [9, 2]. VaMoLà is the Italian acronym for “Accessibility Validator and Monitor”. Its goals are the automatic evaluation of Web pages according to the 22 requirements [6] of the Italian regulations (also known as the Stanca Act [7]) and the monitoring of the related conformity by the Italian Public Administration Web sites. In order to meet these goals, two different applications were developed: an automated evaluation tool (the VaMoLà Validator [14]) and a tool which periodically controls and records the accessibility level of a predefined set of URLs (the VaMoLà Monitor [13]). The VaMoLà Validator (Figure 1) is an open source software, based on AChecker [1], available as a Web service and as a Web based application. Such a validator automatically checks a single URL and it allows end-users to specify which Stanca Act requirements [6] the evaluation has to control. The system, after the evaluation process, produces a XHTML page with an analysis

Matteo Battistelli , Silvia Mirri, Ludovico Antonio Muratori, Paola Salomoni, Simone Spagnoli Page 5

of actual and potential barriers it has found, as well as some guidelines to the subjective manual evaluation the Italian regulations state. Details on the VaMoLà Validator design issues and implementation can be found in [9].

Figure 1. A screenshot of the VaMoLà Validator Home Page

3.1 – The VaMoLà Monitor

The VaMoLà Monitor (Figure 2) is an open source Web based application [13]. Only authorized users can access such an application, in order to check the accessibility degree of a set of Public Administration Web sites. The VaMoLà Monitor records evaluations (conducted by using the VaMoLà Validator) results time by time and it shows comparison between the current situation and the previous ones. Such a system allows the automatic monitoring of a set of URLs and it periodically creates reports in order to constantly check the accessibility level of Web pages (one or more for each Web site).

Figure 2. A screenshot of the VaMoLà Monitor Home Page

The Emilia-Romagna Region is using such an application to regularly evaluate its own Web sites (695 sites) and to study how to correct errors and how to increase the accessibility level (thanks to the periodical reports the monitor generates). In order to better exploit regional

Page 6 Making the Tree Fall Sound: Reporting Web Accessibility with the VaMoLà Monitor

Web sites monitoring results, the application provides also a navigation system based on geographical criteria.

3.2 – Monitor Design Issues

The Monitor is composed by two components: (i) a script which is devoted to storage data from periodical evaluation in a database and (ii) a Web interface to allow users to exploit and navigate accessibility reports. The Monitor has been developed in PHP and scripts which interact with end-users generate XHTML 1.0 Strict Web pages. Data are stored in a MYSQL database. Currently, both the Validator and the Monitor are hosted in a LAMP server at the University of Bologna. 3.2.1 Evaluations Storage Evaluation results are taken by using a script which periodically (once a night, or once a week) runs in background and calls the Web service version of the Validator for each URL to check. Such a script stores the resulting data in the database. In particular, for each Web site, there are information about: - category; a Web site could be related to: government, municipality, region, province,

university, health board, etc.; - municipality; - province; - region. Such information allows researches and Web-based navigations on the strength of geographical localization and Web sites categories. A configuration file sets the level of the pages the monitor has to evaluate (the first level is the home page, the second level is all the URLs directly linked from the home page, and so on). Anyway, through the configuration file, it is also possible to set a maximum number of pages to evaluate for each single Web site. For each evaluation, the script saves into the database the following data: - the evaluation date; - the evaluated URL; - the number of errors; - the number of suggested manual controls (the Validator suggests some checks the user

should do in order to complete the Web site evaluation); - the number of potential errors (the Validator finds some patterns which could represent an

error, but it is not able to automatically decide if there is an error or not; the user should manually check);

- the total number of checks which were done for each kind of error. The periodical evaluations permit to outline the accessibility level trend for each URL in the database. 3.2.2 Reports and User Interface Evaluation results and synthesis are available to end-users through a Web-based interface. The Monitor is a multi-user application, so users could access it by logging in through username and password. The end-users could create and exploit their accessibility reports, according to some options they could set. A fundamental element in the Monitor interface is the Options Menu (see Figure 2), which allows to set some characteristics (related to the Web sites to evaluate and to the evaluation itself) so as to customize reports. In particular, it is possible to select: - which set of guidelines has to be taken into account in the evaluation (Stanca Act, WCAG

1.0, WCAG 2.0) and the priority level (A, AA, AAA). - Which requirements of the Stanca Act or which WCAG guidelines have to be included in

the evaluation, in order to verify the checked URL conformity according to them.

Matteo Battistelli , Silvia Mirri, Ludovico Antonio Muratori, Paola Salomoni, Simone Spagnoli Page 7

- The categories and the sub-categories of the Web sites that the end-users want to evaluate. - A previous date, in order to compare the current accessibility level with the earlier one. - The geographical area to monitor. After setting options, the end-users could enjoy two different forms of report to exploit evaluation results: the graphical and the detailed ones. The graphical report is an SVG map of the selected geographical area (see Figure 2). Different colors in the map highlight different accessibility levels of monitored Web sites: the lighter is the color of an area in the map and the higher is the level of accessibility of Web sites in that area (the number of pages without errors). On the contrary, a dark color means a low percentage of Web sites with no errors according to the Stanca Act. A script based on a map “mouseover” event allows the visibility of a tooltip with the percentage of Web sites without errors in an area. Obviously, besides such a graphical report, a set of textual tables are provided in order to show detailed evaluation results. Each table is related to a specific geographical area and it displays the total number of analyzed Web pages, the percentage of pages without errors, the total number of errors and the average number of errors per page. Further details can be shown on other tables, in order to provide the complete list of evaluated URLs, which could be grouped by category or by analyzed requirements (Figure 3). This way, single accessibility barriers arisen by each Web site are focused, depending on the violated requirements.

Figure 3. A screenshot of the evaluation results in the VaMoLà Monitor

4. – Case Studies

The Stanca Act [7] lists the entities (governmental or not) which have to provide accessible Web content. In particular, the Web sites of the following ones have to be compliant with the Law 4/2004 requirements: - Economic Public entities. - Private firms which provide Public services. - Aid and rehabilitation Public entities. - Transports and telecommunication mainly Public funding firms. - Information services contracting firms. Nowadays, there is no an official register with the lists of URLs which have to be compliant to the Stanca Act, neither from a national neither from a regional or municipal point of view. In order to monitor a set of URLs which would be statistically significant, it is necessary a Web sites rising and classification phase. We are still gathering such URLs, which represent the wide Italian Public Administration situation.

Page 8 Making the Tree Fall Sound: Reporting Web Accessibility with the VaMoLà Monitor

A first and partial sample of Web sites (related to the whole Italian territory) was used during the development and the testing phases of the VaMoLà project. Each site in this sample was classified on the basis of its category (Research Centre, Municipality, Province, Region, Government, Health board, University) and it was linked to a municipality, a province and a region. Unfortunately, the sample geographical distribution was not enough homogeneous. Moreover, the number of Web sites in the sample (about 1,500 URLs) was not sufficient, so it was not representative of the Italian Public Administration. In order to assess both the Monitor application features and the object of monitoring – i.e. the case studies we have chosen - it has been necessary the extension and a better definition for the set of evaluated Web sites. Obtaining an adequate sample, which represent the Italian situation, is a complex and time consuming task and so we have decided to go on steps. Hence, thus far, we are still collecting Web sites related to single geographical areas or single categories. In the meanwhile, we can completely monitor different realms over the Italian territory, instead of evaluate an incomplete national situation. In this paper we are going to present the Monitor results coming from two complete samples, referring to different contexts. In particular, we will show the evaluations of the Emilia-Romagna Region Web sites and the University of Bologna ones. The main references for the collection of the Emilia-Romagna Region Web sites were the Region official portal, called ERMES (http://www.regione.emilia-romagna.it/), and the “Emilia-Romagna digitale” (in English “Digital Emilia-Romagna”) portal (http://www.regionedigitale.net). Starting from them, we gathered URLs of Regional institutions and entities, Public health and URLs of mountains communities and municipalities unions. Further data about Municipalities and Provinces Web sites were found out though http://www.regionedigitale.net. In order to collect the University of Bologna Web sites, the Academy Statute has been taken into account. It presents a complete list of the faculties, the degree courses, the departments and all the other institutes which are related to all the previous mentioned entities. Then we included in the sample also libraries, specializing didactical institutions, services for students and staff Web sites. After the URLs collection phase, we have gathered 637 Web sites related to Emilia-Romagna Region entities (37,299 URLs on the whole) and 248 Web sites related to the University of Bologna (6,264 URLs on the whole). These two samples have been stored into the database and they have been evaluated through the Monitor application. The evaluation results will be shown in the following Section.

5. – Evaluation And Monitoring Results

In this Section we present the results reported after a monitoring session (conducted in January 2011) about the accessibility level of the two above mentioned samples: the Emilia-Romagna Region and the University of Bologna Web sites. On the whole, such an evaluation phase has involved 885 Web sites. The Monitor allows administrator users to specify the depth of the evaluation. In other words it is possible to choose a level for the Web pages on the site to verify. Level 0 corresponds to the home page; Level 1 corresponds to all the pages of the previous level plus the pages directly linked from the home page; Level 2 corresponds to all the pages of the previous level plus the pages directly linked from the Level 1 pages, and so on. Moreover, it is possible to set a maximum number of pages to evaluate for each level. In our test, we have decided to evaluate three level of Web pages (Level 0, Level 1 and Level 2) and a limited number of pages for each Web site in the database. The final samples count by 37,299 pages for the Emilia-Romagna Region and by 6,264 pages for the University of Bologna. Unfortunately, some URLs were not unreachable (4 URLs for the University of Bologna and 26 for the Emilia-Romagna Region) and so they were not included in the final reports shown in the following.

Matteo Battistelli , Silvia Mirri, Ludovico Antonio Muratori, Paola Salomoni, Simone Spagnoli Page 9

As we hinted in the Introduction, reports from the monitor can be used to issue interventions aimed at fixing or preventing specific critical states. Table 1 lists the average failed checks per page on some institutions of the University of Bologna. Web sites authoring on each category can be referred to the same managers and then they represent potential targets to apply any necessary policy. As we can see, Libraries, Degree Courses and Schools (which are a sort of specializing didactical institutions) home pages overcome the 2% threshold in average failed checks per page. Such three categories are potential targets to focus any prophylaxis to improve accessibility. Indeed, Schools and Degree Courses are known to migrate into the portal Web authoring office and so the institution in charge of controlling Web accessibility, can decide to mind just about Libraries. By going into details on reporting accessibility of Libraries Web sites, we can see a further evidence of shortcomings.

Table 1. University of Bologna Web Sites Average Failed Checks/page report

Category Nr. Web sites Nr. Web pages % Average Failed Checks/Page

Level 0 Level 2 University Portal 1 38 1.13 0.30 Libraries 3 93 3.00 2.12 Research Centers 19 558 1.64 1.30 Degree Courses 75 1973 2.07 1.55 Departments 71 1728 1.20 0.73 Faculties 23 318 1.64 0.30 Schools 9 149 2.02 0.75 Services 31 1026 1.20 0.72 Structures 16 381 1.39 0.40

Table 2. University of Bologna Web Sites Average Failed Checks/page report (grouped by Stanca Act

Requirements topics)

Category

% Average Failed Checks

related to Standards

/page

% Average Failed Checks

related to Multimed

ia/page

% Average Failed Checks

related to Presentatio

nal aspects/pag

e

% Average Failed Checks related

to Forms/p

age

% Average Failed Checks related

to Tables/p

age

% Average Failed Checks

related to Scripting/

page

University Portal 42,20 0,26 0,06 6,46 5,68 0,00 Libraries 99,40 32,63 0,51 31,25 0,01 0,00 Research Centers 88,12 19,38 0,48 6,95 1,05 0,01 Degree Courses 86,39 2,70 0,51 0,77 20,57 0,13 Departments 69,61 8,75 0,23 5,06 1,35 0,04 Faculties 31,68 0,22 0,11 5,60 7,33 0,00 Schools 65,12 7,69 0,34 5,06 2,33 0,02 Services 79,14 2,10 0,14 13,69 1,62 0,15 Structures 47,86 3,36 0,14 6,45 4,20 0,00

Page 10 Making the Tree Fall Sound: Reporting Web Accessibility with the VaMoLà Monitor

Table 3. Emilia-Romagna Region Web Sites Average Failed Checks/page report

Category Nr. Web sites Nr. Web pages % Average Failed

Checks/Page Level 0 Level 2

Agencies, Firms, Institutions, Societies

34 1,539 2.29 1.41

Associations, Consortiums, Confederations

37 1,112 2.62 1.90

Libraries, Museums 24 1,010 0.80 0.70 Chambers of Commerce 8 1,194 0.24 0.15 Governmental – Inter-municipal Associations

5 146 4.35 4.35

Governmental –Districts 1 31 0.35 1.01 Governmental – Municipalities

342 15,985 5.00 2.39

Governmental – Mountain Communities

8 450 2.37 2.25

Governmental – Provinces

8 650 2.26 1.67

Governmental – Regions 1 38 1.12 0.56 Governmental – Municipalities Unions

12 700 1.59 0.25

Education 13 463 1.22 0.96 Promotion 6 359 1.41 0.92 Employment Agencies 13 500 1.38 1.15 Information Portals 42 2,846 1.51 1.06 Health Boards 68 3,245 0.71 0.37 Social Security, Civil Protection

3 115 1.75 1.12

Sport 2 45 0.94 1.03 Transports 10 607 0.11 0.24

In particular, Table 2 shows a high percentage of average failed checks on requirements related to standards compliance, multimedia and non-textual elements alternatives, as well as form labels. Moreover, Table 2 points out high percentages of failed checks for research centers Web sites. At this point, once manual controls have been made, a focused first intervention is really suitable on Libraries and Research Centers Web sites. Table 3 shows the wider and more varying scenario of the Regional institutions. Two categories point out values about their average failed checks per page that are over a sort of threshold on the most common figures: the municipalities and the inter-municipal associations. Since single municipalities share the authoring of inter-municipal associations, such results are quite expected, and, actually, any intervention is to be done on the former ones. It is worth noting that the amount of such Web sites is more than a half of the whole Emilia-Romagna sample (637), hence such result should be deeply analyzed in order to outline critical situations. Also in this case, we can descend in deeper and deeper details, up to obtain reports such as the histogram in Figure 5. Here, on the abscissa there is the percentage ranges related to average failed checks per page, while the ordinate labels the corresponding number of Web sites. The diagram synthesizes that about one third of the Municipalities Web sites have a percentage average failed checks over 2%. Moreover, further refinements show

Matteo Battistelli , Silvia Mirri, Ludovico Antonio Muratori, Paola Salomoni, Simone Spagnoli Page 11

that about one tenth of them surmount the 10% of average failed checks. Once again, starting from these data, after necessary manual controls, we can focus an intervention on a restricted amount of institutions (with an obvious time and resources saving).

Figure 4. Municipalities Web sites Histogram

Finally, let us consider Table 4, which sketches the percentage average failed checks per page on the Web sites of the ten main municipalities of the Emilia-Romagna Region. Here all the values are under the threshold of the 1%, as it is expected from bigger institutions, with good organization and enough funds.

Table 4. Municipalities Average Failed Checks/page report

Municipalities % Average Failed Checks/Page Bologna 0,65 Cesena 0,22 Ferrara 0,21 Forli' 0,13 Modena 0,93 Parma 0,05 Piacenza 0,09 Ravenna 0,04 Reggio Emilia 0,31 Rimini 0,27

6. – Conclusion

The VaMoLà Monitor has been designed and developed to provide a synthesis of accessibility screening in wide varying and diverse dominion. Its aim is supporting Public institutions to face issues and shortcomings, in order to increase their Web pages accessibility level. A set of parameters are customizable, according to end-users expectations and preferences. The presented case studies are meant to show that the synthesis view, provided by the Monitor, represents a suitable cue for any aimed intervention, with the consequent resources saving. We are working to set up wider monitoring experiments by including URLs of Italian Public institutions which have to be compliant to the Stanca Act requirements, in order to produce a more complete set of reports. The necessity of drawing paths to descend in deeper details, once one or more critical situations have been found, are feeding further researchers. Moreover, Monitor future work involves the generations of downloadable reports in different

Page 12 Making the Tree Fall Sound: Reporting Web Accessibility with the VaMoLà Monitor

formats, such as CSV, Microsoft Excel and PDF. Finally, studies about metrics and quantitative measurements have been conducted and they are still ongoing.

Acknowledgement

The authors would like to thank Giovanni Grazia and Jacopo Deyla (from the Emilia-Romagna Region), Marina Vriz and Ennio Paiella (from ASPHI) and Gregory R. Gay (from ATRC, Faculty of Information, University of Toronto).

References

[1] ATRC, AChecker. Available from: http://www.atutor.ca/achecker/index.php, 2010.

[2] Battistelli, M., Mirri, S., Muratori, L.A., Salomoni, P., Spagnoli, S. Avoiding to dispense with accuracy: A method to make different DTDs documents comparable. In Proceeding of the 25th Symposium On Applied Computing (SAC2010) (Sierre, Switzerland, March 22-26, 2010). ACM Press, 2010.

[3] Bühler, C., Heck, H., Nietzio, A., Goodwin Olsen, M., and Snaprud, M. Monitoring Accessibility of Governmental Web Sites in Europe. In Proceedings of the International Conference on Computers Helping People with Special Needs (ICCHP 2008) (Linz, Austria, July 9-11, 2008). Springer, Lecture Notes in Computer Science, 2008, 410-417.

[4] Cullen, K., Kubitschke, L., and Meyer, I. Assessment of the status of eAccessibility in Europe. Available from: http://ec.europa.eu/information_society/activities/einclusion/library/studies/meac_study/index_en.htm, 2007.

[5] Evans-Cowley, J. The Accessibility of Municipal Government Websites. In Journal of E-Government, 2(2):75-90, 2006.

[6] Italian Parliament. Decreto Ministeriale 8 luglio 2005 - Allegato A: Verifica tecnica e requisiti tecnici di accessibilità delle applicazioni basate su tecnologie internet. Available from: http://www.pubbliaccesso.gov.it/normative/DM080705-A.htm, 2005.

[7] Italian Parliament. Law nr. 4 – 01/09/2004. Official Journal nr. 13 – 01/17/2004, January 2004.

[8] Kane, S.K., Shulman, J.A., Shockley, T.J., and Ladner, R.E. A Web Accessibility Report Card for Top International University Web Sites. In Proceedings of the W4A 2007.

[9] Mirri, S., Muratori L., Roccetti, M. and Salomoni, P. Metrics for Accessibility: Experiences with the Vamolà Project. In Proceedings 6th ACM International Cross Disciplinary Conference on Web Accessibility (W4A 2009) Madrid (Spain), April 2009, ACM Press, New York, 2009, pp. 142-145.

[10] Nomensa. United Nations global audit of Web accessibility. Available from: http://www.un.org/esa/socdev/enable/documents/fnomensarep.pdf, 2006.

[11] U.S. Rehabilitation Act Amendments. Section 508. Available from: http://www.webaim.org/standards/508/checklist, 1998.

[12] Web Accessibility Benchmarking Cluster. Unified Web Evaluation Methodology (UWEM 1.2 Tests), 2007. Available at: http://www.wabcluster.org/uwem1_2/UWEM_1_2_TESTS.pdf Last access on August/2008.

[13] VaMoLà Project. VaMoLà Monitor. Available at: http://sourceforge.net/projects/vamola-monitor/, 2010.

[14] VaMoLà Project. VaMoLà Validator. Available at: http://sourceforge.net/projects/vamola-validate/, 2010.

[15] World Wide Web Consortium. Web Content Accessibility Guidelines (WCAG) 1.0. Available at: http://www.w3.org/TR/WCAG10/, 1999.

[16] World Wide Web Consortium. Web Content Accessibility Guidelines (WCAG) 2.0. Available at: http://www.w3.org/TR/WCAG20/, 2008.