Edited by Christopher Douce, Wednesday, 21 July 2010, 18:20
I recently attended a public dissemination event that was held by the AEGIS project, hosted by the European headquarters of the company that developed the Blackberry, Research in Motion.
The Aegis project has the strapline that has three interesting keywords: groundwork, infrastructure and standards. When I heard about the project from a colleague, I was keen to learn what lay hidden underneath these words and how they connect to the subject of accessibility.
The Aegis project website states that it 'seeks to determine whether 3rd generation access techniques will provide a more accessible, more exploitable and deeply embeddable approach in mainstream ICT (desktop, rich Internet and mobile applications)' and goes on the say that the project will explore these issues through the development of an Open Accessibility Framework (or OAF). This framework, it is stated, 'provides embedded and built-in accessibility solutions, as well as toolkits for developers, for "engraving" accessibility in existing and emerging mass-market ICT-based products'. It goes on to state that the users of assistive technologies will be placed at the centre of the project.
The notion of the 'generations' of access techniques is an interesting concept that immediately jumped out at me when reading this description (i.e. what is the third generation and what happened to the other two generations?), but more of this a little later on.
Introductory presentations
The dissemination day began with a couple of contextualising presentations that outlined the importance of accessibility. A broad outline of the project was given by the project co-ordinator who emphasised that the point that the development of accessibility required the co-operation of a large number of different stakeholders, ranging from expert users of assistive technology (AT), tutors, and developers.
There was a general call for those who are interested in the project to 'become involved' in some of the activities, particularly with regards to understanding different use cases and requirements. I'm sure the project co-ordinator will not be offended if I provided a link to the project contacts page.
AT Generations
The next presentation was made by Peter Korn of Sun Microsystems who began by emphasising the point that every hour (or was it every second?) hundreds of new web pages are created (I forget the exact figure he presented, but the number is a big one). He then went on to outline the three generations of assistive technologies.
The first generation of AT could be represented by the development of equipment such as the Optacon (wikipedia), an abbreviation for Optical to Tactile Converter. This is the first time I had heard of this device before, and this represented the first 'take away' lesson of the day. The Wikipedia page looks to be a great summary of its development and its history.
One thing that is missing is the lack of an explicit link to a personal computer. The development of a PC gave way to a new second generation of AT that served a wider group of potential users. This generation saw the emergence of specialist AT software vendors, such as companies who develop products such as screen readers and screen magnifiers. Since computer operating systems are continuing to develop and hardware is continually changing (in terms of increases in processing power), this places unique pressures on the users of assistive technology.
For some AT systems to operate successfully, developers had have to apply a number of clever tricks. Imagine a brand new application package, such as a word processing program, that had been developed for the first generation of PCs, for example.
The developers of such an application would not be able to write code in such a way that allows elements of the display to be presented to users of assistive technology. One solution for AT vendors is to rely on tricks such as the reading of 'video memory' to convert on-screen video displays that could be presented to users with visual impairments using synthetic speech.
The big problem of this second generation of AT is that when there is a change to the underlying operating system of a computer it is possible that the 'back door' routes that assistive technologies may use to gain access to information may become closed, making AT systems (and the underlying software) rather brittle. This, of course, leads to a potential increase in development cost and no end of end user frustration.
The second generation of AT is said to have existed between the late 1980s to the early 2000s. The third generation of AT aims to overcome these challenges since operating systems and other applications begin to providing a series of standardised Accessibility Application Programming Interfaces (AAPIs).
This means that different suppliers of assistive technology can write software that uses a consistent interface to find out what information could be presented to an end user. An assistive technology, such a screen reader, can ask a word processor (or any other application) questions about what could be presented. An AAPI could be considered as a way that one system could ask questions about another.
Other presentations
Whilst an API, in some respects can represent one type of standard, there are a whole series of other standards, particularly those from the International Organization for Standardization (ISO) (and other standards bodies) that can provide useful guidance and assistance. A further presentation outlined the complex connections between standards bodies and underlined the connection to the development of systems and products for people with disabilities.
A number of presentations focussed on technology. One demonstration used a recent release of the OpenSolaris operating system (which makes use of the GNOME desktop system) to demonstrate how the Orca screen reader can be used in conjunction with application software such as OpenOffice.
With all software systems, there is often loads of magic stuff happening behind the scenes. To illustrate some of this magic (like the AAPI being used to answer questions), a Gnome application called Accerciser was used. This could be viewed as a software developer utility. It is intended to help developers to 'check if an application is providing correct information to assistive technologies'.
OpenOffice can be extended (as far as I understand) using the Java programming language. Java can be considered as a whole software development framework and environment. It is, in essence, a virtual machine (or computer) running on a physical machine (the one that your operating system runs on).
One of the challenges that developers of Java had to face was to how to make its user interface components accessible to assistive technology. This is achieved using something called the Java Access Bridge. This software component is, in essence, 'makes it possible for a Windows based Assistive Technology to get at and interact with the Java Accessibility API'.
On the subject of Java, one technology that I had not heard of before is JavaFX. I understand this to be a Java based language that has echoes of Adobe Flash and Microsoft Silverlight about it, but I haven't had much of a time to study it. The 'take home' message is that rich internet applications (RIA) need to be accessible too, and having a consistent way to interface with them is in keeping with the third generation approach to assistive technologies.
Another presentation made use of a Blackberry to demonstrate real time texting and show the operation of an embedded screen reader. A point was made that the Blackberry makes extensive use of Java, which was not something that I was aware of. There was also a comment about the importance of long battery life, an issue that I have touched upon in an earlier post. I agree, there is nothing worse than having to search for power sockets, especially when you rely on technology. This is even more important if your technology is an assistive technology.
Towards the fourth generation
Gregg Vanderheiden gave a very interesting talk where he mentioned different strategies that could be applied to make systems accessible, such as making adaptations to an existing interface, providing a parallel interface (for example, you can carry out the same activities using a keyboard or a mouse), or providing an interface that allows people to 'plug in' or make use of their own assistive technology. One example of this might be to use a software interface through an API, or to use a hardware interface, such as a keyboard, through the use of a standard interface such as USB.
Greg's talk made me think about an earlier question that I had asked during the day, namely 'what might constitute the fourth generation of assistive technologies?' In many respects this is an impossible question to answer since we can only identify generations when they have passed. The present and especially the future will always remain perpetually (and often uncomfortably) fuzzy.
One thought that I had regarding this area firmly connects to the area of information pervasiveness and network ubiquity. Common household equipment such as central heating systems and washing machines often continue to remain resolutely unfathomable to many of us. I have heard researchers talking about the notion of 'networked homes', where it is possible to control your heating system (or any other device) through your computer.
I remember hearing a comment from a delegate who attended the Open University ALPE project workshop who said, 'the best assistive technologies are those that benefit everyone, regardless of disability, such as optical character recognition'. But what of a home of networked household goods which can potentially offer their own set of wireless accessible interfaces? What benefit can such products provide for users who do not have the immediate need for an accessible interface?
The answer could lie with increasing awareness of the subject of energy consumption and management. Washing machines, cookers and heating systems all consume energy. Exposing information about energy consumption of different products could allow households to manage energy expenditure more effectively. In my view, the need for 'green' systems and devices may facilitate the development and introduction of products could potentially contain lightweight device level accessibility APIs.
Further development directions
One of the most interesting themes of the day was the idea of the accessibility API that has made the third generation of assistive technologies what they are today. A minor comment that featured during the day was the question of whether we might be able to make our software development tools and environments accessible. Since accessibility and usability are intrinsically connected, the question of, 'are the current generation of accessibility API's as usable as they can be?'
Put another way, if the accessibility APIs themselves are not as usable as they could be, this might reduce the number of software developers who may make use of them, potentially reducing the accessibility of end user applications (and frustrating the users who wish to make use of assistive technologies).
Taking this point, we might ask, 'how could we test (or study) the accessibility of an API?' Thankfully, some work has already been carried out in this area and it seems to be a field of research that is becoming increasingly active. A quick search yields a blog post which contains a whole host of useful resources (I recommend the Google TechTalk that is mentioned). There is, of course, a presentation on this subject that I gave at an Open University conference about two years ago, entitled Connecting Accessibility APIs.
It strikes me that a useful piece of research to carry out is to explore how to conduct studies to evaluate the usability of the various accessibility APIs and whether they might be able to be improved in some way. We should do whatever we can to try to smooth the development path for developers. Useful tools, in the form of APIs, have the potential to facilitate the development of useful and accessible products.
And finally...
Towards the end of the day delegates were told about a site called RaisingTheFloor.net (RTF). RTF is described as a consortium of organizations, projects and individuals from around the world 'that is focused on ensuring that people experiencing disabilities, literacy problems, or the effects of aging are able to access and use all of the information, resources, services and communities available on or through the Web'. The RTF site provides a wealth of resources relating to different types of assistive technologies, projects and stakeholders.
We were also told about an initiative that is a part of Aegis, called the Open Accessibility Everywhere Group (OAEG). I anticipate that more information about OAEG will be available in due course.
I also heard about the BBC MyWebMyWay site. One of the challenges for all computer users is learning and knowing about the range of different ways in which your system can be configured and used. Sites like this are always a pleasure to discover.
Summary
It's great to go to project dissemination events. Not only do you learn about what a project aims to achieve, but sometimes the presentations can often inspire new thoughts and point you toward new (and interesting) directions. As well as learning about the Optacon (which I had never heard of before), I also enjoyed the description of the different generations of assistive technologies. It was also interesting witness the various demonstrations and be presented with a teasing display of the complexities that lie very often hidden amidst the operating system of your computer.
The presentations helped me to connect the notions of the accessibility API and pervasive computing. It also reminded me of some research themes that I still consider to be important, namely, the usability of accessibility APIs. In my opinion, all these themes represent interesting research directions which have the fundamental potential of enhancing the accessibility and usability of different types of technologies.
I wish the AEGIS project the best of luck and look forward to learning more about their research findings.
Acknowlegements
Thanks are extended to Wendy Porch who took the time to review an earlier draft of this post.
Aegis Project : Open accessibility everywhere
I recently attended a public dissemination event that was held by the AEGIS project, hosted by the European headquarters of the company that developed the Blackberry, Research in Motion.
The Aegis project has the strapline that has three interesting keywords: groundwork, infrastructure and standards. When I heard about the project from a colleague, I was keen to learn what lay hidden underneath these words and how they connect to the subject of accessibility.
The Aegis project website states that it 'seeks to determine whether 3rd generation access techniques will provide a more accessible, more exploitable and deeply embeddable approach in mainstream ICT (desktop, rich Internet and mobile applications)' and goes on the say that the project will explore these issues through the development of an Open Accessibility Framework (or OAF). This framework, it is stated, 'provides embedded and built-in accessibility solutions, as well as toolkits for developers, for "engraving" accessibility in existing and emerging mass-market ICT-based products'. It goes on to state that the users of assistive technologies will be placed at the centre of the project.
The notion of the 'generations' of access techniques is an interesting concept that immediately jumped out at me when reading this description (i.e. what is the third generation and what happened to the other two generations?), but more of this a little later on.
Introductory presentations
The dissemination day began with a couple of contextualising presentations that outlined the importance of accessibility. A broad outline of the project was given by the project co-ordinator who emphasised that the point that the development of accessibility required the co-operation of a large number of different stakeholders, ranging from expert users of assistive technology (AT), tutors, and developers.
There was a general call for those who are interested in the project to 'become involved' in some of the activities, particularly with regards to understanding different use cases and requirements. I'm sure the project co-ordinator will not be offended if I provided a link to the project contacts page.
AT Generations
The next presentation was made by Peter Korn of Sun Microsystems who began by emphasising the point that every hour (or was it every second?) hundreds of new web pages are created (I forget the exact figure he presented, but the number is a big one). He then went on to outline the three generations of assistive technologies.
The first generation of AT could be represented by the development of equipment such as the Optacon (wikipedia), an abbreviation for Optical to Tactile Converter. This is the first time I had heard of this device before, and this represented the first 'take away' lesson of the day. The Wikipedia page looks to be a great summary of its development and its history.
One thing that is missing is the lack of an explicit link to a personal computer. The development of a PC gave way to a new second generation of AT that served a wider group of potential users. This generation saw the emergence of specialist AT software vendors, such as companies who develop products such as screen readers and screen magnifiers. Since computer operating systems are continuing to develop and hardware is continually changing (in terms of increases in processing power), this places unique pressures on the users of assistive technology.
For some AT systems to operate successfully, developers had have to apply a number of clever tricks. Imagine a brand new application package, such as a word processing program, that had been developed for the first generation of PCs, for example.
The developers of such an application would not be able to write code in such a way that allows elements of the display to be presented to users of assistive technology. One solution for AT vendors is to rely on tricks such as the reading of 'video memory' to convert on-screen video displays that could be presented to users with visual impairments using synthetic speech.
The big problem of this second generation of AT is that when there is a change to the underlying operating system of a computer it is possible that the 'back door' routes that assistive technologies may use to gain access to information may become closed, making AT systems (and the underlying software) rather brittle. This, of course, leads to a potential increase in development cost and no end of end user frustration.
The second generation of AT is said to have existed between the late 1980s to the early 2000s. The third generation of AT aims to overcome these challenges since operating systems and other applications begin to providing a series of standardised Accessibility Application Programming Interfaces (AAPIs).
This means that different suppliers of assistive technology can write software that uses a consistent interface to find out what information could be presented to an end user. An assistive technology, such a screen reader, can ask a word processor (or any other application) questions about what could be presented. An AAPI could be considered as a way that one system could ask questions about another.
Other presentations
Whilst an API, in some respects can represent one type of standard, there are a whole series of other standards, particularly those from the International Organization for Standardization (ISO) (and other standards bodies) that can provide useful guidance and assistance. A further presentation outlined the complex connections between standards bodies and underlined the connection to the development of systems and products for people with disabilities.
A number of presentations focussed on technology. One demonstration used a recent release of the OpenSolaris operating system (which makes use of the GNOME desktop system) to demonstrate how the Orca screen reader can be used in conjunction with application software such as OpenOffice.
With all software systems, there is often loads of magic stuff happening behind the scenes. To illustrate some of this magic (like the AAPI being used to answer questions), a Gnome application called Accerciser was used. This could be viewed as a software developer utility. It is intended to help developers to 'check if an application is providing correct information to assistive technologies'.
OpenOffice can be extended (as far as I understand) using the Java programming language. Java can be considered as a whole software development framework and environment. It is, in essence, a virtual machine (or computer) running on a physical machine (the one that your operating system runs on).
One of the challenges that developers of Java had to face was to how to make its user interface components accessible to assistive technology. This is achieved using something called the Java Access Bridge. This software component is, in essence, 'makes it possible for a Windows based Assistive Technology to get at and interact with the Java Accessibility API'.
On the subject of Java, one technology that I had not heard of before is JavaFX. I understand this to be a Java based language that has echoes of Adobe Flash and Microsoft Silverlight about it, but I haven't had much of a time to study it. The 'take home' message is that rich internet applications (RIA) need to be accessible too, and having a consistent way to interface with them is in keeping with the third generation approach to assistive technologies.
Another presentation made use of a Blackberry to demonstrate real time texting and show the operation of an embedded screen reader. A point was made that the Blackberry makes extensive use of Java, which was not something that I was aware of. There was also a comment about the importance of long battery life, an issue that I have touched upon in an earlier post. I agree, there is nothing worse than having to search for power sockets, especially when you rely on technology. This is even more important if your technology is an assistive technology.
Towards the fourth generation
Gregg Vanderheiden gave a very interesting talk where he mentioned different strategies that could be applied to make systems accessible, such as making adaptations to an existing interface, providing a parallel interface (for example, you can carry out the same activities using a keyboard or a mouse), or providing an interface that allows people to 'plug in' or make use of their own assistive technology. One example of this might be to use a software interface through an API, or to use a hardware interface, such as a keyboard, through the use of a standard interface such as USB.
Greg's talk made me think about an earlier question that I had asked during the day, namely 'what might constitute the fourth generation of assistive technologies?' In many respects this is an impossible question to answer since we can only identify generations when they have passed. The present and especially the future will always remain perpetually (and often uncomfortably) fuzzy.
One thought that I had regarding this area firmly connects to the area of information pervasiveness and network ubiquity. Common household equipment such as central heating systems and washing machines often continue to remain resolutely unfathomable to many of us. I have heard researchers talking about the notion of 'networked homes', where it is possible to control your heating system (or any other device) through your computer.
I remember hearing a comment from a delegate who attended the Open University ALPE project workshop who said, 'the best assistive technologies are those that benefit everyone, regardless of disability, such as optical character recognition'. But what of a home of networked household goods which can potentially offer their own set of wireless accessible interfaces? What benefit can such products provide for users who do not have the immediate need for an accessible interface?
The answer could lie with increasing awareness of the subject of energy consumption and management. Washing machines, cookers and heating systems all consume energy. Exposing information about energy consumption of different products could allow households to manage energy expenditure more effectively. In my view, the need for 'green' systems and devices may facilitate the development and introduction of products could potentially contain lightweight device level accessibility APIs.
Further development directions
One of the most interesting themes of the day was the idea of the accessibility API that has made the third generation of assistive technologies what they are today. A minor comment that featured during the day was the question of whether we might be able to make our software development tools and environments accessible. Since accessibility and usability are intrinsically connected, the question of, 'are the current generation of accessibility API's as usable as they can be?'
Put another way, if the accessibility APIs themselves are not as usable as they could be, this might reduce the number of software developers who may make use of them, potentially reducing the accessibility of end user applications (and frustrating the users who wish to make use of assistive technologies).
Taking this point, we might ask, 'how could we test (or study) the accessibility of an API?' Thankfully, some work has already been carried out in this area and it seems to be a field of research that is becoming increasingly active. A quick search yields a blog post which contains a whole host of useful resources (I recommend the Google TechTalk that is mentioned). There is, of course, a presentation on this subject that I gave at an Open University conference about two years ago, entitled Connecting Accessibility APIs.
It strikes me that a useful piece of research to carry out is to explore how to conduct studies to evaluate the usability of the various accessibility APIs and whether they might be able to be improved in some way. We should do whatever we can to try to smooth the development path for developers. Useful tools, in the form of APIs, have the potential to facilitate the development of useful and accessible products.
And finally...
Towards the end of the day delegates were told about a site called RaisingTheFloor.net (RTF). RTF is described as a consortium of organizations, projects and individuals from around the world 'that is focused on ensuring that people experiencing disabilities, literacy problems, or the effects of aging are able to access and use all of the information, resources, services and communities available on or through the Web'. The RTF site provides a wealth of resources relating to different types of assistive technologies, projects and stakeholders.
We were also told about an initiative that is a part of Aegis, called the Open Accessibility Everywhere Group (OAEG). I anticipate that more information about OAEG will be available in due course.
I also heard about the BBC MyWebMyWay site. One of the challenges for all computer users is learning and knowing about the range of different ways in which your system can be configured and used. Sites like this are always a pleasure to discover.
Summary
It's great to go to project dissemination events. Not only do you learn about what a project aims to achieve, but sometimes the presentations can often inspire new thoughts and point you toward new (and interesting) directions. As well as learning about the Optacon (which I had never heard of before), I also enjoyed the description of the different generations of assistive technologies. It was also interesting witness the various demonstrations and be presented with a teasing display of the complexities that lie very often hidden amidst the operating system of your computer.
The presentations helped me to connect the notions of the accessibility API and pervasive computing. It also reminded me of some research themes that I still consider to be important, namely, the usability of accessibility APIs. In my opinion, all these themes represent interesting research directions which have the fundamental potential of enhancing the accessibility and usability of different types of technologies.
I wish the AEGIS project the best of luck and look forward to learning more about their research findings.
Acknowlegements
Thanks are extended to Wendy Porch who took the time to review an earlier draft of this post.