Blind Usability with NetBeans IDE at World Usability Day
Blind Usability with NetBeans IDE at World Usability Day
Join the DZone community and get the full member experience.Join For Free
Created by the Usability Professionals' Association, World Usability Day is an annual initiative that promotes good usability research, development and practices, to make sure that technology services and products are user friendly and make life easy.
In this interview, Andreas Stefik, a computer science professor and member of the NetBeans Dream Team, talks about his work around usability for visually impaired developers, as well as his team's participation in this year's World Usability Day which took place worldwide in November.
Please tell us about you and what you do.
My name is Andreas Stefik. I am an Assistant Professor of Computer Science at Southern Illinois University Edwardsville working on computer programming technology. The current focus of my lab is to work on how humans interact with programming languages and development environments, including how these tools are used by the blind and visually impaired.
With my colleagues (Chris Hundhausen at Washington State University and Derrick Smith at the University of Alabama, Huntsville), students, other professionals, we are working alongside the National Science Foundation to research and create tools, programming languages, auditory debuggers, and an educational infrastructure for blind individuals to learn to program.
Most people do not realize that there about four million blind or visually impaired people living in the United States, and that the percentage that is out of the workforce is 61%--extraordinarily high, even with the current recession. Instead of only making web browsers or other tools more usable for the blind, my team's aim is to also empower blind individuals to create their own technology, to foster creativity and provide opportunities for worthy, high-paying careers in the process.
How did you choose to base your work on the NetBeans IDE?
When I was in graduate school at Washington State University, I collaborated with developers at Microsoft on trying to make Visual Studio more blind-friendly. However, the reality was that since Visual Studio is a closed source tool I was constantly blocked by some aspect of the environment that I couldn't control. There are a number of open source development environments out there, but the NetBeans environment has a very strong user community, and that was extraordinarily helpful when I got started.
What was your contribution to World Usability Day (WUD) this year?
We participated in the broader World Usability Day in the St. Louis area at the science center. One of the local groups in the area knew about some of our work and had asked us to come demo and let folks give it a try.
During the event, people from around the St.
Louis area came up to a series of booths focused on the usability of
various products. For example, some booths tackled web usage and
browsers; others, cellular phone usage (for example, change the
ringtone on four different devices), and so on. When a person arrived at
a booth, they would do the activity and receive a stamp if they
completed it, which they could use to get small prizes.
In our case, when people came to our booth, we had them interact with software written in our blind programming environment, without seeing the screen. One of the activities, for example, was a simple guessing game where the computer gives you instructions aurally (e.g., your number is too low/high) and you have to guess that number without seeing the screen. We didn't have people program blind, as this would have been pretty tough for this crowd, but hopefully it gave folks some small sense of how blind or visually impaired individuals interact with a computer.
Do you work specifically on blind usability or other usability issues as well?
Blind usability is one of many projects I am working on. Probably the largest related goal is centered around the usability of computer programming languages. Essentially, my lab and I are studying the syntax and semantics of programming languages in an effort to make them more intuitive.
An example: Most developers probably think words like “for” or “while” are perfectly reasonable choices for looping constructs in a programming language, but our recent study showed that words like “repeat” are rated as more intuitive by novices by approximately 673%. Meaning, modern programming languages are much less intuitive than some computer scientists think they are.
What are some of the issues that your work addresses for visually impaired developers?
The classic problem the blind face for almost any tool is probably the “Where am I?” problem, which basically suggests that blind developers have to continually determine their context of use. For example, if you are at line 257 in a document, what is the context of use of that line? What would you change if there is a bug? What code surrounds that line?
We try to tackle the issue in a number of ways, one of which is that we have integrated a talking debugger into the NetBeans IDE. If you use Visual Studio and a screen reader, for example, and “step over” a piece of code, Visual Studio will just say what key you pressed (for example, "F5"). In our tools, we give the blind developer context of use. We might tell the blind developer what the values of variables are as they go (For example, "a to 5"), what functions they've called (for example, "calling action main"), or details about the state of a program (for example, "loop iteration 4").
But a second problem is that, when you design software for the blind, you have to be very careful about what words or sounds are relayed to the user, or else you can accidentally make the tools very difficult to use. I spent a few years toying with auditory choices in my debugger, for example, but as I was working on these problems, I began to realize that while I was spending considerable effort making the audio output sensible the actual programming languages we were having people use (for example, C) were pretty unintuitive. Therefore I thought, "What if I just write my own programming language, but then run lots of formal experiments using evidence to improve how intuitive it is?" That goes back to the previous question. For example, in our language, you don't have "for" loops. Instead, you can say phrases like "repeat 10 times," the meaning of which is pretty obvious, even to non-programmers. It also has much less syntax, which is desirable for the blind, as screen readers have to read those lines literally, character by character, which can be extraordinarily tedious.
The theme of WUD this year was "Communication", how does this fit in with your work?
For sighted users, communication can mean a number of things, from the color or shape of an icon, to the number of steps one has to complete to finish a project. With the blind, communicating is a more linear experience, and in tools this typically happens through text-to-speech. So, our goal here is to try to give an idea of how this demographic communicates with the computer, and more importantly, to let them experience that themselves.
What are the next steps for your project?
The next immediate goal for us is releasing version 1.5 of the Sodbeans project, probably in January 2011, which includes our programming language implementation and NetBeans embedded accessibility tools. Beyond that, the Washington State School for the Blind starts teaching formal classes in Sodbeans in February, so I'm sure we will be doing bug fixing and maintenance to support them. My team and I are pretty excited to get that going. Past that, we are focusing considerable effort in our programming language, conducting empirical studies to test how intuitive our language is, to help us make it as obvious and easy to use as possible, both for blind and sighted individuals.
As for blind accessibility in NetBeans, some of the goals we have for the near future are to try to give key presses and hotkeys more semantic meaning. Such as, instead of CTRL+C saying “control C” in a screen reader it should say what the environment actually accomplished (for example, you copied the text “Hello, world”). For our talking debugger, we are also hoping to get better at auditory summarization of programs. Consider if a blind user pressed the "continue" button in the debugger. Given an arbitrary length execution of any number of codes, what is the best way to translate all of that execution into an English summary? That, amongst many other issues, is what we are working on and looking forward to.
Opinions expressed by DZone contributors are their own.