Alistair D. N. Edwards
- Personal Homepage
Alistair Edwards is a Senior Lecturer in Computer Science at the University of York. His research interests are in Human-Computer Interaction (HCI), specifically the use of novel modes in interaction (speech and non-speech sounds, haptics) and interfaces for people with particular needs (e.g. those with visual impairments, older people).
Author or co-author of over 70 refereed papers, 5 books and around 40 other publications and editor or co-editor of 3 other books.
He teaches modules related to HCI and to programming.
Outside work he likes flying, sailing and walking.
- Publication period start
- Publication period end
- Number of co-authors
Number of publications with favourite co-authors
Most productive colleagues in number of publications
Malik, Sofianiza Abd, Edwards, Alistair D. N. (2010): Investigation of cultural dependency in mobile technology and older adults. In: Proceedings of ACM CHI 2010 Conference on Human Factors in Computing Systems , 2010, . pp. 3835-3840. http://doi.acm.org/10.1145/1753846.1754065
Edwards, Alistair D. N. (2008): Keeping Up with Technology: Commentary on \"Computers and People with Disabilities. In ACM Transactions on Accessible Computing, 1 (2) pp. 8. http://doi.acm.org/10.1145/1408760.1408762
Hisham, Syariffanor, Edwards, Alistair D. N. (2007): Incorporating culture in user-interface: a case study of older adults in malaysia. In: Proceedings of the Eighteenth ACM Conference on Hypertext and Hypermedia , 2007, . pp. 145-146. http://doi.acm.org/10.1145/1286240.1286278
Holland, Simon, Day, Robert, Leplâtre, Grégory, Edwards, Alistair D. N. (2004): Mobile HCI and Sound. In: Brewster, Stephen A., Dunlop, Mark D. (eds.) Mobile Human-Computer Interaction - Mobile HCI 2004 - 6th International Symposium September 13-16, 2004, Glasgow, UK. pp. 527-528. http://link.springer.de/link/service/series/0558/bibs/3160/31600527.htm
Challis, Ben, Edwards, Alistair D. N. (2000): Weasel: A System for the Non-Visual Presentation of Music Notation. In: Proceedings of 6th International Conference on Computers Helping People with Special Needs. ICCHP 2000 , 2000, .
Fricke, J., Edwards, Alistair D. N. (2000): Tactile Displays Based on Modulated Electromagnetic Radiation. In: Proceedings of 6th International Conference on Computers Helping People with Special Needs. ICCHP 2000 , 2000, .
Challis, Ben, Edwards, Alistair D. N. (2000): Design Principles for Tactile Interaction. In: Proceedings of the First International Workshop on Haptic Human-Computer Interaction , 2000, . pp. 17-24.
Edwards, Alistair D. N., Evreinov, Grigori E., Agranovski, A. V. (1999): Isomorphic Sonification of Spatial Relations. In: Bullinger, Hans-Jorg (eds.) HCI International 1999 - Proceedings of the 8th International Conference on Human-Computer Interaction August 22-26, 1999, Munich, Germany. pp. 526-530.
Stevens, Robert D., Edwards, Alistair D. N. (1994): Mathtalk: The Design of an Interface for Reading Algebra Using Speech. In: Zagler, Wolfgang L., Bushy, Geoff, Wagner, Roland (eds.) ICCHP94 - Computers for Handicapped Persons - 4th International Conference September 14-16, 1994, Vienna, Austria. pp. 313-320.
Alistair D. N.
20.7 Commentary by Alistair D. N. Edwards
Tactile interaction occupies an odd position within the discipline of interaction – as is reflected in this chapter. On the one hand it is associated with futuristic, exotic technology that is still far from mainstream use, but on the other hand it is an element of most conventional human-computer interaction almost without our realizing it. That is to say, that the ubiquitous keyboard and mouse input relies heavily on haptic and proprioceptive senses in an unconscious way that we tend to take for granted. Thus, most people probably do not think that they engage in haptic interaction, but would acknowledge that it must be useful for those who lack other senses (notably sight) for whom tactile communication such as braille would seem invaluable.
It is to be hoped that this chapter will make readers much more aware of the truth of the situation and of the potential of tactile interaction. The keyboard and the mouse have both evolved over time, adapting to the physiological abilities of the user, but with minimal reference to the underlying biology. Of course, the keyboard is a classic example of non-optimal evolution: it is suggested (Noyes 1983) that the conventional qwerty layout was designed to slow typists down, to reduce consecutive-letter clashes on mechanical typewriters and now we are stuck with it. On the one hand, it is fun to speculate how the keyboard might have been designed if the inventors had had the benefit of reading a chapter such as this, but on the other, one has to note that even with the availability of this knowledge haptic interaction clearly a long way from reaching its full potential.
We have physical devices, including keyboards and mixing desks and lighting desks, which inherently give good haptic feedback and interaction, but in the digital, virtual world, we are constrained by the technology. Is the investment (of time, money, space etc.) required to use a device such as that depicted in Figure 21.13 worth the benefit derived?
The tactile picture is a device which seems in some sense 'obvious', particularly to those of us who have sight: if you can't see a picture, then you should be able to feel it. I have spent a lot of time and effort working with tactile diagrams, mainly within the University of York Centre for Tactile Images (sadly no longer in existence for lack of financial viability). We produced a wide range of diagrams for different purposes, working in collaboration with a number of important clients (Including The Deep, Hull; The Jorvik Centre; The National Trust; English Heritage and The National Railway Museum). Repeatedly we found ourselves repeating the assertion (mentioned by Challis) that a good tactile diagram may look bad. That is to say that the haptic senses are very different from vision.
The immediate differences are obvious – when one stops to think about them. Tactual sensitivity is low. Challis mentions the two-point limen but refrains from giving a figure for it. This is probably wise, since it is at least controversial as a measure of tactual sensitivity and is hard to gather agreement as to average values. The point is, though, that the value is of the order of millimetres – much lower than visual resolution. In Figure 21.5, Challis shows examples of patterns which could be produced on swell paper. In the visual representation they are all clearly distinct, but in experiments with swell paper samples (Magill 1999), we found that the average person can only distinguish three levels of pattern: smooth, medium and rough, regardless of their visual pattern. For instance, while the vertical and horizontal cross-hatching shown in Figure 21.5 are clearly different visually, they would probably be perceived as being the same on tactual inspection.
Furthermore, in exploring a tactile diagram, the person is likely to use at most two finger tips and more likely one. It is easy to characterize this as exploring through a tactile pinhole. Yet, in practice, it is likely that the situation is even worse than that. Vision is an inherently integrative sense. The angle sensed by the eye is very small, but because the eye moves constantly and the brain can integrate information we get a wide field of view which forms a picture that we perceive as coherent. The only people who need to be taught how to perceive visually are those who have lacked the sense of vision and have then had it given to them (usually through surgery, as in the case of Virgil in (Sacks 2009)), whereas even proficient braille readers need to be taught how to use tactile pictures. Evidence from such cases of lately-acquired sight suggests that it may be that people who mature without sight may never develop the ability to integrate fragments into a comprehensive picture (literally and metaphorically). In other words, a tactile picture may always be of limited value – even if it has been designed in such a way as to make the most of the haptic senses.
A limitation of most tactile graphic technologies, such as swell paper, is that it is static and hence the idea of a tactile screen is attractive – and perennial. I have stopped paying much attention to announcements and papers by people who have had this idea again and believe they have found the one which will work. I would like to propose a moratorium on such publications until someone can present their device which works, is reliable and affordable. In the absence of such a ban, I will impose my own by not reading them. I have no doubt that one day someone will invent the appropriate technology, but – for the reasons set out in this chapter – it is a hard goal to achieve. Even this chapter mentions another of these attempts ('gel-based pixels that respond to heat'), but clearly it is quite speculative again, since there is not even a publication cited.
It is clear that the haptic sense becomes even more important in the absence of the dominating sense of vision. Braille is the richest form of tactile communication its use by blind people is very low. It is estimated that fewer than 2% of blind people read braille (Bruce, McKennell et al. 1991). The reasons for the low take-up are unclear and probably complex, but the essential explanation must be that blind people do not perceive that the benefits they will gain are worth the effort that it will take to learn braille. This particularly applies to those who have had sight and have lost it – which is the vast majority. For those of us who have sight, exclusion from (printed) literature seems a great loss, but for most of those who experience this the difficulty of learning to read tactually is not sufficient motivation.
Yet it is not sheer laziness on the part of sighted people that means that they do not make as much use of their non-visual senses as they might. In radical experiments by (Pascual-Leone, Theoret et al. 2004) sighted participants were blindfolded 24 hours per day for several days and given daily lessons in reading braille. They made good progress in learning braille and brain scans showed that areas of the brain cortex normally associated with visual processing were being reassigned to processing tactile information. A control group of non-blindfolded, sighted participants made minimal progress on learning braille and showed no brain adaptation. In other words it seems that braille is not only likely to be useful for blind people, they are the only people who are likely to be able to use it.
To summarize, haptic senses are used to a greater extent in conventional interaction than most people realize. They could be used more and to better effect if designers were more informed regarding them. Virtual haptics are being used increasingly and this trend will accelerate as the technology develops, but there is something to be said for the use of (well-designed), old-fashioned physical interactors (knobs, switches and the like). Ultimately, though, there is a limit to the use that can be made, given the physiological limitations of the haptic senses.