Evangelos Mitsopoulos
Research Interests


Note

Evangelos Mitsopoulos was a research student and Research Associate at the University of York from 1995 to 2000. He is currently undertaking his national service in the Greek Army and hence is not presently research active. These pages, recording his research, are maintained by Alistair Edwards his former supervisor (alistair@cs.york.ac.uk). They have been edited slightly as appropriate (e.g. deletion of out-of-date email addresses and the HTML has been updated).


My main research interest is the the design of fast-rate auditory (non-verbal) presentations. Due to the complexity of issues involved in the auditory pereption , the design of auditory presentations is particularly difficult and far from being intuitive. For this reason, I am trying to derive a design methodology that will render this task somewhat easier. Some parts of the methodology are already in place (Mitsopoulos et al, 1997). A draft of the overall framework of the methodology is also available (Mitsopoulos, 1997). The complete specification of the methodology and some related designer's tools will be the core of my PhD thesis (which is not available yet).

Auditory interfaces need not be command-based. It has been proven that blind users can cope with graphical user interfaces, when these are properly adapted (Edwards, 1989, Edwards, 1993 and GUIB, 1995).Thus, I am interested in the design of haptic interaction devices for non-visual interfaces. Most interaction devices become inefficient in the absence of visual information / feedback. It is necessary to investigate alternative ways to provide the spatial information required for navigation tasks. One possible candidate is the haptic modality. Taxonomies of haptic sub-modalities (Loomis, 1986 and Millar, 1994) are necessary to reason what spatial information might be conveyed by a particular interaction device.

Another major issue in interactive auditory interfaces is that of multi-modal integration (typically auditory-tactile). The task of manipulating the interaction device must be transparent i.e. it should not interfere with the primary auditory task. At the same time, complementary information from both sensory channels must blend into a single coherent percept. Tackling these issues should rely on the application of relevant psychological models rather than 'intuitive' designs, since the later have quite often proven to be of low usability. But which models are out there?

Inevitably, my research interests include also those of my group. All aspect relevant to blind users are within our main research interests. But, we are looking into applications of IT to other types of disability, too. If you find all this interesting, why don't you check out alternative interfaces, our group web-page?


References


Back to top Back to top


Last modified: 6/12/01