Alistair Edwards' Proposed projects 1999-2000

List

Legend

IT
Suitable for ITBML third-year students.
CS
Suitable as a third-year Computer Science project.
IP
Suitable for MSc(IP) students - usually involves HCI evaluation work.
4Y
Suitable as a fourth-year, MEng project.

ADNE1: Shopmobility York [IT]

Shopmobility York is an organization with the aim of improving access to the city (not just the shops - but also the sights and even nightclubs) for people with disabilities. They do this by lending Shopmobiles and powered wheelchairs to suitable users. They wish to create a presence on the Internet and that would be the aim of this project. However, the project would not be simply about designing Web pages. There is a complex interaction with commercial sponsors and the local authority and this would have to be reflected in any Internet site.

The project would therefore be more about developing a plan for the site. The requirements of the various stakeholders (users, Shopmobility, the sponsors etc.) would have to be ascertained. The project should be set in the context of a marketing plan, for which the web site would be one component.

There will be questions to be addressed regardng the accessibility of the web site - to people with disabilities and to visitors who may not speak English. The site will have to be designed and implemented, and ideally it should also be evaluated.

Examples of other Shopmobility Web sites can be found at: Nottingham and Wigan.

The student will be expected to produce a web page describing the project.

ADNE2: Do people use icons? [IP]

In previous project (Conlon, 1998), Ann Conlon came to the conclusion that the icons used so widely in modern interfaces were of little more than cosmetic value. She investigated them by getting both novices and experts to test the recognizability and memorability of real icons and she found that they did not score well on either dimension.

This would suggest that people are unlikely to use such icons. This does not prevent them from using iconic interfaces, though, because there is generally a textual alternative available. That is to say that either there is an equivalent menu entry, or the user can access a textual label for the icon (for instance, by letting the cursor dwell over the icon).

The fact that Colon's tests showed similar results for novices as experts was not entirely unexpected, given the similar results in the well-known previous study by Mayes et al.

The objective of this project will be to look for corroboration of Colon's results, but by taking a different approach. Testing will be carried out whereby users are observed (using video recordings and an event-recording tool) interacting with an iconic interface and data is collected about the number of times a textual alternative is used - and how long it takes to do so.

The student will be expected to produce a web page describing the project.

ADNE3: Development of a 'musical quotient' test [IP, 4Y]

An increasing number of human-computer interfaces use non-speech sounds. The research upon which their design depends often relies upon subjective user testing. One of the variables that has to be taken into account is the musical ability of the test subjects. For instance it would not be a good idea to develop an interface that can only be used effectively by people with musical training.

One approach that has been applied is to classify test participants as either 'musical' or 'non-musical' and then look for significant differences between the results for the two groups. There are several problems with this approach, though. First, how to identify the members of the two groups? For instance, is a person who plays piano by ear - but who cannot read a note - a musician? Also, the results have been equivocal. Stevens (1996) found a significant difference between musicians and non-musicians, while Brewster (1994) (using the same definition of musician) found no difference.

The proposed alternative approach is to develop a standardized test of musical ability, as we have standard tests of level of intelligence in IQ tests. Then all test participants could be given an 'MQ' test, and their results in any evaluation tested for correlation with their MQ score (just as tests of say problem-solving skills might be tested for a relationship to IQ).

Research students Ben Challis and John Hankinson have devised a suite of proposed tests (unpublished). The objective of this project would be to implement (some of) them in a form for automatic presentation by computer.

The project will require programming ability (probably in C) and some knowledge of experimental method. A background in music and music technology (MIDI) would be an advantage. If undertaken as a fourth-year project, then collaboration with the Psychology and/or Music Departments would be required.

The student will be expected to produce a web page describing the project.

The most complete descriptions of the research behind auditory interfaces can be found in the proceedings of the Icad Conferences (International Conference on Auditory Display).

ADNE4: Design guidelines for animated icons [IP]

Ann Conlon, in a previous project (Conlon, 1998) came to the conclusion that static icons are much less effective than interface designers seem to believe. However, she thought that there might be rather more justification for the use of animated icons (sometimes referred to as moving icons, or even micons). Certainly, they appear in increasing numbers in software, but are designers using them just because they can, because modern displays will support them and processors have cycles to spare, or are there good reasons to use them?

It would seem that there is information that can be presented well by a dynamic icon. For instance, the Macintosh uses a 'watch' cursor to denote that the system is busy. Such busy signals are important as they indicate the difference between a system which is active and one which is dead. The original watch was static, though, which meant that it was often impossible to tell the difference between a busy system and one which had crashed after the watch had been displayed. Some programs use a version of the watch with moving hands, so that if the hands stop, the system must have done also.

The first phase of this project would be to survey the literature (as it is with most projects). The suspicion is that few guidelines already exist on the use of animated icons (Conlon found a lot of literature on static icons - but most of it was contradictory and hence of little practical use). If that is the case, then designers are probably basing their use of animated icons on ad hoc intuition.

Should any guidelines already exist, then they can be tested by a user-centred experiment. If they do not exist, then they should be derived - based on a similar experiment. As a starting point, the following properties of animated icons are proposed:

Dynamism
An animation can convey dynamic information, i.e. the current state of a continuing process, such as a file transfer.
'Liveness'
As mentioned above, an animation may be used to signal that a process has not died.
Attention seeking
Visual attention will be drawn to an animation.

There are probably many more similar properties which need to be identified and tested.

The student will be expected to produce a web page describing the project.

ADNE5: Tactile interaction [IP]

Computer interfaces are visually dominated; they make very little use of the other senses, including the tactile sense. On the one hand, the tactile sense can be quite powerful and discriminating (imagine feeling all the different textures on a collage) but on the other hand, the technology is limited. Apart from braille displays, there are no effective dynamic tactile displays that can be controlled by computer (but see also ADNE8 for an example of an experimental device). Most current developments in tactile interaction therefore depend on static overlays on touch-sensitive pads. Quite rich overlays can be produced using appropriate technology (such as thermoform machines), but these require hand-made master moulds and are not simple to produce. Much simpler to produce are diagrams on so-called swell paper, which is essentially a printing technology. However, the range of textures that can be produced in this way is limited. The objective of this project would be to investigate the limitations and possibilities of this technology.

There is reason to believe (Challis, 1998, Burdea, 1996) that the number of textures that can be distinguished by the average user is quite small. The idea would be to explore the limitations of simple textures and to see whether they can be extended by systematic variation.

Specifically, linear textures of varying separations, as illustrated below, can be used, but the number that can be distinguished and labelled is probably quite low.

Sample vertical grid patterns

However, some variation might be achieved by using different orientations:

Textures with a horizontal orientation

Texture with a diagonal orientation

Experiments will have to be conducted to establish how many such textures can be easily distinguished, then a tactile interface will be implemented to test the practical application. The exact design and type of interface will depend on the results of the first part, on how many textures can be used.

The student will be expected to produce a web page describing the project.

ADNE6: Two-dimensional sound screen [IP]

The ability to artificially spatialize sounds in three-dimensional space offers new possibilities for human-computer multimedia interaction (e.g. Crispien and Felbaum, 1995). A variety of techniques have been developed for processing sounds in this way (Begault, 1994; Keating, 1996). However, they are not always as useful as might first seem (Crispien, op. cit.) for a variety of reasons. A simple, low-technology alternative is to use a set of speakers and to simply vary the balance between them (Crispien and Petrie, 1993). The Department has acquired a circuit board which will perform this simple processing and the objective of this project would be to evaluate it.

An initial attempt has been made to do this as a student project by John McLean (1998). His results were somewhat equivocal, but given that preliminary study, it should be possible to see what lines of investigation are likely to be fruitful. For a start, using a greater number of test subjects would improve the chances of generating significant results. Also it might be better to concentrate on the rendering of large shapes (which fill the screen) but to investigate the optimal form of presentation (e.g. speed, marking of corners etc.). Furthermore, the difference between recognizing arbitrary shapes and those from a known set seemed significant.

The project would involve the analysis of McLean's results to see where the system is most likely to be effective followed by the design and execution of an experiment to test these hypotheses.

The student will be expected to produce a web page describing the project.

ADNE7: Direct combination: A new paradigm for interaction [IP]

Many aspects of the modern computer interface have become conventional and trace their roots back to the very early graphical user interfaces (e.g.Smith et al., 1982) so there is scope for improvement in interaction design by radical reappraisal of many of the forms of interaction. For instance, the 'object-verb' order is almost ubiquitous (select the item, then select the command to be performed on that item). Holland and Oppenheim (1999) have proposed new styles of interaction, based on what they call 'Direct combination'. One aspect of this is to replace the object-verb specification with what might be described as 'object-object'. That is to say that two objects are brought together and combine in some appropriate manner. There are examples in conventional interfaces. For instance, combine the icon for a file with the Wastebasket icon and the result is that the former is put into the Wastebasket.

Holland proposes an implementation based on graphical interaction, dragging icons on an interface and possibly involving other visual elements such as magic lenses. However, before such an implementation is attempted, it would be useful to assess how natural people find this direct combination style of interaction, whether they can get used to it as well as they can the object-verb style, for instance.

That would be the purpose of this project, therefore. A suitable small-scale domain would be devised within which objects can be combined and then experiments would be carried out as to how easy and natural people find the interaction. It wouldbe simpler to base this on a textual interface. While this loses the directness of the graphical interaction it would also test the viability of the paradigm at a more conceptual level.

The student will be expected to produce a web page describing the project.

ADNE8: 'Texton' evaluation [IP]

Although many modern computer interfaces are described as 'multi-media' one medium which is rarely used is tactile communication. There are many reasons for this, often technological ones. One flexible, inexpensive tactile output device, known as the Texton, has been developed by Evreinov and colleagues in Russia (Challis et al, 1998), but it has never been evaluated. The objective of this project would be to carry out such an evaluation.

The Texton element consists essentially of a spring of rectangular cross-section that can be compressed by two electromagnets. It is capable of generating a 10 different textures, by varying the degree of compression, either statically or dynamically. It is necessary to test how reliably they can be distinguished and hence to assess how useful a device based on such elements might be.

The project would involve the design and execution of an experimental evaluation (Robson 1994)

The student will be expected to produce a web page describing the project.

ADNE9: Tactile joystick evaluation [IP]

The Department has acquired a prototype joystick with tactile feedback. A pantograph-style device is attached to the wrist, such that the lever rests on the back of the hand. The user moves the lever with the other hand, receiving feedback about the position of the end of the lever and hence the position of the cursor on the screen.

The device has been designed as an alternative to the mouse for blind computer users (Edwards, 1987; Edwards, 1989), to be used in conjunction with other non-visual feedback. It needs to be evaluated. Does it work, does the extra channel of information enhance the interaction, making it quicker or less error-prone? What applications is it useful for?

Picture of the tactile joystick

The project would involve designing and carrying out some form of controlled test. Robson (1994) describes the kinds of techniques required, while Stevens and Edwards (1996) explains some of the difficulties in doing this kind of study. This project might be carried out in conjunction with ADNE6.

The student will be expected to produce a web page describing the project.

ADNE10: Listening to two things at once [IP, 4Y]

One of the powerful aspects of vision is the ability to view more than one thing at a time, really the facility to switch quickly from one to another. That is one of the reasons we have computer monitors with large screens and window-based interfaces; it is valuable to be able to compare the contents of different windows. The non-visual senses are not as good at such pseudo-parallel processing. This is one of the major problems in adapting computer interfaces for use by blind people (Edwards, 1988; Mynatt, 1994; Mynatt, 1997).

The objective of this project would be to look at the feasibility of implementing parallel inspection of auditory data in an application in which it is particularly important. The application is biological, that of comparing DNA sequences. The degree of homology between two sequences (of amino acids or nucleic acids) is a strong clue to the function of that sequence. Pairwise or multiple alignments are made to exhibit these homologies: Identical or functionally similar residues are juxtaposed, with gaps inserted to optimize the alignment. A biologist looking at such an alignment can spot important regions of the sequence.

There is a considerable literature on attention which has investigated people's ability to extract information from a second speech stream (e.g. Baddeley and Weiskrantz, 1993 and Moray, 1969), but these tend to concentrate on a situation in which the second stream is unattended. That is to say that the subjects are trying to listen to one stream not both of them.

An obvious approach would be to use two streams of synthetic speech (Edwards, 1991), but there are a number of questions to be tackled as to how best to do this. For instance, should headphones or stereo speakers be used - with one voice per channel? Should different voices be used, if so, which are best? There may also be a role for use of non-speech sounds (Hereford and Winn, 1994; Brewster, 1994).

The project would involve software implementation and user testing. If undertaken as a fourth-year project, collaboration with the Psychology Department would be required.

The student will be expected to produce a web page describing the project.

ADNE11: Web browsers for blind people [IT, CS, IP]

The web is becoming an increasingly important source of information. To be excluded from it - for any reason - is thus becoming a handicap. One cause of exclusion is blindness, given the (growing) degree of visual orientation of web pages (Edwards & Stevens, 1997). Ruth Hayward has already investigated this problem as a student project (Hayward, 1997) but since then there has been a developing trend in access tools. The common idea is that web pages need to be re-arranged to make them suitable for speech-based presentation.

The first objective of this project would be to evaluate and compare two implementations of this idea. Brookestalk (Zajicek & Powell, 1996; Zajicek & Powell, 1997; Zajicek, Powell, Reeves and Griffiths, 1998) is one implementation, a browser which rearranges the pages its displays, and Betsie is a Perl script which rearranges pages for viewing in a conventional browser. This evaluation should highlight the successes and failures in these designs, which will lead on to the second phase of the project. Betsie is a piece of open source software so that sources are available and enhancements are solicited. Thus the student could implement any improvements apparent from the evaluation.

The project will involve the design and implementation of a human factors experiment and Perl programming

The student will be expected to produce a web page describing the project.

ADNE12: Testing a design methodology for non-visual interaction [CS, IP]

Research student Evangelos Mitsopoulos has devised a methodology for the design of non-visual interactive devices (Mitsopoulos & Edwards, 1997; Mitsopoulos & Edwards, 1998). The methodology separates the design process into three levels

Mitsopoulos is working on applying it to the design of non-visual alternatives to graphical user interface widgets, but the objective of this project would be to apply the same methodology to a slightly different problem.

Tables are a useful visual representation of information but it is very difficult to provide an equally powerful and simple non-visual representation for blind people. This has already been tackled in previous student projects (Bufton 1991; Sinclare 1999), but this project would be different in that it would apply Mitsopoulos' methodology. This would be a test of the methodology, but would also show whether its application leads to a better design than previous approaches.

The student will be expected to produce a web page describing the project.

References

Aldrich, F. (1997). A case study of NeuroPage: A reminder system for memory-disabled people. in Computers in the Service of Mankind: Helping the Disabled, (London), IEE. Digest number 97/117 pp. 10/1-10/3. (I have a copy)

Baddeley, A. D. and Weiskrantz, L., (Ed.) (1993). Attention: selection, awareness, and control: A tribute to Donald Broadbent. Oxford, Clarendon Press.

Begault, D. R. (1994). 3-D Sound for Virtual Reality and Multimedia. Boston: Associated Press.

Brewster, S. A. (1994). Providing a structured method for integrating non-speech audio into human-computer interfaces. DPhil Thesis, University of York.

Bufton, S. (1991). Reading text tables for blind people, Department of Computer Science, University of York, Final-year Project Report

Conlon, A. (1998). Icons at the interface: Great expectations. Department of Computer Science MSc (IP) Project report, University of York.

Burdea, G. C. (1996). Force and Touch Feedback for Virtual Reality. New York: Wiley.

Challis, B. (1998). Establishing design principles for the integration of audio-tactile communication in the human-computer interface. Department of Computer Science, Thesis Proposal, University of York.

Challis, B., Hankinson, J., Evreinova, T. and Evreinov, G. (1998). Alternative textured display. in A. D. N. Edwards, A. Arato and W. L. Zagler (ed.), Computers and Assistive Technology, ICCHP '98: Proceedings of the XV IFIP World Comptuer Congress, (Vienna & Budapest), Austrian Computer Society. pp. 37-48

Conlon, A. (1998). Icons at the interface: Great expectations. Department of Computer Science MSc (IP) Project report, University of York.

Crispien, K. and Felbaum, K. (1995). Use of acoustic information in screen reader programs for blind computer users: Results from the Tide project GUIB. in I. P. Porrero and R. P. de la Bellacasa (ed.), The Eurpoean Context for Assistive Technology: Proceedings of the Second Tide Congress, (Paris), IOS Press. pp. 306-311.

Crispien, K. and Petrie, H. (1993). Providing access to GUIs for blind people. in Proceedings of the 19th Convention of the Audio Engineering Society. (I have a copy)

Edwards, A. D. N. (1987). Adapting user interfaces for visually disabled users. Unpublished PhD Thesis, Open University.

Edwards, A. D. N. (1988). The design of auditory interfaces for visually disabled users. in E. Soloway, D. Frye and S. B. Sheppard (ed.), Human Factors in Computing Systems: Proceedings of Chi '88, (Washington), pp. 83-88.

Edwards, A. D. N. (1989). Soundtrack: An auditory interface for blind users. Human Computer Interaction 4(1): pp. 45-66.

Edwards, A. D. N. (1991). Speech Synthesis: Technology for disabled people. London: Paul Chapman.

Edwards, A. D. N. and Stevens, R. D. (1997). Visual dominance and the World-Wide Web. in Proceedings of the Sixth International World Wide Web Conference (CD-Rom), (Santa Clara, California), Stanford University.

Hayward, R. (1997). Accessibility issues of Web browsers for blind people, Department of Computer Science, University of York, Final-year Project Report

Hereford, J. and Winn, W. (1994). Non-speech sound in human-computer interaction: A review and design guidelines. Journal of Educational Computing Research 11(3): pp. 211-233.

Holland, S. and Oppenheim, D. (1999). Direct combination. in Proceedings of CHI '99, (Accepted - I have a pre-print)

Keating, D. A. (1996). The generation of virtual acoustic environments for blind people. in P. M. Sharkey (ed.), Proceedings of the First European Conference on Disability, Virtual Reality and Associated Technologies, (Maidenhead), University of Reading. pp. 201-208.

McLean, J. (1998). Evaluation and user testing of a simple two dimensional screen reader system, University of York, Department of Computer Science, MSc (IP) Project Report .

Mayes, J. T., Draper, S. W., McGregor, A. M. and Oatley, K. (1988) Information flow in a user interface: The effect of experience and context on the recall of MacWrite screens. in D. M. Jones and R. Winder (ed.), People and Computers IV: Proceedings of HCI '88, (Manchester), Cambridge University Press. pp. 275-289.

Mitsopoulos, E. N. and Edwards, A. D. N. (1997) Auditory Scene Analysis as the basis for designing auditory widgets, Proceedings of the Fourth International Conference on Audio Display (ICAD '97), Palo Alto: Xerox, pp. 13-18.

Mitsopoulos, E. N. and Edwards, A. D. N. (1998). A Principled Methodology for the Specification and Design of Non-Visual Widgets. in S. A. Brewster and A. D. N. Edwards (eds.), Proceedings of ICAD '98 (International Conference on Auditory Display), (Glasgow), British Computer Society.

Moray, N. (1969). Attention: Selective Processes In Vision And Hearing. London: Hutchinson Educational.

Mynatt, E. (1994). Auditory presentation of graphical user interfaces. in G. Kramer (ed.), Auditory Display: Sonification, Audification and Auditory Interfaces (Proceedings of ICAD '94), (Santa Fe), Addison Wesley. pp. 533-556.

Mynatt, E. D. (1997). Transforming graphical interfaces into auditory interfaces for blind users. Human-Computer Interaction 12(1 & 2): pp. 7-46.

Robson, C. (1994). Experiment, design and statistics in psychology. 3rd Edition, London: Penguin Books Ltd.

Schwartz, R. L. (1993). Learning Perl. Sebastopol, California: O'Reilly.

Sinclare, J. (1999), In preparation. Final-year project report, Department of Computer Science, University of York.

Smith, D. C., Irby, C., Kimball, R. and Harslem, E. (1982). The STAR user interface: An overview. The National Computer Conference : pp. 515-528

Spooner, R. I. W. (1996). A computerised writing aid for dyslexic people, University of York, Department of Computer Science, Research Student Thesis Proposal.

Stevens, R. (1996). Principles for the design of auditory interfaces to present complex information to blind computer users. DPhil Thesis, University of York, UK.

Stevens, R. D. and Edwards, A. D. N. (1996). An approach to the evaluation of assistive technology. in Proceedings of Assets '96, (Vancouver), ACM. pp. 64-71.

Wilson, B. A., Evans, J. J., Emslie, H. and Malinek, V. (1997). Evaluation of NeuroPage: A new memory aid. Journal of Neurology, Neurosurgery and Psychiatry 63: pp. 113-115.

Zajicek, M. and Powell, C. (1996). Building a conceptual model of the World Wide Web for visually impaired users. in Proceedings of Ergonomics 96,

Zajicek, M. and Powell, C. (1997). Enabling visually impaired people to use the Internet. in Computers in the Service of Mankind: Helping the Disabled, (London), IEE. Digest number 97/117 pp. 11/1-11/3

Zajicek, M., Powell, C., Reeves, C. and Griffiths, J. (1998). Web browsing for the visually impaired. in A. D. N. Edwards, A. Arato and W. L. Zagler (ed.), Computers and Assistive Technology, ICCHP '98: Proceedings of the XV IFIP World Comptuer Congress, (Vienna & Budapest), Austrian Computer Society. pp. 161-169.

 


To:

This page maintained by Alistair Edwards alistair@cs.york.ac.uk

http://www.cs.york.ac.uk/~alistair/projects/projects.html

30th November 1999