HCI Research Group Seminars

News about seminars (and other activities of the HCI Group) is distributed via the york-hci mailing list. This is maintained by Jisc - and you may subscribe yourself to the list.

Seminars usually take place on Thursdays, 13.30-14.30 in CSE/082 – unless otherwise specified.

Information for speakers is available.



Summer Term 2017

Date Speaker Room Title
(Links to abstract)
20 April Sam Simpson CSE/082 Postponed How do experienced mindfulness practitioners develop a mindful life and what does this tell us about playing computer games mindfully?
27 April Chris Power CSE/082 The Science of Getting Stuck: Understanding Perceived Uncertainty in Interactive Systems
4 May LMB/036
11 May CSE/082
Wednesday 17 May Helen Petrie CSE/082 Departmental seminar
Usable security: a view from HCI
18 May Tobias Mülling
University of Brighton
CSE/082 Embracing the Gesture-Driven Interface: challenges for the design of graphical mid-air interfaces using a gestural approach
Wednesday 24 May Alistair Edwards CSE/082 Departmental seminar
The SRC Common Base Policy
25 May Andrew Lewis CSE/082 Introduction to the Mobile ESM application (provisional title)
1 June Guido Gybels
Consultant in accessibility and usability
CSE/082 (To be announced)
8 June Aaron Quigley
University of St Andrews
CSE/082 Ubiquitous User Interfaces
15 June John Mateer
TFTV
CSE/082 Directing for Cinematic Virtual Reality: the relationship between 'presence', 'transportation' and 'suspension of disbelief'
22 June David Zendle   CSE/082 (TBA - but something on juicy feedback)

27 April
Chris Power: The Science of Getting Stuck: Understanding Perceived Uncertainty in Interactive Systems

Imagine a gamer, trying to jump over a chasm for the twentieth time, wondering if they are doing something wrong, or if the game just too hard for them. Picture a family historian navigating through 300 pages of search results to discover a long lost aunt, but unsure which poorly labelled link will lead to her place of birth. Envision financial analysts who want insights about their business, but cannot orient themselves in multi-dimensional data because the system does not react they way they expect. Finally, remember your own experiences, when you were hopelessly lost on a website, unable to find that form or policy you needed, even though you were sure you found it before.

All of these scenarios are examples of users experiencing uncertainty in interactive systems. This uncertainty leads users to getting "stuck" and unable to progress in their tasks. Some of this uncertainty is unavoidable, caused by what we are trying to do, such as solving hard problems or playing a game. In other cases, uncertainty is unnecessary, caused by the design and feedback of the interactive system.

In this talk I will discuss some of the work that has been ongoing by our PhD and MSc students looking at untangling this problem.

18 May
Tobias Mülling: Embracing the Gesture-Driven Interface: challenges for the design of graphical mid-air interfaces using a gestural approach

Mid-air interaction has been investigated for many years, and with the launch of affordable sensors such as Microsoft Kinect (Microsoft Corporation), Leap Motion (Leap Motion, Inc.) and Myo Armband (Thalmic Labs Inc.), this type of interaction has become more popular. However, graphical interfaces for mid-air interaction have been designed using two dominant styles: cursor-based, where the user's hand replicates the mouse movement by copying the WIMP interaction pattern (Windows, Icons, Menus, and Pointing), and gesture-library, where different gestures are assigned to the functionalities of a system, generating a cognitive overload due to the need for recall over recognition. The use of a gestural approach based on manipulation presents itself as an alternative to the mentioned interaction styles, with a focus on enhancing the user experience in 2D Design Patterns. Taking a practice-based research approach, this talk presents the design space of a gestural approach based on manipulation, with challenges and strategies that can be used in the design of graphical interfaces for mid-air interaction. A series of experiments will be presented to exemplify the use of visual elements and mid-air gestures, in an attempt to create gesture-driven interfaces with a satisfactory user experience.

8 June
Aaron Quigley: Ubiquitous User Interfaces

Displays are all around us, on and around our body, fixed and mobile, bleeding into the very fabric of our day to day lives. Displays come in many forms such as smart watches, head-mounted displays or tablets and fixed, mobile, ambient and public displays. However, we know more about the displays connected to our devices than they know about us. Displays and the devices they are connected to are largely ignorant of the context in which they sit including knowing physiological, environmental and computational state. They don't know about the physiological differences between people, the environments they are being used in, if they are being used by one or more persons.

In this talk I review a number of aspects of displays in terms of how we can model, measure, predict and adapt how people can use displays in a myriad of settings. With modeling we seek to represent the physiological differences between people and use the models to adapt and personalize designs, user interfaces. With measurement and prediction we seek to employ various computer vision and depth sensing techniques to better understand how displays are used. And with adaptation we aim to explore subtle techniques and means to support diverging input and output fidelities of display devices. This talk draws on a number of studies from work published in UIST, CHI, MobileHCI, IUI, AVI and UMAP.

Biography

Professor Aaron Quigley is the Chair of Human Computer Interaction in the School of Computer Science at the University of St Andrews, UK. Aaron is one of the ACM Future of Computing Academy convenors, ACM SIGCHI Vice President for Conferences and program co-chair for the ACM IUI 2018 conference in Tokyo Japan. Aaron's research interests include surface and multi-display computing, human computer interaction, pervasive and ubiquitous computing and information visualisation. He has published over 160 internationally peer-reviewed publications including edited volumes, journal papers, book chapters, conference and workshop papers and holds 3 patents. In addition he has served on over 80 program committees and has been involved in chairing roles of over 20 international conferences and workshops including UIST, ITS, CHI, Pervasive, UbiComp, Tabletop, LoCA, UM, I-HCI, BCS HCI and MobileHCI.

15 June
John Mateer: Directing for Cinematic Virtual Reality: the relationship between 'presence', 'transportation' and 'suspension of disbelief'

The emerging medium of 'Cinematic Virtual Reality' (CVR) features media fidelity that approaches what is found in feature film. Unlike traditional VR, CVR limits the level of control users have within the environment to choosing viewpoints rather than interacting with the world itself. This means that CVR production arguably represents a new type of filmmaking. 'Suspension of disbelief' represents the level of immersion audiences experience when watching a film. Likewise, 'presence' refers to a similar experiential measure in Virtual Reality though it is considered slightly differently. This talk considers the use of 'transportation theory' as a bridge between these constructs to enable established film directing methods to be more readily transferred to Virtual Reality and, specifically, Cinematic VR production.

Archive of previous seminars


Links