Human Handheld-Device Interaction

An Adaptive User Interface

More Info
expand_more

Abstract

The move to smaller, lighter and more powerful (mobile) handheld devices, whe-ther PDAs or smart-phones, looks like a trend that is building up speed. With numerous embedded technologies and wireless connectivity, the drift opens up unlimited opportunities in daily activities that are both more efficient and more exciting. Despite all these advancing possibilities, the shrinking size and the mobile use impose challenges for both technical and usability aspects of the devices and their applications. An adaptive user interface, that is able to autonomously adjust its display and available actions to current goals, contexts and emotions of its user, represents solutions for limited input options, various constraints of the output presentation, and user requirements due to mobility and attention shifting in human handheld-device interaction. The present work made preliminary steps in proposing a framework for a rapid construction of adaptive user interfaces that are multimodal, context-aware and affective, on handheld devices. The framework consists of predefined modules that are able to work in isolation but can also be connected in an ad hoc way as part of the framework. The modules deal with human handheld-device interaction, the interpretation of the user's actions, knowledge structure and management, the selection of appropriate responses and the presentation of feedback. Human language and visual perception models have been studied in formulating concepts or ideas as both text and visual language-based messages. An adaptive circular on-screen keyboard and visual language-based interfaces have been proposed as alternative input options for fast interaction. In particular, sentences in the visual language can be constructed using spatial arrangements of visual symbols, such as icons, lines, arrows and ellipses. As icons offer a potential across language barriers, any interaction using the visual language is suitable for language-independent contexts. Personalized predictive and language-based features have also been added to accelerate both input methods. An ontology has been chosen to represent knowledge of the user, the task and the world. The modeling and structure of the knowledge representation has been designed for sharing common semantics, integrating the communication inter-modules, and fulfilling the context aware requirement. It enables the framework to be developed into a widespread application for different domains. The context awareness is approached by interpreting both verbal and non-verbal aspects of user inputs to update the system's belief about the user, the task and the world. Methods and techniques to fuse multiple input modalities for multiple messages from multiple users into a coherence and context dependent interpretation have been developed. A simple approach to emotion analysis has been proposed to interpret the nonverbal aspect of the inputs. It is based on a keyword spotting approach by categorizing the emotional state into a certain valence orientation with intensity. The approach is suitable for a high uncertainties input recognition. Template-based interaction management and output generation methods have been developed. The templates have a direct link to concepts in the ontology-based knowledge representation. This approach supports a common semantic with other modules within the framework. It allows the development of a bigger scale system with consistent and easy to verify knowledge repositories. A multimodal, multi-user, and multi-device communication system in the field of crisis management built based on the framework has been developed as a proof of the proposed concepts. This system consists of comprehensive selected modules for reporting and collaborating observations using handheld devices in mobile ad-hoc network-based communication. It supports communication using the combination of text, visual language and graphics. The system is able to interpret user messages, construct knowledge of the user, the task and the world, and develop a crisis scenario. User tests were aimed at an assessment of whether or not users are capable of expressing their messages using the provided modalities. The tests also addressed usability issues on interacting with an adaptive user interface on handheld devices. The experimental results indicated that the adaptive user interface is able to support communication between users and between users and their handheld devices. Moreover, an explorative study within this research has also generated knowledge regarding (technical, social and usability aspects of) user requirements in adaptive user interfaces and (generally) human handheld-device interaction. The rationale behind our approaches, designs, empirical evaluations and implications for research on the framework for an adaptive user interface on handheld devices are also described in this thesis.

Files