scispace - formally typeset
Search or ask a question

Showing papers presented at "British HCI Conference in 2017"


Proceedings ArticleDOI
08 Jun 2017
TL;DR: The aim of this paper is to outline the design of a chatbot to be used within mental health counselling, able to provide initial counselling, and lead users into the correct services or self-help information.
Abstract: The aim of this paper is to outline the design of a chatbot to be used within mental health counselling. One of the main causes of the burden of disease worldwide is mental health problems. Mental health contributes to 28% of the total burden of disease, compared to 16% each for cancer and heart disease in the UK. Stress, anxiety or depression accounted for 15.8 million days of sickness absence across the UK in 2016. By 2020, the gap between the demand for mental health care and the resources the National Health Service (NHS) can provide is likely to widen, therefore providers are increasingly needing to find more cost-effective ways to deliver mental health care. Digital Interventions have been created to help with these issues, for example anxiety, stress and depression. Chatbots can be incorporated into digital interventions, or used as standalone interventions. Chatbots can be a more interactive experience for the user to receive information, or complete diagnostic tools, or to even be used for counselling. A demo chatbot was created using interactive emoji’s and GIFs to improve the user experience when searching for online self-help tips. This chatbot will be further developed and incorporated into a full web based programme for mental health in the workplace. It is envisaged that the chatbot will be able to provide initial counselling, and lead users into the correct services or self-help information.

96 citations


Proceedings ArticleDOI
George Moore1
03 Jul 2017
TL;DR: In this paper, it is proposed that in the near future cars could detect and publicly display their drivers' emotions in real-time, by making use of an affect engine along with colour changing paint.
Abstract: It is postulated that in the near future cars could detect and publicly display their drivers' emotions in real-time, by making use of an `Affect Engine' along with colour changing paint. Moreover, using an implicit human-in-the-loop approach, and emerging communications standards, drivers perceptions and emotional responses could form an integral part of wider affect-based ad hoc vehicle-to-vehicle networks. Allowing participate in social emotional displays that interact with transport infrastructure to help to advantage, and so encourage, positive emotional states and interactions.

1 citations


Proceedings ArticleDOI
08 Jun 2017
TL;DR: This paper presents a case study and a working prototype to test the utility of interactive medical device instructions accessed by a QR code attached to the medical device.
Abstract: Usability is an increasingly important factor within the field of healthcare and medical devicedevelopment. One of the main issues with the usability of medical devices is their complex nature. Therefore, it is vital that comprehensive and clear instructions are provided to aid in the operation of these devices. While paper-based instructions are commonly provided, they have many disadvantages which can be addressed by interactive digital instructions. Moreover, in an era of pervasive computing, it is important to provide these instructions at the point of need. This can be done using a Quick Response code and a smartphone which allows for interactive instructions to be instantly accessible. This paper presents a case study and a working prototype to test the utility of interactive medical device instructions accessed by a QR code attached to the medical device.

1 citations


Proceedings Article
20 Jun 2017
TL;DR: This research project focuses on exploiting visual-based input modalities to perceive and infer a user’s emotional and cognitive states in order to predicate adaptation of the graphical user interface or indeed the user interaction.
Abstract: The advancement of technology and computer systems requires continuous enhancement by means of the interaction between users and computers. Subsequently, the availability of unobtrusive visualised input modalities, such as eye trackers and RGB cameras has enabled the viable detection of users’ emotions and cognitive states. However, how to utilise such pervasive input modalities to enable a computer to actively interact with a user based on their current emotional and cognitive states is a challenging problem. This paper presents a research study that is currently taking place at Ulster University, which seeks to investigate creating a user model that facilitates future creation of Adaptive Human Computer Interaction. In other words, this research project focuses on exploiting visual-based input modalities to perceive and infer a user’s emotional and cognitive states in order to predicate adaptation of the graphical user interface or indeed the user interaction.