scispace - formally typeset
Search or ask a question
Author

Johan Kildal

Other affiliations: University of Glasgow
Bio: Johan Kildal is an academic researcher from Nokia. The author has contributed to research in topics: Human–robot interaction & User experience design. The author has an hindex of 17, co-authored 59 publications receiving 1091 citations. Previous affiliations of Johan Kildal include University of Glasgow.


Papers
More filters
Proceedings ArticleDOI
18 Apr 2015
TL;DR: This article goes beyond the focused research questions addressed so far by delineating the research area, synthesizing its open challenges and laying out a research agenda.
Abstract: Physical representations of data have existed for thousands of years. Yet it is now that advances in digital fabrication, actuated tangible interfaces, and shape-changing displays are spurring an emerging area of research that we call Data Physicalization. It aims to help people explore, understand, and communicate data using computer-supported physical data representations. We call these representations physicalizations, analogously to visualizations -- their purely visual counterpart. In this article, we go beyond the focused research questions addressed so far by delineating the research area, synthesizing its open challenges and laying out a research agenda.

370 citations

Journal ArticleDOI
TL;DR: The experiments done and results achieved by the authors in the context of the FourByThree project are described, aiming to measure the trust of workers on fenceless human–robot collaboration in industrial robotic applications as well as to gauge the acceptance of different interaction mechanisms between robots and human beings.
Abstract: Human–robot collaboration is a key factor for the development of factories of the future, a space in which humans and robots can work and carry out tasks together. Safety is one of the most critica...

146 citations

Patent
Johan Kildal1
20 Apr 2011
TL;DR: In this article, a method for providing tactile feedback in response to a user input is described, where force sensing information associated with force to an input surface by an input object and detected by the force sensor (250) is obtained and a tactile output actuator (240) is controlled to produce tactile output imitating physical sensation associated with displacement of the input surface.
Abstract: In accordance with an example embodiment of the present invention, a method is provided for providing tactile feedback in response to a user input. Force sensing information associated with force to an input surface by an input object and detected by the force sensor (250) is obtained and a tactile output actuator (240) is controlled to produce tactile output imitating physical sensation associated with displacement of the input surface on the basis of the force sensing information.

77 citations

Proceedings ArticleDOI
05 May 2012
TL;DR: This paper proposes the topics that a research agenda should cover, and describes the research methodology, and presents an initial set of design guidelines that future research will develop further.
Abstract: We introduce the user-centered research that we are conducting using functional deformable research prototypes. This work has recently crystallized in the demonstration of the Nokia Kinetic Device (figure 1). In the large design space that opens before us around deformable user interfaces (DUIs), we have chosen to focus on mobile personal interfaces. We aim to investigate how human factors should influence the transition from rigid to deformable hardware. In this paper, we propose the topics that a research agenda should cover, and we discuss our research methodology. We also describe the functional deformable research prototype (called Kinetic DUI-RP) that we are using to conduct our research. Finally, we present an initial set of design guidelines that future research will develop further.

67 citations

Proceedings ArticleDOI
Johan Kildal1
08 Nov 2010
TL;DR: A new intramodal haptic illusion that could be used to augment touch interaction with mobile devices, transcending the rigid two-dimensional tangible surface (touch display) currently found on them.
Abstract: This paper reports a new intramodal haptic illusion. This illusion involves a person pressing on a rigid surface and perceiving that the surface is compliant, i.e. perceiving that the contact point displaces into the surface. The design process, method and conditions used to create this illusion are described in detail. A user study is also reported in which all participants using variants of the basic method experienced the illusion, demonstrating the effectiveness of the method. This study also offers an initial indication of the mechanical dimensions of illusory compliance that could be manipulated by varying the stimuli presented to the users. This method could be used to augment touch interaction with mobile devices, transcending the rigid two-dimensional tangible surface (touch display) currently found on them.

59 citations


Cited by
More filters
Proceedings ArticleDOI
27 Apr 2013
TL;DR: A study exploring participants' verbalizations of their tactile experiences across two modulated tactile stimuli related to two important mechanoreceptors in the human hand proposes 14 categories for a human-experiential vocabulary based on the categorization of the findings.
Abstract: A common problem with designing and developing applications with tactile interfaces is the lack of a vocabulary that allows one to describe or communicate about haptics. Here we present the findings from a study exploring participants' verbalizations of their tactile experiences across two modulated tactile stimuli (16Hz and 250Hz) related to two important mechanoreceptors in the human hand. The study, with 14 participants, applied the explicitation interview technique to capture detailed descriptions of the diachronic and synchronic structure of tactile experiences. We propose 14 categories for a human-experiential vocabulary based on the categorization of the findings and tie them back to neurophysiological and psychophysical data on the human hand. We finally discuss design opportunities created through this experiential understanding in relation to the two mechanoreceptors.

1,602 citations

Journal ArticleDOI
TL;DR: The body in the mind the bodily basis of meaning imagination and reason as mentioned in this paper is one of the most popular body-in-the-mind readings in the world, but it can also end up in malicious downloads.
Abstract: Thank you for downloading the body in the mind the bodily basis of meaning imagination and reason. As you may know, people have search numerous times for their favorite readings like this the body in the mind the bodily basis of meaning imagination and reason, but end up in malicious downloads. Rather than enjoying a good book with a cup of tea in the afternoon, instead they are facing with some harmful bugs inside their desktop computer.

863 citations

Proceedings ArticleDOI
21 Apr 2018
TL;DR: This work investigates how HCI researchers can help to develop accountable systems by performing a literature analysis of 289 core papers on explanations and explaina-ble systems, as well as 12,412 citing papers.
Abstract: Advances in artificial intelligence, sensors and big data management have far-reaching societal impacts. As these systems augment our everyday lives, it becomes increasing-ly important for people to understand them and remain in control. We investigate how HCI researchers can help to develop accountable systems by performing a literature analysis of 289 core papers on explanations and explaina-ble systems, as well as 12,412 citing papers. Using topic modeling, co-occurrence and network analysis, we mapped the research space from diverse domains, such as algorith-mic accountability, interpretable machine learning, context-awareness, cognitive psychology, and software learnability. We reveal fading and burgeoning trends in explainable systems, and identify domains that are closely connected or mostly isolated. The time is ripe for the HCI community to ensure that the powerful new autonomous systems have intelligible interfaces built-in. From our results, we propose several implications and directions for future research to-wards this goal.

539 citations

Patent
Daehwan Kim1, Yoonki Hong1
22 Apr 2013
TL;DR: In this paper, a mobile terminal and a control method thereof, which can obtain an image, are provided. But they do not specify how to obtain the image from the mobile terminal, nor how to extract the pictogram from the obtained image.
Abstract: A mobile terminal and a control method thereof, which can obtain an image, are provided. A mobile terminal (100) includes a camera unit (121), a pictogram extraction unit (182) and a controller (180). The camera unit (121) obtains image information corresponding to at least one of a still image and a moving image. The pictogram extraction unit (182) extracts at least one pictogram from the obtained image information. The controller (180) detects information related to the extracted pictogram, and displays the detected information to be overlapped with the obtained image information. In the mobile terminal, the controller (180) includes, as the detected information, at least one of previously recorded information and currently searched information related to the extracted pictogram.

482 citations