scispace - formally typeset
Search or ask a question

Showing papers by "Jodi Forlizzi published in 2018"


Proceedings ArticleDOI
08 Jun 2018
TL;DR: Two complementary studies investigating the experience of households living with a conversational agent over an extended period of time find interesting behaviors around purchasing and acclimating to Alexa, in the number and physical placement of devices, and in daily use patterns.
Abstract: In-home, place-based, conversational agents have exploded in popularity over the past three years. In particular, Amazon's conversational agent, Alexa, now dominates the market and is in millions of homes. This paper presents two complementary studies investigating the experience of households living with a conversational agent over an extended period of time. First, we gathered the history logs of 75 Alexa participants and quantitatively analyzed over 278,000 commands. Second, we performed seven in-home, contextual interviews of Alexa owners focusing on how their household interacts with Alexa. Our findings give the first glimpse of how households integrate Alexa into their lives. We found interesting behaviors around purchasing and acclimating to Alexa, in the number and physical placement of devices, and in daily use patterns. Participants also uniformly described interactions between children and Alexa. We conclude with suggestions for future improvement for intelligent conversational agents.

256 citations


Proceedings ArticleDOI
08 Jun 2018
TL;DR: Designers appeared to be the most successful when they engaged in ongoing collaboration with data scientists to help envision what to make and when they embraced a data-centric culture.
Abstract: Machine learning (ML) plays an increasingly important role in improving a user's experience. However, most UX practitioners face challenges in understanding ML's capabilities or envisioning what it might be. We interviewed 13 designers who had many years of experience designing the UX of ML-enhanced products and services. We probed them to characterize their practices. They shared they do not view themselves as ML experts, nor do they think learning more about ML would make them better designers. Instead, our participants appeared to be the most successful when they engaged in ongoing collaboration with data scientists to help envision what to make and when they embraced a data-centric culture. We discuss the implications of these findings in terms of UX education and as opportunities for additional design research in support of UX designers working with ML.

133 citations


Journal ArticleDOI
TL;DR: A set of prominent designers embarked on a research journey to explore aesthetics in movement-based design and unpack one of the design sensitivities unique to their practice: a strong first per cent.
Abstract: A set of prominent designers embarked on a research journey to explore aesthetics in movement-based design Here we unpack one of the design sensitivities unique to our practice: a strong first per

121 citations


Proceedings ArticleDOI
23 Sep 2018
TL;DR: This paper is the first to surface AV-related interview data from pedestrians in a natural, real-world setting and finds an inherent relationship between favorable perceptions of technology and feelings of trust toward AVs.
Abstract: Autonomous vehicles have been in development for nearly thirty years and recently have begun to operate in real-world, uncontrolled settings. With such advances, more widespread research and evaluation of human interaction with autonomous vehicles (AV) is necessary. Here, we present an interview study of 32 pedestrians who have interacted with Uber AVs. Our findings are focused on understanding and trust of AVs, perceptions of AVs and artificial intelligence, and how the perception of a brand affects these constructs. We found an inherent relationship between favorable perceptions of technology and feelings of trust toward AVs. Trust in AVs was also influenced by a favorable interpretation of the company's brand and facilitated by knowledge about what AV technology is and how it might fit into everyday life. To our knowledge, this paper is the first to surface AV-related interview data from pedestrians in a natural, real-world setting.

60 citations


Journal ArticleDOI
TL;DR: In the early days of HCI, the discipline focused on human factors in the development of computer interfaces for operators, but later on, HCI evolved to consider user-centered design, as computers moved out of operations centers and into general use by the workforce.
Abstract: usability and effectiveness to include entertainment and engagement, among other things. HCI evolved again to focus on experience design, adding this lens to the previous lenses of human factors and user-centered design. This lens focused on deeply understanding the needs of users in all areas of life [4]. Since that time, technology and society have evolved drastically. Let’s take a look at the major advances in technology over the past several decades. Our discipline has seen the development of the smartphone, wearable devices, ubiquitous and If I could wave a magic wand, I would use it to make the HCI community move beyond usercentered design to a notion of stakeholder-centered design [1]. At the extreme, I think usercentered design is dead; when I’m feeling less extreme, I tell my students, my collaborators, and companies I consult with that we are no longer designing one thing for one person. Instead, we are doing stakeholdercentered design, which takes into account the notion of different entities interacting with and through products, services, and systems to achieve a desired outcome. While there are many histories of HCI, only a handful foreshadow the current shift in our discipline. One account presents three distinct paradigms within HCI: human factors, cognitive, and phenomenological or situated [2]. Another describes four distinct HCI foci based on the kinds of things made by practitioners. These include interfaces for operators, software interfaces people can use, software that improves task performance for workers, and devices and software people use to construct personal experiences [3]. Both accounts point to changes in technology and society as drivers of the paradigm shifts. They can be simplified into three lenses that I feel shape the discipline of HCI: human factors, user-centered design, and user experience design. In the early days of HCI, our discipline focused on human factors in the development of computer interfaces for operators. The goal was to make interfaces that did not exceed anyone’s physiological or cognitive abilities. In many cases, operators were designing interfaces for themselves. Later on, HCI evolved to consider user-centered design, as computers moved out of operations centers and into general use by the workforce. At

44 citations


Journal ArticleDOI
16 Nov 2018
TL;DR: In this paper, the authors propose a formalism that enables a robot to decide optimally between taking a physical action toward task completion and issuing an utterance to the human teammate.
Abstract: Human collaborators coordinate effectively their actions through both verbal and non-verbal communication. We believe that the the same should hold for human-robot teams. We propose a formalism that enables a robot to decide optimally between taking a physical action toward task completion and issuing an utterance to the human teammate. We focus on two types of utterances: verbal commands, where the robot asks the human to take a physical action, and state-conveying actions, where the robot informs the human about its internal state, which captures the information that the robot uses in its decision making. Human subject experiments show that enabling the robot to issue verbal commands is the most effective form of communicating objectives, while retaining user trust in the robot. Communicating information about the robot’s state should be done judiciously, since many participants questioned the truthfulness of the robot statements when the robot did not provide sufficient explanation about its actions.

33 citations


Journal ArticleDOI
09 May 2018
TL;DR: Wizard of Oz and small-scale scenarios were found fruitful as collaboration basis for multidisciplinary teams, by establishing a united understanding of the problem at hand.
Abstract: The introduction of autonomous vehicles (autonomous vehicles) will reshape the many social interactions that are part of traffic today. In order for autonomous vehicles to become successfully integrated, the social interactions surrounding them need to be purposefully designed. To ensure success and save development efforts, design methods that explore social aspects in early design phases are needed to provide conceptual directions before committing to concrete solutions. This paper contributes an exploration of methods for addressing the social aspects of autonomous vehicles in three key areas: the vehicle as a social entity in traffic, co-experience within the vehicle and the user–vehicle relationship. The methods explored include Wizard of Oz, small-scale scenarios, design metaphors, enactment and peer-to-peer interviews. These were applied in a workshop setting with 18 participants from academia and industry. The methods provided interesting design seeds, however with differing effectiveness. The most promising methods enabled flexible idea exploration, but in a contextualized and concrete manner through tangible objects and enactment to stage future use situations. Further, combinations of methods that enable a shift between social perspectives were preferred. Wizard of Oz and small-scale scenarios were found fruitful as collaboration basis for multidisciplinary teams, by establishing a united understanding of the problem at hand.

30 citations


Proceedings ArticleDOI
01 Aug 2018
TL;DR: The more lifelike the design of the eyes was, the higher the robot was rated on personable qualities, and the more suitable it was perceived to be for the home and for the office.
Abstract: Engagement with social robots is influenced by their appearance and shape. While robots are designed with various features, almost all designs have some form of eyes. In this paper, we evaluate eye design variations for tabletop robots in a lab study, with the goal of learning how they influence participants' perception of the robots' personality and functionality. This evaluation is conducted with non-working “paper prototypes”, a common design methodology which enables quick evaluation of a variety of designs. By comparing sixteen eye designs we found: (1) The more lifelike the design of the eyes was, the higher the robot was rated on personable qualities, and the more suitable it was perceived to be for the home; (2) Eye design did not affect how professional and how suitable for the office the robot was perceived to be. We suggest that designers can use paper prototypes as a design methodology to quickly evaluate variations of a particular feature for social robots.

16 citations


Proceedings ArticleDOI
30 May 2018
TL;DR: This day-long workshop seeks to look at exemplar CDR and CD case studies, to develop methods for describing, evaluating, replicating, and making use of knowledge outcomes from these two forms of design research.
Abstract: Over the last two decades, constructive design research (CDR) - more commonly called Research through Design within HCI - has become an accepted mode of scholarly inquiry within the design research community. It has been described as having three distinct genres: lab, field, and showroom. The lab and field genres typically take a pragmatic stance and typically propose a preferred future. Research done following the showroom approach - more commonly known as critical design (CD), speculative design, or design fictions - typically offers a polemic and a critique of the current state embodied in an artifact. Recently, we have observed a growing conflict within the design research community between pragmatic and critical design researchers [4]. To help reduce this conflict, we called for a divorce between CD and pragmatic CDR, advocating that each approach has its own merits and should be evaluated on its own account. Other design researchers have pushed back on this stance, seeking to create some middle ground to connect these two types of research. In this day-long workshop, we seek to look at exemplar CDR and CD case studies, to develop methods for describing, evaluating, replicating, and making use of knowledge outcomes from these two forms of design research.

13 citations


Proceedings ArticleDOI
20 Apr 2018
TL;DR: Panelists will engage the audience in a structured discussion of where current research meets industry demands and the philosophical-to-technical challenges facing the successful integration of human-robot teaming.
Abstract: There is sustained and growing interest in human-robot teaming across academia and industry. Many critical questions still remain as to how to foster flexible, effective, teaming that allow humans and robots to work closely together. This panel will bring together experts on human-robot interaction (HRI) across academia and industry to discuss and debate those critical challenges. Panelists will engage the audience in a structured discussion of where current research meets industry demands and the philosophical-to-technical challenges facing the successful integration of human-robot teaming.

9 citations


Proceedings ArticleDOI
08 Jun 2018
TL;DR: This is an interesting era for design in terms of the diverse ways that designers need to work, who need to take into account both personal and adaptive and global and scaled in how they think, design, and take action.
Abstract: This is an interesting era for design in terms of the diverse ways that designers need to work. The advent of cloud computing and the sensors in the mobile phones we carry means that there is a vast amount of data collected about what people do. There is a vast opportunity for designers to leverage this data to create products, services and systems that are tailored to individual needs. At the same time, designers at Google are designing products and services that millions of people use every day. This sets up an interesting dichotomy for today's designers, who need to take into account both personal and adaptive and global and scaled in how they think, design, and take action.

Proceedings ArticleDOI
30 Nov 2018
TL;DR: The study has shown that the algorithms can detect the mouth postures of a person in near real-time while they have a robot-assisted meal in a social setting and classify if the mouth is open or closed.
Abstract: Automatic mouth detection can assist in controlling a robotic system with self-feeding of individuals with disability. To address this need we developed and evaluated algorithms that: 1) detect and track the mouth of an individual in real-time, and 2) classify if the mouth is open or closed. A k-nearest neighbors (KNN) clustering algorithm was used to classify and recognize the mouth’s posture. The KNN algorithm classified image frames using features extracted by four methods including a histogram of oriented gradients, Harris-Stephens algorithm, maximally stable extremal regions, and local binary patterns. The results of this study indicated a high classification accuracy (~87%) using 10-fold cross validation for three participants without disability. The study has shown that the algorithms can detect the mouth postures of a person in near real-time (<1s) while they have a robot-assisted meal in a social setting.