scispace - formally typeset
Search or ask a question

Showing papers by "Richard Harper published in 2013"


Journal ArticleDOI
17 May 2013-Science
TL;DR: This work describes problems with the adoption and use of scientific software and reveals key insights and best practices for how to develop, standardize, and implement software.
Abstract: Software pervades every domain of science ( 1 – 3 ), perhaps nowhere more decisively than in modeling. In key scientific areas of great societal importance, models and the software that implement them define both how science is done and what science is done ( 4 , 5 ). Across all science, this dependence has led to concerns around the need for open access to software ( 6 , 7 ), centered on the reproducibility of research ( 1 , 8 – 10 ). From fields such as high-performance computing, we learn key insights and best practices for how to develop, standardize, and implement software ( 11 ). Open and systematic approaches to the development of software are essential for all sciences. But for many scientists this is not sufficient. We describe problems with the adoption and use of scientific software.

186 citations


Journal ArticleDOI
Kenton O'Hara1, Richard Harper1, Helena M. Mentis1, Abigail Sellen1, Alex S. Taylor1 
TL;DR: Norman's critique is indicative of the issue that while using the word natural might have become natural, it is coming at a cost and there is a need to understand the key assumptions implicit within it and how these frame approaches to design and engineering in particular ways.
Abstract: Norman's critique is indicative of the issue that while using the word natural might have become natural, it is coming at a cost. In other words, precisely because the notion of naturalness has become so commonplace in the scientific lexicon of HCI, so it is becoming increasingly important, it seems that there is a critical examination of the conceptual work being performed when it is used. There is a need to understand the key assumptions implicit within it and how these frame approaches to design and engineering in particular ways. A second significant element of this perspective comes from Wittgenstein, and his claim that, through action, people create shared meanings with others, and these shared meanings are the essential common ground that enable individual perception to be cohered into socially organized, understood, and coordinated experiences.

125 citations


Journal ArticleDOI
TL;DR: Georgakopoulou et al. as discussed by the authors studied how storyworlds are co-constructed by multiple narrators via the communicative affordances which have developed in the Facebook status update: namely, the practices of commenting, liking, linking, tagging, photo-sharing, and marking geographical location.
Abstract: This article addresses the emergence of networked narration found in Facebook updates. Drawing on anthropological approaches to co-tellership (Ochs & Capps, 2001), we trace how storyworlds are co-constructed by multiple narrators via the communicative affordances which have developed in the Facebook status update: namely, the practices of commenting, liking, linking, tagging, photo-sharing, and marking geographical location. Our longitudinal analysis of 1800 updates elicited from 60 participants over a period of four years suggests that the rise of what we call a ‘networked narrative’ allows individuals to participate collectively in the construction of ‘shared stories’ (Georgakopoulou, 2007), and through this process for narrators to co-construct their social identities through their interactions with others. We argue that the distribution of storytelling as it takes place on Facebook may be found in other online and offline contexts, and challenges earlier, linear models of narrative form that have dominated discourse-analytic and literary-critical narratology.

44 citations


Proceedings ArticleDOI
23 Feb 2013
TL;DR: The character of the experience as one that entails users reveling in absurdity of movement that is required by the Kinect sensor is discussed, and the 'third-space' defined by Kinect-based gestural interaction is likened to that of Bakhtin's mocking gaze in the contexts of carnivals.
Abstract: As the Kinect sensor is being extended from gaming to other applications and contexts, we critically examine what is the nature of the experience garnered through its current use in gaming in the home setting. Through an exploratory study of family experiences with Kinect in gaming, we discuss the character of the experience as one that entails users reveling in absurdity of movement that is required by the Kinect sensor. Through this analysis, we liken the 'third-space' defined by Kinect-based gestural interaction to that of Bakhtin's mocking gaze in the contexts of carnivals. This is followed by remarks on the implications this re-specification of understanding Kinect-enabled interaction has for the term 'natural' and relatedly the emphasis on the 'user' as 'the 'controller' in HCI. Remarks will be made on the implications of this for the application of the Kinect sensor to distributed gaming and other non-gaming interaction spaces in the home.

33 citations


Proceedings ArticleDOI
27 Apr 2013
TL;DR: This paper asks what it means to design domestic web-connected technologies, placing the aesthetic and material properties intrinsic to the home and home life at the centre of the design exploration, and presents three concepts that were selected and prototyped from a broader process of research-through-design.
Abstract: Web-based technologies are often built to capitalize on the flexibility and fluidity that is supported by the internet, with the value of 'access anywhere' underpinning a blurring of boundaries across home and work. Yet the home is well known in HCI to have a unique set of qualities that can use-fully be drawn upon when designing to support domestic life. In this paper we ask what it means to design domestic web-connected technologies, placing the aesthetic and material properties intrinsic to the home and home life at the centre of our design exploration. We present three concepts that were selected and prototyped from a broader process of research-through-design: Tokens of Search provides tangible handles to web resources; Hole in Space connects the home intimately to a remote place; and Manhattan enables the tangible exploration of events in the community, putting the home at the centre. Discussions in the paper consider not only how aesthetics is articulated in the material and digital properties of the artefacts, but also how a consideration of the properties of the home can create a potentially new design space to explore.

32 citations


01 Jan 2013
TL;DR: In this article, the authors examine the nature of the experience garnered through Kinect's current use in gaming in the home setting and compare it to Bakhtin's mocking gaze in the contexts of carnivals.
Abstract: As the Kinect sensor is being extended from gaming to oth-er applications and contexts, we critically examine what is the nature of the experience garnered through its current use in gaming in the home setting. Through an exploratory study of family experiences with Kinect in gaming, we dis-cuss the character of the experience as one that entails users reveling in absurdity of movement that is required by the Kinect sensor. Through this analysis, we liken the 'third-space' defined by Kinect-based gestural interaction to that of Bakhtin's mocking gaze in the contexts of carnivals. This is followed by remarks on the implications this re-specification of understanding Kinect-enabled interaction has for the term 'natural' and relatedly the emphasis on the 'user' as 'the controller' in HCI. Remarks will be made on the implications of this for the application of the Kinect sensor to distributed gaming and other non-gaming interac-tion spaces in the home. Copyright 2013 ACM.

29 citations


Proceedings ArticleDOI
23 Feb 2013
TL;DR: It is suggested that files continue to act as a cohering concept, something like a 'boundary object' between computer engineers and users, but the effectiveness of this boundary object is now waning and new abstractions are needed.
Abstract: For over 40 years the notion of the file, as devised by pioneers in the field of computing, has been the subject of much contention. Some have wanted to abandon the term altogether on the grounds that metaphors about files can confuse users and designers alike. More recently, the emergence of the 'cloud' has led some to suggest that the term is simply obsolescent. In this paper we want to suggest that, despite all these conceptual debates and changes in technology, the term file still remains central to systems architectures and to the concerns of users. Notwithstanding profound changes in what users do and technologies afford, we suggest that files continue to act as a cohering concept, something like a 'boundary object' between computer engineers and users. However, the effectiveness of this boundary object is now waning. There are increasing signs of slippage and muddle. Instead of throwing away the notion altogether, we propose that the definition of and use of files as a boundary object be reconstituted. New abstractions are needed, ones which reflect what users seek to do with their digital data, and which allow engineers to solve the networking, storage and data management problems that ensue when files move from the PC on to the networked world of today.

23 citations


Proceedings ArticleDOI
03 Jul 2013
TL;DR: Clinical testing conducted to evaluate the accuracy of Aingeal, a wireless in-hospital patient monitor, in measuring respiration rate via impedance pneumography demonstrates comparable performance of the Aingealing device in measuringrespiration rate with a well-accepted and widely used alternative method.
Abstract: This paper presents clinical testing conducted to evaluate the accuracy of Aingeal, a wireless in-hospital patient monitor, in measuring respiration rate via impedance pneumography. Healthy volunteers were invited to simultaneously wear a CE Marked Aingeal vital signs monitor and a capnograph, the current gold standard in respiration rate measurement. During the test, participants were asked to undergo a series of defined breathing protocols which included normal breathing, paced breathing between 8-23 breaths per minute (bpm) and a recovery period following moderate exercise. Statistical analysis of the data collected shows a mean difference of -0.73, a standard deviation of 1.61, limits of agreement of -3.88 and +2.42 bpm and a P-value of 0.22. This testing demonstrates comparable performance of the Aingeal device in measuring respiration rate with a well-accepted and widely used alternative method.

17 citations


Patent
18 Jan 2013
TL;DR: In this article, the authors present a system for controlling a computing-based device using both input received from a traditional input device (e.g. keyboard) and hand gestures made on or near a reference object.
Abstract: Methods and system for controlling a computing-based device using both input received from a traditional input device (e.g. keyboard) and hand gestures made on or near a reference object (e.g. keyboard). In some examples, the hand gestures may comprise one or more hand touch gestures and/or one or more hand air gestures.

13 citations


Book ChapterDOI
01 Sep 2013
TL;DR: A study of the felt-life of engineers, suggesting it consists of a form of digital dwelling that is profoundly emotional though played in reference to ideas of moral propriety and ethics, and with the more humanistic studies of SE reasoning common in CSCW.
Abstract: The organizational and social aspects of software engineering (SE) are now increasingly well investigated. This paper proposes that there are a number of approaches taken in research that can be distinguished not by their method or topic but by the different views they construct of the human agent acting in SE. These views have implications for the pragmatic outcome of the research, such as whether systems design suggestions are made, proposals for the development of practical reasoning tools or the effect of Social Network Systems on engineer’s sociability. This paper suggests that these studies tend to underemphasize the felt-life of engineers, a felt-life that is profoundly emotional though played in reference to ideas of moral propriety and ethics. This paper will present a study of this felt-life, suggesting it consists of a form of digital dwelling. The perspective this view affords are contrasted with process and ‘scientific’ approaches to the human agent in SE, and with the more humanistic studies of SE reasoning common in CSCW.

11 citations


Journal ArticleDOI
19 Jul 2013-Science
TL;DR: The feasibility of the recommendation to both peer-review computer code and release it is questioned, and an alternative is proffer: postpublication community review and stronger procedures and facilities for dealing with corrections and retractions of published results are proffer.
Abstract: Sliz and Morin question the feasibility of our recommendation to both peer-review computer code and release it, and they proffer an alternative: postpublication community review and stronger procedures and facilities for dealing with corrections and retractions of published results. These are not incompatible. Encouraging the broader scientific community to inspect computer code postpublication would help in identifying scientific errors currently unnoticed in the scientific literature. Improving the process of corrections and retractions would have positive benefits far beyond this issue. However, neither negates the need for pre-publication review of code. The scientific publishing process relies on prepublication peer review as a filter for robust results. This is so because, regardless of the strength of processes for dealing with corrections and retractions, putting “the genie back in the bottle” is always going to be a difficult task after a result has been reported in the literature. At a minimum, code needs to be available to reviewers should they choose to scrutinize it. Moreover, prepublication review of code need not necessarily rely on the current review system. Just as English-language editing services have emerged to ensure a minimum standard of accessibility of articles in many major journals, so might software-reviewing services provide a stamp of approval that code actually implements the algorithm reported in a paper. Indeed, in the commercial sector, software escrow providers routinely provide full verification services to companies purchasing (or investing in) business-critical software [e.g., ([ 1 ][1])], and the approaches used by such companies might provide pointers for a new model for academic software verification services. Of course, verification of software is just the first essential step in the process, with by far the more challenging issue being software validation. Addressing this issue, together with the equally pressing issue of uncertainty quantification in complex [computational] models, has been the focus of intensive research efforts in other scientific disciplines ([ 2 ][2]). These efforts might provide a good starting point for equivalent efforts in the life sciences. 1. [↵][3] Iron Mountain, “How verification services fortify your software escrow solution” (Iron Mountain, 2011). 2. [↵][4] National Academies Press, Assessing the Reliability of Complex Models: Mathematical and Statistical Foundations of Verification, Validation, and Uncertainty Quantification (National Academies Press, Washington, DC, 2012). [1]: #ref-1 [2]: #ref-2 [3]: #xref-ref-1-1 "View reference 1 in text" [4]: #xref-ref-2-1 "View reference 2 in text"

Journal ArticleDOI
Richard Harper1
TL;DR: In this article, a key theme in mobile communications research is the idea that people are suffering from communications overload, and the difficulties that will need to be addressed when this occurs are noted.
Abstract: A key theme in mobile communications research is the idea that people are suffering from communications overload. This essay remarks on what that term might mean and how it ought to be addressed when viewed from the perspective of mobile phone research. It will argue that its use in everyday life is rich and complex, and that this use ought to be a research topic in its own right. Some of the difficulties that will need to be addressed when this occurs are noted.

01 Jun 2013
TL;DR: In this article, the authors explore the impact of the internet on the density and duration of friendship and conclude that the internet is likely to increase the stability of social ties, affirming the importance of examining the dynamics of social processes, and the usefulness of agent-based modelling as a technique for investigating processual phenomena.
Abstract: This paper reports the use of agent-based modelling to explore the impact of the internet on the density and duration of friendship. It uses data from the pre-internet era to validate a model that is then used to compare the dynamic process of friendship with and without the internet. It concludes that the internet is likely to increase the stability of social ties. The paper also affirms the importance of examining the dynamics of social processes, and the usefulness of agent-based modelling as a technique for investigating processual phenomena.

01 Feb 2013
TL;DR: This paper analyzes the I/O and network behavior of a large class of home, personal and enterprise applications to find that users and application developers increasingly have to deal with a de facto distributed system of specialized storage containers/file systems, each exposing complex data structures, and each having different naming and metadata conventions.
Abstract: This paper analyzes the I/O and network behavior of a large class of home, personal and enterprise applications. Through user studies and measurements, we find that users and application developers increasingly have to deal with a de facto distributed system of specialized storage containers/file systems, each exposing complex data structures, and each having different naming and metadata conventions, caching and prefetching strategies and transactional properties. Two broad dichotomies emerge from this. First, there is tension between the traditional local file system and cloud storage containers. Local file systems have high performance, but they lack support for rich data structures, like graphs, that other storage containers provide. Second, distinct cloud storage containers provide different operational semantics and data structures. Transferring data between these containers is often lossy leading to added data management complexity for users and developers. We believe our analysis directly impacts the way users understand their data, designers build and evaluate the success of future storage systems and application developers program to APIs provided by the storage systems.