scispace - formally typeset
Search or ask a question

Showing papers in "Communications of The ACM in 2015"


Journal ArticleDOI
TL;DR: This work unifies traditionally separated high-performance computing and big data analytics in one place to accelerate scientific discovery and engineering innovation and foster new ideas in science and engineering.
Abstract: Scientific discovery and engineering innovation requires unifying traditionally separated high-performance computing and big data analytics.

373 citations


Journal ArticleDOI
TL;DR: AI has seen great advances of many kinds recently, but there is one critical area where progress has been extremely slow: ordinary commonsense.
Abstract: AI has seen great advances of many kinds recently, but there is one critical area where progress has been extremely slow: ordinary commonsense.

362 citations


Journal ArticleDOI
TL;DR: Engineers use TLA+ to prevent serious but subtle bugs from reaching production and find ways to reduce the number of bugs in the final product.
Abstract: Engineers use TLA+ to prevent serious but subtle bugs from reaching production.

283 citations


Journal ArticleDOI
TL;DR: Theory on passwords has lagged practice, where large providers use back-end smarts to survive with imperfect technology.
Abstract: Theory on passwords has lagged practice, where large providers use back-end smarts to survive with imperfect technology.

213 citations


Journal ArticleDOI
TL;DR: Static program analysis is a key component of many software development tools, including compilers, development environments, and verification tools as mentioned in this paper, and it is often expected to be sound in that their result models all possible executions of the program under analysis.
Abstract: Static program analysis is a key component of many software development tools, including compilers, development environments, and verification tools. Practical applications of static analysis have grown in recent years to include tools by companies such as Coverity, Fortify, GrammaTech, IBM, and others. Analyses are often expected to be sound in that their result models all possible executions of the program under analysis. Soundness implies that the analysis computes an overapproximation in order to stay tractable; the analysis result will also model behaviors that do not actually occur in any program execution. The precision of an analysis is the degree to which it avoids such spurious results. Users expect analyses to be sound as a matter of course, and desire analyses to be as precise as possible, while being able to scale to large programs.

196 citations


Journal ArticleDOI
TL;DR: From theoretical possibility to near practicality - the next generation of smart phones could change the way the authors communicate with one another.
Abstract: From theoretical possibility to near practicality.

178 citations


Journal ArticleDOI
TL;DR: Connecting mathematical logic and computation, it ensures that some aspects of programming are absolute.
Abstract: Connecting mathematical logic and computation, it ensures that some aspects of programming are absolute.

172 citations


Journal ArticleDOI
TL;DR: This framework addresses the environmental dimension of software performance, as applied here by a paper mill and a car-sharing service.
Abstract: This framework addresses the environmental dimension of software performance, as applied here by a paper mill and a car-sharing service.

145 citations


Journal ArticleDOI
TL;DR: Implantable devices, often dependent on software, save countless lives, but how secure are they?
Abstract: Implantable devices, often dependent on software, save countless lives. But how secure are they?

145 citations


Journal ArticleDOI
TL;DR: Inductive programming can liberate users from performing tedious and repetitive tasks by enabling them to focus on solving real-time problems.
Abstract: © Gulwani, S. et al. | ACM 2015. This is the author's version of the work. It is posted here for your personal use. Not for redistribution. The definitive Version of Record was published in Communications of the ACM, http://dx.doi.org/10.1145/2736282

143 citations


Journal ArticleDOI
TL;DR: This paper shows state-of-the-art edge-aware processing using standard Laplacian pyramids, and proposes a set of image filters to achieve edge-preserving smoothing, detail enhancement, tone mapping, and inverse tone mapping.
Abstract: The Laplacian pyramid is ubiquitous for decomposing images into multiple scales and is widely used for image analysis. However, because it is constructed with spatially invariant Gaussian kernels, the Laplacian pyramid is widely believed to be ill-suited for representing edges, as well as for edge-aware operations such as edge-preserving smoothing and tone mapping. To tackle these tasks, a wealth of alternative techniques and representations have been proposed, for example, anisotropic diffusion, neighborhood filtering, and specialized wavelet bases. While these methods have demonstrated successful results, they come at the price of additional complexity, often accompanied by higher computational cost or the need to postprocess the generated results. In this paper, we show state-of-the-art edge-aware processing using standard Laplacian pyramids. We characterize edges with a simple threshold on pixel values that allow us to differentiate large-scale edges from small-scale details. Building upon this result, we propose a set of image filters to achieve edge-preserving smoothing, detail enhancement, tone mapping, and inverse tone mapping. The advantage of our approach is its simplicity and flexibility, relying only on simple point-wise nonlinearities and small Gaussian convolutions; no optimization or postprocessing is required. As we demonstrate, our method produces consistently high-quality results, without degrading edges or introducing halos.

Journal ArticleDOI
TL;DR: By the end of 2013, about five years after its initial launch, Bitcoin has exceeded everyone’s expectations as its value rose beyond the $1,000 mark, making laszlo's spent bitcoins worth millions of dollars.
Abstract: I JUST WANT to report that I successfully traded 10,000 bitcoins for pizza,” wrote user laszlo on the Bitcoin forums in May 2010—reporting on what has been recognized as the first item in history to be purchased with bitcoins.a By the end of 2013, about five years after its initial launch, Bitcoin has exceeded everyone’s expectations as its value rose beyond the $1,000 mark, making laszlo’s spent bitcoins worth millions of dollars. This meteoric rise in value has fueled many stories in the popular press and has turned a group of early enthusiasts into millionaires. Stories of Bitcoin’s mysterious creator, Satoshi Nakamoto, and of illegal markets hidden in the darknet have added to the hype. But what is Bitcoin’s “ innovation? Is the buzz surrounding the new cryptocurrency justified, or will it turn out to be a modern tulip mania? To truly evaluate Bitcoin’s novelty, its potential impact, and the challenges it faces, we must look past the hype and delve deeper into the details of the protocol. Bitcoin, a peer-to-peer digital cryptocurrency launched in 2009, has been slowly growing. Nakamoto described the protocol in a white paper published in late 2008 and released the software as an open source project, which has since been maintained by a large number of developers, most of them volunteers. Bitcoin’s network and its surrounding ecosystem have grown quite substantially since its initial release. Its dollar value, which most will admit is largely based on speculation on its future worth, has been extremely volatile. The currency had gone through several hype-driven bubbles and subsequent devaluations, attaining higher values each time. Bitcoin’s promise is mainly a result of the combination of features it bundles together: It is a purely digital currency allowing payments to be sent almost instantly over the Internet with extremely low fees. Like cash, it is nearly anonymous, and transactions are effectively irreversible once they are committed. Bitcoin addresses (the Bitcoin: Under the Hood

Journal ArticleDOI
TL;DR: Formal executable models enable systematic evaluation of system designs prior to implementation and deployment and provide real-time information about system architecture and functionality.
Abstract: Formal executable models enable systematic evaluation of system designs prior to implementation and deployment.

Journal ArticleDOI
TL;DR: S soylent, a word processing interface that enables writers to call on Mechanical Turk workers to shorten, proofread, and otherwise edit parts of their documents on demand, and the Find-Fix-Verify crowd programming pattern, which splits tasks into a series of generation and review stages.
Abstract: This paper introduces architectural and interaction patterns for integrating crowdsourced human contributions directly into user interfaces. We focus on writing and editing, complex endeavors that span many levels of conceptual and pragmatic activity. Authoring tools offer help with pragmatics, but for higher-level help, writers commonly turn to other people. We thus present Soylent, a word processing interface that enables writers to call on Mechanical Turk workers to shorten, proofread, and otherwise edit parts of their documents on demand. To improve worker quality, we introduce the Find-Fix-Verify crowd programming pattern, which splits tasks into a series of generation and review stages. Evaluation studies demonstrate the feasibility of crowdsourced editing and investigate questions of reliability, cost, wait time, and work time for edits.

Journal ArticleDOI
TL;DR: The Convolution Engine is presented---a programmable processor specialized for the convolution-like data-flow prevalent in computational photography, computer vision, and video processing and achieves energy efficiency by capturing data-reuse patterns, eliminating data transfer overheads, and enabling a large number of operations per memory access.
Abstract: General-purpose processors, while tremendously versatile, pay a huge cost for their flexibility by wasting over 99% of the energy in programmability overheads. We observe that reducing this waste requires tuning data storage and compute structures and their connectivity to the data-flow and data-locality patterns in the algorithms. Hence, by backing off from full programmability and instead targeting key data-flow patterns used in a domain, we can create efficient engines that can be programmed and reused across a wide range of applications within that domain.We present the Convolution Engine (CE)---a programmable processor specialized for the convolution-like data-flow prevalent in computational photography, computer vision, and video processing. The CE achieves energy efficiency by capturing data-reuse patterns, eliminating data transfer overheads, and enabling a large number of operations per memory access. We demonstrate that the CE is within a factor of 2--3× of the energy and area efficiency of custom units optimized for a single kernel. The CE improves energy and area efficiency by 8--15× over data-parallel Single Instruction Multiple Data (SIMD) engines for most image processing applications.

Journal ArticleDOI
TL;DR: In this article, the authors present a domain-specific verification tool for packet-processing software, called Click, to verify that a network device behaves and performs as expected, for example, does not crash or enter an infinite loop.
Abstract: The industry is in the mood for programmable networks, where an operator can dynamically deploy network functions on network devices, akin to how one deploys virtual machines on physical machines in a cloud environment Such flexibility brings along the threat of unpredictable behavior and performance What are the minimum restrictions that we need to impose on network functionality such that we are able to verify that a network device behaves and performs as expected, for example, does not crash or enter an infinite loop? We present the result of working iteratively on two tasks: designing a domain-specific verification tool for packet-processing software, while trying to identify a minimal set of restrictions that packet-processing software must satisfy in order to be verification-friendly Our main insight is that packet-processing software is a good candidate for domain-specific verification, for example, because it typically consists of distinct pieces of code that share limited mutable state; we can leverage this and other properties to sidestep fundamental verification challenges We apply our ideas on Click packet-processing software; we perform complete and sound verification of an IP router and two simple middleboxes within tens of minutes, whereas a state-of-the-art general-purpose tool fails to complete the same task within several hours

Journal ArticleDOI
TL;DR: It is shown that geographically representative image elements can be discovered automatically from Google Street View imagery in a discriminative manner and it is demonstrated that these elements are visually interpretable and perceptually geo-informative.
Abstract: Given a large repository of geo-tagged imagery, we seek to automatically find visual elements, for example windows, balconies, and street signs, that are most distinctive for a certain geo-spatial area, for example the city of Paris. This is a tremendously difficult task as the visual features distinguishing architectural elements of different places can be very subtle. In addition, we face a hard search problem: given all possible patches in all images, which of them are both frequently occurring and geographically informative? To address these issues, we propose to use a discriminative clustering approach able to take into account the weak geographic supervision. We show that geographically representative image elements can be discovered automatically from Google Street View imagery in a discriminative manner. We demonstrate that these elements are visually interpretable and perceptually geo-informative. The discovered visual elements can also support a variety of computational geography tasks, such as mapping architectural correspondences and influences within and across cities, finding representative elements at different geo-spatial scales, and geographically informed image retrieval.

Journal ArticleDOI
TL;DR: Developers first need compelling incentives and committed management before they can think of ways to improve the quality of their work.
Abstract: Developers first need compelling incentives and committed management.

Journal ArticleDOI
TL;DR: A deep, fine-grain analysis of rhetorical structure highlights crucial sentiment-carrying text segments in the text of Shakespeare's Hamlet.
Abstract: A deep, fine-grain analysis of rhetorical structure highlights crucial sentiment-carrying text segments.

Journal ArticleDOI
TL;DR: Open-universe probability models show merit in unifying efforts and should be considered for use in science education and research.
Abstract: Open-universe probability models show merit in unifying efforts.

Journal ArticleDOI
TL;DR: Knowing where you are in space and time promises a deeper understanding of neighbors, ecosystems, and the environment.
Abstract: Knowing where you are in space and time promises a deeper understanding of neighbors, ecosystems, and the environment.

Journal ArticleDOI
TL;DR: A revealing picture of how personal health information searches become the property of private corporations is revealed.
Abstract: A revealing picture of how personal health information searches become the property of private corporations.

Journal ArticleDOI
TL;DR: It is demonstrated that one can contain the otherwise uncontrolled growth of the Ninja gap and offer a more stable and predictable performance growth over future architectures, offering strong evidence that radical language changes are not required.
Abstract: Current processor trends of integrating more cores with wider SIMD units, along with a deeper and complex memory hierarchy, have made it increasingly more challenging to extract performance from applications. It is believed by some that traditional approaches to programming do not apply to these modern processors and hence radical new languages must be discovered. In this paper, we question this thinking and offer evidence in support of traditional programming methods and the performance-vs-programming effort effectiveness of common multi-core processors and upcoming many-core architectures in delivering significant speedup, and close-to-optimal performance for commonly used parallel computing workloads. We first quantify the extent of the "Ninja gap", which is the performance gap between naively written C/C++ code that is parallelism unaware (often serial) and best-optimized code on modern multi-/many-core processors. Using a set of representative throughput computing benchmarks, we show that there is an average Ninja gap of 24X (up to 53X) for a recent 6-core Intel® Core™ i7 X980 Westmere CPU, and that this gap if left unaddressed will inevitably increase. We show how a set of well-known algorithmic changes coupled with advancements in modern compiler technology can bring down the Ninja gap to an average of just 1.3X. These changes typically require low programming effort, as compared to the very high effort in producing Ninja code. We also discuss hardware support for programmability that can reduce the impact of these changes and even further increase programmer productivity. We show equally encouraging results for the upcoming Intel® Many Integrated Core architecture (Intel® MIC) which has more cores and wider SIMD. We thus demonstrate that we can contain the otherwise uncontrolled growth of the Ninja gap and offer a more stable and predictable performance growth over future architectures, offering strong evidence that radical language changes are not required.

Journal ArticleDOI
TL;DR: The design and roadmap of a new paradigm in database systems, called NoDB, which do not require data loading while still maintaining the whole feature set of a modern database system are presented, bringing an unprecedented positive effect in usability and performance.
Abstract: As data collections become larger and larger, users are faced with increasing bottlenecks in their data analysis. More data means more time to prepare and to load the data into the database before executing the desired queries. Many applications already avoid using database systems, for example, scientific data analysis and social networks, due to the complexity and the increased data-to-query time, that is, the time between getting the data and retrieving its first useful results. For many applications data collections keep growing fast, even on a daily basis, and this data deluge will only increase in the future, where it is expected to have much more data than what we can move or store, let alone analyze.We here present the design and roadmap of a new paradigm in database systems, called NoDB, which do not require data loading while still maintaining the whole feature set of a modern database system. In particular, we show how to make raw data files a first-class citizen, fully integrated with the query engine. Through our design and lessons learned by implementing the NoDB philosophy over a modern Database Management Systems (DBMS), we discuss the fundamental limitations as well as the strong opportunities that such a research path brings. We identify performance bottlenecks specific for in situ processing, namely the repeated parsing and tokenizing overhead and the expensive data type conversion. To address these problems, we introduce an adaptive indexing mechanism that maintains positional information to provide efficient access to raw data files, together with a flexible caching structure. We conclude that NoDB systems are feasible to design and implement over modern DBMS, bringing an unprecedented positive effect in usability and performance.

Journal ArticleDOI
TL;DR: The connection between online communication and psychological well-being depends on whom you are communicating with, and the quality of the communication can vary greatly.
Abstract: The connection between online communication and psychological well-being depends on whom you are communicating with.

Journal ArticleDOI
TL;DR: Everyone should be able to manage their personal data with a personal information management system, according to the 2015 Personal Information Management System of the Year report.
Abstract: Everyone should be able to manage their personal data with a personal information management system.

Journal ArticleDOI
TL;DR: Research, leadership, and communication about AI futures will help shape the future of science, technology, and society.
Abstract: Research, leadership, and communication about AI futures.

Journal ArticleDOI
TL;DR: Business leaders may bemoan the burdens of governing IT, but the alternative could be much worse.
Abstract: Business leaders may bemoan the burdens of governing IT, but the alternative could be much worse.

Journal ArticleDOI
TL;DR: Sharing experiences running artifact evaluation committees for five major conferences and finding out what works and what doesn't helps to improve the quality of artifact evaluation at conferences.
Abstract: Sharing experiences running artifact evaluation committees for five major conferences.

Journal ArticleDOI
TL;DR: The Quipper language offers a unified general-purpose programming framework for quantum computation and has been selected as the preferred programming language for the 2016 Olympics in Rio de Janeiro.
Abstract: The Quipper language offers a unified general-purpose programming framework for quantum computation.