scispace - formally typeset
Search or ask a question

Showing papers in "Communications of The ACM in 2011"


Journal ArticleDOI
TL;DR: The brain's electrical signals enable people without muscle control to physically interact with the world through the use of their brains' electrical signals.
Abstract: The brain's electrical signals enable people without muscle control to physically interact with the world.

2,361 citations


Journal ArticleDOI
TL;DR: A system that can match and reconstruct 3D scenes from extremely large collections of photographs such as those found by searching for a given city on Internet photo sharing sites and is designed to scale gracefully with both the size of the problem and the amount of available computation.
Abstract: We present a system that can reconstruct 3D geometry from large, unorganized collections of photographs such as those found by searching for a given city (e.g., Rome) on Internet photo-sharing sites. Our system is built on a set of new, distributed computer vision algorithms for image matching and 3D reconstruction, designed to maximize parallelism at each stage of the pipeline and to scale gracefully with both the size of the problem and the amount of available computation. Our experimental results demonstrate that it is now possible to reconstruct city-scale image collections with more than a hundred thousand images in less than a day.

1,307 citations


Journal ArticleDOI
TL;DR: The practice of crowdsourcing is transforming the Web and giving rise to a new field of inquiry called "crowdsourcing", which aims to provide real-time information about events in a democratic manner.

1,165 citations


Journal ArticleDOI
TL;DR: VL2 is a practical network architecture that scales to support huge data centers with uniform high capacity between servers, performance isolation between services, and Ethernet layer-2 semantics and can be deployed today, and a working prototype is built.
Abstract: To be agile and cost effective, data centers must allow dynamic resource allocation across large server pools. In particular, the data center network should provide a simple flat abstraction: it should be able to take any set of servers anywhere in the data center and give them the illusion that they are plugged into a physically separate, noninterfering Ethernet switch with as many ports as the service needs. To meet this goal, we present VL2, a practical network architecture that scales to support huge data centers with uniform high capacity between servers, performance isolation between services, and Ethernet layer-2 semantics. VL2 uses (1) flat addressing to allow service instances to be placed anywhere in the network, (2) Valiant Load Balancing to spread traffic uniformly across network paths, and (3) end system--based address resolution to scale to large server pools without introducing complexity to the network control plane. VL2's design is driven by detailed measurements of traffic and fault data from a large operational cloud service provider. VL2's implementation leverages proven network technologies, already available at low cost in high-speed hardware implementations, to build a scalable and reliable network architecture. As a result, VL2 networks can be deployed today, and we have built a working prototype. We evaluate the merits of the VL2 design using measurement, analysis, and experiments. Our VL2 prototype shuffles 2.7 TB of data among 75 servers in 395 s---sustaining a rate that is 94% of the maximum possible.

981 citations


Journal ArticleDOI
TL;DR: Energy efficiency is the new fundamental limiter of processor performance, way beyond numbers of processors.
Abstract: Energy efficiency is the new fundamental limiter of processor performance, way beyond numbers of processors.

920 citations


Journal ArticleDOI
TL;DR: The motivation and key concepts behind answer set programming---a promising approach to declarative problem solving.
Abstract: The motivation and key concepts behind answer set programming---a promising approach to declarative problem solving.

911 citations


Journal ArticleDOI
TL;DR: BI technologies are essential to running today's businesses and this technology is going through sea changes, so how do you protect yourself against these changes?
Abstract: BI technologies are essential to running today's businesses and this technology is going through sea changes.

830 citations


Journal ArticleDOI
Cynthia Dwork1
TL;DR: The problem of statistical disclosure control, revealing accurate statistics about a set of respondents while preserving the privacy of individuals, has a venerable history, with an extensive literature spanning statistics, theoretical computer science, security, databases, and cryptography as discussed by the authors.
Abstract: In the information realm, loss of privacy is usually associated with failure to control access to information, to control the flow of information, or to control the purposes for which information is employed. Differential privacy arose in a context in which ensuring privacy is a challenge even if all these control problems are solved: privacy-preserving statistical analysis of data. The problem of statistical disclosure control – revealing accurate statistics about a set of respondents while preserving the privacy of individuals – has a venerable history, with an extensive literature spanning statistics, theoretical computer science, security, databases, and cryptography (see, for example, the excellent survey [1], the discussion of related work in [2] and the Journal of Official Statistics 9 (2), dedicated to confidentiality and disclosure control). This long history is a testament the importance of the problem. Statistical databases can be of enormous social value; they are used for apportioning resources, evaluating medical therapies, understanding the spread of disease, improving economic utility, and informing us about ourselves as a species. The data may be obtained in diverse ways. Some data, such as census, tax, and other sorts of official data, are compelled; others are collected opportunistically, for example, from traffic on the internet, transactions on Amazon, and search engine query logs; other data are provided altruistically, by respondents who hope that sharing their information will help others to avoid a specific misfortune, or more generally, to increase the public good. Altruistic data donors are typically promised their individual data will be kept confidential – in short, they are promised “privacy.” Similarly, medical data and legally compelled data, such as census data, tax return data, have legal privacy mandates. In our view, ethics demand that opportunistically obtained data should be treated no differently, especially when there is no reasonable alternative to engaging in the actions that generate the data in question. The problems remain: even if data encryption, key management, access control, and the motives of the data curator

711 citations


Journal ArticleDOI
TL;DR: Body posture and finger pointing are a natural modality for human-machine interaction, but first the system must know what it's seeing.
Abstract: Body posture and finger pointing are a natural modality for human-machine interaction, but first the system must know what it's seeing.

641 citations


Journal ArticleDOI
TL;DR: Web apps are cheaper to develop and deploy than native apps, but can they match the native user experience?
Abstract: Web apps are cheaper to develop and deploy than native apps, but can they match the native user experience?

548 citations


Journal ArticleDOI
TL;DR: HiStar is a new operating system designed to minimize the amount of code that must be trusted, which allows users to specify precise data security policies without unduly limiting the structure of applications.
Abstract: HiStar is a new operating system designed to minimize the amount of code that must be trusted. HiStar provides strict information flow control, which allows users to specify precise data security policies without unduly limiting the structure of applications. HiStar's security features make it possible to implement a Unix-like environment with acceptable performance almost entirely in an untrusted user-level library. The system has no notion of superuser and no fully trusted code other than the kernel. HiStar's features permit several novel applications, including privacy-preserving, untrusted virus scanners and a dynamic Web server with only a few thousand lines of trusted code.

Journal ArticleDOI
TL;DR: Checking the satisfiability of logical formulas, SMT solvers scale orders of magnitude beyond custom ad hoc solvers.
Abstract: Checking the satisfiability of logical formulas, SMT solvers scale orders of magnitude beyond custom ad hoc solvers.

Journal ArticleDOI
TL;DR: In this article, the authors describe a family of attacks such that even from a single anonymized copy of a social network, it is possible for an adversary to learn whether edges exist or not between specific targeted pairs of nodes.
Abstract: In a social network, nodes correspond topeople or other social entities, and edges correspond to social links between them. In an effort to preserve privacy, the practice of anonymization replaces names with meaningless unique identifiers. We describe a family of attacks such that even from a single anonymized copy of a social network, it is possible for an adversary to learn whether edges exist or not between specific targeted pairs of nodes.

Journal ArticleDOI
TL;DR: Sora combines the performance and fidelity of hardware SDR platforms with the programmability and flexibility of general-purpose processor (GPP) SDRplatforms to address the challenges of using PC architectures for high-speed SDR.
Abstract: This paper presents Sora, a fully programmable software radio platform on commodity PC architectures. Sora combines the performance and fidelity of hardware software-defined radio (SDR) platforms with the programmability and flexibility of general-purpose processor (GPP) SDR platforms. Sora uses both hardware and software techniques to address the challenges of using PC architectures for high-speed SDR. The Sora hardware components consist of a radio front-end for reception and transmission, and a radio control board for high-throughput, low-latency data transfer between radio and host memories. Sora makes extensive use of features of contemporary processor architectures to accelerate wireless protocol processing and satisfy protocol timing requirements, including using dedicated CPU cores, large low-latency caches to store lookup tables, and SIMD processor extensions for highly efficient physical layer processing on GPPs. Using the Sora platform, we have developed a few demonstration wireless systems, including SoftWiFi, an 802.11a/b/g implementation that seamlessly interoperates with commercial 802.11 NICs at all modulation rates, and SoftLTE, a 3GPP LTE uplink PHY implementation that supports up to 43.8Mbps data rate.

Journal ArticleDOI
TL;DR: The convolutional deep belief network is presented, a hierarchical generative model that scales to realistic image sizes and is translation-invariant and supports efficient bottom-up and top-down probabilistic inference.
Abstract: There has been much interest in unsupervised learning of hierarchical generative models such as deep belief networks (DBNs); however, scaling such models to full-sized, high-dimensional images remains a difficult problem. To address this problem, we present the convolutional deep belief network, a hierarchical generative model that scales to realistic image sizes. This model is translation-invariant and supports efficient bottom-up and top-down probabilistic inference. Key to our approach is probabilistic max-pooling, a novel technique that shrinks the representations of higher layers in a probabilistically sound way. Our experiments show that the algorithm learns useful high-level visual features, such as object parts, from unlabeled images of objects and natural scenes. We demonstrate excellent performance on several visual recognition tasks and show that our model can perform hierarchical (bottom-up and top-down) inference over full-sized images.

Journal ArticleDOI
TL;DR: This research examines how mobile advertising will become more pervasive and profitable, but not before addressing key technical and business challenges.
Abstract: Mobile advertising will become more pervasive and profitable, but not before addressing key technical and business challenges.

Journal ArticleDOI
TL;DR: Methods for evaluating and effectively managing the security behavior of employees and how to evaluate and effectively manage the security behaviour of employees are described.
Abstract: Methods for evaluating and effectively managing the security behavior of employees.

Journal ArticleDOI
TL;DR: This research unites neuroscience, supercomputing, and nanotechnology to discover, demonstrate, and deliver the brain's core algorithms.
Abstract: Unite neuroscience, supercomputing, and nanotechnology to discover, demonstrate, and deliver the brain's core algorithms.

Journal ArticleDOI
TL;DR: The state-of-the-art in nano-machines, including architectural aspects, expected features of future nano-MACHines, and current developments are presented for a better understanding of the nanonetwork scenarios.
Abstract: Nanotechnology is enabling the development of devices in a scale ranging from one to a few one hundred nanometers. Nanonetworks, i.e., the interconnection of nano-scale devices, are expected to expand the capabilities of single nano-machines by allowing them to cooperate and share information. Traditional communication technologies are not directly suitable for nanonetworks mainly due to the size and power consumption of existing transmitters, receivers and additional processing components. All these define a new communication paradigm that demands novel solutions such as nano-transceivers, channel models for the nano-scale, and protocols and architectures for nanonetworks. In this talk, first the state-of-the-art in nano-machines, including architectural aspects, expected features of future nano-machines, and current developments are presented for a better understanding of the nanonetwork scenarios. Moreover, nanonetworks features and components are explained and compared with traditional communication networks. Novel nano-antennas based on nano-materials as well as the terahertz band are investigated for electromagnetic communication in nanonetworks. Furthermore, molecular communication mechanisms are presented for short-range networking based on ion signaling and molecular motors, for medium-range networking based on flagellated bacteria and nanorods, as well as for long-range networking based on pheromones and capillaries. Finally, open research challenges such as the development of network components, molecular communication theory, and new architectures and protocols, which need to be solved in order to pave the way for the development and deployment of nanonetworks within the next couple of decades are presented.

Journal ArticleDOI
TL;DR: SLAM is a program-analysis engine used to check if clients of an API follow the API's stateful usage rules.
Abstract: SLAM is a program-analysis engine used to check if clients of an API follow the API's stateful usage rules.

Journal ArticleDOI
TL;DR: The roots of Google's PageRank can be traced back to several early, and equally remarkable, ranking techniques.
Abstract: The roots of Google's PageRank can be traced back to several early, and equally remarkable, ranking techniques.

Journal ArticleDOI
TL;DR: In this article, the authors explore the factors that may lead to the inability of professionals to adapt or cope with emerging IS in a healthy manner, and propose a solution to this problem.
Abstract: Exploring the factors that may lead to the inability of professionals to adapt or cope with emerging IS in a healthy manner.

Journal ArticleDOI
TL;DR: Privacy and confidentiality issues in cloud-based conference management systems reflect more universal themes.
Abstract: Privacy and confidentiality issues in cloud-based conference management systems reflect more universal themes.

Journal ArticleDOI
TL;DR: Effective countermeasures depend on first understanding how users naturally fall victim to fraudsters, and how to avoid becoming a victim yourself.
Abstract: Effective countermeasures depend on first understanding how users naturally fall victim to fraudsters.

Journal ArticleDOI
TL;DR: With scalable high-performance storage entirely in DRAM, RAMCloud will enable a new breed of data-intensive applications.
Abstract: With scalable high-performance storage entirely in DRAM, RAMCloud will enable a new breed of data-intensive applications.

Journal ArticleDOI
TL;DR: Can a programming language really help programmers write better programs?
Abstract: Spec# is a programming system that puts specifications in the hands of programmers and includes tools that use them. The system includes an object-oriented programming language with specification constructs, a compiler that emits executable code and run-time checks for specifications, a programming methodology that gives rules for structuring programs and for using specifications, and a static program verifier that attempts to mathematically prove the correctness of programs. This paper reflects on the six-year experience of building and using Spec#, the scientific contributions of the project, remaining challenges for tools that seek to establish program correctness, and prospects of incorporating program verification into everyday software engineering. 0. INTRODUCTION: THE SPEC# VISION Software engineering is the process of authoring software that is to fulfill some worldly needs. It is an expensive endeavor that provides difficulties at all levels. At the top level, the gathering of requirements for the software is usually an exploratory and iterative process. Any change in requirements ripples through all parts of the software artifact and is further complicated by having to maintain previous versions of the software. At the level of the program itself, the interaction between program modules requires an understanding of what is expected and what can be assumed by the interacting modules. At the level of each module, one problem is to maintain the consistency of data structures, let alone remember what it means for a particular data structure to be consistent. At the level of individual operations, the algorithmic details can be tricky to get right. Two common themes among all of these difficulties are the problem of having useful and accurate documentation and the problem of making sure that programs adhere to documented behavior and do not misuse features of the programming language. Spec# (pronounced “speck sharp”) is a research project aimed at addressing these two problems. To combat the first problem, Spec# takes the well-known approach of providing contracts, specification constructs to document behavior. To combat the second problem, Spec# adds automatic Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Copyright 2009 Barnett, Fahndrich, Leino, Muller, Schulte, Venter. tool support, which includes an automatic program verifier. The Spec# project set out to explore the programmer experience of using specifications all the time and receiving benefit from them. The tool support is intended not just to help ensure program correctness, but also, importantly, to lure programmers into recording their design decisions in specifications, knowing that the specifications will not just rot as stale comments. In this paper, we describe the Spec# programming system, along with our initial goals of the project, what we have done with it, how it has already had some impact, and how we now, in retrospect, view some of our design decisions. 1. SONGS OF INNOCENCE We started the Spec# project in 2003 as an attempt to build a comprehensive program verification system [4]. Our dream was to build a real system that real programmers can use on real programs to do real verification—a system that “the programming masses” could use in their everyday work. We wanted to explore and push the boundaries of specification and verification technology to get closer to realizing these aspirations. Let us consider in more detail the lay of the land at the time we started the project. Program verification was already several decades old, starting with some formal underpinnings of program semantics and techniques for proving program correctness [13]. Supported by mechanical proof assistants, some early program verifiers were the GYPSY system with the Boyer-Moore prover [0] and the Stanford Pascal Verifier [18]. Later systems, which are still used today, include full-featured proof assistants like PVS [23] and Isabelle/HOL [22]. Another approach that uses verification technology were extended static checkers like ESC/Modula-3 [9] and ESC/Java [12]. These tools have been more closely integrated into existing programming languages and value automation over expressivity or mathematical guarantees like finding all errors in a program. The automation is enabled by a breed of combined decision procedures that today is known as Satisfiability Modulo Theories (SMT) solvers [8]. To make them easier and more cost-effective to use, extended static checkers were intentionally designed to be unsound, that is, they may miss certain errors. Dynamic checking of specifications has always been done by Eiffel [20], which also pioneered the inclusion of specifications in an object-oriented language. The tool suite for the Java Modeling Language (JML) [16] included a facility for dynamic checking of specifications [5]. Eiffel influenced JML, and the strong influence of both of these on Spec# is

Journal ArticleDOI
TL;DR: How computer scientists can empower journalists, democracy's watchdogs, in the production of news in the public interest is explored.
Abstract: How computer scientists can empower journalists, democracy's watchdogs, in the production of news in the public interest.

Journal ArticleDOI
Nir Shavit1
TL;DR: The advent of multicore processors as the standard computing platform will force major changes in software design, as well as inspiring new ideas on how to design scalable systems.
Abstract: The advent of multicore processors as the standard computing platform will force major changes in software design.

Journal ArticleDOI
TL;DR: This paper explains how to identify, instantiate, and evaluate domain-specific design principles for creating more effective visualizations in the rapidly changing environment.
Abstract: How to identify, instantiate, and evaluate domain-specific design principles for creating more effective visualizations.

Journal ArticleDOI
TL;DR: Focusing on socio-technical design with values as a critical component in the design process, this book presents a meta-modelling framework for value-based design.
Abstract: Focusing on socio-technical design with values as a critical component in the design process.