scispace - formally typeset
Search or ask a question

Showing papers by "Carnegie Mellon University published in 2008"


Journal ArticleDOI
TL;DR: The Compact Muon Solenoid (CMS) detector at the Large Hadron Collider (LHC) at CERN as mentioned in this paper was designed to study proton-proton (and lead-lead) collisions at a centre-of-mass energy of 14 TeV (5.5 TeV nucleon-nucleon) and at luminosities up to 10(34)cm(-2)s(-1)
Abstract: The Compact Muon Solenoid (CMS) detector is described. The detector operates at the Large Hadron Collider (LHC) at CERN. It was conceived to study proton-proton (and lead-lead) collisions at a centre-of-mass energy of 14 TeV (5.5 TeV nucleon-nucleon) and at luminosities up to 10(34)cm(-2)s(-1) (10(27)cm(-2)s(-1)). At the core of the CMS detector sits a high-magnetic-field and large-bore superconducting solenoid surrounding an all-silicon pixel and strip tracker, a lead-tungstate scintillating-crystals electromagnetic calorimeter, and a brass-scintillator sampling hadron calorimeter. The iron yoke of the flux-return is instrumented with four stations of muon detectors covering most of the 4 pi solid angle. Forward sampling calorimeters extend the pseudo-rapidity coverage to high values (vertical bar eta vertical bar <= 5) assuring very good hermeticity. The overall dimensions of the CMS detector are a length of 21.6 m, a diameter of 14.6 m and a total weight of 12500 t.

5,193 citations


Journal ArticleDOI
TL;DR: It is explained how, in principle, early warning systems could be established to detect the proximity of some tipping points, and critically evaluate potential policy-relevant tipping elements in the climate system under anthropogenic forcing.
Abstract: The term "tipping point" commonly refers to a critical threshold at which a tiny perturbation can qualitatively alter the state or development of a system. Here we introduce the term "tipping element" to describe large-scale components of the Earth system that may pass a tipping point. We critically evaluate potential policy-relevant tipping elements in the climate system under anthropogenic forcing, drawing on the pertinent literature and a recent international workshop to compile a short list, and we assess where their tipping points lie. An expert elicitation is used to help rank their sensitivity to global warming and the uncertainty about the underlying physical mechanisms. Then we explain how, in principle, early warning systems could be established to detect the proximity of some tipping points.

2,660 citations


Proceedings Article
13 Jul 2008
TL;DR: A probabilistic approach based on the principle of maximum entropy that provides a well-defined, globally normalized distribution over decision sequences, while providing the same performance guarantees as existing methods is developed.
Abstract: Recent research has shown the benefit of framing problems of imitation learning as solutions to Markov Decision Problems. This approach reduces learning to the problem of recovering a utility function that makes the behavior induced by a near-optimal policy closely mimic demonstrated behavior. In this work, we develop a probabilistic approach based on the principle of maximum entropy. Our approach provides a well-defined, globally normalized distribution over decision sequences, while providing the same performance guarantees as existing methods. We develop our technique in the context of modeling real-world navigation and driving behaviors where collected data is inherently noisy and imperfect. Our probabilistic approach enables modeling of route preferences as well as a powerful new approach to inferring destinations and routes based on partial trajectories.

2,479 citations


Journal ArticleDOI
TL;DR: An unusual single crystal structure of a 25-gold-atom cluster protected by eighteen phenylethanethiol ligands is reported, which violates the empirical golden rule "cluster of clusters", and is in good correspondence with time-dependent density functional theory calculations for the observed structure.
Abstract: The total structure determination of thiol-protected Au clusters has long been a major issue in cluster research. Herein, we report an unusual single crystal structure of a 25-gold-atom cluster (1.27 nm diameter, surface-to-surface distance) protected by eighteen phenylethanethiol ligands. The Au25 cluster features a centered icosahedral Au13 core capped by twelve gold atoms that are situated in six pairs around the three mutually perpendicular 2-fold axes of the icosahedron. The thiolate ligands bind to the Au25 core in an exclusive bridging mode. This highly symmetric structure is distinctly different from recent predictions of density functional theory, and it also violates the empirical golden rule—“cluster of clusters”, which would predict a biicosahedral structure via vertex sharing of two icosahedral M13 building blocks as previously established in various 25-atom metal clusters protected by phosphine ligands. These results point to the importance of the ligand−gold core interactions. The Au25(SR)1...

1,905 citations


Journal ArticleDOI
TL;DR: This paper considers the problem of secret communication between two nodes, over a fading wireless medium, in the presence of a passive eavesdropper, and assumes that the transmitter and its helpers (amplifying relays) have more antennas than the eavesdroppers.
Abstract: The broadcast nature of the wireless medium makes the communication over this medium vulnerable to eavesdropping. This paper considers the problem of secret communication between two nodes, over a fading wireless medium, in the presence of a passive eavesdropper. The assumption used is that the transmitter and its helpers (amplifying relays) have more antennas than the eavesdropper. The transmitter ensures secrecy of communication by utilizing some of the available power to produce 'artificial noise', such that only the eavesdropper's channel is degraded. Two scenarios are considered, one where the transmitter has multiple transmit antennas, and the other where amplifying relays simulate the effect of multiple antennas. The channel state information (CSI) is assumed to be publicly known, and hence, the secrecy of communication is independent of the secrecy of CSI.

1,846 citations


Journal ArticleDOI
19 Jun 2008-Nature
TL;DR: A system that permits embodied prosthetic control is described and monkeys (Macaca mulatta) use their motor cortical activity to control a mechanized arm replica in a self-feeding task, and this demonstration of multi-degree-of-freedom embodied prosthetics control paves the way towards the development of dexterous prosthetic devices that could ultimately achieve arm and hand function at a near-natural level.
Abstract: Brain-machine interfaces have mostly been used previously to move cursors on computer displays. Now experiments on macaque monkeys show that brain activity signals can control a multi-jointed prosthetic device in real-time. The monkeys used motor cortical activity to control a human-like prosthetic arm in a self-feeding task, with a greater sophistication of control than previously possible. This work could be important for the development of more practical neuro-prosthetic devices in the future. A system where monkeys use their motor cortical activity to control a robotic arm in a real-time self-feeding task, showing a significantly greater sophisitication of control than in previous studies, is demonstrated. This work could be important for the development of more practical neuro-prosthetic devices in the future. Arm movement is well represented in populations of neurons recorded from the motor cortex1,2,3,4,5,6,7. Cortical activity patterns have been used in the new field of brain–machine interfaces8,9,10,11 to show how cursors on computer displays can be moved in two- and three-dimensional space12,13,14,15,16,17,18,19,20,21,22. Although the ability to move a cursor can be useful in its own right, this technology could be applied to restore arm and hand function for amputees and paralysed persons. However, the use of cortical signals to control a multi-jointed prosthetic device for direct real-time interaction with the physical environment (‘embodiment’) has not been demonstrated. Here we describe a system that permits embodied prosthetic control; we show how monkeys (Macaca mulatta) use their motor cortical activity to control a mechanized arm replica in a self-feeding task. In addition to the three dimensions of movement, the subjects’ cortical signals also proportionally controlled a gripper on the end of the arm. Owing to the physical interaction between the monkey, the robotic arm and objects in the workspace, this new task presented a higher level of difficulty than previous virtual (cursor-control) experiments. Apart from an example of simple one-dimensional control23, previous experiments have lacked physical interaction even in cases where a robotic arm16,19,24 or hand20 was included in the control loop, because the subjects did not use it to interact with physical objects—an interaction that cannot be fully simulated. This demonstration of multi-degree-of-freedom embodied prosthetic control paves the way towards the development of dexterous prosthetic devices that could ultimately achieve arm and hand function at a near-natural level.

1,579 citations


Journal ArticleDOI
TL;DR: This review describes the recent developments of microgel/nanogel particles as drug delivery carriers for biological and biomedical applications, including stability for prolonged circulation in the blood stream, novel functionality for further bioconjugation, and biodegradability for sustained release of drugs for a desired period of time.

1,444 citations


Journal ArticleDOI
TL;DR: This paper presented a reconciliation of three distinct ways in which the research literature has defined overconfidence: (a) overestimation of one's actual performance, (b) overplacement of the performance relative to others, and (c) excessive precision in one's beliefs.
Abstract: The authors present a reconciliation of 3 distinct ways in which the research literature has defined overconfidence: (a) overestimation of one's actual performance, (b) overplacement of one's performance relative to others, and (c) excessive precision in one's beliefs. Experimental evidence shows that reversals of the first 2 (apparent underconfidence), when they occur, tend to be on different types of tasks. On difficult tasks, people overestimate their actual performances but also mistakenly believe that they are worse than others; on easy tasks, people underestimate their actual performances but mistakenly believe they are better than others. The authors offer a straightforward theory that can explain these inconsistencies. Overprecision appears to be more persistent than either of the other 2 types of overconfidence, but its presence reduces the magnitude of both overestimation and overplacement.

1,282 citations


Journal ArticleDOI
30 May 2008-Science
TL;DR: A computational model is presented that predicts the functional magnetic resonance imaging (fMRI) neural activation associated with words for which fMRI data are not yet available, trained with a combination of data from a trillion-word text corpus and observed f MRI data associated with viewing several dozen concrete nouns.
Abstract: The question of how the human brain represents conceptual knowledge has been debated in many scientific fields. Brain imaging studies have shown that different spatial patterns of neural activation are associated with thinking about different semantic categories of pictures and words (for example, tools, buildings, and animals). We present a computational model that predicts the functional magnetic resonance imaging (fMRI) neural activation associated with words for which fMRI data are not yet available. This model is trained with a combination of data from a trillion-word text corpus and observed fMRI data associated with viewing several dozen concrete nouns. Once trained, the model predicts fMRI activation for thousands of other concrete nouns in the text corpus, with highly significant accuracies over the 60 nouns for which we currently have fMRI data.

1,204 citations


Journal IssueDOI
TL;DR: Boss is an autonomous vehicle that uses on-board sensors to track other vehicles, detect static obstacles, and localize itself relative to a road model using a spiral system development process with a heavy emphasis on regular, regressive system testing.
Abstract: Boss is an autonomous vehicle that uses on-board sensors (global positioning system, lasers, radars, and cameras) to track other vehicles, detect static obstacles, and localize itself relative to a road model. A three-layer planning system combines mission, behavioral, and motion planning to drive in urban environments. The mission planning layer considers which street to take to achieve a mission goal. The behavioral layer determines when to change lanes and precedence at intersections and performs error recovery maneuvers. The motion planning layer selects actions to avoid obstacles while making progress toward local goals. The system was developed from the ground up to address the requirements of the DARPA Urban Challenge using a spiral system development process with a heavy emphasis on regular, regressive system testing. During the National Qualification Event and the 85-km Urban Challenge Final Event, Boss demonstrated some of its capabilities, qualifying first and winning the challenge. © 2008 Wiley Periodicals, Inc.

1,201 citations


Proceedings ArticleDOI
24 Aug 2008
TL;DR: This model generalizes several existing matrix factorization methods, and therefore yields new large-scale optimization algorithms for these problems, which can handle any pairwise relational schema and a wide variety of error models.
Abstract: Relational learning is concerned with predicting unknown values of a relation, given a database of entities and observed relations among entities. An example of relational learning is movie rating prediction, where entities could include users, movies, genres, and actors. Relations encode users' ratings of movies, movies' genres, and actors' roles in movies. A common prediction technique given one pairwise relation, for example a #users x #movies ratings matrix, is low-rank matrix factorization. In domains with multiple relations, represented as multiple matrices, we may improve predictive accuracy by exploiting information from one relation while predicting another. To this end, we propose a collective matrix factorization model: we simultaneously factor several matrices, sharing parameters among factors when an entity participates in multiple relations. Each relation can have a different value type and error distribution; so, we allow nonlinear relationships between the parameters and outputs, using Bregman divergences to measure error. We extend standard alternating projection algorithms to our model, and derive an efficient Newton update for the projection. Furthermore, we propose stochastic optimization methods to deal with large, sparse matrices. Our model generalizes several existing matrix factorization methods, and therefore yields new large-scale optimization algorithms for these problems. Our model can handle any pairwise relational schema and a wide variety of error models. We demonstrate its efficiency, as well as the benefit of sharing parameters among relations.

Proceedings Article
01 Sep 2008
TL;DR: The CMU Multi-PIE database as mentioned in this paper contains 337 subjects, imaged under 15 view points and 19 illumination conditions in up to four recording sessions, with a limited number of subjects, a single recording session and only few expressions captured.
Abstract: A close relationship exists between the advancement of face recognition algorithms and the availability of face databases varying factors that affect facial appearance in a controlled manner. The CMU PIE database has been very influential in advancing research in face recognition across pose and illumination. Despite its success the PIE database has several shortcomings: a limited number of subjects, a single recording session and only few expressions captured. To address these issues we collected the CMU Multi-PIE database. It contains 337 subjects, imaged under 15 view points and 19 illumination conditions in up to four recording sessions. In this paper we introduce the database and describe the recording procedure. We furthermore present results from baseline experiments using PCA and LDA classifiers to highlight similarities and differences between PIE and Multi-PIE.

Journal ArticleDOI
12 Sep 2008-Science
TL;DR: This research explored whether human effort can be channeled into a useful purpose: helping to digitize old printed material by asking users to decipher scanned words from books that computerized optical character recognition failed to recognize.
Abstract: CAPTCHAs (Completely Automated Public Turing test to tell Computers and Humans Apart) are widespread security measures on the World Wide Web that prevent automated programs from abusing online services. They do so by asking humans to perform a task that computers cannot yet perform, such as deciphering distorted characters. Our research explored whether such human effort can be channeled into a useful purpose: helping to digitize old printed material by asking users to decipher scanned words from books that computerized optical character recognition failed to recognize. We showed that this method can transcribe text with a word accuracy exceeding 99%, matching the guarantee of professional human transcribers. Our apparatus is deployed in more than 40,000 Web sites and has transcribed over 440 million words.

Journal ArticleDOI
TL;DR: Data generated as a side effect of game play also solves computational problems and trains AI algorithms.
Abstract: Data generated as a side effect of game play also solves computational problems and trains AI algorithms

Journal ArticleDOI
TL;DR: It is found that for most specifications product substitutability does influence the equilibrium distribution structure in a duopoly where each manufacturer distributes its goods through a single exclusive retailer, which may be either a franchised outlet or a factory store.
Abstract: This paper investigates the effect of product substitutability on Nash equilibrium distribution structures in a duopoly where each manufacturer distributes its goods through a single exclusive retailer, which may be either a franchised outlet or a factory store. Static linear demand and cost functions are assumed, and a number of rules about players' expectations of competitors' behavior are examined. It is found that for most specifications product substitutability does influence the equilibrium distribution structure. For low degrees of substitutability, each manufacturer will distribute its product through a company store; for more highly competitive goods, manufacturers will be more likely to use a decentralized distribution system. This article was originally published in Marketing Science, Volume 2, Issue 2, pages 161--191, in 1983.

Journal ArticleDOI
TL;DR: It is suggested that dietary shift can be a more effective means of lowering an average household's food-related climate footprint than "buying local" and achieves more GHG reduction than buying all locally sourced food.
Abstract: Despite significant recent public concern and media attention to the environmental impacts of food, few studies in the United States have systematically compared the life-cycle greenhouse gas (GHG) emissions associated with food production against long-distance distribution, aka “food-miles.” We find that although food is transported long distances in general (1640 km delivery and 6760 km life-cycle supply chain on average) the GHG emissions associated with food are dominated by the production phase, contributing 83% of the average U.S. household’s 8.1 t CO2e/yr footprint for food consumption. Transportation as a whole represents only 11% of life-cycle GHG emissions, and final delivery from producer to retail contributes only 4%. Different food groups exhibit a large range in GHG-intensity; on average, red meat is around 150% more GHG-intensive than chicken or fish. Thus, we suggest that dietary shift can be a more effective means of lowering an average household’s food-related climate footprint than “buy...

Journal ArticleDOI
TL;DR: In thinking about computing, the authors need to be attuned to the three drivers of their field: science, technology and society, to revisit the most basic scientific questions of computing.
Abstract: Computational thinking will influence everyone in every field of endeavour. This vision poses a new educational challenge for our society, especially for our children. In thinking about computing, we need to be attuned to the three drivers of our field: science, technology and society. Accelerating technological advances and monumental societal demands force us to revisit the most basic scientific questions of computing.

Journal ArticleDOI
TL;DR: A detailed review of the physical properties of molecular brushers can be found in this article, with particular focus on synthesis via controlled radical polymerization techniques, where the authors present several strategies for their preparation.

Proceedings ArticleDOI
21 Apr 2008
TL;DR: It is found that a generative model, in which new edges are added via an iterative "forest fire" burning process, is able to produce graphs exhibiting a network community structure similar to that observed in nearly every network dataset examined.
Abstract: A large body of work has been devoted to identifying community structure in networks. A community is often though of as a set of nodes that has more connections between its members than to the remainder of the network. In this paper, we characterize as a function of size the statistical and structural properties of such sets of nodes. We define the network community profile plot, which characterizes the "best" possible community - according to the conductance measure - over a wide range of size scales, and we study over 70 large sparse real-world networks taken from a wide range of application domains. Our results suggest a significantly more refined picture of community structure in large real-world networks than has been appreciated previously.Our most striking finding is that in nearly every network dataset we examined, we observe tight but almost trivial communities at very small scales, and at larger size scales, the best possible communities gradually "blend in" with the rest of the network and thus become less "community-like." This behavior is not explained, even at a qualitative level, by any of the commonly-used network generation models. Moreover, this behavior is exactly the opposite of what one would expect based on experience with and intuition from expander graphs, from graphs that are well-embeddable in a low-dimensional structure, and from small social networks that have served as testbeds of community detection algorithms. We have found, however, that a generative model, in which new edges are added via an iterative "forest fire" burning process, is able to produce graphs exhibiting a network community structure similar to our observations.

Proceedings ArticleDOI
23 Jun 2008
TL;DR: This paper proposes a simple algorithm for estimating a distribution over geographic locations from a single image using a purely data-driven scene matching approach and shows that geolocation estimates can provide the basis for numerous other image understanding tasks such as population density estimation, land cover estimation or urban/rural classification.
Abstract: Estimating geographic information from an image is an excellent, difficult high-level computer vision problem whose time has come. The emergence of vast amounts of geographically-calibrated image data is a great reason for computer vision to start looking globally - on the scale of the entire planet! In this paper, we propose a simple algorithm for estimating a distribution over geographic locations from a single image using a purely data-driven scene matching approach. For this task, we leverage a dataset of over 6 million GPS-tagged images from the Internet. We represent the estimated image location as a probability distribution over the Earthpsilas surface. We quantitatively evaluate our approach in several geolocation tasks and demonstrate encouraging performance (up to 30 times better than chance). We show that geolocation estimates can provide the basis for numerous other image understanding tasks such as population density estimation, land cover estimation or urban/rural classification.

Journal ArticleDOI
TL;DR: A class of hybrid algorithms, of which branch-and-bound and polyhedral outer approximation are the two extreme cases, are proposed and implemented and Computational results that demonstrate the effectiveness of this framework are reported.

Journal ArticleDOI
10 Dec 2008-JAMA
TL;DR: The use of economic incentives produced significant weight loss during the 16 weeks of intervention that was not fully sustained, and incentive participants weighed significantly less at 7 months than at the study start.
Abstract: Context Identifying effective obesity treatment is both a clinical challenge and a public health priority due to the health consequences of obesity. Objective To determine whether common decision errors identified by behavioral economists such as prospect theory, loss aversion, and regret could be used to design an effective weight loss intervention. Design, Setting, and Participants Fifty-seven healthy participants aged 30-70 years with a body mass index of 30-40 were randomized to 3 weight loss plans: monthly weigh-ins, a lottery incentive program, or a deposit contract that allowed for participant matching, with a weight loss goal of 1 lb (0.45 kg) a week for 16 weeks. Participants were recruited May-August 2007 at the Philadelphia VA Medical Center in Pennsylvania and were followed up through June 2008. Main Outcome Measures Weight loss after 16 weeks. Results The incentive groups lost significantly more weight than the control group (mean, 3.9 lb). Compared with the control group, the lottery group lost a mean of 13.1 lb (95% confidence interval [CI] of the difference in means, 1.95-16.40; P=.02) and the deposit contract group lost a mean of 14.0 lb (95% CI of the difference in means, 3.69-16.43; P =.006). About half of those in both incentive groups met the 16-lb target weight loss: 47.4% (95% CI, 24.5%-71.1%) in the deposit contract group and 52.6% (95% CI, 28.9%-75.6%) in the lottery group, whereas 10.5% (95% CI, 1.3%- 33.1%; P = .01) in the control group met the 16-lb target. Although the net weight loss between enrollment in the study and at the end of 7 months was larger in the incentive groups (9.2 lb; t = 1.21; 95% CI, −3.20 to 12.66; P = .23, in the lottery group and 6.2 lb; t = 0.52; 95% CI, −5.17 to 8.75; P = .61 in the deposit contract group) than in the control group (4.4 lb), these differences were not statistically significant. However, incentive participants weighed significantly less at 7 months than at the study start (P = .01 for the lottery group; P = .03 for the deposit contract group) whereas controls did not. Conclusions The use of economic incentives produced significant weight loss during the 16 weeks of intervention that was not fully sustained. The longer-term use of incentives should be evaluated. Trial Registration clinicaltrials.gov Identifier: NCT00520611

Journal ArticleDOI
TL;DR: A fully-coupled monolithic formulation of the fluid-structure interaction of an incompressible fluid on a moving domain with a nonlinear hyperelastic solid is presented.
Abstract: We present a fully-coupled monolithic formulation of the fluid-structure interaction of an incompressible fluid on a moving domain with a nonlinear hyperelastic solid. The arbitrary Lagrangian–Eulerian description is utilized for the fluid subdomain and the Lagrangian description is utilized for the solid subdomain. Particular attention is paid to the derivation of various forms of the conservation equations; the conservation properties of the semi-discrete and fully discretized systems; a unified presentation of the generalized-α time integration method for fluid-structure interaction; and the derivation of the tangent matrix, including the calculation of shape derivatives. A NURBS-based isogeometric analysis methodology is used for the spatial discretization and three numerical examples are presented which demonstrate the good behavior of the methodology.

Proceedings ArticleDOI
24 Aug 2008
TL;DR: A complete model of network evolution, where nodes arrive at a prespecified rate and select their lifetimes, and the combination of the gap distribution with the node lifetime leads to a power law out-degree distribution that accurately reflects the true network in all four cases is presented.
Abstract: We present a detailed study of network evolution by analyzing four large online social networks with full temporal information about node and edge arrivals. For the first time at such a large scale, we study individual node arrival and edge creation processes that collectively lead to macroscopic properties of networks. Using a methodology based on the maximum-likelihood principle, we investigate a wide variety of network formation strategies, and show that edge locality plays a critical role in evolution of networks. Our findings supplement earlier network models based on the inherently non-local preferential attachment.Based on our observations, we develop a complete model of network evolution, where nodes arrive at a prespecified rate and select their lifetimes. Each node then independently initiates edges according to a "gap" process, selecting a destination for each edge according to a simple triangle-closing model free of any parameters. We show analytically that the combination of the gap distribution with the node lifetime leads to a power law out-degree distribution that accurately reflects the true network in all four cases. Finally, we give model parameter settings that allow automatic evolution and generation of realistic synthetic networks of arbitrary scale.

Journal ArticleDOI
TL;DR: A general epidemic threshold condition is proposed for the NLDS system: it is proved that the epidemic threshold for a network is exactly the inverse of the largest eigenvalue of its adjacency matrix, and it is shown that below the epidemic thresholds, infections die out at an exponential rate.
Abstract: How will a virus propagate in a real networkq How long does it take to disinfect a network given particular values of infection rate and virus death rateq What is the single best node to immunizeq Answering these questions is essential for devising network-wide strategies to counter viruses. In addition, viral propagation is very similar in principle to the spread of rumors, information, and “fads,” implying that the solutions for viral propagation would also offer insights into these other problem settings. We answer these questions by developing a nonlinear dynamical system (NLDS) that accurately models viral propagation in any arbitrary network, including real and synthesized network graphs. We propose a general epidemic threshold condition for the NLDS system: we prove that the epidemic threshold for a network is exactly the inverse of the largest eigenvalue of its adjacency matrix. Finally, we show that below the epidemic threshold, infections die out at an exponential rate. Our epidemic threshold model subsumes many known thresholds for special-case graphs (e.g., Erdos--Renyi, BA powerlaw, homogeneous). We demonstrate the predictive power of our model with extensive experiments on real and synthesized graphs, and show that our threshold condition holds for arbitrary graphs. Finally, we show how to utilize our threshold condition for practical uses: It can dictate which nodes to immunize; it can assess the effects of a throttling policy; it can help us design network topologies so that they are more resistant to viruses.

Book ChapterDOI
16 Dec 2008
TL;DR: An overview of the BitBlaze project, a new approach to computer security via binary analysis that focuses on building a unified binary analysis platform and using it to provide novel solutions to a broad spectrum of different security problems.
Abstract: In this paper, we give an overview of the BitBlaze project, a new approach to computer security via binary analysis. In particular, BitBlaze focuses on building a unified binary analysis platform and using it to provide novel solutions to a broad spectrum of different security problems. The binary analysis platform is designed to enable accurate analysis, provide an extensible architecture, and combines static and dynamic analysis as well as program verification techniques to satisfy the common needs of security applications. By extracting security-related properties from binary programs directly, BitBlaze enables a principled, root-cause based approach to computer security, offering novel and effective solutions, as demonstrated with over a dozen different security applications.

Proceedings ArticleDOI
01 Apr 2008
TL;DR: Flicker is presented, an infrastructure for executing security-sensitive code in complete isolation while trusting as few as 250 lines of additional code, and can provide meaningful, fine-grained attestation of the code executed (as well as its inputs and outputs) to a remote party.
Abstract: We present Flicker, an infrastructure for executing security-sensitive code in complete isolation while trusting as few as 250 lines of additional code. Flicker can also provide meaningful, fine-grained attestation of the code executed (as well as its inputs and outputs) to a remote party. Flicker guarantees these properties even if the BIOS, OS and DMA-enabled devices are all malicious. Flicker leverages new commodity processors from AMD and Intel and does not require a new OS or VMM. We demonstrate a full implementation of Flicker on an AMD platform and describe our development environment for simplifying the construction of Flicker-enabled code.

Proceedings ArticleDOI
08 Nov 2008
TL;DR: Examination of how the number of editors in Wikipedia and the coordination methods they use affect article quality demonstrated the critical importance of coordination in effectively harnessing the "wisdom of the crowd" in online production environments.
Abstract: Wikipedia's success is often attributed to the large numbers of contributors who improve the accuracy, completeness and clarity of articles while reducing bias. However, because of the coordination needed to write an article collaboratively, adding contributors is costly. We examined how the number of editors in Wikipedia and the coordination methods they use affect article quality. We distinguish between explicit coordination, in which editors plan the article through communication, and implicit coordination, in which a subset of editors structure the work by doing the majority of it. Adding more editors to an article improved article quality only when they used appropriate coordination techniques and was harmful when they did not. Implicit coordination through concentrating the work was more helpful when many editors contributed, but explicit coordination through communication was not. Both types of coordination improved quality more when an article was in a formative stage. These results demonstrate the critical importance of coordination in effectively harnessing the "wisdom of the crowd" in online production environments.

Journal ArticleDOI
TL;DR: Intensive study has revealed that the regioregular polymerization of 3-alkylthiophenes proceeds by a chain-growth mechanism and can be made into a living system, which enables precise control of the molecular weight and facile end-group functionalization of the polymer chains, leading to tailor-made regiOREgular polythiophene for specific applications.
Abstract: Regioregular poly(3-alkylthiophene)s (rrP3ATs) are an important class of π-conjugated polymers that can be used in plastic electronic devices such as solar cells and field-effect transistors. rrP3ATs can be ordered in three dimensions: conformational ordering along the backbone, π-stacking of flat polymer chains, and lamellar stacking between chains. All of these features lead to the excellent electrical properties of these materials. Creative molecular design and advanced synthesis are critical in controlling the properties of the materials as well as their device performance. This Account reports the advances in molecular design of new functional polythiophenes as well as the associated polymerization methods. Many functionalized regioregular polythiophenes have been designed and synthesized and show fascinating properties such as high conductivity, mobility, chemosensitivity, liquid crystallinity, or chirality. The methods for the synthesis of rrP3ATs are also applicable to other functional side chains...

Journal ArticleDOI
TL;DR: The PSQI is more closely related to psychological symptom ratings and sleep diary measures than the ESS, and these instruments are not likely to be useful as screening measures for polysomnographic sleep abnormalities.
Abstract: Study Objectives: 1) To characterize PSQI and ESS scores, and their relationship to each other, in an adult community sample; 2) To determine whether PSQI and ESS scores, in combination with each other, were associated with distinct demographic, clinical, and sleep characteristics.