scispace - formally typeset
Search or ask a question

Showing papers by "University of Illinois at Urbana–Champaign published in 2010"


Journal ArticleDOI
TL;DR: This paper presents a new approach to single-image superresolution, based upon sparse signal representation, which generates high-resolution images that are competitive or even superior in quality to images produced by other similar SR methods.
Abstract: This paper presents a new approach to single-image superresolution, based upon sparse signal representation. Research on image statistics suggests that image patches can be well-represented as a sparse linear combination of elements from an appropriately chosen over-complete dictionary. Inspired by this observation, we seek a sparse representation for each patch of the low-resolution input, and then use the coefficients of this representation to generate the high-resolution output. Theoretical results from compressed sensing suggest that under mild conditions, the sparse representation can be correctly recovered from the downsampled signals. By jointly training two dictionaries for the low- and high-resolution image patches, we can enforce the similarity of sparse representations between the low-resolution and high-resolution image patch pair with respect to their own dictionaries. Therefore, the sparse representation of a low-resolution image patch can be applied with the high-resolution image patch dictionary to generate a high-resolution image patch. The learned dictionary pair is a more compact representation of the patch pairs, compared to previous approaches, which simply sample a large amount of image patch pairs , reducing the computational cost substantially. The effectiveness of such a sparsity prior is demonstrated for both general image super-resolution (SR) and the special case of face hallucination. In both cases, our algorithm generates high-resolution images that are competitive or even superior in quality to images produced by other similar SR methods. In addition, the local sparse modeling of our approach is naturally robust to noise, and therefore the proposed algorithm can handle SR with noisy inputs in a more unified framework.

4,958 citations


Journal ArticleDOI
26 Mar 2010-Science
TL;DR: Inorganic and organic electronic materials in microstructured and nanostructured forms, intimately integrated with elastomeric substrates, offer particularly attractive characteristics, with realistic pathways to sophisticated embodiments, and applications in systems ranging from electronic eyeball cameras to deformable light-emitting displays are described.
Abstract: Recent advances in mechanics and materials provide routes to integrated circuits that can offer the electrical properties of conventional, rigid wafer-based technologies but with the ability to be stretched, compressed, twisted, bent, and deformed into arbitrary shapes. Inorganic and organic electronic materials in microstructured and nanostructured forms, intimately integrated with elastomeric substrates, offer particularly attractive characteristics, with realistic pathways to sophisticated embodiments. Here, we review these strategies and describe applications of them in systems ranging from electronic eyeball cameras to deformable light-emitting displays. We conclude with some perspectives on routes to commercialization, new device opportunities, and remaining challenges for research.

4,127 citations


Proceedings ArticleDOI
13 Jun 2010
TL;DR: This paper presents a simple but effective coding scheme called Locality-constrained Linear Coding (LLC) in place of the VQ coding in traditional SPM, using the locality constraints to project each descriptor into its local-coordinate system, and the projected coordinates are integrated by max pooling to generate the final representation.
Abstract: The traditional SPM approach based on bag-of-features (BoF) requires nonlinear classifiers to achieve good image classification performance. This paper presents a simple but effective coding scheme called Locality-constrained Linear Coding (LLC) in place of the VQ coding in traditional SPM. LLC utilizes the locality constraints to project each descriptor into its local-coordinate system, and the projected coordinates are integrated by max pooling to generate the final representation. With linear classifier, the proposed approach performs remarkably better than the traditional nonlinear SPM, achieving state-of-the-art performance on several benchmarks. Compared with the sparse coding strategy [22], the objective function used by LLC has an analytical solution. In addition, the paper proposes a fast approximated LLC method by first performing a K-nearest-neighbor search and then solving a constrained least square fitting problem, bearing computational complexity of O(M + K2). Hence even with very large codebooks, our system can still process multiple frames per second. This efficiency significantly adds to the practical values of LLC for real applications.

3,307 citations


Journal ArticleDOI
TL;DR: The Flourishing Scale as mentioned in this paper is a summary measure of the respondent's self-perceived success in important areas such as relationships, self-esteem, purpose, and optimism.
Abstract: Measures of well-being were created to assess psychological flourishing and feelings—positive feelings, negative feelings, and the difference between the two. The scales were evaluated in a sample of 689 college students from six locations. The Flourishing Scale is a brief 8-item summary measure of the respondent’s self-perceived success in important areas such as relationships, self-esteem, purpose, and optimism. The scale provides a single psychological well-being score. The measure has good psychometric properties, and is strongly associated with other psychological well-being scales. The Scale of Positive and Negative Experience produces a score for positive feelings (6 items), a score for negative feelings (6 items), and the two can be combined to create a balance score. This 12-item brief scale has a number of desirable features compared to earlier measures of positive and negative emotions. In particular, the scale assesses with a few items a broad range of negative and positive experiences and feelings, not just those of a certain type, and is based on the amount of time the feelings were experienced during the past 4 weeks. The scale converges well with measures of emotions and affective well-being.

2,860 citations


Journal ArticleDOI
Koji Nakamura1, K. Hagiwara, Ken Ichi Hikasa2, Hitoshi Murayama1  +180 moreInstitutions (92)
TL;DR: In this article, a biennial review summarizes much of particle physics using data from previous editions, plus 2158 new measurements from 551 papers, they list, evaluate and average measured properties of gauge bosons, leptons, quarks, mesons, and baryons.
Abstract: This biennial Review summarizes much of particle physics. Using data from previous editions, plus 2158 new measurements from 551 papers, we list, evaluate, and average measured properties of gauge bosons, leptons, quarks, mesons, and baryons. We also summarize searches for hypothetical particles such as Higgs bosons, heavy neutrinos, and supersymmetric particles. All the particle properties and search limits are listed in Summary Tables. We also give numerous tables, figures, formulae, and reviews of topics such as the Standard Model, particle detectors, probability, and statistics. Among the 108 reviews are many that are new or heavily revised including those on neutrino mass, mixing, and oscillations, QCD, top quark, CKM quark-mixing matrix, V-ud & V-us, V-cb & V-ub, fragmentation functions, particle detectors for accelerator and non-accelerator physics, magnetic monopoles, cosmological parameters, and big bang cosmology.

2,788 citations


Proceedings Article
11 Jul 2010
TL;DR: This work evaluates Brown clusters, Collobert and Weston (2008) embeddings, and HLBL (Mnih & Hinton, 2009) embeds of words on both NER and chunking, and finds that each of the three word representations improves the accuracy of these baselines.
Abstract: If we take an existing supervised NLP system, a simple and general way to improve accuracy is to use unsupervised word representations as extra word features. We evaluate Brown clusters, Collobert and Weston (2008) embeddings, and HLBL (Mnih & Hinton, 2009) embeddings of words on both NER and chunking. We use near state-of-the-art supervised baselines, and find that each of the three word representations improves the accuracy of these baselines. We find further improvements by combining different word representations. You can download our word features, for off-the-shelf use in existing NLP systems, as well as our code, here: http://metaoptimize.com/projects/wordreprs/

2,243 citations


Journal ArticleDOI
TL;DR: In this paper, the authors present a new dataset of gridded emissions covering the historical period (1850-2000) in decadal increments at a horizontal resolution of 0.5° in latitude and longitude.
Abstract: We present and discuss a new dataset of gridded emissions covering the historical period (1850–2000) in decadal increments at a horizontal resolution of 0.5° in latitude and longitude. The primary purpose of this inventory is to provide consistent gridded emissions of reactive gases and aerosols for use in chemistry model simulations needed by climate models for the Climate Model Intercomparison Program #5 (CMIP5) in support of the Intergovernmental Panel on Climate Change (IPCC) Fifth Assessment Report (AR5). Our best estimate for the year 2000 inventory represents a combination of existing regional and global inventories to capture the best information available at this point; 40 regions and 12 sectors are used to combine the various sources. The historical reconstruction of each emitted compound, for each region and sector, is then forced to agree with our 2000 estimate, ensuring continuity between past and 2000 emissions. Simulations from two chemistry-climate models is used to test the ability of the emission dataset described here to capture long-term changes in atmospheric ozone, carbon monoxide and aerosol distributions. The simulated long-term change in the Northern mid-latitudes surface and mid-troposphere ozone is not quite as rapid as observed. However, stations outside this latitude band show much better agreement in both present-day and long-term trend. The model simulations indicate that the concentration of carbon monoxide is underestimated at the Mace Head station; however, the long-term trend over the limited observational period seems to be reasonably well captured. The simulated sulfate and black carbon deposition over Greenland is in very good agreement with the ice-core observations spanning the simulation period. Finally, aerosol optical depth and additional aerosol diagnostics are shown to be in good agreement with previously published estimates and observations.

1,953 citations


Journal ArticleDOI
TL;DR: In this paper, the authors address the strengths and weaknesses of the main available measures of political regime and extend the dichotomous regime classification first introduced in Alvarez et al. (Stud. Comp. Int. Dev. 31(2):3-36, 1996).
Abstract: We address the strengths and weaknesses of the main available measures of political regime and extend the dichotomous regime classification first introduced in Alvarez et al. (Stud. Comp. Int. Dev. 31(2):3–36, 1996). This extension focuses on how incumbents are removed from office. We argue that differences across regime measures must be taken seriously and that they should be evaluated in terms of whether they (1) serve to address important research questions, (2) can be interpreted meaningfully, and (3) are reproducible. We argue that existing measures of democracy are not interchangeable and that the choice of measure should be guided by its theoretical and empirical underpinnings. We show that the choice of regime measure matters by replicating studies published in leading journals.

1,922 citations


Journal ArticleDOI
TL;DR: It is shown that there is a fundamental tradeoff between storage and repair bandwidth which is theoretically characterize using flow arguments on an appropriately constructed graph and regenerating codes are introduced that can achieve any point in this optimal tradeoff.
Abstract: Distributed storage systems provide reliable access to data through redundancy spread over individually unreliable nodes. Application scenarios include data centers, peer-to-peer storage systems, and storage in wireless networks. Storing data using an erasure code, in fragments spread across nodes, requires less redundancy than simple replication for the same level of reliability. However, since fragments must be periodically replaced as nodes fail, a key question is how to generate encoded fragments in a distributed way while transferring as little data as possible across the network. For an erasure coded system, a common practice to repair from a single node failure is for a new node to reconstruct the whole encoded data object to generate just one encoded block. We show that this procedure is sub-optimal. We introduce the notion of regenerating codes, which allow a new node to communicate functions of the stored data from the surviving nodes. We show that regenerating codes can significantly reduce the repair bandwidth. Further, we show that there is a fundamental tradeoff between storage and repair bandwidth which we theoretically characterize using flow arguments on an appropriately constructed graph. By invoking constructive results in network coding, we introduce regenerating codes that can achieve any point in this optimal tradeoff.

1,919 citations


Journal ArticleDOI
TL;DR: Evidence is provided for a role for the long nuclear-retained regulatory RNA, MALAT1 in AS regulation and for the role for an nrRNA in the regulation of gene expression, which suggests that MALat1 regulates AS by modulating the levels of active SR proteins.

1,893 citations


Journal ArticleDOI
29 Apr 2010
TL;DR: This review paper highlights a few representative examples of how the interaction between sparse signal representation and computer vision can enrich both fields, and raises a number of open questions for further study.
Abstract: Techniques from sparse signal representation are beginning to see significant impact in computer vision, often on nontraditional applications where the goal is not just to obtain a compact high-fidelity representation of the observed signal, but also to extract semantic information. The choice of dictionary plays a key role in bridging this gap: unconventional dictionaries consisting of, or learned from, the training samples themselves provide the key to obtaining state-of-the-art results and to attaching semantic meaning to sparse signal representations. Understanding the good performance of such unconventional dictionaries in turn demands new algorithmic and analytical techniques. This review paper highlights a few representative examples of how the interaction between sparse signal representation and computer vision can enrich both fields, and raises a number of open questions for further study.

Journal ArticleDOI
TL;DR: In this article, the authors present a distributed algorithm that can be used by multiple agents to align their estimates with a particular value over a network with time-varying connectivity.
Abstract: We present distributed algorithms that can be used by multiple agents to align their estimates with a particular value over a network with time-varying connectivity. Our framework is general in that this value can represent a consensus value among multiple agents or an optimal solution of an optimization problem, where the global objective function is a combination of local agent objective functions. Our main focus is on constrained problems where the estimates of each agent are restricted to lie in different convex sets. To highlight the effects of constraints, we first consider a constrained consensus problem and present a distributed "projected consensus algorithm" in which agents combine their local averaging operation with projection on their individual constraint sets. This algorithm can be viewed as a version of an alternating projection method with weights that are varying over time and across agents. We establish convergence and convergence rate results for the projected consensus algorithm. We next study a constrained optimization problem for optimizing the sum of local objective functions of the agents subject to the intersection of their local constraint sets. We present a distributed "projected subgradient algorithm" which involves each agent performing a local averaging operation, taking a subgradient step to minimize its own objective function, and projecting on its constraint set. We show that, with an appropriately selected stepsize rule, the agent estimates generated by this algorithm converge to the same optimal solution for the cases when the weights are constant and equal, and when the weights are time-varying but all agents have the same constraint set.

Proceedings ArticleDOI
25 Jul 2010
TL;DR: The results from extensive simulations demonstrate that the proposed algorithm is currently the best scalable solution to the influence maximization problem and significantly outperforms all other scalable heuristics to as much as 100%--260% increase in influence spread.
Abstract: Influence maximization, defined by Kempe, Kleinberg, and Tardos (2003), is the problem of finding a small set of seed nodes in a social network that maximizes the spread of influence under certain influence cascade models. The scalability of influence maximization is a key factor for enabling prevalent viral marketing in large-scale online social networks. Prior solutions, such as the greedy algorithm of Kempe et al. (2003) and its improvements are slow and not scalable, while other heuristic algorithms do not provide consistently good performance on influence spreads. In this paper, we design a new heuristic algorithm that is easily scalable to millions of nodes and edges in our experiments. Our algorithm has a simple tunable parameter for users to control the balance between the running time and the influence spread of the algorithm. Our results from extensive simulations on several real-world and synthetic networks demonstrate that our algorithm is currently the best scalable solution to the influence maximization problem: (a) our algorithm scales beyond million-sized graphs where the greedy algorithm becomes infeasible, and (b) in all size ranges, our algorithm performs consistently well in influence spread --- it is always among the best algorithms, and in most cases it significantly outperforms all other scalable heuristics to as much as 100%--260% increase in influence spread.

Proceedings ArticleDOI
13 Jun 2010
TL;DR: The design, construction and verification of cyber-physical systems pose a multitude of technical challenges that must be addressed by a cross-disciplinary community of researchers and educators.
Abstract: Cyber-physical systems (CPS) are physical and engineered systems whose operations are monitored, coordinated, controlled and integrated by a computing and communication core. Just as the internet transformed how humans interact with one another, cyber-physical systems will transform how we interact with the physical world around us. Many grand challenges await in the economically vital domains of transportation, health-care, manufacturing, agriculture, energy, defense, aerospace and buildings. The design, construction and verification of cyber-physical systems pose a multitude of technical challenges that must be addressed by a cross-disciplinary community of researchers and educators.

Journal ArticleDOI
TL;DR: A material strategy for a type of bio-interfaced system that relies on ultrathin electronics supported by bioresorbable substrates of silk fibroin that provides new capabilities for implantable and surgical devices is described.
Abstract: Electronics that are capable of intimate, non-invasive integration with the soft, curvilinear surfaces of biological tissues offer important opportunities for diagnosing and treating disease and for improving brain/machine interfaces. This article describes a material strategy for a type of bio-interfaced system that relies on ultrathin electronics supported by bioresorbable substrates of silk fibroin. Mounting such devices on tissue and then allowing the silk to dissolve and resorb initiates a spontaneous, conformal wrapping process driven by capillary forces at the biotic/abiotic interface. Specialized mesh designs and ultrathin forms for the electronics ensure minimal stresses on the tissue and highly conformal coverage, even for complex curvilinear surfaces, as confirmed by experimental and theoretical studies. In vivo, neural mapping experiments on feline animal models illustrate one mode of use for this class of technology. These concepts provide new capabilities for implantable and surgical devices. Electronics that are capable of intimate integration with the surfaces of biological tissues create opportunities for improving animal/machine interfaces. A bio-interfaced system of ultrathin electronics supported by bioresorbable silk-fibroin substrates is now presented. Mounting such devices on tissue and then allowing the silk to dissolve initiates a conformal wrapping process that is driven by capillary forces.

Journal ArticleDOI
TL;DR: The fundamental principles of both synthetic methods and recent development in the applications of ultrasound in nanostructured materials synthesis are summarized.
Abstract: Recent advances in nanostructured materials have been led by the development of new synthetic methods that provide control over size, morphology, and nano/microstructure. The utilization of high intensity ultrasound offers a facile, versatile synthetic tool for nanostructured materials that are often unavailable by conventional methods. The primary physical phenomena associated with ultrasound that are relevant to materials synthesis are cavitation and nebulization. Acoustic cavitation (the formation, growth, and implosive collapse of bubbles in a liquid) creates extreme conditions inside the collapsing bubble and serves as the origin of most sonochemical phenomena in liquids or liquid-solid slurries. Nebulization (the creation of mist from ultrasound passing through a liquid and impinging on a liquid-gas interface) is the basis for ultrasonic spray pyrolysis (USP) with subsequent reactions occurring in the heated droplets of the mist. In both cases, we have examples of phase-separated attoliter microreactors: for sonochemistry, it is a hot gas inside bubbles isolated from one another in a liquid, while for USP it is hot droplets isolated from one another in a gas. Cavitation-induced sonochemistry provides a unique interaction between energy and matter, with hot spots inside the bubbles of approximately 5000 K, pressures of approximately 1000 bar, heating and cooling rates of >10(10) K s(-1); these extraordinary conditions permit access to a range of chemical reaction space normally not accessible, which allows for the synthesis of a wide variety of unusual nanostructured materials. Complementary to cavitational chemistry, the microdroplet reactors created by USP facilitate the formation of a wide range of nanocomposites. In this review, we summarize the fundamental principles of both synthetic methods and recent development in the applications of ultrasound in nanostructured materials synthesis.

Journal ArticleDOI
TL;DR: Inefficiencies in photosynthetic energy transduction in crops from light interception to carbohydrate synthesis, and how classical breeding, systems biology, and synthetic biology are providing new opportunities to develop more productive germplasm are examined to more than double the yield potential of major crops.
Abstract: Increasing the yield potential of the major food grain crops has contributed very significantly to a rising food supply over the past 50 years, which has until recently more than kept pace with rising global demand. Whereas improved photosynthetic efficiency has played only a minor role in the remarkable increases in productivity achieved in the last half century, further increases in yield potential will rely in large part on improved photosynthesis. Here we examine inefficiencies in photosynthetic energy transduction in crops from light interception to carbohydrate synthesis, and how classical breeding, systems biology, and synthetic biology are providing new opportunities to develop more productive germplasm. Near-term opportunities include improving the display of leaves in crop canopies to avoid light saturation of individual leaves and further investigation of a photorespiratory bypass that has already improved the productivity of model species. Longer-term opportunities include engineering into plants carboxylases that are better adapted to current and forthcoming CO2 concentrations, and the use of modeling to guide molecular optimization of resource investment among the components of the photosynthetic apparatus, to maximize carbon gain without increasing crop inputs. Collectively, these changes have the potential to more than double the yield potential of our major crops.

Journal ArticleDOI
TL;DR: Recent results that address the toxicity of gold nanoparticles both in vitro and in vivo are discussed, and some experimental recommendations for future research at the interface of nanotechnology and biological systems are provided.
Abstract: Gold nanoparticles have attracted enormous scientific and technological interest due to their ease of synthesis, chemical stability, and unique optical properties. Proof-of-concept studies demonstrate their biomedical applications in chemical sensing, biological imaging, drug delivery, and cancer treatment. Knowledge about their potential toxicity and health impact is essential before these nanomaterials can be used in real clinical settings. Furthermore, the underlying interactions of these nanomaterials with physiological fluids is a key feature of understanding their biological impact, and these interactions can perhaps be exploited to mitigate unwanted toxic effects. In this Perspective we discuss recent results that address the toxicity of gold nanoparticles both in vitro and in vivo, and we provide some experimental recommendations for future research at the interface of nanotechnology and biological systems.

Journal ArticleDOI
TL;DR: The authors survey 1,050 chief financial officers (CFOs) in the U.S., Europe, and Asia to assess whether their firms are credit constrained during the global financial crisis of 2008 and find that constrained firms planned deeper cuts in tech spending, employment, and capital spending.

Journal ArticleDOI
08 Jul 2010-Nature
TL;DR: Developing a calibrated biosensor that measures forces across specific proteins in cells with piconewton (pN) sensitivity reveals that FA stabilization under force requires both vinculin recruitment and force transmission, and that, surprisingly, these processes can be controlled independently.
Abstract: Mechanical forces are central to developmental, physiological and pathological processes. However, limited understanding of force transmission within sub-cellular structures is a major obstacle to unravelling molecular mechanisms. Here we describe the development of a calibrated biosensor that measures forces across specific proteins in cells with piconewton (pN) sensitivity, as demonstrated by single molecule fluorescence force spectroscopy. The method is applied to vinculin, a protein that connects integrins to actin filaments and whose recruitment to focal adhesions (FAs) is force-dependent. We show that tension across vinculin in stable FAs is approximately 2.5 pN and that vinculin recruitment to FAs and force transmission across vinculin are regulated separately. Highest tension across vinculin is associated with adhesion assembly and enlargement. Conversely, vinculin is under low force in disassembling or sliding FAs at the trailing edge of migrating cells. Furthermore, vinculin is required for stabilizing adhesions under force. Together, these data reveal that FA stabilization under force requires both vinculin recruitment and force transmission, and that, surprisingly, these processes can be controlled independently.

Journal ArticleDOI
TL;DR: The FASTQ format is defined, covering the original Sanger standard, the Solexa/Illumina variants and conversion between them, based on publicly available information such as the MAQ documentation and conventions recently agreed by the Open Bioinformatics Foundation projects Biopython, BioPerl, BioRuby, BioJava and EMBOSS.
Abstract: FASTQ has emerged as a common file format for sharing sequencing read data combining both the sequence and an associated per base quality score, despite lacking any formal definition to date, and existing in at least three incompatible variants. This article defines the FASTQ format, covering the original Sanger standard, the Solexa/Illumina variants and conversion between them, based on publicly available information such as the MAQ documentation and conventions recently agreed by the Open Bioinformatics Foundation projects Biopython, BioPerl, BioRuby, BioJava and EMBOSS. Being an open access publication, it is hoped that this description, with the example files provided as Supplementary Data, will serve in future as a reference for this important file format.

Journal ArticleDOI
Stephen Richards1, Richard A. Gibbs1, Nicole M. Gerardo2, Nancy A. Moran3  +220 moreInstitutions (58)
TL;DR: The genome of the pea aphid shows remarkable levels of gene duplication and equally remarkable gene absences that shed light on aspects of aphid biology, most especially its symbiosis with Buchnera.
Abstract: Aphids are important agricultural pests and also biological models for studies of insect-plant interactions, symbiosis, virus vectoring, and the developmental causes of extreme phenotypic plasticity. Here we present the 464 Mb draft genome assembly of the pea aphid Acyrthosiphon pisum. This first published whole genome sequence of a basal hemimetabolous insect provides an outgroup to the multiple published genomes of holometabolous insects. Pea aphids are host-plant specialists, they can reproduce both sexually and asexually, and they have coevolved with an obligate bacterial symbiont. Here we highlight findings from whole genome analysis that may be related to these unusual biological features. These findings include discovery of extensive gene duplication in more than 2000 gene families as well as loss of evolutionarily conserved genes. Gene family expansions relative to other published genomes include genes involved in chromatin modification, miRNA synthesis, and sugar transport. Gene losses include genes central to the IMD immune pathway, selenoprotein utilization, purine salvage, and the entire urea cycle. The pea aphid genome reveals that only a limited number of genes have been acquired from bacteria; thus the reduced gene count of Buchnera does not reflect gene transfer to the host genome. The inventory of metabolic genes in the pea aphid genome suggests that there is extensive metabolite exchange between the aphid and Buchnera, including sharing of amino acid biosynthesis between the aphid and Buchnera. The pea aphid genome provides a foundation for post-genomic studies of fundamental biological questions and applied agricultural problems.

Journal ArticleDOI
TL;DR: The authors specify a progressive (cascading) pattern among ability-based EI facets, in which emotion perception must causally precede emotion understanding, which in turn precedes conscious emotion regulation and job performance.
Abstract: Research and valid practice in emotional intelligence (EI) have been impeded by lack of theoretical clarity regarding (a) the relative roles of emotion perception, emotion understanding, and emotion regulation facets in explaining job performance; (b) conceptual redundancy of EI with cognitive intelligence and Big Five personality; and (c) application of the EI label to 2 distinct sets of constructs (i.e., ability-based EI and mixed-based EI). In the current article, the authors propose and then test a theoretical model that integrates these factors. They specify a progressive (cascading) pattern among ability-based EI facets, in which emotion perception must causally precede emotion understanding, which in turn precedes conscious emotion regulation and job performance. The sequential elements in this progressive model are believed to selectively reflect Conscientiousness, cognitive ability, and Neuroticism, respectively. "Mixed-based" measures of EI are expected to explain variance in job performance beyond cognitive ability and personality. The cascading model of EI is empirically confirmed via meta-analytic data, although relationships between ability-based EI and job performance are shown to be inconsistent (i.e., EI positively predicts performance for high emotional labor jobs and negatively predicts performance for low emotional labor jobs). Gender and race differences in EI are also meta-analyzed. Implications for linking the EI fad in personnel selection to established psychological theory are discussed.

Book ChapterDOI
05 Sep 2010
TL;DR: A system that can compute a score linking an image to a sentence, which can be used to attach a descriptive sentence to a given image, or to obtain images that illustrate a given sentence.
Abstract: Humans can prepare concise descriptions of pictures, focusing on what they find important. We demonstrate that automatic methods can do so too. We describe a system that can compute a score linking an image to a sentence. This score can be used to attach a descriptive sentence to a given image, or to obtain images that illustrate a given sentence. The score is obtained by comparing an estimate of meaning obtained from the image to one obtained from the sentence. Each estimate of meaning comes from a discriminative procedure that is learned us-ingdata. We evaluate on a novel dataset consisting of human-annotated images. While our underlying estimate of meaning is impoverished, it is sufficient to produce very good quantitative results, evaluated with a novel score that can account for synecdoche.

Journal ArticleDOI
TL;DR: The OpenCL standard offers a common API for program execution on systems composed of different types of computational devices such as multicore CPUs, GPUs, or other accelerators as mentioned in this paper, such as accelerators.
Abstract: The OpenCL standard offers a common API for program execution on systems composed of different types of computational devices such as multicore CPUs, GPUs, or other accelerators.

Journal ArticleDOI
TL;DR: In this article, the authors highlight recent efforts and opportunities in the heterogeneous electrochemical conversion of carbon dioxide to help address the global issues of climate change and sustainable energy production, and highlight the potential of electrochemical reduction of CO2 to produce a variety of organic compounds such as formic acid, carbon monoxide, methane, and ethylene with high current efficiency.
Abstract: This Perspective highlights recent efforts and opportunities in the heterogeneous electrochemical conversion of carbon dioxide to help address the global issues of climate change and sustainable energy production. Recent research has shown that the electrochemical reduction of CO2 can produce a variety of organic compounds such as formic acid, carbon monoxide, methane, and ethylene with high current efficiency. These products can be used as feedstocks for chemical synthesis or converted into hydrocarbon fuels. This process is of interest (i) for the recycling of CO2 as an energy carrier, thereby reducing its accumulation in the atmosphere, (ii) for the production of renewable hydrocarbon fuels from CO2, water, and renewable electricity for use as transportation fuels, and (iii) as a convenient means of storing electrical energy in chemical form to level the electrical output from intermittent energy sources such as wind and solar. Accomplishments to date in this field of study have been encouraging, yet s...

Journal ArticleDOI
13 Aug 2010-Science
TL;DR: In 2008, the world produced approximately 87 gigaliters of liquid biofuels, which is roughly equal to the volume of liquid fuel consumed by Germany that year; however, all of this biofuel was produced from crops developed for food production, raising concerns about the net energy and greenhouse gas effects.
Abstract: In 2008, the world produced approximately 87 gigaliters of liquid biofuels, which is roughly equal to the volume of liquid fuel consumed by Germany that year. Essentially, all of this biofuel was produced from crops developed for food production, raising concerns about the net energy and greenhouse gas effects and potential competition between use of land for production of fuels, food, animal feed, fiber, and ecosystem services. The pending implementation of improved technologies to more effectively convert the nonedible parts of plants (lignocellulose) to liquid fuels opens diverse options to use biofuel feedstocks that reach beyond current crops and the land currently used for food and feed. However, there has been relatively little discussion of what types of plants may be useful as bioenergy crops.

Journal ArticleDOI
TL;DR: Self-healing polymers and fiber-reinforced polymer composites possess the ability to heal in response to damage whenever and whenever it occurs in the material as mentioned in this paper, which is a remarkable property.
Abstract: Self-healing polymers and fiber-reinforced polymer composites possess the ability to heal in response to damage wherever and whenever it occurs in the material. This phenomenal material behavior is...

Journal ArticleDOI
TL;DR: A survey of self-healing polymers can be found in this article, where the authors review the major successful autonomic repairing mechanisms developed over the last decade and discuss several issues related to transferring these selfhealing technologies from the laboratory to real applications, such as virgin polymer property changes as a result of the added healing functionality.
Abstract: Inspired by the unique and efficient wound healing processes in biological systems, several approaches to develop synthetic polymers that can repair themselves with complete, or nearly complete, autonomy have recently been developed. This review aims to survey the rapidly expanding field of self-healing polymers by reviewing the major successful autonomic repairing mechanisms developed over the last decade. Additionally, we discuss several issues related to transferring these self-healing technologies from the laboratory to real applications, such as virgin polymer property changes as a result of the added healing functionality, healing in thin films v. bulk polymers, and healing in the presence of structural reinforcements.

Journal ArticleDOI
TL;DR: This study addresses the extent to which insecure and disorganized attachments increase risk for externalizing problems using meta-analysis and discusses the potential significance of attachment for mental health.
Abstract: This study addresses the extent to which insecure and disorganized attachments increase risk for externalizing problems using meta-analysis. From 69 samples (N = 5,947), the association between insecurity and externalizing problems was significant, d = 0.31 (95% CI: 0.23, 0.40). Larger effects were found for boys (d = 0.35), clinical samples (d = 0.49), and from observation-based outcome assessments (d = 0.58). Larger effects were found for attachment assessments other than the Strange Situation. Overall, disorganized children appeared at elevated risk (d = 0.34, 95% CI: 0.18, 0.50), with weaker effects for avoidance (d = 0.12, 95% CI: 0.03, 0.21) and resistance (d = 0.11, 95% CI: −0.04, 0.26). The results are discussed in terms of the potential significance of attachment for mental health.