scispace - formally typeset
Search or ask a question

Showing papers by "Arizona State University published in 2010"


Journal ArticleDOI
TL;DR: In this paper, a new general class of local indicators of spatial association (LISA) is proposed, which allow for the decomposition of global indicators, such as Moran's I, into the contribution of each observation.
Abstract: The capabilities for visualization, rapid data retrieval, and manipulation in geographic information systems (GIS) have created the need for new techniques of exploratory data analysis that focus on the “spatial” aspects of the data. The identification of local patterns of spatial association is an important concern in this respect. In this paper, I outline a new general class of local indicators of spatial association (LISA) and show how they allow for the decomposition of global indicators, such as Moran's I, into the contribution of each observation. The LISA statistics serve two purposes. On one hand, they may be interpreted as indicators of local pockets of nonstationarity, or hot spots, similar to the Gi and G*i statistics of Getis and Ord (1992). On the other hand, they may be used to assess the influence of individual locations on the magnitude of the global statistic and to identify “outliers,” as in Anselin's Moran scatterplot (1993a). An initial evaluation of the properties of a LISA statistic is carried out for the local Moran, which is applied in a study of the spatial pattern of conflict for African countries and in a number of Monte Carlo simulations.

8,933 citations


Proceedings Article
21 Jun 2010
TL;DR: A novel 3D CNN model for action recognition that extracts features from both the spatial and the temporal dimensions by performing 3D convolutions, thereby capturing the motion information encoded in multiple adjacent frames.
Abstract: We consider the fully automated recognition of actions in uncontrolled environment. Most existing work relies on domain knowledge to construct complex handcrafted features from inputs. In addition, the environments are usually assumed to be controlled. Convolutional neural networks (CNNs) are a type of deep models that can act directly on the raw inputs, thus automating the process of feature construction. However, such models are currently limited to handle 2D inputs. In this paper, we develop a novel 3D CNN model for action recognition. This model extracts features from both spatial and temporal dimensions by performing 3D convolutions, thereby capturing the motion information encoded in multiple adjacent frames. The developed model generates multiple channels of information from the input frames, and the final feature representation is obtained by combining information from all channels. We apply the developed model to recognize human actions in real-world environment, and it achieves superior performance without relying on handcrafted features.

4,087 citations


Book
23 Apr 2010
TL;DR: This chapter discusses how to improve the accuracy of Maximum Likelihood Analyses by extending EM to Multivariate Data, and the role of First Derivatives in this process.
Abstract: Part 1. An Introduction to Missing Data. 1.1 Introduction. 1.2 Chapter Overview. 1.3 Missing Data Patterns. 1.4 A Conceptual Overview of Missing Data heory. 1.5 A More Formal Description of Missing Data Theory. 1.6 Why Is the Missing Data Mechanism Important? 1.7 How Plausible Is the Missing at Random Mechanism? 1.8 An Inclusive Analysis Strategy. 1.9 Testing the Missing Completely at Random Mechanism. 1.10 Planned Missing Data Designs. 1.11 The Three-Form Design. 1.12 Planned Missing Data for Longitudinal Designs. 1.13 Conducting Power Analyses for Planned Missing Data Designs. 1.14 Data Analysis Example. 1.15 Summary. 1.16 Recommended Readings. Part 2. Traditional Methods for Dealing with Missing Data. 2.1 Chapter Overview. 2.2 An Overview of Deletion Methods. 2.3 Listwise Deletion. 2.4 Pairwise Deletion. 2.5 An Overview of Single Imputation Techniques. 2.6 Arithmetic Mean Imputation. 2.7 Regression Imputation. 2.8 Stochastic Regression Imputation. 2.9 Hot-Deck Imputation. 2.10 Similar Response Pattern Imputation. 2.11 Averaging the Available Items. 2.12 Last Observation Carried Forward. 2.13 An Illustrative Simulation Study. 2.14 Summary. 2.15 Recommended Readings. Part 3. An Introduction to Maximum Likelihood Estimation. 3.1 Chapter Overview. 3.2 The Univariate Normal Distribution. 3.3 The Sample Likelihood. 3.4 The Log-Likelihood. 3.5 Estimating Unknown Parameters. 3.6 The Role of First Derivatives. 3.7 Estimating Standard Errors. 3.8 Maximum Likelihood Estimation with Multivariate Normal Data. 3.9 A Bivariate Analysis Example. 3.10 Iterative Optimization Algorithms. 3.11 Significance Testing Using the Wald Statistic. 3.12 The Likelihood Ratio Test Statistic. 3.13 Should I Use the Wald Test or the Likelihood Ratio Statistic? 3.14 Data Analysis Example 1. 3.15 Data Analysis Example 2. 3.16 Summary. 3.17 Recommended Readings. Part 4. Maximum Likelihood Missing Data Handling. 4.1 Chapter Overview. 4.2 The Missing Data Log-Likelihood. 4.3 How Do the Incomplete Data Records Improve Estimation? 4.4 An Illustrative Computer Simulation Study. 4.5 Estimating Standard Errors with Missing Data. 4.6 Observed Versus Expected Information. 4.7 A Bivariate Analysis Example. 4.8 An Illustrative Computer Simulation Study. 4.9 An Overview of the EM Algorithm. 4.10 A Detailed Description of the EM Algorithm. 4.11 A Bivariate Analysis Example. 4.12 Extending EM to Multivariate Data. 4.13 Maximum Likelihood Software Options. 4.14 Data Analysis Example 1. 4.15 Data Analysis Example 2. 4.16 Data Analysis Example 3. 4.17 Data Analysis Example 4. 4.18 Data Analysis Example 5. 4.19 Summary. 4.20 Recommended Readings. Part 5. Improving the Accuracy of Maximum Likelihood Analyses. 5.1 Chapter Overview. 5.2 The Rationale for an Inclusive Analysis Strategy. 5.3 An Illustrative Computer Simulation Study. 5.4 Identifying a Set of Auxiliary Variables. 5.5 Incorporating Auxiliary Variables Into a Maximum Likelihood Analysis. 5.6 The Saturated Correlates Model. 5.7 The Impact of Non-Normal Data. 5.8 Robust Standard Errors. 5.9 Bootstrap Standard Errors. 5.10 The Rescaled Likelihood Ratio Test. 5.11 Bootstrapping the Likelihood Ratio Statistic. 5.12 Data Analysis Example 1. 5.13 Data Analysis Example 2. 5.14 Data Analysis Example 3. 5.15 Summary. 5.16 Recommended Readings. Part 6. An Introduction to Bayesian Estimation. 6.1 Chapter Overview. 6.2 What Makes Bayesian Statistics Different? 6.3 A Conceptual Overview of Bayesian Estimation. 6.4 Bayes' Theorem. 6.5 An Analysis Example. 6.6 How Does Bayesian Estimation Apply to Multiple Imputation? 6.7 The Posterior Distribution of the Mean. 6.8 The Posterior Distribution of the Variance. 6.9 The Posterior Distribution of a Covariance Matrix. 6.10 Summary. 6.11 Recommended Readings. Part 7. The Imputation Phase of Multiple Imputation. 7.1 Chapter Overview. 7.2 A Conceptual Description of the Imputation Phase. 7.3 A Bayesian Description of the Imputation Phase. 7.4 A Bivariate Analysis Example. 7.5 Data Augmentation with Multivariate Data. 7.6 Selecting Variables for Imputation. 7.7 The Meaning of Convergence. 7.8 Convergence Diagnostics. 7.9 Time-Series Plots. 7.10 Autocorrelation Function Plots. 7.11 Assessing Convergence from Alternate Starting Values. 7.12 Convergence Problems. 7.13 Generating the Final Set of Imputations. 7.14 How Many Data Sets Are Needed? 7.15 Summary. 7.16 Recommended Readings. Part 8. The Analysis and Pooling Phases of Multiple Imputation. 8.1 Chapter Overview. 8.2 The Analysis Phase. 8.3 Combining Parameter Estimates in the Pooling Phase. 8.4 Transforming Parameter Estimates Prior to Combining. 8.5 Pooling Standard Errors. 8.6 The Fraction of Missing Information and the Relative Increase in Variance. 8.7 When Is Multiple Imputation Comparable to Maximum Likelihood? 8.8 An Illustrative Computer Simulation Study. 8.9 Significance Testing Using the t Statistic. 8.10 An Overview of Multiparameter Significance Tests. 8.11 Testing Multiple Parameters Using the D1 Statistic. 8.12 Testing Multiple Parameters by Combining Wald Tests. 8.13 Testing Multiple Parameters by Combining Likelihood Ratio Statistics. 8.14 Data Analysis Example 1. 8.15 Data Analysis Example 2. 8.16 Data Analysis Example 3. 8.17 Summary. 8.18 Recommended Readings. Part 9. Practical Issues in Multiple Imputation. 9.1 Chapter Overview. 9.2 Dealing with Convergence Problems. 9.3 Dealing with Non-Normal Data. 9.4 To Round or Not to Round? 9.5 Preserving Interaction Effects. 9.6 Imputing Multiple-Item Questionnaires. 9.7 Alternate Imputation Algorithms. 9.8 Multiple Imputation Software Options. 9.9 Data Analysis Example 1. 9.10 Data Analysis Example 2. 9.11 Summary. 9.12 Recommended Readings. Part 10. Models for Missing Not at Random Data. 10.1 Chapter Overview. 10.2 An Ad Hoc Approach to Dealing with MNAR Data. 10.3 The Theoretical Rationale for MNAR Models. 10.4 The Classic Selection Model. 10.5 Estimating the Selection Model. 10.6 Limitations of the Selection Model. 10.7 An Illustrative Analysis. 10.8 The Pattern Mixture Model. 10.9 Limitations of the Pattern Mixture Model. 10.10 An Overview of the Longitudinal Growth Model. 10.11 A Longitudinal Selection Model. 10.12 Random Coefficient Selection Models. 10.13 Pattern Mixture Models for Longitudinal Analyses. 10.14 Identification Strategies for Longitudinal Pattern Mixture Models. 10.15 Delta Method Standard Errors. 10.16 Overview of the Data Analysis Examples. 10.17 Data Analysis Example 1. 10.18 Data Analysis Example 2. 10.19 Data Analysis Example 3. 10.20 Data Analysis Example 4. 10.21 Summary. 10.22 Recommended Readings. Part 11. Wrapping Things Up: Some Final Practical Considerations. 11.1 Chapter Overview. 11.2 Maximum Likelihood Software Options. 11.3 Multiple Imputation Software Options. 11.4 Choosing between Maximum Likelihood and Multiple Imputation. 11.5 Reporting the Results from a Missing Data Analysis. 11.6 Final Thoughts. 11.7 Recommended Readings.

3,910 citations


Journal ArticleDOI
TL;DR: This work presents an integrative 2-level MSEM mathematical framework that subsumes new and existing multilevel mediation approaches as special cases and uses several applied examples to illustrate the flexibility of this framework.
Abstract: Several methods for testing mediation hypotheses with 2-level nested data have been proposed by researchers using a multilevel modeling (MLM) paradigm. However, these MLM approaches do not accommodate mediation pathways with Level-2 outcomes and may produce conflated estimates of between- and within-level components of indirect effects. Moreover, these methods have each appeared in isolation, so a unified framework that integrates the existing methods, as well as new multilevel mediation models, is lacking. Here we show that a multilevel structural equation modeling (MSEM) paradigm can overcome these 2 limitations of mediation analysis with MLM. We present an integrative 2-level MSEM mathematical framework that subsumes new and existing multilevel mediation approaches as special cases. We use several applied examples and accompanying software code to illustrate the flexibility of this framework and to show that different substantive conclusions can be drawn using MSEM versus MLM.

2,595 citations


Journal ArticleDOI
TL;DR: This Progress Report provides an update on recent developments in inkjet printing technology and its applications, which include organic thin-film transistors, light-emitting diodes, solar cells, conductive structures, memory devices, sensors, and biological/pharmaceutical tasks.
Abstract: In this Progress Report we provide an update on recent developments in inkjet printing technology and its applications, which include organic thin-film transistors, light-emitting diodes, solar cells, conductive structures, memory devices, sensors, and biological/pharmaceutical tasks. Various classes of materials and device types are in turn examined and an opinion is offered about the nature of the progress that has been achieved.

2,019 citations


Book
01 Jul 2010
TL;DR: This chapter discusses the history and Ecological Basis of Species' Distribution Modeling, and the design and implementation of species' distribution models.
Abstract: Maps of species' distributions or habitat suitability are required for many aspects of environmental research, resource management and conservation planning. These include biodiversity assessment, reserve design, habitat management and restoration, species and habitat conservation plans and predicting the effects of environmental change on species and ecosystems. The proliferation of methods and uncertainty regarding their effectiveness can be daunting to researchers, resource managers and conservation planners alike. Franklin summarises the methods used in species distribution modeling (also called niche modeling) and presents a framework for spatial prediction of species distributions based on the attributes (space, time, scale) of the data and questions being asked. The framework links theoretical ecological models of species distributions to spatial data on species and environment, and statistical models used for spatial prediction. Providing practical guidelines to students, researchers and practitioners in a broad range of environmental sciences including ecology, geography, conservation biology, and natural resources management.

1,944 citations


Journal ArticleDOI
TL;DR: In this paper, the authors argue that instead of focusing only on global efforts (which are indeed a necessary part of the longterm solution), it is better to encourage polycentric efforts to reduce the risks associated with the emission of greenhouse gases.
Abstract: The 20th anniversary issue of Global Environmental Change provides an important opportunity to address the core questions involved in addressing ‘‘global environmental’’ problems—especially those related to climate change. Climate change is a global collective-action problem since all of us face the likelihood of extremely adverse outcomes that could be reduced if many participants take expensive actions. Conventional collective-action theory predicts that these problems will not be solved unless an external authority determines appropriate actions to be taken, monitors behavior, and imposes sanctions. Debating about global efforts to solve climate-change problems, however, has yet not led to an effective global treaty. Fortunately, many activities can be undertaken by multiple units at diverse scales that cumulatively make a difference. I argue that instead of focusing only on global efforts (which are indeed a necessary part of the long-term solution), it is better to encourage polycentric efforts to reduce the risks associated with the emission of greenhouse gases. Polycentric approaches facilitate achieving benefits at multiple scales as well as experimentation and learning from experience with diverse policies.

1,644 citations


Journal ArticleDOI
TL;DR: Given the significant, sustained growth in services experienced worldwide, Arizona State University's Center for Services Leadership embarked on an 18-month effort to identify and articulate a set of services leadership principles as mentioned in this paper.
Abstract: Given the significant, sustained growth in services experienced worldwide, Arizona State University’s Center for Services Leadership embarked on an 18-month effort to identify and articulate a set ...

1,513 citations


Journal ArticleDOI
10 Dec 2010-Science
TL;DR: Though the threat of extinction is increasing, overall declines would have been worse in the absence of conservation, and current conservation efforts remain insufficient to offset the main drivers of biodiversity loss in these groups.
Abstract: Using data for 25,780 species categorized on the International Union for Conservation of Nature Red List, we present an assessment of the status of the world's vertebrates. One-fifth of species are classified as Threatened, and we show that this figure is increasing: On average, 52 species of mammals, birds, and amphibians move one category closer to extinction each year. However, this overall pattern conceals the impact of conservation successes, and we show that the rate of deterioration would have been at least one-fifth again as much in the absence of these. Nonetheless, current conservation efforts remain insufficient to offset the main drivers of biodiversity loss in these groups: agricultural expansion, logging, overexploitation, and invasive alien species.

1,333 citations


Proceedings ArticleDOI
13 Jun 2010
TL;DR: The proposed method to learn an over-complete dictionary is based on extending the K-SVD algorithm by incorporating the classification error into the objective function, thus allowing the performance of a linear classifier and the representational power of the dictionary being considered at the same time by the same optimization procedure.
Abstract: In a sparse-representation-based face recognition scheme, the desired dictionary should have good representational power (i.e., being able to span the subspace of all faces) while supporting optimal discrimination of the classes (i.e., different human subjects). We propose a method to learn an over-complete dictionary that attempts to simultaneously achieve the above two goals. The proposed method, discriminative K-SVD (D-KSVD), is based on extending the K-SVD algorithm by incorporating the classification error into the objective function, thus allowing the performance of a linear classifier and the representational power of the dictionary being considered at the same time by the same optimization procedure. The D-KSVD algorithm finds the dictionary and solves for the classifier using a procedure derived from the K-SVD algorithm, which has proven efficiency and performance. This is in contrast to most existing work that relies on iteratively solving sub-problems with the hope of achieving the global optimal through iterative approximation. We evaluate the proposed method using two commonly-used face databases, the Extended YaleB database and the AR database, with detailed comparison to 3 alternative approaches, including the leading state-of-the-art in the literature. The experiments show that the proposed method outperforms these competing methods in most of the cases. Further, using Fisher criterion and dictionary incoherence, we also show that the learned dictionary and the corresponding classifier are indeed better-posed to support sparse-representation-based recognition.

1,331 citations


Journal ArticleDOI
01 Nov 2010
TL;DR: This article explores the roadblocks and solutions to providing a trustworthy cloud computing environment and suggests a number of approaches that could be considered.
Abstract: Cloud computing is an evolving paradigm with tremendous momentum, but its unique aspects exacerbate security and privacy challenges. This article explores the roadblocks and solutions to providing a trustworthy cloud computing environment.

Journal ArticleDOI
TL;DR: The theoretical underpinnings of missing data analyses are explained, an overview of traditional missing data techniques are given, and accessible descriptions of maximum likelihood and multiple imputation are provided.

Posted Content
TL;DR: In this article, a large sample of U.S. firms for the period 1995-2008 was used to show that corporate tax avoidance is positively associated with firm-specific stock price crash risk, which is consistent with the following view: tax avoidance facilitates managerial rent extraction and bad news hoarding activities for extended periods by providing tools, masks, and justifications for these opportunistic behaviors.
Abstract: Using a large sample of U.S. firms for the period 1995-2008, we provide strong and robust evidence that corporate tax avoidance is positively associated with firm-specific stock price crash risk. This finding is consistent with the following view: Tax avoidance facilitates managerial rent extraction and bad news hoarding activities for extended periods by providing tools, masks, and justifications for these opportunistic behaviors. The hoarding and accumulation of bad news for extended periods lead to stock price crashes when the accumulated hidden bad news crosses a tipping point, and thus comes out all at once. Moreover, we show that the positive relation between tax avoidance and crash risk is attenuated when firms have strong external monitoring mechanisms such as high institutional ownership, high analyst coverage, and greater takeover threat from corporate control markets.

Journal ArticleDOI
TL;DR: In this article, the authors discuss the emerging research concerned with sustainable development and entrepreneurship, which is the focus of this special issue of the Journal of Business Venturing (JBEV).

Journal ArticleDOI
TL;DR: In this review, the distinction between effortful self-regulatory processes and those that are somewhat less voluntary is discussed, and literature on the former capacities is reviewed.
Abstract: The development of children's emotion-related self-regulation appears to be related to, and likely involved in, many aspects of children's development. In this review, the distinction between effortful self-regulatory processes and those that are somewhat less voluntary is discussed, and literature on the former capacities is reviewed. Emotion-related self-regulation develops rapidly in the early years of life and improves more slowly into adulthood. Individual differences in children's self-regulation are fairly stable after the first year or two of life. Such individual differences are inversely related to at least some types of externalizing problems. Findings for internalizing problems are less consistent and robust, although emotion-related self-regulation appears to be inversely related to internalizing problems after the early years. Self-regulatory capacities have been related to both genetic and environmental factors and their interaction. Some interventions designed to foster self-regulation and...

Journal ArticleDOI
13 May 2010-Nature
TL;DR: It is demonstrated that previously developed random walkers show elementary robotic behaviour when interacting with a precisely defined environment and single-molecule microscopy observations confirm that such walkers achieve directional movement by sensing and modifying tracks of substrate molecules laid out on a two-dimensional DNA origami landscape.
Abstract: Traditional robots rely for their function on computing, to store internal representations of their goals and environment and to coordinate sensing and any actuation of components required in response. Moving robotics to the single-molecule level is possible in principle, but requires facing the limited ability of individual molecules to store complex information and programs. One strategy to overcome this problem is to use systems that can obtain complex behaviour from the interaction of simple robots with their environment. A first step in this direction was the development of DNA walkers, which have developed from being non-autonomous, to being capable of directed but brief motion on one-dimensional tracks. Here we demonstrate that previously developed random walkers—so-called molecular spiders that comprise a streptavidin molecule as an inert ‘body’ and three deoxyribozymes as catalytic ‘legs’—show elementary robotic behaviour when interacting with a precisely defined environment. Single-molecule microscopy observations confirm that such walkers achieve directional movement by sensing and modifying tracks of substrate molecules laid out on a two-dimensional DNA origami landscape. When using appropriately designed DNA origami, the molecular spiders autonomously carry out sequences of actions such as ‘start’, ‘follow’, ‘turn’ and ‘stop’. We anticipate that this strategy will result in more complex robotic behaviour at the molecular level if additional control mechanisms are incorporated. One example might be interactions between multiple molecular robots leading to collective behaviour; another might be the ability to read and transform secondary cues on the DNA origami landscape as a means of implementing Turing-universal algorithmic behaviour.

Journal ArticleDOI
TL;DR: In this article, the authors conducted a pilot test of the PsyCap intervention (PCI) model with a randomized control group design, and conducted a follow-up study with a cross section of practicing managers to determine if following the training guidelines of the PCI caused the participants' performance to improve.
Abstract: Recently, theory and research have supported psychological capital (PsyCap) as an emerging core construct linked to positive outcomes at the individual and organizational level. However, to date, little attention has been given to PsyCap development through training interventions; nor have there been attempts to determine empirically if such PsyCap development has a causal impact on participants' performance. To fill these gaps we first conducted a pilot test of the PsyCap intervention (PCI) model with a randomized control group design. Next, we conducted a follow-up study with a cross section of practicing managers to determine if following the training guidelines of the PCI caused the participants' performance to improve. Results provide beginning empirical evidence that short training interventions such as PCI not only may be used to develop participants' psychological capital, but can also lead to an improvement in their on-the-job performance. The implications these findings have for human resource development and performance management conclude the article.

Journal ArticleDOI
15 Jan 2010-Science
TL;DR: Key findings include the identification of a functional DNA methylation tool kit; hymenopteran-specific genes including diverse venoms; lateral gene transfers among Pox viruses, Wolbachia, and Nasonia; and the rapid evolution of genes involved in nuclear-mitochondrial interactions that are implicated in speciation.
Abstract: We report here genome sequences and comparative analyses of three closely related parasitoid wasps: Nasonia vitripennis, N. giraulti, and N. longicornis. Parasitoids are important regulators of arthropod populations, including major agricultural pests and disease vectors, and Nasonia is an emerging genetic model, particularly for evolutionary and developmental genetics. Key findings include the identification of a functional DNA methylation tool kit; hymenopteran-specific genes including diverse venoms; lateral gene transfers among Pox viruses, Wolbachia, and Nasonia; and the rapid evolution of genes involved in nuclear-mitochondrial interactions that are implicated in speciation. Newly developed genome resources advance Nasonia for genetic research, accelerate mapping and cloning of quantitative trait loci, and will ultimately provide tools and knowledge for further increasing the utility of parasitoids as pest insect-control agents.

Journal ArticleDOI
TL;DR: This work revisits the idea of a motivational hierarchy in light of theoretical developments at the interface of evolutionary biology, anthropology, and psychology and proposes a renovated hierarchy of fundamental motives that serves as both an integrative framework and a generative foundation for future empirical research.
Abstract: Maslow’s pyramid of human needs, proposed in 1943, has been one of the most cognitively contagious ideas in the behavioral sciences. Anticipating later evolutionary views of human motivation and cognition, Maslow viewed human motives as based in innate and universal predispositions. We revisit the idea of a motivational hierarchy in light of theoretical developments at the interface of evolutionary biology, anthropology, and psychology. After considering motives at three different levels of analysis, we argue that the basic foundational structure of the pyramid is worth preserving, but that it should be buttressed with a few architectural extensions. By adding a contemporary design feature, connections between fundamental motives and immediate situational threats and opportunities should be highlighted. By incorporating a classical element, these connections can be strengthened by anchoring the hierarchy of human motives more firmly in the bedrock of modern evolutionary theory. We propose a renovated hierarchy of fundamental motives that serves as both an integrative framework and a generative foundation for future empirical research.

Journal ArticleDOI
TL;DR: Results revealed that commitment to the supervisor, self-efficacy, procedural justice climate, and service climate partially mediated the relationship between servant leadership and organizational citizenship behavior.
Abstract: This study tests the influence of servant leadership on 2 group climates, employee attitudes, and organizational citizenship behavior. Results from a sample of 815 employees and 123 immediate supervisors revealed that commitment to the supervisor, self-efficacy, procedural justice climate, and service climate partially mediated the relationship between servant leadership and organizational citizenship behavior. Cross-level interaction results revealed that procedural justice climate and positive service climate amplified the influence of commitment to the supervisor on organizational citizenship behavior. Implications of these results for theory and practice and directions for future research are discussed.

Journal ArticleDOI
TL;DR: In this paper, the experimental results on solar collectors based on nanofluids made from a variety of nanoparticles (carbon nanotubes, graphite, and silver) were reported.
Abstract: Solar energy is one of the best sources of renewable energy with minimal environmental impact. Direct absorption solar collectors have been proposed for a variety of applications such as water heating; however the efficiency of these collectors is limited by the absorption properties of the working fluid, which is very poor for typical fluids used in solar collectors. It has been shown that mixing nanoparticles in a liquid (nanofluid) has a dramatic effect on the liquid thermophysical properties such as thermal conductivity. Nanoparticles also offer the potential of improving the radiative properties of liquids, leading to an increase in the efficiency of direct absorption solar collectors. Here we report on the experimental results on solar collectors based on nanofluids made from a variety of nanoparticles (carbon nanotubes, graphite, and silver). We demonstrate efficiency improvements of up to 5% in solar thermal collectors by utilizing nanofluids as the absorption mechanism. In addition the experiment...

Journal ArticleDOI
10 Jun 2010-Neuron
TL;DR: It is reported that ultrasound triggers TTX-sensitive neuronal activity in the absence of a rise in brain temperature (<0.01 degrees C) and has a lateral spatial resolution of approximately 2 mm and does not require exogenous factors or surgical invasion.

Journal ArticleDOI
TL;DR: The Wide Angle Camera (WAC) and Narrow Angle Cameras (NACs) are on the NASA Lunar Reconnaissance Orbiter (LRO) as discussed by the authors, and the primary mission of LRO is to obtain measurements of the Moon that will enable future lunar human exploration.
Abstract: The Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) and Narrow Angle Cameras (NACs) are on the NASA Lunar Reconnaissance Orbiter (LRO). The WAC is a 7-color push-frame camera (100 and 400 m/pixel visible and UV, respectively), while the two NACs are monochrome narrow-angle linescan imagers (0.5 m/pixel). The primary mission of LRO is to obtain measurements of the Moon that will enable future lunar human exploration. The overarching goals of the LROC investigation include landing site identification and certification, mapping of permanently polar shadowed and sunlit regions, meter-scale mapping of polar regions, global multispectral imaging, a global morphology base map, characterization of regolith properties, and determination of current impact hazards.

Journal ArticleDOI
TL;DR: In this paper, several diagnostics for the assessment of model misspecification due to spatial dependence and spatial heterogeneity are developed as an application of the Lagrange Multiplier principle.
Abstract: Several diagnostics for the assessment of model misspecification due to spatial dependence and spatial heterogeneity are developed as an application of the Lagrange Multiplier principle. The starting point is a general model which incorporates spatially lagged dependent variables, spatial residual autocorrelation and heteroskedasticity. Particular attention is given to tests for spatial residual autocorrelation in the presence of spatially lagged dependent variables and in the presence of heteroskedasticity. The tests are formally derived and illustrated in a number of simple empirical examples.

Journal ArticleDOI
TL;DR: In this article, the authors compare the properties of Moran's I and Lagrange multiplier tests for spatial dependence, that is, for both spatial error autocorrelation and for a spatially lagged dependent variable.
Abstract: Based on a large number of Monte Carlo simulation experiments on a regular lattice, we compare the properties of Moran's I and Lagrange multiplier tests for spatial dependence, that is, for both spatial error autocorrelation and for a spatially lagged dependent variable. We consider both bias and power of the tests for six sample sizes, ranging from twenty-five to 225 observations, for different structures of the spatial weights matrix, for several underlying error distributions, for misspecified weights matrices, and for the situation where boundary effects are present. The results provide an indication of the sample sizes for which the asymptotic properties of the tests can be considered to hold. They also illustrate the power of the Lagrange multiplier tests to distinguish between substantive spatial dependence (spatial lag) and spatial dependence as a nuisance (error autocorrelation).

Journal ArticleDOI
TL;DR: Emerging knowledge of how plant nutrients respond to environmental variables and are connected to size, the effects of global change factors can be better understood.
Abstract: Biological stoichiometry theory considers the balance of multiple chemical elements in living systems, whereas metabolic scaling theory considers how size affects metabolic properties from cells to ecosystems. We review recent developments integrating biological stoichiometry and metabolic scaling theories in the context of plant ecology and global change. Although vascular plants exhibit wide variation in foliar carbon:nitrogen:phosphorus ratios, they exhibit a higher degree of 'stoichiometric homeostasis' than previously appreciated. Thus, terrestrial carbon:nitrogen:phosphorus stoichiometry will reflect the effects of adjustment to local growth conditions as well as species' replacements. Plant stoichiometry exhibits size scaling, as foliar nutrient concentration decreases with increasing plant size, especially for phosphorus. Thus, small plants have lower nitrogen:phosphorus ratios. Furthermore, foliar nutrient concentration is reflected in other tissues (root, reproductive, support), permitting the development of empirical models of production that scale from tissue to whole-plant levels. Plant stoichiometry exhibits large-scale macroecological patterns, including stronger latitudinal trends and environmental correlations for phosphorus concentration (relative to nitrogen) and a positive correlation between nutrient concentrations and geographic range size. Given this emerging knowledge of how plant nutrients respond to environmental variables and are connected to size, the effects of global change factors (such as carbon dioxide, temperature, nitrogen deposition) can be better understood.

Journal ArticleDOI
TL;DR: It is suggested that the critical threshold for N-induced species loss to mature Eurasian grasslands is below 1.75gNm � 2 yr � 1, and that changes in aboveground biomass, species richness, and plant functional group composition to both mature and degraded ecosystems saturate at N addition rates of approximately 10.5 gNm� 2 yr� 1.
Abstract: Nitrogen (N) deposition is widely considered an environmental problem that leads to biodiversity loss and reduced ecosystem resilience; but, N fertilization has also been used as a management tool for enhancing primary production and ground cover, thereby promoting the restoration of degraded lands. However, empirical evaluation of these contrasting impacts is lacking. We tested the dual effects of N enrichment on biodiversity and ecosystem functioning at different organizational levels (i.e., plant species, functional groups, and community) by adding N at 0, 1.75, 5.25, 10.5, 17.5, and 28.0gNm � 2 yr � 1 for four years in two contrasting field sites in Inner Mongolia: an undisturbed mature grassland and a nearby degraded grassland of the same type. N addition had both quantitatively and qualitatively different effects on the two communities. In the mature community, N addition led to a large reduction in species richness, accompanied by increased dominance of early successional annuals and loss of perennial grasses and forbs at all N input rates. In the degraded community, however, N addition increased the productivity and dominance of perennial rhizomatous grasses, with only a slight reduction in species richness and no significant change in annual abundance. The mature grassland was much more sensitive to N-induced changes in community structure, likely as a result of higher soil moisture accentuating limitation by N alone. Our findings suggest that the critical threshold for N-induced species loss to mature Eurasian grasslands is below 1.75gNm � 2 yr � 1 , and that changes in aboveground biomass, species richness, and plant functional group composition to both mature and degraded ecosystems saturate at N addition rates of approximately 10.5gNm � 2 yr � 1 . This work highlights the tradeoffs that exist in assessing the total impact of N deposition on ecosystem function.

Journal ArticleDOI
TL;DR: The various kinds of symbols used to characterize the topology of vertices in 3-periodic nets, tiles and polyhedra, and symbols for tilings are reviewed, making a recommendation for uniform nomenclature where there is some confusion and misapplication of terminology.
Abstract: We review the various kinds of symbols used to characterize the topology of vertices in 3-periodic nets, tiles and polyhedra, and symbols for tilings, making a recommendation for uniform nomenclature where there is some confusion and misapplication of terminology.

Journal ArticleDOI
TL;DR: O Ongoing efforts to steer human society toward resource conservation and sustainable consumption are discussed, including the concept of the 5 Rs--i.e., reduce, reuse, recycle, rethink, restrain--for minimizing pre- and postnatal exposures to potentially harmful components of plastics.
Abstract: By 2010, the worldwide annual production of plastics will surpass 300 million tons. Plastics are indispensable materials in modern society, and many products manufactured from plastics are a boon to public health (e.g., disposable syringes, intravenous bags). However, plastics also pose health risks. Of principal concern are endocrine-disrupting properties, as triggered for example by bisphenol A and di-(2-ethylhexyl) phthalate (DEHP). Opinions on the safety of plastics vary widely, and despite more than five decades of research, scientific consensus on product safety is still elusive. This literature review summarizes information from more than 120 peer-reviewed publications on health effects of plastics and plasticizers in lab animals and humans. It examines problematic exposures of susceptible populations and also briefly summarizes adverse environmental impacts from plastic pollution. Ongoing efforts to steer human society toward resource conservation and sustainable consumption are discussed, includi...

Journal ArticleDOI
TL;DR: Modules for Experiments in Stellar Astrophysics (MESA) as mentioned in this paper is a suite of open source libraries for a wide range of applications in computational stellar astrophysics, including advanced evolutionary phases.
Abstract: Stellar physics and evolution calculations enable a broad range of research in astrophysics. Modules for Experiments in Stellar Astrophysics (MESA) is a suite of open source libraries for a wide range of applications in computational stellar astrophysics. A newly designed 1-D stellar evolution module, MESA star, combines many of the numerical and physics modules for simulations of a wide range of stellar evolution scenarios ranging from very-low mass to massive stars, including advanced evolutionary phases. MESA star solves the fully coupled structure and composition equations simultaneously. It uses adaptive mesh refinement and sophisticated timestep controls, and supports shared memory parallelism based on OpenMP. Independently usable modules provide equation of state, opacity, nuclear reaction rates, and atmosphere boundary conditions. Each module is constructed as a separate Fortran 95 library with its own public interface. Examples include comparisons to other codes and show evolutionary tracks of very low mass stars, brown dwarfs, and gas giant planets; the complete evolution of a 1 Msun star from the pre-main sequence to a cooling white dwarf; the Solar sound speed profile; the evolution of intermediate mass stars through the thermal pulses on the He-shell burning AGB phase; the interior structure of slowly pulsating B Stars and Beta Cepheids; evolutionary tracks of massive stars from the pre-main sequence to the onset of core collapse; stars undergoing Roche lobe overflow; and accretion onto a neutron star. Instructions for downloading and installing MESA can be found on the project web site (this http URL).