scispace - formally typeset
Search or ask a question

Showing papers by "Georgia Institute of Technology published in 2002"


Journal ArticleDOI
TL;DR: The concept of sensor networks which has been made viable by the convergence of micro-electro-mechanical systems technology, wireless communications and digital electronics is described.

17,936 citations


Journal ArticleDOI
TL;DR: The current state of the art of sensor networks is captured in this article, where solutions are discussed under their related protocol stack layer sections.
Abstract: The advancement in wireless communications and electronics has enabled the development of low-cost sensor networks. The sensor networks can be used for various application areas (e.g., health, military, home). For different application areas, there are different technical issues that researchers are currently resolving. The current state of the art of sensor networks is captured in this article, where solutions are discussed under their related protocol stack layer sections. This article also points out the open research issues and intends to spark new interests and developments in this field.

14,048 citations


Journal ArticleDOI
02 Aug 2002-Science
TL;DR: Many potential applications have been proposed for carbon nanotubes, including conductive and high-strength composites; energy storage and energy conversion devices; sensors; field emission displays and radiation sources; hydrogen storage media; and nanometer-sized semiconductor devices, probes, and interconnects.
Abstract: Many potential applications have been proposed for carbon nanotubes, including conductive and high-strength composites; energy storage and energy conversion devices; sensors; field emission displays and radiation sources; hydrogen storage media; and nanometer-sized semiconductor devices, probes, and interconnects. Some of these applications are now realized in products. Others are demonstrated in early to advanced devices, and one, hydrogen storage, is clouded by controversy. Nanotube cost, polydispersity in nanotube type, and limitations in processing and assembly methods are important barriers for some applications of single-walled nanotubes.

9,693 citations


Journal ArticleDOI
TL;DR: In this article, the authors used the Total Ozone Mapping Spectrometer (TOMS) sensor on the Nimbus 7 satellite to map the global distribution of major atmospheric dust sources with the goal of identifying common environmental characteristics.
Abstract: [1] We use the Total Ozone Mapping Spectrometer (TOMS) sensor on the Nimbus 7 satellite to map the global distribution of major atmospheric dust sources with the goal of identifying common environmental characteristics The largest and most persistent sources are located in the Northern Hemisphere, mainly in a broad “dust belt” that extends from the west coast of North Africa, over the Middle East, Central and South Asia, to China There is remarkably little large-scale dust activity outside this region In particular, the Southern Hemisphere is devoid of major dust activity Dust sources, regardless of size or strength, can usually be associated with topographical lows located in arid regions with annual rainfall under 200–250 mm Although the source regions themselves are arid or hyperarid, the action of water is evident from the presence of ephemeral streams, rivers, lakes, and playas Most major sources have been intermittently flooded through the Quaternary as evidenced by deep alluvial deposits Many sources are associated with areas where human impacts are well documented, eg, the Caspian and Aral Seas, Tigris-Euphrates River Basin, southwestern North America, and the loess lands in China Nonetheless, the largest and most active sources are located in truly remote areas where there is little or no human activity Thus, on a global scale, dust mobilization appears to be dominated by natural sources Dust activity is extremely sensitive to many environmental parameters The identification of major sources will enable us to focus on critical regions and to characterize emission rates in response to environmental conditions With such knowledge we will be better able to improve global dust models and to assess the effects of climate change on emissions in the future It will also facilitate the interpretation of the paleoclimate record based on dust contained in ocean sediments and ice cores

2,653 citations


Journal ArticleDOI
TL;DR: This article showed that individual differences in working memory capacity are reflected in performance on antisaccade, Stroop, and dichotic listening tasks, and that WM capacity, or executive attention, is most important under conditions in which interference leads to retrieval of response tendencies that conflict with the current task.
Abstract: Performance on measures of working memory (WM) capacity predicts performance on a wide range of real-world cognitive tasks. I review the idea that WM capacity (a) is separable from short-term memory, (b) is an important component of general fluid intelligence, and (c) represents a domainfree limitation in ability to control attention. Studies show that individual differences in WM capacity are reflected in performance on antisaccade, Stroop, and dichotic-listening tasks. WM capacity, or executive attention, is most important under conditions in which interference leads to retrieval of response tendencies that conflict with the current task.

2,208 citations


Journal ArticleDOI
TL;DR: Although the dorsolateral PFC is but one critical structure in a network of anterior and posterior “attention control” areas, it does have a unique executiveattention role in actively maintaining access to stimulus representations and goals in interference-rich contexts.
Abstract: We provide an “executive-attention” framework for organizing the cognitive neuroscience research on the constructs of working-memory capacity (WMC), general fluid intelligence, and prefrontal cortex (PFC) function. Rather than provide a novel theory of PFC function, we synthesize a wealth of singlecell, brain-imaging, and neuropsychological research through the lens of our theory of normal individual differences in WMC and attention control (Engle, Kane, & Tuholski, 1999; Engle, Tuholski, Laughlin, & Conway, 1999). Our critical review confirms the prevalent view that dorsolateral PFC circuitry is critical to executive-attention functions. Moreover, although the dorsolateral PFC is but one critical structure in a network of anterior and posterior “attention control” areas, it does have a unique executiveattention role in actively maintaining access to stimulus representations and goals in interference-rich contexts. Our review suggests the utility of an executive-attention framework for guiding future research on both PFC function and cognitive control.

2,075 citations


Journal ArticleDOI
TL;DR: This work has shown how the emission wavelength of quantum-dot nanocrystals can be continuously tuned by changing the particle size, and a single light source can be used for simultaneous excitation of all different-sized dots.

2,066 citations


Journal ArticleDOI
TL;DR: A Monte Carlo simulation--based approach to stochastic discrete optimization problems, where a random sample is generated and the expected value function is approximated by the corresponding sample average function.
Abstract: In this paper we study a Monte Carlo simulation--based approach to stochastic discrete optimization problems. The basic idea of such methods is that a random sample is generated and the expected value function is approximated by the corresponding sample average function. The obtained sample average optimization problem is solved, and the procedure is repeated several times until a stopping criterion is satisfied. We discuss convergence rates, stopping rules, and computational complexity of this procedure and present a numerical example for the stochastic knapsack problem.

1,728 citations


Journal ArticleDOI
TL;DR: It is concluded that efforts to connect behavioral and brain data yield a more complete understanding of the aging mind and there is little evidence for dedifferentiation of function at the behavioral level in old compared with young adults.
Abstract: The authors investigated the distinctiveness and interrelationships among visuospatial and verbal memory processes in short-term, working, and long-term memories in 345 adults. Beginning in the 20s, a continuous, regular decline occurs for processing-intensive tasks (e.g., speed of processing, working memory, and long-term memory), whereas verbal knowledge increases across the life span. There is little differentiation in the cognitive architecture of memory across the life span. Visuospatial and verbal working memory are distinct but highly interrelated systems with domain-specific short-term memory subsystems. In contrast to recent neuroimaging data, there is little evidence for dedifferentiation of function at the behavioral level in old compared with young adults. The authors conclude that efforts to connect behavioral and brain data yield a more complete understanding of the aging mind.

1,614 citations


Journal ArticleDOI
28 Nov 2002-Nature
TL;DR: The fabrication of exchange-coupled nanocomposites using nanoparticle self-assembly with an energy product that exceeds the theoretical limit of 13 MG Oe for non-exchange- coupled isotropic FePt by over 50 per cent is reported.
Abstract: Exchange-spring magnets are nanocomposites that are composed of magnetically hard and soft phases that interact by magnetic exchange coupling. Such systems are promising for advanced permanent magnetic applications, as they have a large energy product--the combination of permanent magnet field and magnetization--compared to traditional, single-phase materials. Conventional techniques, including melt-spinning, mechanical milling and sputtering, have been explored to prepare exchange-spring magnets. However, the requirement that both the hard and soft phases are controlled at the nanometre scale, to ensure efficient exchange coupling, has posed significant preparation challenges. Here we report the fabrication of exchange-coupled nanocomposites using nanoparticle self-assembly. In this approach, both FePt and Fe3O4 particles are incorporated as nanometre-scale building blocks into binary assemblies. Subsequent annealing converts the assembly into FePt-Fe3Pt nanocomposites, where FePt is a magnetically hard phase and Fe3Pt a soft phase. An optimum exchange coupling, and therefore an optimum energy product, can be obtained by independently tuning the size and composition of the individual building blocks. We have produced exchange-coupled isotropic FePt-Fe3Pt nanocomposites with an energy product of 20.1 MG Oe, which exceeds the theoretical limit of 13 MG Oe for non-exchange-coupled isotropic FePt by over 50 per cent.

1,483 citations


Book
01 Jan 2002
TL;DR: The evolution of human-computer interaction has been discussed in detail in this paper, where the authors present a moving target for human-Computer interaction: the evolution of Human-Computer Interaction.
Abstract: Foreword by Ben Shneiderman Introduction: A Moving Target: The Evolution of Human-Computer Interaction, Jonathan Grudin Humans in HCI Perceptual-Motor Interaction: Some Implications for Human-Computer Interaction, Timothy N. Welsh, Sanjay Chandrasekharan, Matthew Ray, Heather Neyedli, Romeo Chua, and Daniel J. Weeks Human Information Processing: An Overview for Human-Computer Interaction, Robert W. Proctor and Kim-Phuong L. Vu Mental Models in Human-Computer Interaction, Stephen J. Payne Task Loading and Stress in Human-Computer Interaction: Theoretical Frameworks and Mitigation Strategies, James L. Szalma, Gabriella M. Hancock, and Peter A. Hancock Choices and Decisions of Computer Users, Anthony Jameson Computers in HCI Input Technologies and Techniques, Ken Hinckley and Daniel Wigdor Sensor- and Recognition-Based Input for Interaction, Andrew D. Wilson Visual Displays, Christopher M. Schlick, Carsten Winkelholz, Martina Ziefle, and Alexander Mertens Haptic Interface, Hiroo Iwata Nonspeech Auditory and Crossmodal Output, Eve Hoggan and Stephen Brewster Network-Based Interaction, Alan Dix Wearable Computers, Daniel Siewiorek, Asim Smailagic, and Thad Starner Design of Fixed, Portable, and Mobile Information Devices, Michael J. Smith and Pascale Carayon Designing Human-Computer Interactions Visual Design Principles for Usable Interfaces: Everything Is Designed: Why We Should Think before Doing, Suzanne Watzman and Margaret Re Globalization, Localization, and Cross-Cultural User-Interface Design, Aaron Marcus and Emilie W. Gould Speech and Language Interfaces, Applications, and Technologies, Clare-Marie Karat, Jennifer Lai, Osamuyimen Stewart, and Nicole Yankelovich Multimedia User Interface Design, Alistair Sutcliffe Multimodal Interfaces, Sharon Oviatt Systems That Adapt to Their Users, Anthony Jameson and Krzysztof Z. Gajos Mobile Interaction Design in the Age of Experience Ecosystems, Marco Susani Tangible User Interfaces, Hiroshi Ishii and Brygg Ullmer Achieving Psychological Simplicity: Measures and Methods to Reduce Cognitive Complexity, John C. Thomas and John T. Richards Information Visualization, Stuart Card Collaboration Technologies, Gary M. Olson and Judith S. Olson Human-Computer Interaction and the Web, Helen Ashman, Declan Dagger, Tim Brailsford, James Goulding, Declan O'Sullivan, Jan-Felix Schmakeit, and Vincent Wade Human-Centered Design of Decision-Support Systems, Philip J. Smith, Roger Beatty, Caroline C. Hayes, Adam Larson, Norman D. Geddes, and Michael C. Dorneich Online Communities, Panayiotis Zaphiris, Chee Siang Ang, and Andrew Laghos Virtual Environments, Kay M. Stanney and Joseph V. Cohn Privacy, Security, and Trust: Human-Computer Interaction Challenges and Opportunities at Their Intersection, John Karat, Clare-Marie Karat, and Carolyn Brodie Application-/Domain-Specific Design Human-Computer Interaction in Health Care, Francois Sainfort, Julie A. Jacko, Molly A. McClellan, and Paula J. Edwards Why We Play: Affect and the Fun of Games-Designing Emotions for Games, Entertainment Interfaces, and Interactive Products, Nicole Lazzaro Motor Vehicle-Driver Interfaces, Paul A. Green Human-Computer Interaction in Aerospace, Steven J. Landry User-Centered Design in Games Randy J. Pagulayan, Kevin Keeker, Thomas Fuller, Dennis Wixon, Ramon L. Romero, and Daniel V. Gunn Designing for Diversity Older Adults and Information Technology: Opportunities and Challenges, Sara J. Czaja, and Chin Chin Lee Human-Computer Interaction for Kids, Amy Bruckman, Alisa Bandlow, Jill Dimond, and Andrea Forte Information Technology for Communication and Cognitive Support, Alan F. Newell, Alex Carmichael, Peter Gregor, Norman Alm, Annalu Waller, Vicki L. Hanson, Graham Pullin, and Jesse Hoey Perceptual Impairments: New Advancements Promoting Technological Access, Julie A. Jacko, V. Kathlene Leonard, Molly A. McClellan, and Ingrid U. Scott Universal Accessibility and Low-Literacy Populations: Implications for Human-Computer Interaction Design and Research Methods, William M. Gribbons Computing Technologies for Deaf and Hard of Hearing Users, Vicki L. Hanson The Development Process Section A Requirements Specification User Experience Requirements Analysis within the Usability Engineering Lifecycle, Deborah J. Mayhew and Todd J. Follansbee Task Analysis, Catherine Courage, Jhilmil Jain, Janice (Ginny) Redish, and Dennis Wixon Contextual Design, Karen Holtzblatt Grounded Theory Method in Human-Computer Interaction and Computer-Supported Cooperative Work, Michael J. Muller and Sandra Kogan An Ethnographic Approach to Design, Jeanette Blomberg and Mark Burrell Section B Design and Development Putting Personas to Work: Employing User Personas to Focus Product Planning, Design, and Development, John Pruitt and Tamara Adlin Prototyping Tools and Techniques, Michel Beaudouin-Lafon and Wendy E. Mackay Scenario-Based Design, Mary Beth Rosson and John M. Carroll Participatory Design: The Third Space in Human-Computer Interaction, Michael J. Muller and Allison Druin Unified User Interface Development: A Software Refactoring Perspective, Anthony Savidis and Constantine Stephanidis Usability + Persuasiveness + Graphic Design = eCommerce User Experience, Deborah J. Mayhew Human-Computer Interaction and Software Engineering for User Interface Plasticity, Joelle Coutaz and Gaelle Calvary Section C Testing, Evaluation, and Technology Transfer Usability Testing, Joseph S. Dumas and Jean E. Fox Usability for Engaged Users: The Naturalistic Approach to Evaluation, David Siegel Survey Design and Implementation in HCI, A. Ant Ozok Inspection-Based Evaluations, Gilbert Cockton, Alan Woolrych, Kasper Hornbaek, and Erik Frokjaer Model-Based Evaluation, David Kieras Spreadsheet Tool for Simple Cost-Benefit Analyses of User Experience Engineering, Deborah J. Mayhew Technology Transfer, Kevin M. Schofield Emerging Phenomena in HCI Augmenting Cognition in HCI: Twenty-First Century Adaptive System Science and Technology, Kelly S. Hale, Kay M. Stanney, and Dylan D. Schmorrow Social Networks and Social Media, Molly A. McClellan, Julie A. Jacko, Francois Sainfort, and Layne M. Johnson Human-Computer Interaction for Development: Changing Human-Computer Interaction to Change the World, Susan M. Dray, Ann Light, Andrew M. Dearden, Vanessa Evers, Melissa Densmore, Divya Ramachandran, Matthew Kam, Gary Marsden, Nithya Sambasivan, Thomas Smyth, Darelle van Greunen, and Niall Winters Author Index Subject Index

Journal ArticleDOI
TL;DR: In this article, the binding energies of the benzene dimer were investigated at the second-order Moller−Plesset perturbation theory (MP2) level, and it was shown that more modest basis sets such as aug-cc-pVDZ are sufficient for geometry optimizations of intermolecular parameters.
Abstract: State-of-the-art electronic structure methods have been applied to the simplest prototype of aromatic π−π interactions, the benzene dimer. By comparison to results with a large aug-cc-pVTZ basis set, we demonstrate that more modest basis sets such as aug-cc-pVDZ are sufficient for geometry optimizations of intermolecular parameters at the second-order Moller−Plesset perturbation theory (MP2) level. However, basis sets even larger than aug-cc-pVTZ are important for accurate binding energies. The complete basis set MP2 binding energies, estimated by explicitly correlated MP2−R12/A techniques, are significantly larger in magnitude than previous estimates. When corrected for higher-order correlation effects via coupled cluster with singles, doubles, and perturbative triples [CCSD(T)], the binding energies De (D0) for the sandwich, T-shaped, and parallel-displaced configurations are found to be 1.8 (2.0), 2.7 (2.4), and 2.8 (2.7) kcal mol-1, respectively.

Proceedings ArticleDOI
19 May 2002
TL;DR: A new technique that uses color to visually map the participation of each program statement in the outcome of the execution of the program with a test suite, consisting of both passed and failed test cases is presented.
Abstract: One of the most expensive and time-consuming components of the debugging process is locating the errors or faults. To locate faults, developers must identify statements involved in failures and select suspicious statements that might contain faults. This paper presents a new technique that uses visualization to assist with these tasks. The technique uses color to visually map the participation of each program statement in the outcome of the execution of the program with a test suite, consisting of both passed and failed test cases. Based on this visual mapping, a user can inspect the statements in the program, identify statements involved in failures, and locate potentially faulty statements. The paper also describes a prototype tool that implements our technique along with a set of empirical studies that use the tool for evaluation of the technique. The empirical studies show that, for the subject we studied, the technique can be effective in helping a user locate faults in a program.

Book ChapterDOI
01 Jan 2002
TL;DR: A number of theories and frameworks have emerged that try to address both the potentials and limitations of effective cognitive and social functioning during the adult years (e.g., Baltes et al. as discussed by the authors ).
Abstract: In recent years, a number of theories and frameworks have emerged that try to address both the potentials and limitations of effective cognitive and social functioning during the adult years (e.g., Baltes & Staudinger, 2000; Baltes, Staudinger, & Lindenberger, 1999; Hoyer & Rybash, 1994; Lemme, 1999; Rowe & Kahn, 1997; Seligman & Csikszentmihalyi, 2000; Staudinger & Pasupathi, 2000; Wapner & Demick, this volume). Such frameworks have aided the articulation of the characteristics of adult development by integrating observations that would otherwise have been disconnected pieces of a puzzle and less meaningful. In the study of adult cognitive development, for example, much of the available data and theory address the description and explanation of age-related cognitive decline. However, everyday observations and the results from some studies suggest that there are improvements or stability as well as declines in cognitive function during the adult years. The aim of the present chapter is to describe what is known about the potentials as well as limits of learning during the adult years.

Journal ArticleDOI
TL;DR: This paper presents a meta-review of the literature on Vinyl Sulfones, Michael Acceptors, and Heterocyclic Inhibitors dating back to the 1970s, which revealed a wide diversity of opinions about the properties of these substances and their role in the human immune system.
Abstract: F. Vinyl Sulfones and Other Michael Acceptors 4683 G. Azodicarboxamides 4695 IV. Acylating Agents 4695 A. Aza-peptides 4695 B. Carbamates 4699 C. Peptidyl Acyl Hydroxamates 4700 D. â-Lactams and Related Inhibitors 4704 E. Heterocyclic Inhibitors 4714 1. Isocoumarins 4715 2. Benzoxazinones 4722 3. Saccharins 4725 4. Miscellaneous Heterocyclic Inhibitors 4728 V. Phosphonylation Agents 4728 A. Peptide Phosphonates 4728 B. Phosphonyl Fluorides 4734 VI. Sulfonylating Agents 4735 A. Sulfonyl Fluorides 4735 VII. Miscellaneous Inhibitors 4736 VIII. Summary and Perspectives 4737 IX. Acknowledgments 4740 X. Note Added in Proof 4740 XI. References 4740

Journal ArticleDOI
TL;DR: This research effort seeks to examine the theoretical foundation of nonparametric regression and to answer the question of whether non parametric regression based on heuristically improved forecast generation methods approach the single interval traffic flow prediction performance of seasonal ARIMA models.
Abstract: Single point short-term traffic flow forecasting will play a key role in supporting demand forecasts needed by operational network models. Seasonal autoregressive integrated moving average (ARIMA), a classic parametric modeling approach to time series, and nonparametric regression models have been proposed as well suited for application to single point short-term traffic flow forecasting. Past research has shown seasonal ARIMA models to deliver results that are statistically superior to basic implementations of nonparametric regression. However, the advantages associated with a data-driven nonparametric forecasting approach motivate further investigation of refined nonparametric forecasting methods. Following this motivation, this research effort seeks to examine the theoretical foundation of nonparametric regression and to answer the question of whether nonparametric regression based on heuristically improved forecast generation methods approach the single interval traffic flow prediction performance of seasonal ARIMA models.

Journal ArticleDOI
TL;DR: In this paper, a model describing the maximum clock frequency distribution of a microprocessor is derived and compared with wafer sort data for a recent 0.25-/spl mu/m microprocessor.
Abstract: A model describing the maximum clock frequency (FMAX) distribution of a microprocessor is derived and compared with wafer sort data for a recent 0.25-/spl mu/m microprocessor. The model agrees closely with measured data in mean, variance, and shape. Results demonstrate that within-die fluctuations primarily impact the FMAX mean and die-to-die fluctuations determine the majority of the FMAX variance. Employing rigorously derived device and circuit models, the impact of die-to-die and within-die parameter fluctuations on future FMAX distributions is forecast for the 180, 130, 100, 70, and 50-nm technology generations. Model predictions reveal that systematic within-die fluctuations impose the largest performance degradation resulting from parameter fluctuations. Assuming a 3/spl sigma/ channel length deviation of 20%, projections for the 50-nm technology generation indicate that essentially a generation of performance gain can be lost due to systematic within-die fluctuations. Key insights from this work elucidate the recommendations that manufacturing process controls be targeted specifically toward sources of systematic within-die fluctuations, and the development of new circuit design methodologies be aimed at suppressing the effect of within-die parameter fluctuations.

Journal ArticleDOI
TL;DR: Two techniques to improve the performance and reduce the complexity of channel parameter estimation are presented: optimum training-sequence design and simplified channel estimation.
Abstract: Multiple transmit-and-receive antennas can be used in orthogonal frequency division multiplexing (OFDM) systems to improve communication quality and capacity. In this paper, we present two techniques to improve the performance and reduce the complexity of channel parameter estimation: optimum training-sequence design and simplified channel estimation. The optimal training sequences not only simplify the initial channel estimation, but also attain the best estimation performance. The simplified channel estimation significantly reduces the complexity of the channel estimation at the expense of a negligible performance degradation. The effectiveness of the new techniques is demonstrated through the simulation of an OFDM system with two-transmit and two-receive antennas. The space-time coding with 240 information bits per codeword is used for transmit diversity. From the simulation, the required signal-to-noise ratio is only about 9 dB for a 10% word error rate for a channel with the typical urban- or hilly-terrain delay profile and a 40-Hz Doppler frequency.

Journal ArticleDOI
TL;DR: This work has used knowledge of the electronic structure of excited states of acids to design molecules that exhibit enhanced excited-state acidity and are the strongest reversible photoacids known.
Abstract: We have used knowledge of the electronic structure of excited states of acids to design molecules that exhibit enhanced excited-state acidity. Such “super” photoacids are the strongest reversible photoacids known and allow the time evolution of proton transfer to be examined in a wide array of organic solvents. This includes breaking/formation of the hydrogen bonds in hundreds of femtoseconds, solvent reorientation and relaxation in picoseconds, proton dissociation, and, finally, diffusion and geminate recombination of the dissociated proton, observed in nanoseconds.

Journal ArticleDOI
TL;DR: This paper presents a new demosaicing technique that uses inter-channel correlation effectively in an alternating-projections scheme and outperforms various state-of-the-art demosaice techniques, both visually and in terms of mean square error.
Abstract: Most commercial digital cameras use color filter arrays to sample red, green, and blue colors according to a specific pattern. At the location of each pixel only one color sample is taken, and the values of the other colors must be interpolated using neighboring samples. This color plane interpolation is known as demosaicing; it is one of the important tasks in a digital camera pipeline. If demosaicing is not performed appropriately, images suffer from highly visible color artifacts. In this paper we present a new demosaicing technique that uses inter-channel correlation effectively in an alternating-projections scheme. We have compared this technique with six state-of-the-art demosaicing techniques, and it outperforms all of them, both visually and in terms of mean square error.

Journal ArticleDOI
TL;DR: The conceptual model described in this paper clarifies ways in which worker motivation is influenced and how health sector reform can positively affect worker motivation.

Journal ArticleDOI
TL;DR: In this paper, the authors demonstrate that spin-transfer torques occur in magnetic heterostructures because the transverse component of a spin current that flows from a nonmagnet into a ferromagnet is absorbed at the interface.
Abstract: Spin-transfer torques occur in magnetic heterostructures because the transverse component of a spin current that flows from a nonmagnet into a ferromagnet is absorbed at the interface. We demonstrate this fact explicitly using free-electron models and first-principles electronic structure calculations for real material interfaces. Three distinct processes contribute to the absorption: (1) spin-dependent reflection and transmission, (2) rotation of reflected and transmitted spins, and (3) spatial precession of spins in the ferromagnet. When summed over all Fermi surface electrons, these processes reduce the transverse component of the transmitted and reflected spin currents to nearly zero for most systems of interest. Therefore, to a good approximation, the torque on the magnetization is proportional to the transverse piece of the incoming spin current.

Journal ArticleDOI
TL;DR: In this paper, the community land model (CLM2) was proposed, where the surface is represented by five primary subgrid land cover types (glacier, lake, wetland, urban, vegetated) in each grid cell.
Abstract: The land surface parameterization used with the community climate model (CCM3) and the climate system model (CSM1), the National Center for Atmospheric Research land surface model (NCAR LSM1), has been modified as part of the development of the next version of these climate models. This new model is known as the community land model (CLM2). In CLM2, the surface is represented by five primary subgrid land cover types (glacier, lake, wetland, urban, vegetated) in each grid cell. The vegetated portion of a grid cell is further divided into patches of up to 4 of 16 plant functional types, each with its own leaf and stem area index and canopy height. The relative area of each subgrid unit, the plant functional type, and leaf area index are obtained from 1-km satellite data. The soil texture dataset allows vertical profiles of sand and clay. Most of the physical parameterizations in the model were also updated. Major model differences include: 10 layers for soil temperature and soil water with explicit treatment of liquid water and ice; a multilayer snowpack; runoff based on the TOPMODEL concept; new formulation of ground and vegetation fluxes; and vertical root profiles from a global synthesis of ecological studies. Simulations with CCM3 show significant improvements in surface air temperature, snow cover, and runoff for CLM2 compared to LSM1. CLM2 generally warms surface air temperature in all seasons compared to LSM1, reducing or eliminating many cold biases. Annual precipitation over land is reduced from 2.35 mm day21 in LSM1 to 2.14 mm day21 in CLM2. The hydrologic cycle is also different. Transpiration and ground evaporation are reduced. Leaves and stems evaporate more intercepted water annually in CLM2 than LSM1. Global runoff from land increases from 0.75 mm day21 in LSM1 to 0.84 mm day21 in CLM2. The annual cycle of runoff is greatly improved in CLM2, especially in arctic and boreal regions where the model has low runoff in cold seasons when the soil is frozen and high runoff during the snowmelt season. Most of the differences between CLM2 and LSM1 are attributed to particular parameterizations rather than to different surface datasets. Important processes include: multilayer snow, frozen water, interception, soil water limitation to latent heat, and higher aerodynamic resistances to heat exchange from ground.

Journal ArticleDOI
TL;DR: This paper surveys three broad classes of very large-scale neighborhood search (VLSN) algorithms: (1) variable-depth methods in which large neighbourhoods are searched heuristically, (2) large neighborhoods in which the neighborhoods are searched using network flow techniques or dynamic programming, and (3) large neighbourhoods induced by restrictions of the original problem that are solvable in polynomial time.

Journal ArticleDOI
TL;DR: How students use AV technology has a greater impact on effectiveness than what AV technology shows them, and an agenda for future research into AV effectiveness is formulated.
Abstract: Algorithm visualization (AV) technology graphically illustrates how algorithms work. Despite the intuitive appeal of the technology, it has failed to catch on in mainstream computer science education. Some have attributed this failure to the mixed results of experimental studies designed to substantiate AV technology's educational effectiveness. However, while several integrative reviews of AV technology have appeared, none has focused specifically on the software's effectiveness by analyzing this body of experimental studies as a whole. In order to better understand the effectiveness of AV technology, we present a systematic meta-study of 24 experimental studies. We pursue two separate analyses: an analysis ofindependent variables , in which we tie each study to a particular guiding learning theory in an attempt to determine which guiding theory has had the most predictive success; and an analysis of dependent variables, which enables us to determine which measurement techniques have been most sensitive to the learning benefits of AV technology. Our most significant finding is that how students use AV technology has a greater impact on effectiveness than what AV technology shows them. Based on our findings, we formulate an agenda for future research into AV effectiveness.

Posted Content
TL;DR: In this article, the authors prove that every Berge graph either falls into one of a few basic classes, or it has a kind of separation that cannot occur in a minimal imperfect graph.
Abstract: A graph G is perfect if for every induced subgraph H, the chromatic number of H equals the size of the largest complete subgraph of H, and G is Berge if no induced subgraph of G is an odd cycle of length at least 5 or the complement of one. The "strong perfect graph conjecture" (Berge, 1961) asserts that a graph is perfect if and only if it is Berge. A stronger conjecture was made recently by Conforti, Cornuejols and Vuskovic -- that every Berge graph either falls into one of a few basic classes, or it has a kind of separation that cannot occur in a minimal imperfect graph. In this paper we prove both these conjectures.

Journal ArticleDOI
TL;DR: Random coefficient modeling analyses conducted on data collected from 2,585 soldiers in 86 combat units confirmed that soldiers' experience, role clarity, and psychological strain predicted self-efficacy to a greater extent than did leadership climate.
Abstract: This study identified potential discontinuities in the antecedents of efficacy beliefs across levels of analysis, with a particular focus on the role of leadership climate at different organizational levels. Random coefficient modeling analyses conducted on data collected from 2,585 soldiers in 86 combat units confirmed that soldiers’ experience, role clarity, and psychological strain predicted self-efficacy to a greater extent than did leadership climate. Also, leadership climate at a higher organizational level related to self-efficacy through role clarity, whereas leadership climate at a lower organizational level related to self-efficacy through psychological strain. Group-level analyses identified leadership climate at a higher organizational level as the strongest predictor of collective efficacy. Theoretical and practical implications and directions for future research are discussed.


Journal ArticleDOI
TL;DR: In this article, the effect of composition, firing temperature, and microstructure of composite cathodes on the electrochemical properties of SSC and SDC was investigated in anode-supported single cells at low temperatures (400-600 °C).

Proceedings ArticleDOI

[...]

28 Jul 2002
TL;DR: This paper applies Lifelong Planning A* to robot navigation inunknown terrain, including goal-directed navigation in unknown terrain and mapping of unknown terrain, and develops the resulting D* Lite algorithm, which implements the same behavior as Stentz' Focussed Dynamic A* but is algorithmically different.
Abstract: Incremental heuristic search methods use heuristics to focus their search and reuse information from previous searches to find solutions to series of similar search tasks much faster than is possible by solving each search task from scratch. In this paper, we apply Lifelong Planning A* to robot navigation in unknown terrain, including goal-directed navigation in unknown terrain and mapping of unknown terrain. The resulting D* Lite algorithm is easy to understand and analyze. It implements the same behavior as Stentz' Focussed Dynamic A* but is algorithmically different. We prove properties about D* Lite and demonstrate experimentally the advantages of combining incremental and heuristic search for the applications studied. We believe that these results provide a strong foundation for further research on fast replanning methods in artificial intelligence and robotics.