scispace - formally typeset
Search or ask a question

Showing papers by "University of Maryland, College Park published in 2004"


Journal ArticleDOI
TL;DR: The authors argue that service provision rather than goods is fundamental to economic exchange and argue that the new perspectives are converging to form a new dominant logic for marketing, one in which service provision is fundamental for economic exchange.
Abstract: Marketing inherited a model of exchange from economics, which had a dominant logic based on the exchange of “goods,” which usually are manufactured output The dominant logic focused on tangible resources, embedded value, and transactions Over the past several decades, new perspectives have emerged that have a revised logic focused on intangible resources, the cocreation of value, and relationships The authors believe that the new perspectives are converging to form a new dominant logic for marketing, one in which service provision rather than goods is fundamental to economic exchange The authors explore this evolving logic and the corresponding shift in perspective for marketing scholars, marketing practitioners, and marketing educators

12,760 citations


Journal ArticleDOI
TL;DR: Spintronics, or spin electronics, involves the study of active control and manipulation of spin degrees of freedom in solid-state systems as discussed by the authors, where the primary focus is on the basic physical principles underlying the generation of carrier spin polarization, spin dynamics, and spin-polarized transport.
Abstract: Spintronics, or spin electronics, involves the study of active control and manipulation of spin degrees of freedom in solid-state systems. This article reviews the current status of this subject, including both recent advances and well-established results. The primary focus is on the basic physical principles underlying the generation of carrier spin polarization, spin dynamics, and spin-polarized transport in semiconductors and metals. Spin transport differs from charge transport in that spin is a nonconserved quantity in solids due to spin-orbit and hyperfine coupling. The authors discuss in detail spin decoherence mechanisms in metals and semiconductors. Various theories of spin injection and spin-polarized transport are applied to hybrid structures relevant to spin-based devices and fundamental studies of materials properties. Experimental work is reviewed with the emphasis on projected applications, in which external electric and magnetic fields and illumination by light will be used to control spin and charge dynamics to create new functionalities not feasible or ineffective with conventional electronics.

9,158 citations


Journal ArticleDOI
TL;DR: The Global Land Data Assimilation System (GLDAS) as mentioned in this paper is an uncoupled land surface modeling system that drives multiple models, integrates a huge quantity of observation-based data, runs globally at high resolution (0.25°), and produces results in near-real time (typically within 48 h of the present).
Abstract: A Global Land Data Assimilation System (GLDAS) has been developed. Its purpose is to ingest satellite- and ground-based observational data products, using advanced land surface modeling and data assimilation techniques, in order to generate optimal fields of land surface states and fluxes. GLDAS is unique in that it is an uncoupled land surface modeling system that drives multiple models, integrates a huge quantity of observation-based data, runs globally at high resolution (0.25°), and produces results in near–real time (typically within 48 h of the present). GLDAS is also a test bed for innovative modeling and assimilation capabilities. A vegetation-based “tiling” approach is used to simulate subgrid-scale variability, with a 1-km global vegetation dataset as its basis. Soil and elevation parameters are based on high-resolution global datasets. Observation-based precipitation and downward radiation and output fields from the best available global coupled atmospheric data assimilation systems are employe...

3,857 citations


Journal ArticleDOI
20 Aug 2004
TL;DR: The Swift mission as discussed by the authors is a multi-wavelength observatory for gamma-ray burst (GRB) astronomy, which is a first-of-its-kind autonomous rapid-slewing satellite for transient astronomy and pioneers the way for future rapid-reaction and multiwavelength missions.
Abstract: The Swift mission, scheduled for launch in 2004, is a multiwavelength observatory for gamma-ray burst (GRB) astronomy. It is a first-of-its-kind autonomous rapid-slewing satellite for transient astronomy and pioneers the way for future rapid-reaction and multiwavelength missions. It will be far more powerful than any previous GRB mission, observing more than 100 bursts yr � 1 and performing detailed X-ray and UV/optical afterglow observations spanning timescales from 1 minute to several days after the burst. The objectives are to (1) determine the origin of GRBs, (2) classify GRBs and search for new types, (3) study the interaction of the ultrarelativistic outflows of GRBs with their surrounding medium, and (4) use GRBs to study the early universe out to z >10. The mission is being developed by a NASA-led international collaboration. It will carry three instruments: a newgeneration wide-field gamma-ray (15‐150 keV) detector that will detect bursts, calculate 1 0 ‐4 0 positions, and trigger autonomous spacecraft slews; a narrow-field X-ray telescope that will give 5 00 positions and perform spectroscopy in the 0.2‐10 keV band; and a narrow-field UV/optical telescope that will operate in the 170‐ 600 nm band and provide 0B3 positions and optical finding charts. Redshift determinations will be made for most bursts. In addition to the primary GRB science, the mission will perform a hard X-ray survey to a sensitivity of � 1m crab (� 2;10 � 11 ergs cm � 2 s � 1 in the 15‐150 keV band), more than an order of magnitude better than HEAO 1 A-4. A flexible data and operations system will allow rapid follow-up observations of all types of

3,753 citations


Book
19 Aug 2004
TL;DR: In this paper, the authors discuss what it is like to be an emerging adult and what does it mean to become an adult, from emerging adulthood to young adulthood, from conflict to companionship, a new relationship with parents, love and sex.
Abstract: Preface 1. A longer road to adulthood 2. What is it like to be an emerging adult: Four portraits 3. From conflict to companionship: A new relationship with parents 4. Love and sex 5. Meandering toward marriage 6. The road through college: Twists and turns 7. Work: More than a job 8. Sources of meaning: Religious beliefs and values 9. The age of possibilities: Four case studies 10. From emerging adulthood to young adulthood: What does it mean to become an adult?

3,598 citations


Journal ArticleDOI
TL;DR: To improve the treatment of the peptide backbone, quantum mechanical and molecular mechanical calculations were undertaken on the alanine, glycine, and proline dipeptides, and the results were combined with molecular dynamics simulations of proteins in crystal and aqueous environments to enhance the quality of the CHARMM force field.
Abstract: Computational studies of proteins based on empirical force fields represent a powerful tool to obtain structure-function relationships at an atomic level, and are central in current efforts to solve the protein folding problem. The results from studies applying these tools are, however, dependent on the quality of the force fields used. In particular, accurate treatment of the peptide backbone is crucial to achieve representative conformational distributions in simulation studies. To improve the treatment of the peptide backbone, quantum mechanical (QM) and molecular mechanical (MM) calculations were undertaken on the alanine, glycine, and proline dipeptides, and the results from these calculations were combined with molecular dynamics (MD) simulations of proteins in crystal and aqueous environments. QM potential energy maps of the alanine and glycine dipeptides at the LMP2/cc-pVxZ//MP2/6-31G* levels, where x = D, T, and Q, were determined, and are compared to available QM studies on these molecules. The LMP2/cc-pVQZ//MP2/6-31G* energy surfaces for all three dipeptides were then used to improve the MM treatment of the dipeptides. These improvements included additional parameter optimization via Monte Carlo simulated annealing and extension of the potential energy function to contain peptide backbone phi, psi dihedral crossterms or a phi, psi grid-based energy correction term. Simultaneously, MD simulations of up to seven proteins in their crystalline environments were used to validate the force field enhancements. Comparison with QM and crystallographic data showed that an additional optimization of the phi, psi dihedral parameters along with the grid-based energy correction were required to yield significant improvements over the CHARMM22 force field. However, systematic deviations in the treatment of phi and psi in the helical and sheet regions were evident. Accordingly, empirical adjustments were made to the grid-based energy correction for alanine and glycine to account for these systematic differences. These adjustments lead to greater deviations from QM data for the two dipeptides but also yielded improved agreement with experimental crystallographic data. These improvements enhance the quality of the CHARMM force field in treating proteins. This extension of the potential energy function is anticipated to facilitate improved treatment of biological macromolecules via MM approaches in general.

3,271 citations


Journal ArticleDOI
TL;DR: In this article, the shape and intensity of the precipitation features are modified during the time between microwave sensor scans by performing a time-weighted linear interpolation, yielding spatially and temporally complete microwave-derived precipitation analyses, independent of the infrared temperature field.
Abstract: A new technique is presented in which half-hourly global precipitation estimates derived from passive microwave satellite scans are propagated by motion vectors derived from geostationary satellite infrared data. The Climate Prediction Center morphing method (CMORPH) uses motion vectors derived from half-hourly interval geostationary satellite IR imagery to propagate the relatively high quality precipitation estimates derived from passive microwave data. In addition, the shape and intensity of the precipitation features are modified (morphed) during the time between microwave sensor scans by performing a time-weighted linear interpolation. This process yields spatially and temporally complete microwave-derived precipitation analyses, independent of the infrared temperature field. CMORPH showed substantial improvements over both simple averaging of the microwave estimates and over techniques that blend microwave and infrared information but that derive estimates of precipitation from infrared data...

2,784 citations


Journal ArticleDOI
13 May 2004-Nature
TL;DR: The results suggest that outbreaks can be contained by a strategy of targeted vaccination combined with early detection without resorting to mass vaccination of a population.
Abstract: Here we present a highly resolved agent-based simulation tool (EpiSims), which combines realistic estimates of population mobility,based on census and land-use data, with parameterized models for simulating the progress of a disease within a host and of transmission between hosts10. The simulation generates a largescale,dynamic contact graph that replaces the differential equations of the classic approach. EpiSims is based on the Transportation Analysis and Simulation System (TRANSIMS) developed at Los Alamos National Laboratory, which produces estimates of social networks based on the assumption that the transportation infrastructure constrains people’s choices about where and when to perform activities11. TRANSIMS creates a synthetic population endowed with demographics such as age and income, consistent with joint distributions in census data. It then estimates positions and activities of all travellers on a second-by-second basis. For more information on TRANSIMS and its availability, see Supplementary Information. The resulting social network is the best extant estimate of the physical contact patterns among large groups of people—alternative methodologies are limited to physical contacts among hundreds of people or non-physical contacts (such as e-mail or citations) among large groups.

2,095 citations


Journal ArticleDOI
30 Jan 2004-Science
TL;DR: Thermodynamic analyses show that the magnetoelectric coupling in a nanostructured BaTiO3-CoFe2O4 ferroelectromagnet can be understood on the basis of the strong elastic interactions between the two phases.
Abstract: We report on the coupling between ferroelectric and magnetic order parameters in a nanostructured BaTiO3-CoFe2O4 ferroelectromagnet. This facilitates the interconversion of energies stored in electric and magnetic fields and plays an important role in many devices, including transducers, field sensors, etc. Such nanostructures were deposited on single-crystal SrTiO3 (001) substrates by pulsed laser deposition from a single Ba-Ti-Co-Fe-oxide target. The films are epitaxial in-plane as well as out-of-plane with self-assembled hexagonal arrays of CoFe2O4 nanopillars embedded in a BaTiO3 matrix. The CoFe2O4 nanopillars have uniform size and average spacing of 20 to 30 nanometers. Temperature-dependent magnetic measurements illustrate the coupling between the two order parameters, which is manifested as a change in magnetization at the ferroelectric Curie temperature. Thermodynamic analyses show that the magnetoelectric coupling in such a nanostructure can be understood on the basis of the strong elastic interactions between the two phases.

2,005 citations


Journal ArticleDOI
TL;DR: A multisite effort by the Treatment Fidelity Workgroup of the National Institutes of Health Behavior Change Consortium to identify treatment fidelity concepts and strategies in health behavior intervention research is described.
Abstract: Treatment fidelity refers to the methodological strategies used to monitor and enhance the reliability and validity of behavioral interventions. This article describes a multisite effort by the Treatment Fidelity Workgroup of the National Institutes of Health Behavior Change Consortium (BCC) to identify treatment fidelity concepts and strategies in health behavior intervention research. The work group reviewed treatment fidelity practices in the research literature, identified techniques used within the BCC, and developed recommendations for incorporating these practices more consistently. The recommendations cover study design, provider training, treatment delivery, treatment receipt, and enactment of treatment skills. Funding agencies, reviewers, and journal editors are encouraged to make treatment fidelity a standard part of the conduct and evaluation of health behavior intervention research.

1,987 citations


Journal ArticleDOI
TL;DR: In view of the current trend of increasing and widespread use of chronic bisphosphonate therapy, the observation of an associated risk of osteonecrosis of the jaw should alert practitioners to monitor for this previously unrecognized potential complication.

Journal ArticleDOI
TL;DR: In this article, the authors present a unified strategic framework that enables competing marketing strategy options to be traded off on the basis of projected financial return, which is operationalized as the change in a firm's customer equity relative to the incremental expenditure necessary to produce the change.
Abstract: The authors present a unified strategic framework that enables competing marketing strategy options to be traded off on the basis of projected financial return, which is operationalized as the change in a firm’s customer equity relative to the incremental expenditure necessary to produce the change. The change in the firm’s customer equity is the change in its current and future customers’ lifetime values, summed across all customers in the industry. Each customer’s lifetime value results from the frequency of category purchases, average quantity of purchase, and brand-switching patterns combined with the firm’s contribution margin. The brand-switching matrix can be estimated from either longitudinal panel data or cross-sectional survey data, using a logit choice model. Firms can analyze drivers that have the greatest impact, compare the drivers’ performance with that of competitors’ drivers, and project return on investment from improvements in the drivers. To demonstrate how the approach can be implemented in a specific corporate setting and to show the methods used to test and validate the model, the authors illustrate a detailed application of the approach by using data from the airline industry. Their framework enables what-if evaluation of marketing return on investment, which can include such criteria as return on quality, return on advertising, return on loyalty programs, and even return on corporate citizenship, given a particular shift in customer perceptions. This enables the firm to focus marketing efforts on strategic initiatives that generate the greatest return.

Journal ArticleDOI
TL;DR: It is shown how damage to different components of this framework can account for the major symptom clusters of the fluent aphasias, and some recent evidence concerning how sentence-level processing might be integrated into the framework is discussed.

Journal ArticleDOI
TL;DR: Structural equation modeling revealed a web of relationships that impact venture growth and communicated vision and self-efficacy were related to goals, and tenacity was related to new resource skill.
Abstract: Previous research on entrepreneurship as well as goal, social-cognitive, and leadership theories has guided hypotheses regarding the relationship between entrepreneurial traits and skill (passion, tenacity, and new resource skill) and situationally specific motivation (communicated vision, self-efficacy, and goals) to subsequent venture growth. Data from 229 entrepreneur-chief executive officers and 106 associates in a single industry were obtained in a 6-year longitudinal study. Structural equation modeling revealed a web of relationships that impact venture growth. Goals, self-efficacy, and communicated vision had direct effects on venture growth, and these factors mediated the effects of passion, tenacity, and new resource skill on subsequent growth. Furthermore, communicated vision and self-efficacy were related to goals, and tenacity was related to new resource skill.

Journal ArticleDOI
TL;DR: In this paper, the authors extend prior research by developing a conceptual framework linking all of these constructs in a business-to-business (B2B) service setting, and they hypothesize that customer satisfaction mediates the relationship between customer value and customer loyalty, and customer satisfaction and loyalty have significant reciprocal effects on each other.
Abstract: Although researchers and managers pay increasing attention to customer value, satisfaction, loyalty, and switching costs, not much is known about their interrelationships. Prior research has examined the relationships within subsets of these constructs, mainly in the business-to-consumer (B2C) environment. The authors extend prior research by developing a conceptual framework linking all of these constructs in a business-to-business (B2B) service setting. On the basis of the cognition-affect-behavior model, the authors hypothesize that customer satisfaction mediates the relationship between customer value and customer loyalty, and that customer satisfaction and loyalty have significant reciprocal effects on each other. Furthermore, the potential interaction effect of satisfaction and switching costs, and the quadratic effect of satisfaction, on loyalty are explored. The authors test the hypotheses on data obtained from a courier service provider in a B2B context. The results support most of the hypotheses and, in particular, confirm the mediating role of customer satisfaction.

Journal ArticleDOI
TL;DR: This article developed a novel system of reclassifying historical exchange rate regimes and employed monthly data on market-determined parallel exchange rates going back to 1946 for 153 countries, and showed that the breakup of Bretton-woods had less impact on exchange rate regime than is popularly believed.
Abstract: We develop a novel system of reclassifying historical exchange rate regimes. One key difference between our study and previous classifications is that we employ monthly data on market-determined parallel exchange rates going back to 1946 for 153 countries. Our approach differs from the IMF official classification (which we show to be only a little better than random), it also differs radically from all previous attempts at historical reclassification. Our classification points to a rethinking of economic performance under alternative exchange rate regimes. Indeed, the breakup of Bretton Woods had less impact on exchange rate regimes than is popularly believed.

Journal ArticleDOI
TL;DR: The results demonstrate that there exist ideal binary time-frequency masks that can separate several speech signals from one mixture and show that the W-disjoint orthogonality of speech can be approximate in the case where two anechoic mixtures are provided.
Abstract: Binary time-frequency masks are powerful tools for the separation of sources from a single mixture. Perfect demixing via binary time-frequency masks is possible provided the time-frequency representations of the sources do not overlap: a condition we call W-disjoint orthogonality. We introduce here the concept of approximate W-disjoint orthogonality and present experimental results demonstrating the level of approximate W-disjoint orthogonality of speech in mixtures of various orders. The results demonstrate that there exist ideal binary time-frequency masks that can separate several speech signals from one mixture. While determining these masks blindly from just one mixture is an open problem, we show that we can approximate the ideal masks in the case where two anechoic mixtures are provided. Motivated by the maximum likelihood mixing parameter estimators, we define a power weighted two-dimensional (2-D) histogram constructed from the ratio of the time-frequency representations of the mixtures that is shown to have one peak for each source with peak location corresponding to the relative attenuation and delay mixing parameters. The histogram is used to create time-frequency masks that partition one of the mixtures into the original sources. Experimental results on speech mixtures verify the technique. Example demixing results can be found online at http://alum.mit.edu/www/rickard/bss.html.

Journal ArticleDOI
TL;DR: In this article, a carbon nanotube transistors with channel lengths exceeding 300 microns have been fabricated, where the carrier transport is diffusive and the channel resistance dominates the transport.
Abstract: Semiconducting carbon nanotube transistors with channel lengths exceeding 300 microns have been fabricated. In these long transistors, carrier transport is diffusive and the channel resistance dominates the transport. Transport characteristics are used to extract the field-effect mobility (79 000 cm2/Vs) and estimate the intrinsic mobility (>100 000 cm2/Vs) at room temperature. These values exceed those for all known semiconductors, which bodes well for application of nanotubes in high-speed transistors, single- and few-electron memories, and chemical/biochemical sensors.

Journal ArticleDOI
16 Dec 2004-Nature
TL;DR: It is suggested that lower crustal foundering occurred within the North China craton during the Late Jurassic, and thus provides constraints on the timing of lithosphere removal beneath the NorthChina craton.
Abstract: Foundering of mafic lower continental crust into underlying convecting mantle has been proposed as one means to explain the unusually evolved chemical composition of Earth's continental crust, yet direct evidence of this process has been scarce. Here we report that Late Jurassic high-magnesium andesites, dacites and adakites (siliceous lavas with high strontium and low heavy-rare-earth element and yttrium contents) from the North China craton have chemical and petrographic features consistent with their origin as partial melts of eclogite that subsequently interacted with mantle peridotite. Similar features observed in adakites and some Archaean sodium-rich granitoids of the tonalite-trondhjemite-granodiorite series have been interpreted to result from interaction of slab melts with the mantle wedge. Unlike their arc-related counterparts, however, the Chinese magmas carry inherited Archaean zircons and have neodymium and strontium isotopic compositions overlapping those of eclogite xenoliths derived from the lower crust of the North China craton. Such features cannot be produced by crustal assimilation of slab melts, given the high Mg#, nickel and chromium contents of the lavas. We infer that the Chinese lavas derive from ancient mafic lower crust that foundered into the convecting mantle and subsequently melted and interacted with peridotite. We suggest that lower crustal foundering occurred within the North China craton during the Late Jurassic, and thus provides constraints on the timing of lithosphere removal beneath the North China craton.

Journal ArticleDOI
TL;DR: This Review integrates and summarizes knowledge gained from areas ranging from structural biology and medicinal chemistry to supramolecular chemistry and nanotechnology, with emphasis on G-quartet structure, function, and molecular recognition.
Abstract: Molecular self-assembly is central to many processes in both biology and supramolecular chemistry. The G-quartet, a hydrogen-bonded macrocycle formed by cation-templated assembly of guanosine, was first identified in 1962 as the basis for the aggregation of 5'-guanosine monophosphate. We now know that many nucleosides, oligonucleotides, and synthetic derivatives form a rich array of functional G-quartets. The G-quartet surfaces in areas ranging from structural biology and medicinal chemistry to supramolecular chemistry and nanotechnology. This Review integrates and summarizes knowledge gained from these different areas, with emphasis on G-quartet structure, function, and molecular recognition.

Journal ArticleDOI
TL;DR: A shortest cost path routing algorithm is proposed which uses link costs that reflect both the communication energy consumption rates and the residual energy levels at the two end nodes and is amenable to distributed implementation.
Abstract: A routing problem in static wireless ad hoc networks is considered as it arises in a rapidly deployed, sensor based, monitoring system known as the wireless sensor network. Information obtained by the monitoring nodes needs to be routed to a set of designated gateway nodes. In these networks, every node is capable of sensing, data processing, and communication, and operates on its limited amount of battery energy consumed mostly in transmission and reception at its radio transceiver. If we assume that the transmitter power level can be adjusted to use the minimum energy required to reach the intended next hop receiver then the energy consumption rate per unit information transmission depends on the choice of the next hop node, i.e., the routing decision. We formulate the routing problem as a linear programming problem, where the objective is to maximize the network lifetime, which is equivalent to the time until the network partition due to battery outage. Two different models are considered for the information-generation processes. One assumes constant rates and the other assumes an arbitrary process. A shortest cost path routing algorithm is proposed which uses link costs that reflect both the communication energy consumption rates and the residual energy levels at the two end nodes. The algorithm is amenable to distributed implementation. Simulation results with both information-generation process models show that the proposed algorithm can achieve network lifetime that is very close to the optimal network lifetime obtained by solving the linear programming problem.

Journal ArticleDOI
TL;DR: The authors argue that these characteristics do not distinguish services from goods, only have meaning from a manufacturing perspective, and imply inappropriate normative strategies, and suggest that advances made by service scholars can provide a foundation for a more service-dominant view of all exchange from which more appropriate normative strategies can be developed.
Abstract: Marketing was originally built on a goods-centered, manufacturing-based model of economic exchange developed during the Industrial Revolution. Since its beginning, marketing has been broadening its perspective to include the exchange of more than manufactured goods. The subdiscipline of service marketing has emerged to address much of this broadened perspective, but it is built on the same goods and manufacturing-based model. The influence of this model is evident in the prototypical characteristics that have been identified as distinguishing services from goods—intangibility, inseparability, heterogeneity, and perishability. The authors argue that these characteristics (a) do not distinguish services from goods, (b) only have meaning from a manufacturing perspective, and (c) imply inappropriate normative strategies. They suggest that advances made by service scholars can provide a foundation for a more service-dominant view of all exchange from which more appropriate normative strategies can be developed...

Journal ArticleDOI
TL;DR: A real-time and retrospective North American Land Data Assimilation System (NLDAS) is presented in this article, which consists of four land models executing in parallel in uncoupled mode, common hourly surface forcing, and common streamflow routing: all using a 1/8° grid over the continental United States.
Abstract: [1] Results are presented from the multi-institution partnership to develop a real-time and retrospective North American Land Data Assimilation System (NLDAS). NLDAS consists of (1) four land models executing in parallel in uncoupled mode, (2) common hourly surface forcing, and (3) common streamflow routing: all using a 1/8° grid over the continental United States. The initiative is largely sponsored by the Global Energy and Water Cycle Experiment (GEWEX) Continental-Scale International Project (GCIP). As the overview for nine NLDAS papers, this paper describes and evaluates the 3-year NLDAS execution of 1 October 1996 to 30 September 1999, a period rich in observations for validation. The validation emphasizes (1) the land states, fluxes, and input forcing of four land models, (2) the application of new GCIP-sponsored products, and (3) a multiscale approach. The validation includes (1) mesoscale observing networks of land surface forcing, fluxes, and states, (2) regional snowpack measurements, (3) daily streamflow measurements, and (4) satellite-based retrievals of snow cover, land surface skin temperature (LST), and surface insolation. The results show substantial intermodel differences in surface evaporation and runoff (especially over nonsparse vegetation), soil moisture storage, snowpack, and LST. Owing to surprisingly large intermodel differences in aerodynamic conductance, intermodel differences in midday summer LST were unlike those expected from the intermodel differences in Bowen ratio. Last, anticipating future assimilation of LST, an NLDAS effort unique to this overview paper assesses geostationary-satellite-derived LST, determines the latter to be of good quality, and applies the latter to validate modeled LST.

Journal ArticleDOI
TL;DR: In this paper, the authors examine the cross-sectional variation in the marginal value of corporate cash holdings arising from differences in corporate financial policy and provide semi-quantitative predictions for the value of an extra dollar of cash depending upon whether that dollar will most likely go to increasing distributions to equity, reducing the amount of cash that needs to be raised in the capital markets, or servicing debt or other liabilities.
Abstract: We examine the cross-sectional variation in the marginal value of corporate cash holdings arising from differences in corporate financial policy. We begin by providing semi-quantitative predictions for the value of an extra dollar of cash depending upon whether that dollar will most likely go to i) increasing distributions to equity, ii) reducing the amount of cash that needs to be raised in the capital markets, or iii) servicing debt or other liabilities. We then relate firm financial structure characteristics to the likelihood of firms engaging in these actions, and derive a set of intuitive hypotheses to test empirically. We generate estimates of the marginal value of cash by examining the variation in excess stock returns over the fiscal year and find results that are both qualitatively and quantitatively consistent with all hypotheses tested. In particular, we find that the marginal value of cash declines with larger cash holdings, higher leverage, better access to capital markets, and as firms choose to distribute cash via dividends rather than repurchases.

Journal ArticleDOI
TL;DR: An overview of the issues associated with the development and application of empirical force fields to biomolecular systems and a summary of the force fields commonly applied to the different classes of biomolecules are presented.
Abstract: Empirical force field-based studies of biological macromolecules are becoming a common tool for investigating their structure-activity relationships at an atomic level of detail. Such studies facilitate interpretation of experimental data and allow for information not readily accessible to experimental methods to be obtained. A large part of the success of empirical force field-based methods is the quality of the force fields combined with the algorithmic advances that allow for more accurate reproduction of experimental observables. Presented is an overview of the issues associated with the development and application of empirical force fields to biomolecular systems. This is followed by a summary of the force fields commonly applied to the different classes of biomolecules; proteins, nucleic acids, lipids, and carbohydrates. In addition, issues associated with computational studies on "heterogeneous" biomolecular systems and the transferability of force fields to a wide range of organic molecules of pharmacological interest are discussed.

Journal ArticleDOI
TL;DR: In this article, the results from a second characterisation of the 91500 zircon, including data from electron probe microanalysis, laser ablation inductively coupled plasma-mass spectrometer (LA-ICP-MS), secondary ion mass spectrometry (SIMS), and laser fluorination analyses, were reported.
Abstract: This paper reports the results from a second characterisation of the 91500 zircon, including data from electron probe microanalysis, laser ablation inductively coupled plasma-mass spectrometry (LA-ICP-MS), secondary ion mass spectrometry (SIMS) and laser fluorination analyses. The focus of this initiative was to establish the suitability of this large single zircon crystal for calibrating in situ analyses of the rare earth elements and oxygen isotopes, as well as to provide working values for key geochemical systems. In addition to extensive testing of the chemical and structural homogeneity of this sample, the occurrence of banding in 91500 in both backscattered electron and cathodoluminescence images is described in detail. Blind intercomparison data reported by both LA-ICP-MS and SIMS laboratories indicate that only small systematic differences exist between the data sets provided by these two techniques. Furthermore, the use of NIST SRM 610 glass as the calibrant for SIMS analyses was found to introduce little or no systematic error into the results for zircon. Based on both laser fluorination and SIMS data, zircon 91500 seems to be very well suited for calibrating in situ oxygen isotopic analyses.

Journal ArticleDOI
01 Sep 2004-Ecology
TL;DR: A change in approach is needed to determine whether pollen limitation reflects random fluctuations around a pollen–resource equilibrium, an adaptation to stochastic pollination environments, or a chronic syndrome caused by an environmental perturbation.
Abstract: Determining whether seed production is pollen limited has been an area of intensive empirical study over the last two decades. Yet current evidence does not allow satisfactory assessment of the causes or consequences of pollen limitation. Here, we critically evaluate existing theory and issues concerning pollen limitation. Our main conclusion is that a change in approach is needed to determine whether pollen limitation reflects random fluctuations around a pollen–resource equilibrium, an adaptation to stochastic pollination environments, or a chronic syndrome caused by an environmental perturbation. We formalize and extend D. Haig and M. Westoby's conceptual model, and illustrate its use in guiding research on the evolutionary consequences of pollen limitation, i.e., whether plants evolve or have evolved to ameliorate pollen limitation. This synthesis also reveals that we are only beginning to understand when and how pollen limitation at the plant level translates into effects on plant population dynamics...

Journal ArticleDOI
TL;DR: In this article, the authors proposed a theoretically consistent modification of gravity in the infrared, which is compatible with all current experimental observations and opens up a number of new avenues for attacking cosmological problems, including inflation, dark matter and dark energy.
Abstract: We propose a theoretically consistent modification of gravity in the infrared, which is compatible with all current experimental observations. This is an analog of the Higgs mechanism in general relativity, and can be thought of as arising from ghost condensation — a background where a scalar field has a constant velocity, = M2. The ghost condensate is a new kind of fluid that can fill the universe, which has the same equation of state, ρ = −p, as a cosmological constant, and can hence drive de Sitter expansion of the universe. However, unlike a cosmological constant, it is a physical fluid with a physical scalar excitation, which can be described by a systematic effective field theory at low energies. The excitation has an unusual low-energy dispersion relation ω2 ~ 4/M2. If coupled to matter directly, it gives rise to small Lorentz-violating effects and a new long-range 1/r2 spin dependent force. In the ghost condensate, the energy that gravitates is not the same as the particle physics energy, leading to the possibility of both sources that can gravitate and anti-gravitate. The newtonian potential is modified with an oscillatory behavior starting at the distance scale MPl/M2 and the time scale MPl2/M3. This theory opens up a number of new avenues for attacking cosmological problems, including inflation, dark matter and dark energy.

Journal ArticleDOI
TL;DR: A new structural optimization method based on the harmony search (HS) meta-heuristic algorithm, which was conceptualized using the musical process of searching for a perfect state of harmony to demonstrate the effectiveness and robustness of the new method.

Journal ArticleDOI
TL;DR: For too long, marketers have not been held accountable for showing how marketing expenditures add to shareholder value as mentioned in this paper, and this lack of accountability has undermined marketers' credibility, threatened the standing of the marketing function within the firm, and even threatened marketing's existence as a distinct capability.
Abstract: For too long, marketers have not been held accountable for showing how marketing expenditures add to shareholder value. As time has gone by, this lack of accountability has undermined marketers’ credibility, threatened the standing of the marketing function within the firm, and even threatened marketing’s existence as a distinct capability within the firm. This article proposes a broad framework for assessing marketing productivity, cataloging what is already known, and suggesting areas for further research. The authors conclude that it is possible to show how marketing expenditures add to shareholder value. The effective dissemination of new methods of assessing marketing productivity to the business community will be a major step toward raising marketing’s vitality in the firm and, more important, toward raising the performance of the firm itself. The authors also suggest many areas in which further research is essential to making methods of evaluating marketing productivity increasingly valid,...