scispace - formally typeset
Search or ask a question

Showing papers by "University of Maryland, College Park published in 2005"


Journal ArticleDOI
22 Jul 2005-Science
TL;DR: Global croplands, pastures, plantations, and urban areas have expanded in recent decades, accompanied by large increases in energy, water, and fertilizer consumption, along with considerable losses of biodiversity.
Abstract: Land use has generally been considered a local environmental issue, but it is becoming a force of global importance. Worldwide changes to forests, farmlands, waterways, and air are being driven by the need to provide food, fiber, water, and shelter to more than six billion people. Global croplands, pastures, plantations, and urban areas have expanded in recent decades, accompanied by large increases in energy, water, and fertilizer consumption, along with considerable losses of biodiversity. Such changes in land use have enabled humans to appropriate an increasing share of the planet’s resources, but they also potentially undermine the capacity of ecosystems to sustain food production, maintain freshwater and forest resources, regulate climate and air quality, and ameliorate infectious diseases. We face the challenge of managing trade-offs between immediate human needs and maintaining the capacity of the biosphere to provide goods and services in the long term.

10,117 citations


Journal ArticleDOI
TL;DR: It is shown that information consensus under dynamically changing interaction topologies can be achieved asymptotically if the union of the directed interaction graphs have a spanning tree frequently enough as the system evolves.
Abstract: This note considers the problem of information consensus among multiple agents in the presence of limited and unreliable information exchange with dynamically changing interaction topologies. Both discrete and continuous update schemes are proposed for information consensus. This note shows that information consensus under dynamically changing interaction topologies can be achieved asymptotically if the union of the directed interaction graphs have a spanning tree frequently enough as the system evolves.

6,135 citations


Journal ArticleDOI
TL;DR: This study empirically test a model of knowledge contribution and finds that people contribute their knowledge when they perceive that it enhances their professional reputations, when they have the experience to share, and when they are structurally embedded in the network.
Abstract: Electronic networks of practice are computer-mediated discussion forums focused on problems of practice that enable individuals to exchange advice and ideas with others based on common interests. However, why individuals help strangers in these electronic networks is not well understood: there is no immediate benefit to the contributor, and free-riders are able to acquire the same knowledge as everyone else. To understand this paradox, we apply theories of collective action to examine how individual motivations and social capital influence knowledge contribution in electronic networks. This study reports on the activities of one electronic network supporting a professional legal association. Using archival, network, survey, and content analysis data, we empirically test a model of knowledge contribution. We find that people contribute their knowledge when they perceive that it enhances their professional reputations, when they have the experience to share, and when they are structurally embedded in the network. Surprisingly, contributions occur without regard to expectations of reciprocity from others or high levels of commitment to the network.

4,636 citations


Journal ArticleDOI
TL;DR: The authors applied cross-sectional and longitudinal propensity score matching estimators to data from the National Supported Work (NSW) Demonstration that have been previously analyzed by LaLonde (1986) and Dehejia and Wahba (1999, 2002).

2,380 citations


Journal ArticleDOI
TL;DR: The authors concluded that CQR is a viable qualitative method and suggest several ideas for research on the method itself and made recommendations for modifications of the method.
Abstract: The authors reviewed the application of consensual qualitative research (CQR) in 27 studies published since the method's introduction to the field in 1997 by C. E. Hill, B. J. Thompson, and E. N. Williams (1997). After first describing the core components and the philosophical underpinnings of CQR, the authors examined how it has been applied in terms of the consensus process, biases, research teams, data collection, data analysis, and writing up the results and discussion sections of articles. On the basis of problems that have arisen in each of these areas, the authors made recommendations for modifications of the method. The authors concluded that CQR is a viable qualitative method and suggest several ideas for research on the method itself.

2,320 citations


Journal ArticleDOI
TL;DR: A surface plasmon polariton (SPP) is an electromagnetic excitation existing on the surface of a good metal, whose electromagnetic field decays exponentially with distance from the surface.

2,211 citations


Journal ArticleDOI
TL;DR: In 1981, the macrocyclic methylene-bridged glycoluril hexamer (CB[6]) was dubbed "cucurbituril" by Mock and co-workers because of its resemblance to the most prominent member of the cucurbitaceae family of plants--the pumpkin.
Abstract: In 1981, the macrocyclic methylene-bridged glycoluril hexamer (CB[6]) was dubbed "cucurbituril" by Mock and co-workers because of its resemblance to the most prominent member of the cucurbitaceae family of plants--the pumpkin. In the intervening years, the fundamental binding properties of CB[6]-high affinity, highly selective, and constrictive binding interactions--have been delineated by the pioneering work of the research groups of Mock, Kim, and Buschmann, and has led to their applications in waste-water remediation, as artificial enzymes, and as molecular switches. More recently, the cucurbit[n]uril family has grown to include homologues (CB[5]-CB[10]), derivatives, congeners, and analogues whose sizes span and exceed the range available with the alpha-, beta-, and gamma-cyclodextrins. Their shapes, solubility, and chemical functionality may now be tailored by synthetic chemistry to play a central role in molecular recognition, self-assembly, and nanotechnology. This Review focuses on the synthesis, recognition properties, and applications of these unique macrocycles.

2,074 citations


Journal ArticleDOI
TL;DR: In this article, the authors investigate the effect of financial, legal, and corruption problems on firms' growth rates and find that it is consistently the smallest firms that are most constrained.
Abstract: Using a unique firm-level survey database covering 54 countries, we investigate the effect of financial, legal, and corruption problems on firms' growth rates. Whether these factors constrain growth depends on firm size. It is consistently the smallest firms that are most constrained. Financial and institutional development weakens the constraining effects of financial, legal, and corruption obstacles and it is again the small firms that benefit the most. There is only a weak relation between firms' perception of the quality of the courts in their country and firm growth. We also provide evidence that the corruption of bank officials constrains firm growth.

2,030 citations


Journal ArticleDOI
TL;DR: In the most central Au+Au collisions at the highest beam energy, evidence is found for the formation of a very high energy density system whose description in terms of simple hadronic degrees of freedom is inappropriate as discussed by the authors.

1,786 citations


Journal ArticleDOI
TL;DR: A new harmony search (HS) meta-heuristic algorithm-based approach for engineering optimization problems with continuous design variables conceptualized using the musical process of searching for a perfect state of harmony using a stochastic random search instead of a gradient search.

1,714 citations


Journal ArticleDOI
29 Apr 2005-Science
TL;DR: The authors of as mentioned in this paper developed a comprehensive database of >37,000 river restoration projects across the United States, which are intended to enhance water quality, manage riparian zones, improve in-stream habitat, allow fish passage, and stabilize stream banks.
Abstract: The authors of this Policy Forum developed a comprehensive database of >37,000 river restoration projects across the United States. Such projects have increased exponentially over the past decade with more than a billion dollars spent annually since 1990. Most are intended to enhance water quality, manage riparian zones, improve in-stream habitat, allow fish passage, and stabilize stream banks. Only 10% of project records document any form of project monitoring, and little if any of this information is either appropriate or available for assessing the ecological effectiveness of restoration activities.

Proceedings ArticleDOI
06 Jun 2005
TL;DR: The Horus system identifies different causes for the wireless channel variations and addresses them and uses location-clustering techniques to reduce the computational requirements of the algorithm and the lightweight Horus algorithm helps in supporting a larger number of users by running the algorithm at the clients.
Abstract: We present the design and implementation of the Horus WLAN location determination system. The design of the Horus system aims at satisfying two goals: high accuracy and low computational requirements. The Horus system identifies different causes for the wireless channel variations and addresses them to achieve its high accuracy. It uses location-clustering techniques to reduce the computational requirements of the algorithm. The lightweight Horus algorithm helps in supporting a larger number of users by running the algorithm at the clients.We discuss the different components of the Horus system and its implementation under two different operating systems and evaluate the performance of the Horus system on two testbeds. Our results show that the Horus system achieves its goal. It has an error of less than 0.6 meter on the average and its computational requirements are more than an order of magnitude better than other WLAN location determination systems. Moreover, the techniques developed in the context of the Horus system are general and can be applied to other WLAN location determination systems to enhance their accuracy. We also report lessons learned from experimenting with the Horus system and provide directions for future work.

Posted Content
TL;DR: In this article, the authors investigate the nature of selection and productivity growth using data from industries where they observe producer-level quantities and prices separately, and show that there are important differences between revenue and physical productivity.
Abstract: There is considerable evidence that producer-level churning contributes substantially to aggregate (industry) productivity growth, as more productive businesses displace less productive ones. However, this research has been limited by the fact that producer-level prices are typically unobserved; thus within-industry price differences are embodied in productivity measures. If prices reflect idiosyncratic demand or market power shifts, high "productivity" businesses may not be particularly efficient, and the literature's findings might be better interpreted as evidence of entering businesses displacing less profitable, but not necessarily less productive, exiting businesses. In this paper, we investigate the nature of selection and productivity growth using data from industries where we observe producer-level quantities and prices separately. We show there are important differences between revenue and physical productivity. A key dissimilarity is that physical productivity is inversely correlated with plant-level prices while revenue productivity is positively correlated with prices. This implies that previous work linking (revenue-based) productivity to survival has confounded the separate and opposing effects of technical efficiency and demand on survival, understating the true impacts of both. We further show that young producers charge lower prices than incumbents, and as such the literature understates the productivity advantage of new producers and the contribution of entry to aggregate productivity growth.

Journal ArticleDOI
TL;DR: A real-time algorithm for foreground-background segmentation that can handle scenes containing moving backgrounds or illumination variations, and it achieves robust detection for different types of videos is presented.
Abstract: We present a real-time algorithm for foreground-background segmentation. Sample background values at each pixel are quantized into codebooks which represent a compressed form of background model for a long image sequence. This allows us to capture structural background variation due to periodic-like motion over a long period of time under limited memory. The codebook representation is efficient in memory and speed compared with other background modeling techniques. Our method can handle scenes containing moving backgrounds or illumination variations, and it achieves robust detection for different types of videos. We compared our method with other multimode modeling techniques. In addition to the basic algorithm, two features improving the algorithm are presented-layered modeling/detection and adaptive codebook updating. For performance evaluation, we have applied perturbation detection rate analysis to four background subtraction algorithms and two videos of different types of scenes.

Book
01 Jan 2005
TL;DR: In this paper, the authors discuss the changing contours of sexual violence and the challenge of HIV/AIDS in black sexual politics and discuss the power of a free mind in the face of racism and sexism.
Abstract: Acknowledgements Introduction: No Turning Back I. African Americans and the New Racism 1. Why Black Sexual Politics? 2. The Past Is Ever Present: Recognizing the New Racism 3.Prisons for Our Bodies, Closets for Our Minds: Racism, Heterosexism, and Black Sexuality II. Rethinking Black Gender Ideology 4. Get Your Freak On: Sex, Babies, and Images of Black Femininity 5. Booty Call: Sex, Violence, and Images of Black Masculinity 6. Very Necessary: Redefining Black Gender Ideology III. Toward a Progressive Black Sexual Politics 7. Assume the Position: The Changing Contours of Sexual Violence 8. No Storybook Romance: How Race and Gender Matter 9. Why We Can't Wait: Black Sexual Politics and the Challenge of HIV/AIDS Afterword: The Power of a Free Mind Notes Glossary Bibliography Index

Journal ArticleDOI
TL;DR: In this paper, the authors propose five criteria for measuring success of river restoration, with emphasis on an ecological perspective, and suggest standards of evaluation for each of the five criteria and provide examples of suitable indicators.
Abstract: Summary 1. Increasingly, river managers are turning from hard engineering solutions to ecologically based restoration activities in order to improve degraded waterways. River restoration projects aim to maintain or increase ecosystem goods and services while protecting downstream and coastal ecosystems. There is growing interest in applying river restoration techniques to solve environmental problems, yet little agreement exists on what constitutes a successful river restoration effort. 2. We propose five criteria for measuring success, with emphasis on an ecological perspective. First, the design of an ecological river restoration project should be based on a specified guiding image of a more dynamic, healthy river that could exist at the site. Secondly, the river’s ecological condition must be measurably improved. Thirdly, the river system must be more self-sustaining and resilient to external perturbations so that only minimal follow-up maintenance is needed. Fourthly, during the construction phase, no lasting harm should be inflicted on the ecosystem. Fifthly, both pre- and postassessment must be completed and data made publicly available. 3. Determining if these five criteria have been met for a particular project requires development of an assessment protocol. We suggest standards of evaluation for each of the five criteria and provide examples of suitable indicators. 4. Synthesis and applications . Billions of dollars are currently spent restoring streams and rivers, yet to date there are no agreed upon standards for what constitutes ecologically beneficial stream and river restoration. We propose five criteria that must be met for a river restoration project to be considered ecologically successful. It is critical that the broad restoration community, including funding agencies, practitioners and citizen restoration groups, adopt criteria for defining and assessing ecological success in restoration. Standards are needed because progress in the science and practice of river restoration has been hampered by the lack of agreed upon criteria for judging ecological success. Without well-accepted criteria that are ultimately supported by funding and implementing agencies, there is little incentive for practitioners to assess and report restoration outcomes. Improving methods and weighing the ecological benefits of various restoration approaches require organized national-level reporting systems.

Proceedings ArticleDOI
08 Jun 2005
TL;DR: A survey of consensus problems in multi-agent cooperative control with the goal of promoting research in this area is provided in this paper, where theoretical results regarding consensus seeking under both time-invariant and dynamically changing information exchange topologies are summarized.
Abstract: As a distributed solution to multi-agent coordination, consensus or agreement problems have been studied extensively in the literature. This paper provides a survey of consensus problems in multi-agent cooperative control with the goal of promoting research in this area. Theoretical results regarding consensus seeking under both time-invariant and dynamically changing information exchange topologies are summarized. Applications of consensus protocols to multiagent coordination are investigated. Future research directions and open problems are also proposed.

Journal ArticleDOI
TL;DR: This paper investigated the relation of the board of directors and institutional ownership with the properties of management earnings forecasts and found that firms with more outside directors and greater institutional ownership are more likely to issue a forecast and are inclined to forecast more frequently.
Abstract: We investigate the relation of the board of directors and institutional ownership with the properties of management earnings forecasts. We find that firms with more outside directors and greater institutional ownership are more likely to issue a forecast and are inclined to forecast more frequently. In addition, these forecasts tend to be more specific, accurate and less optimistically biased. These results are robust to changes specification, Granger causality tests, and simultaneous equation analyses. The results are similar in the pre– and post–Regulation Fair Disclosure (Reg FD) eras. Additional analysis suggests that concentrated institutional ownership is negatively associated with forecast properties. This association is less negative in the post–Reg FD environment, which is consistent with Reg FD reducing the ability of firms to privately communicate information to select audiences.

Journal ArticleDOI
TL;DR: The authors examined the extent to which organizations' reputations encompass different types of stakeholders' perceptions, which may have differential effects on economic outcomes, and proposed a method to measure the influence of reputations in economic outcomes.
Abstract: We examined the extent to which organizations’ reputations encompass different types of stakeholders’ perceptions, which may have differential effects on economic outcomes. Specifically, we propose...

Proceedings ArticleDOI
25 Jun 2005
TL;DR: The model is formally a synchronous context-free grammar but is learned from a bitext without any syntactic information, which can be seen as a shift to the formal machinery of syntax-based translation systems without any linguistic commitment.
Abstract: We present a statistical phrase-based translation model that uses hierarchical phrases---phrases that contain subphrases. The model is formally a synchronous context-free grammar but is learned from a bitext without any syntactic information. Thus it can be seen as a shift to the formal machinery of syntax-based translation systems without any linguistic commitment. In our experiments using BLEU as a metric, the hierarchical phrase-based model achieves a relative improvement of 7.5% over Pharaoh, a state-of-the-art phrase-based system.

Journal ArticleDOI
TL;DR: The burst alert telescope (BAT) as discussed by the authors is one of three instruments on the Swift MIDEX spacecraft to study gamma-ray bursts (GRBs) and it detects the GRB and localizes the burst direction to an accuracy of 1-4 arcmin within 20 s after the start of the event.
Abstract: he burst alert telescope (BAT) is one of three instruments on the Swift MIDEX spacecraft to study gamma-ray bursts (GRBs). The BAT first detects the GRB and localizes the burst direction to an accuracy of 1–4 arcmin within 20 s after the start of the event. The GRB trigger initiates an autonomous spacecraft slew to point the two narrow field-of-view (FOV) instruments at the burst location within 20–70 s so to make follow-up X-ray and optical observations. The BAT is a wide-FOV, coded-aperture instrument with a CdZnTe detector plane. The detector plane is composed of 32,768 pieces of CdZnTe (4×4×2 mm), and the coded-aperture mask is composed of ∼52,000 pieces of lead (5×5×1 mm) with a 1-m separation between mask and detector plane. The BAT operates over the 15–150 keV energy range with ∼7 keV resolution, a sensitivity of ∼10−8 erg s−1 cm−2, and a 1.4 sr (half-coded) FOV. We expect to detect > 100 GRBs/year for a 2-year mission. The BAT also performs an all-sky hard X-ray survey with a sensitivity of ∼2 m Crab (systematic limit) and it serves as a hard X-ray transient monitor.

Journal ArticleDOI
TL;DR: A field study of top management teams and knowledge workers from 72 technology firms demonstrated that the rate of new product and service introduction was a function of organization members' ability to combine and exchange knowledge as mentioned in this paper.
Abstract: A field study of top management teams and knowledge workers from 72 technology firms demonstrated that the rate of new product and service introduction was a function of organization members’ ability to combine and exchange knowledge. We tested the following as bases of that ability: the existing knowledge of employees (their education levels and functional heterogeneity), knowledge from member ego networks (number of direct contacts and strength of ties), and organizational climates for risk taking and teamwork.

Journal ArticleDOI
TL;DR: These Standards will inform efforts in the field to find prevention Programs and policies that are of proven efficacy, effectiveness, or readiness for adoption and will guide prevention scientists as they seek to discover, research, and bring to the field new prevention programs and policies.
Abstract: Ever increasing demands for accountability, together with the proliferation of lists of evidence-based prevention programs and policies, led the Society for Prevention Research to charge a committee with establishing standards for identifying effective prevention programs and policies. Recognizing that interventions that are effective and ready for dissemination are a subset of effective programs and policies, and that effective programs and policies are a subset of efficacious interventions, SPR’s Standards Committee developed overlapping sets of standards. We designed these Standards to assist practitioners, policy makers, and administrators to determine which interventions are efficacious, which are effective, and which are ready for dissemination. Under these Standards, an efficacious intervention will have been tested in at least two rigorous trials that (1) involved defined samples from defined populations, (2) used psychometrically sound measures and data collection procedures; (3) analyzed their data with rigorous statistical approaches; (4) showed consistent positive effects (without serious iatrogenic effects); and (5) reported at least one significant long-term follow-up. An effective intervention under these Standards will not only meet all standards for efficacious interventions, but also will have (1) manuals, appropriate training, and technical support available to allow third parties to adopt and implement the intervention; (2) been evaluated under real-world conditions in studies that included sound measurement of the level of implementation and engagement of the target audience (in both the intervention and control conditions); (3) indicated the practical importance of intervention outcome effects; and (4) clearly demonstrated to whom intervention findings can be generalized. An intervention recognized as ready for broad dissemination under these Standards will not only meet all standards for efficacious and effective interventions, but will also provide (1) evidence of the ability to “go to scale”; (2) clear cost information; and (3) monitoring and evaluation tools so that adopting agencies can monitor or evaluate how well the intervention works in their settings. Finally, the Standards Committee identified possible standards desirable for current and future areas of prevention science as the field develops. If successful, these Standards will inform efforts in the field to find prevention programs and policies that are of proven efficacy, effectiveness, or readiness for adoption and will guide prevention scientists as they seek to discover, research, and bring to the field new prevention programs and policies.

Journal ArticleDOI
TL;DR: In this paper, three basic hysteretic models used in seismic demand evaluation are modified to include deterioration properties: bilinear, peak-oriented, and pinching, and the models incorporate an energy-based deterioration parameter that controls four cyclic deterioration modes: basic strength, postcapping strength, unloading stiffness, and accelerated reloading stiffness deterioration.
Abstract: This paper presents the description, calibration and application of relatively simple hysteretic models that include strength and stiffness deterioration properties, features that are critical for demand predictions as a structural system approaches collapse. Three of the basic hysteretic models used in seismic demand evaluation are modified to include deterioration properties: bilinear, peak-oriented, and pinching. The modified models include most of the sources of deterioration: i.e. various modes of cyclic deterioration and softening of the post-yielding stiffness, and also account for a residual strength after deterioration. The models incorporate an energy-based deterioration parameter that controls four cyclic deterioration modes: basic strength, post-capping strength, unloading stiffness, and accelerated reloading stiffness deterioration. Calibration of the hysteretic models on steel, plywood, and reinforced-concrete components demonstrates that the proposed models are capable of simulating the main characteristics that influence deterioration. An application of a peak-oriented deterioration model in the seismic evaluation of single-degree-of-freedom (SDOF) systems is illustrated. The advantages of using deteriorating hysteretic models for obtaining the response of highly inelastic systems are discussed. Copyright © 2005 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: It is suggested that this process can be observed in instances of trauma, in early out-of-sequence transitions, and in the case of undesired changes that disrupt behaviors and relationships in established roles, which must recognize their structural underpinnings.
Abstract: This article proposes several conceptual perspectives designed to advance our understanding of the material and experiential conditions contributing to persistent disparities in rates of morbidity and mortality among groups unequal in their social and economic statuses. An underlying assumption is that these disparities, which are in clear evidence at mid- and late life, may be anchored to earlier circumstances of the life course. Of particular interest are those circumstances resulting in people with the least privileged statuses having the greatest chances of exposure to health-related stressors. Among the stressors closely linked to status and status attainment are those that continue or are repeated across the life course, such as enduring economic strain and discriminatory experiences. Also taking a long-range toll on health are circumstances of stress proliferation, a process that places people exposed to a serious adversity at risk for later exposure to additional adversities. We suggest that this process can be observed in instances of trauma, in early out-of-sequence transitions, and in the case of undesired changes that disrupt behaviors and relationships in established roles. Effective effort to close the systemic health gaps must recognize their structural underpinnings.

Posted Content
TL;DR: In this article, the authors describe a method for data assimilation in large, spatio-temporally chaotic systems, in which the state estimate and its approximate uncertainty are represented at any given time by an ensemble of system states.
Abstract: Data assimilation is an iterative approach to the problem of estimating the state of a dynamical system using both current and past observations of the system together with a model for the system's time evolution. Rather than solving the problem from scratch each time new observations become available, one uses the model to ``forecast'' the current state, using a prior state estimate (which incorporates information from past data) as the initial condition, then uses current data to correct the prior forecast to a current state estimate. This Bayesian approach is most effective when the uncertainty in both the observations and in the state estimate, as it evolves over time, are accurately quantified. In this article, we describe a practical method for data assimilation in large, spatiotemporally chaotic systems. The method is a type of ``ensemble Kalman filter'', in which the state estimate and its approximate uncertainty are represented at any given time by an ensemble of system states. We discuss both the mathematical basis of this approach and its implementation; our primary emphasis is on ease of use and computational speed rather than improving accuracy over previously published approaches to ensemble Kalman filtering. We include some numerical results demonstrating the efficiency and accuracy of our implementation for assimilating real atmospheric data with the global forecast model used by the U.S. National Weather Service.

Journal ArticleDOI
TL;DR: A new key predistribution scheme is proposed which substantially improves the resilience of the network compared to previous schemes, and an in-depth analysis of the scheme in terms of network resilience and associated overhead is given.
Abstract: To achieve security in wireless sensor networks, it is important to be able to encrypt and authenticate messages sent between sensor nodes. Before doing so, keys for performing encryption and authentication must be agreed upon by the communicating parties. Due to resource constraints, however, achieving key agreement in wireless sensor networks is nontrivial. Many key agreement schemes used in general networks, such as Diffie-Hellman and other public-key based schemes, are not suitable for wireless sensor networks due to the limited computational abilities of the sensor nodes. Predistribution of secret keys for all pairs of nodes is not viable due to the large amount of memory this requires when the network size is large.In this paper, we provide a framework in which to study the security of key predistribution schemes, propose a new key predistribution scheme which substantially improves the resilience of the network compared to previous schemes, and give an in-depth analysis of our scheme in terms of network resilience and associated overhead. Our scheme exhibits a nice threshold property: when the number of compromised nodes is less than the threshold, the probability that communications between any additional nodes are compromised is close to zero. This desirable property lowers the initial payoff of smaller-scale network breaches to an adversary, and makes it necessary for the adversary to attack a large fraction of the network before it can achieve any significant gain.

Journal ArticleDOI
TL;DR: While network analysis has been studied in depth in particular areas such as social network analysis, hypertext mining, and web analysis, only recently has there been a cross-fertilization of ideas among these different communities.
Abstract: Many datasets of interest today are best described as a linked collection of interrelated objects. These may represent homogeneous networks, in which there is a single-object type and link type, or richer, heterogeneous networks, in which there may be multiple object and link types (and possibly other semantic information). Examples of homogeneous networks include single mode social networks, such as people connected by friendship links, or the WWW, a collection of linked web pages. Examples of heterogeneous networks include those in medical domains describing patients, diseases, treatments and contacts, or in bibliographic domains describing publications, authors, and venues. Link mining refers to data mining techniques that explicitly consider these links when building predictive or descriptive models of the linked data. Commonly addressed link mining tasks include object ranking, group detection, collective classification, link prediction and subgraph discovery. While network analysis has been studied in depth in particular areas such as social network analysis, hypertext mining, and web analysis, only recently has there been a cross-fertilization of ideas among these different communities. This is an exciting, rapidly expanding area. In this article, we review some of the common emerging themes.

Journal ArticleDOI
TL;DR: A second-generation theory of fiscal federalism is emerging that provides new insights into the structure and working of federal systems and its implications for the design of fiscal institutions as discussed by the authors...
Abstract: Drawing on a wide range of literature and ideas, a new “second-generation theory of fiscal federalism” is emerging that provides new insights into the structure and working of federal systems. After a restatement and review of the first-generation theory, this paper surveys this new body of work and offers some thoughts on the ways in which it is extending our understanding of fiscal federalism and on its implications for the design of fiscal institutions.

Journal ArticleDOI
TL;DR: This study uncovers and examines the variety of supply chain partnership configurations that exist based on differences in capability platforms, reflecting varying processes and information systems, and uses the absorptive capacity lens to build a conceptual framework that links these configurations with partner-enabled market knowledge creation.
Abstract: The need for continual value innovation is driving supply chains to evolve from a pure transactional focus to leveraging interorganizational partner ships for sharing information and, ultimately, market knowledge creation. Supply chain partners are (1) engaging in interlinked processes that enable rich (broad-ranging, high quality, and privileged) information sharing, and (2) building information technology infrastructures that allow them to process information obtained from their partners to create new knowledge. This study uncovers and examines the variety of supply chain partnership configurations that exist based on differences in capability platforms, reflecting varying processes and information systems. We use the absorptive capacity lens to build a conceptual framework that links these configurations with partner-enabled market knowledge creation. Absorptive capacity refers to the set of organizational routines and processes by which organizations acquire, assimilate, transform, and exploit knowledge to produce dynamic organizational capabilities. Through an exploratory field study conducted in the context of the RosettaNet consortium effort in the IT industry supply chain, we use cluster analysis to uncover and characterize five supply chain partnership configurations (collectors, connectors, crunchers, coercers, and collaborators). We compare their partner-enabled knowledge creation and operational efficiency, as well as the shortcomings in their capability platforms and the nature of information exchange. Through the characterization of each of the configurations, we are able to derive research propositions focused on enterprise absorptive capacity elements. These propositions provide insight into how partner-enabled market knowledge creation and operational efficiency can be affected, and highlight the interconnected roles of coordination information and rich information. The paper concludes by drawing implications for research and practice from the uncovering of these configurations and the resultant research propositions. It also highlights fertile opportunities for advances in research on knowledge management through the study of supply chain contexts and other interorganizational partnering arrangements.