scispace - formally typeset
Search or ask a question

Showing papers by "Cornell University published in 2002"


Proceedings ArticleDOI
06 Jul 2002
TL;DR: This work considers the problem of classifying documents not by topic, but by overall sentiment, e.g., determining whether a review is positive or negative, and concludes by examining factors that make the sentiment classification problem more challenging.
Abstract: We consider the problem of classifying documents not by topic, but by overall sentiment, e.g., determining whether a review is positive or negative. Using movie reviews as data, we find that standard machine learning techniques definitively outperform human-produced baselines. However, the three machine learning methods we employed (Naive Bayes, maximum entropy classification, and support vector machines) do not perform as well on sentiment classification as on traditional topic-based categorization. We conclude by examining factors that make the sentiment classification problem more challenging.

6,626 citations


Journal ArticleDOI
David M. Post1
01 Mar 2002-Ecology
TL;DR: In this article, the authors developed and discussed methods for generating an isotopic baseline and evaluate the assump- tions required to estimate the trophic position of consumers using stable isotopes in multiple ecosystem studies.
Abstract: The stable isotopes of nitrogen (8'5N) and carbon (8'3C) provide powerful tools for estimating the trophic positions of and carbon flow to consumers in food webs; however, the isotopic signature of a consumer alone is not generally sufficient to infer trophic position or carbon source without an appropriate isotopic baseline. In this paper, I develop and discuss methods for generating an isotopic baseline and evaluate the assump- tions required to estimate the trophic position of consumers using stable isotopes in multiple ecosystem studies. I test the ability of two primary consumers, surface-grazing snails and filter-feeding mussels, to capture the spatial and temporal variation at the base of aquatic food webs. I find that snails reflect the isotopic signature of the base of the littoral food web, mussels reflect the isotopic signature of the pelagic food web, and together they provide a good isotopic baseline for estimating trophic position of secondary or higher trophic level consumers in lake ecosystems. Then, using data from 25 north temperate lakes, I evaluate how 815N and 8'3C of the base of aquatic food webs varies both among lakes and between the littoral and pelagic food webs within lakes. Using data from the literature, I show that the mean trophic fractionation of b'5N is 3.4%o (1 SD = 1%M) and of 8'3C is 0.4%o (1 SD = 1.3%o), and that both, even though variable, are widely applicable. A sen- sitivity analysis reveals that estimates of trophic position are very sensitive to assumptions about the trophic fractionation of '5 N, moderately sensitive to different methods for gen- erating an isotopic baseline, and not sensitive to assumptions about the trophic fractionation of 8'3C when 8'3C is used to estimate the proportion of nitrogen in a consumer derived from two sources. Finally, I compare my recommendations for generating an isotopic baseline to an alternative model proposed by M. J. Vander Zanden and J. B. Rasmussen. With an appropriate isotopic baseline and an appreciation of the underlying assumptions and model sensitivity, stable isotopes can help answer some of the most difficult questions in food web ecology.

5,648 citations



Journal ArticleDOI
TL;DR: In this paper, the authors discuss the discounted utility (DU) model, its historical development, underlying assumptions, and "anomalies" -the empirical regularities that are inconsistent with its theoretical predictions.
Abstract: This paper discusses the discounted utility (DU) model: its historical development, underlying assumptions, and "anomalies" - the empirical regularities that are inconsistent with its theoretical predictions. We then summarize the alternate theoretical formulations that have been advanced to address these anomalies. We also review three decades of empirical research on intertemporal choice, and discuss reasons for the spectacular variation in implicit discount rates across studies. Throughout the paper, we stress the importance of distinguishing time preference, per se, from many other considerations that also influence intertemporal choices.

5,242 citations


Proceedings ArticleDOI
23 Jul 2002
TL;DR: The goal of this paper is to develop a method that utilizes clickthrough data for training, namely the query-log of the search engine in connection with the log of links the users clicked on in the presented ranking.
Abstract: This paper presents an approach to automatically optimizing the retrieval quality of search engines using clickthrough data. Intuitively, a good information retrieval system should present relevant documents high in the ranking, with less relevant documents following below. While previous approaches to learning retrieval functions from examples exist, they typically require training data generated from relevance judgments by experts. This makes them difficult and expensive to apply. The goal of this paper is to develop a method that utilizes clickthrough data for training, namely the query-log of the search engine in connection with the log of links the users clicked on in the presented ranking. Such clickthrough data is available in abundance and can be recorded at very low cost. Taking a Support Vector Machine (SVM) approach, this paper presents a method for learning retrieval functions. From a theoretical perspective, this method is shown to be well-founded in a risk minimization framework. Furthermore, it is shown to be feasible even for large sets of queries and features. The theoretical results are verified in a controlled experiment. It shows that the method can effectively adapt the retrieval function of a meta-search engine to a particular group of users, outperforming Google in terms of retrieval quality after only a couple of hundred training examples.

4,453 citations


Journal ArticleDOI
Jin Fan1, Bruce D. McCandliss1, Tobias Sommer1, Amir Raz1, Michael I. Posner1 
TL;DR: A study with 40 normal adult subjects indicates that the ANT produces reliable single subject estimates of alerting, orienting, and executive function, and further suggests that the efficiencies of these three networks are uncorrelated.
Abstract: In recent years, three attentional networks have been defined in anatomical and functional terms. These functions involve alerting, orienting, and executive attention. Reaction time measures can be used to quantify the processing efficiency within each of these three networks. The Attention Network Test (ANT) is designed to evaluate alerting, orienting, and executive attention within a single 30-min testing session that can be easily performed by children, patients, and monkeys. A study with 40 normal adult subjects indicates that the ANT produces reliable single subject estimates of alerting, orienting, and executive function, and further suggests that the efficiencies of these three networks are uncorrelated. There are, however, some interactions in which alerting and orienting can modulate the degree of interference from flankers. This procedure may prove to be convenient and useful in evaluating attentional abnormalities associated with cases of brain injury, stroke, schizophrenia, and attention-deficit disorder. The ANT may also serve as an activation task for neuroimaging studies and as a phenotype for the study of the influence of genes on attentional networks.

3,166 citations


Journal ArticleDOI
TL;DR: Thermal processing enhanced the nutritional value of tomatoes by increasing the bioaccessible lycopene content and total antioxidant activity and are against the notion that processed fruits and vegetables have lower nutritional value than fresh produce.
Abstract: Processed fruits and vegetables have been long considered to have lower nutritional value than their fresh commodities due to the loss of vitamin C during processing This research group found vitamin C in apples contributed < 04% of total antioxidant activity, indicating most of the activity comes from the natural combination of phytochemicals This suggests that processed fruits and vegetables may retain their antioxidant activity despite the loss of vitamin C Here it is shown that thermal processing elevated total antioxidant activity and bioaccessible lycopene content in tomatoes and produced no significant changes in the total phenolics and total flavonoids content, although loss of vitamin C was observed The raw tomato had 076 +/- 003 micromol of vitamin C/g of tomato After 2, 15, and 30 min of heating at 88 degrees C, the vitamin C content significantly dropped to 068 +/- 002, 064 +/- 001, and 054 +/- 002 micromol of vitamin C/g of tomato, respectively (p < 001) The raw tomato had 201 +/- 004 mg of trans-lycopene/g of tomato After 2, 15, and 30 min of heating at 88 degrees C, the trans-lycopene content had increased to 311+/- 004, 545 +/- 002, and 532 +/- 005 mg of trans-lycopene/g of tomato (p < 001) The antioxidant activity of raw tomatoes was 413 +/- 036 micromol of vitamin C equiv/g of tomato With heat treatment at 88 degrees C for 2, 15, and 30 min, the total antioxidant activity significantly increased to 529 +/- 026, 553 +/- 024, and 670 +/- 025 micromol of vitamin C equiv/g of tomato, respectively (p < 001) There were no significant changes in either total phenolics or total flavonoids These findings indicate thermal processing enhanced the nutritional value of tomatoes by increasing the bioaccessible lycopene content and total antioxidant activity and are against the notion that processed fruits and vegetables have lower nutritional value than fresh produce This information may have a significant impact on consumers' food selection by increasing their consumption of fruits and vegetables to reduce the risks of chronic diseases

2,738 citations


Journal ArticleDOI
Carl Nathan1
19 Dec 2002-Nature
TL;DR: The non-inflammatory state does not arise passively from an absence of inflammatory stimuli; rather, maintenance of health requires the positive actions of specific gene products to suppress reactions to potentially inflammatory stimuli that do not warrant a full response.
Abstract: Inflammation is a complex set of interactions among soluble factors and cells that can arise in any tissue in response to traumatic, infectious, post-ischaemic, toxic or autoimmune injury. The process normally leads to recovery from infection and to healing, However, if targeted destruction and assisted repair are not properly phased, inflammation can lead to persistent tissue damage by leukocytes, lymphocytes or collagen. Inflammation may be considered in terms of its checkpoints, where binary or higher-order signals drive each commitment to escalate, go signals trigger stop signals, and molecules responsible for mediating the inflammatory response also suppress it, depending on timing and context. The non-inflammatory state does not arise passively from an absence of inflammatory stimuli; rather, maintenance of health requires the positive actions of specific gene products to suppress reactions to potentially inflammatory stimuli that do not warrant a full response.

2,525 citations


Journal ArticleDOI
TL;DR: In this paper, the authors reviewed the available information about the physical and chemical properties of charcoal as affected by different combustion procedures, and the effects of its application in agricultural fields on nutrient retention and crop production.
Abstract: Rapid turnover of organic matter leads to a low efficiency of organic fertilizers applied to increase and sequester C in soils of the humid tropics. Charcoal was reported to be responsible for high soil organic matter contents and soil fertility of anthropogenic soils (Terra Preta) found in central Amazonia. Therefore, we reviewed the available information about the physical and chemical properties of charcoal as affected by different combustion procedures, and the effects of its application in agricultural fields on nutrient retention and crop production. Higher nutrient retention and nutrient availability were found after charcoal additions to soil, related to higher exchange capacity, surface area and direct nutrient additions. Higher charring temperatures generally improved exchange properties and surface area of the charcoal. Additionally, charcoal is relatively recalcitrant and can therefore be used as a long-term sink for atmospheric CO2. Several aspects of a charcoal management system remain unclear, such as the role of microorganisms in oxidizing charcoal surfaces and releasing nutrients and the possibilities to improve charcoal properties during production under field conditions. Several research needs were identified, such as field testing of charcoal production in tropical agroecosystems, the investigation of surface properties of the carbonized materials in the soil environment, and the evaluation of the agronomic and economic effectiveness of soil management with charcoal.

2,514 citations


Journal ArticleDOI
21 Jun 2002-Science
TL;DR: To improve the ability to predict epidemics in wild populations, it will be necessary to separate the independent and interactive effects of multiple climate drivers on disease impact.
Abstract: Infectious diseases can cause rapid population declines or species extinctions. Many pathogens of terrestrial and marine taxa are sensitive to temperature, rainfall, and humidity, creating synergisms that could affect biodiversity. Climate warming can increase pathogen development and survival rates, disease transmission, and host susceptibility. Although most host-parasite systems are predicted to experience more frequent or severe disease impacts with warming, a subset of pathogens might decline

2,462 citations


Proceedings ArticleDOI
03 Jun 2002
TL;DR: This paper shows that XML's ordered data model can indeed be efficiently supported by a relational database system, and proposes three order encoding methods that can be used to represent XML order in the relational data model, and also proposes algorithms for translating ordered XPath expressions into SQL using these encoding methods.
Abstract: XML is quickly becoming the de facto standard for data exchange over the Internet. This is creating a new set of data management requirements involving XML, such as the need to store and query XML documents. Researchers have proposed using relational database systems to satisfy these requirements by devising ways to "shred" XML documents into relations, and translate XML queries into SQL queries over these relations. However, a key issue with such an approach, which has largely been ignored in the research literature, is how (and whether) the ordered XML data model can be efficiently supported by the unordered relational data model. This paper shows that XML's ordered data model can indeed be efficiently supported by a relational database system. This is accomplished by encoding order as a data value. We propose three order encoding methods that can be used to represent XML order in the relational data model, and also propose algorithms for translating ordered XPath expressions into SQL using these encoding methods. Finally, we report the results of an experimental study that investigates the performance of the proposed order encoding methods on a workload of ordered XML queries and updates.

Journal ArticleDOI
TL;DR: A localization algorithm motivated from least-squares fitting theory is constructed and tested both on image stacks of 30-nm fluorescent beads and on computer-generated images (Monte Carlo simulations), and results show good agreement with the derived precision equation.

Journal ArticleDOI
Robert A. Holt1, G. Mani Subramanian1, Aaron L. Halpern1, Granger G. Sutton1, Rosane Charlab1, Deborah R. Nusskern1, Patrick Wincker2, Andrew G. Clark3, José M. C. Ribeiro4, Ron Wides5, Steven L. Salzberg6, Brendan J. Loftus6, Mark Yandell1, William H. Majoros6, William H. Majoros1, Douglas B. Rusch1, Zhongwu Lai1, Cheryl L. Kraft1, Josep F. Abril, Véronique Anthouard2, Peter Arensburger7, Peter W. Atkinson7, Holly Baden1, Véronique de Berardinis2, Danita Baldwin1, Vladimir Benes, Jim Biedler8, Claudia Blass, Randall Bolanos1, Didier Boscus2, Mary Barnstead1, Shuang Cai1, Kabir Chatuverdi1, George K. Christophides, Mathew A. Chrystal9, Michele Clamp10, Anibal Cravchik1, Val Curwen10, Ali N Dana9, Arthur L. Delcher1, Ian M. Dew1, Cheryl A. Evans1, Michael Flanigan1, Anne Grundschober-Freimoser11, Lisa Friedli7, Zhiping Gu1, Ping Guan1, Roderic Guigó, Maureen E. Hillenmeyer9, Susanne L. Hladun1, James R. Hogan9, Young S. Hong9, Jeffrey Hoover1, Olivier Jaillon2, Zhaoxi Ke1, Zhaoxi Ke9, Chinnappa D. Kodira1, Kokoza Eb, Anastasios C. Koutsos12, Ivica Letunic, Alex Levitsky1, Yong Liang1, Jhy-Jhu Lin6, Jhy-Jhu Lin1, Neil F. Lobo9, John Lopez1, Joel A. Malek6, Tina C. McIntosh1, Stephan Meister, Jason R. Miller1, Clark M. Mobarry1, Emmanuel Mongin13, Sean D. Murphy1, David A. O'Brochta11, Cynthia Pfannkoch1, Rong Qi1, Megan A. Regier1, Karin A. Remington1, Hongguang Shao8, Maria V. Sharakhova9, Cynthia Sitter1, Jyoti Shetty6, Thomas J. Smith1, Renee Strong1, Jingtao Sun1, Dana Thomasova, Lucas Q. Ton9, Pantelis Topalis12, Zhijian Tu8, Maria F. Unger9, Brian P. Walenz1, Aihui Wang1, Jian Wang1, Mei Wang1, X. Wang9, Kerry J. Woodford1, Jennifer R. Wortman6, Jennifer R. Wortman1, Martin Wu6, Alison Yao1, Evgeny M. Zdobnov, Hongyu Zhang1, Qi Zhao1, Shaying Zhao6, Shiaoping C. Zhu1, Igor F. Zhimulev, Mario Coluzzi14, Alessandra della Torre14, Charles Roth15, Christos Louis12, Francis Kalush1, Richard J. Mural1, Eugene W. Myers1, Mark Raymond Adams1, Hamilton O. Smith1, Samuel Broder1, Malcolm J. Gardner6, Claire M. Fraser6, Ewan Birney13, Peer Bork, Paul T. Brey15, J. Craig Venter6, J. Craig Venter1, Jean Weissenbach2, Fotis C. Kafatos, Frank H. Collins9, Stephen L. Hoffman1 
04 Oct 2002-Science
TL;DR: Analysis of the PEST strain of A. gambiae revealed strong evidence for about 14,000 protein-encoding transcripts, and prominent expansions in specific families of proteins likely involved in cell adhesion and immunity were noted.
Abstract: Anopheles gambiae is the principal vector of malaria, a disease that afflicts more than 500 million people and causes more than 1 million deaths each year. Tenfold shotgun sequence coverage was obtained from the PEST strain of A. gambiae and assembled into scaffolds that span 278 million base pairs. A total of 91% of the genome was organized in 303 scaffolds; the largest scaffold was 23.1 million base pairs. There was substantial genetic variation within this strain, and the apparent existence of two haplotypes of approximately equal frequency ("dual haplotypes") in a substantial fraction of the genome likely reflects the outbred nature of the PEST strain. The sequence produced a conservative inference of more than 400,000 single-nucleotide polymorphisms that showed a markedly bimodal density distribution. Analysis of the genome sequence revealed strong evidence for about 14,000 protein-encoding transcripts. Prominent expansions in specific families of proteins likely involved in cell adhesion and immunity were noted. An expressed sequence tag analysis of genes regulated by blood feeding provided insights into the physiological adaptations of a hematophagous insect.

Journal ArticleDOI
13 Jun 2002-Nature
TL;DR: Two related molecules containing a Co ion bonded to polypyridyl ligands, attached to insulating tethers of different lengths are examined, enabling the fabrication of devices that exhibit either single-electron phenomena, such as Coulomb blockade or the Kondo effect.
Abstract: Using molecules as electronic components is a powerful new direction in the science and technology of nanometre-scale systems1. Experiments to date have examined a multitude of molecules conducting in parallel2,3, or, in some cases, transport through single molecules. The latter includes molecules probed in a two-terminal geometry using mechanically controlled break junctions4,5 or scanning probes6,7 as well as three-terminal single-molecule transistors made from carbon nanotubes8, C60 molecules9, and conjugated molecules diluted in a less-conducting molecular layer10. The ultimate limit would be a device where electrons hop on to, and off from, a single atom between two contacts. Here we describe transistors incorporating a transition-metal complex designed so that electron transport occurs through well-defined charge states of a single atom. We examine two related molecules containing a Co ion bonded to polypyridyl ligands, attached to insulating tethers of different lengths. Changing the length of the insulating tether alters the coupling of the ion to the electrodes, enabling the fabrication of devices that exhibit either single-electron phenomena, such as Coulomb blockade, or the Kondo effect.

Journal ArticleDOI
31 May 2002-Cell
TL;DR: In this article, BM ablation induces SDF-1, which upregulates MMP-9 expression, and causes shedding of sKitL and recruitment of c-Kit+ stem/progenitors.

Journal ArticleDOI
TL;DR: Inertial energy storage apparatus having two contrarotating rotors the fellies of which include a number of thin rings of glass or embedded fiber composite material supported by elastic support means so that the radial separations between adjacent rings produced by centrifugal force do not cause failure of the rotors by mechanical rupture of the ring support means.
Abstract: Researchers studying hidden populations–including injection drug users, men who have sex with men, and the homeless–find that standard probability sampling methods are either inapplicable or prohibitively costly because their subjects lack a sampling frame, have privacy concerns, and constitute a small part of the general population. Therefore, researchers generally employ non-probability methods, including location sampling methods such as targeted sampling, and chain-referral methods such as snowball and respondent-driven sampling. Though nonprobability methods succeed in accessing the hidden populations, they have been insufficient for statistical inference. This paper extends the respondent-driven sampling method to show that when biases associated with chain-referral methods are analyzed in sufficient detail, a statistical theory of the sampling process can be constructed, based on which the sampling process can be redesigned to permit the derivation of indicators that are not biased and have known levels of precision. The results are based on a study of 190 injection drug users in a small Connecticut city.

Proceedings ArticleDOI
01 Jul 2002
TL;DR: The work presented in this paper leverages the time-tested techniques of photographic practice to develop a new tone reproduction operator and uses and extends the techniques developed by Ansel Adams to deal with digital images.
Abstract: A classic photographic task is the mapping of the potentially high dynamic range of real world luminances to the low dynamic range of the photographic print. This tone reproduction problem is also faced by computer graphics practitioners who map digital images to a low dynamic range print or screen. The work presented in this paper leverages the time-tested techniques of photographic practice to develop a new tone reproduction operator. In particular, we use and extend the techniques developed by Ansel Adams to deal with digital images. The resulting algorithm is simple and produces good results for a wide variety of images.

Journal ArticleDOI
TL;DR: The degradation in network performance due to unregulated traffic is quantified and it is proved that if the latency of each edge is a linear function of its congestion, then the total latency of the routes chosen by selfish network users is at most 4/3 times the minimum possible total latency.
Abstract: We consider the problem of routing traffic to optimize the performance of a congested network. We are given a network, a rate of traffic between each pair of nodes, and a latency function for each edge specifying the time needed to traverse the edge given its congestion; the objective is to route traffic such that the sum of all travel times---the total latency---is minimized.In many settings, it may be expensive or impossible to regulate network traffic so as to implement an optimal assignment of routes. In the absence of regulation by some central authority, we assume that each network user routes its traffic on the minimum-latency path available to it, given the network congestion caused by the other users. In general such a "selfishly motivated" assignment of traffic to paths will not minimize the total latency; hence, this lack of regulation carries the cost of decreased network performance.In this article, we quantify the degradation in network performance due to unregulated traffic. We prove that if the latency of each edge is a linear function of its congestion, then the total latency of the routes chosen by selfish network users is at most 4/3 times the minimum possible total latency (subject to the condition that all traffic must be routed). We also consider the more general setting in which edge latency functions are assumed only to be continuous and nondecreasing in the edge congestion. Here, the total latency of the routes chosen by unregulated selfish network users may be arbitrarily larger than the minimum possible total latency; however, we prove that it is no more than the total latency incurred by optimally routing twice as much traffic.

Journal ArticleDOI
TL;DR: A framework is provided that describes 9 relevant dimensions and shows that the literature can productively be classified along these dimensions, with each study situated at the intersection of various dimensions.
Abstract: Despite a century's worth of research, arguments surrounding the question of whether far transfer occurs have made little progress toward resolution. The authors argue the reason for this confusion is a failure to specify various dimensions along which transfer can occur, resulting in comparisons of "apples and oranges." They provide a framework that describes 9 relevant dimensions and show that the literature can productively be classified along these dimensions, with each study situated at the intersection of various dimensions. Estimation of a single effect size for far transfer is misguided in view of this complexity. The past 100 years of research shows that evidence for transfer under some conditions is substantial, but critical conditions for many key questions are untested.

Journal ArticleDOI
TL;DR: Bound phytochemicals could survive stomach and intestinal digestion to reach the colon, and may partly explain the mechanism of grain consumption in the prevention of colon cancer, other digestive cancers, breast cancer, and prostate cancer, which is supported by epidemiological studies.
Abstract: Epidemiological studies have shown that consumption of whole grains and grain-based products is associated with reduced risk of chronic diseases. The health benefits of whole grains are attributed in part to their unique phytochemical composition. However, the phytochemical contents in grains have been commonly underestimated in the literature, because bound phytochemicals were not included. This study was designed to investigate the complete phytochemical profiles in free, soluble conjugated, and insoluble bound forms, as well as their antioxidant activities in uncooked whole grains. Corn had the highest total phenolic content (15.55 ± 0.60 μmol of gallic acid equiv/g of grain) of the grains tested, followed by wheat (7.99 ± 0.39 μmol of gallic acid equiv/g of grain), oats (6.53 ± 0.19 μmol of gallic acid equiv/g of grain), and rice (5.56 ± 0.17 μmol of gallic acid equiv/g of grain). The major portion of phenolics in grains existed in the bound form (85% in corn, 75% in oats and wheat, and 62% in rice), ...


Proceedings ArticleDOI
02 Jul 2002
TL;DR: This paper model data-centric routing and compare its performance with traditional end-to-end routing schemes, and examines the complexity of optimal data aggregation, showing that although it is an NP-hard problem in general, there exist useful polynomial-time special cases.
Abstract: Sensor networks are distributed event-based systems that differ from traditional communication networks in several ways: sensor networks have severe energy constraints, redundant low-rate data, and many-to-one flows. Data-centric mechanisms that perform in-network aggregation of data are needed in this setting for energy-efficient information flow. In this paper we model data-centric routing and compare its performance with traditional end-to-end routing schemes. We examine the impact of source-destination placement and communication network density on the energy costs and delay associated with data aggregation. We show that data-centric routing offers significant performance gains across a wide range of operational scenarios. We also examine the complexity of optimal data aggregation, showing that although it is an NP-hard problem in general, there exist useful polynomial-time special cases.

Journal ArticleDOI
TL;DR: A bioactivity index (BI) for dietary cancer prevention is proposed to provide a new alternative biomarker for future epidemiological studies in dietary cancer Prevention and health promotion.
Abstract: Consumption of fruits and vegetables has been associated with reduced risk of chronic diseases such as cardiovascular disease and cancer. Phytochemicals, especially phenolics, in fruits and vegetables are suggested to be the major bioactive compounds for the health benefits. However, the phenolic contents and their antioxidant activities in fruits and vegetables were underestimated in the literature, because bound phenolics were not included. This study was designed to investigate the profiles of total phenolics, including both soluble free and bound forms in common fruits, by applying solvent extraction, base digestion, and solid-phase extraction methods. Cranberry had the highest total phenolic content, followed by apple, red grape, strawberry, pineapple, banana, peach, lemon, orange, pear, and grapefruit. Total antioxidant activity was measured using the TOSC assay. Cranberry had the highest total antioxidant activity (177.0 +/- 4.3 micromol of vitamin C equiv/g of fruit), followed by apple, red grape, strawberry, peach, lemon, pear, banana, orange, grapefruit, and pineapple. Antiproliferation activities were also studied in vitro using HepG(2) human liver-cancer cells, and cranberry showed the highest inhibitory effect with an EC(50) of 14.5 +/- 0.5 mg/mL, followed by lemon, apple, strawberry, red grape, banana, grapefruit, and peach. A bioactivity index (BI) for dietary cancer prevention is proposed to provide a new alternative biomarker for future epidemiological studies in dietary cancer prevention and health promotion.

Journal ArticleDOI
TL;DR: A total of 2740 experimentally confirmed SSR markers for rice are made available, or approximately one SSR every 157 kb, with AT-rich microsatellites had the longest average repeat tracts, while GC-rich motifs were the shortest.
Abstract: A total of 2414 new di-, tri- and tetra-nucleotide non-redundant SSR primer pairs, representing 2240 unique marker loci, have been developed and experimentally validated for rice (Oryza sativa L.). Duplicate primer pairs are reported for 7% (174) of the loci. The majority (92%) of primer pairs were developed in regions flanking perfect repeats > or = 24 bp in length. Using electronic PCR (e-PCR) to align primer pairs against 3284 publicly sequenced rice BAC and PAC clones (representing about 83% of the total rice genome), 65% of the SSR markers hit a BAC or PAC clone containing at least one genetically mapped marker and could be mapped by proxy. Additional information based on genetic mapping and "nearest marker" information provided the basis for locating a total of 1825 (81%) of the newly designed markers along rice chromosomes. Fifty-six SSR markers (2.8%) hit BAC clones on two or more different chromosomes and appeared to be multiple copy. The largest proportion of SSRs in this data set correspond to poly(GA) motifs (36%), followed by poly(AT) (15%) and poly(CCG) (8%) motifs. AT-rich microsatellites had the longest average repeat tracts, while GC-rich motifs were the shortest. In combination with the pool of 500 previously mapped SSR markers, this release makes available a total of 2740 experimentally confirmed SSR markers for rice, or approximately one SSR every 157 kb.

Journal ArticleDOI
TL;DR: A review of different types of antimicrobial polymers developed for food contact, commercial applications, testing methods, regulations and future trends is presented in this article, with a special emphasis on the advantages/disadvantages of each technology.
Abstract: Research and development of antimicrobial materials for food applications such as packaging and other food contact surfaces is expected to grow in the next decade with the advent of new polymer materials and antimicrobials. This article reviews the different types of antimicrobial polymers developed for food contact, commercial applications, testing methods, regulations and future trends. Special emphasis will be on the advantages/disadvantages of each technology.

Journal ArticleDOI
TL;DR: New guidelines for laboratory testing for patients with diabetes mellitus provide specific recommendations that are based on published data or derived from expert consensus, and several analytes have minimal clinical value at present and are not recommended.
Abstract: Background: Multiple laboratory tests are used in the diagnosis and management of patients with diabetes mellitus The quality of the scientific evidence supporting the use of these assays varies substantially Approach: An expert committee drafted evidence-based recommendations for the use of laboratory analysis in patients with diabetes An external panel of experts reviewed a draft of the guidelines, which were modified in response to the reviewers’ suggestions A revised draft was posted on the Internet and was presented at the AACC Annual Meeting in July, 2000 The recommendations were modified again in response to oral and written comments The guidelines were reviewed by the Professional Practice Committee of the American Diabetes Association Content: Measurement of plasma glucose remains the sole diagnostic criterion for diabetes Monitoring of glycemic control is performed by the patients, who measure their own plasma or blood glucose with meters, and by laboratory analysis of glycated hemoglobin The potential roles of noninvasive glucose monitoring, genetic testing, autoantibodies, microalbumin, proinsulin, C-peptide, and other analytes are addressed Summary: The guidelines provide specific recommendations based on published data or derived from expert consensus Several analytes are of minimal clinical value at the present time, and measurement of them is not recommended

Journal ArticleDOI
01 Sep 2002
TL;DR: This paper introduces the Cougar approach to tasking sensor networks through declarative queries, and proposes a natural architecture for a data management system for sensor networks, and describes open research problems in this area.
Abstract: The widespread distribution and availability of small-scale sensors, actuators, and embedded processors is transforming the physical world into a computing platform. One such example is a sensor network consisting of a large number of sensor nodes that combine physical sensing capabilities such as temperature, light, or seismic sensors with networking and computation capabilities. Applications range from environmental control, warehouse inventory, and health care to military environments. Existing sensor networks assume that the sensors are preprogrammed and send data to a central frontend where the data is aggregated and stored for offline querying and analysis. This approach has two major drawbacks. First, the user cannot change the behavior of the system on the fly. Second, conservation of battery power is a major design factor, but a central system cannot make use of in-network programming, which trades costly communication for cheap local computation.In this paper, we introduce the Cougar approach to tasking sensor networks through declarative queries. Given a user query, a query optimizer generates an efficient query plan for in-network query processing, which can vastly reduce resource usage and thus extend the lifetime of a sensor network. In addition, since queries are asked in a declarative language, the user is shielded from the physical characteristics of the network. We give a short overview of sensor networks, propose a natural architecture for a data management system for sensor networks, and describe open research problems in this area.

Journal ArticleDOI
Rosemary Batt1
TL;DR: The authors examined the relationship between human resource practices, employee quit rates, and organizational performance in the service sector and found that quit rates were lower and sales growth was higher in establishments that emphasized high skills, employee participation in decision making and in teams, and human resource incentives such as high relative pay and employment security.
Abstract: This study examined the relationship between human resource practices, employee quit rates, and organizational performance in the service sector. Drawing on a unique, nationally representative sample of call centers, multivariate analyses showed that quit rates were lower and sales growth was higher in establishments that emphasized high skills, employee participation in decision making and in teams, and human resource incentives such as high relative pay and employment security. Quit rates partially mediated the relationship between human resource practices and sales growth. These relationships were also moderated by the customer segment served.

Journal ArticleDOI
TL;DR: This article examined the impact of measuring accruals as the change in successive balance sheet accounts, as opposed to measuring the accruality directly from the statement of cash flows. But their primary finding is that studies using a balance sheet approach to test for earnings management are potentially contaminated by measurement error in accrual estimates.
Abstract: This paper examines the impact of measuring accruals as the change in successive balance sheet accounts, as opposed to measuring accruals directly from the statement of cash flows. Our primary finding is that studies using a balance sheet approach to test for earnings management are potentially contaminated by measurement error in accruals estimates. In particular, if the partitioning variable used to indicate the presence of earnings management is correlated with the occurrence of mergers and acquisitions or discontinued operations, tests are biased and researchers are likely to erroneously conclude that earnings management exists when there is none. Additional results show that the errors in balance sheet accruals estimation can confound returns regressions where discretionary and non-discretionary accruals are used as explanatory variables. Moreover, we demonstrate that tests of market mispricing of accruals will be understated due to erroneous classification of “extreme” accruals firms.

Journal ArticleDOI
TL;DR: Agent-based models (ABMs) as mentioned in this paper have been widely used in computational sociology to model social life as interactions among adaptive agents who influence one another in response to the influence they receive, such as diffusion of information, emergence of norms, coordination of conventions or participation in collective action.
Abstract: ■ Abstract Sociologists often model social processes as interactions among variables. We review an alternative approach that models social life as interactions among adaptive agents who influence one another in response to the influence they receive. These agent-based models (ABMs) show how simple and predictable local interactions can generate familiar but enigmatic global patterns, such as the diffusion of information, emergence of norms, coordination of conventions, or participation in collective action. Emergent social patterns can also appear unexpectedly and then just as dramatically transform or disappear, as happens in revolutions, market crashes, fads, and feeding frenzies. ABMs provide theoretical leverage where the global patterns of interest are more than the aggregation of individual attributes, but at the same time, the emergent pattern cannot be understood without a bottom up dynamical model of the microfoundations at the relational level. We begin with a brief historical sketch of the shift from “factors” to “actors” in computational sociology that shows how agent-based modeling differs fundamentally from earlier sociological uses of computer simulation. We then review recent contributions focused on the emergence of social structure and social order out of local interaction. Although sociology has lagged behind other social sciences in appreciating this new methodology, a distinctive sociological contribution is evident in the papers we review. First, theoretical interest focuses on dynamic social networks that shape and are shaped by agent interaction. Second, ABMs are used to perform virtual experiments that test macrosociological theories by manipulating structural factors like network topology, social stratification, or spatial mobility. We conclude our review with a series of recommendations for realizing the rich sociological potential of this approach.