scispace - formally typeset
Search or ask a question

Showing papers by "Carleton University published in 2000"


Journal Article
D. E. Groom1, M. Aguilar-Benitez, Claude Amsler2, R. M. Barnett1, Patricia R. Burchat3, C. D. Carone4, C. Caso5, G. Conforto6, O. I. Dahl1, Michael Doser7, Semen Eidelman8, Jonathan L. Feng, L. K. Gibbons9, Maury Goodman10, Christoph Grab11, Atul Gurtu12, K. Hagiwara, K. G. Hayes13, J. J. Hernandez14, Ken Ichi Hikasa15, K. Honscheid16, Christopher Kolda1, Michelangelo L. Mangano7, Aneesh V. Manohar17, A. Masoni, Klaus Mönig, Hitoshi Murayama1, Hitoshi Murayama18, Koji Nakamura, S. Sánchez Navas19, Keith A. Olive20, Luc Pape7, A. Piepke21, Matts Roos22, Masaharu Tanabashi15, Nils A. Tornqvist22, T. G. Trippe1, Petr Vogel23, C. G. Wohl1, Ron L. Workman24, W-M. Yao1, B. Armstrong1, J. L. Casas Serradilla7, B. B. Filimonov, P. S. Gee1, S. B. Lugovsky, F. Nicholson7, K. S. Babu, D. Z. Besson25, Otmar Biebel26, P. Bloch7, Robert N. Cahn1, Ariella Cattai7, R. S. Chivukula27, R. Cousins28, Thibault Damour29, K. Desler, R. J. Donahue1, D. A. Edwards, Jens Erler30, V. V. Ezhela, A. Fassò3, W. Fetscher11, Daniel Froidevaux7, Masataka Fukugita31, Thomas K. Gaisser32, L. A. Garren33, S. Geer33, H J Gerber11, Frederick J. Gilman34, Howard E. Haber35, C. A. Hagmann36, Ian Hinchliffe1, Craig J. Hogan37, G. Höhler38, P. Igo-Kemenes39, John David Jackson1, Kurtis F Johnson40, D. Karlen41, Boris Kayser42, S. R. Klein1, Konrad Kleinknecht43, I.G. Knowles44, Edward W. Kolb45, Edward W. Kolb33, P. Kreitz3, R. Landua7, Paul Langacker30, L. S. Littenberg46, David Manley47, John March-Russell, T. Nakada48, Helen R. Quinn3, Georg G. Raffelt49, B. Renk43, L. Rolandi7, Michael T Ronan1, L.J. Rosenberg50, H. F.W. Sadrozinski35, A. I. Sanda51, Michael Schmitt52 
TL;DR: In this article, a biennial review summarizes much of particle physics using data from previous editions., plus 2778 new measurements from 645 papers, including measurements of gauge bosons, leptons, quarks, mesons, and baryons.
Abstract: This biennial Review summarizes much of particle physics. Using data from previous editions., plus 2778 new measurements from 645 papers, we list, evaluate, and average measured properties of gauge bosons, leptons, quarks, mesons, and baryons. We also summarize searches for hypothetical particles such as Higgs bosons, heavy neutrinos, and supersymmetric particles. All the particle properties and search limits are listed in Summary Tables. We also give numerous tables, figures, formulae, and reviews of topics such as the Standard Model, particle detectors., probability, and statistics. Among the 108 reviews are many that are new or heavily revised including those on CKM quark-mixing matrix, V-ud & V-us, V-cb & V-ub, top quark, muon anomalous magnetic moment, extra dimensions, particle detectors, cosmic background radiation, dark matter, cosmological parameters, and big bang cosmology.

1,520 citations


Book
01 Jan 2000
TL;DR: The Davenport-Schinzel sequences and their geometric applications, as well as randomizedalgorithms in computaional geometry, are described.
Abstract: reface. List of contrinbutors. 1. Davenport-Schinzel sequences and their geometric applications (P.K. Agarwal and M. Sharir). 2. Arrangements and their applications (P.K. Agarwal and M. Sharir). 3. Discrete geometric shapes: Matching, interpolation, and approximation (H. Alt and L.J. Guibas). 4. Deterministic parallel computational geometry (M.J. Attalah and D.Z. Chen). 5. Voronoi diagrams (F. Aurenhammer and R. Klein). 6. Mesh generation (M. Bern and P. Plassmann). 7. Applications of computational geometry to geographic information systems (L. de Floriani, P. Magillo and E. Puppo). 8. Making geometry visible: An introduction to the animation of geometric algorithms (A. Hausner and D.P. Dobkin). 9. Spanning trees and spanners (D. Eppstein). 10. Geometric data structures (M.T. Goodrich and K. Ramaiyer). 11. Polygon decomposition (J.M. Keil). 12. Link distance problems (A. Maheshwari, J.-R. Sack and H. N. Djidjev). 13. Derandomization in computational geometry (J. Matousek). 14. Robustness and precision issues in geometric computation (S. Schirra). 15. Geometric shortest paths and network optimization (J.S.B. Mitchell). 16. Randomizedalgorithms in computaional geometry (K. Mulmuley).

688 citations


Journal ArticleDOI
TL;DR: This paper examined the effects of storybook reading on the acquisition of vocabulary of 36 preschool children who had poor expressive vocabulary skills, averaging 13 months behind chronological age, and found that children with limited vocabularies learned new vocabulary from shared book-reading episodes.

651 citations


Journal ArticleDOI
TL;DR: For points in three dimensions it is shown that the problem of deciding whether a complete range assignment of a given cost exists, is NP-hard and an O(n 2 ) time approximation algorithm is given which provides a completerange assignment with cost within a factor of two of the minimum.

468 citations


Journal ArticleDOI
Nick Laskin1
TL;DR: In this paper, a new fractional Langevin-type stochastic dierential equation is introduced, which is derived from the standard Langevin equation, by replacing the rst-order derivative with respect to time by the fractional derivative of order ; and by replacing white noise" Gaussian force by the generalized shot noise", each pulse of which has a random amplitude with the -stable Levy distribution.
Abstract: A new extension of a fractality concept in nancial mathematics has been developed. We have introduced a new fractional Langevin-type stochastic dierential equation that diers from the standard Langevin equation: (i) by replacing the rst-order derivative with respect to time by the fractional derivative of order ; and (ii) by replacing \white noise" Gaussian stochastic force by the generalized \shot noise", each pulse of which has a random amplitude with the -stable Levy distribution. As an application of the developed fractional non-Gaussian dynamical approach the expression for the probability distribution function (pdf) of the returns has been established. It is shown that the obtained fractional pdf ts well the central part and the tails of the empirical distribution of S&P 500 returns. c 2000 Elsevier Science B.V. All rights reserved.

394 citations


Journal ArticleDOI
TL;DR: The effects of climate variability and change on Canadian agriculture have become an important research field since the early 1980s as discussed by the authors, focusing on agricultural adaptation, a purposeful proactive or reactive response to changes associated with climate, and influenced by many factors.
Abstract: The effects of climatic variability and change on Canadian agriculture have become an important research field since the early 1980s. In this paper, we seek to synthesize this research, focusing on agricultural adaptation, a purposeful proactive or reactive response to changes associated with climate, and influenced by many factors. A distinctive feature of methods used in research on adaptation in Canadian agriculture is the focus on the important role of human agency. Many individual farmers perceive they are well adapted to climate, because of their extensive 'technological' tool-kit, giving them confidence in dealing with climatic change. In many regions, little concern is expressed over climatic change, except where there are particular types of climatic vulnerability. Farmers respond to biophysical factors, including climate, as they interact with a complex of human factors. Several of these, notably institutional and political ones, have tended to diminish the farm-level risks stemming from climatic variability and change, but may well increase the long term vulnerability of Canadian agriculture. Notwithstanding the technological and management adaptation measures available to producers, Canadian agriculture remains vulnerable to climatic variability and to climate change.

383 citations


Journal ArticleDOI
01 Sep 2000-Ecology
TL;DR: The results suggest that one must be cautious in applying the results of metapo- pulation analyses to species for which the habitat vs. nonhabitat categorization of the landscape is not appropriate, and Rana pipiens, the northern leopard frog is a good example.
Abstract: For many species, not all required resources are contained in breeding habitat. Such species depend on landscape complementation, i.e., linking together different land- scape elements through movement, to complete their life cycles. We suggest that the di- chotomous habitat classification of many metapopulation analyses (habitat vs. nonhabitat) masks our ability to detect metapopulation effects for such species. We tested this using a species for which landscape complementation is obligate and metapopulation structure is likely: Rana pipiens, the northern leopard frog. We used breeding chorus survey data to index relative abundance of leopard frogs in 34 "core" ponds and conducted Poisson regression analysis to determine the effects on frog density of local pond habitat, availability of summer habitat (landscape complementation), and number of occupied ponds in the surrounding landscapes (metapopulation structure). All of these factors had statistically significant effects on frog density. However, when summer habitat was not included in the statistical model, the metapopulation structure was no longer significant; i.e., its effect was masked. Our results suggest that one must be cautious in applying the results of metapo- pulation analyses to species for which the habitat vs. nonhabitat categorization of'the landscape is not appropriate. The potential for rescue and recolonization to maintain a regional population must be assessed within the constraints of the entire landscape.

368 citations


Journal ArticleDOI
TL;DR: In this paper, the authors compared the response of landscape connectivity measures to habitat fragmentation and found that the connectivity measures were weakly correlated to each other and are therefore generally not comparable.
Abstract: The methods for measuring landscape connectivity have never been compared or tested for their responses to habitat fragmentation. We simulated movement, mortality and boundary reactions across a wide range of landscape structures to analyze the response of landscape connectivity measures to habitat fragmentation. Landscape connectivity was measured as either dispersal success or search time, based on immigration into all habitat patches in the landscape. Both measures indicated higher connectivity in more fragmented landscapes, a potential for problematic conclusions for conservation plans. We introduce cell immigration as a new measure for landscape connectivity. Cell immigration is the rate of immigration into equal-sized habitat cells in the landscape. It includes both within- and between-patch movement, and shows a negative response to habitat fragmentation. This complies with intuition and existing theoretical work. This method for measuring connectivity is highly robust to reductions in sample size (i.e., number of habitat cells included in the estimate), and we hypothesize that it therefore should be amenable to use in empirical studies. The connectivity measures were weakly correlated to each other and are therefore generally not comparable. We also tested immigration into a single patch as an index of connectivity by comparing it to cell immigration over the landscape. This is essentially a comparison between patch-scale and landscape-scale measurement, and revealed some potential for patch immigration to predict connectivity at the landscape scale. However, this relationship depends on the size of the single patch, the dispersal characteristics of the species, and the amount of habitat in the landscape. We conclude that the response of connectivity measures to habitat fragmentation should be understood before deriving conclusions for conservation management.

362 citations



Journal ArticleDOI
01 Sep 2000-Methods
TL;DR: This review highlights many recent developments in SPR-based immunoassay, functionalizations of the gold surface, novel receptors in molecular recognition, and advanced techniques for sensitivity enhancement, and describes the challenge of current problems and some insights toward the future technologies.

325 citations


Journal ArticleDOI
TL;DR: This work proposes a new approach in which some or all of the coefficients of the LP are specified as intervals, and finds the best optimum and the worst optimum for the model, and the point settings of the interval coefficients that yield these two extremes.
Abstract: In order to solve a linear programme, the model coefficients must be fixed at specific values, which implies that the coefficients are perfectly accurate. In practice, however, the coefficients are generally estimates. The only way to deal with uncertain coefficients is to test the sensitivity of the model to changes in their values, either singly or in very small groups. We propose a new approach in which some or all of the coefficients of the LP are specified as intervals. We then find the best optimum and the worst optimum for the model, and the point settings of the interval coefficients that yield these two extremes. This provides the range of the optimised objective function, and the coefficient settings give some insight into the likelihood of these extremes.

Journal Article
TL;DR: In this article, the authors examined the affective correlates of procrastination through experience-sampling and found that participants' appraisals of their tasks when paged revealed that they procrastinated on unpleasant, stressful and difficult tasks, while engaging in activities that were significantly more pleasant.
Abstract: Affective correlates of procrastination were examined through experience-sampling. Forty-five undergraduate students carried electronic pagers for five days preceding an academic deadline. Students were paged eight times daily. At each signal, the participants indicated what they were doing, extent of procrastination and affective state. Contrary to previous research, procrastination was not found to be correlated with either positive or negative affect. Participants' appraisals of their tasks when paged revealed that they procrastinated on unpleasant, stressful and difficult tasks, while engaging in activities that were significantly more pleasant. Specious rewards, self-regulation and the apparent short-term benefits of procrastination are discussed in relation to these findings and as a basis for counseling intervention.

Journal ArticleDOI
TL;DR: An overview of pricing concepts for broadband multiservice networks is provided, reviewing the notions of flat pricing, priority pricing, Paris-Metro pricing, smart-market pricing, responsive pricing, expected capacity pricing, edge pricing, and effective bandwidth pricing.
Abstract: In this article we provide an overview of pricing concepts for broadband multiservice networks. We review the notions of flat pricing, priority pricing, Paris-Metro pricing, smart-market pricing, responsive pricing, expected capacity pricing, edge pricing, and effective bandwidth pricing. We use numerous evaluation criteria, including network, economic, and social efficiency, as well as their suitability in using pricing as a means for congestion control. Some of the schemes are based on best-effort networks, and are thus unable to provide the user with quality of service (QoS) guarantees. Others build on networks with connection admission control functions and are thus able to provide individual QoS guarantees. We particularly investigate the relevant time frame over which pricing schemes are assumed to operate. The majority of the schemes work on short time frames (on the order of minutes), which makes them applicable to use pricing as an additional means for controlling congestion. We also consider technical aspects such as compliance with existing networking technologies or computational overheads associated with billing and accounting.

Journal ArticleDOI
TL;DR: A scalability metric based on cost-effectiveness, where the effectiveness is a function of the system's throughput and its quality of service is presented, which gives insight into the scaling capacity of the designs, and into how to improve the design.
Abstract: Many distributed systems must be scalable, meaning that they must be economically deployable in a wide range of sizes and configurations. This paper presents a scalability metric based on cost-effectiveness, where the effectiveness is a function of the system's throughput and its quality of service. It is part of a framework which also includes a sealing strategy for introducing changes as a function of a scale factor, and an automated virtual design optimization at each scale factor. This is an adaptation of concepts for scalability measures in parallel computing. Scalability is measured by the range of scale factors that give a satisfactory value of the metric, and good scalability is a joint property of the initial design and the scaling strategy. The results give insight into the scaling capacity of the designs, and into how to improve the design. A rapid simple bound on the metric is also described. The metric is demonstrated in this work by applying it to some well-known idealized systems, and to real prototypes of communications software.

Proceedings ArticleDOI
01 Jun 2000
TL;DR: Traditional techniques, namely, ordinary least-squares regression and analysis of variance outperformed analogy-based estimation and regression trees and no significant difference was found in accuracy between estimates derived from company-specific data and estimatesderived from multi-organizational data.
Abstract: Delivering a software product on time, within budget, and to an agreed level of quality is a critical concern for many software organizations. Underestimating software costs can have detrimental effects on the quality of the delivered software and thus on a company's business reputation and competitiveness. On the other hand, overestimation of software cost can result in missed opportunities to funds in other projects. In response to industry demand, a myriad of estimation techniques has been proposed during the last three decades. In order to assess the suitability of a technique from a diverse selection, its performance and relative merits must be compared. The current study replicates a comprehensive comparison of common estimation techniques within different organizational contexts, using data from the European Space Agency. Our study is motivated by the challenge to assess the feasibility of using multi-organization data to build cost models and the benefits gained from company-specific data collection. Using the European Space Agency data set, we investigated a yet unexplored application domain, including military and space projects. The results showed that traditional techniques, namely, ordinary least-squares regression and analysis of variance outperformed analogy-based estimation and regression trees. Consistent with the results of the replicated study no significant difference was found in accuracy between estimates derived from company-specific data and estimates derived from multi-organizational data.

Journal ArticleDOI
TL;DR: In this article, the geology of the western Grenville Province is presented in terms of three tectonic elements: (1)..., (2), and (3).
Abstract: Revised cross sections of the western Grenville Province incorporate new geologic results and reprocessed seismic reflection data. The geology is presented in terms of three tectonic elements: (1) ...

Journal ArticleDOI
TL;DR: In this paper, the differential response of women to part-time work as opposed to a career may be a function of motivational and work-context differences between career and non-career women.
Abstract: Results of this study suggest that the differential response of women to part-time work as opposed to a career may be a function of motivational and work-context differences between career and non-career women. Part-time work was associated with lower work-to-family interference, better time management ability, and greater life satisfaction for women in both career and earner-type positions. Role overload, family-to-work interference, and family time management, however, were dependent on job type with beneficial effects for earners but not for career women. Job type also played a role: Career women reported higher life satisfaction and lower depressed mood than did women in earner positions. © 2000 John Wiley & Sons, Inc.

Journal ArticleDOI
TL;DR: The levels of OH-PCBs in Inuit are higher than those previously reported in the literature for other populations and the possible role in mediating PCB-induced adverse effects needs to be investigated further.
Abstract: In this study, we identified the main hydroxylated polychlorinated biphenyls (OH-PCBs) and other chlorinated phenolic compounds and we determined their relative concentrations in whole blood from 13 male and 17 female Inuit from northern Quebec, Canada, and from a pooled whole blood sample from southern Quebec. We also determined concentrations of polychlorinated biphenyls (PCBs). Total OH-PCB concentrations were variable among the Inuit samples, ranging over 2 orders of magnitude (0.117-11.6 ng/g whole blood wet weight). These concentrations were equal to and up to 70 times those found for the southern Quebec pooled whole blood sample. Geometric mean concentrations of total OH-PCBs were 1.73 and 1.01 ng/g whole blood for Inuit men and women, respectively, and 0.161 ng/g whole blood for the southern population pool. There are limited data available for comparison, but the levels of OH-PCBs in Inuit are higher than those previously reported in the literature for other populations. There was a significant correlation (p < 0.005) between OH-PCBs and PCBs (r = 0.84) and both correlated significantly (p < 0.005) with age (r = 0.68 and 0.78, respectively). The ratio of OH-PCBs to PCBs was lower in Inuit (0.11) than in the southern Quebec pool (0.33). There is no apparent explanation for the difference. There was considerable variability in the congener pattern of the identified OH-PCBs. The main metabolite, 4-OH-CB109 (4-OH-2,3,3',4', 5-pentachlorobiphenyl), constituted 12-62% of the total OH-PCBs in the samples. Pentachlorophenol (PCP) was the dominant phenolic compound in blood, constituting 46% (geometric mean) of the total quantitated chlorinated phenolic compounds. PCP concentrations in Inuit blood ranged from 0.558 to 7.77 ng/g on a wet weight basis. All but two Inuit samples had lower concentrations than the southern Quebec pool (6.29 ng/g). The possible role of OH-PCBs in mediating PCB-induced adverse effects needs to be investigated further.

Journal ArticleDOI
TL;DR: In this article, the authors explore notions of task aversiveness across stages of personal projects and find that boredom, frustration and resentment emerge as PPA dimensions associated with task-aversiveness at each stage of project development.

Journal ArticleDOI
TL;DR: In this article, it was shown that over 60% of the differences in the economic performance can in fact be explained by uneven initial conditions, such as the level of development and pre-transition disproportions in industrial structure and trade patterns.
Abstract: The conventional explanation for the dynamics of output during transition is associated with “good” and “bad” economic policies, in particular with the progress achieved in the liberalization, as measured by the liberalization index, and with the success or failure in macroeconomic stabilization, as measured by the rates of inflation. This paper seeks to provide alternative explanation to the differing performance during transition: the supply-side recession, which in turn is caused by reallocation of resources needed to overcome disproportions inherited from the era of central planning. It is shown that over 60% of the differences in the economic performance can in fact be explained by uneven initial conditions, such as the level of development and pre-transition disproportions in industrial structure and trade patterns.

Journal ArticleDOI
TL;DR: It is suggested that immune activation may come to influence complex behavioral processes, as well as affective state, in view of the central amine variations induced by the cytokines.

Journal ArticleDOI
TL;DR: The authors focus on traditional inspections and estimate, based on actual inspections data, the degree of accuracy of relevant state-of-the-art capture-recapture models for which statistical estimators exist and recommend using a model taking into account that defects have different probabilities of being detected and the corresponding Jackknife Estimator.
Abstract: An important requirement to control the inspection of software artifacts is to be able to decide, based on more objective information, whether the inspection can stop or whether it should continue to achieve a suitable level of artifact quality. A prediction of the number of remaining defects in an inspected artifact can be used for decision making. Several studies in software engineering have considered capture-recapture models to make a prediction. However, few studies compare the actual number of remaining defects to the one predicted by a capture-recapture model on real software engineering artifacts. The authors focus on traditional inspections and estimate, based on actual inspections data, the degree of accuracy of relevant state-of-the-art capture-recapture models for which statistical estimators exist. In order to assess their robustness, we look at the impact of the number of inspectors and the number of actual defects on the estimators' accuracy based on actual inspection data. Our results show that models are strongly affected by the number of inspectors, and therefore one must consider this factor before using capture-recapture models. When the number of inspectors is too small, no model is sufficiently accurate and underestimation may be substantial. In addition, some models perform better than others in a large number of conditions and plausible reasons are discussed. Based on our analyses, we recommend using a model taking into account that defects have different probabilities of being detected and the corresponding Jackknife Estimator. Furthermore, we calibrate the prediction models based on their relative error, as previously computed on other inspections. We identified theoretical limitations to this approach which were then confirmed by the data.

Journal ArticleDOI
TL;DR: In this article, the phase field microelasticity theory is used to formulate a three-dimensional phase field model of a multivariant martensitic transformation under external load.

Journal ArticleDOI
TL;DR: Following the last generalized convulsive seizure triggered from the ipsilateral kindled amygdala in rats, the same brain region sample was used to assay for changes of all mRNA components as discussed by the authors.

Journal ArticleDOI
TL;DR: A novel fuzzy logic approach is adopted to develop a computer based intelligent interpretation of transformer faults using Visual basic and C/sup ++/ programming and this highly reliable tool has been utilized in detection and verification of 20 transformer faults.
Abstract: Dissolved gas in oil analysis is an well established in-service technique for incipient fault detection in oil-insulated power transformers. A great deal of experience and data in dissolved gas in oil analysis (DGA) is now available within the utilities. Actually, diagnostic interpretations were solely done by human experts using past knowledge and standard techniques such as the ratio method. In this paper, a novel fuzzy logic approach is adopted to develop a computer based intelligent interpretation of transformer faults using Visual basic and C/sup ++/ programming. The proposed fuzzy logic based software as been tested and tuned using over 800 dissolved gas in oil analysis (DGA) case histories. This highly reliable tool has then been utilized in detection and verification of 20 transformer faults. The proposed diagnostic tool is very useful to both expert and novice engineers in DGA result interpretation.

Journal ArticleDOI
TL;DR: It is suggested that woody borders play a role in maintaining biodiversity in agro-ecosystems, and that this role extends beyond the borders themselves, into the crop fields.

Journal ArticleDOI
TL;DR: The effect of baclofen pretreatment on cocaine self-administration is dependent on the unit injection dose of cocaine and on the response requirements of the schedule.
Abstract: Rationale: Recent reports have indicated that the γ-aminobutyric acid (GABA) B agonist baclofen at- tenuates the reinforcing effects of cocaine. Objectives: To further evaluate the effect of baclofen on cocaine self-administration under a fixed ratio (FR) and progres- sive ratio (PR) schedule of reinforcement. Methods: In the first series of experiments, three dose-response curves were generated that examined the effect of three doses of baclofen (1.8, 3.2, or 5.6 mg/kg, i.p.) against four unit-injection doses of cocaine (0.19, 0.38, 0.75, and 1.5 mg/kg per injection) reinforced under a FR1 schedule. For comparison, an additional group of rats was pretreated with haloperidol (32, 56, or 100 µg/kg, i.p.). A separate experiment examined the effect of ba- clofen (1.8, 3.2, or 5.6 mg/kg, i.p.) on responding for concurrently available cocaine or food reinforcement. Results: Under the FR1 schedule, baclofen suppressed intake of low but not high unit injection doses of cocaine. In contrast to haloperidol, baclofen had no effect on the distribution of inter-injection intervals and, instead, produced long pauses in cocaine self-administra- tion. Baclofen dose dependently reduced cocaine- reinforced responding on a PR schedule; concurrent ac- cess to a food-reinforced lever demonstrated that the animals retained the capacity to respond at high rates. Conclusion: The effect of baclofen pretreatment on co- caine self-administration is dependent on the unit injec- tion dose of cocaine and on the response requirements of the schedule.

Journal ArticleDOI
TL;DR: The main objectives of this article are to describe and evaluate narrow and broad definitions and provide some suggestions for achieving consensus in defining violence against women.
Abstract: There is considerable disagreement about what harmful behaviors should be included in a definition of nonlethal violence against women in intimate heterosexual relationships. For example, many researchers restrict their focus to physical and/or sexual assaults, whereas others offer formulations that include a much broader range of injurious acts. The main objectives of this article are to describe and evaluate narrow and broad definitions and provide some suggestions for achieving consensus in defining violence against women.

Journal ArticleDOI
TL;DR: The typical daily pattern in snake Tb was an increase in late morning to a plateau temperature in the preferred range, followed by a decrease in the evening with a nightime plateau at approximately the temperate zone, which is nearly identical to those observed in captivity.
Abstract: We used more than 326 000 observations of temperature collected by radio telemetry from 38 individuals over three years to investigate thermoregulation and thermal relations of northern water snakes (Nerodia sipedon) near the northern limit of their distribution in Ontario, Canada. We tested hypotheses concerning the effects of feeding, season, sex, and reproductive condition on thermoregulation of individuals. The mean preferred body temperature (PBT) for captive snakes from the study population was 27.1°C, similar to that reported for other populations, and PBT range (defined as the 25th–75th percentiles of selected temperatures) was 25–30°C. When environmental conditions allowed, the mean and range of body temperature (Tb) of free-living snakes were nearly identical to those observed in captivity. The typical daily pattern in snake Tb was an increase in late morning to a plateau temperature in the preferred range, followed by a decrease in the evening with a nightime plateau at approximately the temper...

Proceedings ArticleDOI
03 Oct 2000
TL;DR: This work describes the current investigations on what autonomous mobile robots can and can not do with respect to some coordination problems and aims to understand the fundamental limitations on what a set of autonomous mobile Robots can achieve.
Abstract: The distributed coordination and control of a set of autonomous mobile robots is a problem widely studied in a variety of fields, such as engineering, artificial intelligence, artificial life, robotics Generally, in these areas the problem is studied mostly from an empirical point of view In contrast, we aim to understand the fundamental limitations on what a set of autonomous mobile robots can achieve We describe the current investigations on what autonomous mobile robots can and can not do with respect to some coordination problems