scispace - formally typeset
Search or ask a question

Showing papers by "Carleton University published in 2004"


Journal ArticleDOI
TL;DR: The current norms represent a more comprehensive set of norms than previously available and will increase the ability of neuropsychologists to determine more precisely the degree to which scores on the TMT reflect impaired performance for varying ages and education.

2,280 citations


Journal ArticleDOI
TL;DR: An overview of important topics and applications in the context of relaying covers different approaches to exploiting the benefits of multihop communications via relays, such as solutions for radio range extension in mobile and wireless broadband cellular networks and solutions to combat shadowing at high radio frequencies.
Abstract: In recent years, there has been an upsurge of interest in multihop-augmented infrastructure-based networks in both the industry and academia, such as the seed concept in 3GPP, mesh networks in IEEE 802.16, and converge extension of HiperLAN/2 through relays or user-cooperative diversity mesh networks. This article, a synopsis of numerous contributions to the working group 4 of the wireless world research forum and other research work, presents an overview of important topics and applications in the context of relaying. It covers different approaches to exploiting the benefits of multihop communications via relays, such as solutions for radio range extension in mobile and wireless broadband cellular networks (trading range for capacity), and solutions to combat shadowing at high radio frequencies. Furthermore, relaying is presented as a means to reduce infrastructure deployment costs. It is also shown that through the exploitation of spatial diversity, multihop relaying can enhance capacity in cellular networks. We wish to emphasize that while this article focuses on fixed relays, many of the concepts presented can also be applied to systems with moving relays.

1,907 citations


Journal ArticleDOI
TL;DR: The findings suggest the value of models that include a wide range of health and health-determinant variables, and affirm the importance of looking more closely at gender differences in health, are affirm.

715 citations


Journal ArticleDOI
TL;DR: This paper presents theoretical characterizations and analysis for the physical layer of multihop wireless communications channels, and demonstrates that amplified relaying does not suffer from the error propagation which limits the performance of decoded relaying channels to that of their weakest link.
Abstract: This paper presents theoretical characterizations and analysis for the physical layer of multihop wireless communications channels. Four channel models are considered and developed: the decoded relaying multihop channel; the amplified relaying multihop channel; the decoded relaying multihop diversity channel; and the amplified relaying multihop diversity channel. Two classifications are discussed: decoded relaying versus amplified relaying, and multihop channels versus multihop diversity channels. The channel models are compared, through analysis and simulations, with the "singlehop" (direct transmission) reference channel on the basis of signal-to-noise ratio, probability of outage, probability of error, and optimal power allocation. Each of the four channel models is shown to outperform the singlehop reference channel under the condition that the set of intermediate relaying terminals is selected intelligently. Multihop diversity channels are shown to outperform multihop channels. Amplified relaying is shown to outperform decoded relaying despite noise propagation. This is attributed to the fact that amplified relaying does not suffer from the error propagation which limits the performance of decoded relaying channels to that of their weakest link.

713 citations


Journal ArticleDOI
S. N. Ahmed1, A. E. Anthony2, E. W. Beier3, Alain Bellerive4, S. D. Biller5, J. Boger6, M.G. Boulay7, M. G. Bowler5, T. J. Bowles7, S. J. Brice7, T. V. Bullard8, Yuen-Dat Chan9, M. L. Chen1, X. Chen9, B. T. Cleveland5, G. A. Cox8, X. Dai5, X. Dai4, F. Dalnoki-Veress4, P. J. Doe8, R. S. Dosanjh4, G. Doucas5, M. R. Dragowsky7, C. A. Duba8, F. A. Duncan1, Monica Dunford3, J. A. Dunmore5, E. D. Earle1, S. R. Elliott7, Hal Evans1, G. T. Ewan1, J. Farine4, J. Farine10, H. Fergani5, F. Fleurot10, Joseph A. Formaggio8, Malcolm M. Fowler7, K. Frame4, K. Frame5, B. G. Fulsom1, N. Gagnon, K. Graham1, Darren Grant4, R. L. Hahn6, J. C. Hall2, A. L. Hallin1, E. D. Hallman10, A. S. Hamer7, W. B. Handler1, C. K. Hargrove4, P. J. Harvey1, R. Hazama8, K. M. Heeger, W. J. Heintzelman3, J. Heise7, R. L. Helmer11, R. L. Helmer12, R. J. Hemingway4, Andrew Hime7, M. A. Howe8, P. Jagam13, N. A. Jelley5, Joshua R. Klein3, Joshua R. Klein2, M. Kos1, A. V. Krumins1, T. Kutter12, Christopher C. M. Kyba3, H. Labranche13, R. Lange6, J. Law13, I. T. Lawson13, K. T. Lesko9, J. R. Leslie1, I. Levine14, I. Levine4, S. Luoma10, R. MacLellan1, S. Majerus5, H. B. Mak1, J. Maneira1, A. D. Marino9, N. McCauley3, A. B. McDonald1, S. McGee8, G. McGregor5, C. Mifflin4, K.K.S. Miknaitis8, Guthrie Miller7, B. A. Moffat1, C. W. Nally12, Bernie G. Nickel13, A. J. Noble4, A. J. Noble11, A. J. Noble1, Eric B. Norman9, N. S. Oblath8, C. E. Okada9, R. W. Ollerhead13, John L. Orrell8, S. M. Oser3, S. M. Oser12, C. Ouellet1, S. J. M. Peeters5, A. W. P. Poon9, B. C. Robertson1, R. G. H. Robertson8, E. Rollin4, S. S.E. Rosendahl9, V. L. Rusu3, M. H. Schwendener10, O. Simard4, J. J. Simpson13, C. J. Sims5, David A. Sinclair4, David A. Sinclair11, P. Skensved1, M. W.E. Smith8, N. Starinsky4, R. G. Stokstad9, L. C. Stonehill8, Reda Tafirout10, Y. Takeuchi1, G. Tešić4, M. A. Thomson1, M. Thorman5, R. Van Berg3, R. G. Van de Water7, C. J. Virtue10, B. L. Wall8, D. Waller4, Chris Waltham12, H. Wan Chan Tseung5, D. L. Wark15, D. L. Wark16, N. West5, J. B. Wilhelmy7, J. F. Wilkerson8, J. R. Wilson5, J. M. Wouters7, Minfang Yeh6, Kai Zuber5 
TL;DR: The Sudbury Neutrino Observatory has precisely determined the total active (nu(x) 8B solar neutrino flux without assumptions about the energy dependence of the nu(e) survival probability.
Abstract: The Sudbury Neutrino Observatory has precisely determined the total active (nu(x)) B-8 solar neutrino flux without assumptions about the energy dependence of the nu(e) survival probability. The measurements were made with dissolved NaCl in heavy water to enhance the sensitivity and signature for neutral-current interactions. The flux is found to be 5.21+/-0.27(stat)+/-0.38(syst)x10(6) cm(-2) s(-1), in agreement with previous measurements and standard solar models. A global analysis of these and other solar and reactor neutrino results yields Deltam(2)=7.1(-0.6)(+1.2)x10(-5) eV(2) and theta= 32.5(-2.3)(+2.4) degrees. Maximal mixing is rejected at the equivalent of 5.4 standard deviations.

705 citations


Journal ArticleDOI
TL;DR: It is proposed that fumonisins are potential risk factors for NTD, craniofacial anomalies, and other birth defects arising from neural crest cells because of their apparent interference with folate utilization.
Abstract: Fumonisins are a family of toxic and carcinogenic mycotoxins produced by Fusarium verticillioides (formerly Fusarium moniliforme), a common fungal contaminant of maize. Fumonisins inhibit ceramide synthase, causing accumulation of bioactive intermediates of sphingolipid metabolism (sphinganine and other sphingoid bases and derivatives) as well as depletion of complex sphingolipids, which interferes with the function of some membrane proteins, including the folate-binding protein (human folate receptor alpha). Fumonisin causes neural tube and craniofacial defects in mouse embryos in culture. Many of these effects are prevented by supplemental folic acid. Recent studies in LMBc mice found that fumonisin exposure in utero increases the frequency of developmental defects and administration of folate or a complex sphingolipid is preventive. High incidences of neural tube defects (NTD) occur in some regions of the world where substantial consumption of fumonisins has been documented or plausibly suggested (Guatemala, South Africa, and China); furthermore, a recent study of NTD in border counties of Texas found a significant association between NTD and consumption of tortillas during the first trimester. Hence, we propose that fumonisins are potential risk factors for NTD, craniofacial anomalies, and other birth defects arising from neural crest cells because of their apparent interference with folate utilization.

564 citations


Journal ArticleDOI
TL;DR: This study attempted to distinguish two types of social withdrawal in early childhood: (a) one based on social fear and anxiety despite a desire to interact socially (conflicted shyness) and (b) onebased on the lack of a strong motivation to engage in social interaction (social disinterest).
Abstract: This study attempted to distinguish two types of social withdrawal in early childhood: (a) one based on social fear and anxiety despite a desire to interact socially (conflicted shyness) and (b) one based on the lack of a strong motivation to engage in social interaction (social disinterest). Two samples of preschoolers (n = 119 and n = 127) 3-5 years of age participated. Their mothers completed the newly developed Child Social Preference Scale, which was designed to assess conflicted shyness and social disinterest. Maternal ratings of child temperament, parenting style, and social goals, teacher ratings of child social adjustment, observations of child free-play behaviors, and child interview assessments of perceived competence and preference for playing with peers were also collected. Distinct patterns of associations were found between conflicted shyness and social disinterest and outcome variables. Implications for the motivational underpinnings and adjustment outcomes of shyness and social disinterest are explored.

564 citations


Journal ArticleDOI
TL;DR: Data suggest that surfactant composition must be considered in the evaluation of toxicity of glyphosate-based herbicides, because thyroid hormone receptor beta mRNA transcript levels were elevated by exposure to formulations containing glyphosate and POEA.
Abstract: Glyphosate-based herbicides are among the most widely used pesticides in the world. We compared the acute toxicity of the glyphosate end-use formulation Roundup Original to four North American amphibian species (Rana clamitans, R. pipiens, R. sylvatica, and Bufo americanus) and the toxicity of glyphosate technical, the polyethoxylated tallowamine surfactant (POEA) commonly used in glyphosate-based herbicides, and five newer glyphosate formulations to R. clamitans. For R. clamitans, acute toxicity values in order of decreasing toxicity were POEA > Roundup Original > Roundup Transorb > Glyfos AU; no significant acute toxicity was observed with glyphosate technical material or the glyphosate formulations Roundup Biactive, Touchdown, or Glyfos BIO. Comparisons between the four amphibian species showed that the toxicity of Roundup Original varied with species and developmental stage. Rana pipiens tadpoles chronically exposed to environmentally relevant concentrations of POEA or glyphosate formulations containing POEA showed decreased snout-vent length at metamorphosis and increased time to metamorphosis, tail damage, and gonadal abnormalities. These effects may be caused, in some part, by disruption of hormone signaling, because thyroid hormone receptor beta mRNA transcript levels were elevated by exposure to formulations containing glyphosate and POEA. Taken together, the data suggest that surfactant composition must be considered in the evaluation of toxicity of glyphosate-based herbicides.

447 citations


Journal ArticleDOI
TL;DR: This paper reviewed the literature on the role of working memory in the solution of arithmetic problems such as 3 + 4 or 345 + 29 and concluded that mental arithmetic requires central executive resources, even for single-digit problems.
Abstract: We reviewed the literature on the role of working memory in the solution of arithmetic problems such as 3 + 4 or 345 + 29. The literature was neither comprehensive nor systematic, but a few conclusions are tenable. First, all three components of the working memory system proposed by Baddeley (i.e., central executive, phonological loop, and visual‐spatial sketchpad) play a role in mental arithmetic, albeit under different conditions. Second, mental arithmetic requires central executive resources, even for single‐digit problems. Third, further progress in understanding the role of working memory in arithmetic requires that researchers systematically manipulate factors such as presentation conditions (e.g., operand duration, format), problem complexity, task requirements (e.g., verification vs production), and response requirements (e.g., spoken vs written); and that they consider individual differences in solution procedures. Fourth, the encoding‐complex model (Campbell, 1994) seems more likely to account f...

439 citations


Journal ArticleDOI
TL;DR: This meta-analytic review examines the role of core correctional practices in reducing recidivism and provides strong preliminary evidence regarding their effectiveness.
Abstract: Several meta-analyses have rendered strong support for the clinically relevant and psychologically informed principles of human service, risk, need, and general responsivity. However, each of these reviews has focused on specific program components and not on the characteristics of the staff or the specific techniques used to deliver the program. This meta-analytic review examines the role of core correctional practices in reducing recidivism and provides strong preliminary evidence regarding their effectiveness. Staff characteristics and training in core skills must be addressed to ensure the maximum therapeutic impact of correctional treatment programs.

434 citations


Journal ArticleDOI
TL;DR: Interestingly, a partial analysis of the GABAA receptor functional genome revealed high cross-correlations between subunit expression in cortical regions of nondepressed individuals, suggesting a high degree of coordinated gene regulation, but in suicide brains, this regulation was perturbed, independent of overall subunit abundance.
Abstract: Corticotropin-releasing hormone (CRH) and GABA have been implicated in depression, and there is reason to believe that GABA may influence CRH functioning. The levels of CRH, and mRNA for CRH-binding protein, CRH1, and CRH2 receptors, as well as various GABAA receptor subunits (α1, α2, α3, α4, α5, δ, and γ2), were determined in several frontal cortical brain regions of depressed suicide victims and nondepressed individuals who had not died by suicide. Relative to the comparison group, CRH levels were elevated in frontopolar and dorsomedial prefrontal cortex, but not in the ventrolateral prefrontal cortex of suicide victims. Conversely, using quantitative PCR analyses, it was observed that, in frontopolar cortex, mRNA for CRH1, but not CRH2, receptors were reduced in suicide brains, possibly secondary to the high levels of CRH activity. In addition, mRNA of the α1, α3, α4, and δ receptor subunits was reduced in the frontopolar region of suicide victims. Interestingly, a partial analysis of the GABAA receptor functional genome revealed high cross-correlations between subunit expression in cortical regions of nondepressed individuals, suggesting a high degree of coordinated gene regulation. However, in suicide brains, this regulation was perturbed, independent of overall subunit abundance. These findings raise the possibility that the CRH and GABAA receptor subunit changes, or the disturbed coordination between these GABAA receptor subunits, contribute to depression and/or suicidality or are secondary to the illness/distress associated with it.

Journal ArticleDOI
TL;DR: A review of place image marketing can be found in this article, which discusses implications for government, business and research, and calls for integration of the various streams of thought in order to enhance our understanding of the field.
Abstract: Place image has traditionally been important in areas including tourism, country positioning in international relations, the protection of local producers from imports through ‘buy domestic’ campaigns and the export promotion of agricultural and manufactured products. Research and practice in each area, however, has developed independently of the others, even though they all revolve around the same notions of place-based marketing, whether practised systematically or not. More recently, the opening of new emerging markets, health scares including mad cow disease and avian flu, the events of September 11, 2001, labour shortages in technology, and the overall globalisation of markets, have resulted in greatly intensified global competition for increasing exports and for attracting everything from investment and tourism to foreign students and skilled labour. In turn, this has served to focus attention on place equity and systematic marketing, which is likely to have a major impact worldwide: developed nations now weigh-in the global arena with coordinated country branding campaigns, leading to intensified competition among them and a potentially significant disadvantage for weaker countries. This paper reviews place branding, discusses implications for government, business and research, and calls for integration of the various streams of thought in order to enhance our understanding of the field.

Journal ArticleDOI
TL;DR: In this article, the denominator theorem of Fomin and Zelevinsky was generalized to any cluster algebra and an algebraic realization and a geometric realization of Cat_C were given.
Abstract: Cluster algebras were introduced by S. Fomin and A. Zelevinsky in connection with dual canonical bases. Let U be a cluster algebra of type A_n. We associate to each cluster C of U an abelian category Cat_C such that the indecomposable objects of Cat_C are in natural correspondence with the cluster variables of U which are not in C. We give an algebraic realization and a geometric realization of Cat_C. Then, we generalize the ``denominator Theorem'' of Fomin and Zelevinsky to any cluster.

Journal ArticleDOI
TL;DR: In this paper, the authors evaluate popular and scholarly claims about the benefits of formal truth-telling and truth-seeking mechanisms in the aftermath of civil wars and conclude that many such claims are flawed or highly contentious as well as that truthtelling advocates claim far more about the power of truthtelling than logic or evidence dictates.
Abstract: This essay evaluates popular and scholarly claims about the peace-promoting benefits of formal truth-telling and truth-seeking mechanisms in the aftermath of civil wars. Its purpose is twofold. First, it synthesizes and clearly articulates in one place the full range of claims about the relationship between truth-telling and peacebuilding. Second, it evaluates these claims by systematically examining the core factual and theoretical assumptions on which they are based. An argument is made that many such claims—and their core assumptions—are flawed or highly contentious as well as that truth-telling advocates claim far more about the power of truth-telling than logic or evidence dictates. This is not to say that truth-telling has no role to play in preventing the resumption of violent conflict in postwar societies, only that proponents likely overstate its importance. Before proclaiming the necessity of truth commissions or trials in the aftermath of violent conflict, we need to better understand how truth-telling prevents the recurrence of civil war, how important it is relative to other factors and other peacebuilding strategies, and when it is likely to prove helpful, harmful, or irrelevant.

Journal ArticleDOI
TL;DR: This article examined the predictors of job stress in correctional officers and marked the first meta-analysis of this topic area, concluding that work attitudes (i.e., participation in decision-making, job satisfaction, commitment, and turnover intention) and specific correctional officer problems generated the strongest predictive relationships with job stress.

Journal ArticleDOI
TL;DR: The authors investigated the effect of typographic saliency on the look up and comprehension of unknown formulaic sequences in a self-paced reading task, and found that the saliency of a document's typographic content was positively associated with the look-up and comprehension.
Abstract: 1. Preface 2. Formulaic sequences in action: An introduction (by Schmitt, Norbert) 3. Measurement of formulaic sequences (by Read, John) 4. Formulaic performance in conventionalised varieties of speech (by Kuiper, Koenraad) 5. Knowledge and acquisition of formulaic sequences: A longitudinal study (by Schmitt, Norbert) 6. Individual differences and their effects on formulaic sequence acquisition (by Dornyei, Zoltan) 7. Social-cultural integration and the development of formulaic sequences (by Adolphs, Svenja) 8. Are corpus-derived recurrent clusters psycholinguistically valid? (by Schmitt, Norbert) 9. The eyes have it: An eye-movement study into the processing of formulaic sequences (by Underwood, Geoffrey) 10. Exploring the processing of formulaic sequences through a self-paced reading task (by Schmitt, Norbert) 11. Comparing knowledge of formulaic sequences across L1, L2, L3, and L4 (by Spottl, Carol) 12. The effect of typographic salience on the look up and comprehension of unknown formulaic sequences (by Bishop, Hugh) 13. 'Here's one I prepared earlier': Formulaic language learning on television (by Wray, Alison) 14. Facilitating the acquisition of formulaic sequences: An exploratory study in an EAP context (by Jones, Martha A.) 15. Index

Journal ArticleDOI
TL;DR: In this paper, a computational method (CADD) is presented whereby a continuum region containing dislocation defects is coupled to a fully atomistic region, with two key advantages: the ability to accomodate discrete dislocations in the continuum region and an algorithm for automatically detecting dislocation as they move from the atomistic regions to the continuum regions and then correctly converting them into discrete ones, or vice-versa.
Abstract: A computational method (CADD) is presented whereby a continuum region containing dislocation defects is coupled to a fully atomistic region. The model is related to previous hybrid models in which continuum finite elements are coupled to a fully atomistic region, with two key advantages: the ability to accomodate discrete dislocations in the continuum region and an algorithm for automatically detecting dislocations as they move from the atomistic region to the continuum region and then correctly "converting" the atomistic dislocations into discrete dislocations, or vice-versa. The resulting CADD model allows for the study of 2d problems involving large numbers of defects where the system size is too big for fully atomistic simulation, and improves upon existing discrete dislocation techniques by preserving accurate atomistic details of dislocation nucleation and other atomic scale phenomena. Applications to nanoindentation, atomic scale void growth under tensile stress, and fracture are used to validate and demonstrate the capabilities of the model.

Posted Content
TL;DR: In this article, the authors examine and clarify the burgeoning stakeholder literature that currently seeks to inform management practice, corporate governance and public policy with particular emphasis on the UK, and assess some of the key arguments concerning its potential impact on business performance and competitiveness.
Abstract: The paper has three main objectives. The first aim is to examine and clarify the burgeoning stakeholder literature that currently seeks to inform management practice, corporate governance and public policy with particular emphasis on the UK. We do this by continuing the process of clarification started by Donaldson and Preston (1995), focusing mainly on the political and practitioner literature generated within the UK. We begin this task by setting out a critique of stakeholding and develop this by using four key themes of enquiry. First, we examine stakeholding's conceptual confusion; second, we outline and develop criticism of its underlying pluralist assumptions; third, we consider the problems of implementation; and finally, we assess some of the key arguments concerning its potential impact on business performance and competitiveness. The second aim is to develop and examine the central criticisms of stakeholding from both the neo-liberal and Marxist/radical perspectives. By so doing we identify the key theoretical and practical issues which stakeholder proponents must address if they are to convince sceptics of the model's validity. The third aim is to develop a conceptual framework capable of illustrating the different stakeholder perspectives and assumptions on which they are based. This consists of five continuums: the first locates authors on a left-right political continuum; the second distinguishes between those authors who use stakeholding primarily for analysis and those who use it to formulate and prescribe specific courses of action; the third differentiates between intrinsic (good in itself) and instrumental (means to an end) motives; the fourth identifies the various levels of proposed intervention; and the fifth illustrates the different degrees of enforcement advocated. We believe that this framework provides a clear illustration of our arguments and serves as a useful instrument for clarifying the stakeholder concept. In addition, it is used to position or map the work of key authors within the stakeholder debate and we believe it may provide a more coherent basis for future research and debate.

Journal ArticleDOI
TL;DR: The authors explores the ways in which discourses of pleasure are deployed strategically in official commentaries on drug and alcohol consumption, arguing that problematic drug consumption appears both without reason (for example "bestial" and unfree) and thus not as "pleasant".
Abstract: The article explores the ways in which discourses of pleasure are deployed strategically in official commentaries on drug and alcohol consumption. Pleasure as a warrantable motive for, or descriptor of, drug and alcohol consumption appears to be silenced the more that consumption appears problematic for liberal government. Tracing examples of this from the 18th century to the present, it is argued that discourses of ‘pleasure’ are linked to discourses of reason and freedom, so that problematic drug consumption appears both without reason (for example ‘bestial’) and unfree (for example ‘compulsive’), and thus not as ‘pleasant’. In turn, changes in this articulation of pleasure, drugs and freedom can be linked with shifts in the major forms taken by liberal governance in the past two centuries, as these constitute freedom differently.

Journal ArticleDOI
TL;DR: Early research related to children's peer relationships was explored, followed by a discussion of the relative "neglect" of social withdrawal prior to the 1980s as discussed by the authors, and increased research attention since that time has provided a greater understanding of the causes, correlates, and consequences associated with "solitude".
Abstract: This commentary outlines the origins, history, and current status of research related to children's social withdrawal and social isolation. Early research related to children's peer relationships is first explored, followed by a discussion of the relative "neglect" of social withdrawal prior to the 1980s. Increased research attention since that time is briefly reviewed; this latter research has provided a greater understanding of the causes, correlates, and consequences associated with "solitude." In the latter half of this essay, the roles of biological factors and parenting are described. The essay closes with a discussion of future directions, including the exploration of risk and protective factors for socially withdrawn children, as well as the need for more research related to prevention and intervention.

Journal ArticleDOI
TL;DR: In this article, the authors investigated the conditions under which fences reduce the impact of roads on population persistence and found that fences were more likely to be beneficial the lower the degree of road avoidance and the higher the probability of an animal being killed on the road.
Abstract: Roads affect animal populations in three adverse ways. They act as barriers to movement, enhance mortality due to collisions with vehicles, and reduce the amount and quality of habitat. Putting fences along roads removes the problem of road mortality but increases the barrier effect. We studied this trade-off through a stochastic, spatially explicit, individual-based model of population dynamics. We investigated the conditions under which fences reduce the impact of roads on population persistence. Our results showed that a fence may or may not reduce the effect of the road on population persistence, depending on the degree of road avoidance by the animal and the probability that an animal that enters the road is killed by a vehicle. Our model predicted a lower value of traffic mortality below which a fence was always harmful and an upper value of traffic mortality above which a fence was always beneficial. Between these two values the suitability of fences depended on the degree of road avoidance. Fences were more likely to be beneficial the lower the degree of road avoidance and the higher the probability of an animal being killed on the road. We recommend the use of fences when traffic is so high that animals almost never succeed in their attempts to cross the road or the population of the species of concern is declining and high traffic mortality is known to contribute to the decline. We discourage the use of fences when population size is stable or increasing or if the animals need access to resources on both sides of the road, unless fences are used in combination with wildlife crossing structures. In many cases, the use of fences may be beneficial as an interim measure until more permanent measures are implemented.

Journal ArticleDOI
TL;DR: The present simulations reveal that changes in Type I error rates are greater when sample sizes are smaller, when the difference in variances is slight rather than extreme, and when the significance level is more stringent.
Abstract: Preliminary tests of equality of variances used before a test of location are no longer widely recommended by statisticians, although they persist in some textbooks and software packages. The present study extends the findings of previous studies and provides further reasons for discontinuing the use of preliminary tests. The study found Type I error rates of a two-stage procedure, consisting of a preliminary Levene test on samples of different sizes with unequal variances, followed by either a Student pooled-variances t test or a Welch separate-variances t test. Simulations disclosed that the twostage procedure fails to protect the significance level and usually makes the situation worse. Earlier studies have shown that preliminary tests often adversely affect the size of the test, and also that the Welch test is superior to the t test when variances are unequal. The present simulations reveal that changes in Type I error rates are greater when sample sizes are smaller, when the difference in variances is slight rather than extreme, and when the significance level is more stringent. Furthermore, the validity of the Welch test deteriorates if it is used only on those occasions where a preliminary test indicates it is needed. Optimum protection is assured by using a separate-variances test unconditionally whenever sample sizes are unequal.

Journal ArticleDOI
TL;DR: In this article, a database of 1700 digital seismograms from 186 earthquakes of magnitude mN 2.5-5.6 that occurred in southeastern Canada and the northeastern United States from 1990 to 2003 was compiled.
Abstract: A database of 1700 digital seismograms from 186 earthquakes of mag- nitude mN 2.5-5.6 that occurred in southeastern Canada and the northeastern United States from 1990 to 2003 was compiled. Maximum-likelihood regression analysis of the database was performed to determine a model for the attenuation of Fourier spectral amplitudes for the shear window, for the vertical and horizontal component of motion, for frequencies from 0.2 to 20 Hz. Fourier amplitudes follow a hinged trilinear attenuation model. Fourier spectral amplitudes decay as R 1.3 (where R is hypocentral distance) within 70 km of the source. There is a transition zone from 70 to 140 km as the direct waves are joined by strong postcritical reflections, where the attenuation is described as R 0.2 ; spectral amplitudes actually increase with distance in this range for low frequencies. Beyond 140 km, the attenuation is well described by R 0.5 , corresponding to geometric spreading in two dimensions. The associated model for the regional quality factor for frequencies greater than 1 Hz can be ex- pressed as Q 893f 032 . Q can be better modeled over a wider frequency range (0.2- 20 Hz) by a polynomial expression: log Q 3.052 0.393 log f 0.945 (log f ) 2 0.327 (log f ) 3 . The polynomial expression accommodates the observation that Q values are at a minimum (about 1000) near 1 Hz and rise at both lower and higher frequencies. Correction factors for the spectral amplitude model that describe the effects of focal depth on the amplitudes and their attenuation are developed using the subset of events with known focal depth. The attenuation model is similar to that determined from an earlier study with more limited data (Atkinson and Mereu, 1992), but the enlarged database indicates more rapid near-source amplitude decay and higher Q. The attenuation model is used to play back attenuation effects to determine the apparent source spectrum for each earthquake in the database and hence determine moment magnitude (M) and Brune stress drop. The events have moment magnitude in the range from 2.5 to 5. Stress drop increases with moment magnitude for events of M 4.3, then appears to attain a relatively constant level in the range from 100 to 200 bars for the larger events, as previously noted in Atkinson (1993b). The results of this study provide a useful framework for improving regional ground-motion relations in eastern North America. They further our understanding of attenuation in the region through analysis of an enlarged ground-motion database. In particular, the inclusion of the three-component broadband data gathered over the last decade allows extension of attenuation models to both horizontal and vertical components over a broad frequency range (0.2-20 Hz).

Journal ArticleDOI
TL;DR: This article explored the ways that work and family interact for stay-at-home fathers who "trade cash for care" while they remain connected to traditionally masculine sources of identity such as paid work and they take on unpaid masculine self-provisioning work at home and community work that builds on traditional male interests.
Abstract: Rooted in a qualitative research project with 70 stay-at-home fathers in Canada, this paper explores the ways that work and family interact for fathers who “trade cash for care.” While fathers are at home, they also remain connected to traditionally masculine sources of identity such as paid work and they take on unpaid masculine self-provisioning work at home and community work that builds on traditional male interests. They thus carve out complex sets of relations between home, paid and unpaid work, community work, and their own sense of masculinity. Narratives from stay-at-home fathers speak volumes about the ways in which the long shadow of hegemonic masculinity hangs over them while also pointing to hints of resistance and change as fathers begin to critique concepts of “male time” and market capitalism approaches to work and care. The paper concludes by pointing to several theoretical contributions to research on fatherhood and masculinities as well as to policy implications that arise from this study on the social valuing of unpaid work.

Journal ArticleDOI
TL;DR: In this paper, the authors considered all possible assignments for the recently discovered $X(3872) and gave numerical results for the $E1$ radiative widths as well as the three principal types of strong decays; open-charm, $c\overline{c}$ annihilation and closedcharm hadronic transitions.
Abstract: In this paper we consider all possible $1D$ and $2P$ $c\overline{c}$ assignments for the recently discovered $X(3872).$ Taking the experimental mass as input, we give numerical results for the $E1$ radiative widths as well as the three principal types of strong decays; open-charm, $c\overline{c}$ annihilation and closed-charm hadronic transitions. We find that many assignments may be immediately eliminated due to the small observed total width. The remaining viable $c\overline{c}$ assignments are $1{}^{3}{D}_{3},$ $1{}^{3}{D}_{2},$ $1{}^{1}{D}_{2},$ $2{}^{3}{P}_{1}$ and $2{}^{1}{P}_{1}.$ A search for the mode $J/\ensuremath{\psi}{\ensuremath{\pi}}^{0}{\ensuremath{\pi}}^{0}$ can establish the C parity of the $X(3872),$ which will eliminate many of these possibilities. Radiative transitions can then be used to test the remaining assignments, as they populate characteristic final states. The $1{}^{3}{D}_{2}$ and $1{}^{1}{D}_{2}$ states are predicted to have large (ca. 50%) radiative branching fractions to ${\ensuremath{\chi}}_{c1}\ensuremath{\gamma}$ and ${h}_{c}\ensuremath{\gamma},$ respectively. We predict that the $1{}^{3}{D}_{3}$ will also be relatively narrow and will have a significant (ca. 10%) branching fraction to ${\ensuremath{\chi}}_{c2}\ensuremath{\gamma},$ and should also be observable in B decay. Tests for non-$c\overline{c}X(3872)$ assignments are also discussed.

Journal ArticleDOI
TL;DR: How the Canadian print media reported the theoretical risk of blood transmission of Creutzfeldt-Jakob disease is described to recommend that journalists report information from both expert opinion sources and from published studies when communicating information on risk.
Abstract: The media play an important role at the interface of science and policy by communicating scientific information to the public and policy makers. In issues of theoretical risk, in which there is scientific uncertainty, the media's role as disseminators of information is particularly important due to the potential to influence public perception of the severity of the risk. In this article we describe how the Canadian print media reported the theoretical risk of blood transmission of Creutzfeldt-Jakob disease (CJD). We searched 3 newspaper databases for articles published by 6 major Canadian daily newspapers between January 1990 and December 1999. We identified all articles relating to blood transmission of CJD. In duplicate we extracted information from the articles and entered the information into a qualitative software program. We compared the observations obtained from this content analysis with information obtained from a previous policy analysis examining the Canadian blood system's decision-making concerning the potential transfusion transmission of CJD. Our search identified 245 relevant articles. We observed that newspapers in one instance accelerated a policy decision, which had important resource and health implication, by communicating information on risk to the public. We also observed that newspapers primarily relied upon expert opinion (47 articles) as opposed to published medical evidence (28 articles) when communicating risk information. Journalists we interviewed described the challenges of balancing their responsibility to raise awareness of potential health threats with not unnecessarily arousing fear amongst the public. Based on our findings we recommend that journalists report information from both expert opinion sources and from published studies when communicating information on risk. We also recommend researchers work more closely with journalists to assist them in identifying and appraising relevant scientific information on risk.

Journal ArticleDOI
TL;DR: A very simple modification to PEG construction for irregular codes is proposed, which considerably improves the performance at high signal-to-noise (SNR) ratios with no sacrifice in low-SNR performance.
Abstract: Progressive-edge-growth (PEG) algorithm is known to construct low-density parity-check codes at finite block lengths with very good performance. In this letter, we propose a very simple modification to PEG construction for irregular codes, which considerably improves the performance at high signal-to-noise (SNR) ratios with no sacrifice in low-SNR performance.

Journal ArticleDOI
TL;DR: In this article, the authors consider mean squared errors (MSE) of empirical predictors under a general setup, where ML or REML estimators are used for the second stage.
Abstract: The term “empirical predictor” refers to a two-stage predictor of a linear combination of fixed and random effects. In the first stage, a predictor is obtained but it involves unknown parameters; thus, in the second stage, the unknown parameters are replaced by their estimators. In this paper, we consider mean squared errors (MSE) of empirical predictors under a general setup, where ML or REML estimators are used for the second stage. We obtain second-order approximation to the MSE as well as an estimator of the MSE correct to the same order. The general results are applied to mixed linear models to obtain a second-order approximation to the MSE of the empirical best linear unbiased predictor (EBLUP) of a linear mixed effect and an estimator of the MSE of EBLUP whose bias is correct to second order. The general mixed linear model includes the mixed ANOVA model and the longitudinal model as special cases.

Journal ArticleDOI
TL;DR: Application of the proposed construction of rate-compatible low-density parity-check codes to a type-II hybrid automatic repeat request (ARQ) scheme with information block length k=1024 and code rates 8/19 to 8/10, results in a throughput which is only about 0.7 dB away from Shannon limit.
Abstract: In this letter, we present a framework for constructing rate-compatible low-density parity-check (LDPC) codes. The codes are linear-time encodable and are constructed from a mother code using puncturing and extending. Application of the proposed construction to a type-II hybrid automatic repeat request (ARQ) scheme with information block length k=1024 and code rates 8/19 to 8/10, using an optimized irregular mother code of rate 8/13, results in a throughput which is only about 0.7 dB away from Shannon limit. This outperforms existing similar schemes based on turbo codes and LDPC codes by up to 0.5 dB.

Journal ArticleDOI
TL;DR: A series of novel luminescent cyclometalated Ir(III) complexes has been synthesized and evaluated for use in unimolecular oxygen-sensing materials and are the most promising candidates for future luminescence-quenching-based oxygen-Sensing studies.
Abstract: In this study, a series of novel luminescent cyclometalated Ir(III) complexes has been synthesized and evaluated for use in unimolecular oxygen-sensing materials. The complexes Ir(C6)2(vacac), 1, Ir(ppy)2(vacac), 2, fac-Ir(ppy)2(vppy), 3, and mer-Ir(ppy)2(vppy), 4, where C6 = Coumarin 6, vacac = allylacetoacetate, ppy = 2-phenylpyridine, and vppy = 2-(4-vinylphenyl)pyridine, all have pendent vinyl or allyl groups for polymer attachment via the hydrosilation reaction. These luminophore complexes were characterized by NMR, absorption, and emission spectroscopy, luminescence lifetime and quantum yield measurements, elemental analysis, and cyclic voltammetry. Complex 1 was structurally characterized using X-ray crystallography, and a series of 1-D (1H, 13C) and 2-D (1H−1H, 1H−13C) NMR experiments were used to resolve the solution structure of 4. Complexes 1 and 3 displayed the longest luminescence lifetimes and largest quantum efficiencies in solution (τ = 6.0 μs, φ = 0.22 for 1; τ = 0.4 μs, φ = 0.2 for 3) an...