scispace - formally typeset
Search or ask a question

Showing papers by "University of North Texas published in 2001"


Journal ArticleDOI
TL;DR: In this paper, a review of the literature in the area of alternate gate dielectrics is given, based on reported results and fundamental considerations, the pseudobinary materials systems offer large flexibility and show the most promise toward success.
Abstract: Many materials systems are currently under consideration as potential replacements for SiO2 as the gate dielectric material for sub-0.1 μm complementary metal–oxide–semiconductor (CMOS) technology. A systematic consideration of the required properties of gate dielectrics indicates that the key guidelines for selecting an alternative gate dielectric are (a) permittivity, band gap, and band alignment to silicon, (b) thermodynamic stability, (c) film morphology, (d) interface quality, (e) compatibility with the current or expected materials to be used in processing for CMOS devices, (f) process compatibility, and (g) reliability. Many dielectrics appear favorable in some of these areas, but very few materials are promising with respect to all of these guidelines. A review of current work and literature in the area of alternate gate dielectrics is given. Based on reported results and fundamental considerations, the pseudobinary materials systems offer large flexibility and show the most promise toward success...

5,711 citations


Journal ArticleDOI
TL;DR: As expected, lovastatin was effective in preventing coronary events in participants whose base-line ratio of total cholesterol to high-de...
Abstract: Background Elevated levels of C-reactive protein, even in the absence of hyperlipidemia, are associated with an increased risk of coronary events. Statin therapy reduces the level of C-reactive protein independently of its effect on lipid levels. We hypothesized that statins might prevent coronary events in persons with elevated C-reactive protein levels who did not have overt hyperlipidemia. Methods The level of C-reactive protein was measured at base line and after one year in 5742 participants in a five-year randomized trial of lovastatin for the primary prevention of acute coronary events. Results The rates of coronary events increased significantly with increases in the base-line levels of C-reactive protein. Lovastatin therapy reduced the C-reactive protein level by 14.8 percent (P<0.001), an effect not explained by lovastatin-induced changes in the lipid profile. As expected, lovastatin was effective in preventing coronary events in participants whose base-line ratio of total cholesterol to high-de...

1,444 citations


Journal ArticleDOI
Abstract: As the use of networks in public management increases, more and larger questions expand this research arena. In many ways, public network management is in search of a paradigm equivalent to the hierarchical-organizational authority paradigm of bureaucratic management. We raise and offer preliminary answers to seven metaquestions that address the nature of network management tasks, group process in collaboration, flexibil

1,140 citations


Journal ArticleDOI
TL;DR: In this article, the authors focus on the most commonly used estimate of reliability, internal reliability, which is used for interpreting study effects and test results, and propose a method to estimate the reliability of test results.
Abstract: Although often ignored, reliability is critical when interpreting study effects and test results. Accordingly, this article focuses on the most commonly used estimate of reliability, internal consi...

964 citations


Journal ArticleDOI
TL;DR: The authors provides a review of SET's foundational premises, how it has been used in the marketing literature, and its theoretical limitations, and is intended to assist researchers who wish to use SET to examine business-to-business relational exchange.
Abstract: Social exchange theory (SET) has been used extensively by marketing scholars to explain business-to-business relational exchange. Despite its popularity as a theoretical explanatory mechanism, there is no recent literature review that delineates SET's foundational premises, how it has been used in the marketing literature, and its theoretical limitations. This article provides such a review and is intended to assist researchers who wish to use SET to examine business-to-business relational exchange.

601 citations


Journal ArticleDOI
TL;DR: This article conducted an over-the-telephone factorial experiment with 1,663 white Americans and found that Asian and Hispanic neighborhood composition do not matter to whites, but black neighborhood composition does matter to white Americans.
Abstract: Employing an alternative methodology and new data, the authors address the debate concerning the underlying causes of racial residential segregation. Are white Americans avoiding racially mixed neighborhoods because they do not want to live with nonwhites? And if so, is this the case independent of factors with which race is associated, such as crime levels or housing values? An over-the-telephone factorial experiment addresses these issues, measuring variables that shape white Americans' choice of purchasing a home. Based on a national, random-digit-dial survey of 1,663 white Americans, the effects of African American, Asian, and Hispanic neighborhood composition on whites' likelihood of buying a house are explored, as well as the other variables for which race may serve as a proxy. Results indicate that Asian and Hispanic neighborhood composition do not matter to whites. Black neighborhood composition, however, does matter, and matters even more for white Americans with children under age 18. The effect of black composition is net of the variables that whites offer as the primary reasons they do not want to live with blacks. The implications of these findings for segregation trends and for future research are considered.

453 citations


Journal ArticleDOI
TL;DR: Findings reveal significant differences in the reasons that these firms decided to adopt Web technology depending on when the firm made the adoption decision; early adopters placed more emphasis on perceived benefits and compatibility of the Web with existing technology and organizational norms than did later adopters.

446 citations


Journal ArticleDOI
TL;DR: This article examined the accuracy of break point estimation using the endogenous break unit root tests of Zivot and Andrews (1992) and Perron (1997) and found that these tests tend to identify the break point incorrectly at one period behind the true break point, where bias in estimating the persistence parameter and spurious rejections are the greatest.
Abstract: This paper examines the accuracy of break point estimation using the endogenous break unit root tests of Zivot and Andrews (1992) and Perron (1997). We find that these tests tend to identify the break point incorrectly at one‐period behind (TB‐1) the true break point (TB), where bias in estimating the persistence parameter and spurious rejections are the greatest. In addition, this outcome occurs under the null and alternative hypotheses, and more so as the magnitude of the break increases. Consequences of utilizing these endogenous break tests are similar to (incorrectly) omitting the break term Bt in Perron's (1989) exogenous test.

373 citations


Journal Article
TL;DR: The results indicate that pentylenetetrazole and picrotoxin interact with overlapping but distinct domains of the GABA(A) receptor, and PTZ decreased open probability by increasing the duration of closed states but had no effect on single-channel conductance or open state duration.
Abstract: Pentylenetetrazole (PTZ) is a central nervous system convulsant that is thought, based on binding studies, to act at the picrotoxin (PTX) site of the gamma-aminobutyric acid type A (GABA(A)) receptor. In the present study, we have investigated the mechanism and site of action of PTZ in recombinant GABA(A) receptors. In rat alpha 1 beta 2 gamma 2 receptors, PTZ inhibited GABA-activated Cl(-) current in a concentration-dependent, voltage-independent manner, with an IC(50) of 0.62 +/- 0.13 mM. The mechanism of inhibition appeared competitive with respect to GABA in both rat and human alpha 1 beta 2 gamma 2 receptors. Varying subunit configuration (change or lack of alpha subunit isoform or lack of gamma 2 subunit) had modest effects on PTZ-induced inhibition, as evidenced by comparable IC(50) values (0.6-2.2 mM) in all receptor configurations tested. This contrasts with PTX and other PTX-site ligands, which have greater affinity in receptors lacking an alpha subunit. Using a one-site model for PTZ interaction with alpha 1 beta 2 gamma 2 receptors, the association rate (k(+1)) was found to be 1.14 x 10(3) M(-1) s(-1) and the dissociation rate (k(-1)) was 0.476 s(-1), producing a functional k(d) of 0.418 mM. PTZ could only gain access to its binding site extracellularly. Single-channel recordings demonstrated that PTZ decreased open probability by increasing the duration of closed states but had no effect on single-channel conductance or open state duration. alpha-Isopropyl-alpha-methyl-gamma-butyrolactone, a compound known to antagonize effects of PTX, also diminished the effects of PTZ. Taken together, our results indicate that pentylenetetrazole and picrotoxin interact with overlapping but distinct domains of the GABA(A) receptor.

336 citations


Journal ArticleDOI
TL;DR: In this paper, the authors examined sources of measurement error variance in the Teacher Efficacy Scale (TES), the most frequently used instrument in the area to assess teacher effectiveness, and the reliability generalization was used to characterize the typical score reliability for the TES.
Abstract: Teacher efficacy has proven to be an important variable in teacher effectiveness. It is consistently related to positive teaching behaviors and student outcomes. However, the measurement of this construct is the subject of current debate, which includes critical examination of predominant instruments used to assess teacher efficacy. The present study extends this critical evaluation and examines sources of measurement error variance in the Teacher Efficacy Scale (TES), historically the most frequently used instrument in the area. Reliability generalization was used to characterize the typical score reliability for the TES and potential sources of measurement error variance across studies. Other related instruments were also examined as regards measurement integrity.

313 citations


Journal ArticleDOI
TL;DR: Given the significant costs involved in putting technology into schools and given the potential to harm young children, one prominent report calls for “An immediate moratorium on the further introduction of computers in ... elementary education”.
Abstract: Given the significant costs involved in putting technology into schools and given the potential to harm young children, one prominent report calls for “An immediate moratorium on the further introduction of computers in ... elementary education” [3]. Rather than getting defensive, gesticulating wildly, and dragging out that favorite story about how one child we personally know accomplished an amazing thing with a computer, it’s time to come out of the closet: children simply aren’t using computers in K–12 schools and that’s why there isn’t substantial data on the impact of computers in K–12 education. Let’s look at some basic statistics about availability and use of computers in K–12:

Journal ArticleDOI
TL;DR: A general account of selection is set out to see how well it accommodates these very different sorts of selection, which can generate complexity and novelty primarily because they are so wasteful and inefficient.
Abstract: Authors frequently refer to gene-based selection in biological evolution, the reaction of the immune system to antigens, and operant learning as exemplifying selection processes in the same sense of this term. However, as obvious as this claim may seem on the surface, setting out an account of "selection" that is general enough to incorporate all three of these processes without becoming so gen- eral as to be vacuous is far from easy. In this target article, we set out such a general account of selection to see how well it accommo- dates these very different sorts of selection. The three fundamental elements of this account are replication, variation, and environmental interaction. For selection to occur, these three processes must be related in a very specific way. In particular, replication must alternate with environmental interaction so that any changes that occur in replication are passed on differentially because of environmental in- teraction. One of the main differences among the three sorts of selection that we investigate concerns the role of organisms. In traditional bio- logical evolution, organisms play a central role with respect to environmental interaction. Although environmental interaction can occur at other levels of the organizational hierarchy, organisms are the primary focus of environmental interaction. In the functioning of the immune system, organisms function as containers. The interactions that result in selection of antibodies during a lifetime are between entities (antibodies and antigens) contained within the organism. Resulting changes in the immune system of one organism are not passed on to later organisms. Nor are changes in operant behavior resulting from behavioral selection passed on to later organisms. But oper- ant behavior is not contained in the organism because most of the interactions that lead to differential replication include parts of the world outside the organism. Changes in the organism's nervous system are the effects of those interactions. The role of genes also varies in these three systems. Biological evolution is gene-based (i.e., genes are the primary replicators). Genes play very different roles in op- erant behavior and the immune system. However, in all three systems, iteration is central. All three selection processes are also incred- ibly wasteful and inefficient. They can generate complexity and novelty primarily because they are so wasteful and inefficient.

Journal ArticleDOI
TL;DR: In this article, the contingent rewards of the Multifactor Leadership Questionnaire (MLQ) were examined in an attempt to theoretically explain recent empirical results linking contingent rewards to transformational rather than transactional leadership.
Abstract: The contingent rewards subscale of the Multifactor Leadership Questionnaire (MLQ) was examined in an attempt to theoretically explain recent empirical results linking contingent rewards to transformational rather than transactional leadership. In Study 1, we supported the proposal that the items in the contingent rewards subscale represented two separate factors, an explicit and an implicit psychological contract. In addition, the implicit factor loaded with other transformational subscales and the explicit factor loaded with other transactional subscales. We confirmed these results in Study 2, and supported other hypotheses from transformational leadership theory using the contingent rewards revision. Implications for the transformational leadership construct are discussed. Copyright © 2001 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: This article found that sexual appeals were more persuasive overall than matched non-sexual appeals for social marketing topics, but had a negative effect on cognitive elaboration (e.g., support and counterarguments).
Abstract: Increasingly, social marketers are using sexual information in public service announcements and collateral material for a wide range of causes. This study builds on previous research to explain how sexual appeals can affect cognitive processing and persuasion for “help-self” social marketing topics. It also goes beyond traditional single-message research designs by testing matched pairs of appeals (sexual/nonsexual) for 13 social marketing topics. The major finding was that sexual appeals were more persuasive overall than matched nonsexual appeals for social marketing topics. Sexual appeals also stimulated more favorable ad execution related thoughts but had a negative effect on cognitive elaboration (e.g., support and counterarguments). Respondents also reported that sexual appeals were more attention getting, likeable, dynamic, and somewhat more apt to increase their interest in the topic than were nonsexual appeals. These findings suggest that persuasion is largely the result of peripheral pro...

Journal ArticleDOI
TL;DR: In this article, an academic year-long teacher research initiative was implemented in an alternative education school in a large school district in the southwest United States, where qualitative and quantitative methodologies were utilized to examine participatory teacher research as an active, collaborative means of professional development for teachers, including its effect on teacher efficacy and empowerment.

Journal ArticleDOI
TL;DR: Since the oxidative stress exists at or before the onset of psychosis the use of antioxidants from the very onset of schizophrenia may reduce the oxidative injury and dramatically improve the outcome of illness.
Abstract: 1. Schizophrenia is a major mental disorder that has a lifetime risk of 1% and affects at young age (average age at the onset 24 +/- 4.6 years) in many cultures around the world. The etiology is unknown, the pathophysiology is complex, and most of the patients need treatment and care for the rest of their lives. 2. Cellular oxidative stress is inferred from higher tissue levels of reactive oxygen species (ROS, e.g., O2*-, OH*, OH-, NO* and ONOO--) than its antioxidant defense that cause peroxidative cell injury, i.e., peroxidation of membrane phospholipids, particularly esterified essential polyunsaturated fatty acids (EPUFAS), proteins and DNA. 3. Oxidative stress can lead to global cellular with predominantly neuronal peroxidation, since neurons are enriched in highly susceptible EPUFAs and proteins, and damages DNA is not repaired effectively. 4. Such neuronal peroxidation may affect its function (i.e., membrane transport, loss of mitochondrial energy production, gene expression and therefore receptor-mediated phospholipid-dependent signal transduction) that may explain the altered information processing in schizophrenia. 5. It is possible that the oxidative neuronal injury can be prevented by dietary supplementation of antioxidants (e.g., vitamins E, C and A; beta-carotene, Q-enzyme, flavons, etc.) and that membrane phospholipids can be corrected by dietary supplementation of EPUFAs. 6. It may be that the oxidative stress is lower in populations consuming a low caloric diet rich in antioxidants and EPUFAs, and minimizing smoking and drinking. 7. Oxidative stress exists in schizophrenia based on altered antioxidant enzyme defense, increased lipid peroxidation and reduced levels of EPUFAs. The life style of schizophrenic patients is also prooxidative stress, i.e., heavy smoking, drinking, high caloric intake with no physical activity and treatment with pro-oxidant drugs. 8. The patients in developed countries show higher levels of lipid peroxidation and lower levels of membrane phospholipids as compared to patients in the developing countries. 9. Initial observations on the improved outcome of schizophrenia in patients supplemented with EPUFAs and antioxidants suggest the possible beneficial effects of dietary supplementation. 10. Since the oxidative stress exists at or before the onset of psychosis the use of antioxidants from the very onset of psychosis may reduce the oxidative injury and dramatically improve the outcome of illness.

Journal ArticleDOI
TL;DR: This paper examined empirical relationships predicting the likelihood of inmate misconduct with individual-level (inmate) variables and aggregate levels of prison population crowding, and compared results from hierarchical logistic models and stepwise pooled logistic regression models to see whether results differ significantly by method of estimation.
Abstract: Penologists recognize that both inmate- and prison-level characteristics are relevant to an understanding of individual inmates' behaviors; yet extant studies have focused only on unilevel models with either individual- or aggregate-level predictors and outcomes. To explore the potential of multilevel modeling for related research, we examine empirical relationships predicting the likelihood of inmate misconduct with individual-level (inmate) variables and aggregate levels of prison population crowding. The framework for the model borrows from both individual- and aggregate-level theories of informal social control. We examine three secondary data sets, using information common to each set. We compare results from hierarchical logistic models with those from stepwise pooled logistic regression models to see whether results differ significantly by method of estimation. The pooled models reveal inconsistency in the significance of inmate predictors (social demographics and criminal histories) across the thr...

Journal ArticleDOI
TL;DR: In this paper, the authors examined the effect of a technology-enriched classroom on student development of higher-order thinking skills and student attitudes toward computers and identified several implications related to classroom design to enhance the development of Higher-Order thinking skills.
Abstract: This study examined the effect of a technology-enriched classroom on student development of higher-order thinking skills and student attitudes toward computers. A sample of 80 sixth-grade and 86 fifth-grade students was tested using the Ross Test of Higher Cognitive Processes and surveyed using the Computer Attitude Questionnaire. The creation of a technology-enriched classroom environment appears to have had a positive effect on student acquisition of higher-order thinking skills. This study identified several implications related to classroom design to enhance the development of higher-order thinking skills. Teachers reported that the technology-enriched classroom differed from the traditional classroom in several significant ways.

Journal ArticleDOI
TL;DR: In this article, the authors discuss the nature of disaster and the future of emergency management, and put forth a model of vulnerability and highlights the plethora of factors that contribute to calamitous events, and introduce the concept of invulnerable development as a method of vulnerability management.
Abstract: Discusses the nature of disaster and the future of emergency management. After exploring differing historical perspectives of disaster, puts forth a model of vulnerability and highlights the plethora of factors that contribute to calamitous events. Introduces the concept of invulnerable development as a method of vulnerability management and compares it to other terms that have been proposed as guides for future disaster policy. The central argument to be made is that vulnerability is, or should be, the key concept for disaster scholarship and reduction.

Journal ArticleDOI
TL;DR: Several technical developments are occurring that will increase the feasibility of cell-based biosensors for field applications; these developments include stem cell and 3D culture technologies.

Journal ArticleDOI
TL;DR: Findings show that an increase in effort sense during constant-load exercise can activate both insular and thalamic regions and elevate cardiovascular responses but that decreases in effortsense do not reduce cardiovascular responses below the level required to sustain metabolic needs.
Abstract: The purpose of this investigation was to hypnotically manipulate effort sense during dynamic exercise and determine whether cerebral cortical structures previously implicated in the central modulat...

Journal ArticleDOI
TL;DR: In this paper, experimental manipulation of combinations of nonverbal and verbal immediacy allowed the authors to more precisely test these causal links in relation to recall, learning loss, and affective learning.
Abstract: Previous research involving few experiments generally claims that higher nonverbal and verbal immediacy by teachers increases students’ affective and cognitive learning. In this study, experimental manipulation of combinations of nonverbal and verbal immediacy allowed us to more precisely test these causal links in relation to recall, learning loss, and affective learning. Obtained effects strengthened previous research associating teacher nonverbal immediacy with enhanced cognitive and affective learning outcomes. However, higher verbal immediacy in the experimental manipulations, when combined with higher and lower nonverbal immediacy, was not observed to produce greater cognitive learning. Correlations among recall, learning loss, and affective learning measures were significant, but the cognitive measures were not strongly associated.

Book ChapterDOI
01 Jan 2001
TL;DR: In this paper, the authors present a review of the thermodynamic foundations of statistical mechanics and its thermodynamic implications, including theoretical, experimental and computational evidences and connections, as well as some perspectives for the future.
Abstract: The domain of validity of standard thermodynamics and Boltzmann-Gibbs statistical mechanics is focused on along a historical perspective. It is then formally enlarged in order to hopefully cover a variety of anomalous systems. The generalization concerns nonextensive systems, where nonextensivity is understood in the thermodynamical sense. This generalization was first proposed in 1988 inspired by the probabilistic description of multifractal geometry, and has been intensively studied during this decade. In the present effort, we describe the formalism, discuss the main ideas, and then exhibit the present status in what concerns theoretical, experimental and computational evidences and connections, as well as some perspectives for the future. The whole review can be considered as an attempt to clarify our current understanding of the foundations of statistical mechanics and its thermodynamical implications.

Journal ArticleDOI
TL;DR: This article examined the construct validity of the Ethics Position Questionnaire (EPQ) and found that the relationship between idealism and moral judgments demonstrated modest predictive validity, but the appreciably weaker influence of relativism and emergence of a veracity factor raise questions about the utility of the EPQ typology.
Abstract: Individual differences in ethical ideology are believed to play a key role in ethical decision making. Forsyth’s (1980) Ethics Position Questionnaire (EPQ) is designed to measure ethical ideology along two dimensions, relativism and idealism. This study extends the work of Forsyth by examining the construct validity of the EPQ. Confirmatory factor analyses conducted with independent samples indicated three factors – idealism, relativism, and veracity – account for the relationships among EPQ items. In order to provide further evidence of the instrument’s nomological and convergent validity, correlations among the EPQ subscales, dogmatism, empathy, and individual differences in the use of moral rationales were examined. The relationship between EPQ measures of idealism and moral judgments demonstrated modest predictive validity, but the appreciably weaker influence of relativism and the emergence of a veracity factor raise questions about the utility of the EPQ typology.

Journal ArticleDOI
TL;DR: In this paper, the authors consider subshifts of finite type on the symbolic space generated by incidence matrices over a countably infinite alphabet and construct a new class of Gibbs states of Holder continuous potentials on these symbol spaces.
Abstract: We consider subshifts of finite type on the symbolic space generated by incidence matrices over a countably infinite alphabet. We extend the definition of topological pressure to this context and, as our main result, we construct a new class of Gibbs states of Holder continuous potentials on these symbol spaces. We establish some basic stochastic properties of these Gibbs states: exponential decay of correlations, central limit theorem and an a.s. invariance principle. This is accomplished via detailed studies of the associated Perron-Frobenius operator and its conjugate operator.

Journal ArticleDOI
TL;DR: This study applies the uses and gratifications perspective to better understand the factors motivating commercial Web site use, and identifies a new media use gratification unique to the Internet: socialization using the medium to communicate with people.
Abstract: The uses and gratifications theoretical framework has continued to prove useful in the study of new and emerging media. In previous research on television as a medium, motivations for media use have been grouped into either process gratifications motivations associated with using the medium, like channel surfing or content gratifications motivations related to information or entertainment delivered by the medium, like watching the evening news for information. This study applies the uses and gratifications perspective to better understand the factors motivating commercial Web site use, and identifies a new media use gratification unique to the Internet: socialization using the medium to communicate with people. Through the cooperation of two major on-line companies, this research reports the results of a two-part study that begins with the identification of 179 motivations for Web use and subsequently reduces those to five primary underlying factors. These factors are discussed and related to three key indicators: frequency of Web use, frequency of computer use, and affinity with the computer. Implications for new social gratifications for Internet use are discussed, and directions for future research are proposed.

Journal ArticleDOI
TL;DR: Agranoff and McGuire as mentioned in this paper explored two models of management, namely, top-down and donor-recipient, and two emergent models, jurisdiction-based and network, for public management in twenty-first-century federalism.
Abstract: Public administration and the processes of federalism have merged to a nearly indistinguishable point. Since the "cooperative federalism" of the 1930s, managing within the federal system has become an increasingly more important activity. However, federalism is not static. As policy responsibilities between the national and subnational governments have evolved and devolved, governing authority has overlapped across levels to a point where all actors are involved simultaneously to varying degrees (Wright 1988). Attention must be given to operations in such a system. Managing across governments and across organizations within the complex and continuously changing processes of federalism deserves the attention of both scholars and practitioners. Such activity has become the very heart of public administration and management. The continuing growth of federal grants and new regulatory programs, increased federal-state programming, the continuation of some federal-local programs, federal initiatives to nongovernmental organizations, and expanded roles for state government have changed the context of public administration from single-organization operations to boundary-spanning operations (Agranoff and McGuire 1998a). Not only do local public managers now operate within their home agency and jurisdiction, they also perform numerous identifiable activities within the vertical realm, which includes the state and federal governments, and also horizontal activities, which involve other local governments and many nongovernmental organizations (Agranoff and McGuire 1998b; Jennings and Krane 1994; Mandell 1990; Wright and Krane 1998). These forces have put a premium on collaborative actions and transactions across governmental boundaries. While the resilience of federalism as a form of governance is undeniable--its shape and operation has caused and been caused by changing social, economic, and political trends (Watts 1996)--the search for appropriate management models within the changing processes of federalism remains a difficult task. To help focus that search, this article explores models of management. Two venerable models, top-down and donor-recipient, and two emergent models, jurisdiction-based and network, are presented. Each model's prevalence and applicability to twenty-first-century federalism are examined. While they are adaptable to explanation in a number of policy arenas, the emerging models are confirmed by our empirical study of 237 city governments and the intergovernmental and collaborative activity of government officials promoting economic development. The primary concern of this paper is to contribute to the understanding of public-management approaches by demonstrating how emergent models exist alongside more traditional models as a result of shifts in federalism. The term "model" employed here follows Kaplan's (1964, 266-7) usage: It is a "scientific metaphor" that directs attention to certain resemblances between theoretical entities and the real subject-matter; one type of system can be shown to be a consistent interpretation of another. Our search here is not for characteristic metaphors of federalism itself (Wright 1988), but how policy making and management can vary within federalism across time and policy realms. Thus, each model is not only described, its temporal and policy-specific relevance is also analyzed. Concern for management models is hardly new. Elazar (1964, 248) suggests that, from the founding period, intergovernmental cooperation was necessary, and methods of providing for collaboration among the various parts of the federal system were sought out continually. Similarly, Grodzins (1966) equates administrative practices in federalism with shared functions, and Leach (1970) identifies the management of grants programs as involving joint action and manpower from all levels of government. Even during Nixon's New Federalism attempt to streamline the intergovernmental system, Walker (1974, 30) claimed that managing within federalism was still in "a state of considerable confusion. …

Journal ArticleDOI
TL;DR: Treatment with lovastatin 20 to 40 mg daily for primary prevention of coronary heart disease was well tolerated and reduced the risk of first acute coronary events without increasing the riskof either noncardiovascular mortality or cancer.
Abstract: This study presents the long-term safety data from AFCAPS/TexCAPS, the first primary prevention trial to demonstrate that men and women with average levels of low-density lipoprotein cholesterol (LDL-C) and below average levels of high-density lipoprotein cholesterol (HDL-C) can significantly benefit from long-term treatment to lower LDL-C; lovastatin 20 to 40 mg/day reduced the risk of a first acute major coronary event (fatal or nonfatal myocardial infarction, unstable angina, or sudden death) by 37% (p = 0.00008). This double-blind randomized, placebo-controlled trial, in 6,605 generally healthy middle-aged and older men and women, had prespecified end point and cancer analyses. All analyses were intention-to-treat. Safety monitoring included history, physical examination, and laboratory studies (including hepatic transaminases and creatine phosphokinase [CPK]). All participants, even those who discontinued treatment, were contacted annually for vital status, cardiovascular events, and cancer history. After an average of 5.2 years of follow-up, there were 157 deaths (80 receiving lovastatin and 77 receiving placebo; relative risk [RR] 1.04; 95% confidence interval [CI] 0.76 to 1.42; p = 0.82); of which 115 were noncardiovascular (RR 1.21; CI 0.84 to 1.74; p = 0.31), and of these, 82 were due to cancer (RR 1.41; CI 0.91 to 2.19; p = 0.13). There were no significant differences between treatment groups in overall cancer rates, discontinuations for noncardiovascular adverse experiences, or clinically important elevations of hepatic transaminases or CPK. Among those who used cytochrome P450 isoform (CYP3A4) inhibitors, there were no treatment group differences in the frequency of clinically important muscle-related adverse events. Treatment with lovastatin 20 to 40 mg daily for primary prevention of coronary heart disease was well tolerated and reduced the risk of first acute coronary events without increasing the risk of either noncardiovascular mortality or cancer.

Journal ArticleDOI
TL;DR: In this paper, a series of experiments were carried out on nickel and carbon doped nanocrystalline nickel with different carbon concentrations from 500 to 1000 ppm at room temperature to 300°C.
Abstract: The potential engineering applications of nanocrystalline materials need more detailed study on deformation and fracture mechanisms at room and elevated temperatures under tensile loading. This paper reports results of a series of experiments carried out on nickel and carbon doped nanocrystalline nickel with different carbon concentrations from 500 to 1000 ppm at room temperature to 300°C. Grain growth was observed in nanocrystalline nickels as the testing temperature increases. A fast grain growth was noticed at 300°C. Pure nanocrystalline nickel experienced an abnormal grain growth at 500°C and its tensile properties reduced to a very low level. The addition of carbon exerted a potential effect to enhance the stability of the microstructure in nanocrystalline nickel at intermediate temperatures. However, carbon doped nickels exhibited lower tensile properties. Nanocrystalline nickels displayed a conventional Hall–Petch relationship. The results are discussed in relation to microstructural characteristics by using TEM and SEM.

Journal ArticleDOI
TL;DR: Using the magazine publishing industry as a context, this paper provides an empirical exploration of various revenue streams and relates them to manager assessment of the performance of the firm’s online efforts, present, to the authors’ knowledge, the first empirical Exploration of the link between the Performance of an online effort and various Revenue streams pursued.