scispace - formally typeset
Search or ask a question
Author

Ronald P. Evens

Bio: Ronald P. Evens is an academic researcher from Tufts University. The author has contributed to research in topics: Butorphanol & Pharmacy. The author has an hindex of 11, co-authored 21 publications receiving 312 citations. Previous affiliations of Ronald P. Evens include Bristol-Myers Squibb & Amgen.

Papers
More filters
Journal ArticleDOI
TL;DR: Subjects had lower total phenytoin 48‐hr area under the curve (AUC) values when on aspirin but free pheny toin AUC values were unchanged, suggesting more rapid clearance of total phenYtoin probably compensated for salicylate displacement of phenyToin from plasma protein binding sites.
Abstract: Six healthy male subjects received phenytoin sodium as 9 100-mg capsules alone or with aspirin in a randomized, crossover fashion. Aspirin, 975 mg every 6 hr, was started 22 hr before a phenytoin dose and continued for an additional 48 hr during blood sampling. Mean 4-hr plasma salicylate levels ranged from 104 to 157 micrograms/ml during the sampling period. Individual mean values for the free fraction of salicylate varied from 0.107 to 0.167. The fraction of free phenytoin in plasma rose from 0.128 +/- 0.004 to 0.163 +/- 0.009 when aspirin was given (p less than 0.001). Subjects had lower total phenytoin 48-hr area under the curve (AUC) values when on aspirin (323 +/- 36 without and 261 +/- 49 micrograms . hr . ml-1 with aspirin; p less than 0.001) but free phenytoin AUC values were unchanged (41.4 +/- 4.5 and 42.4 +/- 9.0 micrograms . hr . ml-1; p less than 0.5). Thus, more rapid clearance of total phenytoin probably compensated for salicylate displacement of phenytoin from plasma protein binding sites. Total phenytoin levels for therapeutic monitoring must be interpreted cautiously when patients also receive salicylate.

65 citations

Journal ArticleDOI
TL;DR: The future impact of biotechnology is promising, as long as the public and private sectors continue to foster policies and provide funds that lead to scientific breakthroughs and all stakeholders establish policies to ensure that the therapeutic advances that mitigate or cure medical conditions that currently have inadequate or no available therapies are accessible to the public at a reasonable cost.
Abstract: For more than three decades the field of biotechnology has had an extraordinary impact on science, health care, law, the regulatory environment, and business. During this time more than 260 novel b...

43 citations

Journal ArticleDOI
TL;DR: The R&D paradigm has changed with biotechnology now as one of the major focuses for new product development with novel molecules by the whole biopharma industry.
Abstract: The biotechnology segment of the overall biopharma industry has existed for only about 40–45 years, as a driver of new product development. This driving force was initiated with the FDA approval of recombinant human insulin in 1982, originating from the Genentech company. The pharma industry in the early years of 1970s and 1980s engaged with biotechnology companies only to a small extent with their in-licensing of a few recombinant molecules, led by Roche, Eli Lilly, and Johnson and Johnson. However, subsequently and dramatically over the last 25 years, biotechnology has become a primary driver of product and technology innovation and has become a cornerstone in new product development by all biopharma companies. This review demonstrates these evolutionary changes regarding approved products, product pipelines, novelty of the products, FDA approval rates, product sales, financial R&D investments in biotechnology, partnerships, mergers and acquisitions, and patent issues. We now have about 300 biotechnology products approved in USA covering 16 medical disciplines and about 250 indications, with the engagement of 25 pharma companies, along with their biotechnology company innovators and partners. The biotechnology pipeline involves over 1000 molecules in clinical trials, including over 300 molecules associated with the top 10 pharma companies. Product approval rates by the FDA for biotechnology products are over double the rate for drugs. Yes, the R&D paradigm has changed with biotechnology now as one of the major focuses for new product development with novel molecules by the whole biopharma industry.

39 citations

Journal ArticleDOI
TL;DR: The marriage of biotechnology and the pharmaceutical industry (pharma) is predicated on an evolution in technology and product innovation that has come as a result of advances in both the science and the business practices of the biotechnology sector in the past 30 years.
Abstract: The marriage of biotechnology and the pharmaceutical industry (pharma) is predicated on an evolution in technology and product innovation It has come as a result of advances in both the science and the business practices of the biotechnology sector in the past 30 years Biotechnology products can be thought of as “intelligent pharmaceuticals,” in that they often provide novel mechanisms of action, new approaches to disease control, higher clinical success rates, improved patient care, extended patent protection, and a significant likelihood of reimbursement Although the first biotechnology product, insulin, was approved just 32 years ago in 1982, today there are more than 200 biotechnology products commercially available Research has expanded to include more than 900 biotechnology products in clinical trials Pharma is substantially engaged in both the clinical development of these products and their commercialization Clinical Pharmacology & Therapeutics (2014); 95 5, 528–532 doi:101038/clpt201414

27 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: Evidence from nonrandomized studies and everyday clinical experience does indicate that measuring serum concentrations of old and new generation antiepileptic drugs (AEDs) can have a valuable role in guiding patient management provided that concentrations are measured with a clear indication and are interpreted critically, taking into account the whole clinical context.
Abstract: Although no randomized studies have demonstrated a positive impact of therapeutic drug monitoring (TDM) on clinical outcome in epilepsy, evidence from nonrandomized studies and everyday clinical experience does indicate that measuring serum concentrations of old and new generation antiepileptic drugs (AEDs) can have a valuable role in guiding patient management provided that concentrations are measured with a clear indication and are interpreted critically, taking into account the whole clinical context. Situations in which AED measurements are most likely to be of benefit include (1) when a person has attained the desired clinical outcome, to establish an individual therapeutic concentration which can be used at subsequent times to assess potential causes for a change in drug response; (2) as an aid in the diagnosis of clinical toxicity; (3) to assess compliance, particularly in patients with uncontrolled seizures or breakthrough seizures; (4) to guide dosage adjustment in situations associated with increased pharmacokinetic variability (e.g., children, the elderly, patients with associated diseases, drug formulation changes); (5) when a potentially important pharmacokinetic change is anticipated (e.g., in pregnancy, or when an interacting drug is added or removed); (6) to guide dose adjustments for AEDs with dose-dependent pharmacokinetics, particularly phenytoin.

901 citations

Journal ArticleDOI
14 Aug 1987-JAMA
TL;DR: Although a variety of univariate statistics are included, certain topics that are important in medical research are not, and there is little or no discussion of multiple regression, life-table techniques, or pooling of studies.
Abstract: This book attempts to achieve a difficult goal: to teach statistics to the novice so as to impart a liking and understanding of statistics. The book is geared toward a medical audience, since most examples are from the medical literature. The structure of the book consists of the following elements in each chapter: a small number of statistical rules of thumb, followed by a nontechnical explanation, a demonstration of how to work through the mechanics of doing the statistical test in question, a summary, and sample problems to be solved by the reader. (The answers, with explanations, are provided in an appendix.) Although a variety of univariate statistics are included, certain topics that are important in medical research are not. For example, there is little or no discussion of multiple regression, life-table techniques, or pooling of studies. These omissions, especially of multiple regression, are unfortunate. The Primer was derived from

898 citations

Journal Article
TL;DR: These guidelines are based on consensus of Canadian experts in neurology, emergency medicine, psychiatry, psychology, family medicine and pharmacology, and consumers and are likely to lead to substantial benefits in both human and economic terms.
Abstract: OBJECTIVE: To provide physicians and allied health care professionals with guidelines for the diagnosis and management of migraine in clinical practice. OPTIONS: The full range and quality of diagnostic and therapeutic methods available for the management of migraine. OUTCOMES: Improvement in the diagnosis and treatment of migraine, which will lead to a reduction in suffering, increased productivity and decreased economic burden. EVIDENCE AND VALUES: The creation of the guidelines followed a needs assessment by members of the Canadian Headache Society and included a statement of objectives; development of guidelines by multidisciplinary working groups using information from literature reviews and other resources; comparison of alternative clinical pathways and description of how published data were analysed; definition of the level of evidence for data in each case; evaluation and revision of the guidelines at a consensus conference held in Ottawa on Oct. 27-29, 1995; redrafting and insertion of tables showing key variables and data from various studies and tables of data with recommendations; and reassessment by all conference participants. BENEFITS, HARMS AND COSTS: Accuracy in diagnosis is a major factor in improving therapeutic effectiveness. Improvement in the precise diagnosis of migraine, coupled with a rational plan for the treatment of acute attacks and for prophylactic therapy, is likely to lead to substantial benefits in both human and economic terms. RECOMMENDATIONS: The diagnosis of migraine can be improved by using modified criteria of the International Headache Society as well as a semistructured patient interview technique. Appropriate treatment of symptoms should take into account the severity of the migraine attack, since most patients will have attacks of differing severity and can learn to use medication appropriate for each attack. When headaches are frequent or particularly severe, prophylactic therapy should be considered. Both the avoidance of migraine trigger factors and the application of nonpharmacological therapies play important roles in overall migraine management and will be addressed at a later date. VALIDATION: The guidelines are based on consensus of Canadian experts in neurology, emergency medicine, psychiatry, psychology, family medicine and pharmacology, and consumers. Previous guidelines did not exist. Field testing of the guidelines is in progress.

213 citations

Book ChapterDOI
01 Jan 2007
TL;DR: The following information is available to be viewed/ printed in Adobe Acrobat pdf format.
Abstract: The following information is available to be viewed/ printed in Adobe Acrobat pdf format. I. Statement of Purpose A. About the Uniform Requirements B. Potential Users of the Uniform Requirements C. How to Use the Uniform Requirements II. Ethical Considerations in the Conduct and Reporting of Research A. Authorship and Contributorship 1. Byline Authors 2. Contributors Listed in Acknowledgments B. Editorship 1. The Role of the Editor 2. Editorial Freedom C. Peer Review D. Conflicts of Interest 1. Potential Conflicts of Interest Related to Individual Authors’ Commitments 2. Potential Conflicts of Interest Related to Project Support 3. Potential Conflicts of Interest Related to Commitments of Editors, Journal Staff, or Reviewers E. Privacy and Confidentiality 1. Patients and Study Participants 2. Authors and Reviewers F. Protection of Human Subjects and Animals in Research III. Publishing and Editorial Issues Related to Publication in Biomedical Journals A. Obligation to Publish Negative Studies B. Corrections, Retractions, and “Expressions of Concern” C. Copyright D. Overlapping Publications 1. Duplicate Submission 2. Redundant Publication 3. Acceptable Secondary Publication 4. Competing Manuscripts based on the Same Study a. Differences in Analysis or Interpretation b. Differences in Reported Methods or Results 5. Competing Manuscripts Based on the Same Database E. Correspondence F. Supplements, Theme Issues, and Special Series G. Electronic Publishing H. Advertising I. Medical Journals and the General Media J. Obligation to Register Clinical Trials IV. Manuscript Preparation and Submission A. Preparing a Manuscript for Submission to Biomedical Journals 1. a. General Principles b. Reporting Guidelines for Specific Study Designs 2. Title page 3. Conflict-of-interest Notification Page 4. Abstract and

203 citations

01 Jan 2005
TL;DR: In this article, the authors survey the empirical literature on the use of transaction cost in the make-or-buy decision in the context of outsourced manufacturing and data warehousing.
Abstract: The “transaction cost” theory of the firm introduced by Coase (1937) has become a standard framework for the study of institutional arrangements. The Coasian framework helps explain not only the existence of the firm, but also its size and scope. Why, in Coase’s (1937, pp. 393‐94) words, “does the entrepreneur not organize one less transaction or one more?” Some firms are highly integrated: IBM, for example, produces many of its components and software and maintains its own sales force for mainframe computers. Others are much more specialized: Dell Computer outsources virtually all its hardware and software components, selling directly to end users through its catalog and website, while the shoe company Reebok owns no manufacturing plants, relying on outside suppliers to make its products. U.S. manufacturing and service companies are increasingly contracting with specialized information technology firms for their computing and data warehousing needs, spending $7.2 billion on outsourced computer operations in 1990. Standard and Poor’s estimates total worldwide outsourcing for 2003 at $170 billion. 1 Why do some firms choose a vertically integrated structure, while others specialize in one stage of production and outsource the remaining stages to other firms? In other words, should a firm make its own inputs, should it buy them on the spot market, or should it maintain an ongoing relationship with a particular supplier? Traditionally, economists viewed vertical integration or vertical control as an attempt to earn monopoly rents by gaining control of input markets or distribution channels. The transaction cost approach, by contrast, emphasizes that vertical coordination can be an efficient means of protecting relationship-specific investments or mitigating other potential conflicts under incomplete contracting. As transaction cost economics was developed in the 1970s and 1980s, a stream of empirical literature emerged explaining the “make-or-buy decision” using transaction cost reasoning. (The traditional approach has generated relatively few empirical applications beyond analyses of particular antitrust cases.) This chapter surveys the empirical literature

202 citations