scispace - formally typeset
Search or ask a question
Author

James Taylor

Bio: James Taylor is an academic researcher from Newcastle University. The author has contributed to research in topics: Laser & Fiber laser. The author has an hindex of 95, co-authored 1161 publications receiving 39945 citations. Previous affiliations of James Taylor include Institut national de la recherche agronomique & European Spallation Source.


Papers
More filters
Posted Content
TL;DR: In this article, an extension of Holt-Winters exponential smoothing has been proposed that smoothes an intraday cycle and an intraweek cycle, allowing parts of different days of the week to be treated as identical.
Abstract: This paper concerns the forecasting of seasonal intraday time series. An extension of Holt-Winters exponential smoothing has been proposed that smoothes an intraday cycle and an intraweek cycle. A recently proposed exponential smoothing method involves smoothing a different intraday cycle for each distinct type of day of the week. Similar days are allocated identical intraday cycles. A limitation is that the method allows only whole days to be treated as identical. We introduce an exponential smoothing formulation that allows parts of different days of the week to be treated as identical. The result is a method that involves the smoothing and initialisation of fewer terms than the other two exponential smoothing methods. We evaluate forecasting up to a day ahead using two empirical studies. For electricity load data, the new method compares well with a range of alternatives. The second study involves a series of arrivals at a call centre that is open for a shorter duration at the weekends than on weekdays. By contrast with the previously proposed exponential smoothing methods, our new method can model in a straightforward way this situation, where the number of periods on each day of the week is not the same.

67 citations

Journal ArticleDOI
15 Oct 1992-Nature
TL;DR: Gamma radiation above 100 MeV in energy has been detected from the radio pulsar PSR1706-44 as discussed by the authors, where gamma emission forms a single broad peak within the pulsar period of 102 ms, in contrast to the two narrow peaks seen in the other known high energy gamma-ray pulsars.
Abstract: Gamma radiation above 100 MeV in energy has been detected from the radio pulsar PSR1706-44. The gamma emission forms a single broad peak within the pulsar period of 102 ms, in contrast to the two narrow peaks seen in the other three known high-energy gamma-ray pulsars. The emission mechanism in all cases is probably the same, the differences arising from the geometry of the magnetic and rotation axes and the line of sight. Gamma-ray emission accounts for as much as 1 percent of the total neutron star spindown energy in these pulsars, much more than emerges at optical or radio frequencies. Thus, study of this emission is important in understanding pulsar emission and evolution.

67 citations

Journal ArticleDOI
TL;DR: Pumping of highly-nonlinear microstructured fibers with zero-dispersion around the pump wavelength of a 50kW peak-power picosecond Yb-fiber laser allowed extensive polychromatic picose Cond operation down to 525nm in all-fibre format.
Abstract: Pumping of highly-nonlinear microstructured fibers with zero-dispersion around the pump wavelength of a 50kW peak-power picosecond Yb-fiber laser allowed extensive polychromatic picosecond operation down to 525nm in all-fibre format. Spectral power densities over 1mW/nm and potential of further pulse compression to femtoseconds is demonstrated.

66 citations

Journal ArticleDOI
TL;DR: A cw room-temperature Cr(4+):YAG laser, tuning from 1.37 to 1.51 microm, is described and mode locking of this novel laser is reported for what is to the authors' knowledge the first time.
Abstract: A cw room-temperature Cr4+:YAG laser, tuning from 1.37 to 1.51 μm, is described. Mode locking of this novel laser is reported for what is to our knowledge the first time.

66 citations

Journal ArticleDOI
TL;DR: In this paper, the authors present an alternative hybrid approach which applies quantile regression to the empirical fit errors to produce forecast error quantile models, which are functions of the lead time, as suggested by the theoretical variance expressions.
Abstract: Exponential smoothing methods do not involve a formal procedure for identifying the underlying data generating process. The issue is then whether prediction intervals should be estimated by a theoretical approach, with the assumption that the method is optimal in some sense, or by an emp irical procedure. In this paper we present an alternative hybrid approach which applies quantile regression to the empirical fit errors to produce forecast error quantile models. These models are functions of the lead time, as suggested by the theoretical variance expressions. In addition to avoiding the optimality assumption, the method is nonparametric, so there is no need for the common normality assumption. Application of the new approach to simple, Holt's, and damped Holt's exponential smoothing, using simulated and real data sets, gave encouraging results.

65 citations


Cited by
More filters
Journal ArticleDOI

[...]

08 Dec 2001-BMJ
TL;DR: There is, I think, something ethereal about i —the square root of minus one, which seems an odd beast at that time—an intruder hovering on the edge of reality.
Abstract: There is, I think, something ethereal about i —the square root of minus one. I remember first hearing about it at school. It seemed an odd beast at that time—an intruder hovering on the edge of reality. Usually familiarity dulls this sense of the bizarre, but in the case of i it was the reverse: over the years the sense of its surreal nature intensified. It seemed that it was impossible to write mathematics that described the real world in …

33,785 citations

01 Jan 2016
TL;DR: The using multivariate statistics is universally compatible with any devices to read, allowing you to get the most less latency time to download any of the authors' books like this one.
Abstract: Thank you for downloading using multivariate statistics. As you may know, people have look hundreds times for their favorite novels like this using multivariate statistics, but end up in infectious downloads. Rather than reading a good book with a cup of tea in the afternoon, instead they juggled with some harmful bugs inside their laptop. using multivariate statistics is available in our digital library an online access to it is set as public so you can download it instantly. Our books collection saves in multiple locations, allowing you to get the most less latency time to download any of our books like this one. Merely said, the using multivariate statistics is universally compatible with any devices to read.

14,604 citations

Journal Article
TL;DR: This book by a teacher of statistics (as well as a consultant for "experimenters") is a comprehensive study of the philosophical background for the statistical design of experiment.
Abstract: THE DESIGN AND ANALYSIS OF EXPERIMENTS. By Oscar Kempthorne. New York, John Wiley and Sons, Inc., 1952. 631 pp. $8.50. This book by a teacher of statistics (as well as a consultant for \"experimenters\") is a comprehensive study of the philosophical background for the statistical design of experiment. It is necessary to have some facility with algebraic notation and manipulation to be able to use the volume intelligently. The problems are presented from the theoretical point of view, without such practical examples as would be helpful for those not acquainted with mathematics. The mathematical justification for the techniques is given. As a somewhat advanced treatment of the design and analysis of experiments, this volume will be interesting and helpful for many who approach statistics theoretically as well as practically. With emphasis on the \"why,\" and with description given broadly, the author relates the subject matter to the general theory of statistics and to the general problem of experimental inference. MARGARET J. ROBERTSON

13,333 citations

Journal ArticleDOI
TL;DR: Machine learning addresses many of the same research questions as the fields of statistics, data mining, and psychology, but with differences of emphasis.
Abstract: Machine Learning is the study of methods for programming computers to learn. Computers are applied to a wide range of tasks, and for most of these it is relatively easy for programmers to design and implement the necessary software. However, there are many tasks for which this is difficult or impossible. These can be divided into four general categories. First, there are problems for which there exist no human experts. For example, in modern automated manufacturing facilities, there is a need to predict machine failures before they occur by analyzing sensor readings. Because the machines are new, there are no human experts who can be interviewed by a programmer to provide the knowledge necessary to build a computer system. A machine learning system can study recorded data and subsequent machine failures and learn prediction rules. Second, there are problems where human experts exist, but where they are unable to explain their expertise. This is the case in many perceptual tasks, such as speech recognition, hand-writing recognition, and natural language understanding. Virtually all humans exhibit expert-level abilities on these tasks, but none of them can describe the detailed steps that they follow as they perform them. Fortunately, humans can provide machines with examples of the inputs and correct outputs for these tasks, so machine learning algorithms can learn to map the inputs to the outputs. Third, there are problems where phenomena are changing rapidly. In finance, for example, people would like to predict the future behavior of the stock market, of consumer purchases, or of exchange rates. These behaviors change frequently, so that even if a programmer could construct a good predictive computer program, it would need to be rewritten frequently. A learning program can relieve the programmer of this burden by constantly modifying and tuning a set of learned prediction rules. Fourth, there are applications that need to be customized for each computer user separately. Consider, for example, a program to filter unwanted electronic mail messages. Different users will need different filters. It is unreasonable to expect each user to program his or her own rules, and it is infeasible to provide every user with a software engineer to keep the rules up-to-date. A machine learning system can learn which mail messages the user rejects and maintain the filtering rules automatically. Machine learning addresses many of the same research questions as the fields of statistics, data mining, and psychology, but with differences of emphasis. Statistics focuses on understanding the phenomena that have generated the data, often with the goal of testing different hypotheses about those phenomena. Data mining seeks to find patterns in the data that are understandable by people. Psychological studies of human learning aspire to understand the mechanisms underlying the various learning behaviors exhibited by people (concept learning, skill acquisition, strategy change, etc.).

13,246 citations

Book
01 Jan 1994
TL;DR: In this paper, the authors present a brief history of LMIs in control theory and discuss some of the standard problems involved in LMIs, such as linear matrix inequalities, linear differential inequalities, and matrix problems with analytic solutions.
Abstract: Preface 1. Introduction Overview A Brief History of LMIs in Control Theory Notes on the Style of the Book Origin of the Book 2. Some Standard Problems Involving LMIs. Linear Matrix Inequalities Some Standard Problems Ellipsoid Algorithm Interior-Point Methods Strict and Nonstrict LMIs Miscellaneous Results on Matrix Inequalities Some LMI Problems with Analytic Solutions 3. Some Matrix Problems. Minimizing Condition Number by Scaling Minimizing Condition Number of a Positive-Definite Matrix Minimizing Norm by Scaling Rescaling a Matrix Positive-Definite Matrix Completion Problems Quadratic Approximation of a Polytopic Norm Ellipsoidal Approximation 4. Linear Differential Inclusions. Differential Inclusions Some Specific LDIs Nonlinear System Analysis via LDIs 5. Analysis of LDIs: State Properties. Quadratic Stability Invariant Ellipsoids 6. Analysis of LDIs: Input/Output Properties. Input-to-State Properties State-to-Output Properties Input-to-Output Properties 7. State-Feedback Synthesis for LDIs. Static State-Feedback Controllers State Properties Input-to-State Properties State-to-Output Properties Input-to-Output Properties Observer-Based Controllers for Nonlinear Systems 8. Lure and Multiplier Methods. Analysis of Lure Systems Integral Quadratic Constraints Multipliers for Systems with Unknown Parameters 9. Systems with Multiplicative Noise. Analysis of Systems with Multiplicative Noise State-Feedback Synthesis 10. Miscellaneous Problems. Optimization over an Affine Family of Linear Systems Analysis of Systems with LTI Perturbations Positive Orthant Stabilizability Linear Systems with Delays Interpolation Problems The Inverse Problem of Optimal Control System Realization Problems Multi-Criterion LQG Nonconvex Multi-Criterion Quadratic Problems Notation List of Acronyms Bibliography Index.

11,085 citations