scispace - formally typeset
Search or ask a question
Author

Art R. T. Jonkers

Other affiliations: University of Münster
Bio: Art R. T. Jonkers is an academic researcher from University of Liverpool. The author has contributed to research in topics: Secular variation & Global warming. The author has an hindex of 6, co-authored 6 publications receiving 76 citations. Previous affiliations of Art R. T. Jonkers include University of Münster.

Papers
More filters
Journal ArticleDOI
10 Nov 2016-PLOS ONE
TL;DR: A model for the whole of mainland Britain over three recent decades (1982–2011) that incorporates geographical extrapolation to Scotland is presented, indicating April as the fastest-warming month and show that most rivers spend on average ever more days of the year at temperatures exceeding 10°C, a critical threshold for several fish pathogens.
Abstract: River water temperature is a hydrological feature primarily controlled by topographical, meteorological, climatological, and anthropogenic factors. For Britain, the study of freshwater temperatures has focussed mainly on observations made in England and Wales; similar comprehensive data sets for Scotland are currently unavailable. Here we present a model for the whole of mainland Britain over three recent decades (1982–2011) that incorporates geographical extrapolation to Scotland. The model estimates daily mean freshwater temperature for every river segment and for any day in the studied period, based upon physico-geographical features, daily mean air and sea temperatures, and available freshwater temperature measurements. We also extrapolate the model temporally to predict future warming of Britain’s rivers given current observed trends. Our results highlight the spatial and temporal diversity of British freshwater temperatures and warming rates. Over the studied period, Britain’s rivers had a mean temperature of 9.84°C and experienced a mean warming of +0.22°C per decade, with lower rates for segments near lakes and in coastal regions. Model results indicate April as the fastest-warming month (+0.63°C per decade on average), and show that most rivers spend on average ever more days of the year at temperatures exceeding 10°C, a critical threshold for several fish pathogens. Our results also identify exceptional warming in parts of the Scottish Highlands (in April and September) and pervasive cooling episodes, in December throughout Britain and in July in the southwest of England (in Wales, Cornwall, Devon, and Dorset). This regional heterogeneity in rates of change has ramifications for current and future water quality, aquatic ecosystems, as well as for the spread of waterborne diseases.

18 citations

Journal ArticleDOI
TL;DR: In this article, the authors used log-periodic fits of modulated power-law scaling of size-ranked event durations to quantify characteristic timescales of a process.
Abstract: SUMMARY The geodynamo exhibits a bewildering gamut of time-dependent fluctuations, on timescales from years to at least hundreds of millions of years. No framework yet exists that comprises all and relates each to all others in a quantitative sense. The technique of bootstrapped discrete scale invariance quantifies characteristic timescales of a process, based upon log-periodic fits of modulated power-law scaling of size-ranked event durations. Four independent geomagnetic data sets are analysed therewith, each spanning different timescales: the sequence of 332 known dipole reversal intervals (0–161 Ma); dipole intensity fluctuations (0–2 Ma); archeomagnetic secular variation (5000 B.C.–1950 A.D.); and historical secular variation (1590–1990 A.D.). Six major characteristic timescales are empirically attested: circa 1.43 Ma, 56 Ka, and 763, 106, 21 and 3 yr. Moreover, all detected wavelengths and phases of the detected scaling signatures are highly similar, suggesting that a single process underlies all. This hypothesis is reinforced by extrapolating the log-periodic scaling signal of any particular data set to higher timescales than observed, through which predictions are obtained for characteristic scales attested elsewhere. Not only do many confirm one another, they also predict the typical duration of superchrons and geomagnetic jerks. A universal scaling bridge describes the complete range of geodynamo fluctuation timescales with a single power law.

16 citations

Journal ArticleDOI
TL;DR: The susceptibility of the English and Welsh fish farming and fisheries industry to emergent diseases is assessed using a stochastic simulation model that considers reactive, proactive, and hybrid methods of control which correspond to a mixture of policy and the ease of disease detection.

15 citations

Journal ArticleDOI
TL;DR: The tension between empirical data and formal theory pervades the entire history of geomagnetism, from the Middle Ages up to the present day as discussed by the authors, and a range of pertinent case studies supports classification of this entire period as proto-scientific, characterised by the initial formation of theories being largely disconnected from observational constraints, and their subsequent evolution being advanced primarily by their empirical falsification.
Abstract: The tension between empirical data and formal theory pervades the entire history of geomagnetism, from the Middle Ages up to the present day. This paper explores its early-modern history (1500-1800), using a hybrid approach: it applies a methodological framework used in modern geophysics to interpret early-modern developments, exploring to what extent formal conjectures shaped observation and vice versa. A range of pertinent case studies supports classification of this entire period as proto-scientific, characterised by the initial formation of theories being largely disconnected from observational constraints, and their subsequent evolution being advanced primarily by their empirical falsification, and not necessarily associated with the introduction of an alternative. The few exceptional instances of purely data-driven discovery were essentially due to an improved signal-to-noise ratio.

15 citations

Journal ArticleDOI
TL;DR: H5N1 avian influenza transmission probabilities and containment strategies, here modelled on the British poultry industry network, show that infection dynamics can additionally express characteristic scales, and hotspots can make more effective inoculation targets.
Abstract: Epidemics are frequently simulated on redundantly wired contact networks, which have many more links between sites than are minimally required to connect all. Consequently, the modelled pathogen can travel numerous alternative routes, complicating effective containment strategies. These networks have moreover been found to exhibit ‘scale-free’ properties and percolation, suggesting resilience to damage. However, realistic H5N1 avian influenza transmission probabilities and containment strategies, here modelled on the British poultry industry network, show that infection dynamics can additionally express characteristic scales. These system-preferred scales constitute small areas within an observed power law distribution that exhibit a lesser slope than the power law itself, indicating a slightly increased relative likelihood. These characteristic scales are here produced by a network-pervading intranet of so-called hotspot sites that propagate large epidemics below the percolation threshold. This intranet is, however, extremely vulnerable; targeted inoculation of a mere 3–6% (depending on incorporated biosecurity measures) of the British poultry industry network prevents large and moderate H5N1 outbreaks completely, offering an order of magnitude improvement over previously advocated strategies affecting the most highly connected ‘hub’ sites. In other words, hotspots and hubs are separate functional entities that do not necessarily coincide, and hotspots can make more effective inoculation targets. Given the ubiquity and relevance of networks (epidemics, Internet, power grids, protein interaction), recognition of this spreading regime elsewhere would suggest a similar disproportionate sensitivity to such surgical interventions.

9 citations


Cited by
More filters
01 Jan 2011
TL;DR: This paper used downscaled outputs from general circulation models coupled with a hydrologic model to forecast the effects of altered flows and increased temperatures on four interacting species of trout across the interior western United States (1.01 million km2), based on empirical statistical models built from fish surveys at 9,890 sites.
Abstract: Broad-scale studies of climate change effects on freshwater species have focused mainly on temperature, ignoring critical drivers such as flow regime and biotic interactions. We use downscaled outputs from general circulation models coupled with a hydrologic model to forecast the effects of altered flows and increased temperatures on four interacting species of trout across the interior western United States (1.01 million km2), based on empirical statistical models built from fish surveys at 9,890 sites. Projections under the 2080s A1B emissions scenario forecast a mean 47% decline in total suitable habitat for all trout, a group of fishes of major socioeconomic and ecological significance. We project that native cutthroat trout Oncorhynchus clarkii, already excluded from much of its potential range by nonnative species, will lose a further 58% of habitat due to an increase in temperatures beyond the species' physiological optima and continued negative biotic interactions. Habitat for nonnative brook trout Salvelinus fontinalis and brown trout Salmo trutta is predicted to decline by 77% and 48%, respectively, driven by increases in temperature and winter flood frequency caused by warmer, rainier winters. Habitat for rainbow trout, Oncorhynchus mykiss, is projected to decline the least (35%) because negative temperature effects are partly offset by flow regime shifts that benefit the species. These results illustrate how drivers other than temperature influence species response to climate change. Despite some uncertainty, large declines in trout habitat are likely, but our findings point to opportunities for strategic targeting of mitigation efforts to appropriate stressors and locations.

438 citations

Journal ArticleDOI
TL;DR: This book by Nino Boccara presents a compilation of model systems commonly termed as `complex' and starts with a definition of the systems under consideration and how to build up a model to describe the complex dynamics.
Abstract: This book by Nino Boccara presents a compilation of model systems commonly termed as `complex'. It starts with a definition of the systems under consideration and how to build up a model to describe the complex dynamics. The subsequent chapters are devoted to various categories of mean-field type models (differential and recurrence equations, chaos) and of agent-based models (cellular automata, networks and power-law distributions). Each chapter is supplemented by a number of exercises and their solutions. The table of contents looks a little arbitrary but the author took the most prominent model systems investigated over the years (and up until now there has been no unified theory covering the various aspects of complex dynamics). The model systems are explained by looking at a number of applications in various fields. The book is written as a textbook for interested students as well as serving as a compehensive reference for experts. It is an ideal source for topics to be presented in a lecture on dynamics of complex systems. This is the first book on this `wide' topic and I have long awaited such a book (in fact I planned to write it myself but this is much better than I could ever have written it!). Only section 6 on cellular automata is a little too limited to the author's point of view and one would have expected more about the famous Domany--Kinzel model (and more accurate citation!). In my opinion this is one of the best textbooks published during the last decade and even experts can learn a lot from it. Hopefully there will be an actualization after, say, five years since this field is growing so quickly. The price is too high for students but this, unfortunately, is the normal case today. Nevertheless I think it will be a great success!

268 citations

Journal ArticleDOI
TL;DR: The Earth's internal magnetic field varies on timescales of months to billions of years as discussed by the authors, and this variability may be related to changes in heat flow associated with mantle convection processes.
Abstract: The Earth's internal magnetic field varies on timescales of months to billions of years. The field is generated by convection in the liquid outer core, which in turn is influenced by the heat flowing from the core into the base of the overlying mantle. Much of the magnetic field's variation is thought to be stochastic, but over very long timescales, this variability may be related to changes in heat flow associated with mantle convection processes. Over the past 500 Myr, correlations between palaeomagnetic behaviour and surface processes were particularly striking during the middle to late Mesozoic era, beginning about 180 Myr ago. Simulations of the geodynamo suggest that transitions from periods of rapid polarity reversals to periods of prolonged stability — such as occurred between the Middle Jurassic and Middle Cretaceous periods — may have been triggered by a decrease in core-mantle boundary heat flow either globally or in equatorial regions. This decrease in heat flow could have been linked to reduced mantle-plume-head production at the core-mantle boundary, an episode of true polar wander, or a combination of the two.

160 citations

Journal ArticleDOI
TL;DR: GEOMAGIA50.v3 as discussed by the authors provides access to published paleomagnetic, rock magnetic, and chronological data from a variety of materials that record Earth's magnetic field over the past 50 ka.
Abstract: GEOMAGIA50.v3 is a comprehensive online database providing access to published paleomagnetic, rock magnetic, and chronological data from a variety of materials that record Earth’s magnetic field over the past 50 ka. Since its original release in 2006, the structure and function of the database have been updated and a significant number of data have been added. Notable modifications are the following: (1) the inclusion of additional intensity, directional and metadata from archeological and volcanic materials and an improved documentation of radiocarbon dates; (2) a new data model to accommodate paleomagnetic, rock magnetic, and chronological data from lake and marine sediments; (3) a refinement of the geographic constraints in the archeomagnetic/volcanic query allowing selection of particular locations; (4) more flexible methodological and statistical constraints in the archeomagnetic/volcanic query; (5) the calculation of predictions of the Holocene geomagnetic field from a series of time varying global field models; (6) searchable reference lists; and (7) an updated web interface. This paper describes general modifications to the database and specific aspects of the archeomagnetic and volcanic database. The reader is referred to a companion publication for a description of the sediment database. The archeomagnetic and volcanic part of GEOMAGIA50.v3 currently contains 14,645 data (declination, inclination, and paleointensity) from 461 studies published between 1959 and 2014. We review the paleomagnetic methods used to obtain these data and discuss applications of the data within the database. The database continues to expand as legacy data are added and new studies published. The web-based interface can be found at http://geomagia.gfz-potsdam.de .

142 citations

Journal ArticleDOI
TL;DR: The magnetic field of the Earth is by far the best documented magnetic field in all known worlds as mentioned in this paper, thanks to the convergence of many different approaches and to the remarkable fact that surface rocks have quietly recorded much of its history.
Abstract: The magnetic field of the Earth is by far the best documented magnetic field of all known planets. Considerable progress has been made in our understanding of its charac- teristics and properties, thanks to the convergence of many different approaches and to the remarkable fact that surface rocks have quietly recorded much of its history. The usefulness of magnetic field charts for navigation and the dedication of a few individuals have also led to the patient construction of some of the longest series of quantitative observations in the history of science. More recently even more systematic observations have been made pos- sible from space, leading to the possibility of observing the Earth's magnetic field in much more details than was previously possible. The progressive increase in computer power was also crucial, leading to advanced ways of handling and analyzing this considerable corpus

138 citations