scispace - formally typeset
Search or ask a question

Showing papers by "Open University published in 2016"


Journal ArticleDOI
TL;DR: Gaia as discussed by the authors is a cornerstone mission in the science programme of the European Space Agency (ESA). The spacecraft construction was approved in 2006, following a study in which the original interferometric concept was changed to a direct-imaging approach.
Abstract: Gaia is a cornerstone mission in the science programme of the EuropeanSpace Agency (ESA). The spacecraft construction was approved in 2006, following a study in which the original interferometric concept was changed to a direct-imaging approach. Both the spacecraft and the payload were built by European industry. The involvement of the scientific community focusses on data processing for which the international Gaia Data Processing and Analysis Consortium (DPAC) was selected in 2007. Gaia was launched on 19 December 2013 and arrived at its operating point, the second Lagrange point of the Sun-Earth-Moon system, a few weeks later. The commissioning of the spacecraft and payload was completed on 19 July 2014. The nominal five-year mission started with four weeks of special, ecliptic-pole scanning and subsequently transferred into full-sky scanning mode. We recall the scientific goals of Gaia and give a description of the as-built spacecraft that is currently (mid-2016) being operated to achieve these goals. We pay special attention to the payload module, the performance of which is closely related to the scientific performance of the mission. We provide a summary of the commissioning activities and findings, followed by a description of the routine operational mode. We summarise scientific performance estimates on the basis of in-orbit operations. Several intermediate Gaia data releases are planned and the data can be retrieved from the Gaia Archive, which is available through the Gaia home page.

5,164 citations


Journal ArticleDOI
TL;DR: The first Gaia data release, Gaia DR1 as discussed by the authors, consists of three components: a primary astrometric data set which contains the positions, parallaxes, and mean proper motions for about 2 million of the brightest stars in common with the Hipparcos and Tycho-2 catalogues.
Abstract: Context. At about 1000 days after the launch of Gaia we present the first Gaia data release, Gaia DR1, consisting of astrometry and photometry for over 1 billion sources brighter than magnitude 20.7. Aims: A summary of Gaia DR1 is presented along with illustrations of the scientific quality of the data, followed by a discussion of the limitations due to the preliminary nature of this release. Methods: The raw data collected by Gaia during the first 14 months of the mission have been processed by the Gaia Data Processing and Analysis Consortium (DPAC) and turned into an astrometric and photometric catalogue. Results: Gaia DR1 consists of three components: a primary astrometric data set which contains the positions, parallaxes, and mean proper motions for about 2 million of the brightest stars in common with the Hipparcos and Tycho-2 catalogues - a realisation of the Tycho-Gaia Astrometric Solution (TGAS) - and a secondary astrometric data set containing the positions for an additional 1.1 billion sources. The second component is the photometric data set, consisting of mean G-band magnitudes for all sources. The G-band light curves and the characteristics of 3000 Cepheid and RR Lyrae stars, observed at high cadence around the south ecliptic pole, form the third component. For the primary astrometric data set the typical uncertainty is about 0.3 mas for the positions and parallaxes, and about 1 mas yr-1 for the proper motions. A systematic component of 0.3 mas should be added to the parallax uncertainties. For the subset of 94 000 Hipparcos stars in the primary data set, the proper motions are much more precise at about 0.06 mas yr-1. For the secondary astrometric data set, the typical uncertainty of the positions is 10 mas. The median uncertainties on the mean G-band magnitudes range from the mmag level to0.03 mag over the magnitude range 5 to 20.7. Conclusions: Gaia DR1 is an important milestone ahead of the next Gaia data release, which will feature five-parameter astrometry for all sources. Extensive validation shows that Gaia DR1 represents a major advance in the mapping of the heavens and the availability of basic stellar data that underpin observational astrophysics. Nevertheless, the very preliminary nature of this first Gaia data release does lead to a number of important limitations to the data quality which should be carefully considered before drawing conclusions from the data.

2,174 citations


Journal ArticleDOI
25 Aug 2016-Nature
TL;DR: Observations reveal the presence of a small planet with a minimum mass of about 1.3 Earth masses orbiting Proxima with a period of approximately 11.2 days at a semi-major-axis distance of around 0.05 astronomical units.
Abstract: At a distance of 1.295 parsecs, the red dwarf Proxima Centauri (α Centauri C, GL 551, HIP 70890 or simply Proxima) is the Sun's closest stellar neighbour and one of the best-studied low-mass stars. It has an effective temperature of only around 3,050 kelvin, a luminosity of 0.15 per cent of that of the Sun, a measured radius of 14 per cent of the radius of the Sun and a mass of about 12 per cent of the mass of the Sun. Although Proxima is considered a moderately active star, its rotation period is about 83 days (ref. 3) and its quiescent activity levels and X-ray luminosity are comparable to those of the Sun. Here we report observations that reveal the presence of a small planet with a minimum mass of about 1.3 Earth masses orbiting Proxima with a period of approximately 11.2 days at a semi-major-axis distance of around 0.05 astronomical units. Its equilibrium temperature is within the range where water could be liquid on its surface.

1,052 citations


Journal ArticleDOI
TL;DR: In this paper, the authors present an analysis of the online sharing economy discourse, identifying that the sharing economy is framed as an economic opportunity; a more sustainable form of consumption; a pathway to a decentralised, equitable and sustainable economy; creating unregulated marketplaces; reinforcing the neoliberal paradigm; and, an incoherent field of innovation.

973 citations


Journal ArticleDOI
TL;DR: The 3D-HST Treasury Program as mentioned in this paper has been used for the 3D design of the HST's three-dimensional (3D) HST-HWST array.
Abstract: NASA [NAS5-26555]; NASA through Hubble Fellowship - Space Telescope Science Institute [HST-HF-51318.001, HST-HF2-51368]; 3D-HST Treasury Program [GO 12177, 12328]; NASA/ESA HST [GO 11600, GO 13420]

614 citations


Journal ArticleDOI
TL;DR: A review of established in vitro blood-brain barrier models with a focus on their validation regarding a set of well-established bloodbrain barrier characteristics is given in this paper, with an overview of the advantages and drawbacks of the different models described.
Abstract: The endothelial cells lining the brain capillaries separate the blood from the brain parenchyma. The endothelial monolayer of the brain capillaries serves both as a crucial interface for exchange of nutrients, gases, and metabolites between blood and brain, and as a barrier for neurotoxic components of plasma and xenobiotics. This "blood-brain barrier" function is a major hindrance for drug uptake into the brain parenchyma. Cell culture models, based on either primary cells or immortalized brain endothelial cell lines, have been developed, in order to facilitate in vitro studies of drug transport to the brain and studies of endothelial cell biology and pathophysiology. In this review, we aim to give an overview of established in vitro blood-brain barrier models with a focus on their validation regarding a set of well-established blood-brain barrier characteristics. As an ideal cell culture model of the blood-brain barrier is yet to be developed, we also aim to give an overview of the advantages and drawbacks of the different models described.

540 citations


Journal ArticleDOI
TL;DR: This paper considers ways in which (mental) health professionals may assist the people with genderqueer and non-binary gender identities and/or expressions they may see in their practice and considers treatment options and associated risks.
Abstract: Some people have a gender which is neither male nor female and may identify as both male and female at one time, as different genders at different times, as no gender at all, or dispute the very idea of only two genders. The umbrella terms for such genders are 'genderqueer' or 'non-binary' genders. Such gender identities outside of the binary of female and male are increasingly being recognized in legal, medical and psychological systems and diagnostic classifications in line with the emerging presence and advocacy of these groups of people. Population-based studies show a small percentage--but a sizable proportion in terms of raw numbers--of people who identify as non-binary. While such genders have been extant historically and globally, they remain marginalized, and as such--while not being disorders or pathological in themselves--people with such genders remain at risk of victimization and of minority or marginalization stress as a result of discrimination. This paper therefore reviews the limited literature on this field and considers ways in which (mental) health professionals may assist the people with genderqueer and non-binary gender identities and/or expressions they may see in their practice. Treatment options and associated risks are discussed.

453 citations


Journal ArticleDOI
TL;DR: Learners' motivations and goals were found to shape how they conceptualised the purpose of the MOOC, which in turn affected their perception of the learning process.
Abstract: Massive open online courses (MOOCs) require individual learners to be able to self-regulate their learning, determining when and how they engage. However, MOOCs attract a diverse range of learners, each with different motivations and prior experience. This study investigates the self-regulated learning (SRL) learners apply in a MOOC, in particular focusing on how learners' motivations for taking a MOOC influence their behaviour and employment of SRL strategies. Following a quantitative investigation of the learning behaviours of 788 MOOC participants, follow-up interviews were conducted with 32 learners. The study compares the narrative descriptions of behaviour between learners with self-reported high and low SRL scores. Substantial differences were detected between the self-described learning behaviours of these two groups in five of the sub-processes examined. Learners' motivations and goals were found to shape how they conceptualised the purpose of the MOOC, which in turn affected their perception of the learning process.

428 citations


Book ChapterDOI
13 Sep 2016
TL;DR: This work proposes a permanent distributed record of intellectual effort and associated reputational reward, based on the blockchain that instantiates and democratises educational reputation beyond the academic community.
Abstract: The ‘blockchain’ is the core mechanism for the Bitcoin digital payment system. It embraces a set of inter-related technologies: the blockchain itself as a distributed record of digital events, the distributed consensus method to agree whether a new block is legitimate, automated smart contracts, and the data structure associated with each block. We propose a permanent distributed record of intellectual effort and associated reputational reward, based on the blockchain that instantiates and democratises educational reputation beyond the academic community. We are undertaking initial trials of a private blockchain or storing educational records, drawing also on our previous research into reputation management for educational systems.

424 citations


Journal ArticleDOI
TL;DR: Different from typical lexicon-based approaches, SentiCircles takes into account the co-occurrence patterns of words in different contexts in tweets to capture their semantics and update their pre-assigned strength and polarity in sentiment lexicons accordingly.
Abstract: We propose a semantic sentiment representation of words called SentiCircle.SentiCircle captures the contextual semantic of words from their co-occurrences.SentiCircle updates the sentiment of words based on their contextual semantics.SentiCircle can be used to perform entity- and tweet-level level sentiment analysis. Sentiment analysis on Twitter has attracted much attention recently due to its wide applications in both, commercial and public sectors. In this paper we present SentiCircles, a lexicon-based approach for sentiment analysis on Twitter. Different from typical lexicon-based approaches, which offer a fixed and static prior sentiment polarities of words regardless of their context, SentiCircles takes into account the co-occurrence patterns of words in different contexts in tweets to capture their semantics and update their pre-assigned strength and polarity in sentiment lexicons accordingly. Our approach allows for the detection of sentiment at both entity-level and tweet-level. We evaluate our proposed approach on three Twitter datasets using three different sentiment lexicons to derive word prior sentiments. Results show that our approach significantly outperforms the baselines in accuracy and F-measure for entity-level subjectivity (neutral vs. polar) and polarity (positive vs. negative) detections. For tweet-level sentiment detection, our approach performs better than the state-of-the-art SentiStrength by 4-5% in accuracy in two datasets, but falls marginally behind by 1% in F-measure in the third dataset.

375 citations


Journal ArticleDOI
TL;DR: The Baryon Oscillation Spectroscopic Survey (BOSS) as discussed by the authors provides the largest survey of galaxy redshifts available to date, in terms of both the number of galaxies measured by a single survey, and the effective cosmological volume covered.
Abstract: The Baryon Oscillation Spectroscopic Survey (BOSS), part of the Sloan Digital Sky Survey (SDSS) III project, has provided the largest survey of galaxy redshifts available to date, in terms of both the number of galaxy redshifts measured by a single survey, and the effective cosmological volume covered. Key to analysing the clustering of these data to provide cosmological measurements is understanding the detailed properties of this sample. Potential issues include variations in the target catalogue caused by changes either in the targeting algorithm or properties of the data used, the pattern of spectroscopic observations, the spatial distribution of targets for which redshifts were not obtained, and variations in the target sky density due to observational systematics. We document here the target selection algorithms used to create the galaxy samples that comprise BOSS. We also present the algorithms used to create large-scale structure catalogues for the final Data Release (DR12) samples and the associated random catalogues that quantify the survey mask. The algorithms are an evolution of those used by the BOSS team to construct catalogues from earlier data, and have been designed to accurately quantify the galaxy sample. The code used, designated MKSAMPLE, is released with this paper.

Journal ArticleDOI
TL;DR: In this paper, the authors describe the MaNGA Data Reduction Pipeline algorithms and centralized metadata framework that produce sky-subtracted spectrophotometrically calibrated spectra and rectified three-dimensional data cubes that combine individual dithered observations.
Abstract: Mapping Nearby Galaxies at Apache Point Observatory (MaNGA) is an optical fiber-bundle integral-field unit (IFU) spectroscopic survey that is one of three core programs in the fourth-generation Sloan Digital Sky Survey (SDSS-IV). With a spectral coverage of 3622–10354 A and an average footprint of ~500 arcsec2 per IFU the scientific data products derived from MaNGA will permit exploration of the internal structure of a statistically large sample of 10,000 low-redshift galaxies in unprecedented detail. Comprising 174 individually pluggable science and calibration IFUs with a near-constant data stream, MaNGA is expected to obtain ~100 million raw-frame spectra and ~10 million reduced galaxy spectra over the six-year lifetime of the survey. In this contribution, we describe the MaNGA Data Reduction Pipeline algorithms and centralized metadata framework that produce sky-subtracted spectrophotometrically calibrated spectra and rectified three-dimensional data cubes that combine individual dithered observations. For the 1390 galaxy data cubes released in Summer 2016 as part of SDSS-IV Data Release 13, we demonstrate that the MaNGA data have nearly Poisson-limited sky subtraction shortward of ~8500 A and reach a typical 10σ limiting continuum surface brightness μ = 23.5 AB arcsec-2 in a five-arcsecond-diameter aperture in the g-band. The wavelength calibration of the MaNGA data is accurate to 5 km s-1 rms, with a median spatial resolution of 2.54 arcsec FWHM (1.8 kpc at the median redshift of 0.037) and a median spectral resolution of σ = 72 km s-1.

Journal ArticleDOI
TL;DR: The mechanisms of action of radiation therapy with photons and ions in the presence and absence of nanoparticles, as well as the influence of some of the core and coating design parameters of nanop particles on their radiosensitisation capabilities are summarised.
Abstract: Radiotherapy is currently used in around 50% of cancer treatments and relies on the deposition of energy directly into tumour tissue. Although it is generally effective, some of the deposited energy can adversely affect healthy tissue outside the tumour volume, especially in the case of photon radiation (gamma and X-rays). Improved radiotherapy outcomes can be achieved by employing ion beams due to the characteristic energy deposition curve which culminates in a localised, high radiation dose (in form of a Bragg peak). In addition to ion radiotherapy, novel sensitisers, such as nanoparticles, have shown to locally increase the damaging effect of both photon and ion radiation, when both are applied to the tumour area. Amongst the available nanoparticle systems, gold nanoparticles have become particularly popular due to several advantages: biocompatibility, well-established methods for synthesis in a wide range of sizes, and the possibility of coating of their surface with a large number of different molecules to provide partial control of, for example, surface charge or interaction with serum proteins. This gives a full range of options for design parameter combinations, in which the optimal choice is not always clear, partially due to a lack of understanding of many processes that take place upon irradiation of such complicated systems. In this review, we summarise the mechanisms of action of radiation therapy with photons and ions in the presence and absence of nanoparticles, as well as the influence of some of the core and coating design parameters of nanoparticles on their radiosensitisation capabilities.

Journal ArticleDOI
01 Mar 2016-Geology
TL;DR: In this article, a compilation of whole-rock geochemical data available in the literature is used to show that fractional crystallization alone is not sufficient to explain the distribution of Niobium/Tantalum (Nb/Ta) in most peraluminous granites.
Abstract: In their late stages of evolution, peraluminous granitic melts exsolve large amounts of fluidswhich can modify the chemical composition of granitic whole-rock samples. The niobium/tantalum (Nb/Ta) ratio is expected to decrease during the magmatic differentiation of graniticmelts, but the behavior of both elements at the magmatic-hydrothermal transition remainsunclear. Using a compilation of whole-rock geochemical data available in the literature, wedemonstrate that fractional crystallization alone is not sufficient to explain the distribution ofNb-Ta in most peraluminous granites. However, we notice that most of the granitic samplesdisplaying evidence of interactions with fluids have Nb/Ta < 5. We propose that the decreaseof the Nb/Ta ratio in evolved melts is the consequence of both fractional crystallization andsub-solidus hydrothermal alteration. We suggest that the Nb/Ta value of ~5 fingerprints themagmatic-hydrothermal transition in peraluminous granites. Furthermore, a Nb/Ta ratio of ~5appears to be a good marker to discriminate mineralized from barren peraluminous granites.

Journal ArticleDOI
TL;DR: It is found that emissions during the cold season account for ≥50% of the annual CH4 flux, with the highest emissions from noninundated upland tundra, and regional scale fluxes of CH4 derived from aircraft data demonstrate the large spatial extent of late season CH4 emissions.
Abstract: Arctic terrestrial ecosystems are major global sources of methane (CH4); hence, it is important to understand the seasonal and climatic controls on CH4 emissions from these systems. Here, we report year-round CH4 emissions from Alaskan Arctic tundra eddy flux sites and regional fluxes derived from aircraft data. We find that emissions during the cold season (September to May) account for ≥50% of the annual CH4 flux, with the highest emissions from noninundated upland tundra. A major fraction of cold season emissions occur during the “zero curtain” period, when subsurface soil temperatures are poised near 0 °C. The zero curtain may persist longer than the growing season, and CH4 emissions are enhanced when the duration is extended by a deep thawed layer as can occur with thick snow cover. Regional scale fluxes of CH4 derived from aircraft data demonstrate the large spatial extent of late season CH4 emissions. Scaled to the circumpolar Arctic, cold season fluxes from tundra total 12 ± 5 (95% confidence interval) Tg CH4 y−1, ∼25% of global emissions from extratropical wetlands, or ∼6% of total global wetland methane emissions. The dominance of late-season emissions, sensitivity to soil environmental conditions, and importance of dry tundra are not currently simulated in most global climate models. Because Arctic warming disproportionally impacts the cold season, our results suggest that higher cold-season CH4 emissions will result from observed and predicted increases in snow thickness, active layer depth, and soil temperature, representing important positive feedbacks on climate warming.

Journal ArticleDOI
12 Sep 2016-BMJ
TL;DR: An overview of the SCCS method is provided, with examples of its use, and limitations, assumptions, and potential biases that can arise where assumptions are not met are discussed, to provide solutions and examples of good practice.
Abstract: The self controlled case series (SCCS) method is an epidemiological study design for which individuals act as their own control—ie, comparisons are made within individuals. Hence, only individuals who have experienced an event are included and all time invariant confounding is eliminated. The temporal association between a transient exposure and an event is estimated. SCCS was originally developed for evaluation of vaccine safety, but has since been applied in a range of settings where exact information on the size of the population at risk is lacking or identification of an appropriate comparison group is difficult—eg, for studies of adverse effects of drug treatments. We provide an overview of the SCCS method, with examples of its use, discuss limitations, assumptions, and potential biases that can arise where assumptions are not met, and provide solutions and examples of good practice.

Journal ArticleDOI
TL;DR: The Fire Model Intercomparison Project (FireMIP) is an international initiative to compare and evaluate existing global fire models against benchmark data sets for present-day and historical conditions as discussed by the authors.
Abstract: Biomass burning impacts vegetation dynamics, biogeochemical cycling, atmospheric chemistry, and climate, with sometimes deleterious socio-economic impacts. Under future climate projections it is often expected that the risk of wildfires will increase. Our ability to predict the magnitude and geographic pattern of future fire impacts rests on our ability to model fire regimes, using either well-founded empirical relationships or process-based models with good predictive skill. While a large variety of models exist today, it is still unclear which type of model or degree of complexity is required to model fire adequately at regional to global scales. This is the central question underpinning the creation of the Fire Model Intercomparison Project (FireMIP), an international initiative to compare and evaluate existing global fire models against benchmark data sets for present-day and historical conditions. In this paper we review how fires have been represented in fire-enabled dynamic global vegetation models (DGVMs) and give an overview of the current state of the art in fire-regime modelling. We indicate which challenges still remain in global fire modelling and stress the need for a comprehensive model evaluation and outline what lessons may be learned from FireMIP.

Journal ArticleDOI
TL;DR: The MaNGA Survey (Mapping Nearby Galaxies at Apache Point Observatory) as discussed by the authors is one of the core programs in the Sloan Digital Sky Survey IV, which is obtaining integral field spectroscopy for 10,000 nearby galaxies at a spectral resolution of R ∼ 2000 from 3622 to 10354 A.
Abstract: The MaNGA Survey (Mapping Nearby Galaxies at Apache Point Observatory) is one of three core programs in the Sloan Digital Sky Survey IV. It is obtaining integral field spectroscopy for 10,000 nearby galaxies at a spectral resolution of R ∼ 2000 from 3622 to 10354 A. The design of the survey is driven by a set of science requirements on the precision of estimates of the following properties: star formation rate surface density, gas metallicity, stellar population age, metallicity, and abundance ratio, and their gradients; stellar and gas kinematics; and enclosed gravitational mass as a function of radius. We describe how these science requirements set the depth of the observations and dictate sample selection. The majority of targeted galaxies are selected to ensure uniform spatial coverage in units of effective radius (Re) while maximizing spatial resolution. About two-thirds of the sample is covered out to 1.5Re (Primary sample), and one-third of the sample is covered to 2.5Re (Secondary sample). We describe the survey execution with details that would be useful in the design of similar future surveys. We also present statistics on the achieved data quality, specifically the point-spread function, sampling uniformity, spectral resolution, sky subtraction, and flux calibration. For our Primary sample, the median r-band signal-to-noise ratio is ∼70 per 1.4 A pixel for spectra stacked between 1Re and 1.5Re. Measurements of various galaxy properties from the first-year data show that we are meeting or exceeding the defined requirements for the majority of our science goals.

Journal ArticleDOI
TL;DR: The worldwide pooled prevalence estimates are higher than assumed so far, but this was largely explained by geography and descent, and clear guidelines on assessing FASD prevalence are urgently needed.
Abstract: Background Although fetal alcohol spectrum disorders (FASD) affect communities worldwide, little is known about its prevalence. The objective of this study was to provide an overview of the global FASD prevalence. Methods We performed a search in multiple electronic bibliographic databases up to August 2015, supplemented with the ascendancy and descendancy approach. Studies were considered when published in English, included human participants, and reported empirical data on prevalence or incidence estimates of FASD. Raw prevalence estimates were transformed using the Freeman–Tukey double arcsine transformation so that the data followed an approximately normal distribution. Once the pooled prevalence estimates, 95% confidence intervals and prediction intervals were calculated based on multiple meta-analyses with transformed proportions using random effects models, these estimates were transformed back to regular prevalence rates. Heterogeneity was tested using Cochran's Q and described using the I2 statistic. Results Among studies that estimated prevalence in general population samples, considerable differences in prevalence rates between countries were found and therefore separate meta-analyses for country were conducted. Particularly high-prevalence rates were observed in South Africa for fetal alcohol syndrome (55.42 per 1,000), for alcohol-related neurodevelopmental disorder (20.25 per 1,000), and FASD (113.22 per 1,000), For partial fetal alcohol syndrome high rates were found in Croatia (43.01 per 1,000), Italy (36.89 per 1,000), and South Africa (28.29 per 1,000). In the case of alcohol-related birth defects, a prevalence of 10.82 per 1,000 was found in Australia. However, studies into FASD exhibited substantial heterogeneity, which could only partly be explained by moderators, most notably geography and descent, in meta-regressions. In addition, the moderators were confounded, making conclusions as to each moderator's relevance tentative at best. Conclusions The worldwide pooled prevalence estimates are higher than assumed so far, but this was largely explained by geography and descent. Furthermore, prevalence studies varied considerably in terms of used methodology and methodological quality. The pooled estimates must therefore be interpreted with caution and for future research it is highly recommended to report methodology in a more comprehensive way. Finally, clear guidelines on assessing FASD prevalence are urgently needed, and a first step toward these guidelines is presented.

Journal ArticleDOI
TL;DR: The findings strongly indicate the importance of learning design in predicting and understanding Virtual Learning Environment behaviour and performance of students in blended and online environments.

Proceedings ArticleDOI
25 Apr 2016
TL;DR: An eight-point checklist named DELICATE is presented that can be applied by researchers, policy makers and institutional managers to facilitate a trusted implementation of Learning Analytics.
Abstract: The widespread adoption of Learning Analytics (LA) and Educational Data Mining (EDM) has somewhat stagnated recently, and in some prominent cases even been reversed following concerns by governments, stakeholders and civil rights groups about privacy and ethics applied to the handling of personal data. In this ongoing discussion, fears and realities are often indistinguishably mixed up, leading to an atmosphere of uncertainty among potential beneficiaries of Learning Analytics, as well as hesitations among institutional managers who aim to innovate their institution's learning support by implementing data and analytics with a view on improving student success. In this paper, we try to get to the heart of the matter, by analysing the most common views and the propositions made by the LA community to solve them. We conclude the paper with an eight-point checklist named DELICATE that can be applied by researchers, policy makers and institutional managers to facilitate a trusted implementation of Learning Analytics.

Journal ArticleDOI
TL;DR: In this article, the authors describe the MaNGA Data Reduction Pipeline (DRP) algorithms and centralized metadata framework that produces sky subtracted, spectrophotometrically calibrated spectra and rectified 3-D data cubes that combine individual dithered observations.
Abstract: Mapping Nearby Galaxies at Apache Point Observatory (MaNGA) is an optical fiber-bundle integral-field unit (IFU) spectroscopic survey that is one of three core programs in the fourth-generation Sloan Digital Sky Survey (SDSS-IV). With a spectral coverage of 3622 - 10,354 Angstroms and an average footprint of ~ 500 arcsec^2 per IFU the scientific data products derived from MaNGA will permit exploration of the internal structure of a statistically large sample of 10,000 low redshift galaxies in unprecedented detail. Comprising 174 individually pluggable science and calibration IFUs with a near-constant data stream, MaNGA is expected to obtain ~ 100 million raw-frame spectra and ~ 10 million reduced galaxy spectra over the six-year lifetime of the survey. In this contribution, we describe the MaNGA Data Reduction Pipeline (DRP) algorithms and centralized metadata framework that produces sky-subtracted, spectrophotometrically calibrated spectra and rectified 3-D data cubes that combine individual dithered observations. For the 1390 galaxy data cubes released in Summer 2016 as part of SDSS-IV Data Release 13 (DR13), we demonstrate that the MaNGA data have nearly Poisson-limited sky subtraction shortward of ~ 8500 Angstroms and reach a typical 10-sigma limiting continuum surface brightness mu = 23.5 AB/arcsec^2 in a five arcsec diameter aperture in the g band. The wavelength calibration of the MaNGA data is accurate to 5 km/s rms, with a median spatial resolution of 2.54 arcsec FWHM (1.8 kpc at the median redshift of 0.037) and a median spectral resolution of sigma = 72 km/s.

Journal ArticleDOI
TL;DR: The results of this study and previous research suggest that cerebral vascular basement membranes form the pathways by which fluid passes into and out of the brain but that different basement membrane layers are involved.
Abstract: In the absence of conventional lymphatics, drainage of interstitial fluid and solutes from the brain parenchyma to cervical lymph nodes is along basement membranes in the walls of cerebral capillaries and tunica media of arteries. Perivascular pathways are also involved in the entry of CSF into the brain by the convective influx/glymphatic system. The objective of this study is to differentiate the cerebral vascular basement membrane pathways by which fluid passes out of the brain from the pathway by which CSF enters the brain. Experiment 1: 0.5 µl of soluble biotinylated or fluorescent Aβ, or 1 µl 15 nm gold nanoparticles was injected into the mouse hippocampus and their distributions determined at 5 min by transmission electron microscopy. Aβ was distributed within the extracellular spaces of the hippocampus and within basement membranes of capillaries and tunica media of arteries. Nanoparticles did not enter capillary basement membranes from the extracellular spaces. Experiment 2: 2 µl of 15 nm nanoparticles were injected into mouse CSF. Within 5 min, groups of nanoparticles were present in the pial-glial basement membrane on the outer aspect of cortical arteries between the investing layer of pia mater and the glia limitans. The results of this study and previous research suggest that cerebral vascular basement membranes form the pathways by which fluid passes into and out of the brain but that different basement membrane layers are involved. The significance of these findings for neuroimmunology, Alzheimer's disease, drug delivery to the brain and the concept of the Virchow-Robin space are discussed.

Journal ArticleDOI
TL;DR: In this article, the effects of value, visibility, accessibility, and guardianship on cybercrime victimization have been studied based on a large sample (N = 9,161).
Abstract: The central question of this article is whether routine activity theory (RAT) can be used as an analytical framework to study cybercrimes. Both a theoretical analysis and an analysis of empirical studies have thus far failed to provide a clear answer. The multivariate analysis presented in this article tries to avoid some of the limitations of other RAT-based studies. Based on a large sample (N = 9,161), the effects of value, visibility, accessibility, and guardianship on victimization of six cybercrimes have been studied. Analysis shows some RAT elements are more applicable than others. Visibility clearly plays a role within cybercrime victimization. Accessibility and personal capable guardianship show varying results. Value and technical capable guardianship show almost no effects on cybercrime victimization.

Journal ArticleDOI
TL;DR: In this paper, the authors presented the first wide area (19 deg(2)), deep (a parts per thousand 120-150 mu Jy beam(-1)), high-resolution (5.6 x 7.4 arcsec) LOFAR High Band Antenna image of the Bootes field made at 130-169 MHz.
Abstract: We present the first wide area (19 deg(2)), deep (a parts per thousand 120-150 mu Jy beam(-1)), high-resolution (5.6 x 7.4 arcsec) LOFAR High Band Antenna image of the Bootes field made at 130-169 MHz. This image is at least an order of magnitude deeper and 3-5 times higher in angular resolution than previously achieved for this field at low frequencies. The observations and data reduction, which includes full direction-dependent calibration, are described here. We present a radio source catalogue containing 6 276 sources detected over an area of 19 deg(2), with a peak flux density threshold of 5 sigma. As the first thorough test of the facet calibration strategy, introduced by van Weeren et al., we investigate the flux and positional accuracy of the catalogue. We present differential source counts that reach an order of magnitude deeper in flux density than previously achieved at these low frequencies, and show flattening at 150-MHz flux densities below 10 mJy associated with the rise of the low flux density star-forming galaxies and radio-quiet AGN.

Journal ArticleDOI
10 Jun 2016-Science
TL;DR: Indigenous land use practices have a fundamental role to play in controlling deforestation and reducing carbon dioxide emissions as discussed by the authors, and Satellite imagery suggests that indigenous lands contribute substantially to maintaining carbon stocks and enhancing biodiversity relative to adjoining territory.
Abstract: Indigenous land use practices have a fundamental role to play in controlling deforestation and reducing carbon dioxide emissions. Satellite imagery suggests that indigenous lands contribute substantially to maintaining carbon stocks and enhancing biodiversity relative to adjoining territory ( 1 ). Many of these sustainable land use practices are born, developed, and successfully implemented by the community without major influence from external stakeholders ( 2 ). A prerequisite for such community-owned solutions is indigenous knowledge, which is local and context-specific, transmitted orally or through imitation and demonstration, adaptive to changing environments, collectivized through a shared social memory, and situated within numerous interlinked facets of people's lives ( 3 ). Such local ecological knowledge is increasingly important given the growing global challenges of ecosystem degradation and climate change ( 4 ).

Journal ArticleDOI
TL;DR: In this article, a new calibration scheme, which is called facet calibration, is presented to obtain deep high-resolution LOFAR High Band Antenna images using the Dutch part of the array.
Abstract: LOFAR, the Low-Frequency Array, is a powerful new radio telescope operating between 10 and 240 MHz. LOFAR allows detailed sensitive high-resolution studies of the low-frequency radio sky. At the same time LOFAR also provides excellent short baseline coverage to map diffuse extended emission. However, producing highquality deep images is challenging due to the presence of direction-dependent calibration errors, caused by imperfect knowledge of the station beam shapes and the ionosphere. Furthermore, the large data volume and presence of station clock errors present additional difficulties. In this paper we present a new calibration scheme, which we name facet calibration, to obtain deep high-resolution LOFAR High Band Antenna images using the Dutch part of the array. This scheme solves and corrects the direction-dependent errors in a number of facets that cover the observed field of view. Facet calibration provides close to thermal noise limited images for a typical 8 hr observing run at similar to 5. resolution, meeting the specifications of the LOFAR Tier-1 northern survey.

Journal ArticleDOI
TL;DR: In this paper, the authors present new laboratory experiments on the low-temperature solid state formation of three complex molecules: methyl formate, glycolaldehyde and ethylene glycol, through recombination of free radicals formed via H-atom addition and abstraction reactions at different stages in the CO-H2CO-CH3OH hydrogenation network.
Abstract: Complex organic molecules (COMs) have been observed not only in the hot cores surrounding low- and high- mass protostars, but also in cold dark clouds. Therefore, it is interesting to understand how such species can be formed without the presence of embedded energy sources. We present new laboratory experiments on the low-temperature solid state formation of three complex molecules: methyl formate (HC(O)OCH3), glycolaldehyde (HC(O)CH2OH) and ethylene glycol (H2C(OH)CH2OH), through recombination of free radicals formed via H-atom addition and abstraction reactions at different stages in the CO-H2CO-CH3OH hydrogenation network at 15 K. The experiments extend previous CO hydrogenation studies and aim at resembling the physical&chemical conditions typical of the CO freeze-out stage in dark molecular clouds, when H2CO and CH3OH form by recombination of accreting CO molecules and H-atoms on ice grains. We confirm that H2CO, once formed through CO hydrogenation, not only yields CH3OH through ongoing H-atom addition reactions, but is also subject to H-atom-induced abstraction reactions, yielding CO again. In a similar way, H2CO is also formed in abstraction reactions involving CH3OH. The dominant methanol H-atom abstraction product is expected to be CH2OH, while H-atom additions to H2CO should at least partially proceed through CH3O intermediate radicals. The occurrence of H-atom abstraction reactions in ice mantles leads to more reactive intermediates (HCO, CH3O and CH2OH) than previously thought, when assuming sequential H-atom addition reactions only. This enhances the probability to form COMs through radical-radical recombination without the need of UV photolysis or cosmic rays as external triggers.

Journal ArticleDOI
TL;DR: This report proposes a set of metrics, drawn from existing data that can form a starting point for policy makers to identify the structure and dynamics of private provision in their particular mixed health systems, and develops an illustrative and partial country typology to illustrate how the scale and operation of the public sector can shape the private sector's structure and behaviour, and vice versa.

Journal ArticleDOI
TL;DR: The Rosetta probe, orbiting Jupiter-family comet 67P/Churyumov-Gerasimenko, has been detecting individual dust particles of mass larger than 10−10 kg by means of the GIADA dust collector and the OSIRIS Wide Angle Camera and Narrow Angle Camera as mentioned in this paper.
Abstract: The Rosetta probe, orbiting Jupiter-family comet 67P/Churyumov–Gerasimenko, has been detecting individual dust particles of mass larger than 10−10 kg by means of the GIADA dust collector and the OSIRIS Wide Angle Camera and Narrow Angle Camera since 2014 August and will continue until 2016 September. Detections of single dust particles allow us to estimate the anisotropic dust flux from 67P, infer the dust loss rate and size distribution at the surface of the sunlit nucleus, and see whether the dust size distribution of 67P evolves in time. The velocity of the Rosetta orbiter, relative to 67P, is much lower than the dust velocity measured by GIADA, thus dust counts when GIADA is nadir-pointing will directly provide the dust flux. In OSIRIS observations, the dust flux is derived from the measurement of the dust space density close to the spacecraft. Under the assumption of radial expansion of the dust, observations in the nadir direction provide the distance of the particles by measuring their trail length, with a parallax baseline determined by the motion of the spacecraft. The dust size distribution at sizes >1 mm observed by OSIRIS is consistent with a differential power index of −4, which was derived from models of 67P's trail. At sizes <1 mm, the size distribution observed by GIADA shows a strong time evolution, with a differential power index drifting from −2 beyond 2 au to −3.7 at perihelion, in agreement with the evolution derived from coma and tail models based on ground-based data. The refractory-to-water mass ratio of the nucleus is close to six during the entire inbound orbit and at perihelion.