scispace - formally typeset
Search or ask a question

Showing papers by "Aalto University published in 2016"


Journal ArticleDOI
Peter A. R. Ade1, Nabila Aghanim2, Monique Arnaud3, M. Ashdown4  +334 moreInstitutions (82)
TL;DR: In this article, the authors present a cosmological analysis based on full-mission Planck observations of temperature and polarization anisotropies of the cosmic microwave background (CMB) radiation.
Abstract: This paper presents cosmological results based on full-mission Planck observations of temperature and polarization anisotropies of the cosmic microwave background (CMB) radiation. Our results are in very good agreement with the 2013 analysis of the Planck nominal-mission temperature data, but with increased precision. The temperature and polarization power spectra are consistent with the standard spatially-flat 6-parameter ΛCDM cosmology with a power-law spectrum of adiabatic scalar perturbations (denoted “base ΛCDM” in this paper). From the Planck temperature data combined with Planck lensing, for this cosmology we find a Hubble constant, H0 = (67.8 ± 0.9) km s-1Mpc-1, a matter density parameter Ωm = 0.308 ± 0.012, and a tilted scalar spectral index with ns = 0.968 ± 0.006, consistent with the 2013 analysis. Note that in this abstract we quote 68% confidence limits on measured parameters and 95% upper limits on other parameters. We present the first results of polarization measurements with the Low Frequency Instrument at large angular scales. Combined with the Planck temperature and lensing data, these measurements give a reionization optical depth of τ = 0.066 ± 0.016, corresponding to a reionization redshift of . These results are consistent with those from WMAP polarization measurements cleaned for dust emission using 353-GHz polarization maps from the High Frequency Instrument. We find no evidence for any departure from base ΛCDM in the neutrino sector of the theory; for example, combining Planck observations with other astrophysical data we find Neff = 3.15 ± 0.23 for the effective number of relativistic degrees of freedom, consistent with the value Neff = 3.046 of the Standard Model of particle physics. The sum of neutrino masses is constrained to ∑ mν < 0.23 eV. The spatial curvature of our Universe is found to be very close to zero, with | ΩK | < 0.005. Adding a tensor component as a single-parameter extension to base ΛCDM we find an upper limit on the tensor-to-scalar ratio of r0.002< 0.11, consistent with the Planck 2013 results and consistent with the B-mode polarization constraints from a joint analysis of BICEP2, Keck Array, and Planck (BKP) data. Adding the BKP B-mode data to our analysis leads to a tighter constraint of r0.002 < 0.09 and disfavours inflationarymodels with a V(φ) ∝ φ2 potential. The addition of Planck polarization data leads to strong constraints on deviations from a purely adiabatic spectrum of fluctuations. We find no evidence for any contribution from isocurvature perturbations or from cosmic defects. Combining Planck data with other astrophysical data, including Type Ia supernovae, the equation of state of dark energy is constrained to w = −1.006 ± 0.045, consistent with the expected value for a cosmological constant. The standard big bang nucleosynthesis predictions for the helium and deuterium abundances for the best-fit Planck base ΛCDM cosmology are in excellent agreement with observations. We also constraints on annihilating dark matter and on possible deviations from the standard recombination history. In neither case do we find no evidence for new physics. The Planck results for base ΛCDM are in good agreement with baryon acoustic oscillation data and with the JLA sample of Type Ia supernovae. However, as in the 2013 analysis, the amplitude of the fluctuation spectrum is found to be higher than inferred from some analyses of rich cluster counts and weak gravitational lensing. We show that these tensions cannot easily be resolved with simple modifications of the base ΛCDM cosmology. Apart from these tensions, the base ΛCDM cosmology provides an excellent description of the Planck CMB observations and many other astrophysical data sets.

10,728 citations


Journal ArticleDOI
01 Sep 2016
TL;DR: Information and communications technologies ICTs have enabled the rise of so-called "Collaborative Consumption" CC: the peer-to-peer-based activity of obtaining, giving, or sharing the access to go...
Abstract: Information and communications technologies ICTs have enabled the rise of so-called "Collaborative Consumption" CC: the peer-to-peer-based activity of obtaining, giving, or sharing the access to goods and services, coordinated through community-based online services. CC has been expected to alleviate societal problems such as hyper-consumption, pollution, and poverty by lowering the cost of economic coordination within communities. However, beyond anecdotal evidence, there is a dearth of understanding why people participate in CC. Therefore, in this article we investigate people's motivations to participate in CC. The study employs survey data N=168 gathered from people registered onto a CC site. The results show that participation in CC is motivated by many factors such as its sustainability, enjoyment of the activity as well as economic gains. An interesting detail in the result is that sustainability is not directly associated with participation unless it is at the same time also associated with positive attitudes towards CC. This suggests that sustainability might only be an important factor for those people for whom ecological consumption is important. Furthermore, the results suggest that in CC an attitude-behavior gap might exist; people perceive the activity positively and say good things about it, but this good attitude does not necessary translate into action.

2,051 citations


Journal ArticleDOI
03 Oct 2016-PLOS ONE
TL;DR: The objective is to understand the current research topics, challenges and future directions regarding Blockchain technology from the technical perspective, and recommendations on future research directions are provided for researchers.
Abstract: Blockchain is a decentralized transaction and data management technology developed first for Bitcoin cryptocurrency. The interest in Blockchain technology has been increasing since the idea was coined in 2008. The reason for the interest in Blockchain is its central attributes that provide security, anonymity and data integrity without any third party organization in control of the transactions, and therefore it creates interesting research areas, especially from the perspective of technical challenges and limitations. In this research, we have conducted a systematic mapping study with the goal of collecting all relevant research on Blockchain technology. Our objective is to understand the current research topics, challenges and future directions regarding Blockchain technology from the technical perspective. We have extracted 41 primary papers from scientific databases. The results show that focus in over 80% of the papers is on Bitcoin system and less than 20% deals with other Blockchain applications including e.g. smart contracts and licensing. The majority of research is focusing on revealing and improving limitations of Blockchain from privacy and security perspectives, but many of the proposed solutions lack concrete evaluation on their effectiveness. Many other Blockchain scalability related challenges including throughput and latency have been left unstudied. On the basis of this study, recommendations on future research directions are provided for researchers.

1,528 citations


Journal ArticleDOI
TL;DR: In this paper, the authors present a guided tour of the main aspects of community detection in networks and point out strengths and weaknesses of popular methods, and give directions to their use.

1,398 citations


Journal ArticleDOI
29 Apr 2016-Science
TL;DR: Deep sequencing of the gut microbiomes of 1135 participants from a Dutch population-based cohort shows relations between the microbiome and 126 exogenous and intrinsic host factors, including 31 intrinsic factors, 12 diseases, 19 drug groups, 4 smoking categories, and 60 dietary factors, and an important step toward a better understanding of environment-diet-microbe-host interactions.
Abstract: Deep sequencing of the gut microbiomes of 1135 participants from a Dutch population-based cohort shows relations between the microbiome and 126 exogenous and intrinsic host factors, including 31 intrinsic factors, 12 diseases, 19 drug groups, 4 smoking categories, and 60 dietary factors. These factors collectively explain 18.7% of the variation seen in the interindividual distance of microbial composition. We could associate 110 factors to 125 species and observed that fecal chromogranin A (CgA), a protein secreted by enteroendocrine cells, was exclusively associated with 61 microbial species whose abundance collectively accounted for 53% of microbial composition. Low CgA concentrations were seen in individuals with a more diverse microbiome. These results are an important step toward a better understanding of environment-diet-microbe-host interactions.

1,272 citations


Journal ArticleDOI
TL;DR: In this paper, the state of the art of optical modulators based on 2D materials, including graphene, transition metal dichalcogenides and black phosphorus, is reviewed.
Abstract: Light modulation is an essential operation in photonics and optoelectronics. With existing and emerging technologies increasingly demanding compact, efficient, fast and broadband optical modulators, high-performance light modulation solutions are becoming indispensable. The recent realization that 2D layered materials could modulate light with superior performance has prompted intense research and significant advances, paving the way for realistic applications. In this Review, we cover the state of the art of optical modulators based on 2D materials, including graphene, transition metal dichalcogenides and black phosphorus. We discuss recent advances employing hybrid structures, such as 2D heterostructures, plasmonic structures, and silicon and fibre integrated structures. We also take a look at the future perspectives and discuss the potential of yet relatively unexplored mechanisms, such as magneto-optic and acousto-optic modulation.

1,158 citations


Journal ArticleDOI
Kurt Lejaeghere1, Gustav Bihlmayer2, Torbjörn Björkman3, Torbjörn Björkman4, Peter Blaha5, Stefan Blügel2, Volker Blum6, Damien Caliste7, Ivano E. Castelli8, Stewart J. Clark9, Andrea Dal Corso10, Stefano de Gironcoli10, Thierry Deutsch7, J. K. Dewhurst11, Igor Di Marco12, Claudia Draxl13, Claudia Draxl14, Marcin Dulak15, Olle Eriksson12, José A. Flores-Livas11, Kevin F. Garrity16, Luigi Genovese7, Paolo Giannozzi17, Matteo Giantomassi18, Stefan Goedecker19, Xavier Gonze18, Oscar Grånäs20, Oscar Grånäs12, E. K. U. Gross11, Andris Gulans14, Andris Gulans13, Francois Gygi21, D. R. Hamann22, P. J. Hasnip23, Natalie Holzwarth24, Diana Iusan12, Dominik B. Jochym25, F. Jollet, Daniel M. Jones26, Georg Kresse27, Klaus Koepernik28, Klaus Koepernik29, Emine Kucukbenli10, Emine Kucukbenli8, Yaroslav Kvashnin12, Inka L. M. Locht12, Inka L. M. Locht30, Sven Lubeck14, Martijn Marsman27, Nicola Marzari8, Ulrike Nitzsche29, Lars Nordström12, Taisuke Ozaki31, Lorenzo Paulatto32, Chris J. Pickard33, Ward Poelmans1, Matt Probert23, Keith Refson34, Keith Refson25, Manuel Richter28, Manuel Richter29, Gian-Marco Rignanese18, Santanu Saha19, Matthias Scheffler35, Matthias Scheffler13, Martin Schlipf21, Karlheinz Schwarz5, Sangeeta Sharma11, Francesca Tavazza16, Patrik Thunström5, Alexandre Tkatchenko36, Alexandre Tkatchenko13, Marc Torrent, David Vanderbilt22, Michiel van Setten18, Veronique Van Speybroeck1, John M. Wills37, Jonathan R. Yates26, Guo-Xu Zhang38, Stefaan Cottenier1 
25 Mar 2016-Science
TL;DR: A procedure to assess the precision of DFT methods was devised and used to demonstrate reproducibility among many of the most widely used DFT codes, demonstrating that the precisionof DFT implementations can be determined, even in the absence of one absolute reference code.
Abstract: The widespread popularity of density functional theory has given rise to an extensive range of dedicated codes for predicting molecular and crystalline properties. However, each code implements the formalism in a different way, raising questions about the reproducibility of such predictions. We report the results of a community-wide effort that compared 15 solid-state codes, using 40 different potentials or basis set types, to assess the quality of the Perdew-Burke-Ernzerhof equations of state for 71 elemental crystals. We conclude that predictions from recent codes and pseudopotentials agree very well, with pairwise differences that are comparable to those between different high-precision experiments. Older methods, however, have less precise agreement. Our benchmark provides a framework for users and developers to document the precision of new applications and methodological improvements.

1,141 citations


Journal ArticleDOI
TL;DR: In this article, the basic physics and applications of planar metamaterials, often called metasurfaces, which are composed of optically thin and densely packed planar arrays of resonant or nearly resonant subwavelength elements, are reviewed.

1,047 citations


Journal ArticleDOI
Nabila Aghanim1, Monique Arnaud2, M. Ashdown3, J. Aumont1  +291 moreInstitutions (73)
TL;DR: In this article, the authors present the Planck 2015 likelihoods, statistical descriptions of the 2-point correlation functions of the cosmic microwave background (CMB) temperature and polarization fluctuations that account for relevant uncertainties.
Abstract: This paper presents the Planck 2015 likelihoods, statistical descriptions of the 2-point correlationfunctions of the cosmic microwave background (CMB) temperature and polarization fluctuations that account for relevant uncertainties, both instrumental and astrophysical in nature. They are based on the same hybrid approach used for the previous release, i.e., a pixel-based likelihood at low multipoles (l< 30) and a Gaussian approximation to the distribution of cross-power spectra at higher multipoles. The main improvements are the use of more and better processed data and of Planck polarization information, along with more detailed models of foregrounds and instrumental uncertainties. The increased redundancy brought by more than doubling the amount of data analysed enables further consistency checks and enhanced immunity to systematic effects. It also improves the constraining power of Planck, in particular with regard to small-scale foreground properties. Progress in the modelling of foreground emission enables the retention of a larger fraction of the sky to determine the properties of the CMB, which also contributes to the enhanced precision of the spectra. Improvements in data processing and instrumental modelling further reduce uncertainties. Extensive tests establish the robustness and accuracy of the likelihood results, from temperature alone, from polarization alone, and from their combination. For temperature, we also perform a full likelihood analysis of realistic end-to-end simulations of the instrumental response to the sky, which were fed into the actual data processing pipeline; this does not reveal biases from residual low-level instrumental systematics. Even with the increase in precision and robustness, the ΛCDM cosmological model continues to offer a very good fit to the Planck data. The slope of the primordial scalar fluctuations, n_s, is confirmed smaller than unity at more than 5σ from Planck alone. We further validate the robustness of the likelihood results against specific extensions to the baseline cosmology, which are particularly sensitive to data at high multipoles. For instance, the effective number of neutrino species remains compatible with the canonical value of 3.046. For this first detailed analysis of Planck polarization spectra, we concentrate at high multipoles on the E modes, leaving the analysis of the weaker B modes to future work. At low multipoles we use temperature maps at all Planck frequencies along with a subset of polarization data. These data take advantage of Planck’s wide frequency coverage to improve the separation of CMB and foreground emission. Within the baseline ΛCDM cosmology this requires τ = 0.078 ± 0.019 for the reionization optical depth, which is significantly lower than estimates without the use of high-frequency data for explicit monitoring of dust emission. At high multipoles we detect residual systematic errors in E polarization, typically at the μK^2 level; we therefore choose to retain temperature information alone for high multipoles as the recommended baseline, in particular for testing non-minimal models. Nevertheless, the high-multipole polarization spectra from Planck are already good enough to enable a separate high-precision determination of the parameters of the ΛCDM model, showing consistency with those established independently from temperature information alone.

932 citations



Journal ArticleDOI
Peter A. R. Ade1, Nabila Aghanim2, Monique Arnaud3, M. Ashdown4  +301 moreInstitutions (72)
TL;DR: In this paper, the implications of Planck data for models of dark energy (DE) and modified gravity (MG) beyond the standard cosmological constant scenario were studied, and it was shown that the density of DE at early times has to be below 2% of the critical density, even when forced to play a role for z < 50.
Abstract: We study the implications of Planck data for models of dark energy (DE) and modified gravity (MG) beyond the standard cosmological constant scenario. We start with cases where the DE only directly affects the background evolution, considering Taylor expansions of the equation of state w(a), as well as principal component analysis and parameterizations related to the potential of a minimally coupled DE scalar field. When estimating the density of DE at early times, we significantly improve present constraints and find that it has to be below ~2% (at 95% confidence) of the critical density, even when forced to play a role for z < 50 only. We then move to general parameterizations of the DE or MG perturbations that encompass both effective field theories and the phenomenology of gravitational potentials in MG models. Lastly, we test a range of specific models, such as k-essence, f(R) theories, and coupled DE. In addition to the latest Planck data, for our main analyses, we use background constraints from baryonic acoustic oscillations, type-Ia supernovae, and local measurements of the Hubble constant. We further show the impact of measurements of the cosmological perturbations, such as redshift-space distortions and weak gravitational lensing. These additional probes are important tools for testing MG models and for breaking degeneracies that are still present in the combination of Planck and background data sets. All results that include only background parameterizations (expansion of the equation of state, early DE, general potentials in minimally-coupled scalar fields or principal component analysis) are in agreement with ΛCDM. When testing models that also change perturbations (even when the background is fixed to ΛCDM), some tensions appear in a few scenarios: the maximum one found is ~2σ for Planck TT+lowP when parameterizing observables related to the gravitational potentials with a chosen time dependence; the tension increases to, at most, 3σ when external data sets are included. It however disappears when including CMB lensing.

Journal ArticleDOI
TL;DR: In this paper, the authors report a longitudinal study of the gut microbiome based on DNA sequence analysis of monthly stool samples and clinical information from 39 children, about half of whom received multiple courses of antibiotics during the first 3 years of life.
Abstract: The gut microbial community is dynamic during the first 3 years of life, before stabilizing to an adult-like state. However, little is known about the impact of environmental factors on the developing human gut microbiome. We report a longitudinal study of the gut microbiome based on DNA sequence analysis of monthly stool samples and clinical information from 39 children, about half of whom received multiple courses of antibiotics during the first 3 years of life. Whereas the gut microbiome of most children born by vaginal delivery was dominated by Bacteroides species, the four children born by cesarean section and about 20% of vaginally born children lacked Bacteroides in the first 6 to 18 months of life. Longitudinal sampling, coupled with whole-genome shotgun sequencing, allowed detection of strain-level variation as well as the abundance of antibiotic resistance genes. The microbiota of antibiotic-treated children was less diverse in terms of both bacterial species and strains, with some species often dominated by single strains. In addition, we observed short-term composition changes between consecutive samples from children treated with antibiotics. Antibiotic resistance genes carried on microbial chromosomes showed a peak in abundance after antibiotic treatment followed by a sharp decline, whereas some genes carried on mobile elements persisted longer after antibiotic therapy ended. Our results highlight the value of high-density longitudinal sampling studies with high-resolution strain profiling for studying the establishment and response to perturbation of the infant gut microbiome.

Journal ArticleDOI
TL;DR: A comprehensive survey on the UAVs and the related issues will be introduced, the envisioned UAV-based architecture for the delivery of Uav-based value-added IoT services from the sky will be introduction, and the relevant key challenges and requirements will be presented.
Abstract: Recently, unmanned aerial vehicles (UAVs), or drones, have attracted a lot of attention, since they represent a new potential market. Along with the maturity of the technology and relevant regulations, a worldwide deployment of these UAVs is expected. Thanks to the high mobility of drones, they can be used to provide a lot of applications, such as service delivery, pollution mitigation, farming, and in the rescue operations. Due to its ubiquitous usability, the UAV will play an important role in the Internet of Things (IoT) vision, and it may become the main key enabler of this vision. While these UAVs would be deployed for specific objectives (e.g., service delivery), they can be, at the same time, used to offer new IoT value-added services when they are equipped with suitable and remotely controllable machine type communications (MTCs) devices (i.e., sensors, cameras, and actuators). However, deploying UAVs for the envisioned purposes cannot be done before overcoming the relevant challenging issues. These challenges comprise not only technical issues, such as physical collision, but also regulation issues as this nascent technology could be associated with problems like breaking the privacy of people or even use it for illegal operations like drug smuggling. Providing the communication to UAVs is another challenging issue facing the deployment of this technology. In this paper, a comprehensive survey on the UAVs and the related issues will be introduced. In addition, our envisioned UAV-based architecture for the delivery of UAV-based value-added IoT services from the sky will be introduced, and the relevant key challenges and requirements will be presented.

Journal ArticleDOI
Peter A. R. Ade1, Nabila Aghanim2, Monique Arnaud3, Frederico Arroja4  +306 moreInstitutions (75)
TL;DR: In this article, the Planck full mission cosmic microwave background (CMB) temperature and E-mode polarization maps are analysed to obtain constraints on primordial non-Gaussianity (NG).
Abstract: The Planck full mission cosmic microwave background (CMB) temperature and E-mode polarization maps are analysed to obtain constraints on primordial non-Gaussianity (NG). Using three classes of optimal bispectrum estimators – separable template-fitting (KSW), binned, and modal – we obtain consistent values for the primordial local, equilateral, and orthogonal bispectrum amplitudes, quoting as our final result from temperature alone ƒlocalNL = 2.5 ± 5.7, ƒequilNL= -16 ± 70, , and ƒorthoNL = -34 ± 32 (68% CL, statistical). Combining temperature and polarization data we obtain ƒlocalNL = 0.8 ± 5.0, ƒequilNL= -4 ± 43, and ƒorthoNL = -26 ± 21 (68% CL, statistical). The results are based on comprehensive cross-validation of these estimators on Gaussian and non-Gaussian simulations, are stable across component separation techniques, pass an extensive suite of tests, and are consistent with estimators based on measuring the Minkowski functionals of the CMB. The effect of time-domain de-glitching systematics on the bispectrum is negligible. In spite of these test outcomes we conservatively label the results including polarization data as preliminary, owing to a known mismatch of the noise model in simulations and the data. Beyond estimates of individual shape amplitudes, we present model-independent, three-dimensional reconstructions of the Planck CMB bispectrum and derive constraints on early universe scenarios that generate primordial NG, including general single-field models of inflation, axion inflation, initial state modifications, models producing parity-violating tensor bispectra, and directionally dependent vector models. We present a wide survey of scale-dependent feature and resonance models, accounting for the “look elsewhere” effect in estimating the statistical significance of features. We also look for isocurvature NG, and find no signal, but we obtain constraints that improve significantly with the inclusion of polarization. The primordial trispectrum amplitude in the local model is constrained to be

Journal ArticleDOI
TL;DR: The results demonstrate the importance of understanding host–microbe interactions to gain better insight into human health and demonstrate the influence of host genetics on microbial species, pathways and gene ontology categories on the basis of metagenomic sequencing in 1,514 subjects.
Abstract: The gut microbiome is affected by multiple factors, including genetics. In this study, we assessed the influence of host genetics on microbial species, pathways and gene ontology categories, on the basis of metagenomic sequencing in 1,514 subjects. In a genome-wide analysis, we identified associations of 9 loci with microbial taxonomies and 33 loci with microbial pathways and gene ontology terms at P < 5 × 10-8. Additionally, in a targeted analysis of regions involved in complex diseases, innate and adaptive immunity, or food preferences, 32 loci were identified at the suggestive level of P < 5 × 10-6. Most of our reported associations are new, including genome-wide significance for the C-type lectin molecules CLEC4F-CD207 at 2p13.3 and CLEC4A-FAM90A1 at 12p13. We also identified association of a functional LCT SNP with the Bifidobacterium genus (P = 3.45 × 10-8) and provide evidence of a gene-diet interaction in the regulation of Bifidobacterium abundance. Our results demonstrate the importance of understanding host-microbe interactions to gain better insight into human health.

Journal ArticleDOI
TL;DR: A systematic literature review on how agile methods and lean software development has been adopted at scale, focusing on reported challenges and success factors in the transformation, identified 35 reported challenges grouped into nine categories, and 29 success factors, grouped into eleven categories.

Journal ArticleDOI
08 Apr 2016-Science
TL;DR: Identifying the most promising avenues to mechanically robust superhydrophobic materials calls for standardized characterization methods.
Abstract: Superhydrophobic surfaces have received rapidly increasing research interest since the late 1990s because of their tremendous application potential in areas such as self-cleaning and anti-icing surfaces, drag reduction, and enhanced heat transfer ( 1 – 3 ). A surface is considered superhydrophobic if a water droplet beads up (with contact angles >150°), and moreover, if the droplet can slide away from the surface readily (i.e., it has small contact angle hysteresis). Two essential features are generally required for superhydrophobicity: a micro- or nanostructured surface texture and a nonpolar surface chemistry, to help trap a thin air layer that reduces attractive interactions between the solid surface and the liquid ( 4 , 5 ). However, such surface textures are highly susceptible to mechanical wear, and abrasion may also alter surface chemistry. Both processes can lead to loss of liquid repellency, which makes mechanical durability a central concern for practical applications ( 6 , 7 ). Identifying the most promising avenues to mechanically robust superhydrophobic materials calls for standardized characterization methods.

Journal ArticleDOI
TL;DR: A first step toward an inclusive big data research agenda for IS is offered by focusing on the interplay between big data’s characteristics, the information value chain encompassing people-process-technology, and the three dominant IS research traditions (behavioral, design, and economics of IS).
Abstract: Big data has received considerable attention from the information systems (IS) discipline over the past few years, with several recent commentaries, editorials, and special issue introductions on the topic appearing in leading IS outlets. These papers present varying perspectives on promising big data research topics and highlight some of the challenges that big data poses. In this editorial, we synthesize and contribute further to this discourse. We offer a first step toward an inclusive big data research agenda for IS by focusing on the interplay between big data’s characteristics, the information value chain encompassing people-process-technology, and the three dominant IS research traditions (behavioral, design, and economics of IS). We view big data as a disruption to the value chain that has widespread impacts, which include but are not limited to changing the way academics conduct scholarly work. Importantly, we critically discuss the opportunities and challenges for behavioral, design science, and economics of IS research and the emerging implications for theory and methodology arising due to big data’s disruptive effects.

Journal ArticleDOI
TL;DR: This study provides a first assessment of continuous sub-national trajectories of blue water consumption, renewable freshwater availability, and water scarcity for the entire 20th century to suggest measures for alleviating water scarcity and increasing sustainability.
Abstract: Water scarcity is a rapidly growing concern around the globe, but little is known about how it has developed over time. This study provides a first assessment of continuous sub-national trajectories of blue water consumption, renewable freshwater availability, and water scarcity for the entire 20th century. Water scarcity is analysed using the fundamental concepts of shortage (impacts due to low availability per capita) and stress (impacts due to high consumption relative to availability) which indicate difficulties in satisfying the needs of a population and overuse of resources respectively. While water consumption increased fourfold within the study period, the population under water scarcity increased from 0.24 billion (14% of global population) in the 1900s to 3.8 billion (58%) in the 2000s. Nearly all sub-national trajectories show an increasing trend in water scarcity. The concept of scarcity trajectory archetypes and shapes is introduced to characterize the historical development of water scarcity and suggest measures for alleviating water scarcity and increasing sustainability. Linking the scarcity trajectories to other datasets may help further deepen understanding of how trajectories relate to historical and future drivers, and hence help tackle these evolving challenges.

Journal ArticleDOI
R. Adam1, Peter A. R. Ade2, Nabila Aghanim3, M. I. R. Alves4  +281 moreInstitutions (69)
TL;DR: In this paper, the authors consider the problem of diffuse astrophysical component separation, and process these maps within a Bayesian framework to derive an internally consistent set of full-sky astrophysical components maps.
Abstract: Planck has mapped the microwave sky in temperature over nine frequency bands between 30 and 857 GHz and in polarization over seven frequency bands between 30 and 353 GHz in polarization. In this paper we consider the problem of diffuse astrophysical component separation, and process these maps within a Bayesian framework to derive an internally consistent set of full-sky astrophysical component maps. Component separation dedicated to cosmic microwave background (CMB) reconstruction is described in a companion paper. For the temperature analysis, we combine the Planck observations with the 9-yr Wilkinson Microwave Anisotropy Probe (WMAP) sky maps and the Haslam et al. 408 MHz map, to derive a joint model of CMB, synchrotron, free-free, spinning dust, CO, line emission in the 94 and 100 GHz channels, and thermal dust emission. Full-sky maps are provided for each component, with an angular resolution varying between 7.5 and 1deg. Global parameters (monopoles, dipoles, relative calibration, and bandpass errors) are fitted jointly with the sky model, and best-fit values are tabulated. For polarization, the model includes CMB, synchrotron, and thermal dust emission. These models provide excellent fits to the observed data, with rms temperature residuals smaller than 4μK over 93% of the sky for all Planck frequencies up to 353 GHz, and fractional errors smaller than 1% in the remaining 7% of the sky. The main limitations of the temperature model at the lower frequencies are internal degeneracies among the spinning dust, free-free, and synchrotron components; additional observations from external low-frequency experiments will be essential to break these degeneracies. The main limitations of the temperature model at the higher frequencies are uncertainties in the 545 and 857 GHz calibration and zero-points. For polarization, the main outstanding issues are instrumental systematics in the 100–353 GHz bands on large angular scales in the form of temperature-to-polarization leakage, uncertainties in the analogue-to-digital conversion, and corrections for the very long time constant of the bolometer detectors, all of which are expected to improve in the near future.

Journal ArticleDOI
Peter A. R. Ade1, Nabila Aghanim2, Monique Arnaud3, M. Ashdown4  +289 moreInstitutions (73)
TL;DR: The most significant measurement of the cosmic microwave background (CMB) lensing potential at a level of 40σ using temperature and polarization data from the Planck 2015 full-mission release was presented in this article.
Abstract: We present the most significant measurement of the cosmic microwave background (CMB) lensing potential to date (at a level of 40σ), using temperature and polarization data from the Planck 2015 full-mission release. Using a polarization-only estimator, we detect lensing at a significance of 5σ. We cross-check the accuracy of our measurement using the wide frequency coverage and complementarity of the temperature and polarization measurements. Public products based on this measurement include an estimate of the lensing potential over approximately 70% of the sky, an estimate of the lensing potential power spectrum in bandpowers for the multipole range 40 ≤ L ≤ 400, and an associated likelihood for cosmological parameter constraints. We find good agreement between our measurement of the lensing potential power spectrum and that found in the ΛCDM model that best fits the Planck temperature and polarization power spectra. Using the lensing likelihood alone we obtain a percent-level measurement of the parameter combination σ8Ω0.25m = 0.591 ± 0.021. We combine our determination of the lensing potential with the E-mode polarization, also measured by Planck, to generate an estimate of the lensing B-mode. We show that this lensing B-mode estimate is correlated with the B-modes observed directly by Planck at the expected level and with a statistical significance of 10σ, confirming Planck’s sensitivity to this known sky signal. We also correlate our lensing potential estimate with the large-scale temperature anisotropies, detecting a cross-correlation at the 3σ level, as expected because of dark energy in the concordance ΛCDM model.

Journal ArticleDOI
TL;DR: In this article, the advances of applying terrestrial laser scanning (TLS) in forest inventories, discusses its properties with reference to other related techniques and discusses the future prospects of this technique.
Abstract: Decision making on forest resources relies on the precise information that is collected using inventory. There are many different kinds of forest inventory techniques that can be applied depending on the goal, scale, resources and the required accuracy. Most of the forest inventories are based on field sample. Therefore, the accuracy of the forest inventories depends on the quality and quantity of the field sample. Conventionally, field sample has been measured using simple tools. When map is required, remote sensing materials are needed. Terrestrial laser scanning (TLS) provides a measurement technique that can acquire millimeter-level of detail from the surrounding area, which allows rapid, automatic and periodical estimates of many important forest inventory attributes. It is expected that TLS will be operationally used in forest inventories as soon as the appropriate software becomes available, best practices become known and general knowledge of these findings becomes more wide spread. Meanwhile, mobile laser scanning, personal laser scanning, and image-based point clouds became capable of capturing similar terrestrial point cloud data as TLS. This paper reviews the advances of applying TLS in forest inventories, discusses its properties with reference to other related techniques and discusses the future prospects of this technique.

Journal ArticleDOI
TL;DR: In this paper, the authors explore the evolution of Design for Sustainability (DfS) and propose an evolutionary framework and map the reviewed DfS approaches onto this framework, showing how it progressively expanded from a technical and product-centric focus towards large scale system level changes in which sustainability is understood as a socio-technical challenge.

Journal ArticleDOI
TL;DR: In this article, a general approach to the synthesis of metasurfaces for full control of transmitted and reflected plane waves and show that perfect performance can be realized based on the use of an equivalent impedance matrix model.
Abstract: Nonuniform metasurfaces (electrically thin composite layers) can be used for shaping refracted and reflected electromagnetic waves. However, known design approaches based on the generalized refraction and reflection laws do not allow realization of perfectly performing devices: there are always some parasitic reflections into undesired directions. In this paper we introduce and discuss a general approach to the synthesis of metasurfaces for full control of transmitted and reflected plane waves and show that perfect performance can be realized. The method is based on the use of an equivalent impedance matrix model which connects the tangential field components at the two sides on the metasurface. With this approach we are able to understand what physical properties of the metasurface are needed in order to perfectly realize the desired response. Furthermore, we determine the required polarizabilities of the metasurface unit cells and discuss suitable cell structures. It appears that only spatially dispersive metasurfaces allow realization of perfect refraction and reflection of incident plane waves into arbitrary directions. In particular, ideal refraction is possible only if the metasurface is bianisotropic (weak spatial dispersion), and ideal reflection without polarization transformation requires spatial dispersion with a specific, strongly nonlocal response to the fields.

Journal ArticleDOI
R. Adam1, Nabila Aghanim2, M. Ashdown3, J. Aumont2  +218 moreInstitutions (58)
TL;DR: In this paper, the authors investigate constraints on cosmic reionization extracted from the Planck cosmic microwave background (CMB) data and find that the universe is ionized at less than the 10% level at redshifts above z ≃ 10.8.
Abstract: We investigate constraints on cosmic reionization extracted from the Planck cosmic microwave background (CMB) data. We combine the Planck CMB anisotropy data in temperature with the low-multipole polarization data to fit ΛCDM models with various parameterizations of the reionization history. We obtain a Thomson optical depth τ = 0.058 ± 0.012 for the commonly adopted instantaneous reionization model. This confirms, with data solely from CMB anisotropies, the low value suggested by combining Planck 2015 results with other data sets, and also reduces the uncertainties. We reconstruct the history of the ionization fraction using either a symmetric or an asymmetric model for the transition between the neutral and ionized phases. To determine better constraints on the duration of the reionization process, we also make use of measurements of the amplitude of the kinetic Sunyaev-Zeldovich (kSZ) effect using additional information from the high-resolution Atacama Cosmology Telescope and South Pole Telescope experiments. The average redshift at which reionization occurs is found to lie between z = 7.8 and 8.8, depending on the model of reionization adopted. Using kSZ constraints and a redshift-symmetric reionization model, we find an upper limit to the width of the reionization period of Δz < 2.8. In all cases, we find that the Universe is ionized at less than the 10% level at redshifts above z ≃ 10. This suggests that an early onset of reionization is strongly disfavoured by the Planck data. We show that this result also reduces the tension between CMB-based analyses and constraints from other astrophysical sources.

Journal ArticleDOI
TL;DR: In this paper, the authors trace the content, scope and relatively short history of modern social innovation research across disciplines by applying network and bibliometric analyses, and explore their relevance to innovation studies.

Journal ArticleDOI
TL;DR: In this paper, a straightforward method to produce lignin nanoparticles from waste lignins obtained from kraft pulping is introduced, which is a natural biopolymer obtained mainly as a byproduct from pulp and paper-making industries, and is primarily burned to produce energy.

Proceedings Article
06 Feb 2016
TL;DR: This article proposed a new inference model, the Ladder Variational Autoencoder, that recursively corrects the generative distribution by a data dependent approximate likelihood in a process similar to the recently proposed Ladder Network.
Abstract: Variational autoencoders are powerful models for unsupervised learning. However deep models with several layers of dependent stochastic variables are difficult to train which limits the improvements obtained using these highly expressive models. We propose a new inference model, the Ladder Variational Autoencoder, that recursively corrects the generative distribution by a data dependent approximate likelihood in a process resembling the recently proposed Ladder Network. We show that this model provides state of the art predictive log-likelihood and tighter log-likelihood lower bound compared to the purely bottom-up inference in layered Variational Autoencoders and other generative models. We provide a detailed analysis of the learned hierarchical latent representation and show that our new inference model is qualitatively different and utilizes a deeper more distributed hierarchy of latent variables. Finally, we observe that batch-normalization and deterministic warm-up (gradually turning on the KL-term) are crucial for training variational models with many stochastic layers.

Journal ArticleDOI
Peter A. R. Ade1, Nabila Aghanim2, Monique Arnaud3, Frederico Arroja4  +279 moreInstitutions (69)
TL;DR: The impact of primordial magnetic fields (PMFs) on the CMB temperature and polarization spectra was investigated in this paper, with different bounds depending on the specific effect that is analysed.
Abstract: We predict and investigate four types of imprint of a stochastic background of primordial magnetic fields (PMFs) on the cosmic microwave background (CMB) anisotropies: the impact of PMFs on the CMB temperature and polarization spectra, related to their contribution to cosmological perturbations; the effect on CMB polarization induced by Faraday rotation; magnetically-induced non-Gaussianities and related non-zero bispectra; and the magnetically-induced breaking of statistical isotropy. We present constraints on the amplitude of PMFs derived from different combinations of Planck data products, depending on the specific effect that is analysed. Overall, Planck data constrain the amplitude of PMFs to less than a few nanogauss, with different bounds depending on the considered model. In particular, individual limits coming from the analysis of the CMB angular power spectra, using the Planck likelihood, are B1Mpc < 4:4 nG (where B1Mpc is the comoving field amplitude at a scale of 1 Mpc) at 95 % confidence level, assuming zero helicity, and B1Mpc < 5:6 nG when we consider a maximally helical field. For nearly scaleinvariant PMFs we obtain B1Mpc < 2:1 nG and B1Mpc < 0:7 nG if the impact of PMFs on the ionization history of the Universe is included in the analysis. From the analysis of magnetically-induced non-Gaussianity we obtain three different values, corresponding to three applied methods, all below 5 nG. The constraint from the magnetically-induced passive-tensor bispectrum is B1Mpc < 2:8 nG. A search for preferred directions in the magnetically-induced passive bispectrum yields B1Mpc < 4:5 nG, whereas the the compensated-scalar bispectrum gives B1Mpc < 3 nG. The analysis of the Faraday rotation of CMB polarization by PMFs uses the Planck power spectra in EE and BB at 70 GHz and gives B1Mpc < 1380 nG. In our final analysis, we consider the harmonic-space correlations produced by Alfv´ en waves, finding no significant evidence for the presence of these waves. Together, these results comprise a comprehensive set of constraints on possible PMFs with Planck data.

Journal ArticleDOI
TL;DR: The research results indicate that the IPMSM with V-shape PMs is more satisfying with comprehensive consideration, and the back-electromotive force (EMF), flux leakage coefficient, average torque, torque ripple, cogging torque, power per unit volume, power factor, and flux-weakening ability are investigated.
Abstract: As a kind of traction device, interior permanent-magnet synchronous machines (IPMSMs) are widely used in modern electric vehicles. This paper performs a design and comparative study of IPMSMs with different rotor topologies (spoke-type PMs, tangential-type PMs, U-shape PMs, and V-shape PMs). The research results indicate that the IPMSM with V-shape PMs is more satisfying with comprehensive consideration. Furthermore, the IPMSM with V-shape PMs is investigated in detail. The influences of geometrical parameters (magnetic bridge and angle between the two V-shape PMs under each pole, etc.) on the performances of V-shape motor are evaluated based on finite-element method (FEM). For accurate research, the effects of saturation, cross-magnetization, and the change in PM flux linkage on d - and q -axis inductances are considered. The back-electromotive force (EMF), flux leakage coefficient, average torque, torque ripple, cogging torque, power per unit volume, power factor, and flux-weakening ability are investigated, respectively. The experimental results verify the validity and accuracy of the process presented in this paper.