scispace - formally typeset
Search or ask a question

Showing papers by "University of Lisbon published in 2012"


Journal ArticleDOI
Georges Aad1, T. Abajyan2, Brad Abbott3, Jalal Abdallah4  +2964 moreInstitutions (200)
TL;DR: In this article, a search for the Standard Model Higgs boson in proton-proton collisions with the ATLAS detector at the LHC is presented, which has a significance of 5.9 standard deviations, corresponding to a background fluctuation probability of 1.7×10−9.

9,282 citations


Journal ArticleDOI
TL;DR: This review presents why PLGA has been chosen to design nanoparticles as drug delivery systems in various biomedical applications such as vaccination, cancer, inflammation and other diseases.

2,753 citations


Journal ArticleDOI
TL;DR: In this article, theoretical and phenomenological aspects of two-Higgs-doublet extensions of the Standard Model are discussed and a careful study of spontaneous CP violation is presented, including an analysis of the conditions which have to be satisfied in order for a vacuum to violate CP.

2,395 citations


Journal ArticleDOI
TL;DR: This paper presents an overview of un Mixing methods from the time of Keshava and Mustard's unmixing tutorial to the present, including Signal-subspace, geometrical, statistical, sparsity-based, and spatial-contextual unmixed algorithms.
Abstract: Imaging spectrometers measure electromagnetic energy scattered in their instantaneous field view in hundreds or thousands of spectral channels with higher spectral resolution than multispectral cameras. Imaging spectrometers are therefore often referred to as hyperspectral cameras (HSCs). Higher spectral resolution enables material identification via spectroscopic analysis, which facilitates countless applications that require identifying materials in scenarios unsuitable for classical spectroscopic analysis. Due to low spatial resolution of HSCs, microscopic material mixing, and multiple scattering, spectra measured by HSCs are mixtures of spectra of materials in a scene. Thus, accurate estimation requires unmixing. Pixels are assumed to be mixtures of a few materials, called endmembers. Unmixing involves estimating all or some of: the number of endmembers, their spectral signatures, and their abundances at each pixel. Unmixing is a challenging, ill-posed inverse problem because of model inaccuracies, observation noise, environmental conditions, endmember variability, and data set size. Researchers have devised and investigated many models searching for robust, stable, tractable, and accurate unmixing algorithms. This paper presents an overview of unmixing methods from the time of Keshava and Mustard's unmixing tutorial to the present. Mixing models are first discussed. Signal-subspace, geometrical, statistical, sparsity-based, and spatial-contextual unmixing algorithms are described. Mathematical problems and potential solutions are described. Algorithm characteristics are illustrated experimentally.

2,373 citations


Posted Content
TL;DR: An overview of unmixing methods from the time of Keshava and Mustard's tutorial as mentioned in this paper to the present can be found in Section 2.2.1].
Abstract: Imaging spectrometers measure electromagnetic energy scattered in their instantaneous field view in hundreds or thousands of spectral channels with higher spectral resolution than multispectral cameras. Imaging spectrometers are therefore often referred to as hyperspectral cameras (HSCs). Higher spectral resolution enables material identification via spectroscopic analysis, which facilitates countless applications that require identifying materials in scenarios unsuitable for classical spectroscopic analysis. Due to low spatial resolution of HSCs, microscopic material mixing, and multiple scattering, spectra measured by HSCs are mixtures of spectra of materials in a scene. Thus, accurate estimation requires unmixing. Pixels are assumed to be mixtures of a few materials, called endmembers. Unmixing involves estimating all or some of: the number of endmembers, their spectral signatures, and their abundances at each pixel. Unmixing is a challenging, ill-posed inverse problem because of model inaccuracies, observation noise, environmental conditions, endmember variability, and data set size. Researchers have devised and investigated many models searching for robust, stable, tractable, and accurate unmixing algorithms. This paper presents an overview of unmixing methods from the time of Keshava and Mustard's unmixing tutorial [1] to the present. Mixing models are first discussed. Signal-subspace, geometrical, statistical, sparsity-based, and spatial-contextual unmixing algorithms are described. Mathematical problems and potential solutions are described. Algorithm characteristics are illustrated experimentally.

1,808 citations


Journal ArticleDOI
TL;DR: In this article, Advanced Camera for Surveys, NICMOS and Keck adaptive-optics-assisted photometry of 20 Type Ia supernovae (SNe Ia) from the Hubble Space Telescope (HST) Cluster Supernova Survey was presented.
Abstract: We present Advanced Camera for Surveys, NICMOS, and Keck adaptive-optics-assisted photometry of 20 Type Ia supernovae (SNe Ia) from the Hubble Space Telescope (HST) Cluster Supernova Survey. The SNe Ia were discovered over the redshift interval 0.623 1 SNe Ia. We describe how such a sample could be efficiently obtained by targeting cluster fields with WFC3 on board HST. The updated supernova Union2.1 compilation of 580 SNe is available at http://supernova.lbl.gov/Union.

1,784 citations


Journal ArticleDOI
TL;DR: The neurobiology and methodological modifications of the test commonly used in behavioral pharmacology are reviewed to review the novel object recognition paradigms in animals, as a valuable measure of cognition.
Abstract: Animal models of memory have been considered as the subject of many scientific publications at least since the beginning of the twentieth century. In humans, memory is often accessed through spoken or written language, while in animals, cognitive functions must be accessed through different kind of behaviors in many specific, experimental models of memory and learning. Among them, the novel object recognition test can be evaluated by the differences in the exploration time of novel and familiar objects. Its application is not limited to a field of research and enables that various issues can be studied, such as the memory and learning, the preference for novelty, the influence of different brain regions in the process of recognition, and even the study of different drugs and their effects. This paper describes the novel object recognition paradigms in animals, as a valuable measure of cognition. The purpose of this work was to review the neurobiology and methodological modifications of the test commonly used in behavioral pharmacology.

1,635 citations


Journal ArticleDOI
TL;DR: A refined theory of basic individual values intended to provide greater heuristic and explanatory power than the original theory of 10 values is proposed and analyses of predictive validity demonstrate that the refined values theory provides greater and more precise insight into the value underpinnings of beliefs.
Abstract: We propose a refined theory of basic individual values intended to provide greater heuristic and explanatory power than the original theory of 10 values (Schwartz, 1992). The refined theory more accurately expresses the central assumption of the original theory that research has largely ignored: Values form a circular motivational continuum. The theory defines and orders 19 values on the continuum based on their compatible and conflicting motivations, expression of self-protection versus growth, and personal versus social focus. We assess the theory with a new instrument in 15 samples from 10 countries (N 6,059). Confirmatory factor and multidimensional scaling analyses support discrimination of the 19 values, confirming the refined theory. Multidimensional scaling analyses largely support the predicted motivational order of the values. Analyses of predictive validity demonstrate that the refined values theory provides greater and more precise insight into the value underpinnings of beliefs. Each value correlates uniquely with external variables.

1,585 citations


Journal ArticleDOI
TL;DR: This paper introduces a new supervised segmentation algorithm for remotely sensed hyperspectral image data which integrates the spectral and spatial information in a Bayesian framework and represents an innovative contribution in the literature.
Abstract: This paper introduces a new supervised segmentation algorithm for remotely sensed hyperspectral image data which integrates the spectral and spatial information in a Bayesian framework. A multinomial logistic regression (MLR) algorithm is first used to learn the posterior probability distributions from the spectral information, using a subspace projection method to better characterize noise and highly mixed pixels. Then, contextual information is included using a multilevel logistic Markov-Gibbs Markov random field prior. Finally, a maximum a posteriori segmentation is efficiently computed by the min-cut-based integer optimization algorithm. The proposed segmentation approach is experimentally evaluated using both simulated and real hyperspectral data sets, exhibiting state-of-the-art performance when compared with recently introduced hyperspectral image classification methods. The integration of subspace projection methods with the MLR algorithm, combined with the use of spatial-contextual information, represents an innovative contribution in the literature. This approach is shown to provide accurate characterization of hyperspectral imagery in both the spectral and the spatial domain.

678 citations


Journal ArticleDOI
TL;DR: In this paper, the authors analyze the perceptions of traditional agriculture in Europe and their influence in land management policies and argue that, contrary to the common perception, traditional agriculture practices were not environmentally friendly and that the standards of living of rural populations were low.
Abstract: For millennia, mankind has shaped landscapes, particularly through agriculture. In Europe, the age-old interaction between humans and ecosystems strongly influenced the cultural heritage. Yet European farmland is now being abandoned, especially in remote areas. The loss of the traditional agricultural landscapes and its consequences for biodiversity and ecosystem services is generating concerns in both the scientific community and the public. Here we ask to what extent farmland abandonment can be considered as an opportunity for rewilding ecosystems. We analyze the perceptions of traditional agriculture in Europe and their influence in land management policies. We argue that, contrary to the common perception, traditional agriculture practices were not environmentally friendly and that the standards of living of rural populations were low. We suggest that current policies to maintain extensive farming landscapes underestimate the human labor needed to sustain these landscapes and the recent and future dynamics of the socio-economic drivers behind abandonment. We examine the potential benefits for ecosystems and people from rewilding. We identify species that could benefit from land abandonment and forest regeneration and the ecosystem services that could be provided such as carbon sequestration and recreation. Finally, we discuss the challenges associated with rewilding, including the need to maintain open areas, the fire risks, and the conflicts between people and wildlife. Despite these challenges, we argue that rewilding should be recognized by policy-makers as one of the possible land management options in Europe, particularly on marginal areas.

624 citations


Journal ArticleDOI
TL;DR: The term "bioturbation" is frequently used to describe how living organisms affect the substratum in which they live as discussed by the authors, and it has been used in aquatic scientific disciplines to describe all transport processes carried out by animals that directly or indirectly affect sediment matrices.
Abstract: The term 'bioturbation' is frequently used to describe how living organisms affect the substratum in which they live. A closer look at the aquatic science literature reveals, however, an inconsistent usage of the term with increasing perplexity in recent years. Faunal disturbance has often been referred to as particle reworking, while water movement (if considered) is re ferred to as bioirrigation in many cases. For consistency, we therefore propose that, for contemporary aquatic scientific disciplines, faunal bioturbation in aquatic environments includes all transport processes carried out by animals that directly or indirectly affect sediment matrices. These pro- cesses include both particle reworking and burrow ventilation. With this definition, bioturbation acts as an 'umbrella' term that covers all transport processes and their physical effects on the sub- stratum. Particle reworking occurs through burrow construction and maintenance, as well as ingestion and defecation, and causes biomixing of the substratum. Organic matter and microor- ganisms are thus displaced vertically and laterally within the sediment matrix. Particle reworking animals can be categorized as biodiffusors, upward conveyors, downward conveyors and regen- erators depending on their behaviour, life style and feeding type. Burrow ventilation occurs when animals flush their open- or blind-ended burrows with overlying water for respiratory and feeding purposes, and it causes advective or diffusive bioirrigation ex change of solutes between the sedi- ment pore water and the overlying water body. Many bioturbating species perform reworking and ventilation simultaneously. We also propose that the effects of bioturbation on other organisms and associated processes (e.g. microbial driven biogeochemical transformations) are considered within the conceptual framework of ecosystem engineering.

Journal ArticleDOI
Georges Aad1, Brad Abbott2, Jalal Abdallah3, S. Abdel Khalek  +3081 moreInstitutions (197)
TL;DR: A combined search for the Standard Model Higgs boson with the ATLAS experiment at the LHC using datasets corresponding to integrated luminosities from 1.04 fb(-1) to 4.9 fb(1) of pp collisions is described in this paper.

Journal ArticleDOI
28 Sep 2012-Cell
TL;DR: In this paper, the authors sequenced the methylome of three haploid cell types from developing pollen: the sperm cell, the vegetative cell and their precursor, the postmeiotic microspore, and found that unlike in mammals the plant germline retains CG and CHG DNA methylation.

Journal ArticleDOI
TL;DR: These findings offer in vivo validation of the concept that exosome secretion can exert key pathophysiologic roles during tumor formation and progression, but they also highlight the idiosyncratic character of the tumor context.
Abstract: During progression from single cancer cells to a tumor mass and metastases, tumor cells send signals that can subvert their tissue microenvironment. These signals involve soluble molecules and various extracellular vesicles, including a particular type termed exosomes. The specific roles of exosomes secreted in the tumor microenvironment, however, is unclear. The small GTPases RAB27A and RAB27B regulate exocytosis of multivesicular endosomes, which lead to exosome secretion, in human HeLa cells. Here, we used mouse models to show that Rab27a blockade in mammary carcinoma cells decreased secretion of exosomes characterized by endocytic markers, but also of matrix metalloproteinase 9, which is not associated with exosomes. Rab27a blockade resulted in decreased primary tumor growth and lung dissemination of a metastatic carcinoma (4T1), but not of a nonmetastatic carcinoma (TS/A). Local growth of 4T1 tumors required mobilization of a population of neutrophil immune cells induced by Rab27a-dependent secretion of exosomes together with a specific combination of cytokines and/or metalloproteinases. Our findings offer in vivo validation of the concept that exosome secretion can exert key pathophysiologic roles during tumor formation and progression, but they also highlight the idiosyncratic character of the tumor context.

Journal ArticleDOI
TL;DR: PHYLOViZ is platform independent Java software that allows the integrated analysis of sequence-based typing methods, including SNP data generated from whole genome sequence approaches, and associated epidemiological data.
Abstract: With the decrease of DNA sequencing costs, sequence-based typing methods are rapidly becoming the gold standard for epidemiological surveillance. These methods provide reproducible and comparable results needed for a global scale bacterial population analysis, while retaining their usefulness for local epidemiological surveys. Online databases that collect the generated allelic profiles and associated epidemiological data are available but this wealth of data remains underused and are frequently poorly annotated since no user-friendly tool exists to analyze and explore it. PHYLOViZ is platform independent Java software that allows the integrated analysis of sequence-based typing methods, including SNP data generated from whole genome sequence approaches, and associated epidemiological data. goeBURST and its Minimum Spanning Tree expansion are used for visualizing the possible evolutionary relationships between isolates. The results can be displayed as an annotated graph overlaying the query results of any other epidemiological data available. PHYLOViZ is a user-friendly software that allows the combined analysis of multiple data sources for microbial epidemiological and population studies. It is freely available at http://www.phyloviz.net .

Journal ArticleDOI
Georges Aad1, Brad Abbott2, J. Abdallah3, S. Abdel Khalek4  +3073 moreInstitutions (193)
TL;DR: In this paper, a Fourier analysis of the charged particle pair distribution in relative azimuthal angle (Delta phi = phi(a)-phi(b)) is performed to extract the coefficients v(n,n) =.
Abstract: Differential measurements of charged particle azimuthal anisotropy are presented for lead-lead collisions at root sNN = 2.76 TeV with the ATLAS detector at the LHC, based on an integrated luminosity of approximately 8 mu b(-1). This anisotropy is characterized via a Fourier expansion of the distribution of charged particles in azimuthal angle relative to the reaction plane, with the coefficients v(n) denoting the magnitude of the anisotropy. Significant v(2)-v(6) values are obtained as a function of transverse momentum (0.5 = 3 are found to vary weakly with both eta and centrality, and their p(T) dependencies are found to follow an approximate scaling relation, v(n)(1/n)(p(T)) proportional to v(2)(1/2)(p(T)), except in the top 5% most central collisions. A Fourier analysis of the charged particle pair distribution in relative azimuthal angle (Delta phi = phi(a)-phi(b)) is performed to extract the coefficients v(n,n) = . For pairs of charged particles with a large pseudorapidity gap (|Delta eta = eta(a) - eta(b)| > 2) and one particle with p(T) < 3 GeV, the v(2,2)-v(6,6) values are found to factorize as v(n,n)(p(T)(a), p(T)(b)) approximate to v(n) (p(T)(a))v(n)(p(T)(b)) in central and midcentral events. Such factorization suggests that these values of v(2,2)-v(6,6) are primarily attributable to the response of the created matter to the fluctuations in the geometry of the initial state. A detailed study shows that the v(1,1)(p(T)(a), p(T)(b)) data are consistent with the combined contributions from a rapidity-even v(1) and global momentum conservation. A two-component fit is used to extract the v(1) contribution. The extracted v(1) isobserved to cross zero at pT approximate to 1.0 GeV, reaches a maximum at 4-5 GeV with a value comparable to that for v(3), and decreases at higher p(T).

Journal ArticleDOI
R. K. Saito1, Maren Hempel1, Dante Minniti1, Dante Minniti2, Philip W. Lucas3, Marina Rejkuba4, Ignacio Toledo5, Oscar A. Gonzalez4, Javier Alonso-García1, Mike Irwin6, Eduardo Gonzalez-Solares6, Simon Hodgkin6, James R. Lewis6, Nicholas Cross7, Valentin D. Ivanov4, Eamonn Kerins8, Jim Emerson9, M. Soto10, E. B. Amôres11, Sebastián Gurovich12, I. Dékány1, R. Angeloni1, Juan Carlos Beamin1, Márcio Catelan1, Nelson Padilla1, Manuela Zoccali13, Manuela Zoccali1, P. Pietrukowicz14, C. Moni Bidin15, Francesco Mauro15, Doug Geisler15, S. L. Folkes16, Stuart E. Sale16, Stuart E. Sale1, Jura Borissova16, Radostin Kurtev16, Andrea Veronica Ahumada4, Andrea Veronica Ahumada17, M. V. Alonso12, M. V. Alonso17, A. Adamson, Julia Ines Arias10, Reba M. Bandyopadhyay18, Rodolfo H. Barbá19, Rodolfo H. Barbá10, Beatriz Barbuy20, Gustavo Baume21, Luigi R. Bedin13, Andrea Bellini22, Robert A. Benjamin23, Eduardo Luiz Damiani Bica24, Charles Jose Bonatto24, Leonardo Bronfman25, Giovanni Carraro4, André-Nicolas Chené15, André-Nicolas Chené16, Juan J. Clariá17, J. R. A. Clarke16, Carlos Contreras3, A. Corvillon1, R. de Grijs26, R. de Grijs27, Bruno Dias20, Janet E. Drew3, C. Farina21, Carlos Feinstein21, E. Fernández-Lajús21, Roberto Claudio Gamen21, Wolfgang Gieren15, Bertrand Goldman28, Carlos González-Fernández29, R. J. J. Grand30, G. Gunthardt17, Nigel Hambly7, Margaret M. Hanson31, Krzysztof G. Hełminiak1, Melvin G. Hoare32, L. Huckvale8, Andrés Jordán1, Karen Kinemuchi33, A. Longmore34, Martin Lopez-Corredoira35, Martin Lopez-Corredoira36, Thomas J. Maccarone37, Daniel J. Majaess38, Eric Martin35, N. Masetti, Ronald E. Mennickent15, I. F. Mirabel, Lorenzo Monaco4, Lorenzo Morelli22, Veronica Motta16, T. Palma17, M. C. Parisi17, Quentin A. Parker39, Quentin A. Parker40, F. Peñaloza16, Grzegorz Pietrzyński14, Grzegorz Pietrzyński15, Giuliano Pignata41, Bogdan Popescu31, Mike Read7, A. F. Rojas1, Alexandre Roman-Lopes10, Maria Teresa Ruiz25, Ivo Saviane4, Matthias R. Schreiber16, A. C. Schröder42, Saurabh Sharma43, Saurabh Sharma16, Michael D. Smith44, Laerte Sodré20, Joseph J. Stead32, Andrew W. Stephens, Motohide Tamura, C. Tappert16, Mark Thompson3, Elena Valenti4, Leonardo Vanzi1, Nicholas A. Walton6, W. A. Weidmann17, Albert A. Zijlstra8 
TL;DR: The ESO VISTA public survey VISTA variables in the V�a L�ctea (VVV) started in 2010 and is expected to run for about five years.
Abstract: Context The ESO public survey VISTA variables in the V�a L�ctea (VVV) started in 2010 VVV targets 562 sq deg in the Galactic bulge and an adjacent plane region and is expected to run for about five years Aims: We describe the progress of the survey observations in the first observing season, the observing strategy, and quality of the data obtained Methods: The observations are carried out on the 4-m VISTA telescope in the ZYJHK s filters In addition to the multi-band imaging the variability monitoring campaign in the K s filter has started Data reduction is carried out using the pipeline at the Cambridge Astronomical Survey Unit The photometric and astrometric calibration is performed via the numerous 2MASS sources observed in each pointing Results: The first data release contains the aperture photometry and astrometric catalogues for 348 individual pointings in the ZYJHK s filters taken in the 2010 observing season The typical image quality is 09 arcsec {-10 arcsec} The stringent photometric and image quality requirements of the survey are satisfied in 100% of the JHK s images in the disk area and 90% of the JHK s images in the bulge area The completeness in the Z and Y images is 84% in the disk, and 40% in the bulge The first season catalogues contain 128 � 10 8 stellar sources in the bulge and 168 � 10 8 in the disk area detected in at least one of the photometric bands The combined, multi-band catalogues contain more than 163 � 10 8 stellar sources About 10% of these are double detections because of overlapping adjacent pointings These overlapping multiple detections are used to characterise the quality of the data The images in the JHK s bands extend typically 4 mag deeper than 2MASS The magnitude limit and photometric quality depend strongly on crowding in the inner Galactic regions The astrometry for K s = 15-18 mag has rms 35-175 mas Conclusions: The VVV Survey data products offer a unique dataset to map the stellar populations in the Galactic bulge and the adjacent plane and provide an exciting new tool for the study of the structure, content, and star-formation history of our Galaxy, as well as for investigations of the newly discovered star clusters, star-forming regions in the disk, high proper motion stars, asteroids, planetary nebulae, and other interesting objects Based on observations taken within the ESO VISTA Public Survey VVV, Programme ID 179B-2002

Journal ArticleDOI
Georges Aad1, Georges Aad2, Brad Abbott2, Brad Abbott3  +5592 moreInstitutions (189)
TL;DR: The ATLAS trigger system as discussed by the authors selects events by rapidly identifying signatures of muon, electron, photon, tau lepton, jet, and B meson candidates, as well as using global event signatures, such as missing transverse energy.
Abstract: Proton-proton collisions at root s = 7 TeV and heavy ion collisions at root(NN)-N-s = 2.76 TeV were produced by the LHC and recorded using the ATLAS experiment's trigger system in 2010. The LHC is designed with a maximum bunch crossing rate of 40 MHz and the ATLAS trigger system is designed to record approximately 200 of these per second. The trigger system selects events by rapidly identifying signatures of muon, electron, photon, tau lepton, jet, and B meson candidates, as well as using global event signatures, such as missing transverse energy. An overview of the ATLAS trigger system, the evolution of the system during 2010 and the performance of the trigger system components and selections based on the 2010 collision data are shown. A brief outline of plans for the trigger system in 2011 is presented.

Journal ArticleDOI
TL;DR: The most important sources of atmospheric moisture at the global scale are identified, both oceanic and terrestrial, and a characterization is made of how continental regions are influenced by water from different moisture source regions as discussed by the authors.
Abstract: [1] The most important sources of atmospheric moisture at the global scale are herein identified, both oceanic and terrestrial, and a characterization is made of how continental regions are influenced by water from different moisture source regions. The methods used to establish source-sink relationships of atmospheric water vapor are reviewed, and the advantages and caveats associated with each technique are discussed. The methods described include analytical and box models, numerical water vapor tracers, and physical water vapor tracers (isotopes). In particular, consideration is given to the wide range of recently developed Lagrangian techniques suitable both for evaluating the origin of water that falls during extreme precipitation events and for establishing climatologies of moisture source-sink relationships. As far as oceanic sources are concerned, the important role of the subtropical northern Atlantic Ocean provides moisture for precipitation to the largest continental area, extending from Mexico to parts of Eurasia, and even to the South American continent during the Northern Hemisphere winter. In contrast, the influence of the southern Indian Ocean and North Pacific Ocean sources extends only over smaller continental areas. The South Pacific and the Indian Ocean represent the principal source of moisture for both Australia and Indonesia. Some landmasses only receive moisture from the evaporation that occurs in the same hemisphere (e.g., northern Europe and eastern North America), while others receive moisture from both hemispheres with large seasonal variations (e.g., northern South America). The monsoonal regimes in India, tropical Africa, and North America are provided with moisture from a large number of regions, highlighting the complexities of the global patterns of precipitation. Some very important contributions are also seen from relatively small areas of ocean, such as the Mediterranean Basin (important for Europe and North Africa) and the Red Sea, which provides water for a large area between the Gulf of Guinea and Indochina (summer) and between the African Great Lakes and Asia (winter). The geographical regions of Eurasia, North and South America, and Africa, and also the internationally important basins of the Mississippi, Amazon, Congo, and Yangtze Rivers, are also considered, as is the importance of terrestrial sources in monsoonal regimes. The role of atmospheric rivers, and particularly their relationship with extreme events, is discussed. Droughts can be caused by the reduced supply of water vapor from oceanic moisture source regions. Some of the implications of climate change for the hydrological cycle are also reviewed, including changes in water vapor concentrations, precipitation, soil moisture, and aridity. It is important to achieve a combined diagnosis of moisture sources using all available information, including stable water isotope measurements. A summary is given of the major research questions that remain unanswered, including (1) the lack of a full understanding of how moisture sources influence precipitation isotopes; (2) the stationarity of moisture sources over long periods; (3) the way in which possible changes in intensity (where evaporation exceeds precipitation to a greater of lesser degree), and the locations of the sources, (could) affect the distribution of continental precipitation in a changing climate; and (4) the role played by the main modes of climate variability, such as the North Atlantic Oscillation or the El Nino–Southern Oscillation, in the variability of the moisture source regions, as well as a full evaluation of the moisture transported by low-level jets and atmospheric rivers.

Journal ArticleDOI
TL;DR: In this article, the authors used a statistical approach to estimate Tmax, Tmin and Tavg for a 10-year period based on LST data obtained from MODIS and auxiliary data using a statistical optimization procedure with a mixed bootstrap and jackknife resampling.

Journal ArticleDOI
Richard Anney1, Lambertus Klei2, Dalila Pinto3, Dalila Pinto4, Joana Almeida, Elena Bacchelli5, Gillian Baird6, Nadia Bolshakova1, Sven Bölte7, Patrick Bolton8, Thomas Bourgeron9, Thomas Bourgeron10, Sean Brennan1, Jessica Brian3, Jillian P. Casey11, Judith Conroy11, Catarina Correia12, Catarina Correia13, Christina Corsello14, Emily L. Crawford15, Maretha de Jonge16, Richard Delorme, Eftichia Duketis7, Frederico Duque, Annette Estes17, Penny Farrar18, Bridget A. Fernandez19, Susan E. Folstein20, Eric Fombonne21, John R. Gilbert20, Christopher Gillberg22, Joseph T. Glessner23, Andrew Green11, Jonathan Green24, Stephen J. Guter25, Elizabeth A. Heron1, Richard Holt18, Jennifer L. Howe3, Gillian Hughes1, Vanessa Hus14, Roberta Igliozzi, Suma Jacob25, Graham Kenny1, Cecilia Kim23, Alexander Kolevzon4, Vlad Kustanovich, Clara Lajonchere, Janine A. Lamb24, Miriam Law-Smith1, Marion Leboyer9, Ann Le Couteur26, Bennett L. Leventhal27, Bennett L. Leventhal28, Xiao-Qing Liu29, Frances Lombard1, Catherine Lord30, Linda Lotspeich31, Sabata C. Lund15, Tiago R. Magalhaes12, Tiago R. Magalhaes13, Carine Mantoulan32, Christopher J. McDougle33, Christopher J. McDougle34, Nadine M. Melhem2, Alison K. Merikangas1, Nancy J. Minshew2, Ghazala Mirza18, Jeff Munson17, Carolyn Noakes3, Gudrun Nygren22, Katerina Papanikolaou35, Alistair T. Pagnamenta18, Barbara Parrini, Tara Paton3, Andrew Pickles24, David J. Posey33, Fritz Poustka7, Jiannis Ragoussis18, Regina Regan11, Wendy Roberts3, Kathryn Roeder36, Bernadette Rogé32, Michael Rutter37, Sabine Schlitt7, Naisha Shah11, Val C. Sheffield38, Latha Soorya4, Inês Sousa18, Vera Stoppioni, Nuala Sykes18, Raffaella Tancredi, Ann P. Thompson39, Susanne Thomson15, Ana Tryfon4, John Tsiantis35, Herman van Engeland16, John B. Vincent3, Fred R. Volkmar40, Jacob A. S. Vorstman16, Simon Wallace18, Kirsty Wing18, Kerstin Wittemeyer18, Shawn Wood2, Danielle Zurawiecki4, Lonnie Zwaigenbaum41, Anthony J. Bailey42, Agatino Battaglia, Rita M. Cantor43, Hilary Coon44, Michael L. Cuccaro20, Geraldine Dawson45, Geraldine Dawson46, Sean Ennis11, Christine M. Freitag7, Daniel H. Geschwind43, Jonathan L. Haines47, Sabine M. Klauck48, William M. McMahon44, Elena Maestrini5, Judith Miller44, Judith Miller23, Anthony P. Monaco49, Anthony P. Monaco18, Stanley F. Nelson43, John I. Nurnberger33, Guiomar Oliveira, Jeremy R. Parr26, Margaret A. Pericak-Vance20, Joseph Piven46, Gerard D. Schellenberg23, Stephen W. Scherer3, Astrid M. Vicente13, Astrid M. Vicente12, Thomas H. Wassink38, Ellen M. Wijsman17, Catalina Betancur50, Catalina Betancur51, Catalina Betancur52, Joseph D. Buxbaum4, Edwin H. Cook25, Louise Gallagher1, Michael Gill1, Joachim Hallmayer31, Andrew D. Paterson3, James S. Sutcliffe15, Peter Szatmari39, Veronica J. Vieland53, Hakon Hakonarson23, Bernie Devlin2 
Trinity College, Dublin1, University of Pittsburgh2, University of Toronto3, Icahn School of Medicine at Mount Sinai4, University of Bologna5, Guy's and St Thomas' NHS Foundation Trust6, Goethe University Frankfurt7, King's College London8, University of Paris9, Pasteur Institute10, University College Dublin11, Instituto Gulbenkian de Ciência12, University of Lisbon13, University of Michigan14, Vanderbilt University15, Utrecht University16, University of Washington17, University of Oxford18, Memorial University of Newfoundland19, University of Miami20, McGill University21, University of Gothenburg22, University of Pennsylvania23, University of Manchester24, University of Illinois at Chicago25, Newcastle University26, Nathan Kline Institute for Psychiatric Research27, New York University28, University of Manitoba29, Cornell University30, Stanford University31, University of Toulouse32, Indiana University33, Harvard University34, National and Kapodistrian University of Athens35, Carnegie Mellon University36, Medical Research Council37, University of Iowa38, McMaster University39, Yale University40, University of Alberta41, University of British Columbia42, University of California, Los Angeles43, University of Utah44, Autism Speaks45, University of North Carolina at Chapel Hill46, Veterans Health Administration47, German Cancer Research Center48, Tufts University49, Centre national de la recherche scientifique50, Pierre-and-Marie-Curie University51, French Institute of Health and Medical Research52, Ohio State University53
TL;DR: Stage 2 of the Autism Genome Project genome-wide association study is reported, adding 1301 ASD families and bringing the total to 2705 families analysed, and it is reasonable to conclude that common variants affect the risk for ASD but their individual effects are modest.
Abstract: While it is apparent that rare variation can play an important role in the genetic architecture of autism spectrum disorders (ASDs), the contribution of common variation to the risk of developing ASD is less clear. To produce a more comprehensive picture, we report Stage 2 of the Autism Genome Project genome-wide association study, adding 1301 ASD families and bringing the total to 2705 families analysed (Stages 1 and 2). In addition to evaluating the association of individual single nucleotide polymorphisms (SNPs), we also sought evidence that common variants, en masse, might affect the risk. Despite genotyping over a million SNPs covering the genome, no single SNP shows significant association with ASD or selected phenotypes at a genome-wide level. The SNP that achieves the smallest P-value from secondary analyses is rs1718101. It falls in CNTNAP2, a gene previously implicated in susceptibility for ASD. This SNP also shows modest association with age of word/phrase acquisition in ASD subjects, of interest because features of language development are also associated with other variation in CNTNAP2. In contrast, allele scores derived from the transmission of common alleles to Stage 1 cases significantly predict case status in the independent Stage 2 sample. Despite being significant, the variance explained by these allele scores was small (Vm< 1%). Based on results from individual SNPs and their en masse effect on risk, as inferred from the allele score results, it is reasonable to conclude that common variants affect the risk for ASD but their individual effects are modest.

Journal ArticleDOI
TL;DR: This companion manuscript is to provide a practical guide framework for the appropriate use and reporting of the novel frequency domain (FD) OCT imaging to guide interventional procedures, with a particular interest on the comparison with intravascular ultrasound (IVUS).
Abstract: This document is complementary to an Expert Review Document on Optical Coherence Tomography (OCT) for the study of coronary arteries and atherosclerosis.1 The goal of this companion manuscript is to provide a practical guide framework for the appropriate use and reporting of the novel frequency domain (FD) OCT imaging to guide interventional procedures, with a particular interest on the comparison with intravascular ultrasound (IVUS).1–4 In the OCT Expert Review Document on Atherosclerosis, a comprehensive description of the physical principles for OCT imaging and time domain (TD) catheters (St Jude Medical, Westford, MA, USA) was provided.1 The main advantage of FD-OCT is that the technology enables rapid imaging of the coronary artery, using a non-occlusive acquisition modality. The FD-OCT catheter (DragonflyTM; St Jude Medical) employs a single-mode optical fibre, enclosed in a hollow metal torque wire that rotates at a speed of 100 r.p.s. It is compatible with a conventional 0.014″ angioplasty guide wire, inserted into a short monorail lumen at the tip. The frequency domain optical coherence tomography lateral resolution is improved in comparison with TD-OCT, while the axial resolution did not change. These features, together with reduced motion artefacts and an increased maximum field of view up to 11 mm, have significantly improved both the quality and ease of use of OCT in the catheterization laboratory.3,4 However, the imaging depth of the FD-OCT is still limited to 0.5–2.0 mm.5 The main obstacle to the adoption of TD-OCT imaging in clinical practice is that OCT cannot image through a blood field, and therefore requires clearing or flushing of blood from the lumen.1 The 6 Fr compatible DragonflyTM FD-OCT catheter is so far the only one in the market, as two other systems from Volcano and Terumo, which …

Journal ArticleDOI
TL;DR: The final results of the Phase II SIMPLE measurements are reported, comprising two run stages of 15 superheated droplet detectors each, with the second stage including an improved neutron shielding and revised nucleation efficiency based on a reanalysis of previously reported monochromatic neutron irradiations.
Abstract: We report the final results of the Phase II SIMPLE measurements, comprising two run stages of 15 superheated droplet detectors each, with the second stage including an improved neutron shielding. The analyses include a refined signal analysis, and revised nucleation efficiency based on a reanalysis of previously reported monochromatic neutron irradiations. The combined results yield a contour minimum

Journal ArticleDOI
TL;DR: Reporting of EM stimulation dose should be guided by the principle of reproducibility: sufficient information about the stimulation parameters should be provided so that the dose can be replicated.

Journal ArticleDOI
Georges Aad1, Georges Aad2, Brad Abbott3, Brad Abbott2  +5559 moreInstitutions (188)
TL;DR: In this paper, the performance of the missing transverse momentum reconstruction was evaluated using data collected in pp collisions at a centre-of-mass energy of 7 TeV in 2010.
Abstract: The measurement of missing transverse momentum in the ATLAS detector, described in this paper, makes use of the full event reconstruction and a calibration based on reconstructed physics objects. The performance of the missing transverse momentum reconstruction is evaluated using data collected in pp collisions at a centre-of-mass energy of 7 TeV in 2010. Minimum bias events and events with jets of hadrons are used from data samples corresponding to an integrated luminosity of about 0.3 nb(-1) and 600 nb(-1) respectively, together with events containing a Z boson decaying to two leptons (electrons or muons) or a W boson decaying to a lepton (electron or muon) and a neutrino, from a data sample corresponding to an integrated luminosity of about 36 pb(-1). An estimate of the systematic uncertainty on the missing transverse momentum scale is presented.

Journal ArticleDOI
TL;DR: In this article, the authors analyzed technical efficiency of Japanese banks from 2000 to 2007 and showed that NPLs remain a significant burden as for banks' performance, and that banks' inputs have to be utilised more efficiently, particularly labour and premises.
Abstract: The paper analyses technical efficiency of the Japanese banks from 2000 to 2007. The estimation technique is based on the Russell directional distance function that takes into consideration not only desirable outputs but also an undesirable output that is represented by non-performing loans (NPLs). The results indicate that NPLs remain a significant burden as for banks' performance. We show that banks' inputs have to be utilised more efficiently, particularly labour and premises. We also argue that a further restructuring process is needed in the segment of Regional Banks. We conclude that the Japanese banking system is still far away from being fully consolidated and restructured.

Journal ArticleDOI
TL;DR: It is demonstrated that total prokaryotic community structure can be directly correlated to geochemistry within these sediments, thus enhancing the understanding of biogeochemical cycling and the ability to predict metabolisms of uncultured microbes in deep-sea sediments.
Abstract: Microbial communities and their associated metabolic activity in marine sediments have a profound impact on global biogeochemical cycles. Their composition and structure are attributed to geochemical and physical factors, but finding direct correlations has remained a challenge. Here we show a significant statistical relationship between variation in geochemical composition and prokaryotic community structure within deep-sea sediments. We obtained comprehensive geochemical data from two gravity cores near the hydrothermal vent field Loki’s Castle at the Arctic Mid-Ocean Ridge, in the Norwegian-Greenland Sea. Geochemical properties in the rift valley sediments exhibited strong centimeter-scale stratigraphic variability. Microbial populations were profiled by pyrosequencing from 15 sediment horizons (59,364 16S rRNA gene tags), quantitatively assessed by qPCR, and phylogenetically analyzed. Although the same taxa were generally present in all samples, their relative abundances varied substantially among horizons and fluctuated between Bacteria- and Archaea-dominated communities. By independently summarizing covariance structures of the relative abundance data and geochemical data, using principal components analysis, we found a significant correlation between changes in geochemical composition and changes in community structure. Differences in organic carbon and mineralogy shaped the relative abundance of microbial taxa. We used correlations to build hypotheses about energy metabolisms, particularly of the Deep Sea Archaeal Group, specific Deltaproteobacteria, and sediment lineages of potentially anaerobic Marine Group I Archaea. We demonstrate that total prokaryotic community structure can be directly correlated to geochemistry within these sediments, thus enhancing our understanding of biogeochemical cycling and our ability to predict metabolisms of uncultured microbes in deep-sea sediments.

Journal ArticleDOI
TL;DR: This paper presents a mixed-integer linear programming formulation that is flexible to incorporate most of the reverse network structures plausible in practice and proposes a multi-commodity formulation and uses a reverse bill of materials in order to capture component commonality among different products.

Journal ArticleDOI
TL;DR: The current literature is reviewed and existing data is integrated in order to propose possible mechanisms of secretion, cell dysfunction, and death that might open novel avenues for the development of new therapeutic strategies.
Abstract: The aggregation, deposition, and dysfunction of alpha-synuclein (aSyn) are common events in neurodegenerative disorders known as synucleinopathies. These include Parkinson's disease, dementia with Lewy bodies, and multiple system atrophy. A growing body of knowledge on the biology of aSyn is emerging and enabling novel hypotheses to be tested. In particular, the hypothesis that aSyn is secreted from neurons, thus contributing to the spreading of pathology not only in the brain but also in other organs, is gaining momentum. Nevertheless, the precise mechanism(s) of secretion, as well as the consequences of extracellular aSyn species for neighboring cells are still unclear. Here, we review the current literature and integrate existing data in order to propose possible mechanisms of secretion, cell dysfunction, and death. Ultimately, the complete understanding of these processes might open novel avenues for the development of new therapeutic strategies.

Journal ArticleDOI
TL;DR: In this article, a model of customer relationship outcomes of service customization and the efficacy of customization is developed, and two large-scale, representative, cross-sectional studies in different service industries based on the European Customer Satisfaction Index framework are conducted, and PLS path modeling is applied to test the model.
Abstract: – Although practitioners and scholars alike embrace service customization as a possibly powerful management instrument, its impact on customer relationships as well as contingencies for its effective application are not well understood. Drawing from relationship marketing and exchange theory, this paper aims to develop a model of customer relationship outcomes of service customization and the efficacy of service customization., – Two large‐scale, representative, cross‐sectional studies in different service industries based on the European Customer Satisfaction Index framework are conducted, and PLS path modeling is applied to test the model., – Customization increases perceived service quality, customer satisfaction, customer trust, and ultimately customer loyalty toward a service provider. Customization has both direct and mediated effects on customer loyalty and interacts with the effects of customer satisfaction and customer trust on loyalty., – Service customization is a viable instrument for relationship marketing. Its efficacy depends on customer satisfaction and customer trust. While this study solely focuses on the impact of service customization, future research could assess the relative importance of service customization in the presence of other relationship marketing instruments., – Service providers can use service customization as an effective instrument for achieving not only higher customer satisfaction, but also higher customer loyalty. Service customization is most effective for companies that have deficits in satisfying their customers, while at the same time their customer relationships are characterized by a high level of trust. These results help managers to decide upon resource allocation to enhance customer satisfaction, trust and loyalty., – This paper investigates the simultaneous effects of service customization on customer loyalty and other relationships variables and offers new insights relatively to the nature and size of customization effects. It fills an important gap in the knowledge of customization outcomes, and clarifies under which circumstances service customization is most effective. The paper is of great value for service providers that face the decision whether to customize their offering or not.