scispace - formally typeset
Search or ask a question

Showing papers by "State University of Campinas published in 2013"


Proceedings ArticleDOI
01 Jan 2013
TL;DR: A versatile, scalable, yet powerful general-purpose robot simulation framework called V-REP, which allows for direct incorporation of various control techniques and renders simulations and simulation models more accessible to a general-public, by reducing the simulation model deployment complexity.
Abstract: From exploring planets to cleaning homes, the reach and versatility of robotics is vast. The integration of actuation, sensing and control makes robotics systems powerful, but complicates their simulation. This paper introduces a versatile, scalable, yet powerful general-purpose robot simulation framework called V-REP. The paper discusses the utility of a portable and flexible simulation framework that allows for direct incorporation of various control techniques. This renders simulations and simulation models more accessible to a general-public, by reducing the simulation model deployment complexity. It also increases productivity by offering built-in and ready-to-use functionalities, as well as a multitude of programming approaches. This allows for a multitude of applications including rapid algorithm development, system verification, rapid prototyping, and deployment for cases such as safety/remote monitoring, training and education, hardware control, and factory automation simulation.

1,293 citations


Journal ArticleDOI
TL;DR: This paper explores the nature of open set recognition and formalizes its definition as a constrained minimization problem, and introduces a novel “1-vs-set machine,” which sculpts a decision space from the marginal distances of a 1-class or binary SVM with a linear kernel.
Abstract: To date, almost all experimental evaluations of machine learning-based recognition algorithms in computer vision have taken the form of “closed set” recognition, whereby all testing classes are known at training time. A more realistic scenario for vision applications is “open set” recognition, where incomplete knowledge of the world is present at training time, and unknown classes can be submitted to an algorithm during testing. This paper explores the nature of open set recognition and formalizes its definition as a constrained minimization problem. The open set recognition problem is not well addressed by existing algorithms because it requires strong generalization. As a step toward a solution, we introduce a novel “1-vs-set machine,” which sculpts a decision space from the marginal distances of a 1-class or binary SVM with a linear kernel. This methodology applies to several different applications in computer vision where open set recognition is a challenging problem, including object recognition and face verification. We consider both in this work, with large scale cross-dataset experiments performed over the Caltech 256 and ImageNet sets, as well as face matching experiments performed over the Labeled Faces in the Wild set. The experiments highlight the effectiveness of machines adapted for open set evaluation compared to existing 1-class and binary SVMs for the same tasks.

1,029 citations


Journal ArticleDOI
Hans ter Steege1, Hans ter Steege2, Nigel C. A. Pitman3, Daniel Sabatier4, Christopher Baraloto5, Rafael de Paiva Salomão6, Juan Ernesto Guevara7, Oliver L. Phillips8, Carolina V. Castilho9, William E. Magnusson10, Jean-François Molino4, Abel Monteagudo, Percy Núñez Vargas11, Juan Carlos Montero10, Ted R. Feldpausch12, Ted R. Feldpausch8, Eurídice N. Honorio Coronado8, Timothy J. Killeen13, Bonifacio Mostacedo14, Rodolfo Vasquez, Rafael L. Assis15, Rafael L. Assis10, John Terborgh3, Florian Wittmann16, Ana Andrade10, William F. Laurance17, Susan G. Laurance17, Beatriz Schwantes Marimon18, Ben Hur Marimon18, Ima Célia Guimarães Vieira6, Iêda Leão do Amaral10, Roel J. W. Brienen8, Hernán Castellanos, Dairon Cárdenas López, Joost F. Duivenvoorden19, Hugo Mogollón20, Francisca Dionízia de Almeida Matos10, Nállarett Dávila21, Roosevelt García-Villacorta22, Pablo Roberto Stevenson Diaz23, Flávia R. C. Costa10, Thaise Emilio10, Carolina Levis10, Juliana Schietti10, Priscila Souza10, Alfonso Alonso24, Francisco Dallmeier24, Álvaro Javier Duque Montoya25, Maria Teresa Fernandez Piedade10, Alejandro Araujo-Murakami, Luzmila Arroyo, Rogério Gribel, Paul V. A. Fine7, Carlos A. Peres26, Marisol Toledo14, A C Gerardo Aymard, Timothy R. Baker8, Carlos Cerón27, Julien Engel28, Terry W. Henkel29, Paul J. M. Maas2, Pascal Petronelli, Juliana Stropp, Charles E. Zartman10, Doug Daly30, David A. Neill, Marcos Silveira31, Marcos Ríos Paredes, Jérôme Chave32, Diogenes de Andrade Lima Filho10, Peter M. Jørgensen33, Alfredo F. Fuentes33, Jochen Schöngart16, Fernando Cornejo Valverde34, Anthony Di Fiore35, E. M. Jimenez25, Maria Cristina Peñuela Mora25, Juan Fernando Phillips, Gonzalo Rivas36, Tinde van Andel2, Patricio von Hildebrand, Bruce Hoffman2, Egleé L. Zent37, Yadvinder Malhi38, Adriana Prieto25, Agustín Rudas25, Ademir R. Ruschell9, Natalino Silva39, Vincent A. Vos, Stanford Zent37, Alexandre Adalardo de Oliveira40, Angela Cano Schutz23, Therany Gonzales34, Marcelo Trindade Nascimento41, Hirma Ramírez-Angulo23, Rodrigo Sierra, Milton Tirado, Maria Natalia Umaña Medina23, Geertje M. F. van der Heijden42, Geertje M. F. van der Heijden43, César I.A. Vela11, Emilio Vilanova Torre23, Corine Vriesendorp, Ophelia Wang44, Kenneth R. Young35, Cláudia Baider40, Henrik Balslev45, Cid Ferreira10, Italo Mesones7, Armando Torres-Lezama23, Ligia Estela Urrego Giraldo25, Roderick Zagt46, Miguel Alexiades47, Lionel Hernández, Isau Huamantupa-Chuquimaco, William Milliken48, Walter Palacios Cuenca, Daniela Pauletto, Elvis H. Valderrama Sandoval49, Elvis H. Valderrama Sandoval50, Luis Valenzuela Gamarra, Kyle G. Dexter22, Kenneth J. Feeley51, Kenneth J. Feeley52, Gabriela Lopez-Gonzalez8, Miles R. Silman53 
Utrecht University1, Naturalis2, Duke University3, Institut de recherche pour le développement4, Institut national de la recherche agronomique5, Museu Paraense Emílio Goeldi6, University of California, Berkeley7, University of Leeds8, Empresa Brasileira de Pesquisa Agropecuária9, National Institute of Amazonian Research10, National University of Saint Anthony the Abbot in Cuzco11, University of Exeter12, World Wide Fund for Nature13, Universidad Autónoma Gabriel René Moreno14, Norwegian University of Life Sciences15, Max Planck Society16, James Cook University17, Universidade do Estado de Mato Grosso18, University of Amsterdam19, Silver Spring Networks20, State University of Campinas21, University of Edinburgh22, University of Los Andes23, Smithsonian Conservation Biology Institute24, National University of Colombia25, University of East Anglia26, Central University of Ecuador27, Centre national de la recherche scientifique28, Humboldt State University29, New York Botanical Garden30, Universidade Federal do Acre31, Paul Sabatier University32, Missouri Botanical Garden33, Amazon.com34, University of Texas at Austin35, University of Florida36, Venezuelan Institute for Scientific Research37, Environmental Change Institute38, Federal Rural University of Amazonia39, University of São Paulo40, State University of Norte Fluminense41, Smithsonian Tropical Research Institute42, University of Wisconsin–Milwaukee43, Northern Arizona University44, Aarhus University45, Tropenbos International46, University of Kent47, Royal Botanic Gardens48, Universidad Nacional de la Amazonía Peruana49, University of Missouri–St. Louis50, Fairchild Tropical Botanic Garden51, Florida International University52, Wake Forest University53
18 Oct 2013-Science
TL;DR: The finding that Amazonia is dominated by just 227 tree species implies that most biogeochemical cycling in the world’s largest tropical forest is performed by a tiny sliver of its diversity.
Abstract: The vast extent of the Amazon Basin has historically restricted the study of its tree communities to the local and regional scales. Here, we provide empirical data on the commonness, rarity, and richness of lowland tree species across the entire Amazon Basin and Guiana Shield (Amazonia), collected in 1170 tree plots in all major forest types. Extrapolations suggest that Amazonia harbors roughly 16,000 tree species, of which just 227 (1.4%) account for half of all trees. Most of these are habitat specialists and only dominant in one or two regions of the basin. We discuss some implications of the finding that a small group of species—less diverse than the North American tree flora—accounts for half of the world’s most diverse tree community.

963 citations


Journal ArticleDOI
TL;DR: The proposed international consensus classification of hippocampal neuronal cell loss will aid in the characterization of specific clinicopathologic syndromes, and explore variability in imaging and electrophysiology findings, and in postsurgical seizure control.
Abstract: Hippocampal sclerosis (HS) is the most frequent histopathology encountered in patients with drug-resistant temporal lobe epilepsy (TLE). Over the past decades, various attempts have been made to classify specific patterns of hippocampal neuronal cell loss and correlate subtypes with postsurgical outcome. However, no international consensus about definitions and terminology has been achieved. A task force reviewed previous classification schemes and proposes a system based on semiquantitative hippocampal cell loss patterns that can be applied in any histopathology laboratory. Interobserver and intraobserver agreement studies reached consensus to classify three types in anatomically well-preserved hippocampal specimens: HS International League Against Epilepsy (ILAE) type 1 refers always to severe neuronal cell loss and gliosis predominantly in CA1 and CA4 regions, compared to CA1 predominant neuronal cell loss and gliosis (HS ILAE type 2), or CA4 predominant neuronal cell loss and gliosis (HS ILAE type 3). Surgical hippocampus specimens obtained from patients with TLE may also show normal content of neurons with reactive gliosis only (no-HS). HS ILAE type 1 is more often associated with a history of initial precipitating injuries before age 5 years, with early seizure onset, and favorable postsurgical seizure control. CA1 predominant HS ILAE type 2 and CA4 predominant HS ILAE type 3 have been studied less systematically so far, but some reports point to less favorable outcome, and to differences regarding epilepsy history, including age of seizure onset. The proposed international consensus classification will aid in the characterization of specific clinicopathologic syndromes, and explore variability in imaging and electrophysiology findings, and in postsurgical seizure control. © 2013 Wiley Periodicals, Inc.

743 citations


Journal ArticleDOI
TL;DR: In this paper, the potential of maltodextrin combination with different wall materials in the microencapsulation of flaxseed oil by spray drying, in order to maximize encapsulation efficiency and minimize lipid oxidation, was evaluated.

724 citations


Journal ArticleDOI
TL;DR: In this article, a detailed description of the analysis used by the CMS Collaboration in the search for the standard model Higgs boson in pp collisions at the LHC, which led to the observation of a new boson.
Abstract: A detailed description is reported of the analysis used by the CMS Collaboration in the search for the standard model Higgs boson in pp collisions at the LHC, which led to the observation of a new boson. The data sample corresponds to integrated luminosities up to 5.1 inverse femtobarns at sqrt(s) = 7 TeV, and up to 5.3 inverse femtobarns at sqrt(s) = 8 TeV. The results for five Higgs boson decay modes gamma gamma, ZZ, WW, tau tau, and bb, which show a combined local significance of 5 standard deviations near 125 GeV, are reviewed. A fit to the invariant mass of the two high resolution channels, gamma gamma and ZZ to 4 ell, gives a mass estimate of 125.3 +/- 0.4 (stat) +/- 0.5 (syst) GeV. The measurements are interpreted in the context of the standard model Lagrangian for the scalar Higgs field interacting with fermions and vector bosons. The measured values of the corresponding couplings are compared to the standard model predictions. The hypothesis of custodial symmetry is tested through the measurement of the ratio of the couplings to the W and Z bosons. All the results are consistent, within their uncertainties, with the expectations for a standard model Higgs boson.

643 citations


Journal ArticleDOI
TL;DR: Although ethosuximide and valproic acid now have level A efficacy/effectiveness evidence as initial monotherapy for children with absence seizures, there continues to be an alarming lack of well designed, properly conducted epilepsy RCTs for patients with generalized seizures/epilepsies and in children in general.
Abstract: The purpose of this report was to update the 2006 International League Against Epilepsy (ILAE) report and identify the level of evidence for long-term efficacy or effectiveness for antiepileptic drugs (AEDs) as initial monotherapy for patients with newly diagnosed or untreated epilepsy. All applicable articles from July 2005 until March 2012 were identified, evaluated, and combined with the previous analysis (Glauser et al., 2006) to provide a comprehensive update. The prior analysis methodology was utilized with three modifications: (1) the detectable noninferiority boundary approach was dropped and both failed superiority studies and prespecified noninferiority studies were analyzed using a noninferiority approach, (2) the definition of an adequate comparator was clarified and now includes an absolute minimum point estimate for efficacy/effectiveness, and (3) the relationship table between clinical trial ratings, level of evidence, and conclusions no longer includes a recommendation column to reinforce that this review of efficacy/evidence for specific seizure types does not imply treatment recommendations. This evidence review contains one clarification: The commission has determined that class I superiority studies can be designed to detect up to a 20% absolute (rather than relative) difference in the point estimate of efficacy/effectiveness between study treatment and comparator using an intent-to-treat analysis. Since July, 2005, three class I randomized controlled trials (RCT) and 11 class III RCTs have been published. The combined analysis (1940-2012) now includes a total of 64 RCTs (7 with class I evidence, 2 with class II evidence) and 11 meta-analyses. New efficacy/effectiveness findings include the following: levetiracetam and zonisamide have level A evidence in adults with partial onset seizures and both ethosuximide and valproic acid have level A evidence in children with childhood absence epilepsy. There are no major changes in the level of evidence for any other subgroup. Levetiracetam and zonisamide join carbamazepine and phenytoin with level A efficacy/effectiveness evidence as initial monotherapy for adults with partial onset seizures. Although ethosuximide and valproic acid now have level A efficacy/effectiveness evidence as initial monotherapy for children with absence seizures, there continues to be an alarming lack of well designed, properly conducted epilepsy RCTs for patients with generalized seizures/epilepsies and in children in general. These findings reinforce the need for multicenter, multinational efforts to design, conduct, and analyze future clinically relevant adequately designed RCTs. When selecting a patient's AED, all relevant variables and not just efficacy and effectiveness should be considered.

589 citations


Journal ArticleDOI
TL;DR: This work finds that because of significant charging of quantum dots with extra electrons, Auger recombination greatly impacts both LED efficiency and the onset of efficiency roll-off at high currents, and demonstrates two specific approaches using heterostructured quantum dots.
Abstract: Development of light-emitting diodes (LEDs) based on colloidal quantum dots is driven by attractive properties of these fluorophores such as spectrally narrow, tunable emission and facile processibility via solution-based methods. A current obstacle towards improved LED performance is an incomplete understanding of the roles of extrinsic factors, such as non-radiative recombination at surface defects, versus intrinsic processes, such as multicarrier Auger recombination or electron-hole separation due to applied electric field. Here we address this problem with studies that correlate the excited state dynamics of structurally engineered quantum dots with their emissive performance within LEDs. We find that because of significant charging of quantum dots with extra electrons, Auger recombination greatly impacts both LED efficiency and the onset of efficiency roll-off at high currents. Further, we demonstrate two specific approaches for mitigating this problem using heterostructured quantum dots, either by suppressing Auger decay through the introduction of an intermediate alloyed layer, or by using an additional shell that impedes electron transfer into the quantum dot to help balance electron and hole injection.

572 citations


Journal ArticleDOI
TL;DR: High coverage of essential interventions did not imply reduced maternal mortality in the health-care facilities the authors studied, and the maternal severity index (MSI) had good accuracy for maternal death prediction in women with markers of organ dysfunction.

533 citations


Journal ArticleDOI
TL;DR: This paper shows that the contamination of the channel estimates happens whenever a pilot sequence is received at a base station simultaneously with non-orthogonal signals coming from other users, and proposes a method to avoid such simultaneous transmissions from adjacent cells, thus significantly decreasing interference.
Abstract: In this paper we study the performance of cellular networks when their base stations have an unlimited number of antennas. In previous work, the asymptotic behavior of the signal to interference plus nose ratio (SINR) was obtained. We revisit these results by deriving the rigorous expression for the SINR of both downlink and uplink in the scenario of infinite number of antennas. We show that the contamination of the channel estimates happens whenever a pilot sequence is received at a base station simultaneously with non-orthogonal signals coming from other users. We propose a method to avoid such simultaneous transmissions from adjacent cells, thus significantly decreasing interference. We also investigate the effects of power allocation in this interference-limited scenario, and show that it results in gains of over 15dB in the signal to interference ratio for the scenario simulated here. The combination of these two techniques results in rate gains of about 18 times in our simulations.

497 citations


Journal ArticleDOI
Betty Abelev1, Jaroslav Adam2, Dagmar Adamová3, Andrew Marshall Adare4  +1002 moreInstitutions (89)
04 Mar 2013
TL;DR: In this paper, the authors measured the transverse-momentum (p(T)) distributions and yields of pi, K, and p in Pb-Pb collisions at root s(NN) = 2.76 TeV.
Abstract: In this paper measurements are presented of pi(+/-), K-+/-, p, and (p) over bar production at midrapidity (vertical bar y vertical bar < 0.5), in Pb-Pb collisions at root s(NN) = 2.76 TeV as a function of centrality. The measurement covers the transverse-momentum (p(T)) range from 100, 200, and 300 MeV/c up to 3, 3, and 4.6 GeV/c for pi, K, and p, respectively. The measured p(T) distributions and yields are compared to expectations based on hydrodynamic, thermal and recombination models. The spectral shapes of central collisions show a stronger radial flow than measured at lower energies, which can be described in hydrodynamic models. In peripheral collisions, the p(T) distributions are not well reproduced by hydrodynamic models. Ratios of integrated particle yields are found to be nearly independent of centrality. The yield of protons normalized to pions is a factor similar to 1.5 lower than the expectation from thermal models.

Book ChapterDOI
02 Apr 2013
TL;DR: In this article, the authors present a framework for responsible innovation, based on four dimensions-anticipatory, reflective, deliberative, and responsive, to reflect on both the products and purposes of science and innovation.
Abstract: “Responsible innovation,” in reality, lacks definition and clarity, both in concept and practice. This chapter begins by emphasizing that science and innovation have not only produced understanding, knowledge, and value (economic, social, or otherwise), but also questions, dilemmas, and unintended impacts. It presents a framework for responsible innovation, based on four dimensions-anticipatory, reflective, deliberative, and responsive. The chapter emphasizes that this must be able to reflect on both the products and purposes of science and innovation. It reminds of the rich history in a number of fields of study, both theoretical and applied, which make a significant and important contribution both to the framing and definition of responsible innovation, and to how this is translated through the dimensions into practice. The chapter concludes with some closing thoughts on implementation, on how the embedding of responsible innovation as a genuinely transformative and constructive approach can be supported.

Journal ArticleDOI
TL;DR: This review describes some of the most prominent examples of paper-based sensors and takes a closer look at recent advances in immunoassays fabricated on paper, excluding simple lateral flow tests assembled on nitrocellulose.
Abstract: Paper has been present in the world of analytical chemistry for centuries, but it seems that just a few years back it was rediscovered as a valuable substrate for sensors. We can easily list some of the countless advantages of this simple cellulosic substrate, including mechanical properties, three-dimensional fibrous structure, biocompatibility and biodegradability, easiness of production and modification, reasonable price, and availability all over the world. Those characteristics make paper a first-choice substrate for disposable sensors and integrated sensing platforms. Nowadays, numerous examples of paper-based sensors are being presented in the literature. This review describes some of the most prominent examples classifying them by type of detection: optical (colorimetric, fluorescence, surface-enhanced Raman spectroscopy, and transmittance methods) and electrochemical (voltammetric, potentiometric, and conductivity-based methods). We take a closer look at recent advances in immunoassays fabricated on paper, excluding simple lateral flow tests assembled on nitrocellulose. This review also summarizes the main advantages and disadvantages of the use of paper as a substrate for sensors, as well as its impact on their performance and application, presents a short history of paper in analytical chemistry, and discusses fabrication methods and available sources of paper.

Journal ArticleDOI
Mirko Manchia1, Mazda Adli2, Nirmala Akula3, Raffaella Ardau, Jean-Michel Aubry4, Lena Backlund5, Claudio E. M. Banzato6, Bernhard T. Baune7, Frank Bellivier8, Susanne Bengesser9, Joanna M. Biernacka10, Clara Brichant-Petitjean8, Elise Bui3, Cynthia V. Calkin1, Andrew T. A. Cheng11, Caterina Chillotti, Sven Cichon12, Scott R. Clark7, Piotr M. Czerski, Clarissa de Rosalmeida Dantas6, Maria Del Zompo13, J. Raymond DePaulo14, Sevilla D. Detera-Wadleigh3, Bruno Etain15, Peter Falkai16, Louise Frisén5, Mark A. Frye10, Janice M. Fullerton17, Sébastien Gard, Julie Garnham1, Fernando S. Goes14, Paul Grof18, Oliver Gruber19, Ryota Hashimoto20, Joanna Hauser, Urs Heilbronner19, Rebecca Hoban21, Rebecca Hoban22, Liping Hou3, Stéphane Jamain15, Jean-Pierre Kahn, Layla Kassem3, Tadafumi Kato, John R. Kelsoe21, John R. Kelsoe22, Sarah Kittel-Schneider23, Sebastian Kliwicki, Po-Hsiu Kuo24, Ichiro Kusumi25, Gonzalo Laje3, Catharina Lavebratt5, Marion Leboyer15, Susan G. Leckband22, Susan G. Leckband21, Carlos Jaramillo26, Mario Maj27, Alain Malafosse4, Lina Martinsson5, Takuya Masui25, Philip B. Mitchell28, Frank Mondimore14, Palmiero Monteleone27, Audrey Nallet4, Maria Neuner23, Tomas Novak3, Claire O'Donovan1, Urban Ösby5, Norio Ozaki29, Norio Ozaki30, Roy H. Perlis31, Andrea Pfennig32, James B. Potash14, James B. Potash33, Daniela Reich-Erkelenz19, Andreas Reif23, Eva Z. Reininghaus9, Sara Richardson3, Guy A. Rouleau34, Janusz K. Rybakowski, Martin Schalling5, Peter R. Schofield17, O. Schubert7, Barbara W. Schweizer14, Florian Seemüller16, Maria Grigoroiu-Serbanescu, Giovanni Severino13, Lisa R. Seymour10, Claire Slaney1, Jordan W. Smoller31, Alessio Squassina13, Thomas Stamm2, Jo Steele3, Pavla Stopkova3, Sarah K. Tighe14, Alfonso Tortorella27, Gustavo Turecki, Naomi R. Wray35, Adam Wright28, Peter P. Zandi14, David Zilles19, Michael Bauer32, Marcella Rietschel36, Francis J. McMahon3, Thomas G. Schulze, Martin Alda1 
19 Jun 2013
TL;DR: The key phenotypic measures of the “Retrospective Criteria of Long-Term Treatment Response in Research Subjects with Bipolar Disorder” scale currently used in the Consortium on lithium Genetics (ConLiGen) study are reported.
Abstract: OBJECTIVE: The assessment of response to lithium maintenance treatment in bipolar disorder (BD) is complicated by variable length of treatment, unpredictable clinical course, and often inconsistent compliance. Prospective and retrospective methods of assessment of lithium response have been proposed in the literature. In this study we report the key phenotypic measures of the "Retrospective Criteria of Long-Term Treatment Response in Research Subjects with Bipolar Disorder" scale currently used in the Consortium on Lithium Genetics (ConLiGen) study. MATERIALS AND METHODS: Twenty-nine ConLiGen sites took part in a two-stage case-vignette rating procedure to examine inter-rater agreement [Kappa (κ)] and reliability [intra-class correlation coefficient (ICC)] of lithium response. Annotated first-round vignettes and rating guidelines were circulated to expert research clinicians for training purposes between the two stages. Further, we analyzed the distributional properties of the treatment response scores available for 1,308 patients using mixture modeling. RESULTS: Substantial and moderate agreement was shown across sites in the first and second sets of vignettes (κ = 0.66 and κ = 0.54, respectively), without significant improvement from training. However, definition of response using the A score as a quantitative trait and selecting cases with B criteria of 4 or less showed an improvement between the two stages (ICC1 = 0.71 and ICC2 = 0.75, respectively). Mixture modeling of score distribution indicated three subpopulations (full responders, partial responders, non responders). CONCLUSIONS: We identified two definitions of lithium response, one dichotomous and the other continuous, with moderate to substantial inter-rater agreement and reliability. Accurate phenotypic measurement of lithium response is crucial for the ongoing ConLiGen pharmacogenomic study.

Journal ArticleDOI
Betty Abelev1, Jaroslav Adam2, Dagmar Adamová3, Andrew Marshall Adare4  +997 moreInstitutions (89)
18 Jan 2013
TL;DR: In this article, the authors measured the centrality of inelastic Pb-Pb collisions at a center-of-mass energy of 2.76 TeV per colliding nucleon pair with ALICE.
Abstract: This publication describes the methods used to measure the centrality of inelastic Pb-Pb collisions at a center-of-mass energy of 2.76 TeV per colliding nucleon pair with ALICE. The centrality is a key parameter in the study of the properties of QCD matter at extreme temperature and energy density, because it is directly related to the initial overlap region of the colliding nuclei. Geometrical properties of the collision, such as the number of participating nucleons and the number of binary nucleon-nucleon collisions, are deduced from a Glauber model with a sharp impact parameter selection and shown to be consistent with those extracted from the data. The centrality determination provides a tool to compare ALICE measurements with those of other experiments and with theoretical calculations.

Journal ArticleDOI
TL;DR: The interaction between mitochondrial reactive oxygen and nitrogen species, and the involvement of these oxidants in mitochondrial diseases, cancer, neurological, and cardiovascular disorders are discussed.
Abstract: Mitochondrially generated reactive oxygen species are involved in a myriad of signaling and damaging pathways in different tissues. In addition, mitochondria are an important target of reactive oxygen and nitrogen species. Here, we discuss basic mechanisms of mitochondrial oxidant generation and removal and the main factors affecting mitochondrial redox balance. We also discuss the interaction between mitochondrial reactive oxygen and nitrogen species, and the involvement of these oxidants in mitochondrial diseases, cancer, neurological, and cardiovascular disorders. Antioxid. Redox Signal. 18, 2029–2074.

Journal ArticleDOI
TL;DR: This paper presents a review on the publications on propolis and patents of applications and biological constituents of propolis.
Abstract: Propolis is the generic name given to the product obtained from resinous substances, which is gummy and balsamic and which is collected by bees from flowers, buds, and exudates of plants. It is a popular folk medicine possessing a broad spectrum of biological activities. These biological properties are related to its chemical composition and more specifically to the phenolic compounds that vary in their structure and concentration depending on the region of production, availability of sources to collect plant resins, genetic variability of the queen bee, the technique used for production, and the season in which propolis is produced. Many scientific articles are published every year in different international journal, and several groups of researchers have focused their attention on the chemical compounds and biological activity of propolis. This paper presents a review on the publications on propolis and patents of applications and biological constituents of propolis.


Journal ArticleDOI
P. Adamson1, I. Anghel2, I. Anghel3, C. Backhouse4, G.D. Barr4, M. Bishai5, Andrew Blake6, G. J. Bock1, D. Bogert1, S. V. Cao7, C. M. Castromonte8, S. Childress1, Joao A B Coelho9, Joao A B Coelho10, L. Corwin11, Daniel P Cronin-Hennessy, J. K. De Jong4, A. V. Devan12, N. E. Devenish13, M. V. Diwan5, Carlos Escobar9, J. J. Evans, E. Falk13, G. J. Feldman14, M. V. Frohne15, H. R. Gallagher10, R. A. Gomes8, Maury Goodman2, P. Gouffon16, N. Graf17, R. Gran, K. Grzelak18, Alec Habig, S. R. Hahn1, J. Hartnell13, R. Hatcher1, A. Himmel19, A. Holin20, J. Hylen1, G. M. Irwin21, Z. Isvan22, Z. Isvan5, C. James1, D. A. Jensen1, T. Kafka10, S. M. S. Kasahara23, G. Koizumi1, M. Kordosky12, A. E. Kreymer1, Karol Lang7, P. J. Litchfield, P. Lucas, W. A. Mann, Marvin L Marshak, M. Mathis, N. Mayer, A. M. McGowan, M. M. Medeiros, R. Mehdiyev, J. R. Meier, M. D. Messier, D. G. Michael, W. H. Miller, S. R. Mishra, S. Moed Sher, C. D. Moore, L. Mualem, J. A. Musser, D. Naples, J. K. Nelson, Harvey B Newman, R. J. Nichol, J. A. Nowak, J. O'Connor, W. P. Oliver, M. Orchanian, R. B. Pahlka, J. M. Paley, R. B. Patterson, Gregory J Pawloski, S. Phan-Budd, R. K. Plunkett, X. Qiu, A. Radovic, B. Rebel, C. Rosenfeld, H. A. Rubin, M. C. Sanchez, J. Schneps, A. Schreckenberger, P. Schreiner, R. Sharma, A. Sousa, N. Tagg, R. L. Talaga, Juergen Thomas, M. A. Thomson, G. Tinti, S. C. Tognini, R. Toner, D. Torretta, G. Tzanakos, J. Urheim, P. Vahle, B. Viren, A. C. Weber, R. C. Webb, Christopher G. White, L. Whitehead, L. H. Whitehead, Stanley G. Wojcicki, R. Zwaska 
TL;DR: Measurements of oscillation parameters from ν (μ) and ν(μ) disappearance using beam and atmospheric data from MINOS are reported, with minimal change to the neutrino parameters.
Abstract: We report measurements of oscillation parameters from ν_μ and ν_μ disappearance using beam and atmospheric data from MINOS. The data comprise exposures of 10.71×10^(20) protons on target in the ν_μ-dominated beam, 3.36×10^(20) protons on target in the ν_μ-enhanced beam, and 37.88 kton yr of atmospheric neutrinos. Assuming identical ν and ν oscillation parameters, we measure |Δm^2|=(2.41_(-0.10)^(+0.09))×10^(-3) eV^2 and sin^⁡2(2θ)=0.950_(-0.036)^(+0.035). Allowing independent ν and ν oscillations, we measure antineutrino parameters of |Δm^2|=(2.50_(-0.25)^(+0.23))×10^(-3) eV^2 and sin^⁡2(2θ)=0.97_(-0.08)^(+0.03), with minimal change to the neutrino parameters.

Journal ArticleDOI
TL;DR: There is evidence to support the association of NAFLD with subclinical atherosclerosis independent of traditional risk factors and metabolic syndrome, however, there is need for future longitudinal studies to review this association to ascertain causality and include other ethnic populations.

Journal ArticleDOI
TL;DR: In this article, the IUPAC Commission on Molecular Structure and Spectroscopy updated and extended the recommendations and made further recommendations regarding symbols, acronyms, and abbreviations.
Abstract: This document contains recommendations for terminology in mass spectrometry. Development of standard terms dates back to 1974 when the IUPAC Commission on Analytical Nomenclature issued recommendations on mass spectrometry terms and defini- tions. In 1978, the IUPAC Commission on Molecular Structure and Spectroscopy updated and extended the recommendations and made further recommendations regarding symbols, acronyms, and abbreviations. The IUPAC Physical Chemistry Division Commission on Molecular Structure and Spectroscopy's Subcommittee on Mass Spectroscopy revised the recommended terms in 1991 and appended terms relating to vacuum technology. Some addi- tional terms related to tandem mass spectrometry were added in 1993 and accelerator mass spectrometry in 1994. Owing to the rapid expansion of the field in the intervening years, par- ticularly in mass spectrometry of biomolecules, a further revision of the recommendations has become necessary. This document contains a comprehensive revision of mass spectrom- etry terminology that represents the current consensus of the mass spectrometry community.

Journal ArticleDOI
01 Dec 2013
TL;DR: This study has identified challenges in the field, including the immense diversity and inconsistency of terminologies, limited documentation, sparse comparison and benchmarking criteria, and nonexistence of standardized query languages.
Abstract: Advances in Web technology and the proliferation of mobile devices and sensors connected to the Internet have resulted in immense processing and storage requirements. Cloud computing has emerged as a paradigm that promises to meet these requirements. This work focuses on the storage aspect of cloud computing, specifically on data management in cloud environments. Traditional relational databases were designed in a different hardware and software era and are facing challenges in meeting the performance and scale requirements of Big Data. NoSQL and NewSQL data stores present themselves as alternatives that can handle huge volume of data. Because of the large number and diversity of existing NoSQL and NewSQL solutions, it is difficult to comprehend the domain and even more challenging to choose an appropriate solution for a specific task. Therefore, this paper reviews NoSQL and NewSQL solutions with the objective of: (1) providing a perspective in the field, (2) providing guidance to practitioners and researchers to choose the appropriate data store, and (3) identifying challenges and opportunities in the field. Specifically, the most prominent solutions are compared focusing on data models, querying, scaling, and security related capabilities. Features driving the ability to scale read requests and write requests, or scaling data storage are investigated, in particular partitioning, replication, consistency, and concurrency control. Furthermore, use cases and scenarios in which NoSQL and NewSQL data stores have been used are discussed and the suitability of various solutions for different sets of applications is examined. Consequently, this study has identified challenges in the field, including the immense diversity and inconsistency of terminologies, limited documentation, sparse comparison and benchmarking criteria, and nonexistence of standardized query languages.

Journal ArticleDOI
11 Dec 2013
TL;DR: Prophylactic rFIXFc, administered every 1 to 2 weeks, resulted in low annualized bleeding rates in patients with hemophilia B, mostly consistent with those expected in the general population of patients withhemophilia.
Abstract: Background Prophylactic factor replacement in patients with hemophilia B improves outcomes but requires frequent injections. A recombinant factor IX Fc fusion protein (rFIXFc) with a prolonged half-life was developed to reduce the frequency of injections required. Methods We conducted a phase 3, nonrandomized, open-label study of the safety, efficacy, and pharmacokinetics of rFIXFc for prophylaxis, treatment of bleeding, and perioperative hemostasis in 123 previously treated male patients. All participants were 12 years of age or older and had severe hemophilia B (endogenous factor IX level of ≤2 IU per deciliter, or ≤2% of normal levels). The study included four treatment groups: group 1 received weekly dose-adjusted prophylaxis (50 IU of rFIXFc per kilogram of body weight to start), group 2 received interval-adjusted prophylaxis (100 IU per kilogram every 10 days to start), group 3 received treatment as needed for bleeding episodes (20 to 100 IU per kilogram), and group 4 received treatment in the perio...

Journal ArticleDOI
TL;DR: The XOS applications described in this paper highlight that they are considered soluble dietary fibers that have prebiotic activity, favoring the improvement of bowel functions and immune function and having antimicrobial and other health benefits.

Proceedings ArticleDOI
04 Jun 2013
TL;DR: This article assesses how well existing face anti-spoofing countermeasures can work in a more realistic condition and introduces two strategies that show promising results.
Abstract: User authentication is an important step to protect information and in this field face biometrics is advantageous. Face biometrics is natural, easy to use and less human-invasive. Unfortunately, recent work has revealed that face biometrics is vulnerable to spoofing attacks using low-tech equipments. This article assesses how well existing face anti-spoofing countermeasures can work in a more realistic condition. Experiments carried out with two freely available video databases (Replay Attack Database and CASIA Face Anti-Spoofing Database) show low generalization and possible database bias in the evaluated countermeasures. To generalize and deal with the diversity of attacks in a real world scenario we introduce two strategies that show promising results.

Journal ArticleDOI
TL;DR: In this article, results of searches for heavy stable charged particles produced in pp collisions at 7 and 8 TeV are presented corresponding to an integrated luminosity of 5.0 and 18.8 inverse femtobarns, respectively.
Abstract: Results of searches for heavy stable charged particles produced in pp collisions at sqrt(s) = 7 and 8 TeV are presented corresponding to an integrated luminosity of 5.0 inverse femtobarns and 18.8 inverse femtobarns, respectively. Data collected with the CMS detector are used to study the momentum, energy deposition, and time-of-flight of signal candidates. Leptons with an electric charge between e/3 and 8e, as well as bound states that can undergo charge exchange with the detector material, are studied. Analysis results are presented for various combinations of signatures in the inner tracker only, inner tracker and muon detector, and muon detector only. Detector signatures utilized are long time-of-flight to the outer muon system and anomalously high (or low) energy deposition in the inner tracker. The data are consistent with the expected background, and upper limits are set on the production cross section of long-lived gluinos, scalar top quarks, and scalar tau leptons, as well as pair produced long-lived leptons. Corresponding lower mass limits, ranging up to 1322 GeV for gluinos, are the most stringent to date.

Journal ArticleDOI
Betty Abelev1, Jaroslav Adam2, Dagmar Adamová3, Andrew Marshall Adare4  +963 moreInstitutions (95)
TL;DR: In this paper, the ALICE measurement of K^0_S and Lambda production at midrapidity in Pb-Pb collisions at sqrt(sNN) = 2.76 TeV is presented.
Abstract: The ALICE measurement of K^0_S and {\Lambda} production at mid-rapidity in Pb-Pb collisions at sqrt(sNN) = 2.76 TeV is presented. The transverse momentum (pT) spectra are shown for several collision centrality intervals and in the pT range from 0.4 GeV/c (0.6 GeV/c for {\Lambda}) to 12 GeV/c. The pT dependence of the {\Lambda}/K^0_S ratios exhibits maxima in the vicinity of 3 GeV/c, and the positions of the maxima shift towards higher pT with increasing collision centrality. The magnitude of these maxima increases by almost a factor of three between most peripheral and most central Pb-Pb collisions. This baryon excess at intermediate pT is not observed in pp interactions at sqrt(s) = 0.9 TeV and at sqrt(s) = 7 TeV. Qualitatively, the baryon enhancement in heavy-ion collisions is expected from radial flow. However, the measured pT spectra above 2 GeV/c progressively decouple from hydrodynamical-model calculations. For higher values of pT, models that incorporate the influence of the medium on the fragmentation and hadronization processes describe qualitatively the pT dependence of the {\Lambda}/K^0_S ratio.

Journal ArticleDOI
Betty Abelev1, Jaroslav Adam2, Dagmar Adamová3, Andrew Marshall Adare4  +969 moreInstitutions (88)
11 Jul 2013
TL;DR: In this paper, the ALICE detector was used to measure the long-range correlations between trigger particles and various species of charged associated particles (unidentified particles, pions, kaons, protons and antiprotons).
Abstract: Angular correlations between unidentified charged trigger particles and various species of charged associated particles (unidentified particles, pions, kaons, protons and antiprotons) are measured by the ALICE detector in p-Pb collisions at a nucleon-nucleon centre-of-mass energy of 5.02 TeV in the transverse-momentum range 0.3 < p(T) < 4 GeV/c. The correlations expressed as associated yield per trigger particle are obtained in the pseudorapidity range vertical bar n(lab)vertical bar < 0.8. Fourier coefficients are extracted from the long-range correlations projected onto the azimuthal angle difference and studied as a function of p(T) and in intervals of event multiplicity. In high-multiplicity events, the second-order coefficient for protons, 4, is observed to be smaller than that for pions, v(2)(pi), up to about p(T) = 2 GeV/c. To reduce correlations due to jets, the per-trigger yield measured in low-multiplicity events is subtracted from that in high-multiplicity events. A two-ridge structure is obtained for all particle species. The Fourier decomposition of this structure shows that the second-order coefficients for pions and kaons are similar. The v(2)(p) is found to be smaller at low P-T and larger at higher p(T) than v(2)(pi), with a crossing occurring at about 2 GeV/c. This is qualitatively similar to the elliptic-flow pattern observed in heavy-ion collisions. A mass ordering effect at low transverse momenta is consistent with expectations from hydrodynamic model calculations assuming a collectively expanding system. (C) 2013 CERN. Published by Elsevier B.V. All rights reserved.

Journal ArticleDOI
E. Abbas, Betty Abelev1, Jaroslav Adam2, Dagmar Adamová3  +1019 moreInstitutions (91)
TL;DR: The ALICE VZERO system, made of two scintillator arrays at asymmetric positions, one on each side of the interaction point, plays a central role in ALICE and is used to monitor LHC beam conditions, to reject beam-induced backgrounds and to measure basic physics quantities such as luminosity, particle multiplicity, centrality and event plane direction as mentioned in this paper.
Abstract: ALICE is an LHC experiment devoted to the study of strongly interacting matter in proton-proton, proton-nucleus and nucleus-nucleus collisions at ultra-relativistic energies. The ALICE VZERO system, made of two scintillator arrays at asymmetric positions, one on each side of the interaction point, plays a central role in ALICE. In addition to its core function as a trigger source, the VZERO system is used to monitor LHC beam conditions, to reject beam-induced backgrounds and to measure basic physics quantities such as luminosity, particle multiplicity, centrality and event plane direction in nucleus-nucleus collisions. After describing the VZERO system, this publication presents its performance over more than four years of operation at the LHC.

Journal ArticleDOI
TL;DR: The as-quenched martensitic parts showed yield and ultimate compressive strengths similar to the as-processed parts, and these were greater than those observed for the fully annealed samples that had the lamellar microstructure of the equilibrium α+β phases.
Abstract: Rapid prototyping allows titanium porous parts with mechanical properties close to that of bone tissue to be obtained. In this article, porous parts of the Ti-6Al-4V alloy with three levels of porosity were obtained by selective laser melting with two different energy inputs. Thermal treatments were performed to determine the influence of the microstructure on the mechanical properties. The porous parts were characterized by both optical and scanning electron microscopy. The effective modulus, yield and ultimate compressive strength were determined by compressive tests. The martensitic α' microstructure was observed in all of the as-processed parts. The struts resulting from the processing conditions investigated were thinner than those defined by CAD models, and consequently, larger pores and a higher experimental porosity were achieved. The use of the high-energy input parameters produced parts with higher oxygen and nitrogen content, their struts that were even thinner and contained a homogeneous porosity distribution. Greater mechanical properties for a given relative density were obtained using the high-energy input parameters. The as-quenched martensitic parts showed yield and ultimate compressive strengths similar to the as-processed parts, and these were greater than those observed for the fully annealed samples that had the lamellar microstructure of the equilibrium α+β phases. The effective modulus was not significantly influenced by the thermal treatments. A comparison between these results and those of porous parts with similar geometry obtained by selective electron beam melting shows that the use of a laser allows parts with higher mechanical properties for a given relative density to be obtained.