scispace - formally typeset
Search or ask a question

Showing papers by "Helsinki University of Technology published in 2010"


Journal ArticleDOI
TL;DR: Dr. Youssef Habibi’s research interests include the sustainable production of materials from biomass, development of high performance nanocomposites from lignocellulosic materials, biomass conversion technologies, and the application of novel analytical tools in biomass research.
Abstract: Cellulose constitutes the most abundant renewable polymer resource available today. As a chemical raw material, it is generally well-known that it has been used in the form of fibers or derivatives for nearly 150 years for a wide spectrum of products and materials in daily life. What has not been known until relatively recently is that when cellulose fibers are subjected to acid hydrolysis, the fibers yield defect-free, rod-like crystalline residues. Cellulose nanocrystals (CNs) have garnered in the materials community a tremendous level of attention that does not appear to be relenting. These biopolymeric assemblies warrant such attention not only because of their unsurpassed quintessential physical and chemical properties (as will become evident in the review) but also because of their inherent renewability and sustainability in addition to their abundance. They have been the subject of a wide array of research efforts as reinforcing agents in nanocomposites due to their low cost, availability, renewability, light weight, nanoscale dimension, and unique morphology. Indeed, CNs are the fundamental constitutive polymeric motifs of macroscopic cellulosic-based fibers whose sheer volume dwarfs any known natural or synthetic biomaterial. Biopolymers such as cellulose and lignin and † North Carolina State University. ‡ Helsinki University of Technology. Dr. Youssef Habibi is a research assistant professor at the Department of Forest Biomaterials at North Carolina State University. He received his Ph.D. in 2004 in organic chemistry from Joseph Fourier University (Grenoble, France) jointly with CERMAV (Centre de Recherche sur les Macromolecules Vegetales) and Cadi Ayyad University (Marrakesh, Morocco). During his Ph.D., he worked on the structural characterization of cell wall polysaccharides and also performed surface chemical modification, mainly TEMPO-mediated oxidation, of crystalline polysaccharides, as well as their nanocrystals. Prior to joining NCSU, he worked as assistant professor at the French Engineering School of Paper, Printing and Biomaterials (PAGORA, Grenoble Institute of Technology, France) on the development of biodegradable nanocomposites based on nanocrystalline polysaccharides. He also spent two years as postdoctoral fellow at the French Institute for Agricultural Research, INRA, where he developed new nanostructured thin films based on cellulose nanowiskers. Dr. Habibi’s research interests include the sustainable production of materials from biomass, development of high performance nanocomposites from lignocellulosic materials, biomass conversion technologies, and the application of novel analytical tools in biomass research. Chem. Rev. 2010, 110, 3479–3500 3479

4,664 citations


Journal ArticleDOI
TL;DR: The proposed OP-ELM methodology performs several orders of magnitude faster than the other algorithms used in this brief, except the original ELM, and is still able to maintain an accuracy that is comparable to the performance of the SVM.
Abstract: In this brief, the optimally pruned extreme learning machine (OP-ELM) methodology is presented. It is based on the original extreme learning machine (ELM) algorithm with additional steps to make it more robust and generic. The whole methodology is presented in detail and then applied to several regression and classification problems. Results for both computational time and accuracy (mean square error) are compared to the original ELM and to three other widely used methodologies: multilayer perceptron (MLP), support vector machine (SVM), and Gaussian process (GP). As the experiments for both regression and classification illustrate, the proposed OP-ELM methodology performs several orders of magnitude faster than the other algorithms used in this brief, except the original ELM. Despite the simplicity and fast performance, the OP-ELM is still able to maintain an accuracy that is comparable to the performance of the SVM. A toolbox for the OP-ELM is publicly available online.

745 citations


Journal ArticleDOI
TL;DR: A conceptual framework that aims to capture interdisciplinarity in the wider sense of knowledge integration, by exploring the concepts of diversity and coherence is proposed, which suggest that the combination of these two approaches may be useful for comparative studies of emergent scientific and technological fields.
Abstract: The multidimensional character and inherent conflict with categorisation of interdisciplinarity makes its mapping and evaluation a challenging task. We propose a conceptual framework that aims to capture interdisciplinarity in the wider sense of knowledge integration, by exploring the concepts of diversity and coherence. Disciplinary diversity indicators are developed to describe the heterogeneity of a bibliometric set viewed from predefined categories, i.e. using a top-down approach that locates the set on the global map of science. Network coherence indicators are constructed to measure the intensity of similarity relations within a bibliometric set, i.e. using a bottom-up approach, which reveals the structural consistency of the publications network. We carry out case studies on individual articles in bionanoscience to illustrate how these two perspectives identify different aspects of interdisciplinarity: disciplinary diversity indicates the large-scale breadth of the knowledge base of a publication; network coherence reflects the novelty of its knowledge integration. We suggest that the combination of these two approaches may be useful for comparative studies of emergent scientific and technological fields, where new and controversial categorisations are accompanied by equally contested claims of novelty and interdisciplinarity.

515 citations


Journal ArticleDOI
TL;DR: The elastic modulus of the nanocomposite mats increased significantly as a consequence of the reinforcing effect of CNs via the percolation network held by hydrogen bonds, but this organization-driven crystallization was limited as observed by the reduction in the degree of crystallinity of the CN-loaded composite fibers.

490 citations


Proceedings ArticleDOI
18 Apr 2010
TL;DR: The evaluation results of the proposed mode selection procedure show that it enables a much more reliable device-to-device communication with limited interference to the cellular network compared to simpler mode selection procedures.
Abstract: Device-to-Device communication underlaying a cellular network enables local services with limited interference to the cellular network. In this paper we study the optimal selection of possible resource sharing modes with the cellular network in a single cell. Based on the learning from the single cell studies we propose a mode selection procedure for a multi-cell environment. Our evaluation results of the proposed procedure show that it enables a much more reliable device-to-device communication with limited interference to the cellular network compared to simpler mode selection procedures. A well performing and practical mode selection is critical to enable the adoption of underlay device-to-device communication in cellular networks.

476 citations


Journal ArticleDOI
TL;DR: It is proposed that the MMN is, in essence, a latency- and amplitude-modulated expression of the auditory N1 response, generated by fresh-afferent activity of cortical neurons that are under nonuniform levels of adaptation.
Abstract: The current review constitutes the first comprehensive look at the possibility that the mismatch negativity (MMN, the deflection of the auditory ERP/ERF elicited by stimulus change) might be generated by so-called fresh-afferent neuronal activity. This possibility has been repeatedly ruled out for the past 30 years, with the prevailing theoretical accounts relying on a memory-based explanation instead. We propose that the MMN is, in essence, a latency- and amplitude-modulated expression of the auditory N1 response, generated by fresh-afferent activity of cortical neurons that are under nonuniform levels of adaptation.

476 citations


Journal ArticleDOI
TL;DR: This research proves that models like TAM should not treat mobile services as a generic concept, but instead to specifically address individual mobile services, and demonstrates the unique value of combining objective usage measurements with traditional survey data in more comprehensively modelling service adoption.

456 citations


Journal ArticleDOI
TL;DR: In this article, a conceptual framework for identifying interdisciplinarity in research documents has been proposed, which attempts to fulfill the need for a robust and nuanced approach that is grounded in deeper knowledge of interdisciplinary research.

455 citations


Journal ArticleDOI
TL;DR: A surprisingly simple and rapid methodology for large-area, lightweight, and thick nacre-mimetic films and laminates with superior material properties and is ready for scale-up via continuous roll-to-roll processes.
Abstract: Although remarkable success has been achieved to mimic the mechanically excellent structure of nacre in laboratory-scale models, it remains difficult to foresee mainstream applications due to time-consuming sequential depositions or energy-intensive processes. Here, we introduce a surprisingly simple and rapid methodology for large-area, lightweight, and thick nacre-mimetic films and laminates with superior material properties. Nanoclay sheets with soft polymer coatings are used as ideal building blocks with intrinsic hard/soft character. They are forced to rapidly self-assemble into aligned nacre-mimetic films via paper-making, doctor-blading or simple painting, giving rise to strong and thick films with tensile modulus of 45 GPa and strength of 250 MPa, that is, partly exceeding nacre. The concepts are environmentally friendly, energy-efficient, and economic and are ready for scale-up via continuous roll-to-roll processes. Excellent gas barrier properties, optical translucency, and extraordinary shape-p...

425 citations


Journal ArticleDOI
TL;DR: The combination of transcranial magnetic stimulation with simultaneous electroencephalography (EEG) provides the possibility to non-invasively probe the brain’s excitability, time-resolved connectivity and instantaneous state.
Abstract: The combination of transcranial magnetic stimulation (TMS) with simultaneous electroencephalography (EEG) provides us the possibility to non-invasively probe the brain’s excitability, time-resolved connectivity and instantaneous state. Early attempts to combine TMS and EEG suffered from the huge electromagnetic artifacts seen in EEG as a result of the electric field induced by the stimulus pulses. To deal with this problem, TMS-compatible EEG systems have been developed. However, even with amplifiers that are either immune to or recover quickly from the pulse, great challenges remain. Artifacts may arise from the movement of electrodes, from muscles activated by the pulse, from eye movements, from electrode polarization, or from brain responses evoked by the coil click. With careful precautions, many of these problems can be avoided. The remaining artifacts can be usually reduced by filtering, but control experiments are often needed to make sure that the measured signals actually originate in the brain. Several studies have shown the power of TMS–EEG by giving us valuable information about the excitability or connectivity of the brain.

362 citations


Journal ArticleDOI
J. A. Tauber1, Nazzareno Mandolesi2, J.-L. Puget3, T. Banos4  +499 moreInstitutions (61)
TL;DR: The European Space Agency's Planck satellite, launched on 14 May 2009, is the third-generation space experiment in the field of cosmic microwave background (CMB) research as mentioned in this paper.
Abstract: The European Space Agency's Planck satellite, launched on 14 May 2009, is the third-generation space experiment in the field of cosmic microwave background (CMB) research. It will image the anisotropies of the CMB over the whole sky, with unprecedented sensitivity ( ~ 2 × 10-6) and angular resolution (~5 arcmin). Planck will provide a major source of information relevant to many fundamental cosmological problems and will test current theories of the early evolution of the Universe and the origin of structure. It will also address a wide range of areas of astrophysical research related to the Milky Way as well as external galaxies and clusters of galaxies. The ability of Planck to measure polarization across a wide frequency range (30-350 GHz), with high precision and accuracy, and over the whole sky, will provide unique insight, not only into specific cosmological questions, but also into the properties of the interstellar medium. This paper is part of a series which describes the technical capabilities of the Planck scientific payload. It is based on the knowledge gathered during the on-ground calibration campaigns of the major subsystems, principally its telescope and its two scientific instruments, and of tests at fully integrated satellite level. It represents the best estimate before launch of the technical performance that the satellite and its payload will achieve in flight. In this paper, we summarise the main elements of the payload performance, which is described in detail in the accompanying papers. In addition, we describe the satellite performance elements which are most relevant for science, and provide an overview of the plans for scientific operations and data analysis.

Journal ArticleDOI
TL;DR: In this paper, the Mesh Adaptive Direct Search (MADS) algorithm is used to minimize the cost function of the system while constraining it to meet the customer demand and safety.

Journal ArticleDOI
TL;DR: In this paper, microfibrillated celluloses (MFCs) and associated films generated from wood pulps of different yields (containing extractives, lignin, and hemicelluloses) have been investigated.
Abstract: The interactions with water and the physical properties of microfibrillated celluloses (MFCs) and associated films generated from wood pulps of different yields (containing extractives, lignin, and hemicelluloses) have been investigated. MFCs were produced by combining mechanical refining and a high pressure treatment using a homogenizer. The produced MFCs were characterized by morphology analysis, water retention, hard-to-remove water content, and specific surface area. Regardless of chemical composition, processing to convert macrofibrils to microfibrils resulted in a decrease in water adsorption and water vapor transmission rate, both important properties for food packaging applications. After homogenization, MFCs with high lignin content had a higher water vapor transmission rate, even with a higher initial contact angle, hypothesized to be due to large hydrophobic pores in the film. A small amount of paraffin wax, less than 10%, reduced the WVTR to a similar value as low density polyethylene. Hard-to-remove water content correlated with specific surface area up to approximately 50 m2/g, but not with water retention value. The drying rate of the MFCs increased with the specific surface area. Hornified fibers from recycled paper also have the potential to be used as starting materials for MFC production as the physical and optical properties of the films were similar to the films from virgin fibers. In summary, the utilization of lignin containing MFCs resulted in unique properties and should reduce MFC production costs by reducing wood, chemical, and energy requirements.

Journal ArticleDOI
TL;DR: In this article, the effect of minor alloying and impurity elements, typically present in electronics manufacturing environment, on the interfacial reactions between Sn and Cu, which is the base system for Pb-free soldering is analyzed.
Abstract: The objective of this review is to study the effect of minor alloying and impurity elements, typically present in electronics manufacturing environment, on the interfacial reactions between Sn and Cu, which is the base system for Pb-free soldering. Especially, the reasons leading to the observed interfacial reaction layers and their microstructural evolution are analysed. The following conclusions have been reached. Alloying and impurity elements can have three major effects on the reactions between the Sn-based solder and the conductor metal: Firstly, they can increase or decrease the reaction/growth rate. Secondly, additives can change the physical properties of the phases formed (in the case of Cu and Sn, ɛ and η). Thirdly they can form additional reaction layers at the interface or they can displace the binary phases that would normally appear and form other reaction products instead. Further, the alloying and impurity elements can be divided roughly into two major categories: (i) elements (Ni, Au, Sb, In, Co, Pt, Pd, and Zn) that show marked solubility in the intermetallic compound (IMC) layer (generally take part in the interfacial reaction in question) and (ii) elements (Bi, Ag, Fe, Al, P, rare-earth elements, Ti and S) that are not extensively soluble in IMC layer (only change the activities of species taking part in the interfacial reaction and do not usually participate themselves). The elements belonging to category (i) usually have the most pronounced effect on IMC formation. It is also shown that by adding appropriate amounts of certain alloying elements to Sn-based solder, it is possible to tailor the properties of the interfacial compounds to exhibit, for example, better drop test reliability. Further, it is demonstrated that if excess amount of the same alloying elements are used, drastic decrease in reliability can occur. The analysis for this behaviour is based on the so-called thermodynamic–kinetic method.

Journal ArticleDOI
TL;DR: Interestingly, after homogenization, the presence of lignin significantly increased film toughness, tensile index, and elastic modulus, indicating that MFC films can potentially be made from low-cost recycled cellulosic materials.

Journal ArticleDOI
TL;DR: In this article, the authors present a new synthesis, based on a suite of complementary approaches, of the primary production and carbon sink in forests of the 25 member states of the European Union (EU-25) during 1990-2005.
Abstract: We present a new synthesis, based on a suite of complementary approaches, of the primary production and carbon sink in forests of the 25 member states of the European Union (EU-25) during 1990-2005. Upscaled terrestrial observations and model-based approaches agree within 25% on the mean net primary production (NPP) of forests, i.e. 520 +/- 75 g C m-2 yr-1 over a forest area of 1.32 x 106 km2 to 1.55 x 106 km2 (EU-25). New estimates of the mean long-term carbon forest sink (net biome production, NBP) of EU-25 forests amounts 75 +/- 20 g C m-2 yr-1. The ratio of NBP to NPP is 0.15 +/- 0.05. Estimates of the fate of the carbon inputs via NPP in wood harvests, forest fires, losses to lakes and rivers and heterotrophic respiration remain uncertain, which explains the considerable uncertainty of NBP. Inventory-based assessments and assumptions suggest that 29 +/- 15% of the NBP (i.e., 22 g C m-2 yr-1) is sequestered in the forest soil, but large uncertainty remains concerning the drivers and future of the soil organic carbon. The remaining 71 +/- 15% of the NBP (i.e., 53 g C m-2 yr-1) is realized as woody biomass increments. In the EU-25, the relatively large forest NBP is thought to be the result of a sustained difference between NPP, which increased during the past decades, and carbon losses primarily by harvest and heterotrophic respiration, which increased less over the same period.

Journal ArticleDOI
TL;DR: In this article, the current status of the long-term stability of dye solar cells and factors affecting it are reviewed, and the authors conclude that techniques giving chemical information are needed for the stability investigations of DSCs to reveal possible ways to improve their lifetime.
Abstract: The current status of the long-term stability of dye solar cells (DSCs) and factors affecting it is reviewed. The purpose is to clarify present knowledge of degradation phenomena and factors in these cells by critically separating the assumptions from the solid experimental evidence reported in the literature. Important degradation processes such as dye desorption, decrease in the tri-iodide concentration, degradation at the photoelectrode and counter electrode, affect of ultraviolet light and moisture, and issues related to the sealing, are covered. It is concluded that techniques giving chemical information are needed for the stability investigations of DSCs to reveal possible ways to improve their lifetime. In this regard, experimental methods suitable for separating degradation mechanisms in complete cells during long-term testing are proposed employing specifically designed sealed cell structures, called segmented cells, that provide windows to measure specific cell components without being obscured by the others.

Journal ArticleDOI
TL;DR: In this paper, the authors present a new generic modelling approach, with a focus on cooking fuel choices, and explore response strategies for energy poverty eradication in India, which analyzes the determinants of fuel consumption choices for heterogeneous household groups, incorporating the effect of income distributions and traditionally more intangible factors such as preferences and private discount rates.

Journal ArticleDOI
TL;DR: Five approaches to software development are presented, organized from integration-centric to composition-oriented and the areas of applicability are described.

Journal ArticleDOI
TL;DR: The current results support the view that cognitive and motor functions are segregated in the cerebellum and suggest that the posterior cerebellar activity during a demanding cognitive task is involved with optimization of the response speed.
Abstract: We applied fMRI and diffusion-weighted MRI to study the segregation of cognitive and motor functions in the human cerebro-cerebellar system. Our fMRI results show that a load increase in a nonverbal auditory working memory task is associated with enhanced brain activity in the parietal, dorsal premotor, and lateral prefrontal cortices and in lobules VII-VIII of the posterior cerebellum, whereas a sensory-motor control task activated the motor/somatosensory, medial prefrontal, and posterior cingulate cortices and lobules V/VI of the anterior cerebellum. The load-dependent activity in the crus I/II had a specific relationship with cognitive performance: This activity correlated negatively with load-dependent increase in RTs. This correlation between brain activity and RTs was not observed in the sensory-motor task in the activated cerebellar regions. Furthermore, probabilistic tractography analysis of the diffusion-weighted MRI data suggests that the tracts between the cerebral and the cerebellar areas exhibiting cognitive load-dependent and sensory-motor activity are mainly projected via separated pontine (feed-forward tracts) and thalamic (feedback tracts) nuclei. The tractography results also indicate that the crus I/II in the posterior cerebellum is linked with the lateral prefrontal areas activated by cognitive load increase, whereas the anterior cerebellar lobe is not. The current results support the view that cognitive and motor functions are segregated in the cerebellum. On the basis of these results and theories of the function of the cerebellum, we suggest that the posterior cerebellar activity during a demanding cognitive task is involved with optimization of the response speed.

Journal ArticleDOI
TL;DR: In this paper, the dependence of the brightness of blazars on the intrinsic properties of their parsec-scale radio jets and the implication for relativistic beaming was investigated by combining high-resolution VLBA images from the MOJAVE program with millimetrewavelength flux density monitoring data from Metsahovi Radio Observatory.
Abstract: Aims. We investigate the dependence of γ-ray brightness of blazars on intrinsic properties of their parsec-scale radio jets and the implication for relativistic beaming. Methods. By combining apparent jet speeds derived from high-resolution VLBA images from the MOJAVE program with millimetrewavelength flux density monitoring data from Metsahovi Radio Observatory, we estimate the jet Doppler factors, Lorentz factors, and viewing angles for a sample of 62 blazars. We study the trends in these quantities between the sources which were detected in γ-rays by the Fermi Large Area Telescope (LAT) during its first three months of science operations and those which were not detected. Results. The LAT-detected blazars have on average higher Doppler factors than non-LAT-detected blazars, as has been implied indirectly in several earlier studies. We find statistically significant differences in the viewing angle distributions between γ-ray bright and weak sources. Most interestingly, γ-ray bright blazars have a distribution of comoving frame viewing angles that is significantly narrower than that of γ-ray weak blazars and centred roughly perpendicular to the jet axis. The lack of γ-ray bright blazars at large comoving frame viewing angles can be explained by relativistic beaming of γ-rays, while the apparent lack of γ-ray bright blazars at small comoving frame viewing angles, if confirmed with larger samples, may suggest an intrinsic anisotropy or Lorentz factor dependence of the γ-ray emission.

Journal ArticleDOI
TL;DR: In this article, the behavior of the parsec-scale jet of the quasar 3C 454.3 during pronounced flaring in 2005-2008 was analyzed and correlations between optical variations and those at X-ray and γ-ray energies were found.
Abstract: We analyze the behavior of the parsec-scale jet of the quasar 3C 454.3 during pronounced flaring in 2005-2008. Three major disturbances propagated down the jet along different trajectories with Lorentz factors Γ > 10. The disturbances show a clear connection with millimeter-wave outbursts, in 2005 May/June, 2007 July, and 2007 December. High-amplitude optical events in the R-band light curve precede peaks of the millimeter-wave outbursts by 15-50 days. Each optical outburst is accompanied by an increase in X-ray activity. We associate the optical outbursts with propagation of the superluminal knots and derive the location of sites of energy dissipation in the form of radiation. The most prominent and long lasting of these, in 2005 May, occurred closer to the black hole, while the outbursts with a shorter duration in 2005 autumn and in 2007 might be connected with the passage of a disturbance through the millimeter-wave core of the jet. The optical outbursts, which coincide with the passage of superluminal radio knots through the core, are accompanied by systematic rotation of the position angle of optical linear polarization. Such rotation appears to be a common feature during the early stages of flares in blazars. We find correlations between optical variations and those at X-ray and γ-ray energies. We conclude that the emergence of a superluminal knot from the core yields a series of optical and high-energy outbursts, and that the millimeter-wave core lies at the end of the jet's acceleration and collimation zone. We infer that the X-ray emission is produced via inverse Compton scattering by relativistic electrons of photons both from within the jet (synchrotron self-Compton) and external to the jet (external Compton, or EC); which one dominates depends on the physical parameters of the jet. A broken power-law model of the γ-ray spectrum reflects a steepening of the synchrotron emission spectrum from near-IR to soft UV wavelengths. We propose that the γ-ray emission is dominated by the EC mechanism, with the sheath of the jet supplying seed photons for γ-ray events that occur near the millimeter-wave core.

Journal ArticleDOI
Marco Bersanelli1, Marco Bersanelli2, Nazzareno Mandolesi2, R. C. Butler2, A. Mennella1, A. Mennella2, F. Villa2, Beatriz Aja3, Eduardo Artal3, E. Artina4, Carlo Baccigalupi5, Carlo Baccigalupi2, M. Balasini4, G. Baldan4, A. J. Banday6, P. Bastia4, P. Battaglia4, T. Bernardino7, E. Blackhurst8, L. Boschini4, Carlo Burigana2, G. Cafagna4, B. Cappellini2, B. Cappellini1, Francesco Cavaliere1, F. Colombo4, G. Crone, F. Cuttaia2, Ocleto D'Arcangelo, Luigi Danese5, R. D. Davies8, R. J. Davis8, L. De Angelis9, G. de Gasperis10, L. de la Fuente3, A. de Rosa2, G. de Zotti2, M. C. Falvella9, Fabricio Ferrari4, R. Ferretti4, Lorenzo Figini, S. Fogliani2, Cristian Franceschet1, E. Franceschi2, T. C. Gaier11, S. Garavaglia, F. Gomez7, Krzysztof M. Gorski11, A. Gregorio12, P. Guzzi4, J. M. Herreros7, Sergi R. Hildebrandt7, Roger J. Hoyland7, N. Hughes, Michael Janssen11, P. Jukkala, D. Kettle8, V. H. Kilpiä, M. Laaninen, P. M. Lapolla4, Charles R. Lawrence11, D. Lawson8, J. P. Leahy8, Rodrigo Leonardi13, P. Leutenegger4, Steven Levin11, P. B. Lilje14, S. R. Lowe8, Philip Lubin13, Davide Maino1, M. Malaspina2, Michele Maris2, J. Marti-Canales, E. Martínez-González7, Angel Mediavilla3, Peter Meinhold13, M. Miccolis4, Gianluca Morgante2, P. Natoli10, Renzo Nesti2, L. Pagan4, Christopher G. Paine11, Bruce Partridge15, Juan Pablo Pascual3, Fabio Pasian2, David Pearson11, M. Pecora4, Francesca Perrotta5, Francesca Perrotta2, Paola Platania, Marian Pospieszalski16, T. Poutanen17, T. Poutanen18, T. Poutanen19, M. Prina11, Rafael Rebolo7, N. Roddis8, Jose Alberto Rubino-Martin7, M. J. Salmon7, M. Sandri2, Michael Seiffert11, R. Silvestri4, Alessandro Simonetto, P. Sjoman, G. F. Smoot20, Carlo Sozzi, Luca Stringhetti2, E. Taddei4, Jan Tauber21, Luca Terenzi2, M. Tomasi1, Jussi Tuovinen22, Luca Valenziano2, Jussi Varis22, Nicola Vittorio10, Lawrence A. Wade11, Althea Wilkinson8, F. Winder8, Andrea Zacchei2, Andrea Zonca2, Andrea Zonca1 
TL;DR: The Planck Low Frequency Instrument (LFI) as mentioned in this paper is an array of microwave radiometers based on state-of-the-art Indium Phosphide cryogenic HEMT amplifiers implemented in a differential system using blackbody loads as reference signals.
Abstract: In this paper we present the Low Frequency Instrument (LFI), designed and developed as part of the Planck space mission, the ESA program dedicated to precision imaging of the cosmic microwave background (CMB). Planck-LFI will observe the full sky in intensity and polarisation in three frequency bands centred at 30, 44 and 70 GHz, while higher frequencies (100-850 GHz) will be covered by the HFI instrument. The LFI is an array of microwave radiometers based on state-of-the-art Indium Phosphide cryogenic HEMT amplifiers implemented in a differential system using blackbody loads as reference signals. The front-end is cooled to 20K for optimal sensitivity and the reference loads are cooled to 4K to minimise low frequency noise. We provide an overview of the LFI, discuss the leading scientific requirements and describe the design solutions adopted for the various hardware subsystems. The main drivers of the radiometric, optical and thermal design are discussed, including the stringent requirements on sensitivity, stability, and rejection of systematic effects. Further details on the key instrument units and the results of ground calibration are provided in a set of companion papers.

Journal ArticleDOI
TL;DR: In this article, the authors model the dynamics of the demagnetization of a dovetail machine under a constant load torque and show that the thermal model should be included in the demagnetic calculations.
Abstract: The demagnetization of permanent magnets in permanent-magnet machines has been under discussion in many publications lately. Demagnetization models have been used, for example, to optimize the machine structures but there have been no studies on how the demagnetization is coupled with the loading and temperature-rise of the machine and how this coupling should be modeled. In this paper, we model the dynamics of the demagnetization of a dovetail machine under a constant load torque. We show that the thermal model should be included in the demagnetization calculations. The demagnetization will cause an increase of the copper losses, which will increase the temperatures of the machine. This will cause more demagnetization and might lead even to a stall in a situation in which a model neglecting the thermal effects predicts stable operation without additional demagnetization.

Journal ArticleDOI
15 Jun 2010-PLOS ONE
TL;DR: The results confirm previous findings that neural activity increases during enhanced working memory performance and suggest that superior working memory task performance in musicians rely on an enhanced ability to exert sustained cognitive control.
Abstract: Musical competence may confer cognitive advantages that extend beyond processing of familiar musical sounds Behavioural evidence indicates a general enhancement of both working memory and attention in musicians It is possible that musicians, due to their training, are better able to maintain focus on task-relevant stimuli, a skill which is crucial to working memory We measured the blood oxygenation-level dependent (BOLD) activation signal in musicians and non-musicians during working memory of musical sounds to determine the relation among performance, musical competence and generally enhanced cognition All participants easily distinguished the stimuli We tested the hypothesis that musicians nonetheless would perform better, and that differential brain activity would mainly be present in cortical areas involved in cognitive control such as the lateral prefrontal cortex The musicians performed better as reflected in reaction times and error rates Musicians also had larger BOLD responses than non-musicians in neuronal networks that sustain attention and cognitive control, including regions of the lateral prefrontal cortex, lateral parietal cortex, insula, and putamen in the right hemisphere, and bilaterally in the posterior dorsal prefrontal cortex and anterior cingulate gyrus The relationship between the task performance and the magnitude of the BOLD response was more positive in musicians than in non-musicians, particularly during the most difficult working memory task The results confirm previous findings that neural activity increases during enhanced working memory performance The results also suggest that superior working memory task performance in musicians rely on an enhanced ability to exert sustained cognitive control This cognitive benefit in musicians may be a consequence of focused musical training

Journal ArticleDOI
TL;DR: This research aimed to identify the requirements posed by precision agriculture on FMIS and then evaluate a modern Web-based approach to the implementation of an FMIS that fulfilled these additional requirements.

Journal ArticleDOI
TL;DR: This work proposes to apply ICA on short-time Fourier transforms of EEG/MEG signals, in order to find more "interesting" sources than with time-domain ICA, and to more meaningfully sort the obtained components.

Journal ArticleDOI
TL;DR: In this paper, the authors suggest the use of solution-specific business models with six key business model elements and develop a typology of five solution specific business models, which can also be used for assessing the performance of individual solutions.

Journal ArticleDOI
TL;DR: The experimental section shows that multiple- Output approaches represent a competitive choice for tackling long-term forecasting tasks and goes a step forward with respect to the previous authors contributions by extending the multiple-output approach with a query-based criterion.

Journal ArticleDOI
TL;DR: The thermodynamics of proteins, the hydrophobic effect and cold denaturation are reviewed and summarized, starting by accounting for these phenomena macroscopically then moving to their atomic-level description.