scispace - formally typeset
Search or ask a question

Showing papers by "Aalto University published in 2010"


Journal ArticleDOI
05 Aug 2010-Nature
TL;DR: The results identify several novel loci associated with plasma lipids that are also associated with CAD and provide the foundation to develop a broader biological understanding of lipoprotein metabolism and to identify new therapeutic opportunities for the prevention of CAD.
Abstract: Plasma concentrations of total cholesterol, low-density lipoprotein cholesterol, high-density lipoprotein cholesterol and triglycerides are among the most important risk factors for coronary artery disease (CAD) and are targets for therapeutic intervention. We screened the genome for common variants associated with plasma lipids in >100,000 individuals of European ancestry. Here we report 95 significantly associated loci (P < 5 x 10(-8)), with 59 showing genome-wide significant association with lipid traits for the first time. The newly reported associations include single nucleotide polymorphisms (SNPs) near known lipid regulators (for example, CYP7A1, NPC1L1 and SCARB1) as well as in scores of loci not previously implicated in lipoprotein metabolism. The 95 loci contribute not only to normal variation in lipid traits but also to extreme lipid phenotypes and have an impact on lipid traits in three non-European populations (East Asians, South Asians and African Americans). Our results identify several novel loci associated with plasma lipids that are also associated with CAD. Finally, we validated three of the novel genes-GALNT2, PPP1R3B and TTC39B-with experiments in mouse models. Taken together, our findings provide the foundation to develop a broader biological understanding of lipoprotein metabolism and to identify new therapeutic opportunities for the prevention of CAD.

3,469 citations


Journal ArticleDOI
TL;DR: This article presents the projector augmented-wave (PAW) method as implemented in the GPAW program package using a uniform real-space grid representation of the electronic wavefunctions and implements the two common formulations of TDDFT, namely the linear-response and the time propagation schemes.
Abstract: Electronic structure calculations have become an indispensable tool in many areas of materials science and quantum chemistry. Even though the Kohn-Sham formulation of the density-functional theory (DFT) simplifies the many-body problem significantly, one is still confronted with several numerical challenges. In this article we present the projector augmented-wave (PAW) method as implemented in the GPAW program package (https://wiki.fysik.dtu.dk/gpaw) using a uniform real-space grid representation of the electronic wavefunctions. Compared to more traditional plane wave or localized basis set approaches, real-space grids offer several advantages, most notably good computational scalability and systematic convergence properties. However, as a unique feature GPAW also facilitates a localized atomic-orbital basis set in addition to the grid. The efficient atomic basis set is complementary to the more accurate grid, and the possibility to seamlessly switch between the two representations provides great flexibility. While DFT allows one to study ground state properties, time-dependent density-functional theory (TDDFT) provides access to the excited states. We have implemented the two common formulations of TDDFT, namely the linear-response and the time propagation schemes. Electron transport calculations under finite-bias conditions can be performed with GPAW using non-equilibrium Green functions and the localized basis set. In addition to the basic features of the real-space PAW method, we also describe the implementation of selected exchange-correlation functionals, parallelization schemes, Delta SCF-method, x-ray absorption spectra, and maximally localized Wannier orbitals.

1,822 citations


Journal ArticleDOI
TL;DR: It is shown that freeze-dried bacterial cellulose nanofibril aerogels can be used as templates for making lightweight porous magnetic aerogel, which can be compacted into a stiff magnetic nanopaper.
Abstract: Nanostructured biological materials inspire the creation of materials with tunable mechanical properties. Strong cellulose nanofibrils derived from bacteria or wood can form ductile or tough networks that are suitable as functional materials. Here, we show that freeze-dried bacterial cellulose nanofibril aerogels can be used as templates for making lightweight porous magnetic aerogels, which can be compacted into a stiff magnetic nanopaper. The 20-70-nm-thick cellulose nanofibrils act as templates for the non-agglomerated growth of ferromagnetic cobalt ferrite nanoparticles (diameter, 40-120 nm). Unlike solvent-swollen gels and ferrogels, our magnetic aerogel is dry, lightweight, porous (98%), flexible, and can be actuated by a small household magnet. Moreover, it can absorb water and release it upon compression. Owing to their flexibility, high porosity and surface area, these aerogels are expected to be useful in microfluidics devices and as electronic actuators.

753 citations


Journal ArticleDOI
07 Oct 2010-Nature
TL;DR: High-fidelity single-shot spin readout in silicon opens the way to the development of a new generation of quantum computing and spintronic devices, built using the most important material in the semiconductor industry.
Abstract: The size of silicon transistors used in microelectronic devices is shrinking to the level at which quantum effects become important. Although this presents a significant challenge for the further scaling of microprocessors, it provides the potential for radical innovations in the form of spin-based quantum computers and spintronic devices. An electron spin in silicon can represent a well-isolated quantum bit with long coherence times because of the weak spin-orbit coupling and the possibility of eliminating nuclear spins from the bulk crystal. However, the control of single electrons in silicon has proved challenging, and so far the observation and manipulation of a single spin has been impossible. Here we report the demonstration of single-shot, time-resolved readout of an electron spin in silicon. This has been performed in a device consisting of implanted phosphorus donors coupled to a metal-oxide-semiconductor single-electron transistor-compatible with current microelectronic technology. We observed a spin lifetime of ∼6 seconds at a magnetic field of 1.5 tesla, and achieved a spin readout fidelity better than 90 per cent. High-fidelity single-shot spin readout in silicon opens the way to the development of a new generation of quantum computing and spintronic devices, built using the most important material in the semiconductor industry.

669 citations


Journal ArticleDOI
21 May 2010-Science
TL;DR: Dendrimersomes marry the stability and mechanical strength obtainable from polymersomes with the biological function of stabilized phospholipid liposomes, plus superior uniformity of size, ease of formation, and chemical functionalization, providing access to systematic tuning of molecular structure and of self-assembled architecture.
Abstract: Self-assembled nanostructures obtained from natural and synthetic amphiphiles serve as mimics of biological membranes and enable the delivery of drugs, proteins, genes, and imaging agents. Yet the precise molecular arrangements demanded by these functions are difficult to achieve. Libraries of amphiphilic Janus dendrimers, prepared by facile coupling of tailored hydrophilic and hydrophobic branched segments, have been screened by cryogenic transmission electron microscopy, revealing a rich palette of morphologies in water, including vesicles, denoted dendrimersomes, cubosomes, disks, tubular vesicles, and helical ribbons. Dendrimersomes marry the stability and mechanical strength obtainable from polymersomes with the biological function of stabilized phospholipid liposomes, plus superior uniformity of size, ease of formation, and chemical functionalization. This modular synthesis strategy provides access to systematic tuning of molecular structure and of self-assembled architecture.

622 citations


Proceedings ArticleDOI
28 Oct 2010
TL;DR: It is stated that too many new systems are developed, but also acknowledge the current reasons for the phenomenon, and encourages opening up the existing systems and joining efforts on developing those further.
Abstract: This paper presents a systematic literature review of the recent (2006--2010) development of automatic assessment tools for programming exercises. We discuss the major features that the tools support and the different approaches they are using both from the pedagogical and the technical point of view. Examples of these features are ways for the teacher to define tests, resubmission policies, security issues, and so forth. We have also identified a list of novel features, like assessing web software, that are likely to get more research attention in the future. As a conclusion, we state that too many new systems are developed, but also acknowledge the current reasons for the phenomenon. As one solution we encourage opening up the existing systems and joining efforts on developing those further. Selected systems from our survey are briefly described in Appendix A.

499 citations


Book
08 Nov 2010
TL;DR: Discovering surprises in the face of intractability is found to be a challenge in finding solutions to intractable problems.
Abstract: Today most computer scientists believe that NP-hard problems cannot be solved by polynomial-time algorithms. From the polynomial-time perspective, all NP-complete problems are equivalent but their exponential-time properties vary widely. Why do some NP-hard problems appear to be easier than others? Are there algorithmic techniques for solving hard problems that are significantly faster than the exhaustive, brute-force methods? The algorithms that address these questions are known as exact exponential algorithms.The history of exact exponential algorithms for NP-hard problems dates back to the 1960s. The two classical examples are Bellman, Held and Karps dynamic programming algorithm for the traveling salesman problem and Rysers inclusionexclusion formula for the permanent of a matrix. The design and analysis of exact algorithms leads to a better understanding of hard problems and initiates interesting new combinatorial and algorithmic challenges. The last decade has witnessed a rapid development of the area, with many new algorithmic techniques discovered. This has transformed exact algorithms into a very active research field. This book provides an introduction to the area and explains the most common algorithmic techniques, and the text is supported throughout with exercises and detailed notes for further reading.The book is intended for advanced students and researchers in computer science, operations research, optimization and combinatorics.

494 citations


Journal ArticleDOI
TL;DR: In this article, the authors present results from monitoring the multi-waveband flux, linear polarization, and parsec-scale structure of the quasar PKS 1510-089, concentrating on eight major γ-ray flares that occurred during the interval 2009.5.
Abstract: We present results from monitoring the multi-waveband flux, linear polarization, and parsec-scale structure of the quasar PKS 1510 – 089, concentrating on eight major γ-ray flares that occurred during the interval 2009.0-2009.5. The γ-ray peaks were essentially simultaneous with maxima at optical wavelengths, although the flux ratio of the two wave bands varied by an order of magnitude. The optical polarization vector rotated by 720° during a five-day period encompassing six of these flares. This culminated in a very bright, ~1 day, optical and γ-ray flare as a bright knot of emission passed through the highest-intensity, stationary feature (the "core") seen in 43 GHz Very Long Baseline Array images. The knot continued to propagate down the jet at an apparent speed of 22c and emit strongly at γ-ray energies as a months-long X-ray/radio outburst intensified. We interpret these events as the result of the knot following a spiral path through a mainly toroidal magnetic field pattern in the acceleration and collimation zone of the jet, after which it passes through a standing shock in the 43 GHz core and then continues downstream. In this picture, the rapid γ-ray flares result from scattering of infrared seed photons from a relatively slow sheath of the jet as well as from optical synchrotron radiation in the faster spine. The 2006-2009.7 radio and X-ray flux variations are correlated at very high significance; we conclude that the X-rays are mainly from inverse Compton scattering of infrared seed photons by 20-40 MeV electrons.

490 citations


Posted Content
TL;DR: For example, this paper found that high-IQ investors are more likely to hold mutual funds and larger numbers of stocks, experience lower risk, and earn higher Sharpe ratios than other demographic and occupational information.
Abstract: Stock market participation is monotonically related to IQ, controlling for wealth, income, age, and other demographic and occupational information. The high correlation between IQ, measured early in adult life, and participation, exists even among the affluent. Supplemental data from siblings, studied with an instrumental variables approach and regressions that control for family effects, demonstrate that IQ’s influence on participation extends to females and does not arise from omitted familial and non-familial variables. High-IQ investors are more likely to hold mutual funds and larger numbers of stocks, experience lower risk, and earn higher Sharpe ratios. We discuss implications for policy and finance research.

448 citations


Journal ArticleDOI
TL;DR: An electron transport study of lithographically fabricated graphene nanoribbons of various widths and lengths finds that charging effects constitute a significant portion of the activation energy.
Abstract: We report an electron transport study of lithographically fabricated graphene nanoribbons (GNRs) of various widths and lengths. At the charge neutrality point, a length-independent transport gap forms whose size is inversely proportional to the GNR width. In this gap, electrons are localized, and charge transport exhibits a transition between thermally activated behavior at higher temperatures and variable range hopping at lower temperatures. By varying the geometric capacitance, we find that charging effects constitute a significant portion of the activation energy.

434 citations


Journal ArticleDOI
TL;DR: The model parameters can be determined from complete DSCs by current dependent EIS and incident-photon-to-collected-electron (IPCE) measurements, supplemented by optical characterization, and used to quantify performance losses in D SCs.
Abstract: Design of new materials for nanostructured dye solar cells (DSC) requires understanding the link between the material properties and cell efficiency. This paper gives an overview of the fundamental and practical aspects of the modeling and characterization of DSCs, and integrates the knowledge into a user-friendly DSC device model. Starting from basic physical and electrochemical concepts, mathematical expressions for the IV curve and differential resistance of all resistive cell components are derived and their relation to electrochemical impedance spectroscopy (EIS) is explained. The current understanding of the associated physics is discussed in detail and clarified. It is shown how the model parameters can be determined from complete DSCs by current dependent EIS and incident-photon-to-collected-electron (IPCE) measurements, supplemented by optical characterization, and used to quantify performance losses in DSCs. The paper aims to give a necessary theoretical background and practical guidelines for establishing an effective feedback-loop for DSC testing and development.

Journal Article
TL;DR: Rybicki et al. as discussed by the authors examined the consumers' values, needs, and objectives related to mobile games and developed a preliminary set of issues and did an exploratory survey of mobile game users to find the key needs and values of mobile gamers.
Abstract: Mobile games are one of the largest mobile application areas and one where users are often willing to pay for services. Furthermore, the market for mobile games is expected to grow dramatically as most phones sold now are capable of running games. Despite this, there is surprisingly little research concerning user expectations from mobile games. In this exploratory study, we examine the consumers' values, needs, and objectives related to mobile games. Based on earlier literature on mobile services, we developed a preliminary set of issues and did an exploratory survey of mobile game users to find the key needs and values of mobile gamers. The results of the study are especially interesting for mobile game developers and mobile phone operators, as they shed light on the demographics and choices of mobile gamers. We argue that if mobile games are ever to be diffused in greater extent to the market, then a deeper understanding of the values and needs of the potential mobile game users must be obtained. This understanding can then be used to guide the development of new game offerings. Keywords: mobile games, values, objectives, consumers, principal component analysis, cluster analysis INTRODUCTION During the last two decades, mobile phones have diffused all over the planet, and the core services provided by the telecom operators (e.g., voice and text messaging) have become commodities. Markets for commodities are typically efficient and quickly respond to changes in supply and demand, driving down prices and making the basis of competition on price. Therefore, in order to remain competitive, many mobile operators have sought cost efficiencies through economies of scale. This has led to a high level of consolidation in the mobile phone operator market. To deviate from competition, companies usually explore ways to provide value-added services for their customers. Operators have considered mobile games1 as a good value-added service for a long time. According to several market research firms (e.g., Juniper, Gartner) the Asian mobile gaming market is growing very fast and the total number of mobile gamers is estimated at 400 million people. The value of the global mobile games market is expected to rise from $5.4 billion in 2008 to more than $10 billion in 2013 (RCR 2008). It is notable that these estimates are based only on OTA downloads through cellular networks. The logic behind these bold estimates is the expansion in the smart phone markets. The shipments of smart phones have grown rapidly during the last few years. According to Canalys.com (2008), 35.5 million smart phone devices were sold in the fourth quarter of 2007, displaying growth rates of over 50 percent during the last two years. The growth of the smart phone market creates a more fruitful basis for the diffusion of mobile games as the games played on smart phones are more sophisticated and more interesting. The recent introduction of Apple AppStore, together with iPhone3G, highlights the importance of games, as almost a third of available titles are games-over 1,700 games as of January 2009 (Rybicki 2009). Previous research on mobile games has dealt with the new possibilities of mobility, e.g., location-based games (Han et al. 2005) and support for combining dimensions of the physical world and our social surroundings into games (Peltola and Karsten 2006). Most mobile game research to date has dealt with technical aspects of the games (see, e.g., Bell et al. 2006; Fritsch et al. 2006). In addition, there is some emerging research into the business models of the software companies producing mobile games (Rajala et al. 2007). Despite the potential of mobile games for different stakeholders, the extant literature provides little empirical research on the actual consumer values regarding mobile games (Anckar and D'Incau 2002; Barnes 2002; MGAIN 2003). Therefore, in this paper, our objective is to explore the values, needs, and objectives related to the purchasing process of mobile games. …

Journal ArticleDOI
TL;DR: In this article, the authors analyse the temporal development of physical population-driven water scarcity, over the period 0 AD to 2005 AD, using population data derived from the HYDE dataset, and water resource availability based on the WaterGAP model results for the period 1961-90.
Abstract: In this letter we analyse the temporal development of physical population-driven water scarcity, i.e. water shortage, over the period 0 AD to 2005 AD. This was done using population data derived from the HYDE dataset, and water resource availability based on the WaterGAP model results for the period 1961‐90. Changes in historical water resources availability were simulated with the STREAM model, forced by climate output data of the ECBilt‐CLIO‐VECODE climate model. The water crowding index, i.e. Falkenmark water stress indicator, was used to identify water shortage in 284 sub-basins. Although our results show a few areas with moderate water shortage (1000‐1700 m 3 /capita/yr) around the year 1800, water shortage began in earnest at around 1900, when 2% of the world population was under chronic water shortage (<1000 m 3 /capita/yr). By 1960, this percentage had risen to 9%. From then on, the number of people under water shortage increased rapidly to the year 2005, by which time 35% of the world population lived in areas with chronic water shortage. In this study, the effects of changes in population on water shortage are roughly four times more important than changes in water availability as a result of long-term climatic change. Global trends in adaptation measures to cope with reduced water resources per capita, such as irrigated area, reservoir storage, groundwater abstraction, and global trade of agricultural products, closely follow the recent increase in global water shortage.

Journal ArticleDOI
TL;DR: In this paper, the authors describe and evaluate the potential approaches to introduce rapid manufacturing (RM) in the spare parts supply chain and illustrate the potential benefits in terms of simultaneously improved service and reduced inventory.
Abstract: Purpose – The purpose of this paper is to describe and evaluate the potential approaches to introduce rapid manufacturing (RM) in the spare parts supply chain.Design/methodology/approach – Alternative conceptual designs for deploying RM technology in the spare parts supply chain were proposed. The potential benefits are illustrated for the aircraft industry. The general feasibility was discussed based on literature.Findings – The potential supply chain benefits in terms of simultaneously improved service and reduced inventory makes the distributed deployment of RM very interesting for spare parts supply. However, considering the trade‐offs affecting deployment it is proposed that most feasible is centralized deployment by original equipment manufacturers (OEMs), or deployment close to the point of use by generalist service providers of RM.Research limitations/implications – The limited part range that is currently possible to produce using the technology means that a RM‐based service supply chain is feasi...

Journal Article
TL;DR: A probabilistic formulation of PCA provides a good foundation for handling missing values, and formulas for doing that are provided, and a novel fast algorithm is introduced and extended to variational Bayesian learning.
Abstract: Principal component analysis (PCA) is a classical data analysis technique that finds linear transformations of data that retain the maximal amount of variance. We study a case where some of the data values are missing, and show that this problem has many features which are usually associated with nonlinear models, such as overfitting and bad locally optimal solutions. A probabilistic formulation of PCA provides a good foundation for handling missing values, and we provide formulas for doing that. In case of high dimensional and very sparse data, overfitting becomes a severe problem and traditional algorithms for PCA are very slow. We introduce a novel fast algorithm and extend it to variational Bayesian learning. Different versions of PCA are compared in artificial experiments, demonstrating the effects of regularization and modeling of posterior variance. The scalability of the proposed algorithm is demonstrated by applying it to the Netflix problem.

Proceedings ArticleDOI
01 Nov 2010
TL;DR: A detailed study of 130,000 measurement sessions that the service has recorded since it was made publicly available in June 2009 is presented, along with describing Netalyzr 's architecture and system implementation.
Abstract: In this paper we present Netalyzr, a network measurement and debugging service that evaluates the functionality provided by people's Internet connectivity. The design aims to prove both comprehensive in terms of the properties we measure and easy to employ and understand for users with little technical background. We structure Netalyzr as a signed Java applet (which users access via their Web browser) that communicates with a suite of measurement-specific servers. Traffic between the two then probes for a diverse set of network properties, including outbound port filtering, hidden in-network HTTP caches, DNS manipulations, NAT behavior, path MTU issues, IPv6 support, and access-modem buffer capacity. In addition to reporting results to the user, Netalyzr also forms the foundation for an extensive measurement of edge-network properties. To this end, along with describing Netalyzr 's architecture and system implementation, we present a detailed study of 130,000 measurement sessions that the service has recorded since we made it publicly available in June 2009.

Journal ArticleDOI
TL;DR: In this paper, the effects of ion irradiation on graphene were studied using atomistic computer simulations based on analytical potential and density functional theory models, and the authors identified the types and concentrations of defects which appeared in graphene under impacts of various ions with energies ranging from tens of electron volts to mega-electron volts.
Abstract: Using atomistic computer simulations based on analytical potential and density-functional theory models, we study effects of ion irradiation on graphene. We identify the types and concentrations of defects which appear in graphene under impacts of various ions with energies ranging from tens of electron volts to mega-electron volts. For two-dimensional targets, defects beyond single and double vacancies are formed via in-plane recoils. We demonstrate that the conventional approach based on binary-collision approximation and stochastic algorithms developed for bulk solids cannot be applied to graphene and other low-dimensional systems. Finally, taking into account the gas-holding capacity of graphene, we suggest the use of graphene as the ultimate membrane for ion-beam analysis of gases and other volatile systems which cannot be put in the high vacuum required for the operation of ion beams.

Journal ArticleDOI
TL;DR: In this paper, the meaning of appearance and especially clothing and fashion is understood in a social context, and an interdisciplinary approach to eco-clothes as cultural and design objects is taken, objects that intertwine consumers' ethical attitudes and values.
Abstract: This paper aims to contribute to a better understanding of eco-fashion consumption and consumer purchase decisions while constructing one's self with external symbols, such as appearance, clothing and fashion items. This study approaches sustainable clothing from a grounding in design research and the meanings of material culture. The study uses sociology and social psychology; hence, the meaning of appearance and especially clothing and fashion is understood in a social context. This paper also takes an interdisciplinary approach to eco-clothes as cultural and design objects in a social and sustainable development context, objects that intertwine consumers' ethical attitudes and values and how they construct a concept of ‘self’ using external symbols. Copyright © 2010 John Wiley & Sons, Ltd and ERP Environment.

Posted Content
TL;DR: A new understanding of roof area distribution and potential PV outputs has an immense significance to energy policy formulation in Ontario and the methodology developed here is transferable in other regions to assist in solar PV deployment.
Abstract: Solar photovoltaic (PV) technology has matured to become a technically viable large-scale source of sustainable energy. Understanding the rooftop PV potential is critical for utility planning, accommodating grid capacity, deploying financing schemes and formulating future adaptive energy policies. This paper merges the capabilities of geographic information systems and object-based image recognition to determine the available rooftop area for PV deployment in an example large-scale region in south eastern Ontario. An innovative five-step procedure has been developed for estimating total rooftop PV potential which involves geographical division of the region; sampling using the Feature Analyst extraction software; extrapolation using roof area-population relationships; reduction for shading, other uses and orientation; and conversion to power and energy outputs. A relationship across the region was found between roof area and population of 70.0 m2/capita ± 6.2%. For this region with appropriate roof tops covered with commercial solar cells the potential PV peak power output is 5.74 GW (157% of the region’s peak power demands) and the potential annual energy production is 6909 Gwh (5% of Ontario’s total annual demand). This suggests that 30% of Ontario’s demand can be met with province-wide rooftop PV deployment. This new understanding of roof area distribution and potential PV outputs has an immense significance to energy policy formulation in Ontario and the methodology developed here is transferable in other regions to assist in solar PV deployment.

Journal ArticleDOI
J. A. Tauber1, Nazzareno Mandolesi2, J.-L. Puget3, T. Banos4  +499 moreInstitutions (61)
TL;DR: The European Space Agency's Planck satellite, launched on 14 May 2009, is the third-generation space experiment in the field of cosmic microwave background (CMB) research as mentioned in this paper.
Abstract: The European Space Agency's Planck satellite, launched on 14 May 2009, is the third-generation space experiment in the field of cosmic microwave background (CMB) research. It will image the anisotropies of the CMB over the whole sky, with unprecedented sensitivity ( ~ 2 × 10-6) and angular resolution (~5 arcmin). Planck will provide a major source of information relevant to many fundamental cosmological problems and will test current theories of the early evolution of the Universe and the origin of structure. It will also address a wide range of areas of astrophysical research related to the Milky Way as well as external galaxies and clusters of galaxies. The ability of Planck to measure polarization across a wide frequency range (30-350 GHz), with high precision and accuracy, and over the whole sky, will provide unique insight, not only into specific cosmological questions, but also into the properties of the interstellar medium. This paper is part of a series which describes the technical capabilities of the Planck scientific payload. It is based on the knowledge gathered during the on-ground calibration campaigns of the major subsystems, principally its telescope and its two scientific instruments, and of tests at fully integrated satellite level. It represents the best estimate before launch of the technical performance that the satellite and its payload will achieve in flight. In this paper, we summarise the main elements of the payload performance, which is described in detail in the accompanying papers. In addition, we describe the satellite performance elements which are most relevant for science, and provide an overview of the plans for scientific operations and data analysis.

Journal ArticleDOI
TL;DR: In this paper, different types of microfibrillated cellulose (MFC) and fines suspensions were produced, characterized, and then added to a paper-making pulp suspension, and the effects of salt concentration, pH, fixative type, dosage and type of fibrillar material on drainage were examined.
Abstract: Different types of microfibrillated cellulose (MFC) and fines suspensions were produced, characterized, and then added to a papermaking pulp suspension. High and medium molar mass cationic polyelectrolytes were used as fixatives. The drainage behavior of the pulp suspensions with additives were evaluated against the strength properties of hand sheets made thereof. The effects of salt concentration, pH, fixative type, dosage and type of fibrillar material on drainage were examined. All the MFC and fines samples produced had clearly different properties due to their dissimilar production methods, and they also introduced specific responses on the measured drainage and paper strength. Generally, the addition of MFC decreased the drainage rate of pulp suspension and increased the strength of paper. However, it was shown that by optimum selection of materials and process conditions an enhancement of the strength properties could be achieved without simultaneously deteriorating the drainage.

Journal ArticleDOI
TL;DR: In this paper, a case study approach that builds on the foundations of moderate constructionism and abduction is presented, where the authors discuss the case study method and its role in industrial marketing, especially in business to business networks.

Journal ArticleDOI
TL;DR: The authors conducted a content analysis of the 145 case studies published in three key journals (Industrial Marketing Management, Journal of Business-to-Business Marketing and Journal of Industrial and Industrial Marketing) over a 10-year period (1997-2006).

Journal ArticleDOI
TL;DR: The increased reactivity of the distorted π-electron system in strained graphene allows us to attach metal atoms and to tailor the properties of graphene.
Abstract: Reconstructed point defects in graphene are created by electron irradiation and annealing. By applying electron microscopy and density functional theory, it is shown that the strain field around these defects reaches far into the unperturbed hexagonal network and that metal atoms have a high affinity to the nonperfect and strained regions of graphene. Metal atoms are attracted by reconstructed defects and bonded with energies of about 2 eV. The increased reactivity of the distorted � -electron system in strained graphene allows us to attach metal atoms and to tailor the properties of graphene.

Journal ArticleDOI
TL;DR: In this article, a protocol is developed to estimate the trapping efficiency (TE) of the existing and planned reservoirs in the Mekong Basin based on Brune's method, and the existing reservoirs have a basin TE of 15-18% and should all the planned reservoirs be built, this will increase to 51-69%.

Journal ArticleDOI
12 Aug 2010-PLOS ONE
TL;DR: A systematic empirical analysis of the statistical properties of communities in large information, communication, technological, biological, and social networks finds that the mesoscopic organization of networks of the same category is remarkably similar.
Abstract: Background Community structure is one of the key properties of complex networks and plays a crucial role in their topology and function. While an impressive amount of work has been done on the issue of community detection, very little attention has been so far devoted to the investigation of communities in real networks.

Journal ArticleDOI
TL;DR: In this article, an exact selfconsistent operator description of the spin and orbital angular momenta, position, and spin-orbit interactions of nonparaxial light in free space is presented.
Abstract: We give an exact self-consistent operator description of the spin and orbital angular momenta, position, and spin-orbit interactions of nonparaxial light in free space. Both quantum-operator formalism and classical energy-flow approach are presented. We apply the general theory to symmetric and asymmetric Bessel beams exhibiting spin- and orbital-dependent intensity profiles. The exact wave solutions are clearly interpreted in terms of the Berry phases, quantization of caustics, and Hall effects of light, which can be readily observedexperimentally.

Journal ArticleDOI
TL;DR: In this paper, first-principles molecular-dynamics simulations with high-resolution transmission electron microscopy experiments are combined to draw a detailed microscopic picture of irradiation effects in hexagonal boron nitride ($h$-BN) monolayers.
Abstract: We combine first-principles molecular-dynamics simulations with high-resolution transmission electron microscopy experiments to draw a detailed microscopic picture of irradiation effects in hexagonal boron nitride ($h$-BN) monolayers. We determine the displacement threshold energies for boron and nitrogen atoms in $h$-BN, which differ significantly from the tight-binding estimates found in the literature and remove ambiguity from the interpretation of the experimental results. We further develop a kinetic Monte Carlo model which allows to extend the simulations to macroscopic time scales and make a direct comparison between theory and experiments. Our results provide a comprehensive picture of the response of $h$-BN nanostructures to electron irradiation.

Journal ArticleDOI
TL;DR: In this paper, the authors show that interorganizational partnerships represent a potentially important resource for the development of ambidexterity, but little is known about how a firm's ambidext...
Abstract: Recent research indicates that interorganizational partnerships represent a potentially important resource for the development of ambidexterity. However, little is known about how a firm’s ambidext...

Journal ArticleDOI
06 Oct 2010-Neuron
TL;DR: Although the sensory modalities driving the neurons in the reorganized OC of blind individuals are altered, the functional specialization of extrastriate cortex is retained regardless of visual experience.