scispace - formally typeset
Search or ask a question

Showing papers by "Rensselaer Polytechnic Institute published in 2010"


Journal ArticleDOI
TL;DR: This review shows that, despite their apparent simplicity, the development of a general eye detection technique involves addressing many challenges, requires further theoretical developments, and is consequently of interest to many other domains problems in computer vision and beyond.
Abstract: Despite active research and significant progress in the last 30 years, eye detection and tracking remains challenging due to the individuality of eyes, occlusion, variability in scale, location, and light conditions. Data on eye location and details of eye movements have numerous applications and are essential in face detection, biometric identification, and particular human-computer interaction tasks. This paper reviews current progress and state of the art in video-based eye detection and tracking in order to identify promising techniques as well as issues to be further addressed. We present a detailed review of recent eye models and techniques for eye detection and tracking. We also survey methods for gaze estimation and compare them based on their geometric properties and reported accuracies. This review shows that, despite their apparent simplicity, the development of a general eye detection technique involves addressing many challenges, requires further theoretical developments, and is consequently of interest to many other domains problems in computer vision and beyond.

1,514 citations


Journal ArticleDOI
01 Mar 2010
TL;DR: In this paper, a set of meta-analyses were conducted to examine the relationship of personality to outcomes associated with two different stages of the entrepreneurial process: entrepreneurial intentions and entrepreneurial performance.
Abstract: A set of meta-analyses were conducted to examine the relationship of personality to outcomes associated with two different stages of the entrepreneurial process: entrepreneurial intentions and entrepreneurial performance. A broad range of personality scales were categorized into a parsimonious set of constructs using the Five Factor model of personality. The results show that four of the Big Five personality dimensions were associated with both dependent variables, with agreeableness failing to be associated with either. Multivariate effect sizes were moderate for the full set of Big Five personality variables on entrepreneurial intentions (multiple R = .36) and entrepreneurial performance (multiple R = .31). Risk propensity, included as a separate dimension of personality, was positively associated with entrepreneurial intentions but was not related to entrepreneurial performance. These effects suggest that personality plays a role in the emergence and success of entrepreneurs.

1,216 citations


Journal ArticleDOI
18 Jan 2010-Small
TL;DR: The fracture toughness, fracture energy, and fatigue properties of an epoxy polymer reinforced with various weight fractions of functionalized graphene sheets, and under fatigue conditions, are reported.
Abstract: Graphene, a single-atom-thick sheet of sp-bonded carbon atoms, has generatedmuch interest due to its high specific area and novel mechanical, electrical, and thermal properties. Recent advances in the production of bulk quantities of exfoliated graphene sheets from graphite have enabled the fabrication of graphene–polymer composites. Such composites show tremendous potential for mechanical-property enhancement due to their combination of high specific surface area, strong nanofiller–matrix adhesion and the outstanding mechanical properties of the sp carbon bonding network in graphene. Graphene fillers have been successfully dispersed in poly(styrene), poly(acrylonitrile) and poly(methyl methacrylate) matrices and the responses of their Young’s modulus, ultimate tensile strength, andglass-transition temperaturehave been characterized. However, to the best of our knowledge there is no report on the fracture toughness and fatigue properties of graphene–polymer composites. Fracture toughness describes the ability of a material containing a crack to resist fracture and it is a critically important material property for design applications. Fatigue involves dynamic propagation of cracks under cyclic loading and it is one of the primary causes of catastrophic failure in structural materials. Consequently, the material’s resistance to fracture and fatigue crack propagation are of paramount importance to prevent failure. Herein we report the fracture toughness, fracture energy, and fatigue properties of an epoxy polymer reinforced with various weight fractions of functionalized graphene sheets. Remarkably, only 0.125% weight of functionalized graphene sheets was observed to increase the fracture toughness of the pristine (unfilled) epoxy by 65% and the fracture energy by 115%.Toachievecomparableenhancement,carbonnanotube (CNT) and nanoparticle epoxy composites require one to two orders of magnitude larger weight fraction of nanofillers. Under fatigue conditions, incorporation of 0.125% weight of functionalized graphene sheets drastically reduced the rate of crack propagation in the epoxy 25-fold. Fractography analysis

809 citations


Journal ArticleDOI
08 Jul 2010-Polymer
TL;DR: In this paper, the state of the art regarding the understanding and prediction of the macro-scale properties of polymers reinforced with nanometer-sized solid inclusions over a wide temperature range is established.

778 citations


Journal ArticleDOI
TL;DR: Results suggest that a 2-3% penetration of cell phones in the driver population is enough to provide accurate measurements of the velocity of the traffic flow, demonstrating the feasibility of the proposed system for real-time traffic monitoring.
Abstract: The growing need of the driving public for accurate traffic information has spurred the deployment of large scale dedicated monitoring infrastructure systems, which mainly consist in the use of inductive loop detectors and video cameras On-board electronic devices have been proposed as an alternative traffic sensing infrastructure, as they usually provide a cost-effective way to collect traffic data, leveraging existing communication infrastructure such as the cellular phone network A traffic monitoring system based on GPS-enabled smartphones exploits the extensive coverage provided by the cellular network, the high accuracy in position and velocity measurements provided by GPS devices, and the existing infrastructure of the communication network This article presents a field experiment nicknamed Mobile Century, which was conceived as a proof of concept of such a system Mobile Century included 100 vehicles carrying a GPS-enabled Nokia N95 phone driving loops on a 10-mile stretch of I-880 near Union City, California, for 8 hours Data were collected using virtual trip lines, which are geographical markers stored in the handset that probabilistically trigger position and speed updates when the handset crosses them The proposed prototype system provided sufficient data for traffic monitoring purposes while managing the privacy of participants The data obtained in the experiment were processed in real-time and successfully broadcast on the internet, demonstrating the feasibility of the proposed system for real-time traffic monitoring Results suggest that a 2-3% penetration of cell phones in the driver population is enough to provide accurate measurements of the velocity of the traffic flow

773 citations


Journal ArticleDOI
TL;DR: The results demonstrate that genipin‐crosslinked gelatin microspheres can be used to deliver growth factors locally to cells in order to direct their function.
Abstract: A main challenge in tissue engineering and regenerative medicine is achieving local and efficient growth factor release to guide cell function. Gelatin is a denatured form of collagen that cells can bind to and degrade through enzymatic action. In this study, gelatin microspheres were used to release bone morphogenetic protein 2 (BMP2). Spherical microparticles with diameters in the range of 2-6 µm were created by an emulsification process and were stabilized by crosslinking with the small molecule genipin. The degree of crosslinking was varied by controlling the incubation time in genipin solution. Loading rate studies, using soy bean trypsin inhibitor as a model protein, showed rapid protein uptake over the first 24 h, followed by a levelling off and then a further increase after approximately 3 days, as the microspheres swelled. Growth factor release studies using microspheres crosslinked to 20%, 50% and 80% of saturation and then loaded with BMP2 showed that higher degrees of crosslinking resulted in higher loading efficiency and slower protein release. After 24 h, the concentration profiles produced by all microsphere formulations were steady and approximately equal. Microspheres incubated with adult human mesenchymal stem cells accumulated preferentially on the cell surface, and degraded over time in culture. BMP2-loaded microspheres caused a three- to eight-fold increase in expression of the bone sialoprotein gene after 14 days in culture, with more crosslinked beads producing a greater effect. These results demonstrate that genipin-crosslinked gelatin microspheres can be used to deliver growth factors locally to cells in order to direct their function.

755 citations


Journal ArticleDOI
TL;DR: This paper presents a robust and accurate novel method for segmenting cell nuclei using a combination of ideas, and presents an efficient semiautomated approach to editing automated segmentation results that requires two mouse clicks per operation.
Abstract: Automatic segmentation of cell nuclei is an essential step in image cytometry and histometry. Despite substantial progress, there is a need to improve accuracy, speed, level of automation, and adaptability to new applications. This paper presents a robust and accurate novel method for segmenting cell nuclei using a combination of ideas. The image foreground is extracted automatically using a graph-cuts-based binarization. Next, nuclear seed points are detected by a novel method combining multiscale Laplacian-of-Gaussian filtering constrained by distance-map-based adaptive scale selection. These points are used to perform an initial segmentation that is refined using a second graph-cuts-based algorithm incorporating the method of alpha expansions and graph coloring to reduce computational complexity. Nuclear segmentation results were manually validated over 25 representative images (15 in vitro images and 10 in vivo images, containing more than 7400 nuclei) drawn from diverse cancer histopathology studies, and four types of segmentation errors were investigated. The overall accuracy of the proposed segmentation algorithm exceeded 86%. The accuracy was found to exceed 94% when only over- and undersegmentation errors were considered. The confounding image characteristics that led to most detection/segmentation errors were high cell density, high degree of clustering, poor image contrast and noisy background, damaged/irregular nuclei, and poor edge information. We present an efficient semiautomated approach to editing automated segmentation results that requires two mouse clicks per operation.

683 citations


BookDOI
01 Mar 2010
TL;DR: In this article, the authors show that diversity in biological diversity relates to the operations of ecosystems in at least three ways: 1. increase in diversity often leads to an increase in productivity due to complementary traits among species for resource use, and productivity itself underpins many ecosystem services, 2. increased diversity leads to increased response diversity (range of traits related to how species within the same functional group respond to environmental drivers) resulting in less variability in functioning over time as environment changes, and 3. idiosyncratic effects due to keystone species properties and unique trait-combinations which may result
Abstract: All ecosystems are shaped by people, directly or indirectly and all people, rich or poor, rural or urban, depend on the capacity of ecosystems to generate essential ecosystem services. In this sense, people and ecosystems are interdependent social-ecological systems. The ecosystem concept describes the interrelationships between living organisms (people included) and the non-living environment and provides a holistic approach to understanding the generation of services from an environment that both delivers benefits to and imposes costs on people. Variation in biological diversity relates to the operations of ecosystems in at least three ways: 1. increase in diversity often leads to an increase in productivity due to complementary traits among species for resource use, and productivity itself underpins many ecosystem services, 2. increased diversity leads to an increase in response diversity (range of traits related to how species within the same functional group respond to environmental drivers) resulting in less variability in functioning over time as environment changes, 3. idiosyncratic effects due to keystone species properties and unique trait-combinations which may result in a disproportional effect of losing one particular species compared to the effect of losing individual species at random. Ecosystems produce multiple services and these interact in complex ways, different services being interlinked, both negatively and positively. Delivery of many services will therefore vary in a correlated manner, but when an ecosystem is managed principally for the delivery of a single service (e.g. food production), other services are nearly always affected negatively. Ecosystems vary in their ability to buffer and adapt to both natural and anthropogenic changes as well as recover after changes (i.e. resilience). When subjected to severe change, ecosystems may cross thresholds and move into different and often less desirable ecological states or trajectories. A major challenge is how to design ecosystem management in ways that maintain resilience and avoids passing undesirable thresholds. There is clear evidence for a central role of biodiversity in the delivery of some – but not all - services, viewed individually. However, ecosystems need to be managed to deliver multiple services to sustain human well-being and also managed at the level of landscapes and seascapes in ways that avoid the passing of dangerous tipping-points. We can state with high certainty that maintaining functioning ecosystems capable of delivering multiple services requires a general approach to sustaining biodiversity, in the long-term also when a single service is the focus.

510 citations


Journal ArticleDOI
TL;DR: In this article, an implicit methodology based on chemical group theory was used to formulate a jet aviation fuel surrogate by the measurements of several combustion related fuel properties, and the validity of the proposed surrogate was evaluated by experimental measurement of select combustion properties of POSF 4658 and the POSF4658 surrogate.

505 citations


Journal ArticleDOI
TL;DR: A scalable and facile technique for noncovalent functionalization of graphene with 1-pyrenecarboxylic acid that exfoliates single-, few-, and multilayered graphene flakes into stable aqueous dispersions is presented.
Abstract: We present a scalable and facile technique for noncovalent functionalization of graphene with 1-pyrenecarboxylic acid that exfoliates single-, few-, and multilayered graphene flakes into stable aqueous dispersions. The exfoliation mechanism is established using stringent control experiments and detailed characterization steps. Using the exfoliated graphene, we demonstrate highly sensitive and selective conductometric sensors (whose resistance rapidly changes >10 000% in saturated ethanol vapor), and ultracapacitors with extremely high specific capacitance (∼120 F/g), power density (∼105 kW/kg), and energy density (∼9.2 Wh/kg).

470 citations


Journal ArticleDOI
TL;DR: In this article, a least squares method was used to fit Ti concentrations in quartz from all experiments to the simple expression for the P-T dependence of Ti-in-quartz solubility, where R is the gas constant 8.3145 J/K, T is temperature in Kelvin, and M is the mole fraction of TiO2 in quartz.
Abstract: Quartz and rutile were synthesized from silica-saturated aqueous fluids between 5 and 20 kbar and from 700 to 940°C in a piston-cylinder apparatus to explore the potential pressure effect on Ti solubility in quartz. A systematic decrease in Ti-in-quartz solubility occurs between 5 and 20 kbar. Titanium K-edge X-ray absorption near-edge structure (XANES) measurements demonstrate that Ti4+ substitutes for Si4+ on fourfold tetrahedral sites in quartz at all conditions studied. Molecular dynamic simulations support XANES measurements and demonstrate that Ti incorporation onto fourfold sites is favored over interstitial solubility mechanisms. To account for the P–T dependence of Ti-in-quartz solubility, a least-squares method was used to fit Ti concentrations in quartz from all experiments to the simple expression $$ RT\ln X_{{{\text{TiO}}_{ 2} }}^{\text{quartz}} = - 60952 + 1.520 \cdot T(K) - 1741 \cdot P(kbar) + RT\ln a_{{{\text{TiO}}_{ 2} }} $$ where R is the gas constant 8.3145 J/K, T is temperature in Kelvin, $$ X_{{{\text{TiO}}_{ 2} }}^{\text{quartz}} $$ is the mole fraction of TiO2 in quartz and $$ a_{{{\text{TiO}}_{ 2} }} $$ is the activity of TiO2 in the system. The P–T dependencies of Ti-in-quartz solubility can be used as a thermobarometer when used in combination with another thermobarometer in a coexisting mineral, an independent P or T estimate of quartz crystallization, or well-constrained phase equilibria. If temperature can be constrained within ±25°C, pressure can be constrained to approximately ±1.2 kbar. Alternatively, if pressure can be constrained to within ±1 kbar, then temperature can be constrained to approximately ±20°C.

Journal ArticleDOI
TL;DR: It is demonstrated here that this roughness effect in conjunction with the surface chemistry of the graphene sheets can be used to dramatically alter the wettability of the substrate.
Abstract: Adv. Mater. 2010, 22, 2151–2154 2010 WILEY-VCH Verlag G T IO N Superhydrophobic materials with water contact angles above 1508 are the key enabler for antisticking, anticontamination, and self-cleaning technologies. Similarly, superhydrophilic materials with water contact angles below 108 have many important applications; for example, as a wicking material in heat pipes and for enhanced boiling heat transfer. In general, the wettability of a solid surface is strongly influenced both by its chemical composition and by its geometric structure (or surface roughness). Several experimental and modeling studies have focused on exploiting surface roughness to engineer superhydrophobicity or superhydrophilicity. Both microscale roughness features (e.g., micromachined silicon pillars) as well as nanoscale features (e.g., aligned arrays of carbon nanotubes) have been investigated. However, so far the wetting properties of graphene-based coatings have not been studied in detail. Graphene is a single-atom-thick sheet of sp hybridized carbon atoms. When deposited on a planer substrate, the individual graphene sheets form an interconnected film, which increases the surface roughness of the substrate by one to two orders of magnitude. We demonstrate here that this roughness effect in conjunction with the surface chemistry of the graphene sheets can be used to dramatically alter the wettability of the substrate. If hydrophilic graphene sheets are used (for example, by sonicating the as-produced graphene in water), the substrate acquires a superhydrophilic character. Conversely if hydrophobic graphene sheets are used (by sonicating the as-produced graphene in acetone) then the roughness effect imparts superhydrophobicity to the underlying substrate. By controlling the relative proportion of acetone and water in the solvent, the contact angle of the resulting graphene film can be tailored over a wide range (from superhydrophobic to superhydrophilic). Such graphene-based coatings with controllable wetting properties provide a facile and effective means to modify the wettability of a variety of surfaces. The graphene sheets used in this study were extracted from graphite using the method developed in Reference [19,20]. In this method, partially oxygenated graphene sheets are generated by the rapid thermal expansion (>2000 8C min ) of completely oxidized graphite oxide. The protocols used to oxidize graphite to graphite oxide and then generate graphene sheets (Fig. 1a) by the thermal exfoliation of graphite oxide are provided in the Experimental section. Figure 1b illustrates a transmission electron microscopy (TEM) image of a typical graphene flake synthesized by the above method and deposited on a standard TEM grid for imaging. The flake is several micrometers in dimension; note the wrinkled (rough) surface texture of the graphene flake. Figure 1c displays a high-resolution TEM (HRTEM) image of the edge of a typical graphene flake, indicating that each flake is comprised of 3 individual graphene sheets. The electron diffraction pattern (shown in inset) confirms the signature of few-layered graphene.

Book
01 Oct 2010
TL;DR: The papers presented cover a range of topics, including computer vision, mapping, terrain identification, distributed systems, localization, manipulation, collision avoidance, multibody dynamics, obstacle detection, microrobotic systems, pursuit-evasion, grasping and manipulation, tracking, spatial kinematics, machine learning, and sensor networks.
Abstract: Robotics: Science and Systems IV spans a wide spectrum of robotics, bringing together researchers working on the foundations of robotics, robotics applications, and the analysis of robotics systems. This volume presents the proceedings of the fourth annual Robotics: Science and Systems conference, held in 2008 at the Swiss Federal Institute of Technology in Zurich. The papers presented cover a range of topics, including computer vision, mapping, terrain identification, distributed systems, localization, manipulation, collision avoidance, multibody dynamics, obstacle detection, microrobotic systems, pursuit-evasion, grasping and manipulation, tracking, spatial kinematics, machine learning, and sensor networks as well as such applications as autonomous driving and the design of manipulators for use in functional-MRI. The conference and its proceedings reflect not only the tremendous growth of robotics as a discipline but also the desire in the robotics community for a flagship event at which the best of the research in the field can be presented.

Journal ArticleDOI
TL;DR: It is shown that proliferative SVZ progenitor cells home to endothelial cells in a stromal-derived factor 1 (SDF1)- and CXC chemokine receptor 4-dependent manner and that SDF1 increases the motility of type A neuroblasts, which migrate from the SVZ toward the olfactory bulb.

Journal ArticleDOI
TL;DR: In this article, the effect of firm leaders on corporate social performance (CSP) was examined by using upper echelon theory, and the KLD Research Analytics CSP ratings, to show that observable CEO characteristics predict differences in CSP between firms, even when firm and industry characteristics are controlled for.
Abstract: While there are growing bodies of research examining both the differences between strongly and poorly socially performing firms, and the impact of firm leaders on other strategic outcomes, little has been done in examining the effect of firm leaders on corporate social performance (CSP). This study directly addresses this issue by using upper echelon theory, and the KLD Research Analytics CSP ratings, to show that observable CEO characteristics predict differences in CSP between firms, even when firm and industry characteristics are controlled for. Using a sample of 650 public US firms I find that strong or exemplary CSP, as measured by the strengths categories of KLD’s ratings, is positively related to the CEO having a bachelor’s degree in humanities, having a breadth of career experience and being female. I find that KLD strength ratings are negatively related to the CEO having a bachelor’s degree in economics and to their level of short-term compensation. Preliminary tests of causality support the assertion that these effects reflect CEO discretion rather than being an artifact of reverse causality. Significant relationships between the CEO characteristics and poor social performance as measured by the concerns categories of KLD’s ratings are not found. This suggests that CEOs have more discretion in influencing strong and exemplary social performance than in impacting poor CSP. Implications, particularly for economics education, are discussed. While there are many fruitful directions for future research, the benefits and challenges to conducting similar studies in other countries is focused upon to correspond to the global nature of this special issue.

Journal ArticleDOI
TL;DR: The study demonstrates the analytic potential of the concept of undone science to deepen understanding of the systematic nonproduction of knowledge in the institutional matrix of state, industry, and social movements that is characteristic of recent calls for a ‘‘new political sociology of science.’’
Abstract: "Undone science" refers to areas of research that are left unfunded, incomplete, or generally ignored but that social movements or civil society organizations often identify as worthy of more research. This study mobilizes four recent studies to further elaborate the concept of undone science as it relates to the political construction of research agendas. Using these cases, we develop the argument that undone science is part of a broader politics of knowledge, wherein multiple and competing groups struggle over the construction and implementation of alternative research agendas. Overall, the study demonstrates the analytic potential of the concept of undone science to deepen understanding of the systematic nonproduction of knowledge in the institutional matrix of state, industry, and social movements that is characteristic of recent calls for a "new political sociology of science."

Journal ArticleDOI
TL;DR: The study's findings suggest that the three types of IT-enabled knowledge capabilities have differential effects on firm innovation, and this study substantially contributes to the information systems research, methodology, and practice in multiple ways.
Abstract: We theoretically and empirically investigate the relationship between information technology (IT) and firm innovation. Invoking absorptive capacity (ACAP) theory, we introduce and develop the concepts of three types of IT-enabled knowledge capabilities. Firm innovation is examined through two observable innovation outcomes: patents, and new product and service introductions. These innovation outcomes are often labeled as competitive actions aggressively undertaken by firms to gain market share or to achieve profitability. We use secondary data about IT-enabled knowledge capabilities and innovation outcomes of 110 firms. Our data results provide strong support for our main assertion that knowledge capabilities that are enhanced through the use of IT contribute to firm innovation. The study's findings suggest that the three types of IT-enabled knowledge capabilities have differential effects on firm innovation. This study substantially contributes to the information systems (IS) research, methodology, and practice in multiple ways.

Journal ArticleDOI
TL;DR: The results demonstrated that bio-printing of VEGF-containing fibrin gel supported sustained release of the GF in the collagen scaffold, and can be gainfully used in the development of three-dimensional (3D) artificial tissue assays and neural tissue regeneration applications.

Journal ArticleDOI
TL;DR: These single crystals exhibited enhanced photocatalytic activities for degradation of Methylene Blue dye under ultraviolet light irradiation.

Journal ArticleDOI
TL;DR: This paper investigated the effects of focus versus diversification on bank performance using data on Chinese banks during the 1996-2006 period and found that diversification is associated with reduced profits and higher costs.
Abstract: This paper investigates the effects of focus versus diversification on bank performance using data on Chinese banks during the 1996–2006 period. We construct a new measure, economies of diversification, and compare the results to those of the more conventional focus indices, which are based on the sum of squares of shares in different products or regions. Diversification is captured in four dimensions: loans, deposits, assets, and geography. We find that all four dimensions of diversification are associated with reduced profits and higher costs. These results are robust regardless of alternative measures of diversification and performance. Furthermore, we observe that banks with foreign ownership (both majority and minority ownership) and banks with conglomerate affiliation are associated with fewer diseconomies of diversification, suggesting that foreign ownership and conglomerate affiliation may play important mitigating roles. This analysis may provide important implications for bank managers and regulators in China as well as in other emerging economies.

Book ChapterDOI
07 Nov 2010
TL;DR: It is described how referentially opaque contexts that do not allow inference exist, and some varieties of referential-opaque alternatives to owl:sameAs are outlined, to shed light upon how owl: sameAs is being used (and misused) on the Web of data.
Abstract: In Linked Data, the use of owl:sameAs is ubiquitous in interlinking data-sets. There is however, ongoing discussion about its use, and potential misuse, particularly with regards to interactions with inference. In fact, owl:sameAs can be viewed as encoding only one point on a scale of similarity, one that is often too strong for many of its current uses. We describe how referentially opaque contexts that do not allow inference exist, and then outline some varieties of referentially-opaque alternatives to owl:sameAs. Finally, we report on an empirical experiment over randomly selected owl:sameAs statements from the Web of data. This theoretical apparatus and experiment shed light upon how owl:sameAs is being used (and misused) on the Web of data.

Journal ArticleDOI
TL;DR: In this paper, the authors construct a stakeholder welfare score measuring the extent to which firms meet the expectation of their non-shareholder stakeholders (such as employees, customers, communities, and environment), and find it to be associated with positive valuation effects.
Abstract: Using data from the independent social choice investment advisory firm Kinder, Lydenberg, Domini (KLD), we construct a stakeholder welfare score measuring the extent to which firms meet the expectation of their non-shareholder stakeholders (such as employees, customers, communities, and environment), and find it to be associated with positive valuation effects: an increase of 1 in the stakeholder welfare score leads to an increase of 0.587 in Tobin’s Q . Furthermore, the valuation effects vary across stakeholders and the aforementioned positive effects are driven by firms’ performance on employee relations and environmental issues. These results suggest that stakeholder welfare (in particular, employee welfare and environmental performance) represents intangibles (such as reputation or human capital) crucial for shareholder value creation rather than private benefits managers pursue for their own social or economic needs.

Journal ArticleDOI
TL;DR: Using global patent data, the authors empirically investigated the importance of both the quantity and quality of innovation on economic growth, controlling for past measures of inventive inputs, and examined how innovation inputs can be translated into per capita growth under various economic structures and stages of economic development.

Journal ArticleDOI
TL;DR: A unified framework for understanding creative problem solving, namely, the explicit-implicit interaction theory is proposed, which represents an initial step in the development of process-based theories of creativity encompassing incubation, insight, and various other related phenomena.
Abstract: This article proposes a unified framework for understanding creative problem solving, namely, the explicit-implicit interaction theory. This new theory of creative problem solving constitutes an attempt at providing a more unified explanation of relevant phenomena (in part by reinterpreting/integrating various fragmentary existing theories of incubation and insight). The explicit-implicit interaction theory relies mainly on 5 basic principles, namely, (a) the coexistence of and the difference between explicit and implicit knowledge, (b) the simultaneous involvement of implicit and explicit processes in most tasks, (c) the redundant representation of explicit and implicit knowledge, (d) the integration of the results of explicit and implicit processing, and (e) the iterative (and possibly bidirectional) processing. A computational implementation of the theory is developed based on the CLARION cognitive architecture and applied to the simulation of relevant human data. This work represents an initial step in the development of process-based theories of creativity encompassing incubation, insight, and various other related phenomena.

Journal ArticleDOI
22 Nov 2010-Small
TL;DR: The graphene band structure is sensitive to lattice symmetry and several methods have been developed to break this symmetry and open an energy gap, which is the major obstacle limiting the utilization of graphene in nano-electronic and -photonic devices, such as p–n junctions, transistors, photodiodes, and lasers.
Abstract: Graphene, a single-atom-thick layer of sp 2 -hybridized carbon atoms, has generated considerable excitement in the scientifi c community due to its peculiar electronic band structure, which leads to unusual phenomena such as the anomalous quantum Hall effect, [ 1,2 ] spin-resolved quantum interference, [ 3 ] ballistic electron transport, [ 4 ] and bipolar supercurrent. [ 5 ] However, pristine graphene is a semimetal with zero bandgap; the local density of states at the Fermi level is zero and conduction can only occur by the thermal excitation of electrons. [ 2 ] This lack of an electronic bandgap is the major obstacle limiting the utilization of graphene in nano-electronic and -photonic devices, [ 6,7 ] such as p–n junctions, transistors, photodiodes, and lasers. The graphene band structure is sensitive to lattice symmetry and several methods have been developed to break this symmetry and open an energy gap. These methods are based on a variety of techniques, such as defect generation, [ 8 ] doping (e.g., with potassium [ 9 ] ), applied bias, [ 10–12 ] and interaction with gases [ 13 ] (e.g., nitrogen dioxide). For instance, in reference [ 12 ] a tunable bandgap of up to 0.25 eV was achieved for electrically gated bilayer graphene by a variable external electric fi eld. Similarly, an internal electric fi eld produced by an imbalance of doped charge between two graphene layers has been shown to open a bandgap. [ 9 ] It has been demonstrated that a gap of ≈ 0.26 eV can be produced by growing graphene epitaxially on silicon carbide substrates. [ 14 ] This gap originated from the breaking of sublattice symmetry due to the graphene–substrate interaction. Patterned adsorption of atomic hydrogen onto the Moire superlattice positions of graphene [ 15 ] has resulted in a bandgap of ≈ 0.73 eV opening, while half-hydrogenated graphene [ 16 ] resulted in a bandgap of ≈ 0.43 eV. A graphene nanomesh structure [ 17 ] has also been shown to exhibit a bandgap. In this graphene structure, lateral quantum confi nement and localization effects due to

Journal ArticleDOI
TL;DR: In this paper, the authors examined the relationship between the Kamlet-Taft α, β, and π* solvent polarity parameters of different room temperature ionic liquids (RTILs) and effective pretreatment of lignocellulosic biomass.

Journal ArticleDOI
14 Apr 2010-Langmuir
TL;DR: A fast and highly reproducible chemical synthesis method for colloidal gold nanoparticles which are negatively charged in nonpolar solvents and coated with hydrophobic organic molecules that can be deposited to any substrate without any limit in size.
Abstract: We report a fast and highly reproducible chemical synthesis method for colloidal gold nanoparticles which are negatively charged in nonpolar solvents and coated with hydrophobic organic molecules. If a hexane droplet containing charged gold nanoparticles is mixed with a larger toluene droplet, nanoparticles immediately float to the air−toluene interface and form a close-packed monolayer film. After evaporation of the solvent molecules, the monolayer film of nanoparticles can be deposited to any substrate without any limit in size. The synthesis does not require a postsynthesis cleaning step, since the two immiscible liquid phases separate the reaction byproducts from gold nanoparticles and a minimal amount of coating molecules is used.

Journal ArticleDOI
TL;DR: Surprisingly, resveratrol does not remodel non-toxic oligomers or accelerate Aβ monomer aggregation despite that both conformers possess random coil secondary structures indistinguishable from soluble oligomers and significantly different from their β-sheet rich, fibrillar counterparts.

Journal ArticleDOI
TL;DR: The study indicates two key mechanisms for the performance improvement, an optimized 2DHA design that permits an efficient coupling of light from the far-field to a localized plasmonic mode and the close spatial matching of the QD layers to the wave function extent of the plasMonic mode.
Abstract: In this paper, we report a successful realization and integration of a gold two-dimensional hole array (2DHA) structure with semiconductor InAs quantum dot (QD). We show experimentally that a properly designed 2DHA-QD photodetector can facilitate a strong plasmonic-QD interaction, leading to a 130% absolute enhancement of infrared photoresponse at the plasmonic resonance. Our study indicates two key mechanisms for the performance improvement. One is an optimized 2DHA design that permits an efficient coupling of light from the far-field to a localized plasmonic mode. The other is the close spatial matching of the QD layers to the wave function extent of the plasmonic mode. Furthermore, the processing of our 2DHA is amenable to large scale fabrication and, more importantly, does not degrade the noise current characteristics of the photodetector. We believe that this demonstration would bring the performance of QD-based infrared detectors to a level suitable for emerging surveillance and medical diagnostic applications.

Journal ArticleDOI
TL;DR: In this paper, the authors review materials growth, device physics, design, fabrication, and performance of DUV LEDs with wavelength ranging from 210 to 365 nm and describe prototype systems for water purification and sterilization.
Abstract: Compact solid-state deep-ultraviolet (DUV) light-emitting diodes (LEDs) go far beyond replacing conventional DUV sources such as mercury lamps. DUV LEDs enable new applications for air, water, and surface sterilization and decontamination, bioagent detection and identification, UV curing, and biomedical and analytical instrumentation. We review materials growth, device physics, design, fabrication, and performance of DUV LEDs with wavelength ranging from 210 to 365 nm, describe prototype systems for water purification and sterilization, and discuss other emerging applications and systems using DUV LEDs.