scispace - formally typeset
Search or ask a question

Showing papers by "Johannes Kepler University of Linz published in 2018"


Journal ArticleDOI
22 Jun 2018-Science
TL;DR: It is demonstrated that, in the general population, the personality trait neuroticism is significantly correlated with almost every psychiatric disorder and migraine, and it is shown that both psychiatric and neurological disorders have robust correlations with cognitive and personality measures.
Abstract: Disorders of the brain can exhibit considerable epidemiological comorbidity and often share symptoms, provoking debate about their etiologic overlap. We quantified the genetic sharing of 25 brain disorders from genome-wide association studies of 265,218 patients and 784,643 control participants and assessed their relationship to 17 phenotypes from 1,191,588 individuals. Psychiatric disorders share common variant risk, whereas neurological disorders appear more distinct from one another and from the psychiatric disorders. We also identified significant sharing between disorders and a number of brain phenotypes, including cognitive measures. Further, we conducted simulations to explore how statistical power, diagnostic misclassification, and phenotypic heterogeneity affect genetic correlations. These results highlight the importance of common genetic variation as a risk factor for brain disorders and the value of heritability-based methods in understanding their etiology.

1,357 citations


Journal ArticleDOI
TL;DR: Questioning the disruptive talk associated with digital transformation, it is suggested that the institutional perspective is a prolific lens to study digital innovation and transformation and that existing institutional arrangements are pivotal arbiters in deciding whether and how novel arrangements gain acceptance.

572 citations


Journal ArticleDOI
24 Dec 2018
TL;DR: This paper conducted preregistered replications of 28 classic and contemporary published findings, with protocols that were peer reviewed in advance, to examine variation in effect magnitudes across samples and settings, and found that very little heterogeneity was attributable to the order in which the tasks were performed or whether the task were administered in lab versus online.
Abstract: We conducted preregistered replications of 28 classic and contemporary published findings, with protocols that were peer reviewed in advance, to examine variation in effect magnitudes across samples and settings. Each protocol was administered to approximately half of 125 samples that comprised 15,305 participants from 36 countries and territories. Using the conventional criterion of statistical significance (p < .05), we found that 15 (54%) of the replications provided evidence of a statistically significant effect in the same direction as the original finding. With a strict significance criterion (p < .0001), 14 (50%) of the replications still provided such evidence, a reflection of the extremely high-powered design. Seven (25%) of the replications yielded effect sizes larger than the original ones, and 21 (75%) yielded effect sizes smaller than the original ones. The median comparable Cohen’s ds were 0.60 for the original findings and 0.15 for the replications. The effect sizes were small (< 0.20) in 16 of the replications (57%), and 9 effects (32%) were in the direction opposite the direction of the original effect. Across settings, the Q statistic indicated significant heterogeneity in 11 (39%) of the replication effects, and most of those were among the findings with the largest overall effect sizes; only 1 effect that was near zero in the aggregate showed significant heterogeneity according to this measure. Only 1 effect had a tau value greater than .20, an indication of moderate heterogeneity. Eight others had tau values near or slightly above .10, an indication of slight heterogeneity. Moderation tests indicated that very little heterogeneity was attributable to the order in which the tasks were performed or whether the tasks were administered in lab versus online. Exploratory comparisons revealed little heterogeneity between Western, educated, industrialized, rich, and democratic (WEIRD) cultures and less WEIRD cultures (i.e., cultures with relatively high and low WEIRDness scores, respectively). Cumulatively, variability in the observed effect sizes was attributable more to the effect being studied than to the sample or setting in which it was studied.

495 citations


Journal ArticleDOI
TL;DR: A broad review is carried out on wetting incidence in membrane distillation processes and describes the wetting mechanisms, wetting causes, and wetting detection methods, as well as hydrophobicity measurements of MD membranes.

446 citations


Journal ArticleDOI
TL;DR: The to date largest comparative study of nine state-of-the-art drug target prediction methods finds that deep learning outperforms all other competitors.
Abstract: Deep learning is currently the most successful machine learning technique in a wide range of application areas and has recently been applied successfully in drug discovery research to predict potential drug targets and to screen for active molecules. However, due to (1) the lack of large-scale studies, (2) the compound series bias that is characteristic of drug discovery datasets and (3) the hyperparameter selection bias that comes with the high number of potential deep learning architectures, it remains unclear whether deep learning can indeed outperform existing computational methods in drug discovery tasks. We therefore assessed the performance of several deep learning methods on a large-scale drug discovery dataset and compared the results with those of other machine learning and target prediction methods. To avoid potential biases from hyperparameter selection or compound series, we used a nested cluster-cross-validation strategy. We found (1) that deep learning methods significantly outperform all competing methods and (2) that the predictive performance of deep learning is in many cases comparable to that of tests performed in wet labs (i.e., in vitro assays).

339 citations


Journal ArticleDOI
TL;DR: DeepSynergy uses chemical and genomic information as input information, a normalization strategy to account for input data heterogeneity, and conical layers to model drug synergies and could be a valuable tool for selecting novel synergistic drug combinations.
Abstract: Motivation While drug combination therapies are a well-established concept in cancer treatment, identifying novel synergistic combinations is challenging due to the size of combinatorial space. However, computational approaches have emerged as a time- and cost-efficient way to prioritize combinations to test, based on recently available large-scale combination screening data. Recently, Deep Learning has had an impact in many research areas by achieving new state-of-the-art model performance. However, Deep Learning has not yet been applied to drug synergy prediction, which is the approach we present here, termed DeepSynergy. DeepSynergy uses chemical and genomic information as input information, a normalization strategy to account for input data heterogeneity, and conical layers to model drug synergies. Results DeepSynergy was compared to other machine learning methods such as Gradient Boosting Machines, Random Forests, Support Vector Machines and Elastic Nets on the largest publicly available synergy dataset with respect to mean squared error. DeepSynergy significantly outperformed the other methods with an improvement of 7.2% over the second best method at the prediction of novel drug combinations within the space of explored drugs and cell lines. At this task, the mean Pearson correlation coefficient between the measured and the predicted values of DeepSynergy was 0.73. Applying DeepSynergy for classification of these novel drug combinations resulted in a high predictive performance of an AUC of 0.90. Furthermore, we found that all compared methods exhibit low predictive performance when extrapolating to unexplored drugs or cell lines, which we suggest is due to limitations in the size and diversity of the dataset. We envision that DeepSynergy could be a valuable tool for selecting novel synergistic drug combinations. Availability and implementation DeepSynergy is available via www.bioinf.jku.at/software/DeepSynergy. Contact klambauer@bioinf.jku.at. Supplementary information Supplementary data are available at Bioinformatics online.

317 citations


Journal ArticleDOI
TL;DR: In this paper, the authors presented a solid-state source of on-demand single photons yielding a raw second-order coherence of g(2)(0)=(7.5±1.6)×10−5 without any background subtraction or data processing.
Abstract: True on-demand high-repetition-rate single-photon sources are highly sought after for quantum information processing applications. However, any coherently driven two-level quantum system suffers from a finite re-excitation probability under pulsed excitation, causing undesirable multi-photon emission. Here, we present a solid-state source of on-demand single photons yielding a raw second-order coherence of g(2)(0)=(7.5±1.6)×10−5 without any background subtraction or data processing. To this date, this is the lowest value of g(2)(0) reported for any single-photon source even compared to the previously reported best background subtracted values. We achieve this result on GaAs/AlGaAs quantum dots embedded in a low-Q planar cavity by employing (i) a two-photon excitation process and (ii) a filtering and detection setup featuring two superconducting single-photon detectors with ultralow dark-count rates of (0.0056±0.0007) s−1 and (0.017±0.001) s−1, respectively. Re-excitation processes are dramatically suppres...

232 citations


Journal ArticleDOI
TL;DR: Germanium’s electronic structure and large, tunable spin-orbit coupling makes it a good material for constructing hole-based quantum devices and two-axis control of a hole spin qubit in a germanium double quantum dot is demonstrated.
Abstract: Holes confined in quantum dots have gained considerable interest in the past few years due to their potential as spin qubits. Here we demonstrate two-axis control of a spin 3/2 qubit in natural Ge. The qubit is formed in a hut wire double quantum dot device. The Pauli spin blockade principle allowed us to demonstrate electric dipole spin resonance by applying a radio frequency electric field to one of the electrodes defining the double quantum dot. Coherent hole spin oscillations with Rabi frequencies reaching 140 MHz are demonstrated and dephasing times of 130 ns are measured. The reported results emphasize the potential of Ge as a platform for fast and electrically tunable hole spin qubit devices.

214 citations


Posted Content
TL;DR: A large-scale human study is contributed, which confirms that FVD correlates well with qualitative human judgment of generated videos, and provides initial benchmark results on SCV.
Abstract: Recent advances in deep generative models have lead to remarkable progress in synthesizing high quality images. Following their successful application in image processing and representation learning, an important next step is to consider videos. Learning generative models of video is a much harder task, requiring a model to capture the temporal dynamics of a scene, in addition to the visual presentation of objects. While recent attempts at formulating generative models of video have had some success, current progress is hampered by (1) the lack of qualitative metrics that consider visual quality, temporal coherence, and diversity of samples, and (2) the wide gap between purely synthetic video data sets and challenging real-world data sets in terms of complexity. To this extent we propose Frechet Video Distance (FVD), a new metric for generative models of video, and StarCraft 2 Videos (SCV), a benchmark of game play from custom starcraft 2 scenarios that challenge the current capabilities of generative models of video. We contribute a large-scale human study, which confirms that FVD correlates well with qualitative human judgment of generated videos, and provide initial benchmark results on SCV.

210 citations


Journal ArticleDOI
TL;DR: An evaluation metric for generative models called Fréchet ChemNet distance (FCD) is proposed that can detect whether generated molecules are diverse and have similar chemical and biological properties as real molecules.
Abstract: The new wave of successful generative models in machine learning has increased the interest in deep learning driven de novo drug design. However, method comparison is difficult because of various flaws of the currently employed evaluation metrics. We propose an evaluation metric for generative models called Frechet ChemNet distance (FCD). The advantage of the FCD over previous metrics is that it can detect whether generated molecules are diverse and have similar chemical and biological properties as real molecules.

207 citations


Journal ArticleDOI
TL;DR: This study establishes an open resource for dissecting DNA methylation heterogeneity in a genetically diverse and heterogeneous cancer, and demonstrates the feasibility of integrating epigenomics, radiology, and digital pathology for a national cohort, thereby leveraging existing samples and data collected as part of routine clinical practice.
Abstract: Glioblastoma is characterized by widespread genetic and transcriptional heterogeneity, yet little is known about the role of the epigenome in glioblastoma disease progression. Here, we present genome-scale maps of DNA methylation in matched primary and recurring glioblastoma tumors, using data from a highly annotated clinical cohort that was selected through a national patient registry. We demonstrate the feasibility of DNA methylation mapping in a large set of routinely collected FFPE samples, and we validate bisulfite sequencing as a multipurpose assay that allowed us to infer a range of different genetic, epigenetic, and transcriptional characteristics of the profiled tumor samples. On the basis of these data, we identified subtle differences between primary and recurring tumors, links between DNA methylation and the tumor microenvironment, and an association of epigenetic tumor heterogeneity with patient survival. In summary, this study establishes an open resource for dissecting DNA methylation heterogeneity in a genetically diverse and heterogeneous cancer, and it demonstrates the feasibility of integrating epigenomics, radiology, and digital pathology for a national cohort, thereby leveraging existing samples and data collected as part of routine clinical practice. In-depth methylation analysis of formalin-fixed paraffin-embedded glioblastoma samples demonstrates heterogeneity between primary and recurring tumors and enables prediction of composition of the tumor microenvironment and insights into progression.

Journal ArticleDOI
26 Feb 2018-Polymers
TL;DR: The main approaches for preparing chitosan nanoparticles by self-assembly through both grafting and polyelectrolyte complexes with polyanions are reviewed, and the state of the art of their application in drug delivery is illustrated.
Abstract: Chitosan is a cationic polysaccharide that is usually obtained by alkaline deacetylation of chitin poly(N-acetylglucosamine). It is biocompatible, biodegradable, mucoadhesive, and non-toxic. These excellent biological properties make chitosan a good candidate for a platform in developing drug delivery systems having improved biodistribution, increased specificity and sensitivity, and reduced pharmacological toxicity. In particular, chitosan nanoparticles are found to be appropriate for non-invasive routes of drug administration: oral, nasal, pulmonary and ocular routes. These applications are facilitated by the absorption-enhancing effect of chitosan. Many procedures for obtaining chitosan nanoparticles have been proposed. Particularly, the introduction of hydrophobic moieties into chitosan molecules by grafting to generate a hydrophobic-hydrophilic balance promoting self-assembly is a current and appealing approach. The grafting agent can be a hydrophobic moiety forming micelles that can entrap lipophilic drugs or it can be the drug itself. Another suitable way to generate self-assembled chitosan nanoparticles is through the formation of polyelectrolyte complexes with polyanions. This paper reviews the main approaches for preparing chitosan nanoparticles by self-assembly through both procedures, and illustrates the state of the art of their application in drug delivery.

Journal ArticleDOI
TL;DR: This paper is a reference for both academics and practicing engineers regarding recent developments and future trends in electrical machine design optimization and comprises the definition of optimization scenarios regarding geometry specification and goal setting.
Abstract: Disruptive innovations in electrical machine design optimization are observed in this paper, motivated by emerging trends. Improvements in mathematics and computer science enable more detailed optimization scenarios that cover evermore aspects of physics. In the past, electrical machine design was equivalent to investigating the electromagnetic performance. Nowadays, thermal, rotor dynamics, power electronics, and control aspects are included. The material and engineering science have introduced new dimensions on the optimization process and impact of manufacturing, and unavoidable tolerances should be considered. Consequently, multifaceted scenarios are analyzed and improvements in numerous fields take effect. This paper is a reference for both academics and practicing engineers regarding recent developments and future trends. It comprises the definition of optimization scenarios regarding geometry specification and goal setting. Moreover, a materials-based perspective and techniques for solving optimization problems are included. Finally, a collection of examples from the literature is presented and two particular scenarios are illustrated in detail.

Journal ArticleDOI
TL;DR: In this article, the authors investigated the volatility spillovers and co-movements among oil prices and stock prices of major oil and gas corporations over the period between 18th June 2001 and 1st February 2016.

Journal ArticleDOI
TL;DR: In this article, the authors provide an inventory of and structure for extant research on the image and reputation of family firms, and summarize the ways in which the public perceives family firms and existing influencing factors, courses of action and impacts.
Abstract: The special characteristics of family firms, such as the owning family’s involvement and control or its strong identification with the business, make creating and preserving a good reputation desirable. Recent studies confirm the positive influence of a firm’s reputation on organizational success and non-financial goals, such as customer retention and social capital. The image and reputation of family firms have been the subject of numerous studies. Despite increasing research intensity, a comprehensive overview of this topic is still lacking. This work provides an inventory of and structure for extant research on the image and reputation of family firms. To this end, a systematic literature analysis has been performed, which includes 73 papers from scientific journals from various business fields. Image and reputation are discussed in different theoretical and geographical contexts. Moreover, this contribution summarizes the ways in which the public perceives family firms and existing influencing factors, courses of action and impacts; in a subsequent step, this work integrates these findings into a model that can serve as starting point for future research activities.

Journal ArticleDOI
TL;DR: In this article, the authors used GaAs quantum dots integrated on a patterned piezoelectric actuator capable of suppressing the exciton fine structure splitting to achieve nearly maximally entangled photon pairs from semiconductor quantum dots without resorting to postselection techniques.
Abstract: We report on the observation of nearly maximally entangled photon pairs from semiconductor quantum dots, without resorting to postselection techniques. We use GaAs quantum dots integrated on a patterned piezoelectric actuator capable of suppressing the exciton fine structure splitting. By using a resonant two-photon excitation, we coherently drive the biexciton state and demonstrate experimentally that our device generates polarization-entangled photons with a fidelity of 0.978(5) and a concurrence of 0.97(1) taking into account the nonidealities stemming from the experimental setup. By combining fine-structure-dependent fidelity measurements and a theoretical model, we identify an exciton spin-scattering process as a possible residual decoherence mechanism. We suggest that this imperfection may be overcome using a modest Purcell enhancement so as to achieve fidelities >0.99, thus making quantum dots evenly matched with the best probabilistic entangled photon sources.

Journal ArticleDOI
TL;DR: It is hypothesized that data from a single high-throughput imaging assay can be repurposed to predict the biological activity of compounds in other assays, even those targeting alternate pathways or biological processes.

Journal ArticleDOI
TL;DR: This paper proposes a stochastic model predictive control approach to optimize the fuel consumption in a vehicle following context using a conditional linear Gauss model to estimate the probability distribution of the future velocity of the preceding vehicle.
Abstract: This paper proposes a stochastic model predictive control (MPC) approach to optimize the fuel consumption in a vehicle following context. The practical solution of that problem requires solving a constrained moving horizon optimal control problem using a short-term prediction of the preceding vehicle’s velocity. In a deterministic framework, the prediction errors lead to constraint violations and to harsh control reactions. Instead, the suggested method considers errors, and limits the probability of a constraint violation. A conditional linear Gauss model is developed and trained with real measurements to estimate the probability distribution of the future velocity of the preceding vehicle. The prediction model is used to evaluate two different stochastic MPC approaches. On the one hand, an MPC with individual chance constraints is applied. On the other hand, samples are drawn from the conditional Gaussian model and used for a scenario-based optimization approach. Finally, both developed control strategies are evaluated and compared against a standard deterministic MPC. The evaluation of the controllers shows a significant reduction of the fuel consumption compared with standard adaptive cruise control algorithms.

Journal ArticleDOI
TL;DR: In this article, the authors identify and shed light on what they believe are the most pressing challenges in recommender systems from both academic and industry perspectives, and detail possible future directions and visions for the further evolution of the field.
Abstract: Music recommender systems (MRSs) have experienced a boom in recent years, thanks to the emergence and success of online streaming services, which nowadays make available almost all music in the world at the user’s fingertip. While today’s MRSs considerably help users to find interesting music in these huge catalogs, MRS research is still facing substantial challenges. In particular when it comes to build, incorporate, and evaluate recommendation strategies that integrate information beyond simple user–item interactions or content-based descriptors, but dig deep into the very essence of listener needs, preferences, and intentions, MRS research becomes a big endeavor and related publications quite sparse. The purpose of this trends and survey article is twofold. We first identify and shed light on what we believe are the most pressing challenges MRS research is facing, from both academic and industry perspectives. We review the state of the art toward solving these challenges and discuss its limitations. Second, we detail possible future directions and visions we contemplate for the further evolution of the field. The article should therefore serve two purposes: giving the interested reader an overview of current challenges in MRS research and providing guidance for young researchers by identifying interesting, yet under-researched, directions in the field.

Journal ArticleDOI
05 Jun 2018-JAMA
TL;DR: When assessed among hospitalized adults with suspected infection in 9 LMIC cohorts, the qSOFA score identified infected patients at risk of death beyond that explained by baseline factors, however, the predictive validity varied among cohorts and settings, and further research is needed to better understand potential generalizability.
Abstract: Importance The quick Sequential (Sepsis-Related) Organ Failure Assessment (qSOFA) score has not been well-evaluated in low- and middle-income countries (LMICs). Objective To assess the association of qSOFA with excess hospital death among patients with suspected infection in LMICs and to compare qSOFA with the systemic inflammatory response syndrome (SIRS) criteria. Design, Settings, and Participants Retrospective secondary analysis of 8 cohort studies and 1 randomized clinical trial from 2003 to 2017. This study included 6569 hospitalized adults with suspected infection in emergency departments, inpatient wards, and intensive care units of 17 hospitals in 10 LMICs across sub-Saharan Africa, Asia, and the Americas. Exposures Low (0), moderate (1), or high (≥2) qSOFA score (range, 0 [best] to 3 [worst]) or SIRS criteria (range, 0 [best] to 4 [worst]) within 24 hours of presentation to study hospital. Main Outcomes and Measures Predictive validity (measured as incremental hospital mortality beyond that predicted by baseline risk factors, as a marker of sepsis or analogous severe infectious course) of the qSOFA score (primary) and SIRS criteria (secondary). Results The cohorts were diverse in enrollment criteria, demographics (median ages, 29-54 years; males range, 36%-76%), HIV prevalence (range, 2%-43%), cause of infection, and hospital mortality (range, 1%-39%). Among 6218 patients with nonmissing outcome status in the combined cohort, 643 (10%) died. Compared with a low or moderate score, a high qSOFA score was associated with increased risk of death overall (19% vs 6%; difference, 13% [95% CI, 11%-14%]; odds ratio, 3.6 [95% CI, 3.0-4.2]) and across cohorts ( P P P P P Conclusions and Relevance When assessed among hospitalized adults with suspected infection in 9 LMIC cohorts, the qSOFA score identified infected patients at risk of death beyond that explained by baseline factors. However, the predictive validity varied among cohorts and settings, and further research is needed to better understand potential generalizability.

Journal ArticleDOI
TL;DR: A model for the logic synthesis of QCA circuits is proposed that considers and abstracts all main physical aspects, in particular, energy dissipation, and provides the basis for a new generation of synthesis approaches at the logic level that are explicitly dedicated to QCA systems.
Abstract: Quantum-dot cellular automata (QCA) are an emerging field-coupled nanotechnology with remarkable performance and energy efficiency. In order to enable the exploration of this technology, we propose a model for the logic synthesis of QCA circuits that, for the first time, considers and abstracts all main physical aspects—in particular, energy dissipation. To this end, we review in detail how energy is dissipated in QCA cells and present a corresponding environment that allows for the estimation of the energy dissipation with respect to any specific set of technology parameters. Based on that, we derive a model for logic synthesis. A case study confirms the accuracy of the proposed model and reveals that interconnections have a significant impact in this technology—motivating a more rigorous consideration. These findings eventually provide the basis for a new generation of synthesis approaches at the logic level that are explicitly dedicated to QCA systems.

Journal ArticleDOI
TL;DR: In this article, the authors review the relevant experiments which have led to these important discoveries and discuss the remaining challenges for the anticipated quantum technologies, as well as the remaining opportunities for quantum information science and technology.
Abstract: More than 80 years passed since the first publication on entangled quantum states. In this period of time the concept of spookily interacting quantum states became an emerging field of science. After various experiments proving the existence of such non-classical states, visionary ideas were put forward to exploit entanglement in quantum information science and technology. These novel concepts have not yet come out of the experimental stage, mostly because of the lack of suitable, deterministic sources of entangled quantum states. Among many systems under investigation, semiconductor quantum dots are particularly appealing emitters of on-demand, single polarization-entangled photon-pairs. Although, it was originally believed that quantum dots must exhibit a limited degree of entanglement related to numerous decoherence effects present in the solid-state. Recent studies invalidated the premise of unavoidable entanglement degrading effects. We review the relevant experiments which have led to these important discoveries and discuss the remaining challenges for the anticipated quantum technologies.

Journal ArticleDOI
TL;DR: VL compared with DL is associated with greater first‐pass emergency intubation in the ICU and amongst less experienced clinicians, and reduces oesophageal intubations, however, VL isassociated with greater incidence of arterial hypotension.
Abstract: Videolaryngoscopy (VL) may improve the success of orotracheal intubation compared with direct laryngoscopy (DL). We performed a systematic search of PubMed, Embase, and CENTRAL databases for studies comparing VL and DL for emergency orotracheal intubations outside the operating room. The primary outcome was rate of first-pass intubation, with subgroup analyses by location, device used, clinician experience, and clinical scenario. The secondary outcome was complication rates. Data are presented as [odds ratio (95% confidence intervals); P-values]. We identified 32 studies with 15 064 emergency intubations. There was no difference in first-pass intubation with VL compared with DL [OR=1.28, (0.99–1.65); P=0.06]. First-pass intubations were increased with VL compared with DL in the intensive care unit (ICU) [2.02 (1.43–2.85); P

Journal ArticleDOI
TL;DR: The results substantiate the conclusion that these devices are the first non-Si optoelectronic platform capable of sufficiently large photovoltages and displacement currents to enable true capacitive stimulation of excitable cells.
Abstract: An efficient nanoscale semiconducting optoelectronic system is reported, which is optimized for neuronal stimulation: the organic electrolytic photocapacitor. The devices comprise a thin (80 nm) trilayer of metal and p-n semiconducting organic nanocrystals. When illuminated in physiological solution, these metal-semiconductor devices charge up, transducing light pulses into localized displacement currents that are strong enough to electrically stimulate neurons with safe light intensities. The devices are freestanding, requiring no wiring or external bias, and are stable in physiological conditions. The semiconductor layers are made using ubiquitous and nontoxic commercial pigments via simple and scalable deposition techniques. It is described how, in physiological media, photovoltage and charging behavior depend on device geometry. To test cell viability and capability of neural stimulation, photostimulation of primary neurons cultured for three weeks on photocapacitor films is shown. Finally, the efficacy of the device is demonstrated by achieving direct optoelectronic stimulation of light-insensitive retinas, proving the potential of this device platform for retinal implant technologies and for stimulation of electrogenic tissues in general. These results substantiate the conclusion that these devices are the first non-Si optoelectronic platform capable of sufficiently large photovoltages and displacement currents to enable true capacitive stimulation of excitable cells.

Journal ArticleDOI
TL;DR: This article measured food waste at a hotel breakfast buffet and identified the following guest and breakfast characteristics as being significantly associated with higher plate waste: more children in the guest mix, more Russians and less Austrians or Germans, fewer hotel guests in the breakfast buffet area as well as more buffet stations being set up.
Abstract: Tourists bite off more than they can chew at hotel breakfast buffets. Food waste from hotel buffets means unnecessary food cost for hotels as well as an unnecessary burden on the environment. The present study measured food waste at a hotel breakfast buffet and identified the following guest and breakfast characteristics as being significantly associated with higher plate waste: more children in the guest mix, more Russians and less Austrians or Germans, fewer hotel guests in the breakfast buffet area as well as more buffet stations being set up. These insights contribute to knowledge on environmental sustainability in tourism, pointing to interesting market segments for targeting in high demand periods as well as promising target segments for interventions (e.g., families) and indicate that simple measures such as rearrangements of the breakfast room may reduce food waste.

Journal ArticleDOI
TL;DR: In this paper, the 3-loop master integrals for heavy quark correlators and the three-loop quantum chromodynamics corrections to the ρ-parameter were derived in terms of 2F1 Gaus hypergeometric functions at rational argument.
Abstract: We calculate 3-loop master integrals for heavy quark correlators and the 3-loop quantum chromodynamics corrections to the ρ-parameter. They obey non-factorizing differential equations of second order with more than three singularities, which cannot be factorized in Mellin-N space either. The solution of the homogeneous equations is possible in terms of 2F1 Gaus hypergeometric functions at rational argument. In some cases, integrals of this type can be mapped to complete elliptic integrals at rational argument. This class of functions appears to be the next one arising in the calculation of more complicated Feynman integrals following the harmonic polylogarithms, generalized polylogarithms, cyclotomic harmonic polylogarithms, square-root valued iterated integrals, and combinations thereof, which appear in simpler cases. The inhomogeneous solution of the corresponding differential equations can be given in terms of iterative integrals, where the new innermost letter itself is not an iterative integral. A new class of iterative integrals is introduced containing letters in which (multiple) definite integrals appear as factors. For the elliptic case, we also derive the solution in terms of integrals over modular functions and also modular forms, using q-product and series representations implied by Jacobi’s ϑi functions and Dedekind’s η-function. The corresponding representations can be traced back to polynomials out of Lambert–Eisenstein series, having representations also as elliptic polylogarithms, a q-factorial 1/ηk(τ), logarithms, and polylogarithms of q and their q-integrals. Due to the specific form of the physical variable x(q) for different processes, different representations do usually appear. Numerical results are also presented.

Journal ArticleDOI
TL;DR: The efficacy of the pENsemble has been numerically demonstrated through rigorous numerical studies with dynamic and evolving data streams, where it delivers the most encouraging performance in attaining a tradeoff between accuracy and complexity.
Abstract: The concept of ensemble learning offers a promising avenue in learning from data streams under complex environments because it better addresses the bias and variance dilemma than its single-model counterpart and features a reconfigurable structure, which is well suited to the given context While various extensions of ensemble learning for mining nonstationary data streams can be found in the literature, most of them are crafted under static base-classifier and revisit preceding samples in the sliding window for a retraining step This feature causes computationally prohibitive complexity and is not flexible enough to cope with rapidly changing environments Their complexities are often demanding because they involve a large collection of offline classifiers due to the absence of structural complexities reduction mechanisms and lack of an online feature selection mechanism A novel evolving ensemble classifier, namely Parsimonious Ensemble (pENsemble), is proposed in this paper pENsemble differs from existing architectures in the fact that it is built upon an evolving classifier from data streams, termed Parsimonious Classifier pENsemble is equipped by an ensemble pruning mechanism, which estimates a localized generalization error of a base classifier A dynamic online feature selection scenario is integrated into the pENsemble This method allows for dynamic selection and deselection of input features on the fly pENsemble adopts a dynamic ensemble structure to output a final classification decision where it features a novel drift detection scenario to grow the ensemble's structure The efficacy of the pENsemble has been numerically demonstrated through rigorous numerical studies with dynamic and evolving data streams, where it delivers the most encouraging performance in attaining a tradeoff between accuracy and complexity

Journal ArticleDOI
TL;DR: It is shown that a methanogenic archaeon, Methanothermococcus okinawensis, can produce CH4 under physicochemical conditions extrapolated for Saturn’s icy moon, Enceladus, and that serpentinization may produce sufficient H2 for biological methane production.
Abstract: The detection of silica-rich dust particles, as an indication for ongoing hydrothermal activity, and the presence of water and organic molecules in the plume of Enceladus, have made Saturn’s icy moon a hot spot in the search for potential extraterrestrial life. Methanogenic archaea are among the organisms that could potentially thrive under the predicted conditions on Enceladus, considering that both molecular hydrogen (H2) and methane (CH4) have been detected in the plume. Here we show that a methanogenic archaeon, Methanothermococcus okinawensis, can produce CH4 under physicochemical conditions extrapolated for Enceladus. Up to 72% carbon dioxide to CH4 conversion is reached at 50 bar in the presence of potential inhibitors. Furthermore, kinetic and thermodynamic computations of low-temperature serpentinization indicate that there may be sufficient H2 gas production to serve as a substrate for CH4 production on Enceladus. We conclude that some of the CH4 detected in the plume of Enceladus might, in principle, be produced by methanogens. Many methanogenic archaea use H2 and CO2 to produce methane. Here, Taubner et al. show that Methanothermococcus okinawensis produces methane under conditions extrapolated for Saturn’s icy moon, Enceladus, and estimate that serpentinization may produce sufficient H2 for biological methane production.

Journal ArticleDOI
TL;DR: In this article, the authors review the basic principles of magnetometry and present a representative discussion of artifacts which can occur in studying samples like soft magnetic materials as well as low moment samples.
Abstract: In the field of nanomagnetism and spintronics, integral magnetometry is nowadays challenged by samples with low magnetic moments and/or low coercive fields. Commercial superconducting quantum interference device magnetometers are versatile experimental tools to magnetically characterize samples with ultimate sensitivity as well as with a high degree of automation. For realistic experimental conditions, the as-recorded magnetic signal contains several artifacts, especially if small signals are measured on top of a large magnetic background or low magnetic fields are required. In this Tutorial, we will briefly review the basic principles of magnetometry and present a representative discussion of artifacts which can occur in studying samples like soft magnetic materials as well as low moment samples. It turns out that special attention is needed to quantify and correct the residual fields of the superconducting magnet to derive useful information from integral magnetometry while pushing the limits of detection and to avoid erroneous conclusions.

Journal ArticleDOI
TL;DR: This work realizes highly compliant magnetosensitive skins with directional perception that enable magnetic cognition, body position tracking, and touchless object manipulation that will enable a cornucopia of applications from navigation, motion tracking in robotics, regenerative medicine, and sports and gaming to interaction in supplemented reality.
Abstract: Electronic skins equipped with artificial receptors are able to extend our perception beyond the modalities that have naturally evolved. These synthetic receptors offer complimentary information on our surroundings and endow us with novel means of manipulating physical or even virtual objects. We realize highly compliant magnetosensitive skins with directional perception that enable magnetic cognition, body position tracking, and touchless object manipulation. Transfer printing of eight high-performance spin valve sensors arranged into two Wheatstone bridges onto 1.7-μm-thick polyimide foils ensures mechanical imperceptibility. This resembles a new class of interactive devices extracting information from the surroundings through magnetic tags. We demonstrate this concept in augmented reality systems with virtual knob-turning functions and the operation of virtual dialing pads, based on the interaction with magnetic fields. This technology will enable a cornucopia of applications from navigation, motion tracking in robotics, regenerative medicine, and sports and gaming to interaction in supplemented reality.