scispace - formally typeset
Search or ask a question

Showing papers in "Springer US in 2015"



Book ChapterDOI
TL;DR: This chapter aims to provide an overview of the class of multi-criteria recommender systems, i.e., the category ofRecommender systems that use multi-Criteria preference ratings, with a discussion on open issues and future challenges for the class.
Abstract: This chapter aims to provide an overview of the class of multi-criteria recommender systems, i.e., the category of recommender systems that use multi-criteria preference ratings. Traditionally, the vast majority of recommender systems literature has focused on providing recommendations by modelling a user’s utility (or preference) for an item as a single preference rating. However, where possible, capturing richer user preferences along several dimensions—for example, capturing not only the user’s overall preference for a given movie but also her preferences for specific movie aspects (such as acting, story, or visual effects)—can provide opportunities for further improvements in recommendation quality. As a result, a number of recommendation techniques that attempt to take advantage of such multi-criteria preference information have been developed in recent years. A review of current algorithms that use multi-criteria ratings for calculating predictions and generating recommendations is provided. The chapter concludes with a discussion on open issues and future challenges for the class of multi-criteria rating recommenders.

292 citations



Journal Article
TL;DR: In this article, the effects of shared decision-making authority in human-robot and human-only teams were studied, and it was found that people valued human teammates more than robotic teammates.
Abstract: In manufacturing, advanced robotic technology has opened up the possibility of integrating highly autonomous mobile robots into human teams. However, with this capability comes the issue of how to maximize both team efficiency and the desire of human team members to work with these robotic counterparts. To address this concern, we conducted a set of experiments studying the effects of shared decision-making authority in human---robot and human-only teams. We found that an autonomous robot can outperform a human worker in the execution of part or all of the process of task allocation ($$p<0.001$$p<0.001 for both), and that people preferred to cede their control authority to the robot $$(p<0.001)$$(p<0.001). We also established that people value human teammates more than robotic teammates; however, providing robots authority over team coordination more strongly improved the perceived value of these agents than giving similar authority to another human teammate $$(p< 0.001)$$(p<0.001). In post hoc analysis, we found that people were more likely to assign a disproportionate amount of the work to themselves when working with a robot $$(p<0.01)$$(p<0.01) rather than human teammates only. Based upon our findings, we provide design guidance for roboticists and industry practitioners to design robotic assistants for better integration into the human workplace.

106 citations


Journal Article
TL;DR: This paper explores the properties of the wavelet transform, which is a well-developed multiscale transform in signal processing, to enable automatic handling of the interaction between temporal and spatial scales and proposes a novel algorithm to compute a data similarity graph at appropriate scales and detect events of different scales simultaneously by a single graph-based clustering process.
Abstract: Event detection has been one of the most important research topics in social media analysis. Most of the traditional approaches detect events based on fixed temporal and spatial resolutions, while in reality events of different scales usually occur simultaneously, namely, they span different intervals in time and space. In this paper, we propose a novel approach towards multiscale event detection using social media data, which takes into account different temporal and spatial scales of events in the data. Specifically, we explore the properties of the wavelet transform, which is a well-developed multiscale transform in signal processing, to enable automatic handling of the interaction between temporal and spatial scales. We then propose a novel algorithm to compute a data similarity graph at appropriate scales and detect events of different scales simultaneously by a single graph-based clustering process. Furthermore, we present spatiotemporal statistical analysis of the noisy information present in the data stream, which allows us to define a novel term-filtering procedure for the proposed event detection algorithm and helps us study its behavior using simulated noisy data. Experimental results on both synthetically generated data and real world data collected from Twitter demonstrate the meaningfulness and effectiveness of the proposed approach. Our framework further extends to numerous application domains that involve multiscale and multiresolution data analysis.

88 citations


Journal Article
TL;DR: In this article, the authors demonstrated a new method to fabricate tri-layer biomimetic blood vessel-like structures on a microfluidic platform using photocrosslinkable gelatin hydrogel.
Abstract: There is an immense need for tissue engineered blood vessels. However, current tissue engineering approaches still lack the ability to build native blood vessel-like perfusable structures with multi-layered vascular walls. This paper demonstrated a new method to fabricate tri-layer biomimetic blood vessel-like structures on a microfluidic platform using photocrosslinkable gelatin hydrogel. The presented method enables fabrication of physiological blood vessel-like structures with mono-, bi- or tri-layer vascular walls. The diameter of the vessels, the total thickness of the vessel wall and the thickness of each individual layer of the wall were independently controlled. The developed fabrication process is a simple and rapid method, allowing the physical fabrication of the vascular structure in minutes, and the formation of a vascular endothelial cell layer inside the vessels in 3–5 days. The fabricated vascular constructs can potentially be used in numerous applications including drug screening, development of in vitro models for cardiovascular diseases and/or cancer metastasis, and study of vascular biology and mechanobiology.

85 citations



Book ChapterDOI

71 citations


Reference BookDOI
TL;DR: The present research presents a novel and scalable approach called “SmartLabeling” that allows for real-time assessment of the severity of a person’s addiction before and during the course of treatment.
Abstract: Basic Sciences and Clinical Foundations.- Screening and Early Interventions.- Drugs of Abuse and Pharmacotherapies for Substance Disorders.- Behavioural Approaches.- Social Therapies and Treatment Settings.- Main Elements of a Systems Approach to Addiction Treatment.- Behavioural Addictions and Management Applications.- Medical Disorders and Complications of Alcohol and Other Drugs, Pain Addiction.- Psychiatric Comorbidities and Complications of Alcohol and Other Drugs.- Special Populations.- Children, Adolescents and Young Adults.- Education and Training.

71 citations


Journal Article
TL;DR: In this paper, the authors present a new algorithm for independent component analysis (ICA) which has provable performance guarantees, where the covariance of Gaussian noise is not known in advance.
Abstract: We present a new algorithm for Independent Component Analysis (ICA) which has provable performance guarantees. In particular, suppose we are given samples of the form $y = Ax + \eta$ where $A$ is an unknown $n \times n$ matrix and $x$ is a random variable whose components are independent and have a fourth moment strictly less than that of a standard Gaussian random variable and $\eta$ is an $n$-dimensional Gaussian random variable with unknown covariance $\Sigma$: We give an algorithm that provable recovers $A$ and $\Sigma$ up to an additive $\epsilon$ and whose running time and sample complexity are polynomial in $n$ and $1 / \epsilon$. To accomplish this, we introduce a novel "quasi-whitening" step that may be useful in other contexts in which the covariance of Gaussian noise is not known in advance. We also give a general framework for finding all local optima of a function (given an oracle for approximately finding just one) and this is a crucial step in our algorithm, one that has been overlooked in previous attempts, and allows us to control the accumulation of error when we find the columns of $A$ one by one via local search.

68 citations


Journal Article
TL;DR: The Clock Drawing Test is administered using a digitizing ballpoint pen that reports its position with considerable spatial and temporal precision, making available far more detailed data about the subject’s performance, offering the possibility of substantial improvement in detecting cognitive impairment earlier than currently possible.
Abstract: The Clock Drawing Test--a simple pencil and paper test--has been used for more than 50 years as a screening tool to differentiate normal individuals from those with cognitive impairment, and has proven useful in helping to diagnose cognitive dysfunction associated with neurological disorders such as Alzheimer's disease, Parkinson's disease, and other dementias and conditions. We have been administering the test using a digitizing ballpoint pen that reports its position with considerable spatial and temporal precision, making available far more detailed data about the subject's performance. Using pen stroke data from these drawings categorized by our software, we designed and computed a large collection of features, then explored the tradeoffs in performance and interpretability in classifiers built using a number of different subsets of these features and a variety of different machine learning techniques. We used traditional machine learning methods to build prediction models that achieve high accuracy. We operationalized widely used manual scoring systems so that we could use them as benchmarks for our models. We worked with clinicians to define guidelines for model interpretability, and constructed sparse linear models and rule lists designed to be as easy to use as scoring systems currently used by clinicians, but more accurate. While our models will require additional testing for validation, they offer the possibility of substantial improvement in detecting cognitive impairment earlier than currently possible, a development with considerable potential impact in practice.

Journal Article
TL;DR: In the majority of studies, charge information changed ordering and prescribing behavior as mentioned in this paper, but no analysis or synthesis of these studies has been conducted; however, the authors of this paper aim to determine the type and quality of charge display studies that have been published and synthesize this information in the form of a literature review.
Abstract: ABSTRACTBACKGROUNDWhile studies have been published in the last 30 years that examine the effect of charge display during physician decision-making, no analysis or synthesis of these studies has been conducted.OBJECTIVEWe aimed to determine the type and quality of charge display studies that have been published; to synthesize this information in the form of a literature review.METHODSEnglish-language articles published between 1982 and 2013 were identified using MEDLINE, Web of Knowledge, ABI-Inform, and Academic Search Premier. Article titles, abstracts, and text were reviewed for relevancy by two authors. Data were then extracted and subsequently synthesized and analyzed.RESULTSSeventeen articles were identified that fell into two topic categories: the effect of charge display on radiology and laboratory test ordering versus on medication choice. Seven articles were randomized controlled trials, eight were pre-intervention vs. post-intervention studies, and two interventions had a concurrent control and intervention groups, but were not randomized. Twelve studies were conducted in a clinical environment, whereas five were survey studies. Of the nine clinically based interventions that examined test ordering, seven had statistically significant reductions in cost and/or the number of tests ordered. Two of the three clinical studies looking at medication expenditures found significant reductions in cost. In the survey studies, physicians consistently chose fewer tests or lower cost options in the theoretical scenarios presented.CONCLUSIONSIn the majority of studies, charge information changed ordering and prescribing behavior.

Book ChapterDOI
TL;DR: Health impact assessment is a multidisciplinary and comprehensive governance tool used by decision-makers in order to build a healthy public policy and reduce health inequalities.
Abstract: Health impact assessment (HIA) is a combination of procedures, methods and tools used in order to evaluate the potential effects of a policy or project on the health of a population. HIA, therefore, is a multidisciplinary and comprehensive governance tool used by decision-makers in order to build a healthy public policy and reduce health inequalities .

Journal Article
TL;DR: In this article, the authors theoretically investigate the performance of refractive index sensors, utilizing square and hexagonal arrays of nanoholes, that can monitor the spectral position of EOT signals.
Abstract: Nanohole arrays in metal films allow extraordinary optical transmission (EOT); the phenomenon is highly advantageous for biosensing applications. In this article, we theoretically investigate the performance of refractive index sensors, utilizing square and hexagonal arrays of nanoholes, that can monitor the spectral position of EOT signals. We present near- and far-field characteristics of the aperture arrays and investigate the influence of geometrical device parameters in detail. We numerically compare the refractive index sensitivities of the two lattice geometries and show that the hexagonal array supports larger figure-of-merit values due to its sharper EOT response. Furthermore, the presence of a thin dielectric film that covers the gold surface and mimics a biomolecular layer causes larger spectral shifts within the EOT resonance for the hexagonal array. We also investigate the dependence of the transmission responses on hole radius and demonstrate that hexagonal lattice is highly promising for applications demanding strong light transmission.



BookDOI
TL;DR: The history of chromosome rearrangements and gene fusions in cancer and the molecular basis of MLL leukaemias are reviewed.
Abstract: Part I Introduction Chapter 1 - A short history of chromosome rearrangements and gene fusions in cancer Felix Mitelman Part II General Chapter 2 Molecular genetics methods in discovery of chromosome structure Donna Albertson Chapter 3 Mechanism of recurrent chromosomal translocations Fred Alt, Richard Frock, Jiazhi Hu Chapter 4 Chromosome translocations, cancer initiation and clonal evolution Mel Greaves, Anthony M Ford Chapter 5 Chromosomal fragile sites and cancer Michelle LeBeau, Yanwen Jiang, Isabelle Lucas Chapter 6 Copy number changes in carcinomas: applications Pamela Rabbitts, Henry Wood Part III LEUKAEMIA/LYMPHOMA Chapter 7 Chronic myeloid leukaemia Junia Melo, Debora A Casolari Chapter 8 Immunoglobulin and MYC rearrangements in multiple myeloma pathogenesis P. Leif Bergsagel, W.Michael Kuehl Chapter 9 Genetic alterations in B cell lymphoma Riccardo Dalla Favera, Marco Fangazio, Laura Pasqualucci Chapter 10 Chromosomal translocations and gene rearrangements in acute lymphoblastic leukaemia Tom Look, Marc Mansour Chapter 11 Cellular and molecular basis of MLL leukaemias: from transformation mechanisms to novel therapeutic strategies Chi Wai Eric So, Bernd Zeisig Chapter 12 Acute promyelocytic leukaemia: from a specific translocation to cure by targeted therapies Hugues de The, Kim Rice Chapter 13 Chromosome Abnormalities in AML and their clinical importance Clara Bloomfield, Krzysztof Mrozek PART IV SARCOMAS Chapter 14 Fusion Oncogenes of Sarcomas Pierre Aman Chapter 15 Translocations in Ewing Sarcoma Stephen L Lessnick, Jason M Tanner PART V EPITHELIAL TUMOURS Chapter16 RET and thyroid carcinomas Giancarlo Vecchio, Maria Domenica Castellone Chapter 17 Gene fusions in prostate cancer Scott Tomlins, A.S.McDaniel Chapter 18 Chromosomal Translocations in Lung Cancer Hiroyuki Mano Chapter 19 Colon & Ovarian translocations Paul T. Spellman PART VI OTHER ASPECTS Chapter 20 Pre-clinical modeling of chromosomal translocations and inversions Terry Rabbitts, Katia Ruggero Chapter 21 Protein complex hierarchy & translocation gene products Jacqueline Matthews Chapter 22 Aberrant transcriptional programming in blood cancers Constanze Bonifer, Peter N. Cockerill, Anetta Ptasinska Index

Journal Article
TL;DR: In this article, two new single parameter constraint-handling methodologies based on parent-centric and inverse parabolic probability (IP) distribution are proposed for real-parameter optimization in the presence of constraints.
Abstract: Evolutionary algorithms (EAs) are being routinely applied for a variety of optimization tasks, and real-parameter optimization in the presence of constraints is one such important area. During constrained optimization EAs often create solutions that fall outside the feasible region; hence a viable constraint-handling strategy is needed. This paper focuses on the class of constraint-handling strategies that repair infeasible solutions by bringing them back into the search space and explicitly preserve feasibility of the solutions. Several existing constraint-handling strategies are studied, and two new single parameter constraint-handling methodologies based on parent-centric and inverse parabolic probability (IP) distribution are proposed. The existing and newly proposed constraint-handling methods are first studied with PSO, DE, GAs, and simulation results on four scalable test-problems under different location settings of the optimum are presented. The newly proposed constraint-handling methods exhibit robustness in terms of performance and also succeed on search spaces comprising up-to $$500$$500 variables while locating the optimum within an error of $$10^{-10}$$10-10. The working principle of the IP based methods is also demonstrated on (i) some generic constrained optimization problems, and (ii) a classic `Weld' problem from structural design and mechanics. The successful performance of the proposed methods clearly exhibits their efficacy as a generic constrained-handling strategy for a wide range of applications.

BookDOI
TL;DR: In this paper, Moltmann et al. present a theory of truth predicates in natural language based on the finite theory of revision, which they call "truth predicates" or "truth predicate predicates".
Abstract: Introduction.- Part 1. Truth and Natural Language.- 'Truth Predicates' in Natural Language Friederike Moltmann,- Truth and Language, Natural and Formal John Collins.- Truth and Trustworthiness Michael Sheard.- Part 2. Uses of Truth.- Putting Davidson's Semantics to Work to Solve Frege's Paradox on Concept and Object Philippe de Rouilhan.- Sets, truth, and recursion Reinhard Kahle.- Unfolding feasible arithmetic and weak truth Sebastian Eberhard and Thomas Strahm.- Some remarks on the finite theory of revision Ricardo Bruni.- Part 3. Truth as a Substantial Notion.- Truth as a Composite Correspondence Gila Sher.- Complexity and Hierarchy in Truth Predicates Michael Glanzberg.- Can Deflationism Account for the Norm of Truth? Pascal Engel.- Part 4. Deflationism and Conservativity.- Norms For Theories Of Reflexive Truth Volker Halbach and Leon Horsten.- Some weak theories of truth Graham E. Leigh.- Deflationism and Instrumentalism Martin Fischer.- Typed and Untyped Disquotational Truth Cezary Cieslinski.- New Constructions Of Satisfaction Classes Ali Enayat and Albert Visser.- Part 5. Truth Without Paradox.- Truth, Pretense and the Liar Paradox Bradley Armour-Garb and James A. Woodbridge.-Groundedness, Truth and Dependence Denis Bonnay and Floris Tijmen van Vugt.- On Stratified Truth A. Cantini.- Part 6. Inferentialism and Revisionary Approach.- Truth, Signi_cation and Paradox Stephen Read.- Vagueness, truth and permissive consequence Pablo Cobreros, Paul Egre, David Ripley, Robert van Rooij.- Validity and Truth-Preservation Julien Murzi and Lionel Shapiro.- Getting One for Two, or the Contractors' Bad Deal. Towards a Uni_ed Solution to the Semantic Paradoxes Zardini.- Kripke's Thought-Paradox and the 5th Antinomy Graham Priest.

Book ChapterDOI
TL;DR: In this paper, the authors present some open problems and obtain partial results for spectral optimization problems involving measure, torsional rigidity and rst Dirichlet eigenvalue.
Abstract: We present some open problems and obtain some partial results for spectral optimization problems involving measure, torsional rigidity and rst Dirichlet eigenvalue.


Journal Article
TL;DR: In this paper, a tensile output bar is used to measure the tensile force acting on the specimen boundaries and a high speed camera system is employed to measure displacement history at the specimen level through planar digital image correlation.
Abstract: A new set-up is proposed to perform high strain rate tension experiments on sheet metal using a compression Hopkinson bar system on the input side. With the help of a custom-made load inversion device, the compression loading pulse is converted into tensile loading of the specimen boundary. A tensile output bar is used to measure the tensile force acting on the specimen boundaries. A high speed camera system is employed to measure the displacement history at the specimen level through planar digital image correlation. The output bar is positioned on top of the input bar. As a result, the valid experiment duration of the proposed system is twice as long as that of conventional Kolsky systems. It therefore facilitates the execution of intermediate strain rate (~100/s) experiments without increasing total system length. Numerical simulations are carried out to assess the effect of spurious bending effects that are introduced through the eccentricity of the input and output bar axes. In addition, experiments are performed on straight and notched specimens to demonstrate the characterization of the rate dependent plasticity and fracture properties of a 1.06 mm thick DP780 steel.


Journal Article
TL;DR: In this paper, the performance of electrospun cellulose acetate filters with different mean fiber diameters was compared with those for two conventional filter media, a glass fiber filter and a celluloseacetate microfiber filter.
Abstract: Aerosol filtration using electrospun cellulose acetate filters with different mean fiber diameters is reported, and the results are compared with those for two conventional filter media, a glass fiber filter and a cellulose acetate microfiber filter. The performance of these filters was studied using two aerosols, one solid (NaCl) and one liquid (diethyl hexyl sebacate), under conditions of relatively high face velocity (45 cm/s). The experimental observations are compared to theoretical predictions based on single fiber filtration efficiency. Our results indicate that the mechanisms for single fiber filtration efficiency provide reasonable predictions of the most penetrating particle size (MPPS), in the range of 40–270 nm, percentage penetration from 0.03 to 70 %, and fiber diameter in the range from 0.1 to 24 µm. Using an analysis based on blocking filtration laws, we conclude that filtration by cake formation dominated in the case of NaCl aerosols on electrospun filter media, whereas filters with larger fiber diameter showed a transition in mechanisms, from an initial regime characterized by pore blocking to a later regime characterized by cake formation. The liquid aerosol did not exhibit cake formation, even for the smallest fiber diameters, and also had much smaller influence on pressure drop than did the solid aerosol. The electrospun filters demonstrated slightly better quality factors compared to the commercial glass fiber filter, at a much lower thickness. In general, this study demonstrates control of the properties of electrospun cellulose acetate fibers for air filtration application.




Book ChapterDOI
TL;DR: In this article, a framework for exploratory OLAP over RDF sources is proposed, which uses a multidimensional schema of the OLAP cube expressed in RDF vocabularies.
Abstract: Business Intelligence (BI) tools provide fundamental support for analyzing large volumes of information. Data Warehouses (DW) and Online Analytical Processing (OLAP) tools are used to store and analyze data. Nowadays more and more information is available on the Web in the form of Resource Description Framework (RDF), and BI tools have a huge potential of achieving better results by integrating real-time data from web sources into the analysis process. In this paper, we describe a framework for so-called exploratory OLAP over RDF sources. We propose a system that uses a multidimensional schema of the OLAP cube expressed in RDF vocabularies. Based on this information the system is able to query data sources, extract and aggregate data, and build a cube. We also propose a computer-aided process for discovering previously unknown data sources and building a multidimensional schema of the cube. We present a use case to demonstrate the applicability of the approach.

BookDOI
TL;DR: The authors examines the distinctive and highly problematic ethical questions surrounding conflict archaeology, and examines the ethical issues surrounding conflict archaeological sites and their relationships with conflict archaeological artifacts, as well as conflict archaeologists.
Abstract: This volume examines the distinctive and highly problematic ethical questions surrounding conflict archaeology.

Journal Article
TL;DR: The results suggest that the organizational maxim about human behavior, “you get what you measure”—i.e., sharing metrics with people causes them to focus on optimizing those metrics while de-emphasizing other objectives—also applies to the training of agents.
Abstract: In this work, we address a relatively unexplored aspect of designing agents that learn from human reward. We investigate how an agent’s non-task behavior can affect a human trainer’s training and agent learning. We use the TAMER framework, which facilitates the training of agents by human-generated reward signals, i.e., judgements of the quality of the agent’s actions, as the foundation for our investigation. Then, starting from the premise that the interaction between the agent and the trainer should be bi-directional, we propose two new training interfaces to increase a human trainer’s active involvement in the training process and thereby improve the agent’s task performance. One provides information on the agent’s uncertainty which is a metric calculated as data coverage, the other on its performance. Our results from a 51-subject user study show that these interfaces can induce the trainers to train longer and give more feedback. The agent’s performance, however, increases only in response to the addition of performance-oriented information, not by sharing uncertainty levels. These results suggest that the organizational maxim about human behavior, “you get what you measure”—i.e., sharing metrics with people causes them to focus on optimizing those metrics while de-emphasizing other objectives—also applies to the training of agents. Using principle component analysis, we show how trainers in the two conditions train agents differently. In addition, by simulating the influence of the agent’s uncertainty–informative behavior on a human’s training behavior, we show that trainers could be distracted by the agent sharing its uncertainty levels about its actions, giving poor feedback for the sake of reducing the agent’s uncertainty without improving the agent’s performance.