scispace - formally typeset
Search or ask a question

Showing papers by "Conservatoire national des arts et métiers published in 2018"


Journal ArticleDOI
TL;DR: Zhang et al. as mentioned in this paper provided a conceptualization of the gratifications derived from brand-consumer interactions, referred to as Brand-Consumer Social Sharing Value (BCSV), using a sample of brands' Facebook page users.

170 citations


Proceedings ArticleDOI
27 Jun 2018
TL;DR: In this article, a cross-modal retrieval model aligning visual and textual data (like pictures of dishes and their recipes) in a shared representation space is proposed, and validated on the Recipe1M dataset containing nearly 1 million picture-recipe pairs.
Abstract: Designing powerful tools that support cooking activities has rapidly gained popularity due to the massive amounts of available data, as well as recent advances in machine learning that are capable of analyzing them. In this paper, we propose a cross-modal retrieval model aligning visual and textual data (like pictures of dishes and their recipes) in a shared representation space. We describe an effective learning scheme, capable of tackling large-scale problems, and validate it on the Recipe1M dataset containing nearly 1 million picture-recipe pairs. We show the effectiveness of our approach regarding previous state-of-the-art models and present qualitative results over computational cooking use cases.

124 citations


Journal ArticleDOI
TL;DR: In this article, the authors compare multiple imputation methods for multilevel continuous and binary data where variables are systematically and sporadically missing and highlight that valid inferences can only be obtained if the dataset includes a large number of clusters.
Abstract: We present and compare multiple imputation methods for multilevel continuous and binary data where variables are systematically and sporadically missing. The methods are compared from a theoretical point of view and through an extensive simulation study motivated by a real dataset comprising multiple studies. The comparisons show that these multiple imputation methods are the most appropriate to handle missing values in a multilevel setting and why their relative performances can vary according to the missing data pattern, the multilevel structure and the type of missing variables. This study shows that valid inferences can only be obtained if the dataset includes a large number of clusters. In addition, it highlights that heteroscedastic multiple imputation methods provide more accurate inferences than homoscedastic methods, which should be reserved for data with few individuals per cluster. Finally, guidelines are given to choose the most suitable multiple imputation method according to the structure of the data.

91 citations


Journal ArticleDOI
TL;DR: In this paper, the authors study turbulent flows in pressure-driven ducts with square cross-section through direct numerical simulation in a wide enough range of Reynolds number to reach flow conditions which are representative of fully developed turbulence.
Abstract: We study turbulent flows in pressure-driven ducts with square cross-section through direct numerical simulation in a wide enough range of Reynolds number to reach flow conditions which are representative of fully developed turbulence ( ). Numerical simulations are carried out over very long integration times to get adequate convergence of the flow statistics, and specifically to achieve high-fidelity representation of the secondary motions which arise. The intensity of the latter is found to be on the order of 1 %–2 % of the bulk velocity, and approximately unaffected by Reynolds number variation, at least in the range under scrutiny. The smallness of the mean convection terms in the streamwise vorticity equation points to a simple characterization of the secondary flows, which in the asymptotic high- regime are approximated with good accuracy by eigenfunctions of the Laplace operator, in the core part of the duct. Despite their effect of redistributing the wall shear stress along the duct perimeter, we find that secondary motions do not have a large influence on the bulk flow properties, and the streamwise velocity field can be characterized with good accuracy as resulting from the superposition of four flat walls in isolation. As a consequence, we find that parametrizations based on the hydraulic diameter concept, and modifications thereof, are successful in predicting the duct friction coefficient.

90 citations


Journal ArticleDOI
TL;DR: The results add important evidence towards the possible role of DAAs in HCC recurrence and stress the need for further mechanistic studies and clinical trials to accurately confirm this role and to identify patient characteristics that may be associated with this event.
Abstract: In Egypt, hepatocellular carcinoma (HCC) is the most common form of cancer and direct-acting antivirals (DAA) are administered on a large scale to patients with chronic HCV infection to reduce the risk. In this unique setting, we aimed to determine the association of DAA exposure with early-phase HCC recurrence in patients with a history of HCV-related liver cancer. This was a prospective cohort study of an HCV-infected population from one Egyptian specialized HCC management centre starting from the time of successful HCC intervention. The incidence rates of HCC recurrence between DAA-exposed and nonexposed patients were compared, starting from date of HCC complete radiological response and censoring after 2 years. DAA exposure was treated as time varying. Two Poisson regressions models were used to control for potential differences in the exposed and nonexposed group; multivariable adjustment and balancing using inverse probability of treatment weighting (IPTW). We included 116 patients: 53 treated with DAAs and 63 not treated with DAAs. There was 37.7% and 25.4% recurrence in each group after a median of 16.0 and 23.0 months of follow-up, respectively. Poisson regression using IPTW demonstrated an association between DAAs and HCC recurrence with an incidence rate ratio of 3.83 (95% CI: 2.02-7.25), which was similar in the multivariable-adjusted model and various sensitivity analyses. These results add important evidence towards the possible role of DAAs in HCC recurrence and stress the need for further mechanistic studies and clinical trials to accurately confirm this role and to identify patient characteristics that may be associated with this event.

76 citations


Journal ArticleDOI
TL;DR: It is shown that it is possible to calculate the observer gains making the estimation error dynamics cooperative and stable via some change of coordinates under arbitrary switching sequences.

73 citations


Journal ArticleDOI
TL;DR: This review focuses on the selection of decoy compounds that has considerably changed over the years, from randomly selected compounds to highly customized or experimentally validated negative compounds and proposes recommendations for the selection and the design of benchmarking datasets.
Abstract: Virtual Screening (VS) is designed to prospectively help identifying potential hits, i.e., compounds capable of interacting with a given target and potentially modulate its activity, out of large compound collections. Among the variety of methodologies, it is crucial to select the protocol that is the most adapted to the query/target system under study and that yields the most reliable output. To this aim, the performance of VS methods is commonly evaluated and compared by computing their ability to retrieve active compounds in benchmarking datasets. The benchmarking datasets contain a subset of known active compounds together with a subset of decoys, i.e., assumed non-active molecules. The composition of both the active and the decoy compounds subsets is critical to limit the biases in the evaluation of the VS methods. In this review, we focus on the selection of decoy compounds that has considerably changed over the years, from randomly selected compounds to highly customized or experimentally validated negative compounds. We first outline the evolution of decoys selection in benchmarking databases as well as current benchmarking databases that tend to minimize the introduction of biases, and secondly, we propose recommendations for the selection and the design of benchmarking datasets.

63 citations


Journal ArticleDOI
TL;DR: The description of seasonality offers an explanation for heterogeneities in the West-East YF burden across Africa, and could highlight areas of increased transmission and provide insights into the occurrence of large outbreaks.
Abstract: Background Yellow fever virus (YFV) is a vector-borne flavivirus endemic to Africa and Latin America. Ninety per cent of the global burden occurs in Africa where it is primarily transmitted by Aedes spp, with Aedes aegypti the main vector for urban yellow fever (YF). Mosquito life cycle and viral replication in the mosquito are heavily dependent on climate, particularly temperature and rainfall. We aimed to assess whether seasonal variations in climatic factors are associated with the seasonality of YF reports. Methodology/principal findings We constructed a temperature suitability index for YFV transmission, capturing the temperature dependence of mosquito behaviour and viral replication within the mosquito. We then fitted a series of multilevel logistic regression models to a dataset of YF reports across Africa, considering location and seasonality of occurrence for seasonal models, against the temperature suitability index, rainfall and the Enhanced Vegetation Index (EVI) as covariates alongside further demographic indicators. Model fit was assessed by the Area Under the Curve (AUC), and models were ranked by Akaike's Information Criterion which was used to weight model outputs to create combined model predictions. The seasonal model accurately captured both the geographic and temporal heterogeneities in YF transmission (AUC = 0.81), and did not perform significantly worse than the annual model which only captured the geographic distribution. The interaction between temperature suitability and rainfall accounted for much of the occurrence of YF, which offers a statistical explanation for the spatio-temporal variability in transmission. Conclusions/significance The description of seasonality offers an explanation for heterogeneities in the West-East YF burden across Africa. Annual climatic variables may indicate a transmission suitability not always reflected in seasonal interactions. This finding, in conjunction with forecasted data, could highlight areas of increased transmission and provide insights into the occurrence of large outbreaks, such as those seen in Angola, the Democratic Republic of the Congo and Brazil.

62 citations


Journal ArticleDOI
TL;DR: This analysis estimated the differential health impact and household economic impact of ten antigens and their corresponding vaccines across income quintiles for forty-one low- and middle-income countries and indicated that benefits would accrue predominantly in the lowest income quintile.
Abstract: With social policies increasingly directed toward enhancing equity through health programs, it is important that methods for estimating the health and economic benefits of these programs by subpopulation be developed, to assess both equity concerns and the programs’ total impact We estimated the differential health impact (measured as the number of deaths averted) and household economic impact (measured as the number of cases of medical impoverishment averted) of ten antigens and their corresponding vaccines across income quintiles for forty-one low- and middle-income countries Our analysis indicated that benefits across these vaccines would accrue predominantly in the lowest income quintiles Policy makers should be informed about the large health and economic distributional impact that vaccines could have, and they should view vaccination policies as potentially important channels for improving health equity Our results provide insight into the distribution of vaccine-preventable diseases and the hea

53 citations


Journal ArticleDOI
TL;DR: This study provides a broad view of the current status and trends in DED research and may help clinicians, researchers and policy makers better understand this research field and predict its dynamic directions.
Abstract: Purpose To perform a bibliometric analysis in the field of dry eye disease (DED) research to characterize the current international status of DED research and to identify the most effective actors (journals, countries, authors) involved in this field. Methods Scientometric methods were used to evaluate global scientific production and development trends in DED research, using the Web of Science Core Collection. Results The growth of the literature related to DED averaged 12.18% over the last 10 years. A total of 5522 original and review articles, published in 821 different journals, were identified. The USA was the most productive country with 34.53% of the overall articles studied and 46.10% of the overall citations. The Ocular Surface published a very high percentage of articles related to DED relative to the total number of articles published (31.87%). The most productive institutions and the most frequently cited articles were from the USA and Japan. A network visualization map for country collaboration revealed that most European countries developed most of their collaborations with countries belonging to their own continent, which was not the case for the USA or Japan. A total of 41,956 KeyWords Plus were found with an average of 7.6 (SD = 3.15) KeyWords Plus per article. Conclusions This study provides a broad view of the current status and trends in DED research and may help clinicians, researchers and policy makers better understand this research field and predict its dynamic directions.

52 citations


Posted Content
TL;DR: This paper proposes a cross-modal retrieval model aligning visual and textual data (like pictures of dishes and their recipes) in a shared representation space, and describes an effective learning scheme, capable of tackling large-scale problems.
Abstract: Designing powerful tools that support cooking activities has rapidly gained popularity due to the massive amounts of available data, as well as recent advances in machine learning that are capable of analyzing them. In this paper, we propose a cross-modal retrieval model aligning visual and textual data (like pictures of dishes and their recipes) in a shared representation space. We describe an effective learning scheme, capable of tackling large-scale problems, and validate it on the Recipe1M dataset containing nearly 1 million picture-recipe pairs. We show the effectiveness of our approach regarding previous state-of-the-art models and present qualitative results over computational cooking use cases.

Journal ArticleDOI
TL;DR: Some recent results on interval observers for several dynamical systems classes such as continuous-time and switched systems are presented.
Abstract: Abstract Based on the theory of positive systems, the goal of interval observers is to compute sets of admissible values of the state vector at each instant of time for systems subject to bounded uncertainties (noises, disturbances and parameters). The size of the estimated sets, which should be minimised, has to be proportional to the model uncertainties. An interval estimation can be seen as a conventional point estimation (the centre of the interval) with an estimation error given by the interval radius. The reliable uncertainties propagation performed in this context can be useful in several fields such as robust control, diagnosis and fault-tolerant control. This paper presents some recent results on interval observers for several dynamical systems classes such as continuous-time and switched systems.

Journal ArticleDOI
TL;DR: With the redefinition of the kelvin, the broad research activities of the temperature community on the determination of the Boltzmann constant have been very successfully completed.
Abstract: The International Committee for Weights and Measures (CIPM), at its meeting in October 2017, followed the recommendation of the Consultative Committee for Units (CCU) on the redefinition of the kilogram, ampere, kelvin and mole. For the redefinition of the kelvin, the Boltzmann constant will be fixed with the numerical value 1.380 649 × 10-23 J K-1. The relative standard uncertainty to be transferred to the thermodynamic temperature value of the triple point of water will be 3.7 × 10-7, corresponding to an uncertainty in temperature of 0.10 mK, sufficiently low for all practical purposes. With the redefinition of the kelvin, the broad research activities of the temperature community on the determination of the Boltzmann constant have been very successfully completed. In the following, a review of the determinations of the Boltzmann constant k, important for the new definition of the kelvin and performed in the last decade, is given.

Journal ArticleDOI
TL;DR: In this paper, the authors characterize the eruptive events occurred at Mt Etna between January 2011 and December 2015 leading to the emplacement of numerous lava flows and to the formation of a new pyroclastic cone (NSEC) on the eastern flank of the South East Crater.
Abstract: Estimates of lava volumes provide important data on the lava flooding history and evolution of a volcano. For mapping these volcanic deposits, the advancement of satellite remote sensing techniques offer a great potential. Here we characterize the eruptive events occurred at Mt Etna between January 2011 and December 2015 leading to the emplacement of numerous lava flows and to the formation of a new pyroclastic cone (NSEC) on the eastern flank of the South East Crater. The HOTSAT system is used to analyze remote sensing data acquired by the SEVIRI sensor in order to detect the thermal anomalies from active lava flows and calculate the associated radiative power. The time-series analysis of SEVIRI data provides an estimation of event magnitude and intensity of the effusive material erupted during each event. The cumulative volume estimated from SEVIRI images from 2011 to 2015 adds up to ~106 millions of cubic meters of lava and is constrained using a topographic approach, i.e. by subtracting the last topography of Etna updated to 2005 from a 2015 digital elevation model, produced using tri-stereo Pleiades satellite images acquired on December 18, 2015. The total volume of products erupted from 2005 to 2015, calculated from topography difference by integration of the thickness distribution over the area covered, is about 287×106 m3, of which ~55×106 m3 is the volume of the NSEC cone.

Journal ArticleDOI
TL;DR: The FTC, based on a linear state feedback, is designed to compensate the impact of actuator faults on system performance by stabilising the closed-loop system using interval observers.
Abstract: This paper addresses the problem of passive fault-tolerant control for linear parameter-varying systems subject to actuator faults. The FTC, based on a linear state feedback, is designed to compensate the impact of actuator faults on system performance by stabilising the closed-loop system using interval observers. The design of interval observers is based on the discrete-time Luenberger observer structure, where uncertainties and faults with known bounds are considered. Sufficient conditions for the existence of the proposed observer are explicitly provided. Simulation results are presented to show the effectiveness of the proposed approach.

Journal ArticleDOI
TL;DR: The objective of this study is to develop the first fully passive nonlinear piezoelectric tuned vibration absorber (NPTVA), designed to mitigate a specific resonance of a nonlinear host structure.
Abstract: The objective this study is to develop the first fully passive nonlinear piezoelectric tuned vibration absorber (NPTVA). The NPTVA is designed to mitigate a specific resonance of a nonlinear host structure. To avoid the use of synthetic inductors which require external power, closed magnetic circuits in ferrite material realize the large inductance values required by vibration mitigation at low frequencies. The saturation of an additional passive inductor is then exploited to build the nonlinearity in the NPTVA. The performance of the proposed device is demonstrated both numerically and experimentally.

Journal ArticleDOI
TL;DR: In this paper, two aeroelastic phenomena, Vortex Induced Vibration (VIV) and cross-flow galloping, are investigated to harvest energy from the wind.

Journal ArticleDOI
TL;DR: The theoretical years-lost method was adapted to calculate gains in longevity (years-saved) according to specific-risks under the competing risks model and was implemented in R software.
Abstract: To quantify the years of life saved from cardiovascular (CVD), cancer and overall deaths among elite athletes according to their main type of physiological effort performed in the Olympic Games. All French athletes participating in the Games from 1912 to 2012, with vital status validated and cause of death (if concerned) identified by the national registries were included (n = 2814, 455 died) and classified according to 6 groups of effort: POWER (continuous effort < 45 s); INTERMEDIATE (45 s ≤ continuous effort < 600 s); ENDURANCE (continuous effort ≥ 600 s); POLYVALENT (participating in different events entering different classifications), INTERMITTENT (intermittent effort, i.e. team sports); PRECISION (targeting events). The theoretical years-lost method was adapted to calculate gains in longevity (years-saved) according to specific-risks under the competing risks model and was implemented in R software. Considering overall-deaths, all groups significantly saved, on average, 6.5 years of life (95% CI 5.8–7.2) compared to the general population. This longevity advantage is mainly driven by a lower risk of cancer which, isolated, contributed to significantly save 2.3 years of life (95% CI 1.2–1.9) on average in each group. The risk of CVD-related mortality in the ENDURANCE and PRECISION groups is not significantly different from the general population. The other groups significantly saved, on average, 1.6 years of life (95% CI 1.2–1.9) from CVD death. The longevity benefits in elite athletes are associated with the type of effort performed during their career, mainly due to differences on the CVD-risk of death.

Book
19 Jun 2018
TL;DR: The Return of Work in Critical Theory as discussed by the authors is an account of the human significance of work and the human costs of contemporary forms of work organization, which brings together empirical research with incisive analysis of the political stakes of contemporary work.
Abstract: From John Maynard Keynes's prediction of a fifteen-hour workweek to present-day speculation about automation, we have not stopped forecasting the end of work. Critical theory and political philosophy have turned their attention away from the workplace to focus on other realms of domination and emancipation. But far from coming to an end, work continues to occupy a central place in our lives. This is not only because of the amount of time people spend on the job. Many of our deepest hopes and fears are bound up in our labor\textemdash what jobs we perform, how we relate to others, how we might flourish.The Return of Work in Critical Theory presents a bold new account of the human significance of work and the human costs of contemporary forms of work organization. A collaboration among experts in philosophy, social theory, and clinical psychology, it brings together empirical research with incisive analysis of the political stakes of contemporary work. The Return of Work in Critical Theory begins by looking in detail at the ways in which work today fails to meet our expectations. It then sketches a phenomenological description of work and examines the normative premises that underlie the experience of work. Finally, it puts forward a novel conception of work that can renew critical theory's engagement with work and point toward possibilities for transformation. Inspired by Max Horkheimer's vision of critical theory as empirically informed reflection on the sources of social suffering with emancipatory intent, The Return of Work in Critical Theory is a lucid diagnosis of the malaise and pathologies of contemporary work that proposes powerful remedies.

Journal ArticleDOI
TL;DR: In this article, a space-frequency multiplicative regularization is developed to identify mechanical forces acting on a structure, which takes advantage of one's prior knowledge of the nature and the location of excitation sources, as well as that of their spectral contents.

Journal ArticleDOI
Abstract: We use a direct numerical simulations (DNS) database for turbulent flow in a square duct up to bulk Reynolds number to quantitatively analyse the role of secondary motions on the mean flow structure. For that purpose we derive a generalized form of the identity of Fukagata, Iwamoto and Kasagi (FIK), which allows one to quantify the effect of cross-stream convection on the mean streamwise velocity, wall shear stress and bulk friction coefficient. Secondary motions are found to contribute approximately 6 % of the total friction, and to act as a self-regulating mechanism of turbulence whereby wall shear stress non-uniformities induced by corners are equalized, and universality of the wall-normal velocity profiles is established. We also carry out numerical experiments whereby the secondary motions are artificially suppressed, in which case their equalizing role is partially taken by the turbulent stresses.

Book ChapterDOI
TL;DR: SMILE, a new deep convolutional neural network which addresses the issue of learning with incomplete ground truth, aims to identify ambiguous labels in order to ignore them during training, and don’t propagate incorrect or noisy information.
Abstract: Annotation of medical images for semantic segmentation is a very time consuming and difficult task Moreover, clinical experts often focus on specific anatomical structures and thus, produce partially annotated images In this paper, we introduce SMILE, a new deep convolutional neural network which addresses the issue of learning with incomplete ground truth SMILE aims to identify ambiguous labels in order to ignore them during training, and don’t propagate incorrect or noisy information A second contribution is SMILEr which uses SMILE as initialization for automatically relabeling missing annotations, using a curriculum strategy Experiments on 3 organ classes (liver, stomach, pancreas) show the relevance of the proposed approach for semantic segmentation: with 70% of missing annotations, SMILEr performs similarly as a baseline trained with complete ground truth annotations

Journal ArticleDOI
TL;DR: In this article, a validated method for the separation of the silymarin constituents has been developed to ensure precision and accuracy in their quantification, each compound was separated with a high reproducibility.
Abstract: Fruits of Silybum marianum (L.) Gaernt are the main source of taxifolin derived flavonolignans. Together, these molecules constitute a mixture called silymarin with many useful applications for cosmetic and pharmaceutic industries. Here, a validated method for the separation of the silymarin constituents has been developed to ensure precision and accuracy in their quantification. Each compound was separated with a high reproducibility. Precision and repeatability of the quantification method were validated according to the AOAC recommendations. The method was then applied to study the natural variability of wild accessions of S. marianum. Analysis of the variation in the fruits composition of these 12 accessions from Pakistan evidenced a huge natural diversity. Correlation analysis suggested a synergistic action of the different flavonolignans to reach the maximal antioxidant activity, as determined by cupric ion reducing antioxidant capacity (CUPRAC) and ferric reducing antioxidant power (FRAP) assays. Principal component analysis (PCA) separated the 12 accessions into three distinct groups that were differing from their silymarin contents, whereas hierarchical clustering analysis (HCA) evidenced strong variations in their silymarin composition, leading to the identification of new silybin-rich chemotypes. These results proved that the present method allows for an efficient separation and quantification of the main flavonolignans with potent antioxidant activities.

Journal ArticleDOI
TL;DR: A cost effective, low-consumption radio over fiber system based on 850-nm Single Mode Vertical Cavity Surface Emitting Lasers and Standard Single Mode Fibers allows even in critical cases to maintain at high level the quality of the received signal.
Abstract: In view of the realization of short- and medium-range Mobile Front-Haul connections for present (LTE) and future (5G) cellular networks, a cost effective, low-consumption radio over fiber system is proposed, based on 850-nm Single Mode Vertical Cavity Surface Emitting Lasers and Standard Single Mode Fibers (SSMFs). An efficient countermeasure to possible impairments due to the bimodal behavior of SSMFs at 850 nm allows even in critical cases to maintain at high level the quality of the received signal. The performances are evaluated with reference to the Physical Downlink Shared Channel of an entire LTE frame with 20-MHz bandwidth centered in band 20 of the standard. In terms of error vector magnitude and outage probability under temperature stress, the system is able to transmit 256-QAM signals in compliance with the LTE standard, which corresponds to a raw data rate transmission of 134.4 Mbit/s, up to distances of 1.5 km.

Journal ArticleDOI
TL;DR: L-amB induction treatment improves survival in patients with PVE-C, and medical treatment followed by long-term maintenance fluconazole may be the best treatment option for frail patients.
Abstract: Background: Prosthetic valve endocarditis caused by Candida spp. (PVE-C) is rare and devastating, with international guidelines based on expert recommendations supporting the combination of surgery and subsequent azole treatment. Methods: We retrospectively analyzed PVE-C cases collected in Spain and France between 2001 and 2015, with a focus on management and outcome. Results: Forty-six cases were followed up for a median of 9 months. Twenty-two patients (48%) had a history of endocarditis, 30 cases (65%) were nosocomial or healthcare related, and 9 (20%) patients were intravenous drug users. "Induction" therapy consisted mainly of liposomal amphotericin B (L-amB)-based (n = 21) or echinocandin-based therapy (n = 13). Overall, 19 patients (41%) were operated on. Patients <66 years old and without cardiac failure were more likely to undergo cardiac surgery (adjusted odds ratios [aORs], 6.80 [95% confidence interval [CI], 1.59-29.13] and 10.92 [1.15-104.06], respectively). Surgery was not associated with better survival rates at 6 months. Patients who received L-amB alone had a better 6-month survival rate than those who received an echinocandin alone (aOR, 13.52; 95% CI, 1.03-838.10). "Maintenance" fluconazole therapy, prescribed in 21 patients for a median duration of 13 months (range, 2-84 months), led to minor adverse effects. Conclusion: L-amB induction treatment improves survival in patients with PVE-C. Medical treatment followed by long-term maintenance fluconazole may be the best treatment option for frail patients.

Journal ArticleDOI
TL;DR: The proposed MRMI and ERMI indices are performant tools to account for health-state severity according to outcomes of interest in a population-based setting.

Journal ArticleDOI
TL;DR: A new series of 18 small-sized fully organic VEGF-A165/NRP-1 antagonists (NRPas) are reported, and 2a (renamed NRPa-308) emerges as a promising "hit", which exerts not only potent anti-angiogenic activity, but also significant effects on cell viability of large panel of human solid and haematological cancer cell lines.

Journal ArticleDOI
TL;DR: Evaluated the safety ofaclofen off‐label use compared to the main approved drugs for AUD (acamprosate, naltrexone, nalmefene) in France.
Abstract: PURPOSE Baclofen is widely used off-label for alcohol use disorders (AUD) in France, despite its uncertain efficacy and safety, particularly at high doses. This study was designed to evaluate the safety of this off-label use compared to the main approved drugs for AUD (acamprosate, naltrexone, nalmefene). METHODS This cohort study from the French Health Insurance claims database included patients, aged 18 to 70 years, with no serious comorbidity (assessed by the Charlson score) initiating baclofen or approved drugs for AUD between 2009 and 2015. The risk of hospitalisation or death associated with baclofen, at variable doses over time (from low doses <30 mg/day to high doses ≥180 mg/day), compared to approved drugs, was evaluated by a Cox model adjusted to sociodemographic and medical characteristics. RESULTS The cohort included 165 334 patients, 47 614 of whom were exposed to baclofen. Patients exposed to baclofen differed from those treated with approved drugs in terms of sociodemographic and medical characteristics (more females, higher socioeconomic status, fewer hospitalisations for alcohol-related problems), but these differences tended to fade at higher doses of baclofen. Baclofen exposure was significantly associated with hospitalisation (hazard ratio [HR] = 1.13 [95%CI: 1.09-1.17]) and death (HR = 1.31 [95%CI: 1.08-1.60]). The risk increased with dose, reaching 1.46 [1.28-1.65] for hospitalisation and 2.27 [1.27-4.07] for death at high doses. Similar results were in patients with a history of hospitalisation for alcohol-related problems. CONCLUSIONS This study raises concerns about the safety of baclofen for AUD, particularly at high doses, with higher risks of hospitalisation and mortality than approved drugs.

Journal ArticleDOI
TL;DR: A newly developed anaerobic microbial consortium via thermophilic consolidated bioprocessing is a promising candidate for H2 production in space applications as in situ resource utilization.

Book ChapterDOI
08 Sep 2018
TL;DR: A new model for leveraging unlabeled data to improve generalization performances of image classifiers: a two-branch encoder-decoder architecture called HybridNet, able to outperform state-of-the-art results on CIFAR-10, SVHN and STL-10 in various semi-supervised settings.
Abstract: In this paper, we introduce a new model for leveraging unlabeled data to improve generalization performances of image classifiers: a two-branch encoder-decoder architecture called HybridNet. The first branch receives supervision signal and is dedicated to the extraction of invariant class-related representations. The second branch is fully unsupervised and dedicated to model information discarded by the first branch to reconstruct input data. To further support the expected behavior of our model, we propose an original training objective. It favors stability in the discriminative branch and complementarity between the learned representations in the two branches. HybridNet is able to outperform state-of-the-art results on CIFAR-10, SVHN and STL-10 in various semi-supervised settings. In addition, visualizations and ablation studies validate our contributions and the behavior of the model on both CIFAR-10 and STL-10 datasets.