scispace - formally typeset
Search or ask a question

Showing papers by "Paris Dauphine University published in 2020"


Posted Content
TL;DR: This work presents a new method that views object detection as a direct set prediction problem, and demonstrates accuracy and run-time performance on par with the well-established and highly-optimized Faster RCNN baseline on the challenging COCO object detection dataset.
Abstract: We present a new method that views object detection as a direct set prediction problem. Our approach streamlines the detection pipeline, effectively removing the need for many hand-designed components like a non-maximum suppression procedure or anchor generation that explicitly encode our prior knowledge about the task. The main ingredients of the new framework, called DEtection TRansformer or DETR, are a set-based global loss that forces unique predictions via bipartite matching, and a transformer encoder-decoder architecture. Given a fixed small set of learned object queries, DETR reasons about the relations of the objects and the global image context to directly output the final set of predictions in parallel. The new model is conceptually simple and does not require a specialized library, unlike many other modern detectors. DETR demonstrates accuracy and run-time performance on par with the well-established and highly-optimized Faster RCNN baseline on the challenging COCO object detection dataset. Moreover, DETR can be easily generalized to produce panoptic segmentation in a unified manner. We show that it significantly outperforms competitive baselines. Training code and pretrained models are available at this https URL.

4,122 citations


Book ChapterDOI
23 Aug 2020
TL;DR: DetR as mentioned in this paper proposes a set-based global loss that forces unique predictions via bipartite matching, and a transformer encoder-decoder architecture to directly output the final set of predictions in parallel.
Abstract: We present a new method that views object detection as a direct set prediction problem. Our approach streamlines the detection pipeline, effectively removing the need for many hand-designed components like a non-maximum suppression procedure or anchor generation that explicitly encode our prior knowledge about the task. The main ingredients of the new framework, called DEtection TRansformer or DETR, are a set-based global loss that forces unique predictions via bipartite matching, and a transformer encoder-decoder architecture. Given a fixed small set of learned object queries, DETR reasons about the relations of the objects and the global image context to directly output the final set of predictions in parallel. The new model is conceptually simple and does not require a specialized library, unlike many other modern detectors. DETR demonstrates accuracy and run-time performance on par with the well-established and highly-optimized Faster R-CNN baseline on the challenging COCO object detection dataset. Moreover, DETR can be easily generalized to produce panoptic segmentation in a unified manner. We show that it significantly outperforms competitive baselines. Training code and pretrained models are available at https://github.com/facebookresearch/detr.

2,009 citations


Journal ArticleDOI
TL;DR: The key result is an abrupt 8.8% decrease in global CO2 emissions in the first half of 2020 compared to the same period in 2019, larger than during previous economic downturns or World War II.
Abstract: The COVID-19 pandemic is impacting human activities, and in turn energy use and carbon dioxide (CO2) emissions. Here we present daily estimates of country-level CO2 emissions for different sectors based on near-real-time activity data. The key result is an abrupt 8.8% decrease in global CO2 emissions (-1551 Mt CO2) in the first half of 2020 compared to the same period in 2019. The magnitude of this decrease is larger than during previous economic downturns or World War II. The timing of emissions decreases corresponds to lockdown measures in each country. By July 1st, the pandemic's effects on global emissions diminished as lockdown restrictions relaxed and some economic activities restarted, especially in China and several European countries, but substantial differences persist between countries, with continuing emission declines in the U.S. where coronavirus cases are still increasing substantially.

405 citations


Journal ArticleDOI
TL;DR: A Correction to this paper has been published: https://doi.org/10.1038/s41467-020-20254-5.
Abstract: Author(s): Liu, Zhu; Ciais, Philippe; Deng, Zhu; Lei, Ruixue; Davis, Steven J; Feng, Sha; Zheng, Bo; Cui, Duo; Dou, Xinyu; Zhu, Biqing; Guo, Rui; Ke, Piyu; Sun, Taochun; Lu, Chenxi; He, Pan; Wang, Yuan; Yue, Xu; Wang, Yilong; Lei, Yadong; Zhou, Hao; Cai, Zhaonan; Wu, Yuhui; Guo, Runtao; Han, Tingxuan; Xue, Jinjun; Boucher, Olivier; Boucher, Eulalie; Chevallier, Frederic; Tanaka, Katsumasa; Wei, Yiming; Zhong, Haiwang; Kang, Chongqing; Zhang, Ning; Chen, Bin; Xi, Fengming; Liu, Miaomiao; Breon, Francois-Marie; Lu, Yonglong; Zhang, Qiang; Guan, Dabo; Gong, Peng; Kammen, Daniel M; He, Kebin; Schellnhuber, Hans Joachim | Abstract: A Correction to this paper has been published: https://doi.org/10.1038/s41467-020-20254-5.

173 citations


Journal ArticleDOI
TL;DR: This survey is intended to be beneficial for visualization researchers whose interests involve making ML models more trustworthy, as well as researchers and practitioners from other disciplines in their search for effective visualization techniques suitable for solving their tasks with confidence and conveying meaning to their data.
Abstract: Machine learning (ML) models are nowadays used in complex applications in various domains such as medicine, bioinformatics, and other sciences. Due to their black box nature, however, it may someti ...

123 citations


Journal ArticleDOI
TL;DR: The Carbon Monitor near-real-time CO 2 emission dataset shows a 8.8% decline inCO 2 emissions globally from January 1 to June 30 th in 2020 when compared with the same period in 2019 and detects a regrowth of CO 2 emissions by late April, which is mainly attributed to the recovery of economic activities in China and a partial easing of lockdowns in other countries.
Abstract: We constructed a near-real-time daily CO2 emission dataset, the Carbon Monitor, to monitor the variations in CO2 emissions from fossil fuel combustion and cement production since January 1, 2019, at the national level, with near-global coverage on a daily basis and the potential to be frequently updated. Daily CO2 emissions are estimated from a diverse range of activity data, including the hourly to daily electrical power generation data of 31 countries, monthly production data and production indices of industry processes of 62 countries/regions, and daily mobility data and mobility indices for the ground transportation of 416 cities worldwide. Individual flight location data and monthly data were utilized for aviation and maritime transportation sector estimates. In addition, monthly fuel consumption data corrected for the daily air temperature of 206 countries were used to estimate the emissions from commercial and residential buildings. This Carbon Monitor dataset manifests the dynamic nature of CO2 emissions through daily, weekly and seasonal variations as influenced by workdays and holidays, as well as by the unfolding impacts of the COVID-19 pandemic. The Carbon Monitor near-real-time CO2 emission dataset shows a 8.8% decline in CO2 emissions globally from January 1st to June 30th in 2020 when compared with the same period in 2019 and detects a regrowth of CO2 emissions by late April, which is mainly attributed to the recovery of economic activities in China and a partial easing of lockdowns in other countries. This daily updated CO2 emission dataset could offer a range of opportunities for related scientific research and policy making.

83 citations


Posted Content
TL;DR: It is proved that bounded twin-width is preserved by FO interpretations and transductions (allowing operations such as squaring or complementing a graph) and unifies and significantly extends the knowledge on fixed-parameter tractability of FO model checking on non-monotone classes, such as the FPT algorithm on bounded-width posets.
Abstract: Inspired by a width invariant defined on permutations by Guillemot and Marx [SODA '14], we introduce the notion of twin-width on graphs and on matrices. Proper minor-closed classes, bounded rank-width graphs, map graphs, $K_t$-free unit $d$-dimensional ball graphs, posets with antichains of bounded size, and proper subclasses of dimension-2 posets all have bounded twin-width. On all these classes (except map graphs without geometric embedding) we show how to compute in polynomial time a sequence of $d$-contractions, witness that the twin-width is at most $d$. We show that FO model checking, that is deciding if a given first-order formula $\phi$ evaluates to true for a given binary structure $G$ on a domain $D$, is FPT in $|\phi|$ on classes of bounded twin-width, provided the witness is given. More precisely, being given a $d$-contraction sequence for $G$, our algorithm runs in time $f(d,|\phi|) \cdot |D|$ where $f$ is a computable but non-elementary function. We also prove that bounded twin-width is preserved by FO interpretations and transductions (allowing operations such as squaring or complementing a graph). This unifies and significantly extends the knowledge on fixed-parameter tractability of FO model checking on non-monotone classes, such as the FPT algorithm on bounded-width posets by Gajarský et al. [FOCS '15].

74 citations


Journal ArticleDOI
TL;DR: The concept of species charisma has been explored in the context of invasive alien species (IAS) in this paper, where the authors discuss how it can affect species introductions, media portrayals, public perceptions, opposition to management, research efforts, and public participation in research and management.
Abstract: T concept of charismatic species – commonly used in the scholarly literature to refer to the “attractiveness”, “appeal”, or “beauty” of a given species (Panel 1) – has recently garnered attention in conservation science due to its potential to stimulate public awareness and support, especially through the use of flagship species (Veríssimo et al. 2011; Courchamp et al. 2018). The charisma of any introduced species, and invasive alien species (IAS) in particular, can affect people’s perceptions and attitudes toward management of that species (McNeely 2001; Veitch and Clout 2001; Shackleton et al. 2019). Research demonstrates how IAS charisma can influence the invasion process across a wide range of organisms spanning different taxonomic groups and regions (WebTables 1–3; Figure 1). Unlike the charisma of threatened species, which has a positive effect on management efforts, charisma in IAS usually represents a hindrance to management (Genovesi and Bertolino 2001; Bertolino and Genovesi 2003). Charisma can reduce public support for IAS management attempts and contribute to conflicting perceptions and interests, and ultimately impede management efforts (eg by delaying or preventing control implementation; Estévez et al. 2015; Novoa et al. 2018). However, the issue of species charisma in relation to IAS has not yet been systematically explored. We discuss the concept of species charisma in the context of IAS, and explore how it can affect species introductions, media portrayals, public perceptions, opposition to management, research efforts, and public participation in research and management (Figure 1). In addition to clarifying the concept of charismatic IAS (Panel 1), we illustrate how the perception of charisma is highly contextdependent and varies over space and time. Identifying these issues enables us to provide a set of recommendations for further research, and to highlight both management implications and measures that can be taken to address this issue. The role of species charisma in biological invasions

74 citations


Journal ArticleDOI
TL;DR: In this paper, the authors consider a power network with distributed local power generation and storage, where each node can either buy or sell electricity, impacting the electricity spot price and the objective at each node is to minimize energy and storage costs by optimally controlling the storage device.
Abstract: We consider a stylized model for a power network with distributed local power generation and storage. This system is modeled as a network connection of a large number of nodes, where each node is characterized by a local electricity consumption, has a local electricity production (photovoltaic panels for example) and manages a local storage device. Depending on its instantaneous consumption and production rate as well as its storage management decision, each node may either buy or sell electricity, impacting the electricity spot price. The objective at each node is to minimize energy and storage costs by optimally controlling the storage device. In a noncooperative game setting, we are led to the analysis of a nonzero sum stochastic game with N players where the interaction takes place through the spot price mechanism. For an infinite number of agents, our model corresponds to an extended mean field game. We are able to compare this solution to the optimal strategy of a central planner and in a linear quadratic setting, we obtain and explicit solution to the extended mean field game and we show that it provides an approximate Nash equilibrium for N-player game.

69 citations


Journal ArticleDOI
TL;DR: This paper addresses the problem of fairly sharing multiple resources between slices, in the critical situation in which the network does not have enough resources to fully satisfy slice demands, by proposing a versatile optimization framework based on the Ordered Weighted Average (OWA) operator that takes into account different fairness approaches.
Abstract: Among the novelties introduced by 5G networks, the formalization of the ‘network slice’ as a resource allocation unit is an important one. In legacy networks, resources such as link bandwidth, spectrum, computing capacity are allocated independently of each other. In 5G environments, a network slice is meant to directly serve end-to-end services, or verticals: behind a network slice demand, a tenant expresses the need to access a precise service type, under a fully qualified set of computing and network requirements. The resource allocation decision encompasses, therefore, a combination of different resources. In this paper, we address the problem of fairly sharing multiple resources between slices, in the critical situation in which the network does not have enough resources to fully satisfy slice demands. We model the problem as a multi-resource allocation problem, proposing a versatile optimization framework based on the Ordered Weighted Average (OWA) operator, that takes into account different fairness approaches. We show how, adapting the OWA utility function, our framework can generalize classical single-resource allocation methods, existing multi-resource allocation solutions at the state of the art, and implement novel multi-resource allocation solutions. We compare analytically and by extensive simulations the different methods in terms of fairness and system efficiency.

68 citations


Posted ContentDOI
18 Sep 2020-medRxiv
TL;DR: At the end of the lockdown the prevalence of anti-SARS-CoV-2 IgG or neutralizing antibodies remained low in the French adult population, even in regions with high reported rates of COVID-19.
Abstract: Aim To estimate the seroprevalence of SARS-CoV-2 infection in May-June 2020 after the lockdown in adults living in three regions in France and to identify the associated risk factors. Methods Participants in a survey on COVID-19 from an existing consortium of three general adult population cohorts living in the Ile-de-France (IDF) or Grand Est (GE), two regions with high rate of COVID-19, or in the Nouvelle-Aquitaine (NA), with a low rate, were asked to take a dried-blood spot (DBS) for anti-SARS-CoV-2 antibodies assessment. The primary outcome was a positive anti-SARS-CoV-2 ELISA IgG result against the spike protein of the virus (ELISA-S). The secondary outcomes were a positive ELISA IgG against the nucleocapsid protein (ELISA-NP), anti-SARS-CoV-2 neutralizing antibodies titers >=40 (SN), and predicted positivity obtained from a multiple imputation model (MI). Prevalence estimates were adjusted using sampling weights and post-stratification methods. Findings Between May 4, 2020 and June 23, 2020, 16,000 participants were asked to provide DBS, and 14,628 were included in the analysis, 983 with a positive ELISA-S, 511 with a positive ELISA-NP, 424 with SN>=40 and 941 (Standard Deviation=31) with a positive MI. Adjusted estimates of seroprevalence (positive ELISA-S) were 10.0% (95%CI 9.1%;10.9%) in IDF, 9.0% (95%CI 7.7%; 10.2%) in GE and 3.1% (95%CI 2.4%; 3.7%), in NA. The adjusted prevalence of positive ELISA-NP, SN and MI were 5.7%, 5.0% and 10.0% in IDF, 6.0%, 4.3% and 8.6% in GE, and 0.6%, 1.3% and 2.5% in NA, respectively. A higher seroprevalence was observed in younger participants and when at least one child or adolescent lived in the same household. A lower seroprevalence was observed in smokers compared to non-smokers. Interpretation At the end of the lockdown the prevalence of anti-SARS-CoV-2 IgG or neutralizing antibodies remained low in the French adult population, even in regions with high reported rates of COVID-19.

Journal ArticleDOI
TL;DR: It is suggested that economic development and gender equality in rights go hand-in-hand with a reshaping rather than a suppression of gender norms, with the emergence of new and more horizontal forms of social differentiation across genders.
Abstract: The so-called “gender-equality paradox” is the fact that gender segregation across occupations is more pronounced in more egalitarian and more developed countries. Some scholars have explained this paradox by the existence of deeply rooted or intrinsic gender differences in preferences that materialize more easily in countries where economic constraints are more limited. In line with a strand of research in sociology, we show instead that it can be explained by cross-country differences in essentialist gender norms regarding math aptitudes and appropriate occupational choices. To this aim, we propose a measure of the prevalence and extent of internalization of the stereotype that “math is not for girls” at the country level. This is done using individual-level data on the math attitudes of 300,000 15-y-old female and male students in 64 countries. The stereotype associating math to men is stronger in more egalitarian and developed countries. It is also strongly associated with various measures of female underrepresentation in math-intensive fields and can therefore entirely explain the gender-equality paradox. We suggest that economic development and gender equality in rights go hand-in-hand with a reshaping rather than a suppression of gender norms, with the emergence of new and more horizontal forms of social differentiation across genders.

Journal ArticleDOI
TL;DR: The theoretical results demonstrate that even though the model is misspecified, under regularity conditions, the accept–reject ABC approach concentrates posterior mass on an appropriately defined pseudotrue parameter value, but under model misspecification the ABC posterior does not yield credible sets with valid frequentist coverage and has non‐standard asymptotic behaviour.
Abstract: We analyze the behavior of approximate Bayesian computation (ABC) when the model generating the simulated data differs from the actual data generating process; i.e., when the data simulator in ABC is misspecified. We demonstrate both theoretically and in simple, but practically relevant, examples that when the model is misspecified different versions of ABC can yield substantially different results. Our theoretical results demonstrate that even though the model is misspecified, under regularity conditions, the accept/reject ABC approach concentrates posterior mass on an appropriately defined pseudo-true parameter value. However, under model misspecification the ABC posterior does not yield credible sets with valid frequentist coverage and has non-standard asymptotic behavior. In addition, we examine the theoretical behavior of the popular local regression adjustment to ABC under model misspecification and demonstrate that this approach concentrates posterior mass on a completely different pseudo-true value than accept/reject ABC. Using our theoretical results, we suggest two approaches to diagnose model misspecification in ABC. All theoretical results and diagnostics are illustrated in a simple running example.

Journal ArticleDOI
TL;DR: In this article, the effects of customer experience (CX) on customer behavior depend on different combinations of its dimensions, and it is shown that complementarity and substitutability effects result when they reflect a perfect match, and not simply by adding extra dimensions.

Journal ArticleDOI
TL;DR: In this article, the authors show that Hairer's regularity structures, a major extension of rough path theory, also provide a new and powerful tool to analyze rough volatility models.
Abstract: A new paradigm has emerged recently in financial modeling: rough (stochastic) volatility. First observed by Gatheral et al. in high-frequency data, subsequently derived within market microstructure models, rough volatility captures parsimoniously key-stylized facts of the entire implied volatility surface, including extreme skews (as observed earlier by Alos et al.) that were thought to be outside the scope of stochastic volatility models. On the mathematical side, Markovianity and, partially, semimartingality are lost. In this paper, we show that Hairer's regularity structures, a major extension of rough path theory, which caused a revolution in the field of stochastic partial differential equations, also provide a new and powerful tool to analyze rough volatility models.

Journal ArticleDOI
TL;DR: In this paper, the authors consider the control of the COVID-19 pandemic, modeled by a standard SIR com-partmental model, and show that the divergence between the individual and societal strategies happens after the epidemic peak but while significant propagation is still underway.
Abstract: We consider the control of the COVID-19 pandemic, modeled by a standard SIR com-partmental model. The control of the epidemic is induced by the aggregation of individuals' decisions to limit their social interactions: on one side, when the epidemic is ongoing, an individual is encouraged to diminish his/her contact rate in order to avoid getting infected, but, on the other side, this effort comes at a social cost. If each individual lowers his/her contact rate, the epidemic vanishes faster but the effort cost may be high. A Mean Field Nash equilibrium at the population level is formed, resulting in a lower effective transmission rate of the virus. However, it is not clear that the individual's interest aligns with that of the society. We prove that the equilibrium exists and compute it numerically. The equilibrium selects a sub-optimal solution in comparison to the societal optimum (a centralized decision respected fully by all individuals), meaning that the cost of anarchy is strictly positive. We provide numerical examples and a sensitivity analysis. We show that the divergence between the individual and societal strategies happens after the epidemic peak but while significant propagation is still underway.

Journal ArticleDOI
TL;DR: In this article, the authors present an agenda for the engagement of the social sciences with microbiome research and its implications for public policy and social change, based on existing multidisciplinary science-policy agenda-setting exercises.
Abstract: The human microbiome is an important emergent area of cross, multi and transdisciplinary study. The complexity of this topic leads to conflicting narratives and regulatory challenges. It raises questions about the benefits of its commercialisation and drives debates about alternative models for engaging with its publics, patients and other potential beneficiaries. The social sciences and the humanities have begun to explore the microbiome as an object of empirical study and as an opportunity for theoretical innovation. They can play an important role in facilitating the development of research that is socially relevant, that incorporates cultural norms and expectations around microbes and that investigates how social and biological lives intersect. This is a propitious moment to establish lines of collaboration in the study of the microbiome that incorporate the concerns and capabilities of the social sciences and the humanities together with those of the natural sciences and relevant stakeholders outside academia. This paper presents an agenda for the engagement of the social sciences with microbiome research and its implications for public policy and social change. Our methods were informed by existing multidisciplinary science-policy agenda-setting exercises. We recruited 36 academics and stakeholders and asked them to produce a list of important questions about the microbiome that were in need of further social science research. We refined this initial list into an agenda of 32 questions and organised them into eight themes that both complement and extend existing research trajectories. This agenda was further developed through a structured workshop where 21 of our participants refined the agenda and reflected on the challenges and the limitations of the exercise itself. The agenda identifies the need for research that addresses the implications of the human microbiome for human health, public health, public and private sector research and notions of self and identity. It also suggests new lines of research sensitive to the complexity and heterogeneity of human–microbiome relations, and how these intersect with questions of environmental governance, social and spatial inequality and public engagement with science.

Journal ArticleDOI
TL;DR: This multidisciplinary paper proposes three hallmarks that can support robust international antibiotic policy that are Structural, Equitable and Tracked and describes these hallmarks and propose their consideration should aid the design and evaluation of international antibiotic policies with maximal benefit at both local and international scales.
Abstract: There is increasing concern globally about the enormity of the threats posed by antimicrobial resistance (AMR) to human, animal, plant and environmental health. A proliferation of international, national and institutional reports on the problems posed by AMR and the need for antibiotic stewardship have galvanised attention on the global stage. However, the AMR community increasingly laments a lack of action, often identified as an ‘implementation gap’. At a policy level, the design of internationally salient solutions that are able to address AMR’s interconnected biological and social (historical, political, economic and cultural) dimensions is not straightforward. This multidisciplinary paper responds by asking two basic questions: (A) Is a universal approach to AMR policy and antibiotic stewardship possible? (B) If yes, what hallmarks characterise ‘good’ antibiotic policy? Our multistage analysis revealed four central challenges facing current international antibiotic policy: metrics, prioritisation, implementation and inequality. In response to this diagnosis, we propose three hallmarks that can support robust international antibiotic policy. Emerging hallmarks for good antibiotic policies are: Structural, Equitable and Tracked. We describe these hallmarks and propose their consideration should aid the design and evaluation of international antibiotic policies with maximal benefit at both local and international scales.

Journal ArticleDOI
TL;DR: In this article, the uniqueness and non-degeneracy of positive radial solutions to the double power non-linearity problem were shown to imply the uniqueness of energy minimizers at fixed mass in certain regimes.
Abstract: In this paper we first prove a general result about the uniqueness and non-degeneracy of positive radial solutions to equations of the form $$\Delta u+g(u)=0$$ . Our result applies in particular to the double power non-linearity where $$g(u)=u^q-u^p-\mu u$$ for $$p>q>1$$ and $$\mu >0$$ , which we discuss with more details. In this case, the non-degeneracy of the unique solution $$u_\mu $$ allows us to derive its behavior in the two limits $$\mu \rightarrow 0$$ and $$\mu \rightarrow \mu _*$$ where $$\mu _*$$ is the threshold of existence. This gives the uniqueness of energy minimizers at fixed mass in certain regimes. We also make a conjecture about the variations of the $$L^2$$ mass of $$u_\mu $$ in terms of $$\mu $$ , which we illustrate with numerical simulations. If valid, this conjecture would imply the uniqueness of energy minimizers in all cases and also give some important information about the orbital stability of $$u_\mu $$ .

Journal ArticleDOI
TL;DR: In this paper, the authors study variants of the SEIR model for interpreting some qualitative features of the statistics of the Covid-19 epidemic in France and propose a possible explanation that lockdown is creating social heterogeneity: even if a large majority of the population complies with the lockdown rules, a small fraction of population still has to maintain a normal or high level of social interactions.
Abstract: We study variants of the SEIR model for interpreting some qualitative features of the statistics of the Covid-19 epidemic in France. Standard SEIR models distinguish essentially two regimes: either the disease is controlled and the number of infected people rapidly decreases, or the disease spreads and contaminates a significant fraction of the population until herd immunity is achieved. After lockdown, at first sight it seems that social distancing is not enough to control the outbreak. We discuss here a possible explanation, namely that the lockdown is creating social heterogeneity: even if a large majority of the population complies with the lockdown rules, a small fraction of the population still has to maintain a normal or high level of social interactions, such as health workers, providers of essential services, etc . This results in an apparent high level of epidemic propagation as measured through re-estimations of the basic reproduction ratio. However, these measures are limited to averages, while variance inside the population plays an essential role on the peak and the size of the epidemic outbreak and tends to lower these two indicators. We provide theoretical and numerical results to sustain such a view.

Posted Content
TL;DR: The twin-width of a graph is shown to be the minimum integer d such that it has a sequence of iterated vertex identifications for which the overall maximum number of red edges incident to a single vertex is at most $d, and the "small conjecture" that every small hereditary class has bounded twin- width is explored.
Abstract: The twin-width of a graph $G$ is the minimum integer $d$ such that $G$ has a $d$-contraction sequence, that is, a sequence of $|V(G)|-1$ iterated vertex identifications for which the overall maximum number of red edges incident to a single vertex is at most $d$, where a red edge appears between two sets of identified vertices if they are not homogeneous in $G$ We show that if a graph admits a $d$-contraction sequence, then it also has a linear-arity tree of $f(d)$-contractions, for some function $f$ First this permits to show that every bounded twin-width class is small, ie, has at most $n!c^n$ graphs labeled by $[n]$, for some constant $c$ This unifies and extends the same result for bounded treewidth graphs [Beineke and Pippert, JCT '69], proper subclasses of permutations graphs [Marcus and Tardos, JCTA '04], and proper minor-free classes [Norine et al, JCTB '06] The second consequence is an $O(\log n)$-adjacency labeling scheme for bounded twin-width graphs, confirming several cases of the implicit graph conjecture We then explore the "small conjecture" that, conversely, every small hereditary class has bounded twin-width Inspired by sorting networks of logarithmic depth, we show that $\log_{\Theta(\log \log d)}n$-subdivisions of $K_n$ (a small class when $d$ is constant) have twin-width at most $d$ We obtain a rather sharp converse with a surprisingly direct proof: the $\log_{d+1}n$-subdivision of $K_n$ has twin-width at least $d$ Secondly graphs with bounded stack or queue number (also small classes) have bounded twin-width Thirdly we show that cubic expanders obtained by iterated random 2-lifts from $K_4$~[Bilu and Linial, Combinatorica '06] have bounded twin-width, too We suggest a promising connection between the small conjecture and group theory Finally we define a robust notion of sparse twin-width and discuss how it compares with other sparse classes

Journal ArticleDOI
08 Dec 2020-PLOS ONE
TL;DR: Optimal control analysis shows that an increasing dose titration protocol, a very common clinical dosing process, can achieve tumor stabilization for a wide range of potential initial tumor compositions and volumes.
Abstract: In the absence of curative therapies, treatment of metastatic castrate-resistant prostate cancer (mCRPC) using currently available drugs can be improved by integrating evolutionary principles that govern proliferation of resistant subpopulations into current treatment protocols. Here we develop what is coined as an ‘evolutionary stable therapy’, within the context of the mathematical model that has been used to inform the first adaptive therapy clinical trial of mCRPC. The objective of this therapy is to maintain a stable polymorphic tumor heterogeneity of sensitive and resistant cells to therapy in order to prolong treatment efficacy and progression free survival. Optimal control analysis shows that an increasing dose titration protocol, a very common clinical dosing process, can achieve tumor stabilization for a wide range of potential initial tumor compositions and volumes. Furthermore, larger tumor volumes may counter intuitively be more likely to be stabilized if sensitive cells dominate the tumor composition at time of initial treatment, suggesting a delay of initial treatment could prove beneficial. While it remains uncertain if metastatic disease in humans has the properties that allow it to be truly stabilized, the benefits of a dose titration protocol warrant additional pre-clinical and clinical investigations.

Proceedings ArticleDOI
01 Nov 2020
TL;DR: The notion of twin-width on graphs and on matrices was introduced by Guillemot and Marx as mentioned in this paper, who showed that the width invariant of a graph can be computed in polynomial time.
Abstract: Inspired by a width invariant defined on permutations by Guillemot and Marx [SODA '14], we introduce the notion of twin-width on graphs and on matrices. Proper minor-closed classes, bounded rank-width graphs, map graphs, $K_{t^{-}}$ free unit $d$ -dimensional ball graphs, posets with antichains of bounded size, and proper subclasses of dimension-2 posets all have bounded twin-width. On all these classes (except map graphs without geometric embedding) we show how to compute in polynomial time a sequence of $d$ -contractions, witness that the twin-width is at most $d$ . We show that FO model checking, that is deciding if a given first-order formula $\phi$ evaluates to true for a given binary structure $G$ on a domain $D$ , is FPT in $\vert \phi\vert$ on classes of bounded twin-width, provided the witness is given. More precisely, being given a $d$ -contraction sequence for $G$ , our algorithm runs in time $f(d,\vert \phi\vert)\cdot\vert D\vert$ where $f$ is a computable but non-elementary function. We also prove that bounded twin-width is preserved by FO interpretations and transductions (allowing operations such as squaring or complementing a graph). This unifies and significantly extends the knowledge on fixed-parameter tractability of FO model checking on non-monotone classes, such as the FPT algorithm on bounded-width posets by Gajarský et al. [FOCS '15].

Journal ArticleDOI
TL;DR: It is proposed that the aryl hydrocarbon receptor constitutes a valuable target to protect intestinal functions in metabolic diseases, which can be achieved in the future via food or drug ligands.
Abstract: Objective Obesity is characterized by systemic and low-grade tissue inflammation. In the intestine, alteration of the intestinal barrier and accumulation of inflammatory cells in the epithelium are important contributors of gut inflammation. Recent studies demonstrated the role of the aryl hydrocarbon receptor (AhR) in the maintenance of immune cells at mucosal barrier sites. A wide range of ligands of external and local origin can activate this receptor. We studied the causal relationship between AhR activation and gut inflammation in obesity. Methods Jejunum samples from subjects with normal weight and severe obesity were phenotyped according to T lymphocyte infiltration in the epithelium from lamina propria and assayed for the mRNA level of AhR target genes. The effect of an AhR agonist was studied in mice and Caco-2/TC7 cells. AhR target gene expression, permeability to small molecules and ions, and location of cell-cell junction proteins were recorded under conditions of altered intestinal permeability. Results We showed that a low AhR tone correlated with a high inflammatory score in the intestinal epithelium in severe human obesity. Moreover, AhR activation protected junctional complexes in the intestinal epithelium in mice challenged by an oral lipid load. AhR ligands prevented chemically induced damage to barrier integrity and cytokine expression in Caco-2/TC7 cells. The PKC and p38MAPK signaling pathways were involved in this AhR action. Conclusions The results of these series of human, mouse, and cell culture experiments demonstrate the protective effect of AhR activation in the intestine targeting particularly tight junctions and cytokine expression. We propose that AhR constitutes a valuable target to protect intestinal functions in metabolic diseases, which can be achieved in the future via food or drug ligands.

Journal ArticleDOI
TL;DR: The collaborative economy is grounded in communities as discussed by the authors, and communities and platforms pervade all aspects of the collaborative economy. Yet, they exist in apparent tension, and these are typically characteri...
Abstract: Communities and platforms pervade all aspects of the collaborative economy. Yet, they exist in apparent tension. The collaborative economy is grounded in communities. These are typically characteri...

Book ChapterDOI
01 Jan 2020
TL;DR: In this article, the authors introduce the Mean Field Game (MFG) theory, which models differential games involving infinitely many interacting players, and focus on the Partial Differential Equations (PDEs) approach to MFGs.
Abstract: These notes are an introduction to Mean Field Game (MFG) theory, which models differential games involving infinitely many interacting players. We focus here on the Partial Differential Equations (PDEs) approach to MFGs. The two main parts of the text correspond to the two emblematic equations in MFG theory: the first part is dedicated to the MFG system, while the second part is devoted to the master equation.

Journal ArticleDOI
TL;DR: In this article, the bottom of the spectrum of the Anderson Hamiltonian with Dirichlet or Neumann boundary conditions was studied, and it was shown that the shape of each eigenfunction, recentered around its maximum and properly rescaled, is given by the inverse of a hyperbolic cosine.
Abstract: We study the bottom of the spectrum of the Anderson Hamiltonian $${\mathcal {H}}_L := -\partial _x^2 + \xi $$ on [0, L] driven by a white noise $$\xi $$ and endowed with either Dirichlet or Neumann boundary conditions. We show that, as $$L\rightarrow \infty $$, the point process of the (appropriately shifted and rescaled) eigenvalues converges to a Poisson point process on $$\mathbf{R}$$ with intensity $$e^x dx$$, and that the (appropriately rescaled) eigenfunctions converge to Dirac masses located at independent and uniformly distributed points. Furthermore, we show that the shape of each eigenfunction, recentered around its maximum and properly rescaled, is given by the inverse of a hyperbolic cosine. We also show that the eigenfunctions decay exponentially from their localization centers at an explicit rate, and we obtain very precise information on the zeros and local maxima of these eigenfunctions. Finally, we show that the eigenvalues/eigenfunctions in the Dirichlet and Neumann cases are very close to each other and converge to the same limits.

Proceedings Article
12 Jul 2020
TL;DR: It is shown that, undermild conditions on the dataset distribution, any deterministic classifier can be outperformed by a randomized one, and this gives arguments for using randomization, and leads to a new algorithm for building randomized classifiers that are robust to strong adversarial attacks.
Abstract: Is there a classifier that ensures optimal robust-ness against all adversarial attacks? This paper answers this question by adopting a game-theoretic point of view. We show that adversarial attacks and defenses form an infinite zero-sum game where classical results (e.g. Sion theorems) do not apply. We demonstrate the non-existence of a Nash equilibrium in our game when the clas-sifier and the Adversary are both deterministic, hence giving a negative answer to the above question in the deterministic regime. Nonetheless, the question remains open in the randomized regime. We tackle this problem by showing that, under mild conditions on the dataset distribution, any deterministic classifier can be outperformed by a randomized one. This gives arguments for using randomization, and leads us to a new algorithm for building randomized classifiers that are robust to strong adversarial attacks. Empirical results validate our theoretical analysis, and show that our defense method considerably outperforms Adver-sarial Training against state-of-the-art attacks.

Journal ArticleDOI
TL;DR: In this paper, the authors consider the problem of model reduction of parametrized PDEs where the goal is to approximate any function belonging to the set of solutions at a reduced computational cost.
Abstract: We consider the problem of model reduction of parametrized PDEs where the goal is to approximate any function belonging to the set of solutions at a reduced computational cost. For this, the bottom line of most strategies has so far been based on the approximation of the solution set by linear spaces on Hilbert or Banach spaces. This approach can be expected to be successful only when the Kolmogorov width of the set decays fast. While this is the case on certain parabolic or elliptic problems, most transport-dominated problems are expected to present a slow decaying width and require to study nonlinear approximation methods. In this work, we propose to address the reduction problem from the perspective of general metric spaces with a suitably defined notion of distance. We develop and compare two different approaches, one based on barycenters and another one using tangent spaces when the metric space has an additional Riemannian structure. Since the notion of linear vectorial spaces does not exist in general metric spaces, both approaches result in nonlinear approximation methods. We give theoretical and numerical evidence of their efficiency to reduce complexity for one-dimensional conservative PDEs where the underlying metric space can be chosen to be the L 2 -Wasserstein space.

Journal ArticleDOI
TL;DR: In this article, the authors proposed a Multiple Criteria Decision Analysis (MCDA) method aimed to elaborate walkability decision maps for different groups of citizens that reflect their capability to walk in the urban environment.
Abstract: The analysis of urban walkability has been extensively explored in the last decades. Despite this growing attention, there is a lack of studies attentive on how citizens' values, individual abilities and urban environment favour or hinder the propensity to walk. Hence, there is a need to explore how preferences and values of citizens vary in space in order to design walkability policies able to improve the capability set of citizens. In this perspective, the design of spatial decision tools aimed to plann public policies for the development of walkable cities needs further investigation. We propose a Multiple Criteria Decision Analysis (MCDA) method aimed to elaborate walkability decision maps for different groups of citizens that reflect their capability to walk in the urban environment. We tested the method in the city of Alghero (Italy). First, we analysed walkability under a normative model named CAWS; then we made a survey with 358 participants in order to study the driving values that influence their choice to walk and finalised to build an evaluation model attentive to individual differences. Cluster analysis was employed to group citizens into 11 groups based on their sociodemographic characteristics and preferences on spatial criteria of walkability. Finally, by integrating GIS with MCDA we built a set of decision maps representative of the walkability of the 11 groups of citizens. Results highlight the importance of citizens’ values for policy design, allow the interpersonal comparison among individuals and group preferences and give new suggestions for the formulation of walkability oriented urban policies. Moreover, the results confirm the usability of the general method as a decision support tool supporting the design of urban policies.