scispace - formally typeset
Search or ask a question

Showing papers by "University of North Carolina at Charlotte published in 2021"


Journal ArticleDOI
TL;DR: In this article, the authors present a set of guidelines for investigators to select and interpret methods to examine autophagy and related processes, and for reviewers to provide realistic and reasonable critiques of reports that are focused on these processes.
Abstract: In 2008, we published the first set of guidelines for standardizing research in autophagy. Since then, this topic has received increasing attention, and many scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Thus, it is important to formulate on a regular basis updated guidelines for monitoring autophagy in different organisms. Despite numerous reviews, there continues to be confusion regarding acceptable methods to evaluate autophagy, especially in multicellular eukaryotes. Here, we present a set of guidelines for investigators to select and interpret methods to examine autophagy and related processes, and for reviewers to provide realistic and reasonable critiques of reports that are focused on these processes. These guidelines are not meant to be a dogmatic set of rules, because the appropriateness of any assay largely depends on the question being asked and the system being used. Moreover, no individual assay is perfect for every situation, calling for the use of multiple techniques to properly monitor autophagy in each experimental setting. Finally, several core components of the autophagy machinery have been implicated in distinct autophagic processes (canonical and noncanonical autophagy), implying that genetic approaches to block autophagy should rely on targeting two or more autophagy-related genes that ideally participate in distinct steps of the pathway. Along similar lines, because multiple proteins involved in autophagy also regulate other cellular pathways including apoptosis, not all of them can be used as a specific marker for bona fide autophagic responses. Here, we critically discuss current methods of assessing autophagy and the information they can, or cannot, provide. Our ultimate goal is to encourage intellectual and technical innovation in the field.

1,129 citations


Journal ArticleDOI
TL;DR: Total fluorine (TF) measurements complemented by suspect screening using high resolution mass spectrometry are emerging as essential tools for PFAS exposure assessment to better understand contributions from precursor compounds that degrade into terminal perfluoroalkyl acids (PFAA).
Abstract: We synthesize current understanding of the magnitudes and methods for assessing human and wildlife exposures to poly- and perfluoroalkyl substances (PFAS). Most human exposure assessments have focused on 2 to 5 legacy PFAS, and wildlife assessments are typically limited to targeted PFAS (up to ~30 substances). However, shifts in chemical production are occurring rapidly, and targeted methods for detecting PFAS have not kept pace with these changes. Total fluorine measurements complemented by suspect screening using high-resolution mass spectrometry are thus emerging as essential tools for PFAS exposure assessment. Such methods enable researchers to better understand contributions from precursor compounds that degrade into terminal perfluoroalkyl acids. Available data suggest that diet is the major human exposure pathway for some PFAS, but there is large variability across populations and PFAS compounds. Additional data on total fluorine in exposure media and the fraction of unidentified organofluorine are needed. Drinking water has been established as the major exposure source in contaminated communities. As water supplies are remediated, for the general population, exposures from dust, personal care products, indoor environments, and other sources may be more important. A major challenge for exposure assessments is the lack of statistically representative population surveys. For wildlife, bioaccumulation processes differ substantially between PFAS and neutral lipophilic organic compounds, prompting a reevaluation of traditional bioaccumulation metrics. There is evidence that both phospholipids and proteins are important for the tissue partitioning and accumulation of PFAS. New mechanistic models for PFAS bioaccumulation are being developed that will assist in wildlife risk evaluations. Environ Toxicol Chem 2021;40:631-657. © 2020 SETAC.

212 citations


Journal ArticleDOI
TL;DR: In this paper, the authors report the outcomes of a wastewater surveillance pilot program at the University of North Carolina at Charlotte, a large urban university with a substantial population of students living in on-campus dormitories.

163 citations


Journal ArticleDOI
20 Feb 2021
TL;DR: This article begins with a brief history of freeform optics, focusing on imaging systems, including marketplace emergence, and describes fabrication methods, emphasizing deterministic computer numerical control grinding, polishing, and diamond machining.
Abstract: In the last 10 years, freeform optics has enabled compact and high-performance imaging systems. This article begins with a brief history of freeform optics, focusing on imaging systems, including marketplace emergence. The development of this technology is motivated by the clear opportunity to enable science across a wide range of applications, spanning from extreme ultraviolet lithography to space optics. Next, we define freeform optics and discuss concurrent engineering that brings together design, fabrication, testing, and assembly into one process. We then lay out the foundations of the aberration theory for freeform optics and emerging design methodologies. We describe fabrication methods, emphasizing deterministic computer numerical control grinding, polishing, and diamond machining. Next, we consider mid-spatial frequency errors that inherently result from freeform fabrication techniques. We realize that metrologies of freeform optics are simultaneously sparse in their existence but diverse in their potential. Thus, we focus on metrology techniques demonstrated for the measurement of freeform optics. We conclude this review with an outlook on the future of freeform optics.

123 citations


Journal ArticleDOI
TL;DR: In this article, the authors provide an extensive survey of the applications and examples where hydrodynamic instabilities play a central role, including solar prominences, ionospheric flows in space, supernovae, inertial fusion and pulsed-power experiments, pulsed detonation engines and Scramjets.

123 citations


Posted Content
TL;DR: Poseformer as discussed by the authors is a purely transformer-based approach for 3D human pose estimation in videos without convolutional architectures involved and achieves state-of-the-art performance on two popular and standard benchmark datasets.
Abstract: Transformer architectures have become the model of choice in natural language processing and are now being introduced into computer vision tasks such as image classification, object detection, and semantic segmentation. However, in the field of human pose estimation, convolutional architectures still remain dominant. In this work, we present PoseFormer, a purely transformer-based approach for 3D human pose estimation in videos without convolutional architectures involved. Inspired by recent developments in vision transformers, we design a spatial-temporal transformer structure to comprehensively model the human joint relations within each frame as well as the temporal correlations across frames, then output an accurate 3D human pose of the center frame. We quantitatively and qualitatively evaluate our method on two popular and standard benchmark datasets: Human3.6M and MPI-INF-3DHP. Extensive experiments show that PoseFormer achieves state-of-the-art performance on both datasets. Code is available at \url{this https URL}

111 citations


Journal ArticleDOI
TL;DR: Research suggests youth with disabilities are less likely to experience positive outcomes compared to peers without disabilities as discussed by the authors, and identification of in-school predictors of postschool success can be found in the literature.
Abstract: Research suggests youth with disabilities are less likely to experience positive outcomes compared to peers without disabilities. Identification of in-school predictors of postschool success can pr...

110 citations


Journal ArticleDOI
TL;DR: In this article, a composite material with graphene-reinforcement has obtained commercially notices in promoted engineering applications, and it has been used in a number of process manufacturing applications, such as:
Abstract: Due to the rapid development of process manufacturing, composite materials with graphene-reinforcement have obtained commercially notices in promoted engineering applications. For this regard, vibr...

103 citations


Journal ArticleDOI
TL;DR: The findings, based on a sample of 509 Spanish small and medium-sized enterprises (SMEs), suggest that not all DC dimensions are equally important for SME performance.
Abstract: We investigate how the four dimensions of the dynamic capabilities (DC) construct (sensing, learning, integrating, and coordinating) individually affect firm performance and the moderating role of ...

102 citations


Journal ArticleDOI
Chloe Mirzayi1, Audrey Renson2, Massive Analysis1, Fatima Zohra1, Shaimaa Elsafoury1, Ludwig Geistlinger1, Lora J. Kasselman1, Kelly Eckenrode3, Janneke van de Wijgert4, Amy Loughman5, Francine Z Marques6, David A MacIntyre7, Manimozhiyan Arumugam1, Rimsha Azhar8, Francesco Beghini9, Kirk Bergstrom10, Ami Bhatt11, Jordan E Bisanz12, Jonathan Braun13, Hector Corrada Bravo14, Gregory A Buck15, Frederic D. Bushman12, David Casero16, Gerard Clarke17, Maria Carmen Collado18, Maria Carmen Collado16, Paul D. Cotter16, John F. Cryan19, Ryan T Demmer12, Suzanne Devkota20, Eran Elinav, Juan S Escobar14, Jennifer Fettweis21, Robert D. Finn22, Anthony A. Fodor23, Sofia Forslund24, Andre Franke, Cesare Furlanello25, Jack Gilbert15, Elizabeth Grice26, Benjamin Haibe-Kains27, Scott Handley28, Pamela Herd10, Susan Holmes29, Jonathan P Jacobs30, Lisa Karstens25, Rob Knight19, Dan Knights31, Omry Koren32, Douglas S Kwon33, Morgan G. I. Langille34, Brianna Lindsay12, Dermot P.B. McGovern, Alice C. McHardy30, Shannon McWeeney35, Noel T. Mueller, Luigi Nezi10, Matthew Olm36, Noah Palm37, Edoardo Pasolli38, Jeroen Raes2, Matthew R. Redinbo24, Malte Rühlemann2, R Balfour Sartor39, Patrick D. Schloss34, Lynn Schriml20, Eran Segal34, Michelle Shardell40, Thomas Sharpton14, Ekaterina Smirnova41, Harry Sokol10, Justin L Sonnenburg42, Sujatha Srinivasan24, Louise B. Thingholm43, Peter J. Turnbaugh43, Vaibhav Upadhyay44, Ramona L Walls45, Paul Wilmes46, Takuji Yamada, Georg Zeller35, Mingyu Zhang35, Ni Zhao47, Liping Zhao48, Wenjun Bao32, Aedin Culhane49, Viswanath Devanarayan, Joaquin Dopazo50, Xiaohui Fan51, Xiaohui Fan52, Matthias Fischer53, Wendell D. Jones, Rebecca Kusko54, Christopher E. Mason55, Tim R Mercer56, Susanna-Assunta Sansone57, Andreas Scherer58, Leming Shi59, Shraddha Thakkar60, Weida Tong48, Russell D. Wolfinger, Christopher Hunter8, Nicola Segata32, Curtis Huttenhower56, Jennifer B Dowd1, Heidi E. Jones1, Levi Waldron1 
The Graduate Center, CUNY1, University of North Carolina at Chapel Hill2, Utrecht University3, Deakin University4, Monash University5, Imperial College London6, University of Copenhagen7, University of Trento8, University of British Columbia9, Stanford University10, Pennsylvania State University11, Cedars-Sinai Medical Center12, Genentech13, Virginia Commonwealth University14, University of Pennsylvania15, University College Cork16, National Research Council17, Teagasc18, University of Minnesota19, Weizmann Institute of Science20, European Bioinformatics Institute21, University of North Carolina at Charlotte22, Max Delbrück Center for Molecular Medicine23, University of Kiel24, University of California, San Diego25, Princess Margaret Cancer Centre26, Washington University in St. Louis27, Georgetown University28, University of California, Los Angeles29, Oregon Health & Science University30, Bar-Ilan University31, Harvard University32, Dalhousie University33, University of Maryland, Baltimore34, Johns Hopkins University35, Yale University36, University of Naples Federico II37, Katholieke Universiteit Leuven38, University of Michigan39, Oregon State University40, University of Paris41, Fred Hutchinson Cancer Research Center42, University of California, San Francisco43, Critical Path Institute44, University of Luxembourg45, Tokyo Institute of Technology46, Rutgers University47, SAS Institute48, Eisai49, Zhejiang University50, University of Cologne51, Boston Children's Hospital52, Durham University53, Cornell University54, University of Queensland55, University of Oxford56, University of Helsinki57, Fudan University58, Center for Drug Evaluation and Research59, National Center for Toxicological Research60
TL;DR: The STORMS tool as mentioned in this paper is composed of a 17-item checklist organized into six sections that correspond to the typical sections of a scientific publication, presented as an editable table for inclusion in supplementary materials.
Abstract: The particularly interdisciplinary nature of human microbiome research makes the organization and reporting of results spanning epidemiology, biology, bioinformatics, translational medicine and statistics a challenge. Commonly used reporting guidelines for observational or genetic epidemiology studies lack key features specific to microbiome studies. Therefore, a multidisciplinary group of microbiome epidemiology researchers adapted guidelines for observational and genetic studies to culture-independent human microbiome studies, and also developed new reporting elements for laboratory, bioinformatics and statistical analyses tailored to microbiome studies. The resulting tool, called 'Strengthening The Organization and Reporting of Microbiome Studies' (STORMS), is composed of a 17-item checklist organized into six sections that correspond to the typical sections of a scientific publication, presented as an editable table for inclusion in supplementary materials. The STORMS checklist provides guidance for concise and complete reporting of microbiome studies that will facilitate manuscript preparation, peer review, and reader comprehension of publications and comparative analysis of published results.

99 citations


Journal ArticleDOI
TL;DR: In this paper, the authors synthesize recent research on pandemic-related changes to work and family in the United States, and apply an intersectionality lens to discuss the gendered implications of these changes.
Abstract: The COVID-19 pandemic has affected nearly all the aspects of society since it's onset in early 2020. In addition to infecting and taking the lives of millions of global citizens, the pandemic has fundamentally changed family and work patterns. The pandemic and associated mitigation measures have increased the unemployment rates, amplified health risks for essential workers required to work on-site, and led to unprecedented rates of telecommuting. Additionally, due to school/daycare closures and social distancing, many parents have lost access to institutional and informal childcare support during the COVID-19 crisis. Such losses in childcare support have significantly impacted the paid and unpaid labor of parents, particularly of mothers. In this article, we synthesize recent research on pandemic-related changes to work and family in the United States. Applying an intersectionality lens, we discuss the gendered implications of these changes. Because gender inequality in family and work are connected, COVID-19 has, in many cases, deepened the pre-existing gender inequalities in both realms.

Journal ArticleDOI
01 Oct 2021
TL;DR: In this paper, the authors used data on the durability of immunity among evolutionarily close coronavirus relatives of SARS-CoV-2 to estimate times to reinfection by a comparative evolutionary analysis of related viruses.
Abstract: Summary Background Among the most consequential unknowns of the devastating COVID-19 pandemic are the durability of immunity and time to likely reinfection. There are limited direct data on SARS-CoV-2 long-term immune responses and reinfection. The aim of this study is to use data on the durability of immunity among evolutionarily close coronavirus relatives of SARS-CoV-2 to estimate times to reinfection by a comparative evolutionary analysis of related viruses SARS-CoV, MERS-CoV, human coronavirus (HCoV)-229E, HCoV-OC43, and HCoV-NL63. Methods We conducted phylogenetic analyses of the S, M, and ORF1b genes to reconstruct a maximum-likelihood molecular phylogeny of human-infecting coronaviruses. This phylogeny enabled comparative analyses of peak-normalised nucleocapsid protein, spike protein, and whole-virus lysate IgG antibody optical density levels, in conjunction with reinfection data on endemic human-infecting coronaviruses. We performed ancestral and descendent states analyses to estimate the expected declines in antibody levels over time, the probabilities of reinfection based on antibody level, and the anticipated times to reinfection after recovery under conditions of endemic transmission for SARS-CoV-2, as well as the other human-infecting coronaviruses. Findings We obtained antibody optical density data for six human-infecting coronaviruses, extending from 128 days to 28 years after infection between 1984 and 2020. These data provided a means to estimate profiles of the typical antibody decline and probabilities of reinfection over time under endemic conditions. Reinfection by SARS-CoV-2 under endemic conditions would likely occur between 3 months and 5·1 years after peak antibody response, with a median of 16 months. This protection is less than half the duration revealed for the endemic coronaviruses circulating among humans (5–95% quantiles 15 months to 10 years for HCoV-OC43, 31 months to 12 years for HCoV-NL63, and 16 months to 12 years for HCoV-229E). For SARS-CoV, the 5–95% quantiles were 4 months to 6 years, whereas the 95% quantiles for MERS-CoV were inconsistent by dataset. Interpretation The timeframe for reinfection is fundamental to numerous aspects of public health decision making. As the COVID-19 pandemic continues, reinfection is likely to become increasingly common. Maintaining public health measures that curb transmission—including among individuals who were previously infected with SARS-CoV-2—coupled with persistent efforts to accelerate vaccination worldwide is critical to the prevention of COVID-19 morbidity and mortality. Funding US National Science Foundation.

Proceedings ArticleDOI
01 Feb 2021
TL;DR: This paper proposes a flexible and optimized dataflow for GCNs that simultaneously improves resource utilization and reduces data movement, and introduces a novel accelerator architecture called GCNAX, which tailors the compute engine, buffer structure and size based on the proposed dataflow.
Abstract: Graph convolutional neural networks (GCNs) have emerged as an effective approach to extend deep learning for graph data analytics. Given that graphs are usually irregular, as nodes in a graph may have a varying number of neighbors, processing GCNs efficiently pose a significant challenge on the underlying hardware. Although specialized GCN accelerators have been proposed to deliver better performance over generic processors, prior accelerators not only under-utilize the compute engine, but also impose redundant data accesses that reduce throughput and energy efficiency. Therefore, optimizing the overall flow of data between compute engines and memory, i.e., the GCN dataflow, which maximizes utilization and minimizes data movement is crucial for achieving efficient GCN processing.In this paper, we propose a flexible and optimized dataflow for GCNs that simultaneously improves resource utilization and reduces data movement. This is realized by fully exploring the design space of GCN dataflows and evaluating the number of execution cycles and DRAM accesses through an analysis framework. Unlike prior GCN dataflows, which employ rigid loop orders and loop fusion strategies, the proposed dataflow can reconFigure the loop order and loop fusion strategy to adapt to different GCN configurations, which results in much improved efficiency. We then introduce a novel accelerator architecture called GCNAX, which tailors the compute engine, buffer structure and size based on the proposed dataflow. Evaluated on five real-world graph datasets, our simulation results show that GCNAX reduces DRAM accesses by a factor of $8.1 \times$ and $2.4 \times$, while achieving $8.9 \times, 1.6 \times$ speedup and $9.5 \times$, $2.3 \times$ energy savings on average over HyGCN and AWB-GCN, respectively.

Proceedings ArticleDOI
20 Jun 2021
TL;DR: This work proposes a dynamic prototype unit (DPU) to encode the normal dynamics as prototypes in real time, free from extra memory cost, and introduces meta-learning to the authors' DPU to form a novel few-shot normalcy learner, namely Meta-Prototype Unit (MPU).
Abstract: Frame reconstruction (current or future frame) based on Auto-Encoder (AE) is a popular method for video anomaly detection. With models trained on the normal data, the reconstruction errors of anomalous scenes are usually much larger than those of normal ones. Previous methods introduced the memory bank into AE, for encoding diverse normal patterns across the training videos. However, they are memory-consuming and cannot cope with unseen new scenarios in the testing data. In this work, we propose a dynamic prototype unit (DPU) to encode the normal dynamics as prototypes in real time, free from extra memory cost. In addition, we introduce meta-learning to our DPU to form a novel few-shot normalcy learner, namely Meta-Prototype Unit (MPU). It enables the fast adaption capability on new scenes by only consuming a few iterations of update. Extensive experiments are conducted on various benchmarks. The superior performance over the state-of-the-art demonstrates the effectiveness of our method. Our code is available at https://github.com/ktr-hubrt/MPN/.

Journal ArticleDOI
TL;DR: A deep learning architecture to improve the performance of drug sensitivity prediction based on high-throughput screening technologies using a deep neural network, which is called DeepDSC.
Abstract: High-throughput screening technologies have provided a large amount of drug sensitivity data for a panel of cancer cell lines and hundreds of compounds. Computational approaches to analyzing these data can benefit anticancer therapeutics by identifying molecular genomic determinants of drug sensitivity and developing new anticancer drugs. In this study, we have developed a deep learning architecture to improve the performance of drug sensitivity prediction based on these data. We integrated both genomic features of cell lines and chemical information of compounds to predict the half maximal inhibitory concentrations $(\text{IC}_{50})$ ( IC 50 ) on the Cancer Cell Line Encyclopedia (CCLE) and the Genomics of Drug Sensitivity in Cancer (GDSC) datasets using a deep neural network, which we called DeepDSC. Specifically, we first applied a stacked deep autoencoder to extract genomic features of cell lines from gene expression data, and then combined the compounds’ chemical features to these genomic features to produce final response data. We conducted 10-fold cross-validation to demonstrate the performance of our deep model in terms of root-mean-square error (RMSE) and coefficient of determination $\text{R}^{2}$ R 2 . We show that our model outperforms the previous approaches with RMSE of 0.23 and $\text{R}^{2}$ R 2 of 0.78 on CCLE dataset, and RMSE of 0.52 and $\text{R}^{2}$ R 2 of 0.78 on GDSC dataset, respectively. Moreover, to demonstrate the prediction ability of our models on novel cell lines or novel compounds, we left cell lines originating from the same tissue and each compound out as the test sets, respectively, and the rest as training sets. The performance was comparable to other methods.

Journal ArticleDOI
TL;DR: An umbrella review of systematic reviews with meta-analyses of observational studies on handgrip strength and all health outcomes found three outcomes (lower all-cause mortality, lower cardiovascular mortality, and lower risk of disability) were found to have highly suggestive evidence.

Journal ArticleDOI
TL;DR: A machine learning approach, MSHub, is engineered to enable auto-deconvolution of gas chromatography–mass spectrometry data and workflows are designed to enable the community to store, process, share, annotate, compare and perform molecular networking of GC–MS data within theGNPS Molecular Networking analysis platform.
Abstract: We engineered a machine learning approach, MSHub, to enable auto-deconvolution of gas chromatography-mass spectrometry (GC-MS) data. We then designed workflows to enable the community to store, process, share, annotate, compare and perform molecular networking of GC-MS data within the Global Natural Product Social (GNPS) Molecular Networking analysis platform. MSHub/GNPS performs auto-deconvolution of compound fragmentation patterns via unsupervised non-negative matrix factorization and quantifies the reproducibility of fragmentation patterns across samples.

Journal ArticleDOI
TL;DR: In this article, a new conceptualization of ethical leadership behavior (ELB) defined as signaling behavior by the leader (individual) targeted at stakeholders (e.g., an individual follower, group of followers, or clients) comprising the enactment of prosocial values combined with expressions of moral emotions is presented.
Abstract: Ethical leadership has attracted massive attention in the twenty-first century. Yet despite this vast literature, knowledge of ethical leadership suffers from two critical limitations: First, existing conceptualizations conflate ethical leader behaviors with followers' evaluations of leaders' characteristics, values, traits, and followers' cognitions. Second, we know little to nothing regarding the causes and consequences of ethical leadership behaviors as most of the evidence not only confounds concepts, but also precludes causal inferences due to design problems. Thus, we first present a review of the definitions of ethical leadership that alarmingly reveals a hodgepodge of follower evaluations of leader behaviors, traits, and values. We then address this concept confusion by drawing upon signaling theory in presenting a new conceptualization of ethical leadership behavior (ELB) defined as signaling behavior by the leader (individual) targeted at stakeholders (e.g., an individual follower, group of followers, or clients) comprising the enactment of prosocial values combined with expressions of moral emotions. As such, enacting prosocial values and expressing moral emotions are each necessary for ethical leadership. Next, we review the nomological network of ELB at the individual, dyad, and group levels. We conclude with a discussion of future research directions in testing new theoretical models, including a set of theoretical and methodological recommendations.

Journal ArticleDOI
TL;DR: In this article, a systematic search of four major database (Psychology and Behavioral Sciences Collection, PsycARTICLES,PsycINFO, and ScienceDirect) was conducted.

Journal ArticleDOI
TL;DR: According to the results, factors such as “ambulance rescue” and “curved roadway” produce temporally stable effects on pedestrian injury severity, however, strong temporal instabilities in effects on pedestrians injury severity are found for most factors across the three-year period and the weekday/weekend.

Journal ArticleDOI
TL;DR: Transition education should be grounded in quality research as mentioned in this paper, and educators need information on which practices are effective for teaching students with disabilities transition-related skills, which can be found in the literature.
Abstract: Transition education should be grounded in quality research. To do so, educators need information on which practices are effective for teaching students with disabilities transition-related skills....

Journal ArticleDOI
TL;DR: Issues that relate to the accumulation of antibiotics and antibiotic resistance (AR) determinants in agricultural lands and crops, following TWW irrigation and biosolid amendment are discussed.

Journal ArticleDOI
TL;DR: This article reviews the advances of ABM in social, ecological, and socio-ecological systems, compares ABM with other traditional, equation-based models, provides guidelines for ABM novice, modelers, and reviewers, and point out the challenges and impending tasks that need to be addressed for the ABM community.

Journal ArticleDOI
TL;DR: In this article, a dual-phase mechanical metamaterial composites by employing architected lattice materials as the constituent matrix and reinforcement phases is developed. And the authors find that strength and toughness can be simultaneously enhanced with the addition of reinforcement phase grains.
Abstract: Nature's materials are generally hybrid composites with superior mechanical properties achieved through delicate architectural designs Inspired by the precipitation hardening mechanisms observed in biological materials as well as engineering alloys, we develop here dual-phase mechanical metamaterial composites by employing architected lattice materials as the constituent matrix and reinforcement phases The composite metamaterials made from austenitic stainless steel are simply fabricated using selected laser melting based additive manufacturing Using quasi-static compression tests and simulation studies, we find that strength and toughness can be simultaneously enhanced with the addition of reinforcement phase grains Effects of reinforcement phase patterning and connectivity are examined By fully utilizing the energy dissipation from phase-boundary slip, an optimized dual-phase metamaterial is designed with the maximum slip area, where every truss unit in the matrix phase is completely surrounded by reinforcement phase lattices; this material exhibits a specific energy absorption capability that is ~25 times that of the constituent matrix phase lattices The design rationale for dissipative dual-phase metamaterials is analyzed and summarized with a focus on phase pattering The present digital multi-phase mechanical metamaterials can emulate almost any of nature's architectures and toughening mechanisms, offering a novel pathway to manipulate mechanical properties through arbitrary phase-material selection and patterning We believe that this could markedly expand the design space for the development of future materials

Journal ArticleDOI
TL;DR: The authors explored the notion of emotional capital in relation to language teachers' emotion labor and the role of reflection in understanding their emotional experiences, arguing that language teachers struggle to orient to the feeling rules of their institutions, they develop the capacity to perform the emotions that they believe are expected of them.
Abstract: In this article we explore the notion of emotional capital in relation to language teachers’ emotion labor and the role of reflection in understanding their emotional experiences. We draw on interview narratives with teachers (n = 25) working in higher education institutions in the US and the UK. During these interview conversations, we elicited accounts of teachers’ emotionally charged experiences that arise as part of their ongoing, mundane teaching practice and how they respond to these situations. We argue that as language teachers struggle to orient to the feeling rules of their institutions, they develop the capacity to perform the emotions that they believe are expected of them. This capacity is further shaped through their reflective practice, both as individual reflection and collaborative reflection with colleagues. We thus analyze how language teachers’ accruing emotional capital, developed through emotion labor and reflective activity, can be converted into social and cultural capital. We also point to how language teachers’ emotional capital is entangled in power relations and thus requires careful scrutiny.

Journal ArticleDOI
TL;DR: In this article, the authors leverage a spatiotemporal model, which allows one to characterize the age of information (AoI) from a joint queueing-geometry perspective, for the design of a decentralized scheduling policy that exploits local observation to make transmission decisions that minimize the AoI.
Abstract: Optimization of information freshness in wireless networks has usually been performed based on queueing analysis that captures only the temporal traffic dynamics associated with the transmitters and receivers. However, the effect of interference, which is mainly dominated by the interferers’ geographic locations, is not well understood. In this paper, we leverage a spatiotemporal model, which allows one to characterize the age of information (AoI) from a joint queueing-geometry perspective, for the design of a decentralized scheduling policy that exploits local observation to make transmission decisions that minimize the AoI. To quantify the performance, we also derive accurate and tractable expressions for the peak AoI. Numerical results reveal that: i) the packet arrival rate directly affects the service process due to queueing interactions, ii) the proposed scheme can adapt to traffic variations and largely reduce the peak AoI, and iii) the proposed scheme scales well as the network grows in size. This is done by adaptively adjusting the radio access probability at each transmitter to the change of the ambient environment.

Journal ArticleDOI
TL;DR: In this article, a comparison of deleterious burden between maize and sorghum was made, and it was shown that maize, in contrast to Sorghum, departed from the domestication-cost hypothesis that predicts a higher deleterius burden among domesticates compared with wild lines.
Abstract: Sorghum and maize share a close evolutionary history that can be explored through comparative genomics1,2. To perform a large-scale comparison of the genomic variation between these two species, we analysed ~13 million variants identified from whole-genome resequencing of 499 sorghum lines together with 25 million variants previously identified in 1,218 maize lines. Deleterious mutations in both species were prevalent in pericentromeric regions, enriched in non-syntenic genes and present at low allele frequencies. A comparison of deleterious burden between sorghum and maize revealed that sorghum, in contrast to maize, departed from the domestication-cost hypothesis that predicts a higher deleterious burden among domesticates compared with wild lines. Additionally, sorghum and maize population genetic summary statistics were used to predict a gene deleterious index with an accuracy greater than 0.5. This research represents a key step towards understanding the evolutionary dynamics of deleterious variants in sorghum and provides a comparative genomics framework to start prioritizing these variants for removal through genome editing and breeding. Comparative genomics revealed similar distribution patterns of deleterious mutations in maize and sorghum but a post-domestication reduction of genetic load in sorghum, which is probably caused by sorghum’s high selfing rate and unique domestication history.

Journal ArticleDOI
TL;DR: This paper showed that employees may continue to work remotely for a substantial amount of time, even after the end of the pandemic, and their established theories of group processes and intergroup relations can help us underspec...
Abstract: Employees may continue to work remotely for a substantial amount of time, even after the end of the pandemic. Our established theories of group processes and intergroup relations can help us unders...

Journal ArticleDOI
TL;DR: Do voters have stable immigration views? While any account of immigration politics must make an assumption about whether underlying attitudes are stable, the literature has been ambiguous regarding the question as mentioned in this paper, and there is no evidence to support such assumptions.
Abstract: Do voters have stable immigration views? While any account of immigration politics must make an assumption about whether underlying attitudes are stable, the literature has been ambiguous regarding...

Journal ArticleDOI
TL;DR: In this article, the physicochemical properties of nanoantibiotic conjugates are discussed and compared to those of conventional antibiotic conjugate types, which can be used in manufacturing and designing various nAbts types.
Abstract: One primary mechanism for bacteria developing resistance is frequent exposure to antibiotics. Nanoantibiotics (nAbts) is one of the strategies being explored to counteract the surge of antibiotic resistant bacteria. nAbts are antibiotic molecules encapsulated with engineered nanoparticles (NPs) or artificially synthesized pure antibiotics with a size range of ≤100 nm in at least one dimension. NPs may restore drug efficacy because of their nanoscale functionalities. As carriers and delivery agents, nAbts can reach target sites inside a bacterium by crossing the cell membrane, interfering with cellular components, and damaging metabolic machinery. Nanoscale systems deliver antibiotics at enormous particle number concentrations. The unique size-, shape-, and composition-related properties of nAbts pose multiple simultaneous assaults on bacteria. Resistance of bacteria toward diverse nanoscale conjugates is considerably slower because NPs generate non-biological adverse effects. NPs physically break down bacteria and interfere with critical molecules used in bacterial processes. Genetic mutations from abiotic assault exerted by nAbts are less probable. This paper discusses how to exploit the fundamental physical and chemical properties of NPs to restore the efficacy of conventional antibiotics. We first described the concept of nAbts and explained their importance. We then summarized the critical physicochemical properties of nAbts that can be utilized in manufacturing and designing various nAbts types. nAbts epitomize a potential Trojan horse strategy to circumvent antibiotic resistance mechanisms. The availability of diverse types and multiple targets of nAbts is increasing due to advances in nanotechnology. Studying nanoscale functions and properties may provide an understanding in preventing future outbreaks caused by antibiotic resistance and in developing successful nAbts.