scispace - formally typeset
Search or ask a question

Showing papers in "American Journal of Tropical Medicine and Hygiene in 2019"


Journal ArticleDOI
TL;DR: It is found that enveloped viruses differed in their susceptibility to irradiation treatment with adsorbed doses for inactivation of a target dose of 1 × 106 50% tissue culture infectious dose (TCID50)/mL ranging from 1 to 5 MRads.
Abstract: Gamma irradiation using a cobalt-60 source is a commonly used method for the inactivation of infectious specimens to be handled safely in subsequent laboratory procedures. Here, we determined irradiation doses to safely inactivate liquid proteinaceous specimens harboring different emerging/reemerging viral pathogens known to cause neglected tropical and other diseases of regional or global public health importance. By using a representative arenavirus, bunyavirus, coronavirus, filovirus, flavivirus, orthomyxovirus, and paramyxovirus, we found that these enveloped viruses differed in their susceptibility to irradiation treatment with adsorbed doses for inactivation of a target dose of 1 × 106 50% tissue culture infectious dose (TCID50)/mL ranging from 1 to 5 MRads. This finding seemed generally inversely correlated with genome size. Our data may help to guide other facilities in testing and verifying safe inactivation procedures.

89 citations


Journal ArticleDOI
TL;DR: In this paper, the Model Law on Medicines and Crime, a quantifiable Sustainable Development Goal target and an international convention to insure drug quality and safety are urgent priorities, in addition to strengthening international and national pharmaceutical governance.
Abstract: Falsified and substandard medicines are associated with tens of thousands of deaths, mainly in young children in poor countries. Poor-quality drugs exact an annual economic toll of up to US$200 billion and contribute to the increasing peril of antimicrobial resistance. The WHO has emerged recently as the global leader in the battle against poor-quality drugs, and pharmaceutical companies have increased their roles in assuring the integrity of drug supply chains. Despite advances in drug quality surveillance and detection technology, more efforts are urgently required in research, policy, and field monitoring to halt the pandemic of bad drugs. In addition to strengthening international and national pharmaceutical governance, in part by national implementation of the Model Law on Medicines and Crime, a quantifiable Sustainable Development Goal target and an international convention to insure drug quality and safety are urgent priorities.

72 citations


Journal ArticleDOI
TL;DR: This quantitative point-of-care diagnostic test for G6PD deficiency can provide equal access to safe radical cure of P. vivax cases in high- and low-resource settings, for males and females and may support malaria elimination, in countries where P.vivax is endemic.
Abstract: Glucose-6-phosphate dehydrogenase (G6PD) is an essential enzyme that protects red blood cells from oxidative damage caused by certain drugs, diseases, and foods.1,2 The X-linked human G6PD gene is highly polymorphic with many mutations resulting in reduced enzyme activity in red blood cells or G6PD deficiency. Exposure to oxidative agents can induce hemolysis in red blood cells with low G6PD activity levels and cause severe anemia, sometimes requiring blood transfusion or causing irreversible renal damage and even mortality, if not managed promptly. Glucose-6-phosphate dehydrogenase deficiency presents clinically in the neonate as jaundice resulting from hyperbilirubinemia; this may lead to kernicterus, a form of brain damage.3,4 Several medications including rasburicase- and 8-aminoquinoline–based antimalarial drugs, such as primaquine, are known to cause clinically significant hemolysis in G6PD-deficient individuals. A high of dose primaquine (a 7-or 14-day regimen) is required to cure patients of Plasmodium vivax malaria. If a patient is not cured of P. vivax, they are at risk of relapse with increasing risk of morbidity and further transmission of the parasite.5,6 Relapse contributes to more than 50% of the disease burden in a community.7,8 A single dose of tafenoquine, another 8-aminoquinoline, when given with chloroquine is capable of curing a patient of P. vivax.9 Tafenoquine, recently approved by the United States Food and Drug Administration (FDA) for radical cure of P. vivax, under the Krintafel label, is indicated for individuals with greater than 70% G6PD activity. Tafenoquine has also been approved by the FDA with a different dosage for prophylaxis, under the Arakoda label, with similar indications for G6PD deficiency. Many countries will struggle to meet their target malaria elimination goals without broader safe access to radical cure of P. vivax. Diagnostic tests that determine a patient’s G6PD status are needed at or near where they seek treatment.10 Glucose-6-phosphate dehydrogenase deficiency is determined by measuring G6PD activity, adjusted for temperature, in blood normalized for either red blood cell count or hemoglobin. Quantitative testing for G6PD is performed in reference or specialized laboratories using a complex assay on a temperature-regulated instrument because of the large temperature impact on enzyme activity. The most commonly used test for clinical screening is the qualitative fluorescent spot test, which accurately discriminates hemizygous-deficient males and homozygous- or heterozygous-deficient females, who typically have G6PD activity less than 30% of normal. Although this is adequate for males who are either deficient ( 80% activity), it is inadequate to classify females who can be G6PD deficient, intermediate (30–80% activity), or normal.11–15 Quantitative point-of-care G6PD tests that can be used in low-resource settings can impact health on many levels, including reductions in neonatal morbidity and mortality. It can also impact health by providing access to safe, radical, curative treatment for P. vivax malaria, which can then prevent relapse burden, onward malaria transmission, and accelerate malaria elimination.7–10,16 Both qualitative and quantitative G6PD deficiency tests that can be performed closer to the patients are beginning to emerge.17–19 In 2017, the production of a commonly used reference assay for evaluating new G6PD products, the G6PDH quantitative test by Trinity Biotech, was suspended.20 This study presents the results of a bridging evaluation of the Trinity Biotech kit against a commercially available U.S. Food and Drug Administration–cleared reference assay by Pointe Scientific. The innovative, point-of-care STANDARD G6PD test by SD Biosensor was also evaluated against the same reference assay.

61 citations


Journal ArticleDOI
TL;DR: The T-cell data suggest PfSPZ Vaccine may be more protective in children than in adults, whereas infants may not be immunologically mature enough to respond to the PfSPz Vaccine immunization regimen assessed.
Abstract: In 2016, there were more cases and deaths caused by malaria globally than in 2015. An effective vaccine would be an ideal additional tool for reducing malaria's impact. Sanaria® PfSPZ Vaccine, composed of radiation-attenuated, aseptic, purified, cryopreserved Plasmodium falciparum (Pf) sporozoites (SPZ) has been well tolerated and safe in malaria-naive and experienced adults in the United States and Mali and protective against controlled human malaria infection with Pf in the United States and field transmission of Pf in Mali, but had not been assessed in younger age groups. We, therefore, evaluated PfSPZ Vaccine in 93 Tanzanians aged 45 years to 6 months in a randomized, double-blind, normal saline placebo-controlled trial. There were no significant differences in adverse events between vaccinees and controls or between dosage regimens. Because all age groups received three doses of 9.0 × 105 PfSPZ of PfSPZ Vaccine, immune responses were compared at this dosage. Median antibody responses against Pf circumsporozoite protein and PfSPZ were highest in infants and lowest in adults. T-cell responses were highest in 6-10-year olds after one dose and 1-5-year olds after three doses; infants had no significant positive T-cell responses. The safety data were used to support initiation of trials in > 300 infants in Kenya and Equatorial Guinea. Because PfSPZ Vaccine-induced protection is thought to be mediated by T cells, the T-cell data suggest PfSPZ Vaccine may be more protective in children than in adults, whereas infants may not be immunologically mature enough to respond to the PfSPZ Vaccine immunization regimen assessed.

57 citations


Journal ArticleDOI
TL;DR: Five Mentoring-the-Mentor workshops in Africa, South America, and Asia aimed at strengthening the capacity for evidence-based, LMIC-specific institutional mentoring programs globally are presented in this special edition of the American Journal of Tropical Medicine and Hygiene.
Abstract: Mentoring is a proven path to scientific progress, but it is not a common practice in low- and middle-income countries (LMICs). Existing mentoring approaches and guidelines are geared toward high-income country settings, without considering in detail the differences in resources, culture, and structure of research systems of LMICs. To address this gap, we conducted five Mentoring-the-Mentor workshops in Africa, South America, and Asia, which aimed at strengthening the capacity for evidence-based, LMIC-specific institutional mentoring programs globally. The outcomes of the workshops and two follow-up working meetings are presented in this special edition of the American Journal of Tropical Medicine and Hygiene. Seven articles offer recommendations on how to tailor mentoring to the context and culture of LMICs, and provide guidance on how to implement mentoring programs. This introductory article provides both a prelude and executive summary to the seven articles, describing the motivation, cultural context and relevant background, and presenting key findings, conclusions, and recommendations.

52 citations


Journal ArticleDOI
TL;DR: The researchers investigated the prevalence of CTX-M extended-spectrum beta-lactamases in chickens from small-scale poultry farms and in children living on the farms in rural Ecuador, and reported a shared evolutionary history between chicken and human samples.
Abstract: In 1940, one year before the first administration of penicillin in man, two members of the team who discovered the drug revealed that resistance to penicillin already existed. Since then, as antimicrobial resistance (AMR) has progressed in the wake of exponential antimicrobial use, scientists have raced against extraordinarily efficient microbial gene dissemination and evolution to provide effective antimicrobial therapeutics. Today, with the existence of genes resistant to every natural and synthetic antimicrobial compound, national surveillance systems track AMR in human and animal populations to deepen our understanding of resistance and find ways to circumvent it. Although we have established surveillance systems across North America and Europe, pathogens do not respect international boundaries, and the emergence of resistance in any country poses a worldwide threat. In this issue of the AJTMH, Hedman et al. report spillover of AMR to developing world settings with no prior history of agricultural antimicrobial use. We are reminded that surveillance must become a global “One Health” effort if we are to solve one of today’s most significant threats to human, animal, and environmental health. Antimicrobial resistance has reached its tipping point, and some are saying we are now in the post-antibiotic era. Recent reports have highlighted this trend, including the emergence of multiple plasmid-mediated colistin resistance genes in human and animal pathogens, spread of metallobeta-lactamase-1 in India, and the emergence of plasmidmediated carbapenem-resistant Enterobacteriaceae in swine for the first time in the United States. Leading world health agencies consider the threat of AMR as paramount and recognize its complex causation: expanding human and domestic animal populations; increased globalization, international trade, and demand for animal source foods; and all-too-easy access to antimicrobials in both developed and developing countries. The proficiency of genome evolution via horizontal gene transfer and the emergence of new forms of resistance have compounded the lack of new antibiotic discovery and development, while intensifying the threat posed by drug-resistant pathogens. By 2050, an estimated 10million human lives per year will be at risk if we fail to attenuate the rise of drug resistance, and critical medical procedures such as administration of cancer chemotherapy, joint replacement, and gastrointestinal surgery may be associatedwith increasingmorbidity. The increase inAMRburden correlates with a 65% increase in antimicrobial consumption in humans between 2000 and 2015 in 76 countries and administration of 63,000 tons in animals in 2010,with a projected 67% increase in consumption by 2030. Antimicrobial resistance poses a particularly significant threat to lowand middle-income countries. This is due not only to the health-care challenges these countries face, but also to an increase in small-scale intensive animal production, exacerbated by poor sanitation infrastructure. The findings reported by Hedman et al. in this issue exemplify this problem and the difficulty of understanding the complicated dynamics of AMR transmission between humans and animals sharing the same environment. The researchers investigated the prevalence of CTX-M extended-spectrum beta-lactamases in chickens from small-scale poultry farms and in children living on the farms in rural Ecuador. CTX-M-mediated cephalosporin resistance was seen in bacteria both in commercially bred “broiler” chickens treated with high levels of antibiotics and in free-grazing animals that had no direct exposure to antibiotics. Resistance was also detected in bacteria from children in the community. After phylogenetic analysis, the authors reported a shared evolutionary history between chicken and human samples. Hedman et al., thus, provide valuable insight into the rise of phenotypic resistance and avian-to-human spillover in areas that have previously reported low AMR levels in both poultry and humans. Altogether, the data provided by Hedman et al. support a familiar narrative: gene exchange is a property of bacteria that efficiently enables the transmission of resistance between animals and humans. Of particular importance to surveillance systems, the study also highlights the pivotal role of the environment in AMR transmission. The ability of the environment to act as a reservoir for resistance is not a new concept and may have promoted the potential spillover event described by Hedman et al. in Ecuador. Indeed, the environmental AMR resistome consists of more than one million distinct bacterial species, which markedly exceeds the number of species that infect human and animal populations. Despite the knowledge of environmental influences on AMR, current surveillance systems often neglect environmental sampling. It is now crucial that we re-emphasize the role that the environment plays as a reservoir and in maintaining AMR genes as we establish surveillance systems to combat AMR. We know that many of the resistance mechanisms we see in veterinary clinics and animal production systems likely have environmental origins. Recently, we have reported horizontal dissemination of resistance determinants in multiple Salmonella serotypes across commercial swine farms following manure application. In addition, numerous studies have reported very little difference in the shedding of drug-resistant bacterial strains between animals raised under organic or antimicrobial-free production systems. Combined with studies such as that conducted by Hedman et al., these findings demonstrate the need to apply a One Health approach and study environmental reservoirs more closely, rather than focusing only on the resistance that arises following antimicrobial administration. * Address correspondence to Siddhartha Thakur, College of VeterinaryMedicine, NC State, 1060WilliamMoore Dr., Raleigh, NC 27606. E-mail: sthakur@ncsu.edu

41 citations


Journal ArticleDOI
TL;DR: Addressing the dual needs for innovation and discovery and for building state and local capacities may overcome current challenges in vector-borne disease prevention and control, but will require coordination across a national network of collaborators operating under a national strategy.
Abstract: Reported cases of vector-borne diseases in the United States have more than tripled since 2004, characterized by steadily increasing incidence of tick-borne diseases and sporadic outbreaks of domestic and invasive mosquito-borne diseases. An effective public health response to these trends relies on public health surveillance and laboratory systems, proven prevention and mitigation measures, scalable capacity to implement these measures, sensitive and specific diagnostics, and effective therapeutics. However, significant obstacles hinder successful implementation of these public health strategies. The recent emergence of Haemaphysalis longicornis, the first invasive tick to emerge in the United States in approximately 80 years, serves as the most recent example of the need for a coordinated public health response. Addressing the dual needs for innovation and discovery and for building state and local capacities may overcome current challenges in vector-borne disease prevention and control, but will require coordination across a national network of collaborators operating under a national strategy. Such an effort should reduce the impact of emerging vectors and could reverse the increasing trend of vector-borne disease incidence and associated morbidity and mortality.

40 citations


Journal ArticleDOI
TL;DR: The results point to the importance of continued genomic-based surveillance and prompt urgent vector competence studies to assess the level of vector susceptibility and virus transmission, and the impact this might have on this variant’s epidemic potential and global spread.
Abstract: In May 2016, the Kenyan Ministry of Health (KMoH) reported an outbreak of chikungunya virus (CHIKV) in Mandera County on the border with Somalia. During this time in Somalia, outbreaks of CHIKV were occurring in the neighboring Bula Hawa region, originating from Mogadishu. In Mandera town, 1,792 cases were detected, and an estimated 50% of the health work force was affected by this virus. A cross-border joint response was coordinated between Kenya and Somalia to control the outbreak.1 This was the first reported outbreak of CHIKV in Kenya since 2004. The previous large CHIKV outbreak in Kenya occurred in Lamu Island in 2004, with an estimated 75% of the population infected.2 The disease also spread to the coastal city of Mombasa by the end of 2004, and further to the Comoros and La Reunion islands, causing large outbreaks in 2005–2006. On La Reunion island, unusual clinical complications were reported in association with CHIKV infection, and viral isolate sequences revealed the presence of an alanine-to-valine mutation in the E1 glycoprotein at position 226 (E1:A226V).3 This specific mutation was shown to confer significant increase in CHIKV infectivity for the Aedes albopictus vector, which was also the dominant mosquito species suspected to be responsible for the transmission of CHIKV on the island of La Reunion.4,5 Since then, the E1:A226V mutation has been observed in many of the genomes in the CHIKV lineage, spreading in the East, Central, and South African region (ECSA lineage), and has been shown to have emerged through convergent evolution in at least four different occasions.6 The remarkable emergence and spread of CHIKV adaptation to the Ae. albopictus vector prompted additional studies on the genetic plasticity of this virus, looking for additional biomarkers associated with virus transmission capacity, fitness, and pathogenicity.6–8 Mutations with the ability to enhance infection in this vector are of increased importance, as Ae. albopictus is rapidly expanding throughout the world.9 The Ae. albopictus mosquito is believed to have originated in Asia and is today most commonly found in east Asia. Aedes albopictus is also common in some parts of South and Southeast Asia, India, and Africa, and it has shown increased spread to regions with lower temperatures, such as southern Europe, southern Brazil, northern China, and the northern United States.9 In Europe, this vector has been associated with autochthonous transmission of CHIKV.10 Although some of the more recent CHIKV outbreaks have been transmitted by Ae. albopictus, most of the CHIKV transmissions in the world are associated with Aedes aegypti. Aedes aegypti is a container-breeding, domesticated mosquito mainly found in urban areas and feeding largely on humans during early and late daytime hours. Aedes aegypti originated from the ancestral zoophilic Ae. aegypti formosus native to Africa. Aedes aegypti now is most common in tropical and subtropical regions, such as Africa, India, Southeast Asia, and Australia.9 Recently, two mutations, E1:K211E and E2:V264A, have been reported in the CHIKV to be associated with increased fitness to Ae. aegypti vectors.11 These two mutations, in the background of the wild-type E1:226A, are believed to increase virus infectivity, dissemination, and transmission in Ae. aegypti, with no impact on virus fitness for the Ae. albopictus vector.6,11 E1:K211E was first observed in genomes sampled in 2006 from the Kerala and Puducherry regions, India, and simultaneous presence of both mutations was first observed in 2009–2010 in Tamil Nadu and Andhra Pradesh, India.12,13 In Kenya, the predominant CHIKV vector is Ae. aegypti, and vector competence studies using local vector populations have shown it is capable of transmitting the virus in this region.14 Given the recent large outbreak of CHIKV in the rural setting of Mandera County, Kenya, we analyzed CHIKV genomes sequenced from this outbreak for the presence of adaptive mutations associated with both Ae. albopictus and Ae. aegypti. Along with estimating origins and the time of emergence of a variant that carried two of these previously reported mutations, we also estimated time and origins of the Mandera CHIKV outbreak. Our results indicate the spread of a CHIKV that carries mutations previously associated with increased fitness for Ae. aegypti in Kenya. We also show that this variant is now connected with new large outbreaks in other regions of the world.

40 citations


Journal ArticleDOI
TL;DR: New research is warranted to identify persistent hotspots (PHS) after just one or a few rounds of MDA, and new adaptive strategies need to be advanced and validated for turning PHSs into responder villages.
Abstract: Control of schistosomiasis presently relies largely on preventive chemotherapy with praziquantel through mass drug administration (MDA) programs. The Schistosomiasis Consortium for Operational Research and Evaluation has concluded five studies in four countries (Cote d'Ivoire, Kenya, Mozambique, and Tanzania) to evaluate alternative approaches to MDA. Studies involved four intervention years, with final evaluation in the fifth year. Mass drug administration given annually or twice over 4 years reduced average prevalence and intensity of schistosome infections, but not all villages that were treated in the same way responded similarly. There are multiple ways by which responsiveness to MDA, or the lack thereof, could be measured. In the analyses presented here, we defined persistent hotspots (PHS) as villages that achieved less than 35% reduction in prevalence and/or less than 50% reduction in infection intensity after 4 years of either school-based or community-wide MDA, either annually or twice in 4 years. By this definition, at least 30% of villages in each of the five studies were PHSs. We found no consistent relationship between PHSs and the type or frequency of intervention, adequacy of reported MDA coverage, and prevalence or intensity of infection at baseline. New research is warranted to identify PHSs after just one or a few rounds of MDA, and new adaptive strategies need to be advanced and validated for turning PHSs into responder villages.

40 citations


Journal ArticleDOI
TL;DR: This work aimed at assessing S. stercoralis-associated morbidity through a systematic review and meta-analysis of the available literature and highlighted the appalling knowledge gap about clinical manifestations of this common yet neglected soil-transmitted helminthiasis.
Abstract: Strongyloides stercoralis, a worldwide-distributed soil-transmitted helminth, causes chronic infection which may be life threatening. Limitations of diagnostic tests and nonspecificity of symptoms have hampered the estimation of the global morbidity due to strongyloidiasis. This work aimed at assessing S. stercoralis-associated morbidity through a systematic review and meta-analysis of the available literature. MEDLINE, Embase, CENTRAL, LILACS, and trial registries (WHO portal) were searched. The study quality was assessed using the Newcastle-Ottawa scale. Odds ratios (ORs) of the association between symptoms and infection status and frequency of infection-associated symptoms were calculated. Six articles from five countries, including 6,014 individuals, were included in the meta-analysis-three were of low quality, one of high quality, and two of very high quality. Abdominal pain (OR 1.74 [CI 1.07-2.94]), diarrhea (OR 1.66 [CI 1.09-2.55]), and urticaria (OR 1.73 [CI 1.22-2.44]) were associated with infection. In 17 eligible studies, these symptoms were reported by a large proportion of the individuals with strongyloidiasis-abdominal pain by 53.1% individuals, diarrhea by 41.6%, and urticaria by 27.8%. After removing the low-quality studies, urticaria remained the only symptom significantly associated with S. stercoralis infection (OR 1.42 [CI 1.24-1.61]). Limitations of evidence included the low number and quality of studies. Our findings especially highlight the appalling knowledge gap about clinical manifestations of this common yet neglected soil-transmitted helminthiasis. Further studies focusing on morbidity and risk factors for dissemination and mortality due to strongyloidiasis are absolutely needed to quantify the burden of S. stercoralis infection and inform public health policies.

39 citations


Journal ArticleDOI
TL;DR: This study validated a multiplex quantitative reverse transcription–polymerase chain reaction for Plasmodium 18S rRNA and led to biomarker qualification through the U.S. Food and Drug Administration for use in CHMI studies at non-endemic sites, which will facilitate biomarker use for the qualified context of use in drug and vaccine trials.
Abstract: 18S rRNA is a biomarker that provides an alternative to thick blood smears in controlled human malaria infection (CHMI) trials. We reviewed data from CHMI trials at non-endemic sites that used blood smears and Plasmodium 18S rRNA/rDNA biomarker nucleic acid tests (NATs) for time to positivity. We validated a multiplex quantitative reverse transcription-polymerase chain reaction (qRT-PCR) for Plasmodium 18S rRNA, prospectively compared blood smears and qRT-PCR for three trials, and modeled treatment effects at different biomarker-defined parasite densities to assess the impact on infection detection, symptom reduction, and measured intervention efficacy. Literature review demonstrated accelerated NAT-based infection detection compared with blood smears (mean acceleration: 3.2-3.6 days). For prospectively tested trials, the validated Plasmodium 18S rRNA qRT-PCR positivity was earlier (7.6 days; 95% CI: 7.1-8.1 days) than blood smears (11.0 days; 95% CI: 10.3-11.8 days) and significantly preceded the onset of grade 2 malaria-related symptoms (12.2 days; 95% CI: 10.6-13.3 days). Discrepant analysis showed that the risk of a blood smear-positive, biomarker-negative result was negligible. Data modeling predicted that treatment triggered by specific biomarker-defined thresholds can differentiate complete, partial, and non-protective outcomes and eliminate many grade 2 and most grade 3 malaria-related symptoms post-CHMI. Plasmodium 18S rRNA is a sensitive and specific biomarker that can justifiably replace blood smears for infection detection in CHMI trials in non-endemic settings. This study led to biomarker qualification through the U.S. Food and Drug Administration for use in CHMI studies at non-endemic sites, which will facilitate biomarker use for the qualified context of use in drug and vaccine trials.

Journal ArticleDOI
TL;DR: Dengue was the most common pathogen for AUFI in urban Thailand, however, murine typhus and leptospirosis were not uncommon, and empirical antibiotic treatment using doxycycline or azithromycin might be more appropriate, but cost-benefit studies are required.
Abstract: Acute undifferentiated febrile illness (AUFI) has been a diagnostic dilemma in the tropics. Without accurate point-of-care tests, information on local pathogens and clinical parameters is essential for presumptive diagnosis. A prospective hospital-based study was conducted at the Bangkok Hospital for Tropical Diseases from 2013 to 2015 to determine common etiologies of AUFI. A total of 397 adult AUFI cases, excluding malaria by blood smear, were enrolled. Rapid diagnostic tests for tropical infections were performed on admission, and acute and convalescent samples were tested to confirm the diagnosis. Etiologies could be identified in 271 (68.3%) cases. Dengue was the most common cause, with 157 cases (39.6%), followed by murine typhus (20 cases; 5.0%), leptospirosis (16 cases; 4.0%), influenza (14 cases; 3.5%), and bacteremia (six cases; 1.5%). Concurrent infection by at least two pathogens was reported in 37 cases (9.3%). Furthermore, characteristics of dengue and bacterial infections (including leptospirosis and rickettsioses) were compared to facilitate dengue triage, initiate early antibiotic treatment, and minimize unnecessary use of antibiotics. In conclusion, dengue was the most common pathogen for AUFI in urban Thailand. However, murine typhus and leptospirosis were not uncommon. Empirical antibiotic treatment using doxycycline or azithromycin might be more appropriate, but cost-benefit studies are required. Physicians should recognize common causes of AUFI in their localities and use clinical and laboratory clues for provisional diagnosis to provide appropriate treatment while awaiting laboratory confirmation.

Journal ArticleDOI
TL;DR: In hyperendemic areas, 3–5 years of implementation of SAFE is insufficient to achieve trachoma elimination as a public health problem; additional years of intervention with the WHO-recommended SAFE and several rounds of TIS will be required beforetrachoma is eliminated.
Abstract: At baseline in 2006, Amhara National Regional State, Ethiopia, was the most trachoma-endemic region in the country. Trachoma impact surveys (TIS) were conducted in all districts between 2010 and 2015, following 3-5 years of intervention with the WHO-recommended SAFE (surgery, antibiotics, facial cleanliness, and environmental improvement) strategy. A multistage cluster random sampling design was used to estimate the district-level prevalence of trachoma. In total, 1,887 clusters in 152 districts were surveyed, from which 208,265 individuals from 66,089 households were examined for clinical signs of trachoma. The regional prevalence of trachomatous inflammation-follicular (TF) and trachomatous inflammation-intense among children aged 1-9 years was 25.9% (95% CI: 24.9-26.9) and 5.5% (95% CI: 5.2-6.0), respectively. The prevalence of trachomatous scarring and trachomatous trichiasis among adults aged ≥ 15 years was 12.9% (95% CI: 12.2-13.6) and 3.9% (95% CI: 3.7-4.1), respectively. Among children aged 1-9 years, 76.5% (95% CI: 75.3-77.7) presented with a clean face; 66.2% (95% CI: 64.1-68.2) of households had access to water within 30 minutes round-trip, 48.1% (95% CI: 45.5-50.6) used an improved water source, and 46.2% (95% CI: 44.8-47.5) had evidence of a used latrine. Nine districts had a prevalence of TF below the elimination threshold of 5%. In hyperendemic areas, 3-5 years of implementation of SAFE is insufficient to achieve trachoma elimination as a public health problem; additional years of SAFE and several rounds of TIS will be required before trachoma is eliminated.

Journal ArticleDOI
TL;DR: Findings confirm the intrinsic link between malaria and economic growth and underscore the importance of malaria control in the agenda for sustainable development.
Abstract: Investing in health has been considered a means to achieve economic growth and reduce poverty since the second half of the 20th century. Until then, economic thinking was about a one-way relationship between wealth and health in terms of wealth being required to achieve health.1–6 This linear wealth-to-health link was weakened by several econometric studies providing evidence that health is a significant determinant of growth.4,7 In 2001, the WHO Commission on Macroeconomics and Health underscored the importance of health as an instrument for economic development and poverty reduction.8 Bloom and Canning described the process of cumulative causality where health improvements promote economic growth, which in turn promotes health. Healthier populations are more productive at work and learn more at school, contributing to increased current and future earnings and savings. Associated savings in health-care spending is hypothesized to increase investment in physical and human capital and attract higher foreign investments. Higher income for individuals or countries improves health through different channels, from better nutrition to better public health infrastructure.9 Furthermore, as per the classical Grossman model of health demand, if individuals expect a longer life, their savings and investment in human capital will be greater.10 The mutual reinforcement between health and wealth is also recognized to exist in reverse, whereby sick people are more likely to become poor and those who are poor are more vulnerable to disease.11 Although empirically there is evidence on the strong correlation between health and income both across and within countries over time, the literature has long debated on the magnitude of the effects of other factors that simultaneously influence health outcomes and wealth, notably institutional quality.6,9,12,13 Specifically, a portion of the empirical growth economics has long debated about the relative importance of potential determinants of economic growth. In 2001, Gallup and Sachs quantified the association between malaria and the level and growth of per capita income over the period 1965–1995. In a cross-country regression framework controlling for historical, geographical, social, economic, and institutional country characteristics, they found that malaria-endemic countries displayed, ceteris paribus, per capita income levels 70% lower than those of nonendemic countries and that a 10% reduction in their malaria exposure index was associated with a 0.26 percentage point increase in annual per capita growth rates.14 By contrast, Acemoglu, Johnson, and Robinson argued that malaria was not the main determinant of economic performance, instead concluding on the central role of institutions in cross-country growth differences.15 Despite this debate, results from the Gallup and Sachs study are still widely cited to support the case for investing in malaria control and elimination in the context of the Sustainable Development Goals.16–25 Since the publication of these studies, significant progress has been made in the fight against malaria and estimation methods have evolved. Core malaria control interventions have reached unprecedented coverage levels, and this progress contributed to reduce malaria case incidence rate by an estimated 37% globally over the period 2000–2015 and by 18% over the more recent 2010–2017 period.26 The availability of data and methodologies for estimating malaria-burden estimates has also improved. Although some data on malaria incidence were available from the WHO at the time of the Gallup and Sachs study, the authors preferred using a malaria exposure index, defined as the product of the land area subject to malaria and the fraction of malaria cases attributable to Plasmodium falciparum malaria.14 Since then, malaria case incidence measures have been standardized.27,28 New econometric approaches exploiting panel data structures have also become the norm.29,30 Herein, we take a contemporary look at the seminal work of Gallup and Sachs.14 We focus on the period 2000–2017 and draw on the vastly improved and more recent data available on malaria case incidence and other determinants of economic growth as well as the much higher econometric standards that are now the professional norm. Then, we make an attempt at a causal analysis of malaria on economic growth using two instrumental variables (IVs), namely, antimalarial treatment failure and resistance of malaria mosquitos to pyrethroid-only insecticides. We supplement this study with a sectoral analysis testing the hypothesis that industry sectors that are more labor intensive will have slower growth rates in countries that have higher malaria incidence. Evidence suggests that in addition to the implications on the total size of the economy, malaria is associated with reduced labor productivity and supply.31–37 To explore this further, we use a common approach in macro-econometric modeling recently applied to the health sector to quantify the argument that malaria case incidence affects economic growth through labor productivity.38,39

Journal ArticleDOI
TL;DR: Improved testing and increasing awareness of the disease have contributed to efforts to better understand the general risk factors and modes of transmission present in Hawaii and also helped improve prevention efforts, although there still do not fully understand the specific causes of cases being concentrated in certain parts of the state over others.
Abstract: Angiostrongyliasis, also known as rat lungworm disease, is caused by the parasitic nematode Angiostrongylus cantonensis. Its primary host includes several species of rats, primarily those in the Rattus genus,1–5 in which mature A. cantonensis lay eggs. These eggs hatch into first-stage larvae, which are then expelled in the rat’s feces. Intermediate hosts, including snails and slugs, ingest the contaminated feces, and the first-stage larvae enter these hosts and develop into third-stage larvae. If a rat eats an infected intermediate host, the third-stage larvae infect the rat, in which they can continue to develop into mature adults, reproduce, and continue the cycle. Human infections with A. cantonensis occur when individuals ingest third-stage larvae of the parasite. In humans, however, the third-stage larvae are not able to develop into their adult stage and, therefore, eventually die after migrating to the central nervous system. The immune system’s reaction to the dead parasites is responsible for most symptoms associated with angiostrongyliasis.6 The primary clinical presentation of angiostrongyliasis is eosinophilic meningitis (EM). Common symptoms include headache, stiff neck, paresthesias, vomiting, and nausea; face or limb paralysis, photophobia, and disturbed vision can sometimes present as well.7,8 Uncommonly, in severe cases, high intracranial pressure caused by the infection can result in unconsciousness, coma, and sometimes even death.7,8 Signs and symptoms generally reflect those areas damaged by the migrating larvae and resulting inflammation. Treatment is mainly supportive; lumbar punctures, analgesics, and especially corticosteroids may be used to treat some of the associated symptoms.8 The use of anthelmintic drugs has been controversial with unclear benefits.8–10 Traditionally, since the first human Angiostrongylus infection was identified in 1945 in Taiwan,11 infections have been most commonly identified in Southeast Asia and the Pacific basin. However, with increased globalism, this parasite has continued to spread to other parts of the world, including to the Americas,4,12–14 with cases often identified in travelers returning from regions where angiostrongyliasis is endemic.15–18 In the United States, angiostrongyliasis has been present in Hawaii since at least 1959.19,20 However, recently, A. cantonensis has been found in both mollusk and rat hosts in the Gulf Coast region of the continental United States,21–23 and sporadic autochthonous cases have been identified in other areas as well.24,25 This suggests the range of the parasite continues to expand, and cases may continue to appear in regions previously unaffected. In Hawaii, angiostrongyliasis is endemic and has been reportable to the Hawaii Department of Health (HDOH) since 2007. Two previous assessments, from 1959 to 1965 and from 2001 to 2005,20,26,27 have examined cases of EM related to angiostrongyliasis in Hawaii. We report updated findings on the number and description of angiostrongyliasis cases in Hawaii from 2007 to 2017.

Journal ArticleDOI
TL;DR: It is indicated that surface waters in an urban Brazilian site can serve as an environmental reservoir of AMR and that improving wastewater treatment and sanitation generally may ameliorate AMR dissemination.
Abstract: Surface waters are an unappreciated reservoir of antimicrobial resistance (AMR). Poor sanitation brings different species of environmental bacteria into contact, facilitating horizontal gene transfer. To investigate the role of surface waters as potential reservoirs of AMR, we studied the point prevalence of fecal contamination, AMR genes, and Enterobacteriaceae in an urban lake and rural river system in Northeast Brazil in comparison with a lake and sewer system in Northeast Ohio in the United States. Surface water samples were examined for evidence of human fecal contamination using microbial source tracking and screened for plasmid-mediated fluoroquinolone resistance and carbapenemase genes. Enterobacteriaceae were detected using selective agar followed by antimicrobial susceptibility testing and detection of AMR genes by microarray, and classified by repetitive sequence-based polymerase chain reaction and multilocus sequence typing. Concentrations of human fecal bacteria in the Brazilian urban lake and sewage in Northeast Ohio were similarly high. Filtered water samples from the Brazilian urban lake, however, showed the presence of bla OXA-48, bla KPC, bla VIM-2, qnrS, and aac(6')-lb-cr, whereas only bla VIM-2 was identified in raw sewage from Northeast Ohio. From the Brazilian urban lake, 85% of the Enterobacteriaceae (n = 40) cultured were resistant to at least one clinically important antibiotic, including ST131 Escherichia coli harboring the extended-spectrum beta-lactamase CTX-M. Although two isolates demonstrated polymyxin resistance, mcr-1/2 was not detected. Our findings indicate that surface waters in an urban Brazilian site can serve as an environmental reservoir of AMR and that improving wastewater treatment and sanitation generally may ameliorate AMR dissemination.

Journal ArticleDOI
TL;DR: Etanercept appears to contribute to the control of inflammation and facilitate corticosteroid taper in patients with extraparenchymal neurocysticercosis, and was associated with clinical improvement, stable disease, absence of recurrence, and lack of serious side effects.
Abstract: Manifestations of neurocysticercosis (NCC) are primarily due to host inflammatory responses directed at drug-damaged or naturally degenerating metacestodes (cysts) of the tapeworm Taenia solium. Prolonged high-dose corticosteroids are frequently required to control this inflammation in complicated disease, often causing severe side effects. Studies evaluating alternatives to corticosteroids are lacking. Here, we describe the clinical course of NCC in 16 patients prescribed etanercept (ETN), a tumor necrosis factor-alpha inhibitor to control inflammation resulting from anthelmintic treatment. Twelve patients with extraparenchymal NCC were administered ETN with corticosteroids (11/12, 91.7%) and/or methotrexate (9/12, 75.0%). The median age of the subgroup with extraparenchymal NCC was 40 years (range 26-57 years) and 66.7% were male. They were administered ETN for a median period of 311 days (range 31-461 days) and then followed for a median of 3.4 years (range 0.3-6.6 years). Among nine assessable patients, all improved clinically after starting ETN and one deteriorated transiently. Of the remaining three, one was lost to follow-up and two patients have improved but had not completed their assigned course. Four additional persons with recurrent perilesional edema (PE) episodes were given ETN for a median of 400.5 days (range 366-854 days) and followed post-ETN for a median of 1.7 years (range 0.2-2.4 years). All PE patients improved and two successfully tapered corticosteroids. Etanercept administration was associated with clinical improvement, stable disease, absence of recurrence, and lack of serious side effects. Etanercept appears to contribute to the control of inflammation and facilitate corticosteroid taper.

Journal ArticleDOI
TL;DR: Investigation of the impact of CHIKV exposure on population susceptibility to other emerging viruses may help predict outbreaks and identify cross-reactive immune responses among alphaviruses, which may lead to the development of vaccines targeting multiple viruses.
Abstract: Most alphaviruses are mosquito-borne and can cause severe disease in domesticated animals and humans. The most notable recent outbreak in the Americas was the 2014 chikungunya virus (CHIKV) outbreak affecting millions and producing disease highlighted by rash and arthralgia. Chikungunya virus is a member of the Semliki Forest (SF) serocomplex, and before its arrival in the Americas, two other member of the SF complex, Una (UNAV) and Mayaro (MAYV) viruses, were circulating in Central and South America. This study examined whether antibodies from convalescent CHIKV patients could cross-neutralize UNAV and MAYV. Considerable cross-neutralization of both viruses was observed, suggesting that exposure to CHIKV can produce antibodies that may mitigate infection with UNAV or MAYV. Understanding the impact of CHIKV exposure on population susceptibility to other emerging viruses may help predict outbreaks; moreover, identification of cross-reactive immune responses among alphaviruses may lead to the development of vaccines targeting multiple viruses.

Journal ArticleDOI
TL;DR: It is found no evidence that the WASH intervention resulted in additional reductions in STH infections beyond that achieved with deworming alone over the 2-year trial period, and the role of WASH onSTH infections over a longer period of time and in the absence of de worming remains to be determined.
Abstract: Water, sanitation, and hygiene (WASH) interventions have been proposed as an important complement to deworming programs for sustainable control of soil-transmitted helminth (STH) infections. We aimed to determine whether a community-based WASH program had additional benefits in reducing STH infections compared with community deworming alone. We conducted the WASH for WORMS cluster-randomized controlled trial in 18 rural communities in Timor-Leste. Intervention communities received a WASH intervention that provided access to an improved water source, promoted improved household sanitation, and encouraged handwashing with soap. All eligible community members in intervention and control arms received albendazole every 6 months for 2 years. The primary outcomes were infection with each STH, measured using multiplex real-time quantitative polymerase chain reaction. We compared outcomes between study arms using generalized linear mixed models, accounting for clustering at community, household, and individual levels. At study completion, the integrated WASH and deworming intervention did not have an effect on infection with Ascaris spp. (relative risk [RR] 2.87, 95% confidence interval [CI]: 0.66-12.48, P = 0.159) or Necator americanus (RR 0.99, 95% CI: 0.52-1.89, P = 0.987), compared with deworming alone. At the last follow-up, open defecation was practiced by 66.1% (95% CI: 54.2-80.2) of respondents in the control arm versus 40.2% (95% CI: 25.3-52.6) of respondents in the intervention arm (P = 0.005). We found no evidence that the WASH intervention resulted in additional reductions in STH infections beyond that achieved with deworming alone over the 2-year trial period. The role of WASH on STH infections over a longer period of time and in the absence of deworming remains to be determined.

Journal ArticleDOI
TL;DR: Drinking water testing could become a low-cost approach to determine the presence of typhoidal Salmonella in the environment that can guide informed-design of blood culture-based surveillance and thus assist policy decisions on investing to control typhoid.
Abstract: With prequalification of a typhoid conjugate vaccine by the World Health Organization, countries are deciding whether and at what geographic scale to provide the vaccine. Optimal local data to clarify typhoid risk are expensive and often unavailable. To determine whether quantitative polymerase chain reaction (qPCR) can be used as a tool to detect typhoidal Salmonella DNA in the environment and approximate the burden of enteric fever, we tested water samples from urban Dhaka, where enteric fever burden is high, and rural Mirzapur, where enteric fever burden is low and sporadic. Sixty-six percent (38/59) of the water sources of Dhaka were contaminated with typhoidal Salmonella DNA, in contrast to none of 33 samples of Mirzapur. If these results can be replicated in larger scale in Bangladesh and other enteric fever endemic areas, drinking water testing could become a low-cost approach to determine the presence of typhoidal Salmonella in the environment that can, in turn, guide informed-design of blood culture-based surveillance and thus assist policy decisions on investing to control typhoid.

Journal ArticleDOI
TL;DR: The finding of L. infantum amastigotes resistant to miltefosine in isolates from patients who eventually failed treatment strongly suggests natural resistance to this drug, as milteFosine had never been used in Brazil before this trial was carried out.
Abstract: In India, visceral leishmaniasis (VL) caused by Leishmania donovani has been successfully treated with miltefosine with a cure rate of > 90%. To assess the efficacy and safety of oral miltefosine against Brazilian VL, which is caused by Leishmania infantum, a phase II, open-label, dose-escalation study of oral miltefosine was conducted in children (aged 2-12 years) and adolescent-adults (aged 13-60 years). Definitive cure was assessed at a 6-month follow-up visit. The cure rate was only 42% (6 of 14 patients) with a recommended treatment of 28 days and 68% (19 of 28 patients) with an extended treatment of 42 days. The in vitro miltefosine susceptibility profile of intracellular amastigote stages of the pretreatment isolates, from cured and relapsed patients, showed a positive correlation with the clinical outcome. The IC50 mean (SEM) of eventual cures was 5.1 (0.4) µM, whereas that of eventual failures was 12.8 (1.9) µM (P = 0.0002). An IC50 above 8.0 µM predicts failure with 82% sensitivity and 100% specificity. The finding of L. infantum amastigotes resistant to miltefosine in isolates from patients who eventually failed treatment strongly suggests natural resistance to this drug, as miltefosine had never been used in Brazil before this trial was carried out.

Journal ArticleDOI
TL;DR: Investigation of an outbreak of Rift Valley fever in Kabale district, southwestern Uganda found that eight of 83 animals were positive for RVFV by IgG serology; one goat from the home of a confirmed case tested positive by RT-PCR.
Abstract: Rift Valley fever (RVF) is caused by RVF virus (RVFV), a single-stranded RNA virus in the Bunyavirales order, Phenuiviridae family, and Phlebovirus genus. It is an emerging epidemic disease of humans and livestock and an important endemic problem in sub-Saharan Africa.1 Rift Valley fever virus is transmitted to livestock and humans by the bite of infected mosquitoes or exposure to tissues or blood of infected animals.2 Interepidemic virus maintenance is thought to occur either transovarially in Aedes species mosquitoes or through cycling of low-level transmission between domestic livestock or wild ungulates and mosquitoes.3 After periods of heavy rainfall, Aedes mosquitoes rapidly emerge, resulting in extensive amplification of the virus through infection of livestock.2 Presentation of RVF in animals can vary among species with a range of clinical severity. Livestock, particularly cattle, sheep, and goats are highly susceptible to RVFV and present with symptoms of fever, loss of appetite, weakness, low milk production, nasal discharge, vomiting, and diarrhea. During large epizootics, “abortion storms,” particularly in sheep and cattle, have been identified. High newborn mortality (80–100%) and adult mortality (5–20%) may also be observed.3 Humans infected with RVFV typically have mild, self-limited febrile illness, but can present in a small number of cases (< 8%) with severe jaundice, rhinitis, encephalitis, and hemorrhagic manifestations. Retinal degeneration (5–10% of cases), hemorrhagic fever (< 1%), or encephalitis (< 1%) may also develop.4,5 Outbreaks of RVF have been reported most frequently in East Africa, especially Kenya, Somalia, and Tanzania, with the last major outbreak in the region recorded in 1997–1998. However, outbreaks have also been reported in other African countries, including Egypt, South Africa, Madagascar, and Senegal.6 In 2000, the first RVF outbreak outside of Africa was reported in Saudi Arabia and Yemen.5,7 More recently, Kenya and Tanzania experienced a large epizootic of RVFV in 2006–2007 and Sudan in 2010.6 The emergence of RVFV in East Africa resulted in widespread livestock morbidity and mortality, with hundreds of laboratory-confirmed human cases, and likely thousands of asymptomatic or mild RVFV human infections going undetected.8,9 Rift Valley fever virus in Uganda was first detected in mosquitos collected in Semliki forest, Western Uganda, in 1944,10 and has since been detected several times by the East African Virus Research Institute (EAVRI), now the Uganda Virus Research Institute (UVRI). Human cases were recorded during outbreaks occurring near Entebbe in 1960 and 1962.11,12 Since then, serological evidence of human and livestock RVFV infections in Uganda have been intermittently reported,13,14 but the last published description of acute human RVFV infection was seven cases occurring near Entebbe in 1968.15 On March 10, 2016, UVRI and the Uganda Ministry of Health (MOH) received a report of a suspected viral hemorrhagic fever (VHF) case presenting to Kabale Regional Referral Hospital (KRRH) in Kabale district in southwestern Uganda. The initial case was of a 48-year-old male butcher who had been working in a local abattoir, where livestock were brought for slaughter. The patient presented with a history of fever, vomiting, diarrhea, headache, and hemorrhagic symptoms (bleeding gums, epistaxis, bloody urine, and stools). A blood sample was taken and sent to the UVRI VHF laboratory for testing. On March 11, a second suspected VHF case presented to KRRH with similar symptoms; a blood sample was collected for testing. This patient was a 16-year-old male student first reported by Kabale District Health Office from the Uganda–Rwanda border village of Katuna. Both samples were tested for hemorrhagic fever viruses, including Ebola viruses, Marburg virus, Crimean–Congo hemorrhagic fever virus, and RVFV. Both specimens were found to be positive for RVFV by reverse transcriptase polymerase chain reaction (RT-PCR) and IgM serology.

Journal ArticleDOI
TL;DR: It is confirmed that markers of benzimidazole resistance are circulating among hookworms in central Ghana, with unknown potential to impact the effectiveness and sustainability of chemotherapeutic approaches to disease transmission and control.
Abstract: Hookworm infection causes anemia, malnutrition, and growth delay, especially in children living in sub-Saharan Africa. The World Health Organization recommends periodic mass drug administration (MDA) of anthelminthics to school-age children (SAC) as a means of reducing morbidity. Recently, questions have been raised about the effectiveness of MDA as a global control strategy for hookworms and other soil-transmitted helminths (STHs). Genomic DNA was extracted from Necator americanus hookworm eggs isolated from SAC enrolled in a cross-sectional study of STH epidemiology and deworming response in Kintampo North Municipality, Ghana. A polymerase chain reaction (PCR) assay was then used to identify single-nucleotide polymorphisms (SNPs) associated with benzimidazole resistance within the N. americanus β-tubulin gene. Both F167Y and F200Y resistance-associated SNPs were detected in hookworm samples from infected study subjects. Furthermore, the ratios of resistant to wild-type SNP at these two loci were increased in posttreatment samples from subjects who were not cured by albendazole, suggesting that deworming drug exposure may enrich resistance-associated mutations. A previously unreported association between F200Y and a third resistance-associated SNP, E198A, was identified by sequencing of F200Y amplicons. These data confirm that markers of benzimidazole resistance are circulating among hookworms in central Ghana, with unknown potential to impact the effectiveness and sustainability of chemotherapeutic approaches to disease transmission and control.

Journal ArticleDOI
TL;DR: A survey of the clinical records of seven European referral centers for the management of patients with CE and retrieved data on the clinical management of 32 patients with a diagnosis of bone CE found that patients endured chronic debilitating disease with a high rate of complications.
Abstract: Cystic echinococcosis (CE) is a zoonosis caused by the larval stage of the tapeworm Echinococcus granulosus. In humans, the infection induces the formation of parasitic cysts mostly in the liver and lungs, but virtually any organ can be affected. CE of the bone is one of the rarest forms of the disease, yet it is also extremely debilitating for patients and hard to manage for clinicians. Unlike abdominal CE, there is currently no expert consensus on the management of bone CE. In this study, we conducted a survey of the clinical records of seven European referral centers for the management of patients with CE and retrieved data on the clinical management of 32 patients with a diagnosis of bone CE. Our survey confirmed that the patients endured chronic debilitating disease with a high rate of complications (84%). We also found that diagnostic approaches were highly heterogeneous. Surgery was extensively used to treat these patients, as well as albendazole, occasionally combined with praziquantel or nitaxozanide. Treatment was curative only for two patients, with one requiring amputation of the involved bone. Our survey highlights the need to conduct systematic studies on bone CE, both retrospectively and prospectively.

Journal ArticleDOI
TL;DR: The present review provides the most recent updates on epidemiology, pathobiology, and clinical and prognostic features pertaining to SACC.
Abstract: Schistosoma japonicum is a digenetic blood fluke that has been implicated in the carcinogenesis of several human malignancies, notably liver and colorectal cancer (CRC). Schistosoma japonicum-associated colorectal cancer (SACC) is a distinct subtype with biological behavior analogous to colitis-induced CRC. The clinicopathological characteristics of SACC include young age at diagnosis, predominance among males, a strong predilection for the sigmoid colon and rectum, multifocal distribution, frequent mucinous histology, and poor prognosis. In addition to chronic inflammation, immunomodulation, and schistosomal toxins, bacterial coinfection appears to play an important role in the carcinogenic process. The present review provides the most recent updates on epidemiology, pathobiology, and clinical and prognostic features pertaining to SACC.

Journal ArticleDOI
TL;DR: It is concluded that M. perstans in Ghana harbor Wolbachia that are effectively depleted by doxycycline with subsequent reduction in MF loads, most likely because of interruption of fertility of adult worms.
Abstract: Mansonella perstans infection, a vector-borne disease transmitted by female midges of the genus Culicoides, is an infection that, although not officially listed by the WHO, is nevertheless to be considered a neglected tropical disease. It affects more than 100 million people mainly in rural areas of Central Africa, the Caribbean, and South America.1,2 Recent reports suggest high prevalence in Ghana.3,4 In the middle belt of Ghana, the overall prevalence was 32%, but some communities had prevalences of up to 75%. Contrary to the assertion that maximal infection rates (up to 90%) occur in children aged 10 to 14 years,5 our studies in Ghana showed prevalence peaks after 20 years in the Sene West and Atebubu districts in the middle belt,4 suggesting that age prevalence differs with overall endemicity as is the case with other filarial infections. After microfilariae (MF) are taken up from M. perstans–infected individuals by the midge, infective larvae develop and are then transmitted to human hosts. Adult filariae persist in serous cavities and retroperitoneal tissues for years. Microfilariae released by female worms circulate in the peripheral blood of infected individuals.5–7 Clinical manifestations of M. perstans infections include a wide range of symptoms such as arthralgias, serositis, angioedema, pruritus, fever, and headaches, which are often subclinical.5 The severity of these clinical manifestations is, in most cases, not very obvious because M. perstans is often coendemic with other filarial parasites in the hosts, making it difficult to assign clinical symptoms specifically to M. perstans.3,8 Mansonella perstans infection is not covered by large-scale programs for the control of filarial diseases such as onchocerciasis and lymphatic filariasis. Mansonella perstans worms often occur in the same geographical areas9 and could be a confounding agent in relation to diagnosis and compliance assessment. It has been shown that M. perstans infection attenuates immune responses associated with severe malaria and protects against anemia.10 By contrast, we have shown that M. perstans–microfilaremic individuals are characterized by increased TH2 and regulatory cell populations concomitant with reduced systemic cytokine/chemokine and increased filaria-specific IgG4 levels,11 which might lead to increased susceptibility and worsened disease course of HIV, tuberculosis (TB), and malaria,12–14 and the lower efficacy of bacillus Calmette–Guerin vaccination against TB.15 Infection with another filarial nematode, Wuchereria bancrofti, has recently been shown to increase the risk of acquiring concomitant HIV infection by a factor of 2–3, depending on the age of the patient.16 Filarial nematodes that polarize host immunity toward humoral and T helper type 2–mediated immunity and to immune-regulatory responses (involving Treg cell responses) are special candidates for impeding vaccine-induced protective immunity.11,15,17,18 Whatever the clinical consequences of M. perstans infection, the lack of an effective treatment may, ultimately, be a drawback to health professionals and patients, particularly in light of the United Nations recent adoption of the Sustainable Development Goals (e.g., SDG #3—ensure healthy lives and promote well-being for all at all ages). These issues highlight the importance that more attention is paid to this neglected infection. Drugs that are usually used against other filarial parasites—diethylcarbamazine (DEC), ivermectin, and albendazole—have shown very limited efficacy against M. perstans.5,19–21 Therefore, the discovery that M. perstans from Mali22,23 and Gabon24 were positive for Wolbachia is a major breakthrough.6 However, the efficacy of treatment with doxycycline has only been addressed in a study in Mali23 and not yet repeated in other countries. Confirmation of the efficacy of doxycycline to treat M. perstans is critical because there have been controversial data on the distribution of Wolbachia endosymbionts in M. perstans.23–25 Therefore, the aim of this study was to determine whether M. perstans worms in Ghana harbor Wolbachia and to demonstrate the efficacy of doxycycline 200 mg/day for 6 weeks in depleting the Wolbachia with analysis of subsequent filaricidal effects, to evaluate the usability of doxycycline in M. perstans treatment.

Journal ArticleDOI
TL;DR: The results support the continued use of temperature-dependent models to predict Ae.
Abstract: The Asian tiger mosquito, Aedes albopictus, transmits several arboviruses of public health importance, including chikungunya and dengue. Since its introduction to the United States in 1985, the species has invaded more than 40 states, including temperate areas not previously at risk of Aedes-transmitted arboviruses. Mathematical models incorporate climatic variables in predictions of site-specific Ae. albopictus abundances to identify human populations at risk of disease. However, these models rely on coarse resolutions of environmental data that may not accurately represent the climatic profile experienced by mosquitoes in the field, particularly in climatically heterogeneous urban areas. In this study, we pair field surveys of larval and adult Ae. albopictus mosquitoes with site-specific microclimate data across a range of land use types to investigate the relationships between microclimate, density of larval habitat, and adult mosquito abundance and determine whether these relationships change across an urban gradient. We find no evidence for a difference in larval habitat density or adult abundance between rural, suburban, and urban land classes. Adult abundance increases with increasing larval habitat density, which itself is dependent on microclimate. Adult abundance is strongly explained by microclimate variables, demonstrating that theoretically derived, laboratory-parameterized relationships in ectotherm physiology apply to the field. Our results support the continued use of temperature-dependent models to predict Ae. albopictus abundance in urban areas.

Journal ArticleDOI
TL;DR: Seasonality of ITN use given access was observed over all nine rainfall typologies and was most pronounced in arid climates and less pronounced where rainfall was relatively constant throughout the year, suggesting that net use is triggered by mosquito density.
Abstract: Seasonal variation in the proportion of the population using an insecticide-treated net (ITN) is well documented and is widely believed to be dependent on mosquito abundance and heat, driven by rainfall and temperature. However, seasonal variation in ITN use has not been quantified controlling for ITN access. Demographic and Health Survey and Malaria Indicator Survey datasets, their georeferenced data, and public rainfall and climate layers were pooled for 21 countries. Nine rainfall typologies were developed from rainfall patterns in Koppen climate zones. For each typology, the odds of ITN use among individuals with access to an ITN within their households ("ITN use given access") were estimated for each month of the year, controlling for region, wealth quintile, residence, year, temperature, and malaria parasitemia level. Seasonality of ITN use given access was observed over all nine rainfall typologies and was most pronounced in arid climates and less pronounced where rainfall was relatively constant throughout the year. Peak ITN use occurred 1-3 months after peak rainfall and corresponded with peak malaria incidence and average malaria transmission season. The observed lags between peak rainfall and peak ITN use given access suggest that net use is triggered by mosquito density. In equatorial areas, ITN use is likely to be high year-round, given the presence of mosquitoes and an associated year-round perceived malaria risk. These results can be used to inform behavior change interventions to improve ITN use in specific times of the year and to inform geospatial models of the impact of ITNs on transmission.

Journal ArticleDOI
TL;DR: The developed lateral flow dipstick test may improve the serodiagnosis of strongyloidiasis and merit further validation studies.
Abstract: The conventional method of detecting Strongyloides stercoralis in fecal samples has poor diagnostic sensitivity. Detection of Strongyloides-specific antibodies increases the sensitivity; however, most tests are ELISAs that use parasite extract which may cross-react with the sera of other helminth infections. To improve the serological diagnosis of strongyloidiasis, this study aimed at developing a sensitive and specific lateral flow rapid dipstick test. Two recombinant proteins, recombinant NIE (rNIE) and recombinant Ss1a (rSs1a), were used in preparing the dipstick, with gold-conjugated antihuman IgG4 as detector reagent. In parallel, the corresponding ELISA was performed. Both assays demonstrated diagnostic sensitivity of 91.3% (21/23) when tested with serum samples of patients with Strongyloides infection, and 100% specificity with 82 sera of asymptomatic (healthy) and those with other parasitic infections. The ELISA and dipstick test results were positively correlated to each other (r = 0.6114, P = 0.0019). The developed lateral flow dipstick test may improve the serodiagnosis of strongyloidiasis and merit further validation studies.

Journal ArticleDOI
TL;DR: The high prevalence of fecal contamination in drinking water highlights the importance of household water treatment methods and the high levels of antibiotic resistance found indicate a need for further research to identify the origins of potential environmental contamination, misuse, or inadequate disposal of antibiotics.
Abstract: Antibiotic resistance in pathogenic bacteria is a serious public health issue. The growing threat is a cause for concern and action to prevent the emergence of new resistant strains and the spread of existing ones to humans via the environment. This study aimed at identifying fecal pathogens in drinking water obtained from rural Andean households from Cajamarca, Peru, and measuring the antibiotic resistance profile of Escherichia coli. The study was embedded within a community-randomized controlled trial among 102 communities in the northern highlands of the Cajamarca region, Peru. Of 314 samples, 55.4% (95% CI [49.7, 61.0], n = 174) were identified as thermotolerant coliforms. Among the samples positive for thermotolerant coliform, E. coli was isolated in 37.3% (n = 117), Klebsiella spp. in 8.0% (n = 25), Enterobacter spp. in 5.1% (n = 16), and Citrobacter spp. in 2.5% (n = 8). Of the 117 E. coli samples, 48.7% (95% CI [39.4, 58.1], n = 57) showed resistance to any antibiotic. The E. coli antibiotic resistance profile showed highest resistance against tetracycline (37.6%), ampicillin (34.2%), sulfamethoxazole-trimethoprim (21.4%), and nalidixic acid (13%). Some 19.7% (95% CI [12.9, 28.0], n = 23) of the E. coli isolates displayed multidrug resistance, defined as resistance to at least three classes of antibiotics. The CTX-M-3 gene, which encodes extended-spectrum resistance to beta-lactamase antibiotics, was found in one isolate. The high prevalence of fecal contamination in drinking water highlights the importance of household water treatment methods. Likewise, the high levels of antibiotic resistance found indicate a need for further research to identify the origins of potential environmental contamination, misuse, or inadequate disposal of antibiotics.