scispace - formally typeset
Search or ask a question

Showing papers in "The Journal of Infectious Diseases in 2013"


Journal ArticleDOI
TL;DR: Within 4 years of vaccine introduction, the vaccine-type HPV prevalence decreased among females aged 14-19 years despite low vaccine uptake, and the estimated vaccine effectiveness was high.
Abstract: Background. Human papillomavirus (HPV) vaccination was introduced into the routine immunization schedule in the United States in late 2006 for females aged 11 or 12 years with catch-up vaccination recommended for those aged 13-26 years. In 2010 3-dose vaccine coverage was only 32% among 13-17 year-olds. Reduction in the prevalence of HPV types targeted by the quadrivalent vaccine (HPV-6 -11 -16 and -18) will be one of the first measures of vaccine impact. Methods. We analyzed HPV prevalence data from the vaccine era (2007-2010) and the prevaccine era (2003-2006) that were collected during National Health and Nutrition Examination Surveys. HPV prevalence was determined by the Linear Array HPV Assay in cervicovaginal swab samples from females aged 14-59 years; 4150 provided samples in 2003-2006 and 4253 provided samples in 2007-2010. Results. Among females aged 14-19 years the vaccine-type HPV prevalence (HPV-6 -11 -16 or -18) decreased from 11.5% (95% confidence interval [CI] 9.2-14.4) in 2003-2006 to 5.1% (95% CI 3.8-6.6) in 2007-2010 a decline of 56% (95% CI 38-69). Among other age groups the prevalence did not differ significantly between the 2 time periods (P > .05). The vaccine effectiveness of at least 1 dose was 82% (95% CI 53-93). Conclusions. Within 4 years of vaccine introduction the vaccine-type HPV prevalence decreased among females aged 14-19 years despite low vaccine uptake. The estimated vaccine effectiveness was high.

535 citations


Journal ArticleDOI
TL;DR: Low drug AUCs are predictive of clinical outcomes in tuberculosis patients, and low rifampin and isoniazid peak and AUC concentrations preceded all cases of acquired drug resistance.
Abstract: In African countries with a high tuberculosis burden, the 2-month sputum culture conversion rate is only 50%–70%, and acquired drug resistance (ADR) continues to be a major problem [1–4]. In the laboratory, the hollow-fiber model of tuberculosis has demonstrated that the microbial kill and ADR prevention of first-line anti-tuberculosis agents are driven by such drug concentration measures as the 0–24 hour area under the concentration–time curve (AUC) to minimum inhibitory concentration (MIC) ratio and the peak concentration to MIC ratio [5–7]. These data have been used in computer-aided clinical trial simulations in the face of 100% adherence, which led us to hypothesize that between-patient pharmacokinetic variability could explain a large proportion of therapy failure and that adherence plays a relatively minor role [8, 9]. Here, we investigated whether such pharmacokinetic variability would indeed lead to a large proportion of patients failing to attain adequate concentrations and then failing therapy. We also sought to identify the drug concentrations that are predictive of clinical outcome. Several attempts to relate drug concentrations to tuberculosis outcomes have been made with conflicting results [10–16]. The reasons are unclear but could be one of several. First, in some studies, a single measure such as the 2-hour drug concentration (peak) was used to dichotomize patients into those with poor vs good outcomes. However, since drug AUCs are strongly associated with efficacy of first-line anti-tuberculosis agents in preclinical models [5–7, 17], a more intensive multisample schedule that allows AUC identification may be more informative. Second, several studies utilized predetermined peak concentration drug cutoff values to classify patients as having either low or high drug concentrations. These are peak concentrations of 3–5 mg/L for isoniazid, 8–24 mg/L for rifampin, and 20–50 mg/L for pyrazinamide [18]. These concentrations need further validation with regard to clinical outcomes. Third, noncompartmental pharmacokinetic analysis was utilized in some studies; however, pharmacokinetics of some anti-tuberculosis drugs may be best described using multiple compartments. A fourth possible reason may be the type of statistical analysis used. Biological systems such as anti-tuberculosis drug pharmacokinetics and the tuberculosis disease are best analyzed using nonlinear statistical approaches since they and most natural phenomena are nonlinear systems [19–23]. In linear analysis, complex problems are broken into smaller components that are then solved, after which the solutions are put together (superimposed) and added up to a solution of the whole problem. Nonlinear systems are characterized by discontinuities and relationships of higher-order complexity between components; the total function of the whole system is often more than the linear sum of its components. Therefore, components need to be analyzed in the context of all parameters interacting within the whole system. Here, we utilized classification and regression tree analysis (CART) to examine the role of several clinical factors, including drug concentrations, in toto, as predictors of clinical outcome in our cohort [24–29]. CART uses nonparametric techniques that examine both linear and nonlinear interactions simultaneously in the whole dataset and creates a hierarchy of predictors, starting with the most predictive to the least predictive.

374 citations


Journal ArticleDOI
TL;DR: ART initiation <6 months after infection is associated with lower levels of T-cell activation and smaller HIV DNA and RNA reservoir size during long-term therapy.
Abstract: (See the editorial commentary by Henrich and Gandhi on pages 1189–93 and the major article by Yukl et al 1212–20.) T-cell immune activation, defined by coexpression of CD38 and HLA-DR on CD4+ and CD8+ T cells, has been linked to morbidity and mortality in untreated human immunodeficiency virus (HIV) infection, both AIDS-related and non-AIDS-related [1, 2]. In patients receiving HIV antiretroviral therapy (ART), higher levels of T-cell activation has been linked to diminished CD4+ T-cell count recovery [3–5], surrogate markers of cardiovascular disease [6, 7], and increased mortality [8]. ART-mediated virologic suppression reduces both CD4+ and CD8+ T-cell activation levels [3, 9]. However, patients frequently continue to exhibit T-cell activation levels higher than those seen in HIV-negative controls, indicating that ART reduces but does not fully reverse HIV-related immune activation [3, 9, 10]. A key unanswered question, therefore, is whether ART initiated early (within the first months after HIV infection) can reduce on-therapy immune activation more than ART initiated during chronic HIV infection. A related question is whether early initiation of ART durably limits the size of persistent HIV reservoirs. Cellular reservoirs of latent genomically integrated HIV are established quickly after infection [11–14], but studies investigating whether ART initiated in the first few months after infection can limit the size of the established reservoir have reached inconsistent conclusions. One study found that ART initiated <6 months after HIV infection was associated with substantially decreased cell-associated HIV infectivity [15], but other studies have reached different conclusions [11, 16–18]. Controversy has persisted in part because of the small size of these studies and the lack of direct comparison between early- and later-treated individuals. The growing research effort aimed at eradicating HIV infection will depend on more precisely elucidating the role early ART may play in limiting the growth of the viral reservoir. We hypothesized that ART initiated in the first 6 months after HIV infection, compared with ART initiated during chronic disease, would lead to lower levels of on-therapy T-cell activation and a smaller HIV reservoir size. To test this hypothesis, we studied individuals with HIV infection diagnosed during acute infection and who either started ART ≤6 months after infection or deferred therapy for >2 years. We assessed T-cell activation levels and measured the HIV reservoir (based on both HIV DNA and cell-associated RNA levels) to study the impact of early ART on these outcomes.

290 citations


Journal ArticleDOI
TL;DR: Malaria LAMP dramatically lowers the detection threshold achievable in malaria-endemic settings, providing a new tool for diagnosis, surveillance, and screening in elimination strategies.
Abstract: (See the major article by Polley et al on pages 637–44.) Accurate detection of malaria is of increasing importance as the malaria prevalence declines across much of its range [1, 2], with surveillance and screening becoming increasingly important in program management [3–5]. Microscopy and antigen-detecting rapid diagnostic tests (RDTs), when performed correctly, provide accurate diagnosis for case management [6] but cannot reliably detect lower-density parasitemia that may occur in asymptomatic individuals, who then represent reservoirs of infection. Such low-density infections with Plasmodium species are widely documented [7–12] and may contribute to transmission [13–15]. To eliminate malaria and prevent reintroduction, sustaining the capacity to detect such infections will be critical. The development of field-ready assays that can detect foci of infection in a way timely enough to enable treatment is therefore recognized as a major priority for malaria elimination [3, 16, 17]. Such highly sensitive assays could also benefit antimalarial drug efficacy monitoring, vaccine studies, and screening of vulnerable populations, such as pregnant women, in which low-density infections may have significant clinical consequences [18–20]. Polymerase chain reaction (PCR) detects parasite DNA, can identify infections below the threshold of detection for microscopy and RDTs, and is commonly considered the gold standard to detect malaria infection. However, PCR requires sophisticated laboratory infrastructure and advanced training, making it challenging and costly to implement in most malaria-endemic areas. Although PCR is used in some cases for focal screening and treatment strategies [11, 21], its restriction to central reference laboratories, often far from the sampled population, greatly limits its usefulness. Loop-mediated isothermal amplification (LAMP) may offer a practical alternative. Like PCR, LAMP is a molecular technique that amplifies nucleic acids but uses simpler equipment and is less time intensive. A prototype LAMP assay designed for use in resource-constrained settings was developed through a public/private collaboration between the Foundation for Innovative New Diagnostics, Switzerland, and Eiken Chemical Co., Ltd., Japan. The assay forms the basis for a CE-marked product commercially released in July 2012 as the Loopamp Malaria Pan/Pf Detection Kit, list number LMC562 (Eiken Chemical). It is consistent with a recent description of “an ideal LAMP detection format” [22], and the kit consists of vacuum-dried reagents stable at ambient temperature. The assay's primers target Plasmodium genus or P. falciparum–specific sequences. Assay performance includes either of 2 simple blood-processing methods, a 40-minute reaction time in a closed amplification unit, and a visual readout. The result is essentially qualitative. The relative simplicity and robustness of LAMP opens the potential for sustainable nucleic acid amplification in laboratories and near-patient locations in malaria-endemic countries. An evaluation of the LAMP test kit in a United Kingdom reference laboratory, reported in the article by Polley et al in this issue of the Journal, showed that LAMP sensitivity approximated that of nested PCR [23]. This article presents results of the same kit evaluated in a rural clinic in Uganda.

277 citations


Journal ArticleDOI
TL;DR: These data are the first clinical demonstration of the activity of any integrase inhibitor in subjects with HIV-1 resistant to RAL, and dolutegravir 50 mg twice daily with an optimized background provided greater and more durable benefit than the once-daily regimen.
Abstract: Integrase inhibitors (INIs) represent a class of drugs for the treatment of human immunodeficiency virus (HIV)–infected individuals, blocking HIV genome integration into the host cell DNA [1]. They have been shown to be highly effective for the treatment of antiretroviral-naive and antiretroviral-experienced subjects, as demonstrated first with raltegravir (RAL) and more recently with elvitegravir (EVG) [2–6]. However, these first-generation INIs share common resistance pathways. In clinical studies of RAL, subjects with virologic failure and reduced RAL susceptibility typically harbored virus with 1 of 3 signature mutational pathways (ie, N155H, Q148H/K/R, or Y143C/H/R) in the integrase gene [7]. Continuing RAL treatment in these circumstances may lead to the addition of secondary mutations or pathway evolution; N155H may evolve to Y143 or Q148 pathways [4]. In addition, EVG does not appear to have activity against RAL-resistant isolates, and RAL does not appear to have activity against EVG-resistant isolates [8–10]. Therefore, there is a need for an INI with a high barrier to resistance and activity in subjects with human immunodeficiency virus type 1 (HIV-1) resistant to EVG and RAL. Dolutegravir (DTG) is a new HIV-1 INI that has demonstrated good efficacy and safety in treatment-naive, HIV-infected individuals [11]. In vitro studies demonstrate limited cross-resistance between DTG and RAL or EVG, with no or minimal impact on DTG fold-change against Q148 single mutants or against viruses with Y143 or N155 signature mutations regardless of RAL-associated secondary mutations [12, 13]. However, the DTG fold-change increased for Q148H/K/R as secondary RAL resistance–associated mutations increased. On the basis of these in vitro findings, this phase IIb pilot study was conducted to assess and demonstrate the activity of DTG in HIV-1–infected individuals with RAL-resistant viral isolates.

269 citations


Journal ArticleDOI
TL;DR: Most current fluoroquinolone-resistant E. coli clinical isolates represent a highly clonal subgroup that likely originated from a single rapidly expanded and disseminated ST131 strain, which expanded abruptly after 2000.
Abstract: Background. Fluoroquinolone-resistant Escherichia coli are increasingly prevalent. Their clonal origins—potentially critical for control efforts—remain undefined. Methods. Antimicrobial resistance profiles and fine clonal structure were determined for 236 diverse-source historical (1967–2009) E. coli isolates representing sequence type ST131 and 853 recent (2010–2011) consecutive E. coli isolates from 5 clinical laboratories in Seattle, Washington, and Minneapolis, Minnesota. Clonal structure was resolved based on fimH sequence (fimbrial adhesin gene: H subclone assignments), multilocus sequence typing, gyrA and parC sequence (fluoroquinolone resistance-determining loci), and pulsed-field gel electrophoresis. Results. Of the recent fluoroquinolone-resistant clinical isolates, 52% represented a single ST131 subclonal lineage, H30, which expanded abruptly after 2000. This subclone had a unique and conserved gyrA/parC allele combination, supporting its tight clonality. Unlike other ST131 subclones, H30 was significantly associated with fluoroquinolone resistance and was the most prevalent subclone among current E. coli clinical isolates, overall (10.4%) and within every resistance category (11%–52%). Conclusions. Most current fluoroquinolone-resistant E. coli clinical isolates, and the largest share of multidrug-resistant isolates, represent a highly clonal subgroup that likely originated from a single rapidly expanded and disseminated ST131 strain. Focused attention to this strain will be required to control the fluoroquinolone and multidrug-resistant E. coli epidemic.

258 citations


Journal ArticleDOI
TL;DR: The ability of donor cells to engraft without evidence of ongoing HIV-1 infection suggests that HIV- 1 replication may be fully suppressed during cART and does not contribute to maintenance of viral reservoirs in peripheral blood in patients receiving combination antiretroviral therapy (cART).
Abstract: Background. The long-term impact of allogeneic hematopoietic stem cell transplantation (HSCT) on human immunodeficiency virus type 1 (HIV-1) reservoirs in patients receiving combination antiretroviral therapy (cART) is largely unknown. Methods. We studied the effects of a reduced-intensity conditioning allogeneic HSCT from donors with wildtype–CCR5 + cells on HIV-1 peripheral blood reservoirs in 2 patients heterozygous for the ccr5Δ32 mutation. Indepth analyses of the HIV-1 reservoir size in peripheral blood, coreceptor use, and specific antibody responses were performed on samples obtained before and up to 3.5 years after HSCT receipt. Results. Although HIV-1 DNA was readily detected in peripheral blood mononuclear cells (PBMCs) before and 2–3 months after HSCT receipt, HIV-1 DNA and RNA were undetectable in PBMCs, CD4 + T cells, or plasma up to 21 and 42 months after HSCT. The loss of detectable HIV-1 correlated temporally with full donor chimerism, development of graft-versus-host disease, and decreases in HIV-specific antibody levels. Conclusions. The ability of donor cells to engraft without evidence of ongoing HIV-1 infection suggests that HIV-1 replication may be fully suppressed during cART and does not contribute to maintenance of viral reservoirs in peripheral blood in our patients. HSCTs with wild-type–CCR5 + donor cells can lead to a sustained reduction in the size of the peripheral reservoir of HIV-1.

257 citations


Journal ArticleDOI
TL;DR: Kissing was a significant risk for primary EBV infection, a total of 89% of infections were symptomatic, and blood viral load and CD8(+) lymphocytosis correlated with disease severity, according to university students studied prospectively.
Abstract: (See the editorial commentary by Rickinson and Fox, on pages 6–8.) Epstein-Barr virus (EBV) was discovered in 1964 [1] and conclusively linked to infectious mononucleosis in 1968 [2]. More than 4 decades later, however, the incidence and risk factors for acquisition and correlates of severity of primary EBV infection are incompletely understood. The prevalence of EBV antibodies varies widely by age and geographic location [3–5], and the inference has been that the later in life EBV infection is acquired, the more likely it will be symptomatic [6]. However, the proportion of symptomatic infections in any age group is not well described. Furthermore, infectious mononucleosis has been postulated to be an immunopathologic disease whose symptoms are caused by the CD8+ response to the virus rather than to the virus per se. Indeed, CD8+ lymphocytosis—not viremia—was associated with symptomatic disease in a previous study of primary EBV infection [7]. An accurate assessment of the burden of disease attributable to primary EBV infection is an important component in the development of a prophylactic vaccine or antiviral therapy. This information can only be obtained by a prospective study, which we were able to perform because we had access to >4000 resident freshmen on the campus of the University of Minnesota, whose rate of natural infection is relatively high [8]. This report describes our findings in students from 2 freshman classes followed prospectively throughout their undergraduate years with frequent reporting of health histories, clinical monitoring, quantitative virologic testing, and immunologic testing. We measured the following factors to define their range, kinetics, and correlation with disease severity during primary infection: blood and oral viral loads; CD8+, CD4+, and natural killer (NK) cell numbers; and T-cell activation markers.

256 citations


Journal ArticleDOI
TL;DR: In this article, the authors used Poisson regression to identify predictors and eGFR-related discontinuations of antiretroviral agents (ARVs) associated with chronic renal impairment, but the extent of such adverse events among human immunodeficiency virus (HIV)-positive persons with initially normal renal function is unknown.
Abstract: BACKGROUND: Several antiretroviral agents (ARVs) are associated with chronic renal impairment, but the extent of such adverse events among human immunodeficiency virus (HIV)-positive persons with initially normal renal function is unknown. METHODS: D:A:D study participants with an estimated glomerular filtration rate (eGFR) of ≥ 90 mL/min after 1 January 2004 were followed until they had a confirmed eGFR of ≤ 70 mL/min (the threshold below which we hypothesized that renal interventions may begin to occur) or ≤ 60 mL/min (a value indicative of moderately severe chronic kidney disease [CKD]) or until the last eGFR measurement during follow-up. An eGFR was considered confirmed if it was detected at 2 consecutive measurements ≥ 3 months apart. Predictors and eGFR-related ARV discontinuations were identified using Poisson regression. RESULTS: Of 22 603 persons, 468 (2.1%) experienced a confirmed eGFR of ≤ 70 mL/min (incidence rate, 4.78 cases/1000 person-years of follow-up [95% confidence interval {CI}, 4.35-5.22]) and 131 (0.6%) experienced CKD (incidence rate, 1.33 cases/1000 person-years of follow-up [95% CI, 1.10-1.56]) during a median follow-up duration of 4.5 years (interquartile range [IQR], 2.7-6.1 years). A current eGFR of 60-70 mL/min caused significantly higher rates of discontinuation of tenofovir (adjusted incidence rate ratio [aIRR], 1.72 [95% CI, 1.38-2.14]) but not other ARVs compared with a current eGFR of ≥ 90 mL/min. Cumulative tenofovir use (aIRR, 1.18/year [95% CI, 1.12-1.25]) and ritonavir-boosted atazanavir use (aIRR, 1.19/year [95% CI, 1.09-1.32]) were independent predictors of a confirmed eGFR of ≤ 70 but were not significant predictors of CKD whereas ritonavir-boosted lopinavir use was a significant predictor for both end points (aIRR, 1.11/year [95% CI, 1.05-1.17] and 1.22/year [95% CI, 1.16-1.28], respectively). Associations were unaffected by censoring for concomitant ARV use but diminished after discontinuation of these ARVs. CONCLUSIONS: Tenofovir, ritonavir-boosted atazanavir, and ritonavir-boosted lopinavir use were independent predictors of chronic renal impairment in HIV-positive persons without preexisting renal impairment. Increased tenofovir discontinuation rates with decreasing eGFR may have prevented further deteriorations. After discontinuation, the ARV-associated incidence rates decreased.

247 citations


Journal ArticleDOI
TL;DR: Measurement of biosignatures during clinical trials of new drugs could be useful predictors of rapid bactericidal or sterilizing drug activity, and would expedite the licensing of new treatment regimens.
Abstract: Background. Accurate assessment of treatment efficacy would facilitate clinical trials of new antituberculosis drugs. We hypothesized that early alterations in peripheral immunity could be measured by gene expression profiling in tuberculosis patients undergoing successful conventional combination treatment. Methods. Ex vivo blood samples from 27 pulmonary tuberculosis patients were assayed at diagnosis and during treatment. RNA was processed and hybridized to Affymetrix GeneChips, to determine expression of over 47 000 transcripts. Results. There were significant ≥2-fold changes in expression of >4000 genes during treatment. Rapid, largescale changes were detected, with down-regulated expression of 1261 genes within the first week, including inflammatory markers such as complement components C1q and C2. This was followed by slower changes in expression of different networks of genes, including a later increase in expression of B-cell markers, transcription factors, and signaling molecules. Conclusions. The fast initial down-regulation of expression of inflammatory mediators coincided with rapid killing of actively dividing bacilli, whereas slower delayed changes occurred as drugs acted on dormant bacilli and coincided with lung pathology resolution. Measurement of biosignatures during clinical trials of new drugs could be useful predictors of rapid bactericidal or sterilizing drug activity, and would expedite the licensing of new treatment regimens.

221 citations


Journal ArticleDOI
TL;DR: Cell-based measurements of viral persistence were consistently associated with markers of immune activation and the frequency of PD-1-expressing CD4(+) T cells, suggesting that HIV infection in these individuals may be more difficult to cure and may require unique interventions.
Abstract: Despite the effectiveness of long-term therapy, multiple studies have shown that human immunodeficiency virus (HIV) persists indefinitely in plasma [1, 2], peripheral blood mononuclear cells (PBMCs) [3, 4], and tissues [5–7]. Multiple mechanisms contribute to this phenomenon, including the presence of long-lived latently infected CD4+ T cells [8], ongoing cryptic viral replication [9, 10], ineffective HIV-specific responses [11], and persistent immune activation [12–15]. The host immune environment is likely to be a strong determinant of each of these mechanisms. For example, a high density of activated CD4+ T cells in lymphoid tissues could support isolated rounds of de novo infection [10, 16]. In addition, a chronic inflammatory environment might lead to dysfunctional HIV-specific T cells [13, 14] and, hence, to a relative inability to clear infected cells [15]. Finally, the upregulation of certain “negative regulators” of T-cell activation (eg, programmed cell death protein 1 [PD-1]) has been postulated as a mechanism that enables recently infected CD4+ T cells to shift toward a state of persistence rather than one of activation-induced cell death [15]. Expression of PD-1 causes impaired HIV-specific immunity [14, 17], and PD-1high CD4+ T cells are highly enriched in integrated HIV DNA [15]. Given the potential role of the host immune environment in maintaining HIV persistence, we performed a comprehensive assessment of virologic and T-cell immunophenotyping in a large cohort of effectively treated individuals.

Journal ArticleDOI
TL;DR: A cell line susceptibility study with 28 cell lines found that HCoV-EMC can infect primate, porcine, and bat cells and therefore may jump interspecies barriers and can also infect civet lung fibroblast and rabbit kidney cell lines.
Abstract: The emerging novel human betacoronavirus 2c EMC/2012 (HCoV-EMC) was recently isolated from patients with severe pneumonia and renal failure and was associated with an unexplained high crude fatality rate of 56%. We performed a cell line susceptibility study with 28 cell lines. HCoV-EMC was found to infect the human respiratory tract (polarized airway epithelium cell line Calu-3, embryonic fibroblast cell line HFL, and lung adenocarcinoma cell line A549), kidney (embryonic kidney cell line HEK), intestinal tract (colorectal adenocarcinoma cell line Caco-2), liver cells (hepatocellular carcinoma cell line Huh-7), and histiocytes (malignant histiocytoma cell line His-1), as evident by detection of high or increasing viral load in culture supernatants, detection of viral nucleoprotein expression by immunostaining, and/or detection of cytopathic effects. Although an infected human neuronal cell line (NT2) and infected monocyte and T lymphocyte cell lines (THP-1, U937, and H9) had increased viral loads, their relatively lower viral production corroborated with absent nucleoprotein expression and cytopathic effects. This range of human tissue tropism is broader than that for all other HCoVs, including SARS coronavirus, HCoV-OC43, HCoV-HKU1, HCoV-229E, and HCoV-NL63, which may explain the high mortality associated with this disease. A recent cell line susceptibility study showed that HCoV-EMC can infect primate, porcine, and bat cells and therefore may jump interspecies barriers. We found that HCoV-EMC can also infect civet lung fibroblast and rabbit kidney cell lines. These findings have important implications for the diagnosis, pathogenesis, and transmission of HCoV-EMC.

Journal ArticleDOI
TL;DR: HCPs within 1.829 m of patients with influenza could be exposed to infectious doses of influenza virus, primarily in small-particle aerosols, which questions the current paradigm of localized droplet transmission during non-aerosol-generating procedures.
Abstract: Background. Defining dispersal of influenza virus via aerosol is essential for the development of prevention measures. Methods. During the 2010–2011 influenza season, subjects with influenza-like illness were enrolled in an emergency department and throughout a tertiary care hospital, nasopharyngeal swab specimens were obtained, and symptom severity, treatment, and medical history were recorded. Quantitative impaction air samples were taken not ≤0.305 m (1 foot), 0.914 m (3 feet), and 1.829 m (6 feet) from the patient’s head during routine care. Influenza virus was detected by rapid test and polymerase chain reaction. Results. Sixty-one of 94 subjects (65%) tested positive for influenza virus. Twenty-six patients (43%) released influenza virus into room air, with 5 (19%) emitting up to 32 times more virus than others. Emitters surpassed the airborne 50% human infectious dose of influenza virus at all sample locations. Healthcare professionals (HCPs) were exposed to mainly small influenza virus particles (diameter, <4.7 µm), with concentrations decreasing with increasing distance from the patient’s head (P< .05). Influenza virus release was associated with high viral loads in nasopharyngeal samples (shedding), coughing, and sneezing (P< .05). Patients who reported severe illness and major interference with daily life also emitted more influenza virus (P< .05). Conclusions. HCPs within 1.829 m of patients with influenza could be exposed to infectious doses of influenza virus, primarily in small-particle aerosols. This finding questions the current paradigm of localized droplet transmission during non–aerosol-generating procedures.

Journal ArticleDOI
TL;DR: Increasing anti-HA and NA antibody in serum and secretions correlated with reducing pH1N1 influenza virus infection and illness in healthy young adults.
Abstract: Background. Serum antibody to the hemagglutinin (HA) of influenza viruses is a correlate and predictor of immunity to influenza in humans; the relative values of other correlates are uncertain. Methods. Serum and nasal secretions (NS) were collected in fall and spring of 2009–2011 from healthy adults who were monitored for acute respiratory illness (ARI). Serum samples were tested for hemagglutination-inhibition (HAI) antibody increase and secretions for virus if ill; enrollment sera were also tested for neuraminidase-inhibiting (NI) antibody and NS for neutralizing (neut), NI, immunoglobulin A (IgA), and immunoglobulin G (IgG) anti-HA antibody. Results. Serum anti-HA and anti-neuraminidase (NA) antibody titers to 2009(H1N1) pandemic influenza virus (pH1N1) correlated with titers in NS (including IgA and IgG antibody). Increasing anti-HA and anti-NA titers in serum and NS tests all correlated with reducing infection and infection-associated illness. Multivariate analyses indicated serum HAI and NI each independently predicted immunity to infection and infection-associated illness. Only serum NI independently predicted reduced illness among infected subjects. Conclusions. Increasing anti-HA and NA antibody in serum and secretions correlated with reducing pH1N1 influenza virus infection and illness in healthy young adults. Both anti-HA and anti-NA antibody are independent predictors of immunity to influenza; ensuring induction of both by vaccination is desirable.

Journal ArticleDOI
TL;DR: The declining antibody prevalence over time and the consistently high observed prevalence among participants aged 12-19 years support broad use of EBV vaccine before 12 years of age.
Abstract: Background Data on the age-specific prevalence of Epstein-Barr virus (EBV) infection are relevant for determining when to administer a prophylactic vaccine. Comparison of demographic groups could identify factors associated with its acquisition. Methods The National Health and Nutrition Examination Surveys (NHANES) examine a representative sample of the US population. Serum specimens from NHANES participants 6-19 years old were tested for EBV antibody by enzyme immunoassay (EIA). A random portion was also tested by indirect immunofluorescence (IFA). Prevalence estimates and risk-factor comparisons used demographic data and sampling weights in logistic regression models. Results Serum specimens collected between 2003 and 2010 from 9338 individuals participating in NHANES were tested. The concordance between EIA and IFA findings was 96.7%. The overall age-adjusted EBV antibody prevalence declined from 72% in 2003-2004 to 65% in 2009-2010 (P = .027). The prevalence in 2009-2010 by age group was as follows: 6-8 years, 50%; 9-11 years, 55%; 12-14 years, 59%; 15-17 years, 69%; and 18-19 years, 89%. Within each race/ethnicity group, younger age, health insurance coverage, higher household income, and education level were significantly associated with a lower prevalence of EBV antibody. Conclusions The EBV antibody prevalence declined in US individuals aged 6-19 years from 2003-2004 to 2009-2010, mainly because of the decrease among non-Hispanic white participants. The declining antibody prevalence over time and the consistently high observed prevalence among participants aged 12-19 years support broad use of EBV vaccine before 12 years of age.

Journal ArticleDOI
TL;DR: Young, asymptomatic, HIV-infected women, demonstrate increased noncalcified coronary plaque and increased immune activation, particularly monocyte activation, which may contribute to cardiovascular disease in this population of women.
Abstract: (See the editorial commentary by Boccara and Cohen on pages 1729–31.) Cardiovascular disease (CVD) is increased approximately 2-fold in human immunodeficiency virus (HIV) infection [1] and may have a unique relationship to sex in this population. Two large registry studies have recently shown that the relative increases in myocardial infarction rates are higher in HIV-infected vs non–HIV-infected women than in HIV-infected vs non–HIV-infected men [2, 3]. Worldwide, women account for a growing percentage of HIV-infected patients and more than half of all HIV infections [4]. Few studies have assessed CVD exclusively among HIV-infected women [5–7], and, to our knowledge, none have explored sex differences with respect to coronary atherosclerotic plaque. In the context of HIV infection, CVD is probably related to the interplay between traditional CVD risk factors, effects of antiretroviral therapy (ART), and the proinflammatory and immune activation effects of HIV. Among HIV-infected men, Lo et al [8] previously demonstrated a higher prevalence of atherosclerotic plaque and in particular, noncalcified plaque. In more recent data from the same cohort, we also demonstrated that soluble CD163 (sCD163) levels are increased in HIV-infected men and are associated with noncalcified plaque [9]. CD163 is a monocyte-macrophage specific scavenger receptor cleaved from activated monocytes and macrophages during inflammation. Among non–HIV-infected patients, increased levels of sCD163 have been found to be associated with coronary artery disease [10]. These prior studies have not investigated indices of immune activation with respect to atherosclerotic plaque features among HIV-infected women. In the current study, we examined atherosclerotic plaque features and detailed indices of immune activation among HIV-infected women and investigated the relationships of age, sex, and HIV infection to these indices.

Journal ArticleDOI
TL;DR: Although this trial, the first of its kind in dengue, does not support balapiravir as a candidate drug, it does establish a framework for antiviral treatment trials in d Dengue and provides the field with a clinically evaluated benchmark molecule.
Abstract: Dengue is an acute illness caused by 1 of 4 single-stranded positive-sense RNA viruses and is the commonest arboviral infection of humans. In countries where dengue is endemic, the case burden strains already fragile healthcare systems and has an economic cost [1, 2]. There are currently no licensed vaccines for dengue (although late-stage trials are in progress), and mosquito vector control has been mostly unsuccessful or unsustainable. Clinically apparent dengue manifests with a spectrum of symptoms. High fever, erythema, headache, and myalgia are common symptoms, and laboratory findings of leukopenia and mild thrombocytopenia are typical. The critical phase occurs around the time of defervescence, typically on days 4–6 of illness, during which a transient capillary permeability syndrome manifests in some patients. In children particularly, capillary permeability can be significant enough to precipitate life-threatening circulatory shock, called dengue shock syndrome (DSS). Treatment is supportive, and the mortality rate for DSS in experienced hospital settings is <1% [2]. The magnitude of the early dengue virus (DENV) burden in patients with dengue has been associated with overall clinical outcome. For example, the early plasma viremia and/or NS1 antigenemia levels in pediatric dengue patients who develop clinically significant capillary permeability are higher than in patients without this complication [3–6]. The higher antigenic burden in these patients is believed to trigger a cascade of immunological events that promotes capillary permeability [7]. The association between high viral burdens in the first few days of illness and more severe outcomes has encouraged antiviral discovery efforts for dengue [8, 9], with the rationale that a reduction of the viral burden should result in a reduced incidence of severe complications and a lessening of symptoms and illness duration. Balapiravir is a prodrug of a nucleoside analogue (4′-azidocytidine) called R1479 and was developed for the treatment of chronic hepatitis C Virus (HCV) infection by Hoffmann-La Roche. [10–12]. Monotherapy twice per day for 14 days reduced plasma HCV levels in a dose- and time-dependent manner and was well-tolerated at doses up to 3000 mg in adult male patients [13]. However, the clinical development of balapiravir for HCV infection was stopped when clinical safety signals were detected in patients receiving extended courses (2–3 months) of balapiravir therapy in conjunction with pegylated interferon and ribavirin. Because HCV and DENV possess RNA-dependent RNA polymerases that share a similar overall architecture [14], we explored a new indication for balapiravir by testing the in vitro activity of R1479 against DENV. Subsequently, the safety, tolerability, and antiviral efficacy of balapirivir in adult dengue patients were investigated in a clinical trial.

Journal ArticleDOI
TL;DR: Because antiinflammatory agents are already on the market, further clinical trials should be done to evaluate this effect in humans as soon as possible, to determine their suitability as coadjuvant tuberculosis treatment.
Abstract: C3HeB/FeJ mice infected with Mycobacterium tuberculosis were used in an experimental animal model mimicking active tuberculosis in humans to evaluate the effect of antiinflammatory agents. No other treatment but ibuprofen was given, and it was administered when the animals' health started to deteriorate. Animals treated with ibuprofen had statistically significant decreases in the size and number of lung lesions, decreases in the bacillary load, and improvements in survival, compared with findings for untreated animals. Because antiinflammatory agents are already on the market, further clinical trials should be done to evaluate this effect in humans as soon as possible, to determine their suitability as coadjuvant tuberculosis treatment.

Journal ArticleDOI
TL;DR: No causal role for parechovirus and bocavirus was found in children <5 years of age who had medical visits for AGE and tested negative for rotavirus and norovirus.
Abstract: Background. Although rotavirus and norovirus cause nearly 40% of severe endemic acute gastroenteritis (AGE) in children <5 years of age in the United States, there are limited data on the etiologic role of other enteric viruses in this age group. Methods. We conducted active population-based surveillance in children presenting with AGE to hospitals, emergency departments, and primary care clinics in 3 US counties. Stool specimens from these children and from age-matched healthy controls collected between October 2008 and September 2009 were tested for enteric adenovirus, astrovirus, sapovirus, parechovirus, bocavirus, and aichivirus. Typing was performed by sequencing and phylogenetic analysis. Results. Adenovirus, astrovirus, sapovirus, parechovirus, bocavirus, and aichivirus were detected in the stool specimens of 11.8%, 4.9%, 5.4%, 4.8%, 1.4%, and 0.2% of patients with AGE and 1.8%, 3.0%, 4.2%, 4.4%, 2.4%, and 0% of healthy controls, respectively. Adenovirus (type 41), astrovirus (types 1, 2, 3, 4, and 8), sapovirus (genogroups I and II), parechovirus (types 1, 3, 4, and 5), and bocavirus (types 1, 2, and 3) were found cocirculating. Conclusions. Adenovirus, astrovirus, and sapovirus infections were detected in 22.1% of the specimens from children <5 years of age who had medical visits for AGE and tested negative for rotavirus and norovirus. No causal role for parechovirus and bocavirus was found.

Journal ArticleDOI
TL;DR: During the 2009–2010 influenza A(H1N1) pandemic, early initiation of NAI treatment reduced the likelihood of severe outcomes compared with late or no treatment.
Abstract: (See the editorial commentary by Aoki and Hayden, on pages 547–9.) The neuraminidase inhibitors (NAIs), oseltamivir and zanamivir are licensed for the treatment of influenza A and B. Before the 2009–2010 pandemic, evidence from randomized trials suggested modest reductions in time to alleviation of symptoms and symptom severity [1–4] and possibly a reduction in antibiotic use for secondary complications [5–7]. Further evidence from methodologically weaker observational studies, derived mainly from prepandemic data (seasonal influenza), suggests that oral oseltamivir reduces mortality by about 75%, hospitalization by 25% and symptom duration compared with no treatment, with broadly similar findings for zanamivir, based on fewer studies [8]. Despite limited usage since launch, except in Japan, both drugs, especially oseltamivir, were widely stockpiled for pandemic purposes and subsequently deployed during the influenza A(H1N1)pdm09 pandemic. A subsequent analysis of oseltamivir safety data published by F. Hoffman–La Roche estimated that 18.3 million individuals worldwide received the drug during the pandemic period between 1 May and 31 December 2009 [9], and data from the United States shows that 97.5% of prescriptions for NAIs during the pandemic period were for oseltamivir [10]. Published studies from the recent pandemic period suggest that early (≤48 hours after symptom onset) versus “late” (delayed >48 hours after symptom onset) treatment of healthy and at-risk adults reduced the likelihood of hospitalization or requirement for critical care [11–15]. Similarly, a small number of studies suggest that increased in-hospital mortality might be related to the late initiation of NAI therapy [16–19]. However, many studies are too small to produce conclusive individual findings; some adjust for possible confounders, but most do not. Considerable uncertainty remains among public health policy-makers and governments regarding the public health benefits of NAI usage during the 2009–2010 pandemic. We therefore present a systematic review and meta-analysis of studies specifically from that period, assessing the impact of NAI treatment in hospitalized patients on mortality, requirement for critical care, and influenza-related pneumonia.

Journal ArticleDOI
TL;DR: This review will outline barriers to HCV care in HCV/HIV coinfection, with a particular emphasis on persons who inject drugs, proposing strategies to enhance HCV treatment uptake and outcomes.
Abstract: The majority of hepatitis C virus (HCV) and human immunodeficiency virus (HIV) coinfection occurs among persons who inject drugs. Rapid improvements in responses to HCV therapy have been observed, but liver-related morbidity rates remain high, given notoriously low uptake of HCV treatment. Advances in HCV therapy will have a limited impact on the burden of HCV-related disease at the population-level unless barriers to HCV education, screening, evaluation, and treatment are addressed and treatment uptake increases. This review will outline barriers to HCV care in HCV/HIV coinfection, with a particular emphasis on persons who inject drugs, proposing strategies to enhance HCV treatment uptake and outcomes.

Journal ArticleDOI
TL;DR: In the absence of antiretroviral therapy (ART), the results of several clinical trials support a predominantly antiviral activity (i.e., approximately 0.5 log decrease in plasma viral load) when interferon alfa is administered to HIV-1-infected persons.
Abstract: (See the editorial commentary by McNamara and Collins, on pages 201–3.) The quest to effect long-term control of human immunodeficiency virus type 1 (HIV-1) in the absence of antiretroviral therapy (ART) has led to numerous therapeutic approaches aimed at increasing host-mediated control of HIV and/or clearance of latent virus reservoirs [1], while maintaining the beneficial effects of immune reconstitution. Pilot strategies currently under investigation include gene therapy [2], therapeutic vaccines [3], cytokines [4], and chemotherapy [5]. Despite intensive investigation, no strategy so far has resulted in sustained control of HIV in the absence of antiretroviral therapy. Interferon α belongs to a family of type 1 interferons produced by dendritic and others cells as part of the host's Toll-like receptor–mediated antiviral response [6]. While interferon-mediated gene expression is increased in advanced HIV disease [7], the role of this innate response (ie, viral control mechanism vs chronic activation mediator contributing to disease progression) remains under debate [8–10]. The results of several clinical trials support a predominantly antiviral activity (ie, approximately 0.5 log decrease in plasma viral load) when interferon alfa is administered to HIV-1–infected persons in the absence of ART [11–14]. However, in this setting interferon alfa is not completely suppressive, possibly because of the deterioration of immune system effectors due to the ongoing viral replication. The degree to which interferon alfa monotherapy may contribute to virus control (eg, suppression of residual replication) after ART-mediated immune reconstitution has not been tested. Given the growing interest in identifying novel strategies aimed at controlling HIV in the absence of ART, we sought to establish a proof of concept that interferon alfa can suppress HIV replication in subjects in whom the detrimental effects of uncontrolled HIV replication on immune function have been partially reversed by ART.

Journal ArticleDOI
TL;DR: This study illustrates the power of combined DNA approaches to generate impressive immune responses in humans by using electroporation after PV administration to provide immunogenicity superior to that observed in the trial without Electroporation, despite fewer vaccinations.
Abstract: DNA-based immunization offers several advantages [1]. DNA vaccines contain nonliving, nonreplicating, and nontransmissible material, providing an improved safety profile over live attenuated viral vectors. They do not elicit antivector immunity, retaining potency through multiple boost cycles. In theory, DNA vaccines are simple and relatively inexpensive to construct, readily produced in large quantities, easy to characterize, and stable and can be combined into complex formulations. Despite the early enthusiasm from results of studies in small animals, DNA vaccines have not generated robust immune responses in humans [2–6]. The amount of antigen produced by each transfected cell is low because of the low transcription rate of antigen sequences being driven off the cytomegalovirus promoter [7–9]. One approach to augment the immunogenicity of DNA is to combine the DNA vaccine with a plasmid cytokine adjuvant [10–13]. Interleukin 12 (IL-12) is a key cytokine for the induction of cellular immune responses [14, 15]. Interleukin 15 (IL-15) is a member of the common cytokine receptor γ-chain family [16–18] that fosters development of long-lived memory T-cell responses [19–21]. A newer strategy for increasing immune potency has been to deliver the plasmids with in vivo electroporation. Electroporation enhances uptake of DNA into cells by temporarily generating an electrical field that increases the permeability of cell membranes and moves the macromolecules through the briefly open membrane pores. Clinical applications of electroporation have been tested, especially in cancer treatment and gene therapy [22–24]. Electroporation has elicited HIV-specific cellular immune responses in mice [25] and simian immunodeficiency virus–specific immune responses in macaques [26]. In macaque studies, genetic optimization, electroporation, and IL-12 plasmid adjuvant have improved the immunogenicity of DNA vaccines in vivo [26]. More recently, Vasan et al reported on a trial that showed the potential to increase vaccine-induced cellular responses to a DNA vaccine relative to intramuscular injection alone [27]. However, electroporation remains investigational [28] and has not been licensed by the Food and Drug Administration for clinical use. This is the first report on the combination of these approaches in humans. Here, we summarize the results of 2 trials of an HIV plasmid DNA vaccine, PENNVAX®-B (PV), one investigating HIV consensus clade B Gag, Pol, and Env with IL-12 or IL-15 plasmid cytokine adjuvants delivered by intramuscular injection without electroporation (HIV Vaccine Trials Network [HVTN] study 070) and the other investigating the same vaccine with plasmid IL-12 delivered intramuscularly with electroporation (HVTN study 080). The results illustrate the power of these combined DNA approaches to generate impressive immune responses in humans.

Journal ArticleDOI
TL;DR: 3 frequently used tests for HEV detection are compared: the MP Diagnostics HEV immunoglobulin G (IgG) enzyme-linked immunosorbent assay (ELISA), the Axiom Diagnostics Hev IgG enzyme immunoassay (EIA), and the Mikrogen recomLine HEV IgG assay.
Abstract: Hepatitis E virus (HEV) seroprevalences of 0.3%-53% were reported from industrialized countries. Because these estimates may be influenced by detection assays, this study compares 3 frequently used tests for HEV detection: the MP Diagnostics HEV immunoglobulin G (IgG) enzyme-linked immunosorbent assay (ELISA), the Axiom Diagnostics HEV IgG enzyme immunoassay (EIA), and the Mikrogen recomLine HEV IgG assay. Sera from 200 healthy healthcare workers and 30 individuals with acute HEV infection were analyzed. Among the healthy individuals, HEV IgG was found in 4.5% by the MP Diagnostics assay, in 29.5% by the Axiom Diagnostics assay, and in 18% by the Mikrogen assay. Among individuals with acute HEV infection, positive results were obtained for 83.3%, 100%, and 96.7%, respectively. Thus, the 3 assays show clear differences in diagnostic sensitivity.

Journal ArticleDOI
TL;DR: Capsular switching has been a regular occurrence among pneumococcal populations throughout the past 7 decades, and this type of recombination has likely been an intrinsic feature throughout the history ofneumococcal evolution.
Abstract: Background.Changes in serotype prevalence among pneumococcal populations result from both serotype replacement and serotype (capsular) switching. Temporal changes in serotype distributions are well documented, but the contribution of capsular switching to such changes is unknown. Furthermore, it is unclear to what extent vaccine-induced selective pressures drive capsular switching. Methods.Serotype and multilocus sequence typing data for 426 pneumococci dated from 1937 through 2007 were analyzed. Whole-genome sequence data for a subset of isolates were used to investigate capsular switching events. Results.We identified 36 independent capsular switch events, 18 of which were explored in detail with whole-genome sequence data. Recombination fragment lengths were estimated for 11 events and ranged from approximately 19.0 kb to ≥58.2 kb. Two events took place no later than 1960, and the imported DNA included the capsular locus and the nearby penicillin-binding protein genes pbp2x and pbp1a. Conclusions.Capsular switching has been a regular occurrence among pneumococcal populations throughout the past 7 decades. Recombination of large DNA fragments (>30 kb), sometimes including the capsular locus and penicillin-binding protein genes, predated both vaccine introduction and widespread antibiotic use. This type of recombination has likely been an intrinsic feature throughout the history of pneumococcal evolution.

Journal ArticleDOI
TL;DR: Diarrhea seemed to be a multipathogen event and a state of enteropathogen excess above a high carriage baseline in this community-based study in Dhaka, Bangladesh.
Abstract: (See the editorial commentary by Ryan on pages 1732–3.) Diarrhea accounts for 26.1% of childhood deaths in South Asia [1], with a peak incidence in the first year of life [2–4]. Beyond this immediate mortality burden, diarrheal episodes contribute to intestinal barrier dysfunction and malnutrition, which underlie additional mortality [5] and disability-adjusted life-years lost [6]. This large burden of disease continues despite improvements from measures such as oral rehydration solution, antibiotics, cleaner water, sanitation, breast-feeding, and rotavirus vaccination [7–10]. The etiology of diarrhea must be understood to accelerate additional preventive measures. Unfortunately, diarrhea is a nonspecific syndrome defined as ≥3 loose stools in a day and can be caused by a diversity of viruses, bacteria, protozoa, helminths, fungi, as well as non infectious triggers [11]. Rotavirus is widely accepted as the major diarrheal pathogen in the first year of life [12–14], but the relative importance of enteropathogens thereafter is less clear. Several methods are needed to detect these enteropathogens including culture, immunoassay, microscopy, and polymerase chain reaction (PCR), yet these are generally applied selectively and vary in their sensitivity [9, 10, 15, 16]. Certain bacteria, such as Campylobacter and Shigella, are difficult to grow, particularly in the global context of widespread antibiotic use. Mixed infections are common but difficult to interpret. For these reasons we developed a series of quantitative multiplex PCR assays for 32 of the main enteropathogen targets, encompassing the major viruses, bacteria, protozoa, helminths, and fungi [17–21]. In this work we applied these assays to infants in Dhaka, Bangladesh, starting from birth, testing all assays with both monthly surveillance and diarrheal specimens . This detection strategy and knowledge of pathogen history preceding diarrhea allowed for a temporal examination of the etiology of diarrhea not possible with most study designs.

Journal ArticleDOI
TL;DR: This Ad26 vectored vaccine was generally safe and immunogenic at all doses tested and is a promising new vaccine vector for HIV-1.
Abstract: Background. We report the first-in-human safety and immunogenicity assessment of a prototype Ad26 vector-based human immunodeficiency virus (HIV) vaccine in humans. Methods. Sixty Ad26-seronegative, healthy, HIV-uninfected subjects were enrolled in a randomized, double-blinded, placebo-controlled, dose-escalation phase 1 study. Five groups of 12 subjects received 109–1011 vp of the Ad26-EnvA vaccine (N = 10/group) or placebo (N = 2/group) at weeks 0 and 24 or weeks 0, 4, and 24. Safety and immunogenicity were assessed. Results. Self-limited reactogenicity was observed after the initial immunization at the highest (1011 vp) dose. No product-related SAEs were observed. All subjects who received the Ad26-EnvA vaccine developed Ad26 NAb titers, EnvA-specific enzyme-linked immunosorbent assays (ELISA) titers, and EnvA-specific enzyme-linked immunospot assays (ELISPOT) responses. These responses persisted at week 52. At week 28 in the 109, 1010, 1011 vp 3-dose and the 1010 and 5 × 1010 vp 2-dose groups, geometric mean EnvA ELISA titers were 6113, 12 470, 8545, 3470, and 9655 and mean EnvA ELISPOT responses were 397, 178, 736, 196, and 1311 SFC/106 peripheral blood mononuclear cells, respectively. Conclusion. This Ad26 vectored vaccine was generally safe and immunogenic at all doses tested. Reactogenicity was minimal with doses of 5 × 1010 vp or less. Ad26 is a promising new vaccine vector for HIV-1. Clinical Trials Registration. {"type":"clinical-trial","attrs":{"text":"NCT00618605","term_id":"NCT00618605"}}NCT00618605.

Journal ArticleDOI
TL;DR: HZ incidence in vaccinated children was 79% lower than in unvaccinated children, and half of HZ cases were due to wild-type VZV, which is similar to vaccine-strain herpes zoster.
Abstract: Background. Vaccine-strain herpes zoster (HZ) can occur after varicella vaccination. This study determined the number and proportion of HZ cases caused by vaccine-strain varicella zoster virus (VZV), assessed the positive predictive value of provider diagnosis of HZ, and computed HZ incidence rates in vaccinated and unvaccinated children. Methods. We used electronic medical records to identify all office visits with an HZ diagnosis for children aged <18 years in a managed care plan. Providers collected skin specimens and completed a questionnaire. Specimens were tested by polymerase chain reaction to identify wild-type or vaccine-strain VZV. Results. From May 2005 to September 2009, we enrolled 322 subjects. VZV was detected in 82% of specimens (84% wild-type, 15% vaccine-strain, 1% possible vaccine–wild-type recombinant). Among the 118 vaccinated subjects, VZV was detected in 70% (52% wild-type). The positive predictive value for provider diagnosis of “definite HZ” was 93% for unvaccinated and 79% for vaccinated children. The incidence of laboratory-confirmed HZ was 48 per 100 000 person-years in vaccinated children (both wild-type and vaccine-strain) and 230 per 100 000 personyears in unvaccinated children (wild-type only). Conclusions. HZ incidence in vaccinated children was 79% lower than in unvaccinated children. Among vaccinated children, half of HZ cases were due to wild-type VZV.

Journal ArticleDOI
TL;DR: Low levels of HIV DNA and RNA were observed in non-CD4+ T leukocytes at low levels, particularly in gut tissues, and future studies should determine whether different mechanisms allow HIV to persist in these distinct reservoirs, and the degree to which different therapies can affect each reservoir.
Abstract: Even with optimal antiretroviral therapy, human immunodeficiency virus (HIV) persists in plasma, blood cells, and tissues To develop new therapies, it is essential to know what cell types harbor residual HIV We measured levels of HIV DNA, RNA, and RNA/DNA ratios in sorted subsets of CD4+ T cells (CCR7+, transitional memory, and effector memory) and non-CD4+ T leukocytes from blood, ileum, and rectum of 8 ART-suppressed HIV-positive subjects Levels of HIV DNA/million cells in CCR7+ and effector memory cells were higher in the ileum than blood When normalized by cell frequencies, most HIV DNA and RNA in the blood were found in CCR7+ cells, whereas in both gut sites, most HIV DNA and RNA were found in effector memory cells HIV DNA and RNA were observed in non-CD4+ T leukocytes at low levels, particularly in gut tissues Compared to the blood, the ileum had higher levels of HIV DNA and RNA in both CD4+ T cells and non-CD4+ T leukocytes, whereas the rectum had higher HIV DNA levels in both cell types but lower RNA levels in CD4+ T cells Future studies should determine whether different mechanisms allow HIV to persist in these distinct reservoirs, and the degree to which different therapies can affect each reservoir

Journal ArticleDOI
TL;DR: Viral evolution in key potential neutralization epitopes likely allowed GII.4.4-2012 to escape from human herd immunity and emerge as the new predominant strain.
Abstract: Background. GII.4 noroviruses are a significant source of acute gastroenteritis worldwide, causing the majority of human norovirus outbreaks. Evolution of the GII.4 major capsid protein occurs rapidly, resulting in the emergence of new strains that produce successive waves of pandemic disease. A new pandemic isolate, GII.4 2012 Sydney, largely replaced previously circulating strains in late 2012. We compare the antigenic properties of GII.4 2012 Sydney with previously circulating strains. Methods. To determine whether GII.4-2012 Sydney is antigenically different from recently circulating strains GII.4-2006 Minerva and GII.4-2009 New Orleans in previously identified blockade epitopes, we compared reactivity and blockade profiles of GII.4-2006, GII.4-2009, and GII.4-2012 virus-like particles in surrogate neutralization/blockade assays using monoclonal antibodies and human polyclonal sera. Results. Using monoclonal antibodies that map to known blockade epitopes in GII.4-2006 and GII.4-2009 and human outbreak polyclonal sera, we demonstrate either complete loss or significantly reduced reactivity and blockade of GII.4.2012 compared to GII.4-2006 and GII.4-2009. Conclusions. GII.4-2012 Sydney is antigenically different from GII.4-2006 Minerva and GII.4-2009 New Orleans in at least 2 key blockade epitopes. Viral evolution in key potential neutralization epitopes likely allowed GII.4-2012 to escape from human herd immunity and emerge as the new predominant strain.