scispace - formally typeset
Search or ask a question

Showing papers by "Csaba P. Kovesdy published in 2020"


Journal ArticleDOI
TL;DR: The burden of CKD was much higher than expected for the level of development, whereas the disease burden in western, eastern, and central sub-Saharan Africa, east Asia, south Asia, central and eastern Europe, Australasia, and western Europe was lower than expected.

2,370 citations


Journal ArticleDOI
TL;DR: Conference deliberations on potassium homeostasis in health and disease, guidance for evaluation and management of dyskalemias in the context of kidney diseases, and research priorities in each of the above areas are provided.

225 citations


Journal ArticleDOI
TL;DR: The pathophysiology and available diagnostic tests for iron-deficiency anemia in CKD are reviewed, the literature that has informed the current practice guidelines are discussed, and the available oral and intravenous iron formulations for the treatment of IDA in CKd are summarized.
Abstract: Anemia is a complication that affects a majority of individuals with advanced CKD. Although relative deficiency of erythropoietin production is the major driver of anemia in CKD, iron deficiency stands out among the mechanisms contributing to the impaired erythropoiesis in the setting of reduced kidney function. Iron deficiency plays a significant role in anemia in CKD. This may be due to a true paucity of iron stores (absolute iron deficiency) or a relative (functional) deficiency which prevents the use of available iron stores. Several risk factors contribute to absolute and functional iron deficiency in CKD, including blood losses, impaired iron absorption, and chronic inflammation. The traditional biomarkers used for the diagnosis of iron-deficiency anemia (IDA) in patients with CKD have limitations, leading to persistent challenges in the detection and monitoring of IDA in these patients. Here, we review the pathophysiology and available diagnostic tests for IDA in CKD, we discuss the literature that has informed the current practice guidelines for the treatment of IDA in CKD, and we summarize the available oral and intravenous (IV) iron formulations for the treatment of IDA in CKD. Two important issues are addressed, including the potential risks of a more liberal approach to iron supplementation as well as the potential risks and benefits of IV versus oral iron supplementation in patients with CKD.

112 citations


Journal ArticleDOI
TL;DR: Evidence suggests that a patient-centered plant-dominant low-protein diet (PLADO) of 0.6–0.8 g/kg/day composed of >50% plant-based sources, administered by dietitians trained in non-dialysis CKD care, is promising and consistent with the precision nutrition.
Abstract: Chronic kidney disease (CKD) affects >10% of the adult population. Each year, approximately 120,000 Americans develop end-stage kidney disease and initiate dialysis, which is costly and associated with functional impairments, worse health-related quality of life, and high early-mortality rates, exceeding 20% in the first year. Recent declarations by the World Kidney Day and the U.S. Government Executive Order seek to implement strategies that reduce the burden of kidney failure by slowing CKD progression and controlling uremia without dialysis. Pragmatic dietary interventions may have a role in improving CKD outcomes and preventing or delaying dialysis initiation. Evidence suggests that a patient-centered plant-dominant low-protein diet (PLADO) of 0.6–0.8 g/kg/day composed of >50% plant-based sources, administered by dietitians trained in non-dialysis CKD care, is promising and consistent with the precision nutrition. The scientific premise of the PLADO stems from the observations that high protein diets with high meat intake not only result in higher cardiovascular disease risk but also higher CKD incidence and faster CKD progression due to increased intraglomerular pressure and glomerular hyperfiltration. Meat intake increases production of nitrogenous end-products, worsens uremia, and may increase the risk of constipation with resulting hyperkalemia from the typical low fiber intake. A plant-dominant, fiber-rich, low-protein diet may lead to favorable alterations in the gut microbiome, which can modulate uremic toxin generation and slow CKD progression, along with reducing cardiovascular risk. PLADO is a heart-healthy, safe, flexible, and feasible diet that could be the centerpiece of a conservative and preservative CKD-management strategy that challenges the prevailing dialysis-centered paradigm.

99 citations


Journal ArticleDOI
TL;DR: UrineACR is the preferred measure of albuminuria; however, if ACR is not available, predicted ACR from PCR or urine dipstick protein may help in CKD screening, staging, and prognosis.
Abstract: Background Although measuring albuminuria is the preferred method for defining and staging chronic kidney disease (CKD), total urine protein or dipstick protein is often measured instead. Objective To develop equations for converting urine protein-creatinine ratio (PCR) and dipstick protein to urine albumin-creatinine ratio (ACR) and to test their diagnostic accuracy in CKD screening and staging. Design Individual participant-based meta-analysis. Setting 12 research and 21 clinical cohorts. Participants 919 383 adults with same-day measures of ACR and PCR or dipstick protein. Measurements Equations to convert urine PCR and dipstick protein to ACR were developed and tested for purposes of CKD screening (ACR ≥30 mg/g) and staging (stage A2: ACR of 30 to 299 mg/g; stage A3: ACR ≥300 mg/g). Results Median ACR was 14 mg/g (25th to 75th percentile of cohorts, 5 to 25 mg/g). The association between PCR and ACR was inconsistent for PCR values less than 50 mg/g. For higher PCR values, the PCR conversion equations demonstrated moderate sensitivity (91%, 75%, and 87%) and specificity (87%, 89%, and 98%) for screening (ACR >30 mg/g) and classification into stages A2 and A3, respectively. Urine dipstick categories of trace or greater, trace to +, and ++ for screening for ACR values greater than 30 mg/g and classification into stages A2 and A3, respectively, had moderate sensitivity (62%, 36%, and 78%) and high specificity (88%, 88%, and 98%). For individual risk prediction, the estimated 2-year 4-variable kidney failure risk equation using predicted ACR from PCR had discrimination similar to that of using observed ACR. Limitation Diverse methods of ACR and PCR quantification were used; measurements were not always performed in the same urine sample. Conclusion Urine ACR is the preferred measure of albuminuria; however, if ACR is not available, predicted ACR from PCR or urine dipstick protein may help in CKD screening, staging, and prognosis. Primary funding source National Institute of Diabetes and Digestive and Kidney Diseases and National Kidney Foundation.

96 citations



Journal Article
TL;DR: In this paper, the authors present an analysis of the relationship between albuminuria and total urine protein or dipstick protein for chronic kidney disease (CKD) diagnosis and staging.
Abstract: Measuring albuminuria is the preferred method for defining and staging chronic kidney disease (CKD), but total urine protein or dipstick protein is often measured instead. This analysis presents eq...

60 citations


Journal ArticleDOI
TL;DR: This review outlines the current understanding of the diagnosis, prevalence, etiology, outcome, and treatment of constipation in CKD, and aims to discuss its novel clinical and therapeutic implications.

53 citations


Journal ArticleDOI
TL;DR: The “CKD Patch” can be used to quantitatively enhance ASCVD and CVD mortality risk prediction equations recommended in major US and European guidelines according to CKD measures, when available.

47 citations



Journal ArticleDOI
TL;DR: The real-world utilization results suggest patiromer is used for the chronic management of hyperkalemia among US Veterans with HK and the successful management of HK may have contributed to the observed high rate of RAASi therapy continuation.
Abstract: Objective: Patiromer is a sodium-free, non-absorbed, potassium (K+) binder approved for the treatment of hyperkalemia (HK). Among US Veterans with HK, this retrospective, observational cohort study...

Journal ArticleDOI
TL;DR: The data suggest that initial CVC use with later placement of an AVF may be an acceptable option among elderly hemodialysis patients.
Abstract: Author(s): Ko, Gang Jee; Rhee, Connie M; Obi, Yoshitsugu; Chang, Tae Ik; Soohoo, Melissa; Kim, Tae Woo; Kovesdy, Csaba P; Streja, Elani; Kalantar-Zadeh, Kamyar | Abstract: BackgroundArteriovenous fistulas (AVFs) are the preferred vascular access type in most hemodialysis patients. However, the optimal vascular access type in octogenarians and older (≥80 years) hemodialysis patients remains widely debated given their limited life expectancy and lower AVF maturation rates.MethodsAmong incident hemodialysis patients receiving care in a large national dialysis organization during 2007-2011, we examined patterns of vascular access type conversion in 1 year following dialysis initiation in patients l80 versus ≥80 years of age. Among a subcohort of patients ≥80 years of age, we examined the association between vascular access type conversion and mortality using multivariable survival models.ResultsIn the overall cohort of 100 804 patients, the prevalence of AVF/arteriovenous graft (AVG) as the primary vascular access type increased during the first year of hemodialysis, but plateaued thereafter. Among 8356 patients ≥80 years of age and treated for g1 year, those with initial AVF/AVG use and placement of AVF from a central venous catheter (CVC) had lower mortality compared with patients with persistent CVC use. When the reference group was changed to patients who had AVF placement from a CVC in the first year of dialysis, those with initial AVF use had similar mortality. A longer duration of CVC use was associated with incrementally worse survival.ConclusionsAmong incident hemodialysis patients ≥80 years of age, placement of an AVF from a CVC within the first year of dialysis had similar mortality compared with initial AVF use. Our data suggest that initial CVC use with later placement of an AVF may be an acceptable option among elderly hemodialysis patients.

Journal ArticleDOI
TL;DR: It is described for the first time in Kazakhstan an increase in the prevalence and incidence of ESRD on dialysis, while mortality rate decreased over time, during 2014–2018.
Abstract: The epidemiology of dialysis patients has been little studied in developing countries and economies in transition. We examined the prevalence, incidence and mortality rate of dialysis patients in Kazakhstan, via aggregation and utilization of large-scale administrative healthcare data. The registry data of 8898 patients receiving dialysis therapy between 2014 and 2018 years were extracted from the Unified National Electronic Health System (UNEHS) and linked with the national population registry of Kazakhstan. We provide descriptive statistics of demographic, comorbidity and dialysis-related characteristics. Among all patients undergoing maintenance dialysis for end-stage renal disease (ESRD), there were 3941 (44%) females and 4957 (56%) males. 98.7% of patients received hemodialysis and 1.3% peritoneal dialysis. The majority of the patients (63%) were ethnic Kazakhs, 18% were Russians and 19% were of other ethnicities. The prevalence and incidence rate in 2014 were 135.2 and 68.9 per million population (PMP), respectively, which were different in 2018 [350.2 and 94.9 PMP, respectively]. Overall mortality rate among dialysis patients reduced from 1667/1000 patient-years [95%Confidence Interval (CI): 1473–1886] (PY) in 2014 to 710/1000PY [95%CI: 658–767] in 2018. We observed 13% lower crude survival probability in females compared to males and in older patients compared to younger ones. Russian ethnicity had 58% higher risk of death, while other ethnicities had 34% higher risk of death compared to in those of Kazakh ethnicity. We describe for the first time in Kazakhstan an increase in the prevalence and incidence of ESRD on dialysis, while mortality rate decreased over time, during 2014–2018. We observed statistically significant lower survival probability in female dialysis patients compared to males, in older patients compared to younger ones, and in patients of Russian ethnicity compared to Kazakh.


Journal ArticleDOI
TL;DR: In this article, the authors evaluated whether β-blocker use and their dialyzability characteristics were associated with early mortality among patients with chronic kidney disease with heart failure who transitioned to dialysis.

Journal ArticleDOI
TL;DR: Post‐transplant DSA is significantly associated with worse patient and kidney allograft outcomes in SLKT and further prospective and large cohort studies are warranted to better assess these associations.
Abstract: There is a dearth of published data regarding the presence of post-transplant donor-specific antibodies (DSA), especially C1q-binding DSA (C1q+DSA), and patient and kidney allograft outcomes in simultaneous liver-kidney transplant (SLKT) recipients. We conducted a retrospective cohort study consisted of 85 consecutive SLKT patients between 2009 and 2018 in our center. Associations between presence of post-transplant DSA, including persistent and/or newly developed DSA and C1q+DSA, and all-cause mortality and the composite outcome of mortality, allograft kidney loss, and antibody-mediated rejection were examined using unadjusted and age and sex-adjusted Cox proportional hazards and time-dependent regression models. The mean age at SLKT was 56 years and 60% of the patients were male. Twelve patients (14%) had post-transplant DSA and seven patients (8%) had C1q+DSA. The presence of post-transplant DSA was significantly associated with increased risk of mortality (unadjusted model: Hazard Ratio (HR) = 2.72, 95% confidence interval (CI): 1.06-6.98 and adjusted model: HR = 3.20, 95% CI: 1.11-9.22) and the composite outcome (unadjusted model: HR = 3.18, 95% CI: 1.31-7.68 and adjusted model: HR = 3.93, 95% CI: 1.39-11.10). There was also higher risk for outcomes in recipients with C1q+DSA compared the ones without C1q+DSA. Post-transplant DSA is significantly associated with worse patient and kidney allograft outcomes in SLKT. Further prospective and large cohort studies are warranted to better assess these associations.

Posted ContentDOI
31 May 2020
TL;DR: Evidence suggests that a patient-centered plant-dominant low-protein diet (PLADO) of 0.6-0.8 g/kg/day comprised of >50% plant-based sources, administered by dietitians trained in non-dialysis CKD care, can be promising.
Abstract: Chronic kidney disease (CKD) affects >10% of the adult population. Each year, approximately 120,000 Americans develop end-stage kidney disease and initiate dialysis, which is costly and associated with functional impairments, worse health-related quality of life, and high early-mortality rates, exceeding 20% in the first year. Recent declarations by the World Kidney Day and the U.S. Government Executive Order seek to implement strategies that reduce the burden of kidney failure by slowing CKD progression and controlling uremia without dialysis. Pragmatic dietary interventions may have a role in improving CKD outcomes and preventing or delaying dialysis initiation. Evidence suggests that a patient-centered plant-dominant low-protein diet (PLADO) of 0.6–0.8 g/kg/day composed of >50% plant-based sources, administered by dietitians trained in non-dialysis CKD care, is promising and consistent with the precision nutrition. The scientific premise of the PLADO stems from the observations that high protein diets with high meat intake not only result in higher cardiovascular disease risk but also higher CKD incidence and faster CKD progression due to increased intraglomerular pressure and glomerular hyperfiltration. Meat intake increases production of nitrogenous end-products, worsens uremia, and may increase the risk of constipation with resulting hyperkalemia from the typical low fiber intake. A plant-dominant, fiber-rich, low-protein diet may lead to favorable alterations in the gut microbiome, which can modulate uremic toxin generation and slow CKD progression, along with reducing cardiovascular risk. PLADO is a heart-healthy, safe, flexible, and feasible diet that could be the centerpiece of a conservative and preservative CKD-management strategy that challenges the prevailing dialysis-centered paradigm.

Journal ArticleDOI
TL;DR: Longer predialysis ACEi/ARB exposure was associated with lower postdialysis mortality and Prospective studies are needed to evaluate the benefits of strategies enabling uninterrupted predial kidney injury and hyperkalemia use.

Journal ArticleDOI
TL;DR: In patients with late-stage CKD who transitioned to dialysis, warfarin use was associated with higher risk of ischemic and bleeding events but a lower risk of mortality.

Journal ArticleDOI
TL;DR: Abnormalities of MBD parameters including higher phosphorus, intact PTH, ALP and lower calcium levels were independently associated with decline in RKF in incident hemodialysis patients in incident HD patients.
Abstract: Abnormalities of mineral bone disorder (MBD) parameters have been suggested to be associated with poor renal outcome in predialysis patients. However, the impact of those parameters on decline in residual kidney function (RKF) is uncertain among incident hemodialysis (HD) patients. We performed a retrospective cohort study in 13,772 patients who initiated conventional HD during 2007 to 2011 and survived 6 months of dialysis. We examined the association of baseline serum phosphorus, calcium, intact parathyroid hormone (PTH), and alkaline phosphatase (ALP) with a decline in RKF. Decline in RKF was assessed by estimated slope of renal urea clearance (KRU) over 6 months from HD initiation. Our cohort had a mean ± SD age of 62 ± 15 years; 64% were men, 57% were white, 65% had diabetes, and 51% had hypertension. The median (interquartile range [IQR]) baseline KRU level was 3.4 (2.0, 5.2) mL/min/1.73 m2 . The median (IQR) estimated 6-month KRU slope was -1.47 (-2.24, -0.63) mL/min/1.73 m2 per 6 months. In linear regression models, higher phosphorus categories were associated with a steeper 6-month KRU slope compared with the reference category (phosphorus 4.0 to <4.5 mg/dL). Lower calcium and higher intact PTH and ALP categories were also associated with a steeper 6-month KRU slope compared with their respective reference groups (calcium 9.2 to <9.5 mg/dL; intact PTH 150 to <250 pg/mL; ALP <60 U/L). The increased number of parameter abnormalities had an additive effect on decline in RKF. Abnormalities of MBD parameters including higher phosphorus, intact PTH, ALP and lower calcium levels were independently associated with decline in RKF in incident HD patients. © 2019 American Society for Bone and Mineral Research. © 2019 American Society for Bone and Mineral Research.

Journal ArticleDOI
TL;DR: In the second iteration of the Global Kidney Health Atlas, the International Society of Nephrology conducted a cross-sectional global survey between July and October 2018 to explore the coverage and scope of HIS for kidney disease, with a focus on kidney replacement therapy (KRT).
Abstract: BACKGROUND: Health information systems (HIS) are fundamental tools for the surveillance of health services, estimation of disease burden and prioritization of health resources. Several gaps in the availability of HIS for kidney disease were highlighted by the first iteration of the Global Kidney Health Atlas. METHODS: As part of its second iteration, the International Society of Nephrology conducted a cross-sectional global survey between July and October 2018 to explore the coverage and scope of HIS for kidney disease, with a focus on kidney replacement therapy (KRT). RESULTS: Out of a total of 182 invited countries, 154 countries responded to questions on HIS (85% response rate). KRT registries were available in almost all high-income countries, but few low-income countries, while registries for non-dialysis chronic kidney disease (CKD) or acute kidney injury (AKI) were rare. Registries in high-income countries tended to be national, in contrast to registries in low-income countries, which often operated at local or regional levels. Although cause of end-stage kidney disease, modality of KRT and source of kidney transplant donors were frequently reported, few countries collected data on patient-reported outcome measures and only half of low-income countries recorded process-based measures. Almost no countries had programs to detect AKI and practices to identify CKD-targeted individuals with diabetes, hypertension and cardiovascular disease, rather than members of high-risk ethnic groups. CONCLUSIONS: These findings confirm significant heterogeneity in the global availability of HIS for kidney disease and highlight important gaps in their coverage and scope, especially in low-income countries and across the domains of AKI, non-dialysis CKD, patient-reported outcomes, process-based measures and quality indicators for KRT service delivery.

Journal ArticleDOI
TL;DR: Recipients who receivedHCV Ab positive, but NAT‐negative donor kidneys did not experience worse 6‐month eGFR than correctly matched HCV Ab−/NAT− recipients, according to combined exact matching and propensity score matching.
Abstract: The kidney donor profile index (KDPI) defines an hepatitis C (HCV) positive donor based on HCV antibody (Ab) and/or nucleic acid amplification test (NAT) positivity, with donors who are not actively infected (Ab+/NAT-) also classified as HCV positive. From Scientific Registry of Transplant Recipients dataset, we identified HCV-negative recipients, who received a kidney transplant from HCV Ab+/NAT- (n = 116) and HCV Ab-/NAT- (n = 25 574) donor kidneys. We then compared recipients' estimated glomerular filtration rate (eGFR) at 6 months in matched cohorts, using combined exact matching (based on KDPI) and propensity score matching. We created two separate matched cohorts: for the first cohort, we used the allocation KDPI, while for the second cohort we used an optimal KDPI, where the HCV component of KDPI was considered negative in Ab+/NAT- patients. The mean ± SD age of the allocation KDPI-matched cohort at baseline was 59 ± 10 years, 69% were male, 61% were white. Recipients' eGFR at 6 months after transplantation was significantly higher in the HCV Ab+/NAT- group compared to the HCV Ab-/NAT- group (61.1 ± 17.9 vs. 55.6 ± 18.8 ml/min/1.73 m2 , P = 0.011) in the allocation KDPI-matched cohort, while it was similar (61.8 ± 19.5 vs. 62.1 ± 20.1 ml/min/1.73 m2 , P = 0.9) in the optimal KDPI-matched cohort. Recipients who received HCV Ab positive, but NAT-negative donor kidneys did not experience worse 6-month eGFR than correctly matched HCV Ab-/NAT- recipients.

Journal ArticleDOI
TL;DR: Analysis of frequency and risk factors of lactic acid (LA) elevation in ambulatory elderly male US veterans with stable diabetic CKD3 treated with metformin found metformIn therapy is associated with increased risk of hyperlactatemia in elderly men with diabetic CKd3.
Abstract: The FDA has recently endorsed metformin use in patients with T2D and stage 3 CKD (CKD3). However, metformin safety in elderly individuals is unknown. The aim of this study was to identify frequency and risk factors of lactic acid (LA) elevation in ambulatory elderly male US veterans with stable diabetic CKD3 treated with metformin. We studied 92 patients with non-diabetic CKD3 (Group1), diabetic CKD3 not on metformin (Group2) and diabetic CKD3 on metformin (Group 3). Mean LA levels were similar at 1.3 ± 0.3 and 1.3 ± 0.4 mmol/L in Groups 1 and 2, respectively; while, LA was significantly higher in Group 3 (2.1 ± 1.0 mmol/L, P 2.0 mmol/L), as compared with 17 (42.5%) patients in Group 3 (P


Journal ArticleDOI
TL;DR: It is suggested that the presence of certain comorbidities may be contributing factors in the decision to prescribe spironolactone, and high healthcare resource utilization and costs for patients at later stages of disease, irrespective of spironOLactone use, highlight the need for new therapies for DKD.
Abstract: Limited evidence has indicated that addition of a steroidal mineralocorticoid receptor antagonist (MRA) to the standard of care reduces proteinuria in patients with diabetic kidney disease (DKD); however, there are limited data regarding real-world MRA use in these patients. This study aimed to describe the characteristics of spironolactone users and non-users with DKD, and to explore their clinical outcomes. This was a non-interventional, retrospective cohort study using demographic and clinical data from a US claims database (PharMetrics Plus) and the Experian consumer data asset during 2006–2015. Baseline characteristics (e.g. comorbidities) and post-inclusion clinical outcomes were described in matched cohorts of spironolactone users and non-users (n = 5465 per group). Although matching aligned key demographic and clinical characteristics of the cohorts, a significantly greater proportion of spironolactone users than non-users had oedema, proteinuria, and cardiovascular disease at baseline (P < 0.0001). During the post-inclusion period, disease progression and clinical events of interest such as acute kidney injury were more commonly observed in spironolactone users than non-users. Users also had higher healthcare resource utilization and costs than non-users; however, these differences diminished at later stages of disease. In this study, spironolactone users had a greater comorbidity burden at baseline than matched non-users, suggesting that the presence of certain comorbidities may be contributing factors in the decision to prescribe spironolactone. High healthcare resource utilization and costs for patients at later stages of disease, irrespective of spironolactone use, highlight the need for new therapies for DKD.

Journal ArticleDOI
TL;DR: In NDD-CKD patients transitioning to dialysis, pre-ESRD opiate and gabapentin/pregabalin use were associated with higher post- ESRD mortality, whereas non-opiate analgesic use was not associated with death.
Abstract: Background Population-based studies show there is a high prevalence of chronic kidney disease (CKD) patients suffering from chronic pain. While opiates are frequently prescribed in non-dialysis-dependent CKD (NDD-CKD) patients, there may be toxic accumulation of metabolites, particularly among those progressing to end-stage renal disease (ESRD). We examined the association of opiate versus other analgesic use during the pre-ESRD period with post-ESRD mortality among NDD-CKD patients transitioning to dialysis. Methods We examined a national cohort of US Veterans with NDD-CKD who transitioned to dialysis over 2007-14. Among patients who received ≥1 prescription(s) in the Veterans Affairs (VA) Healthcare System within 1 year of transitioning to dialysis, we examined associations of pre-ESRD analgesic status, defined as opiate, gabapentin/pregabalin, other non-opiate analgesic, versus no analgesic use, with post-ESRD mortality using multivariable Cox models. Results Among 57,764 patients who met eligibility criteria, pre-ESRD opiate and gabapentin/pregabalin use were each associated with higher post-ESRD mortality (ref: no analgesic use), whereas non-opiate analgesic use was not associated with higher mortality in expanded case-mix analyses: HRs (95% CIs) 1.07 (1.05-1.10), 1.07 (1.01-1.13), and 1.00 (0.94-1.06), respectively. In secondary analyses, increasing frequency of opiate prescriptions exceeding 1 opiate prescription in the 1-year pre-ESRD period was associated with incrementally higher post-ESRD mortality (ref: no analgesic use). Conclusions In NDD-CKD patients transitioning to dialysis, pre-ESRD opiate and gabapentin/pregabalin use were associated with higher post-ESRD mortality, whereas non-opiate analgesic use was not associated with death. There was a graded association between increasing frequency of pre-ESRD opiate use and incrementally higher mortality.

Journal ArticleDOI
TL;DR: The incidence of CMV infection was similar in recipients who received HCVAb D+/R + and HCV Ab D − KT, and further studies are needed to assess this association in KT from HCV nucleic acid positive donors.
Abstract: Deceased-donor kidney transplantation (KT) from hepatitis C (HCV)-infected donors into HCV-uninfected recipients (HCV D+/R−) could become standard care in the near future. However, HCV viral replic...

Journal ArticleDOI
31 Jan 2020-Nephron
TL;DR: The presence of pretransplant DSAs was not associated with worse kidney allograft outcomes from the authors' single-center experience, and similar results when comparing different DSA subclasses (class I and II DSAs) with recipients without DSAs.
Abstract: Introduction and Objective: The impact of pretransplant donor-specific antibodies (DSAs), especially class II DSAs, on kidney allograft outcomes remains unclear in simultaneous liver-kidney transplantation (SLKT) recipients. Methods: We examined 85 recipients who consecutively underwent SLKT between 2009 and 2018 in our center. Associations between pretransplant DSA and worsening kidney function (WKF), kidney allograft loss, composite kidney outcome (WKF and/or antibody-mediated rejection and/or death-censored kidney allograft loss), death with functioning graft, and overall mortality were examined in survival analysis. WKF was defined as an eGFR decrease of 30% or greater from baseline, or 2 or more episodes of proteinuria, at least 90 days apart from each other. Results: The mean age at SLKT was 56 ± 10 years, and 62% of the recipients were male. More than one quarter (26%) of our recipients were African American. The 2 major causes of end-stage liver disease were hepatitis C (28%) and alcoholic hepatitis (26%). Nineteen recipients (22%) had pretransplant DSAs at the time of SLKT. The DSA(+) group and DSA(−) group had similar risk of WKF (unadjusted model: hazard ratio [HR] = 0.77, 95% confidence interval [CI]: 0.29–2.05 and adjusted model: HR = 0.36, 95% CI: 0.12–1.08); similar risk of composite kidney outcome (unadjusted model: HR = 1.04, 95% CI: 0.45–2.43 and adjusted model: HR = 0.53, 95% CI: 0.20–1.39); and similar risk of overall death (unadjusted model: HR = 1.23, 95% CI: 0.45–3.36 and adjusted model: HR = 1.28, 95% CI: 0.42–3.87). We found similar results when comparing different DSA subclasses (class I and II DSAs) with recipients without DSAs. Conclusions: The presence of pretransplant DSAs was not associated with worse kidney allograft outcomes from our single-center experience. Further prospective larger studies are strongly warranted.

Journal ArticleDOI
20 May 2020-Nephron
TL;DR: Findings indicate RTRs who use prescription opioids during the first year posttransplant, regardless of the dosage/amount, are less likely to be adherent to tacrolimus.
Abstract: Introduction: Little is known about the effect of posttransplant opioid use on adherence to immunosuppressant therapy (IST) among adult renal transplant recipients (RTRs). Objective: The aim of this study was to examine the relationship between opioid use and IST adherence among adult RTRs during the first year posttransplant. Methods: Longitudinal data were analyzed from a retrospective cohort study examining US veterans undergoing renal transplant from October 1, 2007, through March 31, 2015. Data were collected from the US Renal Data System, Centers for Medicare and Medicaid Services Data (Medicare Part D), and Veterans Affairs pharmacy records. Dose of opioid prescriptions was collected and divided based on annual morphine milligram equivalent within a year of transplant. Proportion of days covered of greater than or equal to 80% indicated adherence to tacrolimus. Unadjusted and multivariable-adjusted logistic regression analyses were performed. Results: A study population of 1,229 RTRs included 258 with no opioid use, while 971 opioid users were identified within the first year after transplantation. Compared to RTRs without opioid usage, RTRs with opioid usage had a lower probability of being adherent to tacrolimus in unadjusted logistic regression (odds ratio [OR] (95% confidence interval [CI]): 0.22 [0.07–0.72]) and adjusted logistic regression (OR [95% CI]: 0.11 [0.03–0.44]). These patterns generally remained consistent in unadjusted and adjusted main and sensitivity analyses. Conclusions: Findings indicate RTRs who use prescription opioids during the first year posttransplant, regardless of the dosage/amount, are less likely to be adherent to tacrolimus. Future studies are needed to better understand underlying causes of the association between opioid use and tacrolimus nonadherence.

Journal ArticleDOI
13 Jul 2020
TL;DR: Findings suggest a potential contribution of Nrf2 dysfunction to the development of premature biological aging and its related morbidities in ESRD patients.
Abstract: Patients with end-stage renal disease (ESRD) display phenotypic features of premature biological aging, characterized by disproportionately high morbidity and mortality at a younger age. Nuclear factor erythroid 2-related factor 2 (Nrf2) activity, a master regulator of antioxidative responses, declines with age and is implicated in the pathogenesis of age-related disorders; however, little is known about the association between Nrf2 and premature biological aging in ESRD patients. In a cross-sectional pilot cohort of 34 ESRD patients receiving maintenance hemodialysis, we measured the expression of Nrf2 and cyclin-dependent kinase inhibitor 2A (CDKN2A, or p16INK4a, a biomarker of biological aging) genes in whole blood and examined the association of Nrf2 with CDKN2A expression, using Spearman's rank correlation and multivariable linear regression models with adjustment for potential confounders. There was a significant negative correlation between Nrf2 and CDKN2A expression (rho=-0.51, P=0.002); while no significant correlation was found between Nrf2 expression and chronological age (rho=-0.02, P=0.91). After multivariable adjustment, Nrf2 expression remained significantly and negatively associated with CDKN2A expression (β coefficient=-1.51, P=0.01), independent of chronological age, gender, race, and diabetes status. These findings suggest a potential contribution of Nrf2 dysfunction to the development of premature biological aging and its related morbidities in ESRD patients.