scispace - formally typeset
Search or ask a question

Showing papers in "Clinical Transplantation in 2016"


Journal ArticleDOI
TL;DR: Voriconazole use has increased since the drug's introduction in 2002, and new and unique adverse effects are emerging as patients undergo prolonged therapy, most concerning is the increased risk of cutaneous malignancies, primarily squamous cell carcinoma.
Abstract: Voriconazole use has increased since the drug's introduction in 2002, and new and unique adverse effects are emerging as patients undergo prolonged therapy. Most concerning is the increased risk of cutaneous malignancies, primarily squamous cell carcinoma (SCC); this risk is duration dependent and the associated malignancies tend to be more aggressive and multifocal. Voriconazole is also associated with phototoxicity (which may be a precursor to malignancy), periostitis, hallucinations and encephalopathy, peripheral neuropathy, alopecia, nail changes, hyponatremia, and other adverse effects. Some toxicities (neuropsychiatric and gastrointestinal including hepatic) are seen in clear association with supratherapeutic serum voriconazole levels; thus, careful monitoring of voriconazole levels is a critical component of safe drug use. Guidelines for screening for adverse effects after long-term voriconazole use may be beneficial and need to be established.

77 citations


Journal ArticleDOI
TL;DR: The identification of factors that predispose kidney transplantation recipients to medication misunderstanding, non‐adherence, and hospitalization could help target appropriate self‐care interventions.
Abstract: We sought to evaluate the prevalence of medication understanding and non-adherence of entire drug regimens among kidney transplantation (KT) recipients and to examine associations of these exposures with clinical outcomes. Structured, in-person interviews were conducted with 99 adult KT recipients between 2011 and 2012 at two transplant centers in Chicago, IL; and Atlanta, GA. Nearly, one-quarter (24%) of participants had limited literacy as measured by the Rapid Estimate of Adult Literacy in Medicine test; patients took a mean of 10 (SD=4) medications and 32% had a medication change within the last month. On average, patients knew what 91% of their medications were for (self-report) and demonstrated proper dosing (via observed demonstration) for 83% of medications. Overall, 35% were non-adherent based on either self-report or tacrolimus level. In multivariable analyses, fewer months since transplant and limited literacy were associated with non-adherence (all P<.05). Patients with minority race, a higher number of medications, and mild cognitive impairment had significantly lower treatment knowledge scores. Non-white race and lower income were associated with higher rates of hospitalization within a year following the interview. The identification of factors that predispose KT recipients to medication misunderstanding, non-adherence, and hospitalization could help target appropriate self-care interventions.

73 citations


Journal ArticleDOI
TL;DR: Significant reductions in the toll of DGF in the near future seem unlikely but concentrated research on many levels offers long‐term promise.
Abstract: Delayed graft function (DGF) remains a major barrier to improved outcomes after kidney transplantation. High-risk transplant recipients can be identified, but no definitive prediction model exists. Novel biomarkers to predict DGF in the first hours post-transplant, such as neutrophil gelatinase-associated lipocalin (NGAL), are under investigation. Donor management to minimize the profound physiological consequences of brain death is highly complex. A hormonal resuscitation package to manage the catecholamine “storm” that follows brain death is recommended. Donor pretreatment with dopamine prior to procurement lowers the rate of DGF. Hypothermic machine perfusion may offer a significant reduction in the rate of DGF vs simple cold storage, but costs need to be evaluated. Surgically, reducing warm ischemia time may be advantageous. Research into recipient preconditioning options has so far not generated clinically helpful interventions. Diagnostic criteria for DGF vary, but requirement for dialysis and/or persistent high serum creatinine is likely to remain key to diagnosis until current work on early biomarkers has progressed further. Management centers on close monitoring of graft (non)function and physiological parameters. With so many unanswered questions, substantial reductions in the toll of DGF in the near future seem unlikely but concentrated research on many levels offers long-term promise.

72 citations


Journal ArticleDOI
TL;DR: This work evaluates the extent to which functional impairment/disability is associated with increased risk of postoperative death.
Abstract: BACKGROUND Frail patients are more vulnerable to perioperative stressors of liver transplantation (LT). Program Specific Reports, used in transplant center auditing, risk-adjust for frailty using the Karnofsky Performance Status (KPS) scale. We evaluate the extent to which functional impairment/disability is associated with increased risk of postoperative death. METHODS We included 24 505 first-time LT recipients from the Scientific Registry of Transplant Recipients (2006-2011). We categorized patients as Severe, Moderate, or Normal function/disability using the KPS scale and evaluated risk of 30- and 90-day mortality. Analyses took potential center-specific differences in KPS measurement protocols into account using hierarchal logistic modeling. RESULTS Over one-quarter of our population was Severely impaired/disabled, and 30.5% had no functional limitations. Severely and Moderately impaired/disabled patients had 2.56 (95% CI 1.91-3.44) and 1.40 (95% CI 1.10-1.78) times the odds of 30-day mortality, respectively, after adjusting for key recipient and donor factors. Estimates remained consistent regardless of Model for End-Stage Liver Disease score, medical condition, or clustering analyses by center. Technical/operative complications and multiorgan failure/hemorrhage were more common causes of death among more Severely disabled patients than in higher functioning groups. CONCLUSIONS Pre-transplant functional status, assessed using the KPS scale, is a reliable predictor of post-LT mortality in the United States.

66 citations


Journal ArticleDOI
TL;DR: The results demonstrate the feasibility of multiplexed gene expression quantification from FFPE renal allograft tissue and represents a method for prospective and retrospective validation of molecular diagnostics and its adoption in clinical transplantation pathology.
Abstract: Histopathologic diagnoses in transplantation can be improved with molecular testing. Preferably, molecular diagnostics should fit into standard-of-care workflows for transplant biopsies, that is, formalin-fixed paraffin-embedded (FFPE) processing. The NanoString(®) gene expression platform has recently been shown to work with FFPE samples. We aimed to evaluate its methodological robustness and feasibility for gene expression studies in human FFPE renal allograft samples. A literature-derived antibody-mediated rejection (ABMR) 34-gene set, comprised of endothelial, NK cell, and inflammation transcripts, was analyzed in different retrospective biopsy cohorts and showed potential to molecularly discriminate ABMR cases, including FFPE samples. NanoString(®) results were reproducible across a range of RNA input quantities (r = 0.998), with different operators (r = 0.998), and between different reagent lots (r = 0.983). There was moderate correlation between NanoString(®) with FFPE tissue and quantitative reverse transcription polymerase chain reaction (qRT-PCR) with corresponding dedicated fresh-stabilized tissue (r = 0.487). Better overall correlation with histology was observed with NanoString(®) (r = 0.354) than with qRT-PCR (r = 0.146). Our results demonstrate the feasibility of multiplexed gene expression quantification from FFPE renal allograft tissue. This represents a method for prospective and retrospective validation of molecular diagnostics and its adoption in clinical transplantation pathology.

57 citations


Journal ArticleDOI
TL;DR: It is unknown if muscle wasting is important in lung transplantation, and low body mass index is used to measure wasting, but can over‐ or underestimate muscle mass.
Abstract: Background Frailty in non-transplant populations increases morbidity and mortality. Muscle wasting is an important frailty characteristic. Low body mass index is used to measure wasting, but can over- or underestimate muscle mass. Computed tomography (CT) software can directly measure muscle mass. It is unknown if muscle wasting is important in lung transplantation. Aim and Methods The aim of this single-center, retrospective cohort study was to determine whether pre-transplant low muscle mass (as measured by CT using Slice-O-matic software at L2–L3 interspace) was associated with post-transplantation mortality, hospital and intensive care unit length of stay (LOS), duration of mechanical ventilation, or primary graft dysfunction. Lung transplant recipients from 2000 to 2012 with a CT scan less than six months prior to transplant were included. Univariate, multivariate, and Kaplan–Meier analyses were conducted. Results Thirty-six patients were included. Those with low muscle index (lower 25th percentile) had a worse survival (hazard ratio = 3.83; 95% confidence interval 1.42–10.3; p = 0.007) and longer hospital LOS by an estimated 7.2 d (p = 0.01) when adjusted for age and sex as compared to those with higher muscle index. Conclusion Low muscle index at lung transplantation is associated with worse survival and increased hospital LOS.

52 citations


Journal ArticleDOI
TL;DR: Isolated recreational MJ use is not associated with poorer patient or kidney allograft outcomes at 1 year, and recreationalMJ use should not necessarily be considered a contraindication to kidney transplantation.
Abstract: As marijuana (MJ) legalization is increasing, kidney transplant programs must develop listing criteria for marijuana users. However, no data exist on the effect of MJ on kidney allograft outcomes, and there is no consensus on whether MJ use should be a contraindication to transplantation. We retrospectively reviewed 1225 kidney recipients from 2008 to 2013. Marijuana use was defined by positive urine toxicology screen and/or self-reported recent use. The primary outcome was death at 1 year or graft failure (defined as GFR<20 mL/min/1.73 m2). The secondary outcome was graft function at 1 year. Using logistic regression analyses, we compared these outcomes between MJ users and non-users. Marijuana use was not associated with worse primary outcomes by unadjusted (odds ratio 1.07, 95% CI 0.45–2.57, P=.87) or adjusted (odds ratio 0.79, 95% CI 0.28–2.28, P=.67) analysis. Ninety-two percent of grafts functioned at 1 year. Among these, the mean creatinine (1.52, 95% CI 1.39–1.69 vs 1.46, 95% CI 1.42–1.49; P=.38) and MDRD GFR (50.7, 95% CI 45.6–56.5 vs 49.5, 95% CI 48.3–50.7; P=.65) were similar between groups. Isolated recreational MJ use is not associated with poorer patient or kidney allograft outcomes at 1 year. Therefore, recreational MJ use should not necessarily be considered a contraindication to kidney transplantation.

49 citations


Journal ArticleDOI
TL;DR: Current measures of obesity do not accurately describe body composition, so objective measures of musculature and adiposity are possible and may inform efforts to optimize liver transplantation outcomes.
Abstract: BACKGROUND Current measures of obesity do not accurately describe body composition. Using cross-sectional imaging, objective measures of musculature and adiposity are possible and may inform efforts to optimize liver transplantation outcomes. METHODS Abdominal visceral fat area and psoas muscle cross-sectional area were measured on CT scans for 348 liver transplant recipients. After controlling for donor and recipient characteristics, survival analysis was performed using Cox regression. RESULTS Visceral fat area was significantly associated with post-transplant mortality (p < 0.001; HR = 1.06 per 10 cm(2) , 95% CI: 1.04-1.09), as were positive hepatitis C status (p = 0.004; HR = 1.78, 95% CI: 1.21-2.61) and total psoas area (TPA) (p < 0.001; HR = 0.91 per cm(2) , 95% CI: 0.88-0.94). Among patients with smaller TPA, the patients with high visceral fat area had 71.8% one-yr survival compared to 81.8% for those with low visceral fat area (p = 0.15). At five yr, the smaller muscle patients with high visceral fat area had 36.9% survival compared to 58.2% for those with low visceral fat area (p = 0.023). CONCLUSIONS Abdominal adiposity is associated with survival after liver transplantation, especially in patients with small trunk muscle size. When coupled with trunk musculature, abdominal adiposity offers direct characterization of body composition that can aid preoperative risk evaluation and inform transplant decision-making.

48 citations


Journal ArticleDOI
TL;DR: This work sought to understand antimicrobial prescribing practices and identify opportunities for interdisciplinary collaboration among the transplant, antimicrobial stewardship, and infectious diseases teams.
Abstract: Objective Rising incidence of Clostridium difficile and multidrug-resistant organisms' infections and a dwindling development of new antimicrobials are an impetus for antimicrobial stewardship in organ transplant recipients. We sought to understand antimicrobial prescribing practices and identify opportunities for interdisciplinary collaboration among the transplant, antimicrobial stewardship, and infectious diseases teams. Methods In 2013, two assessors conducted four real-time audits on all antimicrobial therapy in transplant patients, assessing each regimen against stewardship principles established by the Centers for Disease Prevention and Control, supplemented by applicable transplant-specific infection guidelines. Chi-square test was used to compare stewardship-concordant and stewardship-discordant audit results relative to transplant infectious diseases consultation. Results Analysis was performed on 176 audits. Fifty-eight percent (103/176) received at least one antimicrobial, of which 69.9% (72/103) were stewardship-concordant. Infections were confirmed or suspected in 52.3% (92/176). Of those, 98.9% (91/92) received antimicrobials, and 41.8% (38/91) were prescribed by transplant clinicians. Infectious diseases consultation was associated with more stewardship-concordant prescriptions (78.5% vs. 59.6%, p = 0.03). The most common stewardship-discordant categories were lack of de-escalation, empiric antimicrobial spectrum being too broad, and therapy duration being too long. Conclusions Opportunities exist for antimicrobial stewardship in transplant recipients, especially those who do not require infectious diseases consultation.

46 citations


Journal ArticleDOI
TL;DR: Pre‐transplant PLA2R‐Ab could be a useful tool for the prediction of rMN and patients with rMN in the absence of PLA2 rMN should be screened for occult malignancy and/or alternate antigens.
Abstract: Previous studies that have assessed the association of pre-transplant antiphospholipase A2 receptor autoantibody (PLA2R-Ab) concentration with a recurrence of membranous nephropathy (rMN) post-kidney transplant have yielded variable results. We tested 16 consecutive transplant patients with a history of iMN for pre-transplant PLA2R-Ab. Enzyme-linked immunosorbent assay titers (Euroimmun, NJ, USA) >14 RU/mL were considered positive. A receiver operating characteristic (ROC) analysis was performed after combining data from Quintana et al. (n = 21; Transplantation February 2015) to determine a PLA2R-Ab concentration which could predict rMN. Six of 16 (37%) patients had biopsy-proven rMN at a median of 3.2 yr post-transplant. Of these, five of six (83%) had a positive PLA2R-Ab pre-transplant with a median of 82 RU/mL (range = 31-1500). The only patient who had rMN with negative PLA2R-Ab was later diagnosed with B-cell lymphoma. One hundred percent (n = 10) of patients with no evidence of rMN (median follow-up = five yr) had negative pre-transplant PLA2R-Ab. In a combined ROC analysis (n = 37), a pre-transplant PLA2R-Ab > 29 RU/mL predicted rMN with a sensitivity of 85% and a specificity of 92%. Pre-transplant PLA2R-Ab could be a useful tool for the prediction of rMN. Patients with rMN in the absence of PLA2R-Ab should be screened for occult malignancy and/or alternate antigens.

46 citations


Journal ArticleDOI
TL;DR: It is hypothesized that measures of physical health would predict long‐term survival in heart transplant recipients (HTx) and VO2peak is known as the gold standard measure of cardiopulmonary fitness.
Abstract: Background Peak oxygen uptake (VO2peak ) is known as the gold standard measure of cardiopulmonary fitness. We therefore hypothesized that measures of physical health would predict long-term survival in heart transplant recipients (HTx). Methods This retrospective study investigated survival in two HTx populations; the cardiopulmonary exercise test (CPET) cohort comprised 178 HTx patients who completed a VO2peak test during their annual follow-up (1990-2003), and the SF-36 cohort comprised 133 patients who completed a quality of life questionnaire, SF-36v1 (1998-2000). Results Mean (SD) age in the CPET cohort was 52 (12) yr and 54 (11) yr in the SF-36 cohort. Mean observation time was, respectively, 11 and 10 yr. Mean (SD) VO2peak was 19.6 (5.3) mL/kg/min, and median (IR) physical function (PF) score was 90 (30). VO2peak and PF scores were both significant predictors in univariate Cox regression. Multiple Cox regression analyses adjusted for other potential predictors showed that VO2peak , age, and cardiac allograft vasculopathy (CAV) were the most important predictors in the CPET cohort, whereas age, PF score, smoking, and CAV were the most important predictors in the SF-36 cohort. In Kaplan-Meier analysis, VO2peak and PF scores above the median value were related to significant longer survival time. Conclusion Peak oxygen uptake and self-reported physical health are strong predictors for long-term survival in HTx recipients. VO2peak is a crucial measurement and should be more frequently used after HTx.

Journal ArticleDOI
TL;DR: Mindfulness‐based resilience training holds promise as an intervention to enhance resilience and manage stress for transplant patients and their caregivers.
Abstract: Solid organ and stem cell transplant patients and their caregivers report a substantial level of distress. Mindfulness-based stress reduction has been shown to alleviate distress associated with transplant, but there is limited experience in this population with other mindfulness-based interventions, or with combined transplant patient and caregiver interventions. We evaluated a novel, 6-week mindfulness-based resilience training (MBRT) class for transplant patients and their caregivers that incorporates mindfulness practice, yoga, and neuroscience of stress and resilience. Thirty-one heart, liver, kidney/pancreas, and stem cell transplant patients and 18 caregivers at Mayo Clinic in Arizona participated. Measures of stress, resilience, depression, anxiety, health-related quality of life, positive and negative affect, and sleep were completed at baseline, 6 weeks, and 3 months postintervention. At 6 weeks and 3 months, patients demonstrated significant (P<.005) improvements from baseline in measures of perceived stress, depression, anxiety, and negative affect. Quality-of-life mental component (P=.006) and positive affect (P=.02) also improved at follow-up. Most participants adhered to the program, were satisfied with class length and frequency, and reported improved well-being as a result of the class. MBRT holds promise as an intervention to enhance resilience and manage stress for transplant patients and their caregivers.

Journal ArticleDOI
TL;DR: The 12‐month PROTECT study showed that de novo liver transplant recipients who switched from a calcineurin inhibitor (CNI)‐based immunosuppression to a CNI‐free everolimus (EVR)‐ based regimen showed numerically better renal function.
Abstract: Background The 12-month (M) PROTECT study showed that de novo liver transplant recipients (LTxR) who switched from a calcineurin inhibitor (CNI)-based immunosuppression to a CNI-free everolimus (EVR)-based regimen showed numerically better renal function. Here, we present the five-yr follow-up data. Methods PROTECT was a randomized controlled study in which LTxR received basiliximab and CNI-based immunosuppression ± corticosteroids. Patients were randomized 1:1 to receive EVR or continue CNI. Patients completing the core study could enter the extension study on their randomized treatment. Results A total of 81 patients entered the extension study (41, EVR; 40, CNI). At M59 post-randomization, the adjusted mean eGFR was significantly higher in the EVR group, with a benefit of 12.4 mL/min using Cockcroft–Gault (95% CI: 1.2; 23.6; p = 0.0301). Also, there was a significant benefit for adjusted and unadjusted eGFR using the four-variable Modification of Diet in Renal Disease (MDRD4) or Nankivell formula. During the extension period, treatment failure rates were similar. SAEs occurred in 26 (63.4%) and 28 (70.0%) of the patients in EVR and CNI groups, respectively. Conclusion Compared with the CNI-based treatment, EVR-based CNI-free immunosuppression resulted in significantly better renal function and comparable patient and graft outcomes after five-yr follow-up.

Journal ArticleDOI
TL;DR: In conclusion, QTc prolongation appears to be associated with worse outcomes, and although DD did not impact outcomes, it significantly worsened after transplantation.
Abstract: Cirrhotic cardiomyopathy causes variable degree of systolic and diastolic dysfunction (DD) and conduction abnormalities. The primary aim of our study was to determine whether pre-transplant DD and prolonged corrected QT (QTc) predict a composite of mortality, graft failure, and major cardiovascular events after liver transplantation. We also evaluated the reversibility of cirrhotic cardiomyopathy after transplantation. Adult patients who underwent liver transplantation at our institution from January 2007 to March 2009 were included. Data were obtained from institutional registry, medical record review, and evaluation of echocardiographic images. Among 243 patients, 113 (46.5%) had grade 1 DD, 16 (6.6%) had grade 2 DD, and none had grade 3 DD. The mean pre-transplant QTc was 453 milliseconds. After a mean post-transplant follow-up of 5.2 years, 75 (31%) patients satisfied the primary composite outcome. Cox regression analysis did not show any significant association between DD and the composite outcome (P=.17). However, longer QTc was independently associated with the composite outcome (HR: 1.01, 95% confidence interval: 1.00-1.02, P=.05). DD (P<.001) and left ventricular mass index (P=.001) worsened after transplantation. In conclusion, QTc prolongation appears to be associated with worse outcomes. Although DD did not impact outcomes, it significantly worsened after transplantation.

Journal ArticleDOI
TL;DR: Discussion continues about right vs. left donor nephrectomy (LDN), where left side is preferred due to longer renal vein while right side has been associated with renal vein thrombosis and shorter vessels.
Abstract: Background Discussion continues about right vs. left donor nephrectomy (LDN). Left side is preferred due to longer renal vein while right side has been associated with renal vein thrombosis and shorter vessels. Methods A retrospective analysis of UNOS database for adult living donor transplants between 1 January 2000 and 31 December 2009. Results We identified 58 599 living donor transplants, of which 86.1% were LDN. There were no significant differences between the recipients or donors demographics. There were higher rates of delayed graft function in right donor nephrectomy (RDN) recipients with a hazard risk of 1.38 (95% CI 1.24–1.53; p < 0.0001). Primary failure rates were similar. In the RDN group, graft thrombosis as cause of graft failure was statistically higher with a hazard ratio of 1.48 (95% CI 1.18–1.86, p = 0.0004), and graft survival was significantly inferior (p = 0.006 log-rank test). For living donors outcomes, the conversion from laparoscopic to open was higher in the RDN group with an odds ratio of 2.02 (95% CI 1.61–2.52; p < 0.00001). There was no significant difference in vascular complications or re-operation required due to bleeding. Re-operations and re-admissions were higher in the LDN group. Conclusion There are statistical differences between left and right kidney donor nephrectomies on recipient outcomes, but the difference is extremely small. The choice and laterality should be based on center and surgeon preference and experience.

Journal ArticleDOI
TL;DR: The caliber of education mainstream media provides the public about brain death is evaluated to evaluate the public's understanding of brain death.
Abstract: INTRODUCTION: We sought to evaluate the caliber of education mainstream media provides the public about brain death. METHODS: We reviewed articles published prior to July 31, 2015 on the most shared/heavily trafficked mainstream media websites of 2014 using the names of patients from two highly publicized brain death cases, "Jahi McMath" and "Marlise Munoz." RESULTS: We reviewed 208 unique articles. The subject was referred to as being "alive" or on "life support" in 72% (149) of the articles, 97% (144) of which also described the subject as being brain dead. A definition of brain death was provided in 4% (9) of the articles. Only 7% (14) of the articles noted that organ support should be discontinued after brain death declaration unless a family has agreed to organ donation. Reference was made to well-known cases of patients in persistent vegetative states in 16% (34) of articles and 47% (16) of these implied both patients were in the same clinical state. CONCLUSIONS: Mainstream media provides poor education to the public on brain death. Because public understanding of brain death impacts organ and tissue donation, it is important for physicians, organ procurement organizations, and transplant coordinators to improve public education on this topic. This article is protected by copyright. All rights reserved.This article is protected by copyright. All rights reserved. Language: en

Journal ArticleDOI
TL;DR: The duration of anticytomegalovirus (CMV) prophylaxis after lung transplantation (LT) varies among transplant centers.
Abstract: Background The duration of anticytomegalovirus (CMV) prophylaxis after lung transplantation (LT) varies among transplant centers. Methods A retrospective review of CMV donor-seropositive/recipient-seronegative (D+/R-) and CMV recipient-seropositive (R+) LT patients between January 2005 and September 2012 was performed. Starting January 2007, valganciclovir prophylaxis was given for at least 12 months (often lifelong) for CMV D+/R- and extended from three to six months for R+ LT patients. Risks of CMV infection and CMV disease, and mortality after LT, were assessed. Results A total of 88 LT patients were studied, including 32 CMV D+/R-, and 56 R+ patients. During the follow-up period, 11 (12.5%) patients had asymptomatic CMV infection, and nine (10.3%) developed CMV disease. CMV disease (HR, 4.189; 95% CI: 1.672-10.495; p = 0.002) and CMV infection and disease (HR, 3.775; 95% CI: 1.729-8.240; p = 0.001) were significant risk factors for mortality. Overall, no significant difference was observed in rates of CMV infection or disease among LT recipients who received shorter vs. extended CMV prophylaxis. Conclusions Despite extended prophylaxis, LT patients remain at risk of CMV infection and disease. CMV remains associated with increased mortality after transplantation.

Journal ArticleDOI
TL;DR: The effects of race and socioeconomic factors on transplantation for HCC while controlling for stage, resection status, and transplant candidacy are explored.
Abstract: INTRODUCTION Liver transplantation is the most effective treatment for hepatocellular carcinoma (HCC) in eligible patients, but is not accessed equally by all. We explored the effects of race and socioeconomic factors on transplantation for HCC while controlling for stage, resection status, and transplant candidacy. PATIENTS AND METHODS All HCC patients, 2003-2013, were retrospectively analyzed using multivariate analysis to explore differences in transplantation rates among cohorts. RESULTS Of 3078 HCC patients, 754 (24%) were considered transplant eligible. Odds of transplantation were significantly higher for those with commercial insurance (OR = 1.99, 95% CI [1.42, 2.79]) and lower for black patients (OR = 0.55, 95% CI [0.33, 0.91]). Asians were more likely to be resected than white patients with similarly staged tumors and transplant criteria (p < 0.001). Patients not listed for transplantation for non-medical reasons were more likely to be government-insured (p = 0.02) and not white (p = 0.05). No step along the transplantation pathway was identified as the dominant hurdle. DISCUSSION Patients who are black or government-insured are significantly less likely to undergo transplantation for HCC despite controlling for tumor stage, resection status, and transplant eligibility. Asian patients have higher rates of hepatic resection, but also appear to have lower transplantation rates beyond this effect.

Journal ArticleDOI
TL;DR: Late PRA testing post‐failure revealed 18 patients remained NS and 34 patients became HS, a significant risk for becoming HS after AF, and studies comparing ISW vs. continuation in re‐transplant candidates with high baseline DQ eplet MM burden should be performed.
Abstract: Sensitization following renal allograft failure (AF) is highly variable. Some patients remain non-sensitized (NS), while others become highly sensitized (HS). We studied 66 NS patients who experienced AF after initial kidney transplantation. Post-failure, two main groups of NS panel reactive antibody (PRA) class I and II <10% and HS patients (PRA class I or II ≥80%) were identified. The impact of acute rejection (AR), immunosuppression withdrawal (ISW) at AF, allograft nephrectomy, graft intolerance syndrome (GIS), and both standard serologic and eplet-based mismatches (MM) in inducing HS status after failure was examined. Late PRA testing post-failure revealed 18 patients remained NS and 34 patients became HS. African American recipients, ISW at AF, DQB1 eplet MM, and presence of GIS were associated with becoming HS. Presence of total zero eplet MM, zero DQA1/B1 eplet MM, continuation of immunosuppression after failure, and a hyporesponsive immune status characterized by recurrent infections were features of NS patients. DQ eplet MM represents a significant risk for becoming HS after AF. Studies comparing ISW vs. continuation in re-transplant candidates with high baseline DQ eplet MM burden should be performed. This may provide insights if sensitization post-AF can be lessened.

Journal ArticleDOI
TL;DR: No randomized‐controlled trials (RCT) or international guidelines on antifungal prophylaxis (AFP) in the LTX population exist, but there are calls for more research into this issue.
Abstract: Background Lung transplant (LTX) recipients are at high risk of invasive aspergillus infections (IAI). However, no randomized controlled trials (RCT) or international guidelines on antifungal prophylaxis (AFP) in the LTX population exist. Methods A meta-analysis was performed to determine whether AFP reduces the rate of IAI after LTX. A total of six eligible observational studies (five with no prophylaxis, one with targeted prophylaxis, three studies including heart(lung)transplantation) with a total of 748 patients were included. Results The pooled odds ratio (OR) for IAI (62 IFI in the intervention arm, 82 in the control group) was 0.234 [95% confidence interval (CI) 0.097-0.564,p=0.001,z=-3.237]. Pooled studies were characterized by substantial heterogeneity (I2=66.64%), number needed to treat was 6.8. A subgroup analyses with exclusion of heart transplant recipients also showed a statistically significant reduction of IAI with AFP (OR 0.183, 95%CI 0.0449-0.744, p=0.018). Conclusion The present study suggests that universal antifungal prophylaxes reduces incidence of IAI after LTX. However included studies are limited by small sample size, single center structure without randomization, mixed population (including heart/heart-lung transplant) and heterogeneity due to variations in immunosuppression, type and duration of antifungal prophylaxis. Therefore, there is a clear need for an adequately powered RCT. This article is protected by copyright. All rights reserved.

Journal ArticleDOI
TL;DR: The aim of this study was to investigate the hypothesis that intraoperative infusion of dexmedetomidine can exert a protective effect against hepatic ischemia–reperfusion injury (IRI) in adult living donor liver transplantation (LDLiver transplantation).
Abstract: Objective The aim of this study was to investigate the hypothesis that intraoperative infusion of dexmedetomidine can exert a protective effect against hepatic ischemia-reperfusion injury (IRI) in adult living donor liver transplantation (LDLiver transplantation). Patients and methods Forty recipients were allocated into: control group (group I; n = 20) that received a placebo; and dexmedetomidine group (group II; n = 20) that received a continuous intraoperative infusion of 0.8 μg/kg/h of dexmedetomidine. Data collected were AST, ALT, bilirubin, INR, and lactate, at baseline, immediately post-operatively, and on post-operative days 1, 3, and 5. Intercellular adhesion molecule-1 (ICAM-1) was measured at: baseline, 2 and 6 h after reperfusion, and on post-operative day 1. At the end of the surgery, a liver biopsy was sent for histopathological assessment. Results No significant difference was noticed in either group regarding MELD score, baseline AST, ALT, bilirubin, INR, or lactate. Dexmedetomidine tended to decrease blood pressure and heart rate, but the comparison was insignificant. Group II showed significantly attenuated levels of ICAM-1 and significantly minimal histopathological changes. The laboratory changes showed significantly lower AST, ALT, bilirubin, INR, and lactate in group II. Conclusions Dexmedetomidine exerted protective effects against hepatic IRI during adult LDLiver transplantation, as indicated by suppression of ICAM-1, better scores of histopathological assessment, and augmented post-operative liver function tests.

Journal ArticleDOI
TL;DR: This study compares outcomes after LTx utilizing different CPB strategies – elective CPB vs. off‐pump vs.off‐p pump with unplanned conversion to CPB.
Abstract: BACKGROUND The risk-benefit for utilizing cardio-pulmonary bypass (CPB) in lung transplantation (LTx) remains debatable. This study compares outcomes after LTx utilizing different CPB strategies - elective CPB vs. off-pump vs. off-pump with unplanned conversion to CPB. METHODS A total of 302 LTx performed over seven yr were divided into three groups: "off-pump" group (n = 86), "elective on-pump" group (n = 162), and "conversion" group (n = 54). The preoperative donor and recipient demographics and baseline characteristics and the postoperative outcomes were analyzed; 1:1 propensity score matching was used to identify patients operated upon using elective CPB who had risk profiles similar to those operated upon off-pump (propensity-matching 1) as well as those emergently converted from off-pump to CPB (propensity-matching 2). RESULTS Preoperative group demographic characteristics were comparable; however, the "off-pump" patient group was significantly older. The "conversion" group had a significantly greater number of patients with primary pulmonary hypertension, pulmonary fibrosis, preoperative mechanical ventilation, and preoperative extracorporeal life support (ECLS). Postoperatively, patients from the "conversion" group had significantly poorer PaO2 /FiO2 ratios upon arrival in intensive care unit (ICU) and at 24, 48, and 72 h postoperatively, and they required more prolonged ventilation, longer ICU admission, and they experienced an increased need for ECLS than the other groups. Overall, cumulative survival at one, two, and three yr was significantly worse in patients from the "conversion" group compared to the "off-pump" and "elective on-pump" groups - 61.9% vs. 94.4% vs. 86.9%, 54.4% vs. 90.6% vs. 79.5% and 39.8% vs. 78.1% vs. 74.3%, respectively (p < 0.001). The "off-pump" group had significantly better PaO2 /FiO2 ratios, and a significantly shorter duration of ventilation, ICU stay, and hospital length of stay when compared to the propensity-matched "elective on-pump" group. There were no statistically significant differences in postoperative outcomes and overall survival between the "converted" group and the propensity-matched "elective on-pump" group. CONCLUSIONS Despite segregation of unplanned CPB conversion cases from elective on-pump cases, patients with comparable preoperative demographic/risk profiles demonstrated better early postoperative outcomes and, possibly, an improved early survival with an off-pump strategy. A considerable proportion of high-risk patients require intraoperative conversion from off-pump to CPB, and this seems associated with suboptimal outcomes; however, there is no significant benefit to employing an elective on-pump strategy over emergent conversion in the high-risk group.

Journal ArticleDOI
TL;DR: A lack of research exploring post‐transplant process optimization to reduce readmissions and increasing readmission rates at the authors' center from 2009 to 2013 led to this study, aimed at assessing the effect of patient and process factors on 30‐d readmissions rates after kidney transplantation.
Abstract: A lack of research exploring post-transplant process optimization to reduce readmissions and increasing readmission rates at our center from 2009 to 2013 led to this study, aimed at assessing the effect of patient and process factors on 30-d readmission rates after kidney transplantation. This was a retrospective case-control study in adult kidney transplant recipients. Univariate and multivariate analyses were utilized to assess patient and process determinants of 30-d readmissions. 384 patients were included; 30-d readmissions were significantly associated with graft loss and death (p = 0.001). Diabetes (p = 0.049), pharmacist identification of poor understanding or adherence, and prolonged time on hemodialysis prior to transplant were associated with an increased risk of 30-d readmissions. After controlling for risk factors, readmission rates were only independently predicted by pharmacist identification of patient lack of understanding or adherence regarding post-transplant medications and dialysis exposure for more than three yr (OR 2.3, 95% CI 1.10-4.71, p = 0.026 and OR 2.1, 95% CI 1.22, 3.70, respectively), both of which were significantly modified by history of diabetes. Thirty-d readmissions are attributable to both patient and process-level factors. These data suggest that a lack of post-transplant medication knowledge in high-risk patients drives early hospital readmission.

Journal ArticleDOI
TL;DR: Maternal and fetal outcomes including change in left ventricular (LV) function and calcineurin inhibitor (CNI) dose in women who became pregnant from the authors' institution are investigated.
Abstract: Purpose Successful pregnancy following cardiac transplantation has been described, although outcome data from individual centers are relatively sparse. We investigated maternal and fetal outcomes including change in left ventricular (LV) function and calcineurin inhibitor (CNI) dose in women who became pregnant from our institution. Methods We identified every female patient 3 months post-surgery, between 1985 and 2014. Those who conceived had a review of their medical records and transplant charts. Those currently alive were interviewed. Results There were 22 pregnancies in 17 women with 20 live births (91%). Mean time from transplantation was 98±62.4 months. Rejection complicated one pregnancy, and LV function remained normal in all others. Hypertension complicated 3 (13.6%), preeclampsia 3 (13.6%), and cholestasis 1 (4.5%). Mean birthweight was 2447±608 grams at 34.1±3.6 weeks. Four women died following pregnancy. A significant increase in total daily dose of tacrolimus and cyclosporine A was required to maintain therapeutic levels through pregnancy (CyA, P<.001; and Tac, P=.001), with no deterioration in serum creatinine. Conclusions We report a 91% live birth rate post-cardiac transplantation. Meticulous individualized care with frequent monitoring of CNI levels and LV function is necessary to optimize the maternal and fetal outcomes.

Journal ArticleDOI
TL;DR: Patients with a functioning renal transplant have a high stroke incidence and case fatality and unlike those on hemodialysis, risk factors are similar to the general population, and warfarin use in those with AF did not demonstrate benefit.
Abstract: Stroke incidence is high in end-stage renal disease, and risk factors differ between the dialysis and general populations. However, risk factors and outcomes following renal transplantation remain unclear. We analyzed all adult patients with a functioning renal transplant from 01/01/2007 to 12/31/2012. Data were extracted from the electronic patient record. Variables associated with stroke were identified by survival analyses; demographic, clinical, and imaging and laboratory variables were assessed and case fatality determined. Follow-up was until 05/12/2013. A total of 956 patients were identified (median age 40.1 years, 59.9% male). Atrial fibrillation (AF) prevalence was 9.2%, and 38.2% received a transplant during follow-up. A total of 26 (2.7%) experienced a stroke during 4409 patient-years of follow-up (84.6% ischemic). Stroke incidence was 5.96/1000 patient-years. Factors associated with stroke on regression analysis were prior stroke, diabetes, age, systolic hypertension, and hemoglobin. Atrial fibrillation was associated with time to stroke (P<0.001). Warfarin did not associate with ischemic stroke risk in those with AF. Fatality was 19.2% at 7, 23.1% at 28, and 42.3% at 365 days after stroke. Patients with a functioning renal transplant have a high stroke incidence and case fatality. Unlike those on hemodialysis, risk factors are similar to the general population. We did not demonstrate benefit from warfarin use in those with AF.

Journal ArticleDOI
TL;DR: Dual KT (DKT) may reduce organ discard and optimize the use of kidneys from marginal donors as well as expand the organ donor pool through expanded criteria donors.
Abstract: Background The need to expand the organ donor pool remains a formidable challenge in kidney transplantation (KT). The use of expanded criteria donors (ECDs) represents one approach, but kidney discard rates are high because of concerns regarding overall quality. Dual KT (DKT) may reduce organ discard and optimize the use of kidneys from marginal donors. Study design We conducted a single-center retrospective review of outcomes in adult recipients of DKTs from adult marginal deceased donors (DD) defined by limited renal functional capacity. If the calculated creatinine clearance in an adult DD was <65 mL/min, then the kidneys were transplanted as a DKT. Results Over 11.5 yr, 72 DKTS were performed including 45 from ECDs, 17 from donation after cardiac death (DCD) donors, and 10 from standard criteria donors (SCD). Mean adult DD and recipient ages were both 60 yr, including 29 DDs and 26 recipients ≥65 yr of age. Mean pre-DKT waiting and dialysis vintage times were 12 months and 25 months, respectively. Actual patient and graft survival rates were 84.7% and 70.8%, respectively, with a mean follow-up of 58 months. One yr and death-censored graft survival rates were 90% and 80%, respectively. Outcomes did not differ by DD category, recipient age, or presence of delayed graft function (DGF). Eleven patients died at a mean of 32 months post-DKT (eight with functioning grafts) and 13 other patients experienced graft losses at a mean of 33 months. The incidence of DGF was 25%; there were two cases (2.8%) of primary non-function. Mean length of initial hospital stay was 7.2 d. Mean serum creatinine and glomerular filtration rate levels at 12 and 24 months were 1.5 and 53 and 1.5 mg/dL and 51 mL/min/1.73 m2, respectively. DKT graft survival and function were superior to concurrent single ECD and similar to concurrent SCD KTs. Two patients underwent successful kidney retransplantation, so the dialysis-free rate in surviving patients was 87%. The proportion of total renal function transplanted from adult DD to DKT recipients was 77% compared to 56% for patients receiving single KTs. Conclusions Dual kidney transplantation using kidneys from adult marginal DDs that otherwise might be discarded offer a viable option to counteract the growing shortage of acceptable single kidneys. Excellent medium-term outcomes can be achieved and waiting times can be reduced in a predominantly older recipient population.

Journal ArticleDOI
TL;DR: A uniform, ethically defensible donor selection protocol would accept older donors with many minor medical abnormalities but protect from donation many currently acceptable younger, black, and/or low GFR candidates.
Abstract: Recent studies from the United States and Norway have suggested an unexpected 8- to 11-fold relative risk of ESRD after kidney donation, but a low long-term absolute risk. Abundant renal epidemiologic data predict that these studies have underestimated long-term risk. The 1% lifetime post-donation risk in the US study requires medical screening to predict ESRD in 96 of 100 candidates. This is particularly unlikely in the 30-35% of candidates under age 35, half of whose lifetime ESRD will occur after age 64. Many experts have attributed the increased relative risks in these studies to loss of GFR at donation, which ultimately means that high-normal pre-donation GFRs will reduce absolute post-donation risks. The 8- to 11-fold relative risks predict implausible risks of uninephrectomy in the general population, but lower estimates still result in very high risks for black donors. Young vs. older age, low vs. high-normal pre-donation GFRs, black race, and an increased relative risk of donation all predict highly variable individual risks, not a single "low" or "1%" risk as these studies suggest. A uniform, ethically defensible donor selection protocol would accept older donors with many minor medical abnormalities but protect from donation many currently acceptable younger, black, and/or low GFR candidates.

Journal ArticleDOI
TL;DR: The rationale behind the method, techniques and protocols, devices available, and clinical experience worldwide are outlined, and the potential of ex vivo lung perfusion in leading a new era of lung preservation is highlighted.
Abstract: Lung transplantation is an established life-saving therapy for patients with end-stage lung disease. Unfortunately, greater success in lung transplantation is hindered by a shortage of lung donors and the relatively poor early-, mid-, and long-term outcomes associated with severe primary graft dysfunction. Ex vivo lung perfusion has emerged as a modern preservation technique that allows for a more accurate lung assessment and improvement in lung quality. This review outlines the: (i) rationale behind the method; (ii) techniques and protocols; (iii) Toronto ex vivo lung perfusion method; (iv) devices available; and (v) clinical experience worldwide. We also highlight the potential of ex vivo lung perfusion in leading a new era of lung preservation.

Journal ArticleDOI
TL;DR: This work examined the impact of PH on outcome after lung transplantation, with special emphasis on pre‐ and post‐capillary PH.
Abstract: Purpose Pulmonary hypertension (PH) is recognized as a risk factor in lung transplantation as reflected in the lung allocation score (LAS). We examined the impact of PH on outcome after lung transplantation, with special emphasis on pre- and post-capillary PH. Methods Consecutive lung transplant recipients were evaluated according to ISHLT criteria including right heart catheterization in the period from 1992 to October 2014. Post-transplant survival was assessed according to hemodynamic characteristics: post-capillary PH (mean pulmonary arterial pressure [mPAP] ≥ 25 mmHg and pulmonary arterial wedge pressure [PAWP] > 15 mmHg), pre-capillary PH (mPAP ≥ 25 mmHg, PAWP ≤ 15 mmHg) and non-PH (mPAP < 25 mmHg). Results Of 518 transplant recipients, 58 (11%) had post-capillary PH. Pre-capillary PH was present in 211 (41%) and 249 (48%) non-PH. Post-capillary PH and pre-capillary PH were associated with worse 90-d outcomes after transplantation compared to non-PH (p = 0.043 and 0.003, respectively). The negative effect persisted 1 yr post-transplantation in pre-capillary PH (p = 0.037), but not in post-capillary PH (p = 0.447). Long-term survival was unaffected by hemodynamic classification. Conclusion Post-capillary PH was present in 11% and pre-capillary PH in 41% of the transplant cohort. Post-capillary PH and pre-capillary PH were associated with inferior 90-d survival, but long-term survival was unaffected.

Journal ArticleDOI
TL;DR: The experience with a limited upper midline incision for living donor right hepatectomy is described, which is one way to increase donation by decreasing donor morbidity.
Abstract: Background Living donor liver transplantation is a viable option to increase access to transplantation and techniques to limit the operative incision is one way to increase donation by decreasing donor morbidity. We describe our experience with a limited upper midline incision (UMI) for living donor right hepatectomy. Study design Prospective data were collected on 58 consecutive living liver donors who underwent right hepatectomy via a UMI. Results Donor median age was 32 years, with median body mass index of 24.6. The mean incision length was 11.7 cm. Ten liver grafts included middle hepatic vein. The mean graft volume by preoperative imaging was 940 cc. The mean operative time was 407 minutes; cellsaver was utilized in 35 patients with median of 1 unit. Mean peak aspartate transaminase (AST) and alanine transaminase (ALT) were 492 and 469, and peak bilirubin and international normalized ratio (INR) were 3.3 and 1.8. The average length of stay was 6 days. There were 10 Clavien grade I and 11 Clavien grade II complications. Three patients developed an incisional hernia requiring surgical repair. Conclusion Living liver donor hepatectomy can be safely performed through a UMI. This approach consolidates the steps of liver mobilization, hilar dissection, and parenchymal transection in a single-exposure technique, with incision comparable to the laparoscopic-assisted modality.