scispace - formally typeset
Search or ask a question

Showing papers in "Clinical Transplantation in 2015"


Journal ArticleDOI
TL;DR: In patients with ESLD awaiting LT, 6MWD appears to be a more useful prognostic indicator than the presence of sarcopenia, and also failed to correlate with either functional capacity or HRQOL in LT candidates.
Abstract: Sarcopenia, or loss of skeletal muscle mass, is associated with increased mortality and morbidity in liver transplant (LT) candidates. Six-minute walk distance (6MWD) and health-related quality of life (HRQOL) as assessed by short form 36 scores (SF-36) also impact clinical outcomes in these patients. This study explored the relationship between the sarcopenia, 6MWD, and HRQOL in LT candidates. Sarcopenia was evaluated based on skeletal muscle mass index (SMI) quantified from abdominal computed tomography. Patients were followed until death, removal from the wait list or the end of the study period. Two hundred and thirteen patients listed for LT were included. The mean SMI, 6MWD and mean gait speed were 54.3 ± 9.7, 370.5 m and 1 m/s, respectively. Sarcopenia was noted in 22.2% of LT candidates. There was no correlation between sarcopenia, 6MWD, and SF-36 scores. The 6MWD, but not sarcopenia, was an independent predictor of mortality (hazard ratio = 2.1 [0.9-4.7]). In summary, sarcopenia did not emerge as a significant predictor of waitlist mortality and also failed to correlate with either functional capacity or HRQOL in LT candidates. In patients with ESLD awaiting LT, 6MWD appears to be a more useful prognostic indicator than the presence of sarcopenia.

111 citations


Journal ArticleDOI
TL;DR: Results suggest LCPT is associated with clinically meaningful improvement of hand tremor and may be an alternative management approach in lieu of further dose reduction of immediate‐release tacrolimus for patients experiencing tremor.
Abstract: Tremor is a common side effect of tacrolimus correlated with peak-dose drug concentration. LCPT, a novel, once-daily, extended-release formulation of tacrolimus, has a reduced Cmax with comparable AUC exposure, requiring a ~30% dose reduction vs. immediate-release tacrolimus. In this phase 3b study, kidney transplant recipients (KTR) on a stable dose of tacrolimus and with a reported clinically significant tremor were offered a switch to LCPT. Tremor pre- and seven d post-conversion was evaluated by independent, blinded movement disorder neurologists using the Fahn-Tolosa-Marin (FTM) scale and by an accelerometry device; patients completed the QUEST (quality of life in essential tremor) and the Patient Global Impression of Change. There were 38 patients in the mITT population. A statistically and clinically significant improvement in tremor (FTM score, amplitude as measured by the accelerometry device and QOL [p-values < 0.05]) resulted post-conversion. Change in QUEST was significantly (p = 0.006) correlated (R = 0.44) with change in FTM; 78.9% of patients reported an improvement after switching to LCPT (p < 0.0005). To our knowledge this is the first trial in KTR that utilizes a sophisticated and reproducible measurement of tremor. Results suggest LCPT is associated with clinically meaningful improvement of hand tremor and may be an alternative management approach in lieu of further dose reduction of immediate-release tacrolimus for patients experiencing tremor.

73 citations


Journal ArticleDOI
TL;DR: IVIG administration appeared to be safe and effective in treating BKV viremia and BkVN and preventing graft loss in patients who had inadequate response to immunosuppression reduction and leflunomide therapy.
Abstract: BK virus associated nephropathy (BKVN) can cause clinically significant viral infections in renal transplant recipients, leading to allograft dysfunction and loss. The usual management of BKVN involves reduction of immunosuppression and the addition of leflunomide, quinolones, and cidofovir, but the rate of graft loss remains high. The aim of this study was to assess the impact of treatment with intravenous immunoglobulin (IVIG) on the outcome of BKVN in renal transplant recipients. Upon diagnosis of BKVN, patients remained on anti-polyomavirus treatment consisting of reduction of immunosuppression and the use of leflunomide therapy. Treatment with IVIG was given only to patients who did not respond to 8 weeks of the adjustment of immunosuppression and leflunomide. All 30 patients had persistent BK viremia and BKVN with their mean BK viral loads higher than the baseline (range 15,000 - 2 millions copies/mL). Mean peak BK load was 205,314 copies/mL compared to 697 copies/mL after one year follow-up. Twenty-seven patients (90%) had positive responses in clearing viremia. The actuarial patient and graft survival rates after 12 months were 100% and 96.7%, respectively. IVIG administration appeared to be safe and effective in treating BK viremia and BKVN and in preventing graft loss in patients who had inadequate response to immunosuppression reduction and leflunomide therapy.

69 citations


Journal ArticleDOI
TL;DR: IFI continue to occur in LTR, and the eradication of IFI appears to be challenging even with prolonged prophylaxis, while Azole resistance is uncommon despite prolonged AF exposure.
Abstract: Lung transplant recipients (LTR) at our institution receive prolonged and mostly lifelong azole antifungal (AF) prophylaxis. The impact of this prophylactic strategy on the epidemiology and outcome of invasive fungal infections (IFI) is unknown. This was a single-center, retrospective cohort study. We reviewed the medical records of all adult LTR from January 2002 to December 2011. Overall, 16.5% (15 of 91) of patients who underwent lung transplantation during this time period developed IFI. Nineteen IFI episodes were identified (eight proven, 11 probable), 89% (17 of 19) of which developed during AF prophylaxis. LTR with idiopathic pulmonary fibrosis were more likely to develop IFI (HR: 4.29; 95% CI: 1.15-15.91; p = 0.03). A higher hazard of mortality was observed among those who developed IFI, although this was not statistically significant (hazard ratio [HR]: 1.71; 95% confidence interval [CI] [0.58-4.05]; p = 0.27). Aspergillus fumigatus was the most common cause of IFI (45%), with pulmonary parenchyma being the most common site of infection. None of our patients developed disseminated invasive aspergillosis, cryptococcal or endemic fungal infections. IFI continue to occur in LTR, and the eradication of IFI appears to be challenging even with prolonged prophylaxis. Azole resistance is uncommon despite prolonged AF exposure.

49 citations


Journal ArticleDOI
TL;DR: Preferences for the testing and treatment of antibody‐mediated rejection in renal transplant patients vary among programs and individual practitioners and the description of these preferences and identification of commonalities can contribute to creating a standard of care.
Abstract: Introduction Preferences for the testing and treatment of antibody-mediated rejection (AMR) in renal transplant patients vary among programs and individual practitioners. The description of these preferences and identification of commonalities can contribute to creating a standard of care. Methods A survey was distributed through the Transplant Listserv of the American College of Clinical Pharmacy (ACCP) and via email to members of the American Society of Transplantation Community of Pharmacy (AST CoP), collected, and analyzed. Results Most clinicians (26/28) test for donor-specific antibodies (DSAs) when evaluating a patient with possible AMR. Treatments for AMR varied widely among responding clinicians and included intravenous immune globulin (IVIG, n = 25), plasmapheresis (n = 24), rituximab (n = 8), bortezomib (n = 4), rabbit antithymocyte globulin (n = 2), and eculizumab (n = 1). Weight-based dosing of IVIG averaged 1.8 g/kg total dose. Six centers use rituximab as initial therapy, while two use rituximab if other therapy fails. Four centers use bortezomib as initial therapy, while two centers use it for severe/persistent AMR. One center uses eculizumab as initial therapy and one center uses it for severe AMR. Conclusion Methods for the detection of AMR are similar, yet treatment of AMR varies widely. Most centers utilize DSA for detection and a combination of IVIG and plasmapheresis for treatment.

48 citations


Journal ArticleDOI
TL;DR: There are limited data about sarcopenic obesity in liver transplant recipients and the need to select patients suitable for liver transplantation based on prior history and once they provide informed consent for a transplant.
Abstract: Objective There are limited data about sarcopenic obesity in liver transplant recipients. Methods Living donor liver transplant recipients with at least 12 months of follow-up were included. Metabolic syndrome (MS) was defined as ≥3 ATP III criteria. Body composition was assessed by bioelectrical impedance. Immunosuppression protocol included short-term steroids, mycophenolate and calcineurin inhibitors (mainly tacrolimus). Data are shown as percentage, mean ± SD, or median (25–75 IQR). Results The study comprised 82 patients (males 69), aged 50.5 ± 10.65 yr, and follow-up 24 (12–38.5) months. Etiology for cirrhosis was alcohol 29%, hepatitis C 22%, hepatitis B 17%, cryptogenic 24%, and others 7%. Post-transplant sarcopenic obesity was present in 72 (88%), and MS was present in 43 (52%) of recipients with no significant difference among etiologies. There were significant differences between pre- and post-transplant body mass index, triglycerides, high-density lipoprotein, low-density lipoprotein (p = 0.000 for all), prevalence of hypertension (18% vs. 39%), and diabetes (20% vs. 56%). Patients with sarcopenic obesity had significantly higher body mass index, waist circumference, and MS (57% vs. 20%, p = 0.041) when compared to patients without sarcopenic obesity. Conclusion Despite resuming routine activities, the majority of liver transplant recipients develop sarcopenic obesity and MS. The importance and role of appropriate nutrition and exercise after transplantation merits further investigation.

47 citations


Journal ArticleDOI
TL;DR: It is hypothesized that FTR is common in sarcopenic liver transplant recipients and shown to be a primary driver of mortality following major general and vascular surgery.
Abstract: Introduction Sarcopenic liver transplant recipients have higher rates of mortality, but mechanisms underlying these rates remain unclear. Failure to rescue (FTR) has been shown to be a primary driver of mortality following major general and vascular surgery. We hypothesized that FTR is common in sarcopenic liver transplant recipients. Methods We retrospectively reviewed 348 liver transplant recipients with perioperative CT scans. Analytic morphomic techniques were used to assess trunk muscle size via total psoas area (TPA). One-yr major complication and FTR rates were calculated across TPA tertiles. Results The one-yr complication rate was 77% and the FTR rate was 19%. Multivariate regression showed TPA as a significant predictor of FTR (OR = 0.27 per 1000 mm2 increase in TPA, p < 0.001). Compared to patients in the largest muscle tertile, patients in the smallest tertile had 1.4-fold higher adjusted complication rates (91% vs. 66%) and 2.8-fold higher adjusted FTR rates (22% vs. 8%). Discussion These results suggest that mortality in sarcopenic liver transplant recipients may be strongly related to FTR. Efforts aimed at early recognition and management of complications may decrease postoperative mortality. Additionally, this work highlights the need for expanded multicenter collaborations aimed at collection and analysis of postoperative complications in liver transplant recipients.

45 citations


Journal ArticleDOI
TL;DR: The results of this carefully selected cohort of older OLT recipients had outcomes that were comparable with the younger recipients, and show the need for better pre‐OLT evaluation and optimization, and for closer post-OLT surveillance, of cardiovascular disease among the elderly.
Abstract: With the increasing age of recipients undergoing orthotopic liver transplant (OLT), there is need for better risk stratification among them. Our study aims to identify predictors of poor outcome among OLT recipients ≥ 60 yr of age. All patients who underwent OLT at Cleveland Clinic from January 2004 to April 2010 were included. Baseline patient characteristics and post-OLT outcomes (mortality, graft failure, length of stay, and major post-OLT cardiovascular events) were obtained from prospectively collected institutional registry. Among patients ≥ 60 yr of age, multivariate regression modeling was performed to identify independent predictors of poor outcome. Of the 738 patients included, 223 (30.2%) were ≥ 60 yr. Hepatic encephalopathy, platelet counts 3.5 mg/dL, and serum albumin < 2.65 mg/dL independently predicted poor short-term outcomes. The presence of pre-OLT coronary artery disease and arrhythmia were independent predictors of poor long-term outcomes. Cardiac causes represented the second most common cause of mortality among the elderly cohort. Despite that, this carefully selected cohort of older OLT recipients had outcomes that were comparable with the younger recipients. Thus, our results show the need for better pre-OLT evaluation and optimization, and for closer post-OLT surveillance, of cardiovascular disease among the elderly.

42 citations


Journal ArticleDOI
TL;DR: It is concluded that after liver transplantation sarcopenia does not progress but is arrested and frequently improves in the absence of confounding conditions.
Abstract: Muscle wasting, sarcopenia, is common in advanced cirrhosis and predicts adverse outcomes while awaiting and following liver transplantation. Frequent post-transplant worsening of sarcopenia has attracted recent interest. It is unknown whether this serious problem is an expected metabolic consequence of transplantation or results from confounding conditions such as recurrent allograft liver disease or avoidable post-transplant complications. To clarify this question, we studied pre- and post-transplant muscle mass in a retrospective cohort of 40 patients transplanted for three diseases - alcoholic cirrhosis, non-alcoholic steatohepatitis cirrhosis, and primary sclerosing cholangitis cirrhosis - in whom allograft disease recurrence was monitored and excluded, and who lacked common post-transplant muscle wasting complications such as sepsis, renal failure, ischemia, and cholestasis. We measured skeletal muscle index (SMI) using computed tomography before and 12-48 months after transplant. SMI as a categorical variable significantly improved, from 18 patients above the normal cutoff pre-transplant to 28 post-transplant (p = 0.008). SMI increases were greatest in patients with the lowest pre-transplant SMI (p < 0.01). As a continuous variable, mean SMI remained stable, with a non-significant trend toward improvement. We conclude that after liver transplantation sarcopenia does not progress but is arrested and frequently improves in the absence of confounding conditions.

41 citations


Journal ArticleDOI
TL;DR: Renal function in female kidney transplant recipients improved slightly during pregnancy and returned to pre‐pregnant level after delivery, and the dose elevation of calcineurin inhibitor by approximately 20–25% should be considered during gestational period to maintain optimal blood drug level.
Abstract: We investigated the effects of pregnancy and delivery on renal function in transplant recipients and the relationship between doses of immunosuppressants and blood drug levels during pregnancy in 75 women with 88 deliveries. Significant serum creatinine elevation (> 0.5 mg/dL) was found in eight deliveries. In the remaining 80 cases, serum creatinine was reduced by an average of 0.14 mg/dL and returned to pre-pregnant levels after delivery. Tacrolimus was used in 28 deliveries and cyclosporine in others. Tacrolimus blood trough level declined from 5.8 ± 2.8 ng/mL 12 months before delivery to 4.2 ± 1.8 ng/mL at second trimester; therefore, drug dose was increased from 4.1 ± 1.9 mg/d at first trimester to 5.5 ± 2.5 mg/d at delivery. Similarly, cyclosporine levels were 125.1 ± 65.1 ng/mL 12 months before delivery and 75.4 ± 35.0 ng/mL at second trimester resulting in dose elevation from 183.0 ± 71.8 mg/d at first trimester to 225.4 ± 85.1 mg/d at delivery. Renal function in female kidney transplant recipients improved slightly during pregnancy and returned to pre-pregnant level after delivery. The dose elevation of calcineurin inhibitor by approximately 20-25% should be considered during gestational period to maintain optimal blood drug level.

40 citations


Journal ArticleDOI
TL;DR: In the United States, African Americans and whites differ in access to the deceased donor renal transplant waitlist, and the extent to which racial disparities in waitlisting differ between United Network for Organ Sharing regions is understudied.
Abstract: Background In the United States, African Americans and whites differ in access to the deceased donor renal transplant waitlist. The extent to which racial disparities in waitlisting differ between United Network for Organ Sharing (UNOS) regions is understudied. Methods The US Renal Data System (USRDS) was linked with US census data to examine time from dialysis initiation to waitlisting for whites (n = 188 410) and African Americans (n = 144 335) using Cox proportional hazards across 11 UNOS regions, adjusting for potentially confounding individual, neighborhood, and state characteristics. Results Likelihood of waitlisting varies significantly by UNOS region, overall and by race. Additionally, African Americans face significantly lower likelihood of waitlisting compared to whites in all but two regions (1 and 6). Overall, 39% of African Americans with ESRD reside in Regions 3 and 4 – regions with a large racial disparity and where African Americans comprise a large proportion of the ESRD population. In these regions, the African American–white disparity is an important contributor to their overall regional disparity. Conclusions Race remains an important factor in time to transplant waitlist in the United States. Race contributes to overall regional disparities; however, the importance of race varies by UNOS region.

Journal ArticleDOI
TL;DR: Both anti‐dHLA DSA and SAFB+/iBeads‐ DSA appear irrelevant, which could explain the good outcome observed in some patients with preformed class I DSA.
Abstract: Class I single-antigen flow beads (SAFB) carry native and denatured human leukocyte antigen (HLA) molecules. Using a cohort of 179 class I HLA-sensitized kidney recipients, we described incidence and clinical relevance of preformed denatured HLA donor-specific antibodies (DSA) using two different assays: an acid-treated SAFB assay (anti-dHLA DSA) and the iBeads assays (SAFB+/iBeads- DSA). Eighty-five class I DSA were found in 67 patients (median mean fluorescence intensity [MFI] of 1729 [range 520-13 882]). Anti-dHLA and SAFB+/iBeads- DSA represented 11% and 18% of class I DSA and were mainly low MFI DSA (500-1000 MFI). Concordance between these two assays was good (90%). None of the patients with only class I anti-dHLA DSA or only SAFB+/iBeads- DSA developed acute clinical antibody-mediated rejection in the first-year post-transplantation, and their five-yr death-censored graft survival was similar to that of patients without DSA. Moreover, all these patients displayed a negative current T-cell flow cytometry cross-match. Therefore, both anti-dHLA DSA and SAFB+/iBeads- DSA appear irrelevant, which could explain the good outcome observed in some patients with preformed class I DSA.

Journal ArticleDOI
TL;DR: The experience with ureteral complications requiring revision surgery after renal transplantation is presented and the results to a matched control population are compared.
Abstract: Background In this study, we present our experience with ureteral complications requiring revision surgery after renal transplantation and compare our results to a matched control population. Methods We performed a retrospective analysis of our database between 1997 and 2012. We divided the cases into early (<60 d) and late repairs. Kaplan–Meier and Cox proportional hazards models were used to compare graft survival between the intervention cohort and controls generated from the Scientific Registry of Transplant Recipients data set. Results Of 2671 kidney transplantations, 51 patients were identified as to having undergone 53 ureteral revision procedures; 43.4% of cases were performed within 60 d of the transplant and were all associated with urinary leaks, and 49% demonstrated ureteral stenosis. Reflux allograft pyelonephritis and ureterolithiasis were each the indication for intervention in 3.8%; 15.1% of the lesions were located at the anastomotic site, 37.7% in the distal segment, 7.5% in the middle segment, 5.7% proximal ureter, and 15.1% had a long segmental stenosis. In 18.9%, the location was not specified. Techniques used included ureterocystostomy (30.2%), ureteroureterostomy (34%), ureteropyelostomy (30.1%), pyeloileostomy (1.9%), and ureteroileostomy (3.8%). No difference in overall graft survival (HR 1.24 95% CI 0.33–4.64, p = 0.7) was detected when compared to the matched control group. Conclusion Using a variety of techniques designed to re-establish effective urinary flow, we have been able to salvage a high percentage of these allografts. When performed by an experienced team, a ureteric complication does not significantly impact graft survival or function as compared to a matched control group.

Journal ArticleDOI
TL;DR: The objectives were to analyze the attitude of citizens, born in Latin America and living in Spain, toward living kidney donation and to determine the psychosocial variables affecting this attitude.
Abstract: INTRODUCTION The Latin American (LA) population in Spain is ever increasing in size and is perfectly integrated into the social structure. The objectives were to analyze the attitude of citizens, born in Latin America and living in Spain, toward living kidney donation (LKD) and to determine the psychosocial variables affecting this attitude. MATERIAL AND METHODS A sample of LA residents living in Spain was obtained randomly in 2010 and stratified according to the respondent's nationality (n = 1314). Attitude was evaluated using a validated questionnaire ("Proyecto Colaborativo Internacional Donante sobre Donacion de Vivo Renal" Rios). The survey was self-administered and completed anonymously. RESULTS The questionnaire completion rate was 86% (n = 1.132). A total of 89% (n = 1003) were in favor of related living donation, and 30% if the donation were unrelated. The variables associated with attitude toward LKD were as follows: sex (p = 0.043); marital status (p = 0.013); previous experience of organ donation (p = 0.009); attitude toward deceased organ donation (p < 0.001); a respondent's belief that he or she could be a possible recipient of a future transplant (p < 0.001); knowledge of a partner's opinion (p = 0.021); family discussion about organ donation (p = 0.001); knowledge of the view of one's religion toward donation (p < 0.001); concern about "mutilation" after donation (p = 0.004); and evaluation of the risk from living donation (p = 0.036). CONCLUSIONS The attitude of LA citizens residing in Spain was favorable both toward related LKD and unrelated living donation.

Journal ArticleDOI
TL;DR: The univariate and multivariate analyses identified two main risk factors associated with development of GVHD in OLT recipients, a difference between recipient and donor age of >20 yr, and any human leukocyte antigen class I matches.
Abstract: Graft-versus-host disease (GVHD) is a rare, fatal complication following orthotopic liver transplantation (OLT). To date, several risk factors have been proposed, but reports on these factors have been inconclusive. This is a retrospective, case-control study of prospectively collected data from 2775 OLTs performed at our institution. Eight cases of GVHD after OLT were diagnosed on the basis of the patient's clinical characteristics, and the findings were confirmed with skin and colonic biopsies. Each case was matched to three controls based on the diagnosis of liver disease, recipient's age, and blood group. Univariate and multivariate analyses were performed to identify risk factors associated with the development of GVHD after OLT. The univariate and multivariate analyses identified two main risk factors associated with development of GVHD in OLT recipients, a difference between recipient and donor age of >20 yr, and any human leukocyte antigen class I matches. Taking these two risk factors into consideration while matching prospective donors and recipients may reduce further incidence of GVHD in OLT patients. However, further studies are recommended to validate these findings.

Journal ArticleDOI
TL;DR: It is important to recognize which factors may lead to PCT increases in the post‐transplantation period, which in turn will help understand the kinetics and utility of this biomarker in this important patient population.
Abstract: Procalcitonin (PCT) has been increasingly used as a biomarker of bacterial infection and as a tool to guide antimicrobial therapy, especially in lower respiratory tract and bloodstream infections. Despite its increased use, data in patients with solid organ transplants are limited. Even without the presence of infection, PCT increases as a result of surgical procedures during transplantation, implantation of devices, and use of induction immunosuppressive therapy. The risk of infection is also higher in solid organ transplant recipients when compared to the general population. Monitoring PCT in the early post-transplant period seems to be a promising method for early detection of infectious complications. It has been shown that elevated PCT levels after one wk of transplantation are correlated with infectious complications. PCT may be a useful adjunctive biomarker that may improve early identification and guide appropriate treatment of infection or rejection, with the potential to further improve clinical outcomes. The use of serial PCT measurements may be more reliable than single values. It is important to recognize which factors may lead to PCT increases in the post-transplantation period, which in turn will help understand the kinetics and utility of this biomarker in this important patient population.

Journal ArticleDOI
TL;DR: IVIG/RTX treatment for severe TG during chronic antibody‐mediated rejection does not seem to change the natural history of TG and is associated with a high incidence of adverse events.
Abstract: Outcome of patients with transplant glomerulopathy (TG) is poor. Using B-cell targeting molecules represent a rational strategy to treat TG during chronic antibody-mediated rejection. In this pilot study, 21 patients with this diagnosis received four doses of intravenous immunoglobulins and two doses of rituximab (IVIG/RTX group). They were retrospectively compared with a untreated control group of 10 patients. At 24 months post-biopsy, graft survival was similar and poor between the treated and the untreated group, 47% vs. 40%, respectively, p = 0.69. This absence of response of IVIG/RTX treatment was observed, regardless the phenotype of TG. Baseline estimated glomerular filtration rate (eGFR) and decline in eGFR during the first six months after the treatment were risk factors associated with 24-month graft survival. The IVIG/RTX therapy had a modest effect on the kinetics of donor-specific alloantibodies at M24, compared to the untreated group, not associated with an improvement in graft survival. The mean number of adverse events per patient was higher in the IVIG/RTX group than in the control group (p = 0.03). Taken together, IVIG/RTX treatment for severe TG during chronic antibody-mediated rejection does not seem to change the natural history of TG and is associated with a high incidence of adverse events.

Journal ArticleDOI
TL;DR: HALD is a safe procedure for the donor with good recipient outcomes and should be considered a routine procedure for kidney donation, according to current guidelines.
Abstract: Hand-assisted laparoscopic donor (HALD) nephrectomy has been performed at our institution since December 1999. Through May 2014, a total of 1500 HALD procedures have been performed. We have evaluated the outcomes of HALD. The HALD procedure consists of a hand-port incision as well as two 12-mm ports. Mean donor age was 40.8 ± 10.8 yr, BMI was 27.9 ± 5.0, there were 541 males, 1271 Caucasians, and the left kidney was removed in 1236 patients. All procedures were successfully completed. Four donors (0.27%) were converted to an open technique due to bleeding. Four donors required blood transfusions. 53 donors (3.5%) were readmitted in the first month post-donation; almost half were due to gastrointestinal complaints. Six donors required reoperation; three for SBO and three for wound dehiscence. 27 patients (1.8%) developed incisional hernias. Seven donors (0.47%) developed bowel obstruction. All donors recovered well with a mean hospital stay after donation of 2.1 ± 0.3 d. All except one kidney were successfully implanted. Twenty-one recipients (1.4%) experienced DGF. Ureter complications occurred in 17 (1.1%) recipients. Early graft loss occurred in 13 patients (0.9%). In conclusion, HALD is a safe procedure for the donor with good recipient outcomes.

Journal ArticleDOI
TL;DR: The data suggest that donor artery multiplicity is an independent risk factor for urologic complications following KTX, and donation after cardiac death, non‐mandatory national share kidneys, donor peak serum creatinine > 1.5 mg/dL or creatinines phosphokinase > 1000 IU/L, and donor down time were not associated with urologics complications.
Abstract: Urologic complications are the most frequent technical adverse events following kidney transplantation (KTX). We evaluated traditional and novel potential risk factors for urologic complications following KTX. Consecutive KTX recipients between December 1, 2006 and December 31, 2010 with at least six-month follow-up (n = 635) were evaluated for overall urologic complications accounting for donor, recipient, and transplant characteristics using univariate and multivariate logistic regression. Urologic complications occurred in 29 cases (4.6%) at a median of 40 d (range 1-999) post-transplantation and included 17 ureteral strictures (2.6%), five (0.8%) ureteral obstructions due to donor-derived stones or intraluminal thrombus, and seven urine leaks (1.1%). All except two complications occurred within the first year of transplantation. Risk factors for urologic complications on univariate analysis were dual KTX (p = 0.04) and renal artery multiplicity (p = 0.02). On multivariate analysis, only renal artery multiplicity remained significant (aHR 2.4, 95% confidence interval 1.1, 5.1, p = 0.02). Donation after cardiac death, non-mandatory national share kidneys, donor peak serum creatinine > 1.5 mg/dL or creatinine phosphokinase > 1000 IU/L, and donor down time were not associated with urologic complications. Our data suggest that donor artery multiplicity is an independent risk factor for urologic complications following KTX.

Journal ArticleDOI
TL;DR: The authors' series provides little evidence that RD‐SLKT would have yielded substantial short‐term survival benefit to RD‐LTA recipients, and Evaluation of the cause and timing of death relative to native renal recovery revealed that only four RD‐ LTA recipients might have derived survival benefit from RD‐ SLKT.
Abstract: It is unclear whether a concomitant kidney transplant grants survival benefit to liver transplant (LT) candidates with renal dysfunction (RD). We retrospectively studied LT candidates without RD (n = 714) and LT candidates with RD who underwent either liver transplant alone (RD-LTA; n = 103) or simultaneous liver–kidney transplant (RD-SLKT; n = 68). RD was defined as renal replacement therapy (RRT) requirement or modification of diet in renal disease (MDRD)–glomerular filtration rate (GFR) <25 mL/min/1.73 m2. RD-LTAs had worse one-yr post-transplant survival compared to RD-SLKTs (79.6% vs. 91.2%, p = 0.05). However, RD-LTA recipients more often had hepatitis C (60.2% vs. 41.2%, p = 0.004) and more severe liver disease (MELD 37.9 ± 8.1 vs. 32.7 ± 9.1, p = 0.0001). Twenty RD-LTA recipients died in the first post-transplant year. Evaluation of the cause and timing of death relative to native renal recovery revealed that only four RD-LTA recipients might have derived survival benefit from RD-SLKT. Overall, 87% of RD-LTA patients recovered renal function within one month of transplant. One yr after RD-LTA or RD-SLKT, serum creatinine (1.5 ± 1.2 mg/dL vs. 1.4 ± 0.5 mg/dL, p = 0.63) and prevalence of stage 4 or 5 chronic kidney disease (CKD; 5.9% vs. 6.8%, p = 0.11) were comparable. Our series provides little evidence that RD-SLKT would have yielded substantial short-term survival benefit to RD-LTA recipients.

Journal ArticleDOI
TL;DR: In conclusion, old age should not preclude ESRD patients from kidney transplantation, however, specific differences that have to do with immunosuppression and other aspects of managing elderly transplant recipients should be considered.
Abstract: Kidney transplantation is the best renal replacement therapy option and is superior to dialysis in elderly end-stage renal disease (ESRD) patients. Furthermore, the outcome of transplantation in the elderly is comparable to younger patients in terms of allograft survival. The exact nature of this phenomenon is not completely clear. As the elderly population continues to grow, it becomes more important to identify specific issues associated with kidney transplantation. In particular, elderly transplant recipients might have a lower chance of acute rejection as their immune systems seem to be less reactive. This might predispose elderly recipients to greater risk of post-transplant infectious complications or malignancies. Furthermore, due to differences in pharmacokinetics, elderly recipients might require lower doses of immunosuppressive medication. As the main cause of graft failure in the elderly is death with a functioning graft and also considering the scarcity of donor organs, it might make sense to recommend transplanting elderly recipients with extended criteria donor kidneys. This approach would balance shorter patient survival compared to younger recipients. In conclusion, old age should not preclude ESRD patients from kidney transplantation. However, specific differences that have to do with immunosuppression and other aspects of managing elderly transplant recipients should be considered.

Journal ArticleDOI
TL;DR: In patients with hepatocellular carcinoma, the outcome after liver transplantation (LT) is excellent if tumor characteristics are within the Milan criteria (MC) and expanded Asan criteria (AC) have not yet been validated in Western countries.
Abstract: Background: In patients with hepatocellular carcinoma (HCC), the outcome after liver transplantation (LT) is excellent if tumor characteristics are within the Milan criteria (MC). Expanded Asan criteria (AC) have not yet been validated in Western countries. Methods: A total of 76 patients with HCC underwent LT. Patients were divided and compared according to Milan, UCSF, and Asan criteria. Differences between pre- and post-operative assessment were evaluated. Overall survival (OS) and disease-free survival (DFS) were compared between groups. Predictors of recurrence were investigated. Results: Asan criteria provided 26% and 15% more criteria-fitting patients than MC and UCSF pre-operatively while 49% and 35% at pathological evaluation. Discrepancy between pre- and post-operative evaluation was 32% for MC, 33% for UCSF, and 18% for AC (p=0.06). After a median follow-up of 70.5months, patients exceeding MC but fulfilling Asan had comparable 5-yr OS and DFS to patients fulfilling MC (p=0.17; p=0.29). Patients exceeding UCSF but fulfilling AC had comparable 5-yr OS and DFS to patients fulfilling UCSF (p=0.26; p=0.32). Number of nodules, macro-vascular invasion, capsular invasion, and exceeding AC predicted recurrence at multivariate analysis (p=0.01, 0.03, 0.01, 0.02, respectively). Conclusions: The extension to AC allows increasing the number of patients eligible for LT without affecting OS and DFS.

Journal ArticleDOI
TL;DR: Evaluated the effect of contemporary induction immunosuppression agents in heart transplant recipients with the primary endpoint of survival, utilizing national registry data.
Abstract: Introduction The impact of induction immunosuppression on long-term survival in heart transplant recipients is unclear. Over the past three decades, practices have varied as induction agents have changed and experiences grew. We sought to evaluate the effect of contemporary induction immunosuppression agents in heart transplant recipients with the primary endpoint of survival, utilizing national registry data. Methods We queried the United Network for Organ Sharing (UNOS) data registry for all heart transplants from 1987 to 2012. We restricted our analysis to adult (≥18 yr) recipients performed from 2001–2011 (to allow for the potential for a minimum of 12 months post-transplant follow-up) who received either: no antibody based induction (NONE) or the contemporary agents (INDUCED) of either: basiliximab/daclizumab (IL-2Rab), alemtuzumab, or ATG/ALG/thymoglobulin. Kaplan-Meier estimates of the survival function as well as Cox proportional hazards models were utilized. Results Of the 17 857 heart transplants that met the inclusion criteria, there were 4635 (26%) reported deaths during the follow-up period. There were 8216 (46%) patients who were INDUCED. Of the INDUCED agents, 55% were IL-2Rab, 4% alemtuzumab, and 40% ALG/ATG/thymoglobulin. Donor and recipient characteristics were evaluated. Overall, being INDUCED did not significantly affect survival in univariable (p = 0.522) and multivariable (p = 0.130) Cox models as well as a propensity score adjusted model (p = 0.733). Among those induced, ATG/ALG/thymoglobulin appeared to have superior survival as compared with IL-2Rab (log-rank p = 0.007, univariable hazard ratio [HR] = 0.886; 95% CI: 0.811–0.968; p = 0.522). However, in a multivariable Cox model that adjusted for recipient age, VAD, BMI, steroid use, CMV match, and ischemic time, the hazard ratio for ALG/ATG/thymoglobulin vs. IL-2Rab was no longer statistically significant (HR = 0.948; 95% CI: 0.850–1.058; p = 0.341). Conclusion In a contemporary analysis of heart transplant recipients, an overall analysis of induction agents does not appear to impact survival, as compared to no induction immunosuppression. While ALG/ATG/thymoglobulin appeared to have a beneficial effect on survival compared to IL-2Rab in the univariable model, this difference was no longer statistically significant once we adjusted for clinically relevant covariates.

Journal ArticleDOI
TL;DR: There is a paucity of information on fractures after kidney transplantation outside the United States, and fractures are associated with high morbidity and economic costs.
Abstract: Background Fractures are associated with high morbidity and economic costs. There is a paucity of information on fractures after kidney transplantation outside the United States. Methods Data were obtained from the Hospital Episode Statistics database on kidney transplants performed in England between 2001 and 2013 and post-transplant fracture-related hospitalization. Mortality data were obtained from the Office for National Statistics. Results In total, 21 769 first kidney transplant procedures were analyzed with 112 512 patient-years follow-up. Overall, 836 (3.8%) kidney allograft recipients developed a fracture requiring hospitalization. Event rate was 9.99 for any fracture and 1.54 for a hip fracture per 1000 patient-years. Accounting for the competing risk of mortality, increasing age, female gender, white ethnicity, and a history of pre-transplant diabetes mellitus or previous fracture were associated with increased fracture risk post-kidney transplantation. Death occurred in 2407 (11.1%) kidney allograft recipients, with 173 deaths occurring post-fracture. In an extended Cox model, hip fracture as a time-varying factor was independently associated with an increased risk of death (hazard ratio, 3.288; 95% confidence intervals, 2.513–4.301; p < 0.001). Conclusions Fracture rates in English kidney transplant recipients are lower than previously reported in US cohorts. Sustaining a hip fracture is associated with an increased mortality risk. Our results can be used to power future fracture prevention trials.

Journal ArticleDOI
TL;DR: HLT may provide improved outcomes in critically ill IPAH patients admitted to the ICU at time of transplantation, and the overall survival after HLT or DLT is comparable.
Abstract: Patients with idiopathic pulmonary arterial hypertension (IPAH) have improved survival after heart-lung transplantation (HLT) and double-lung transplantation (DLT). However, the optimal procedure for patients with IPAH undergoing transplantation remains unclear. We hypothesized that critically ill IPAH patients, defined by admission to the intensive care units (ICU), would demonstrate improved survival with HLT vs. DLT. All adult IPAH patients (>18 yr) in the Scientific Registry of Transplant Recipients (SRTR) database, who underwent either HLT or DLT between 1987 and 2012, were included. Baseline characteristics, survival, and adjusted survival were compared between the HLT and DLT groups. Similar analyses were performed for the subgroups as defined by the recipients' hospitalization status. A total of 928 IPAH patients (667 DLT, 261 HLT) were included in this analysis. The HLT recipients were younger, more likely to be admitted to the ICU, and have had their transplant in previous eras. Overall, the adjusted survivals after HLT or DLT were similar. For recipients who were hospitalized in the ICU, DLT was associated with worse outcomes (HR 1.827; 95% CI 1.018-3.279). In IPAH patients, the overall survival after HLT or DLT is comparable. HLT may provide improved outcomes in critically ill IPAH patients admitted to the ICU at time of transplantation.

Journal ArticleDOI
TL;DR: For all assessed domains, patients reported a significant improvement in HRQOL after PKT, and maintenance of the two grafts functioning predicted higher improvement ofHRQOL scores.
Abstract: Pancreas-kidney transplantation (PKT) may significantly improve quality of life (HRQOL) in patients with type 1 diabetes. We have assessed the changes felt by PKT patients, using the Gastrointestinal Quality of Life Index (GIQLI) and EuroQol-5D questionnaires. Patients were asked to compare how their HRQOL had changed from pre-transplantation to the last visit. The 60 men and 66 women enrolled had a mean follow-up of five yr; 84.1% with both grafts, 15.9% with one graft functioning. In all domains of EuroQol-5D scores improved after PKT, as well as the visual analogue scale health state (from 38% to 84%, p < 0.001; effect size 3.34). In GIQLI, physical function was felt better after PKT than before (14.83 3.86 vs. 7.86 4.43, p < 0.001; effect size 1.68); the same was observed for psychological status, social function, and GI complaints. Concerning the burden of medical treatment, the score significantly improved (from 1.31 to 3.63, p < 0.001, effect size 2.02). The rate of unemployed patients decreased after PKT (from 50.8% to 36.5%, p < 0.001). Multivariate analysis showed that having only one functioning graft was associated with worse HRQOL scores (B = 5.157, p = 0.015). In conclusion, for all assessed domains, patients reported a significant improvement in HRQOL after PKT. Maintenance of the two grafts functioning predicted higher improvement of HRQOL scores.

Journal ArticleDOI
TL;DR: It was demonstrated that the surgeon had a significant impact on severe complications, especially those of the ureter, as well as the clinical and immunological risk profile, in 405 KTx patients treated using defined immunosuppressive regimens.
Abstract: The population of kidney transplant (KTx) recipients often has complex medical and immunological conditions. Surgical complications (SCs) contribute to the increasing morbidity and costs in these patients. We analyzed the risk factors for SC in 405 KTx patients treated using defined immunosuppressive regimens according to their clinical and immunological risk profile: (1) standard immunosuppression (SIS) with IL-2 receptor mAb, CNI, and (a) mycophenolic acid (MPA) or (b) mTOR inhibitor; and (2) more intense immunosuppression (IIS) with (a) ATG or (b) the additional use of plasma exchange and B- and T-cell-depleting agents. In a mixed effects logistic regression model, we identified the following risk factors for SC: male gender, diabetes, and post-operative dialysis. No difference was found between the patients who received SIS with MPA and those who received mTOR inhibitors. The risk of suffering complications with IIS increases with age. In addition to IIS, diabetes was a risk for wound healing disorders. Therapeutic anticoagulation and a third or subsequent retransplantation increased the rate of bleeding. We did not identify immunosuppression or patient demographics as risk factors for lymphoceles or ureter complications; however, we demonstrated that the surgeon had a significant impact on severe complications, especially those of the ureter.

Journal ArticleDOI
TL;DR: Left ventricular assist devices as a bridge to transplant (BTT) have been known to cause allosensitization, as measured by panel‐reactive antibody (PRA) levels, but the goal of this study was to measure the impact of this allosensing on outcomes.
Abstract: Background Left ventricular assist devices (LVADs) as a bridge to transplant (BTT) have been known to cause allosensitization, as measured by panel-reactive antibody (PRA) levels. The goal of this study was to measure the impact of this allosensitization on outcomes. Methods Panel-reactive antibodies were analyzed in BTT patients, with sensitization defined as peak PRAs ≥ 10%. Baseline characteristics and outcomes in the two patient groups were evaluated using descriptive statistics, Kaplan–Meier, and regression analysis. Results Thirty-eight patients were included in the study (17 sensitized vs. 21 non-sensitized). There were more women in the sensitized group (47% vs. 10%, p = 0.023). There was no difference in mean times to high-grade acute cellular rejection (ACR; 18.3 months in sensitized vs. 36.9 months in non-sensitized). Five patients in the sensitized groups developed antibody-mediated rejection (AMR) vs. 0 in the non-sensitized, and all five patients died (Kaplan–Meier log-rank p = 0.024). There was also a significant difference in the incidence of infection at the one- to six-month stage (52.9% vs. 19.0%, p = 0.03). Conclusion Sensitization appears to have a negative effect on mortality. This mortality appears to be concentrated in patients with AMR, and we postulate that the development of AMR in a sensitized patient may be a predictor of mortality.

Journal ArticleDOI
TL;DR: Heart transplant patients have risk factors that place them at higher risk for acute venous thromboembolism and pulmonary embolism than the general population, and rate of VTE and incidence of PE‐related mortality among heart transplant patients are assessed.
Abstract: Introduction Heart transplant patients have risk factors that place them at higher risk for acute venous thromboembolism (VTE), which includes deep vein thrombosis (DVT) and pulmonary embolism (PE), than the general population. We assessed for rate of VTE and incidence of PE-related mortality among heart transplant patients. Materials and Methods A total of 1258 heart transplant patients were evaluated for the development of VTE. The diagnosis of DVT was made by Duplex ultrasonography, and PE was diagnosed by computerized tomography pulmonary angiography or ventilation–perfusion radionuclide scan. PE-related mortality was assessed at one yr, three yr, and five yr post-transplant. Results A total of 117 (9.3%) patients were diagnosed with DVT, including 65 of 117 (55.5%) with lower extremity DVT (LEDVT) and 52 of 117 (44.4%) with upper extremity DVT (UEDVT). A total of 24 (1.9%) patients experienced PE with seven (29.2%) resulting deaths. The rate of LEDVT and UEDVT was similar (55.5% vs. 44.4%); however, the incidence of PE was greater for those with LEDVT (23.1% vs. 7.7%; p = 0.04). Patients with PE had lower survival over the five-yr follow-up period compared to those with DVT only (67% vs. 81%; p = 0.51). Conclusion Heart transplant patients have a high incidence of VTE despite current best practice, indicating a need for a more aggressive approach to thromboprophylaxis.

Journal ArticleDOI
TL;DR: Causal path analyses reveal that protein intake restriction should not be advised to RTR, and low protein intake is associated with increased risk of mortality and graft failure in RTR.
Abstract: The effect of a low protein intake on survival in renal transplant recipients (RTR) is unknown. A low protein intake may increase risks of malnutrition, low muscle mass, and death. We aimed to study associations of protein intake with mortality and graft failure and to identify potential intermediate factors. Protein intake was estimated from 24-h urinary urea excretion (24-h UUE). Graft failure was defined as return to dialysis or retransplantation. We used Cox regression analyses to analyze associations with outcome and potential intermediate factors in the causal path. In 604 RTR, meanSD 24-h UUE was 380 � 114 mmol/24-h. During median follow-up for 7.0 yr (interquartile range: 6.2-7.5 yr), 133 RTR died and 53 developed graft failure. In univariate analyses, 24-h UUE was associated with lower risk of mortality (HR (95% CI) = 0.80 (0.69-0.94)) and graft failure (HR (95% CI) = 0.72 (0.56-0.92)). These associations were independent of potential confounders. In causal path analyses, the association of 24-h UUE with mortality disappeared after adjustment for muscle mass. Low protein intake is associated with increased risk of mortality and graft failure in RTR. Causal path analyses reveal that the association with mortality is explained by low muscle mass. These findings suggest that protein intake restriction should not be advised to RTR.