scispace - formally typeset
Search or ask a question

Showing papers in "Transplantation in 2012"


Journal ArticleDOI
TL;DR: Identifying the molecular pathways that trigger tissue injury, signal transduction and rejection facilitates the identification of targets for the development of immunosuppressive drugs.
Abstract: Rejection is the major barrier to successful transplantation. The immune response to an allograft is an ongoing dialogue between the innate and adaptive immune system that if left unchecked will lead to the rejection of transplanted cells, tissues, or organs. Activation of elements of the innate immune system, triggered as a consequence of tissue injury sustained during cell isolation or organ retrieval and ischemia reperfusion, will initiate and amplify the adaptive response. T cells require a minimum of two signals for activation, antigen recognition, and costimulation. The activation requirements of naive T cells are more stringent than those of memory T cells. Memory T cells are present in the majority of transplant recipients as a result of heterologous immunity. The majority of B cells require help from T cells to initiate antibody production. Antibodies reactive to donor human leukocyte antigen molecules, minor histocompatibility antigens, endothelial cells, RBCs, or autoantigens can trigger or contribute to rejection early and late after transplantation. Antibody-mediated rejection triggered by alloantibody binding and complement activation is recognized increasingly as a significant contribution to graft loss. Even though one component of the immune system may dominate and lead to rejection being described in short hand as T cell or antibody mediated, it is usually multifactorial resulting from the integration of multiple mechanisms. Identifying the molecular pathways that trigger tissue injury, signal transduction and rejection facilitates the identification of targets for the development of immunosuppressive drugs.

320 citations


Journal ArticleDOI
TL;DR: The results support the use and dissemination of ABOi transplantation when a compatible live donor is not available, but caution that the highest period of risk is immediately posttransplant.
Abstract: Background—ABO incompatible (ABOi) kidney transplantation is an important modality to facilitate living donor transplant for incompatible pairs. To date, reports of the outcomes from this practice in the United States have been limited to single-center studies. Methods—Using the Scientific Registry of Transplant Recipients, we identified 738 patients who underwent live-donor ABOi kidney transplantation between January 1, 1995 and March 31, 2010. These were compared with matched controls that underwent ABO compatible (ABOc) livedonor kidney transplantation. Subgroup analyses among ABOi recipients were performed according to donor blood type, recipient blood type, and transplant center ABOi volume. Results—When compared to ABOc matched controls, long-term patient survival of ABOi recipients was not significantly different between the cohorts (p=0.2). However, graft loss was significantly higher, particularly in the first 14 days post-transplant (SHR 2.34, 95% CI 1.43–3.84, p=0.001), with little to no difference beyond day 14 (SHR 1.28, 95% CI 0.99–1.54, p=0.058). In subgroup analyses among ABOi recipients, no differences in survival were seen by donor blood type, recipient blood type, or transplant center ABOi volume. Conclusions—These results support the use and dissemination of ABOi transplantation when a compatible live donor is not available, but caution that the highest period of risk is immediately post-transplant.

224 citations


Journal ArticleDOI
TL;DR: The importance of establishing the DQ match before transplantation to define immunologic risk is shown, as DQ DSAbs are associated with inferior allograft outcomes.
Abstract: BackgroundThe importance of human leukocyte antigen (HLA) matching in renal transplantation is well recognized, with HLA-DR compatibility having the greatest influence. De novo DQ donor-specific antibodies (DSAbs) are the predominant HLA class II DSAb after transplantation. The aim of this study was

223 citations


Journal ArticleDOI
TL;DR: HEV-associated glomerulonephritis seems to be an HEV-related extrahepatic manifestation in solid-organ transplant patients during HEV infection that improved and proteinuria decreased after HEV clearance.
Abstract: Background Hepatitis E virus (HEV) infection is an emerging disease in industrialized countries. Few data regarding genotype 3 HEV extrahepatic manifestations exist. Methods We assessed kidney function and histology in solid-organ transplant patients during HEV infection. In all, 51 cases of genotype 3 HEV infections were diagnosed (34 kidney, 14 liver, and 3 kidney-pancreas transplant patients). Of these, 43.2% were cleared of the virus spontaneously within 6 months of infection, whereas 56.8% evolved to chronic hepatitis. Twelve of these patients completed a 3-month antiviral therapy and were followed up for 6 months posttreatment. Kidney function (estimated glomerular filtration rate [eGFR] obtained by the Modification of Diet in Renal Disease equation) and proteinuria were assessed before infection, during HEV infection and during follow-up. Kidney biopsies were obtained from patients with high proteinuria and decreased eGFR levels. Results During HEV infection, there was a significant decrease in eGFR in both kidney- and liver-transplant patients. Glomerular diseases were observed in kidney biopsies obtained during the acute and chronic phases. This included membranoproliferative glomerulonephritis and relapses in IgA nephropathy. The majority of patients had cryoglobulinemia that became negative after HEV clearance. Kidney function improved and proteinuria decreased after HEV clearance. Conclusion HEV-associated glomerulonephritis seems to be an HEV-related extrahepatic manifestation. Further studies are required to confirm these observations.

186 citations


Journal ArticleDOI
TL;DR: Screening for this condition is essential, alongside adequate treatment strategies to attempt repermeation of the PV and prevent thrombosis extension, and it affects survival when it is complete, at least in the short term after transplant.
Abstract: Background Nonneoplastic portal vein thrombosis (PVT) is frequent in patients with cirrhosis who undergo liver transplantation (LT); however, data on its impact on outcome and strategies of management are sparse. Methods A systematic review of the literature was performed by analyzing studies that report on PVT in LT recipients and were published between January 1986 and January 2012. Results Of 25,753 liver transplants, 2004 were performed in patients with PVT (7.78%), and approximately half presented complete thrombosis. Thrombectomy/thromboendovenectomy was employed in 75% of patients; other techniques included venous graft interposition and portocaval hemitransposition. Overall, the presence of PVT significantly increased 30-day (10.5%) and 1-year (18.8%) post-LT mortality when compared to patients without PVT (7.7% and 15.4%, respectively). However, only complete PVT accounted for this increased mortality. Rethrombosis occurred in up to 13% of patients with complete PVT and in whom no preventative strategies were used, and was associated with increased morbidity and mortality. Conclusions PVT is common in patients with cirrhosis undergoing LT, and it affects survival when it is complete, at least in the short term after transplant. Therefore, screening for this condition is essential, alongside adequate treatment strategies to attempt repermeation of the PV and prevent thrombosis extension.

183 citations


Journal ArticleDOI
TL;DR: With a greater supply of hepatocytes, wider use of HT and evaluation in different liver conditions should be possible, and methods to increase cell engraftment based on portal embolization or irradiation of the liver are being assessed for clinical application.
Abstract: Hepatocyte transplantation (HT) has been performed in patients with liver-based metabolic disease and acute liver failure as a potential alternative to liver transplantation. The results are encouraging in genetic liver conditions where HT can replace the missing enzyme or protein. However, there are limitations to the technique, which need to be overcome. Unused donor livers to isolate hepatocytes are in short supply and are often steatotic, although addition of N-acetylcysteine improves the quality of the cells obtained. Hepatocytes are cryopreserved for later use and this is detrimental to metabolic function on thawing. There are improved cryopreservation protocols, but these need further refinement. Hepatocytes are usually infused into the hepatic portal vein with many cells rapidly cleared by the innate immune system, which needs to be prevented. It is difficult to detect engraftment of donor cells in the liver, and methods to track cells labeled with iron oxide magnetic resonance imaging contrast agents are being developed. Methods to increase cell engraftment based on portal embolization or irradiation of the liver are being assessed for clinical application. Encapsulation of hepatocytes allows cells to be transplanted intraperitoneally in acute liver failure with the advantage of avoiding immunosuppression. Alternative sources of hepatocytes, which could be derived from stem cells, are needed. Mesenchymal stem cells are currently being investigated particularly for their hepatotropic effects. Other sources of cells may be better if the potential for tumor formation can be avoided. With a greater supply of hepatocytes, wider use of HT and evaluation in different liver conditions should be possible.

182 citations


Journal ArticleDOI
TL;DR: Cancer incidence is different among solid organ transplantations, and ratios may be higher than those in the 55- to 59-year-old population, when compared with an older population.
Abstract: Background De novo posttransplant malignancy (PTM) is a serious complication of transplantation Incidences may vary among solid organ transplantations (SOTs) and may take to particular screening recommendations and posttransplantation care Methods Adult recipients, from the US Organ Procurement Transplant Network/United Network for Organ Sharing database (data as of September 3, 2010), of a primary kidney transplantation (KT), liver transplantation (LT), heart transplantation (HT) or lung transplantation (LuT) performed in the United States between 1999 and 2008 were selected Multiple-organ recipients and those whose grafts failed within 2 weeks after transplantation were excluded The incidence of PTM (in 1000 person-years) was estimated using the Kaplan-Meier product-limit method and compared with SOT and the general population Results The cohort included 193,905 recipients (123,380 KT; 43,106 LT; 16511 HT; and 10,908 LuT) PTM incidence was 803, 110, 143, and 198 in KT, LT, HT, and LuT, respectively In general, PTM recipients were 3 to 5 years older, mostly whites, and are males in all SOTs In KT, the type of cancer with the highest incidence was posttransplant lymphoproliferative disorder (PTLD, 158%), followed by lung (112%), prostate (082%), and kidney (079%) cancers; in LT, PTLD (244%), lung and bronchial (218%), primary hepatic (091%), and prostate (088%) cancers; in HT, lung and bronchial (324%) and prostate (307%) cancers, and PTLD (224%); and in LuT, lung and bronchial cancers (594%), PTLD (572%), and colorectal cancer (138%) PTLD, Kaposi sarcoma, and lung and bronchial cancers were increased in all SOTs, when compared with an older (55- to 59-year-old) population Conclusions Cancer incidence is different among solid organ transplantations, and ratios may be higher than those in the 55- to 59-year-old population

169 citations


Journal ArticleDOI
TL;DR: Data describing the efficacy of treatments for AMR in renal allografts are of low or very low quality, and larger randomized controlled trials and dose-response studies are required.
Abstract: Background: Antibody-mediated rejection (AMR) is a recognized cause of allograft loss in kidney transplant recipients. A range of therapies targeting removal of circulating donor-specific antibodies (DSAs), blocking their effect or reducing production have been reported. Methods: We conducted a systematic review to determine the efficacy of treatments for acute AMR in renal allografts. Electronic databases, reference lists, and conference proceedings were searched for controlled trials. Nonrandomized publications were reviewed for the purpose of discussion. Results: We identified 10,388 citations, including five randomized and seven nonrandomized controlled trials. The randomized studies were small (median, 13 patients/arm; range, 5-23), of which, four examined plasmapheresis (one suggested benefit) and one for immunoadsorption (also suggesting benefit). Marked heterogeneity was evident, including the definition and severity of AMR and the treatment regimen. The end point of graft survival was common to all studies. Small, nonrandomized controlled studies suggested benefit from rituximab or bortezomib. The effects of dose and regimen on the clinical response to any of the current treatments were not apparent from the available data. Conclusions: Data describing the efficacy of treatments for AMR in renal allografts are of low or very low quality. Larger randomized controlled trials and dose-response studies are required.

165 citations


Journal ArticleDOI
TL;DR: This study supports routine prophylaxis for all D+/R+ recipients with a positive CMV serostatus when treated preemptively for CMV infection, particularly for D-/R- patients.
Abstract: Background Cytomegalovirus (CMV) prevention can be achieved by prophylaxis or preemptive therapy. We performed a prospective randomized trial to determine whether renal transplant recipients with a positive CMV serostatus (R+) had a higher rate of CMV infection and disease after transplantation when treated preemptively for CMV infection, compared with primary valganciclovir prophylaxis. Methods Prophylaxis was 2 × 450 mg oral valganciclovir/day for 100 days; preemptive patients were monitored by CMV-polymerase chain reaction (PCR), and after a positive PCR test received 2 × 900 mg valganciclovir/day for at least 14 days followed by secondary prophylaxis. Valganciclovir dosage was adjusted according to renal function. Patients are followed up for 5 years and initial 12-month data are presented. Two hundred and ninety-six recipients were analyzed (168 donor/recipient seropositive [D+/R+], 128 donor seronegative/recipient seropositive [D-/R+]; 146 receiving prophylaxis and 150 preemptive therapy). Results Overall, CMV infection (asymptomatic CMV viral load ≥ 400 CMV DNA copies/mL proven by CMV-PCR) was significantly higher in recipients under preemptive therapy (38.7% vs. 11.0%, P 0.05). Tolerability was similar for both treatment groups. Conclusions Oral valganciclovir prophylaxis significantly reduces CMV infection and disease, particularly for D+/R+ patients. Hence, our study supports routine prophylaxis for all D+/R+ recipients.

141 citations


Journal ArticleDOI
TL;DR: CMI assessment shortly after the onset of CMV viremia may be useful to predict progression versus spontaneous viral clearance, thereby helping guide the need for antiviral therapy and refining current preemptive strategies.
Abstract: Background A CD8+ T-cell response to cytomegalovirus (CMV) has been associated with control of viral replication. Assessment shortly after the onset of asymptomatic viremia could help significantly refine preemptive strategies. Methods We conducted a prospective study of organ transplant recipients who developed asymptomatic low-level viremia not initially requiring antiviral therapy. Cell-mediated immunity (CMI) was measured shortly after viremia onset and longitudinally using the Quantiferon-CMV assay. The primary outcome was the ability to predict spontaneous clearance versus virologic and/or clinical progression. Results We enrolled 42 transplant patients, of which 37 were evaluable. Viral load at onset was 1140 copies/mL (interquartile range 655-1542). Spontaneous viral clearance occurred in 29 of 37 (78.4%) patients and 8 of 37(21.6%) had clinical and/or virologic progression requiring antivirals. At baseline, a positive CMI test (interferon-γ≥0.2 IU/mL) was present in 26 of 37(70.3%) patients. In patients with a positive CMI, the incidence of subsequent spontaneous viral clearance was 24 of 26 (92.3%) compared with 5 of 11 (45.5%) in patients with a negative CMI at onset (P=0.004). The absolute interferon-γ production was higher in patients with spontaneous clearance versus progression at all time points tested. Analysis of different cutoffs for defining a positive test suggested that the best threshold was 0.1 or 0.2 IU/mL of interferon-γ. Conclusions CMI assessment shortly after the onset of CMV viremia may be useful to predict progression versus spontaneous viral clearance, thereby helping guide the need for antiviral therapy and refining current preemptive strategies.

131 citations


Journal ArticleDOI
TL;DR: Whether modern mTOR inhibitor–based immunosuppressive regimens exert and affect wound healing after kidney transplantation and the information available from the U.S. Food and Drug Administration database is considered.
Abstract: Surgical complications, including events such as lymphocele and urological complications that affect wound healing, are reported with an incidence of 15% to 32% after kidney transplantation. The experience of the surgeon and comorbidities play an important role in determining the risk of such complications occurring. Since the introduction of the inosine 5'-monophosphate dehydrogenase inhibitors (mycophenolate mofetil) to the immunosuppressive armamentarium, replacing the antimetabolite prodrug azathioprine, reports have associated certain forms of wound healing complications (wound dehiscence, impaired healing, lymphocele, and incisional hernia) with the use of these agents. When mammalian target of rapamycin (mTOR) inhibitors (sirolimus, everolimus) became available, these findings were observed increasingly, particularly in direct comparisons with inosine 5'-monophosphate dehydrogenase inhibitors. The purpose of this article was to review the reported incidence of wound healing complications from randomized clinical trials that investigated the use of sirolimus- and everolimus-based treatment regimens in de novo kidney transplantation and the information available from the U.S. Food and Drug Administration database. The clinical trials included were primarily identified using biomedical literature database searches, with additional studies added at the authors' discretion. This review summarizes these studies to consider whether modern mTOR inhibitor-based immunosuppressive regimens exert and affect wound healing after kidney transplantation.

Journal ArticleDOI
TL;DR: CMV infection data from randomized clinical trials that investigated the use of sirolimus- and everolimus-based treatment regimens in de novo renal transplantation are reviewed to discuss whether mTOR inhibitor-based immunosuppressive therapy can reduce the magnitude of CMV-related complications in the de noVO kidney transplantation setting.
Abstract: Cytomegalovirus (CMV) infection and disease are major complications in the renal transplant recipient. The occurrence of CMV is associated with acute rejection, allograft dysfunction, significant end-organ disease, and mortality. Several clinical studies have indicated that the use of certain immunosuppressive drugs can delay the reconstitution of CMV-specific cell-mediated immune responses, thereby leading to uncontrolled CMV replication. Accumulating evidence indicates, however, that the use of the mammalian target of rapamycin (mTOR) inhibitors, sirolimus, and everolimus, may decrease the incidence and severity of CMV infection in renal transplant recipients. The purpose of this article is to review CMV infection data from randomized clinical trials that investigated the use of sirolimus- and everolimus-based treatment regimens in de novo renal transplantation. The mTOR inhibitor clinical trials included were primarily identified using biomedical literature database searches, with additional studies added at the authors' discretion. This review will summarize these studies to discuss whether mTOR inhibitor-based immunosuppressive therapy can reduce the magnitude of CMV-related complications in the de novo renal transplantation setting.

Journal ArticleDOI
TL;DR: Transplantation of patients bridged on ECMO to LTX is feasible and results in acceptable outcome and the institutional experience with ECMO as a bridge to L TX was reviewed.
Abstract: Background The introduction of the lung allocation score has brought lung transplantation (LTX) of patients on extracorporeal membrane oxygenation (ECMO) bridge into the focus of interest. We reviewed our institutional experience with ECMO as a bridge to LTX. Methods Between 1998 and 2011, 38 patients (median age 30.1 years, range 13-66 years) underwent ECMO support with intention to bridge to primary LTX. The underlying diagnosis was cystic fibrosis (n=17), pulmonary hypertension (n=4), idiopathic pulmonary fibrosis (n=9), adult respiratory distress syndrome (n=4), hemosiderosis (n=1), bronchiolitis obliterans (n=1), sarcoidosis (n=1), and bronchiectasis (n=1). The type of extracorporeal bridge was venovenous (n=18), venoarterial (n=15), interventional lung assist (n=1), or a stepwise combination of them (n=4). The median bridging time was 5.5 days (range 1-63) days. The type of transplantation was double LTX (n=7), size-reduced double LTX (n=8), lobar LTX (n=16), split LTX (n=2), and lobar LTX after ex vivo lung perfusion (n=1). Results Four patients died before transplantation. Thirty-four patients underwent LTX, of them eight patients died in the hospital after a median stay of 24.5 days (range 1-180 days). Twenty-six patients left the hospital and returned to normal life (median hospital stay=47.5 days; range 21-90 days). The 1-, 3-, and 5-year survival for all transplanted patients was 60%, 60%, and 48%, respectively. The 1-, 3-, and 5-year survival conditional on 3-month survival for patients bridged with ECMO to LTX (78%, 78%, and 63%) was not worse than for other LTX patients within the same period of time (90%, 80%, and 72%, respectively, P=0.09, 0.505, and 0.344). Conclusion Transplantation of patients bridged on ECMO to LTX is feasible and results in acceptable outcome.

Journal ArticleDOI
TL;DR: Despite some long-term complications, which are similar to those reported after solid organ transplantation, the patient is satisfied of her new face and has normal social interaction.
Abstract: BACKGROUND: The first human facial allotransplantation, a 38-year-old woman, was performed on November 27, 2005. The aesthetic aspect and functional recovery and the risk-to-benefit ratio are evaluated 5 years later. MATERIALS AND METHODS: The facial transplantation included nose, chin, part of cheeks, and lips. The immunosuppressive protocol included tacrolimus, mycophenolate mofetil, prednisone, and antithymocyte globulins. In addition, donor bone marrow cells were infused on days 4 and 11 after transplantation. RESULTS: The aesthetic aspect is satisfying. The patient has normal protective and discriminative sensibility. She showed a rapid motion recovery, which has remained stable for 3 years posttransplantation. She can smile, chew, swallow, and blow normally whereas pouting and kissing is still difficult. Phonation recovery was impressive therefore the patient can talk normally. Two episodes of acute rejection developed during the first year. Donor-specific anti-human leukocyte antigen antibodies were never detected. Five-year mucosal biopsy showed a slight perivascular inflammatory infiltrate while skin biopsy was normal. The main side effect of the immunosuppressive treatment was a progressive decrease in renal function, which improved after switching from tacrolimus to sirolimus. Moreover, she developed arterial hypertension, an increase in lipid levels, and in situ cervix carcinoma treated by conization. Since 2008, she showed mild cholangitis possibly caused by sirolimus. In September 2010, bilateral pneumopathy occurred and was successfully treated with antibiotics. CONCLUSION: Despite some long-term complications, which are similar to those reported after solid organ transplantation, the patient is satisfied of her new face and has normal social interaction.

Journal ArticleDOI
TL;DR: Transplant candidates are ill equipped to seek live donors; by separating the advocate from the patient, understandable concerns about initiating conversations are reduced.
Abstract: BACKGROUND Lack of education and reluctance to initiate a conversation about live donor kidney transplantation is a common barrier to finding a donor. Although transplant candidates are often hesitant to discuss their illness, friends or family members are often eager to spread awareness and are empowered by advocating for the candidates. We hypothesized that separating the advocate from the patient is important in identifying live donors. METHODS We developed an intervention to train a live donor champion (LDC; a friend, family member, or community member willing to advocate for the candidate) for this advocacy role. We compared outcomes of 15 adult kidney transplant candidates who had no prospective donors and underwent the LDC intervention with 15 matched controls from our waiting list. RESULTS Comfort in initiating a conversation about transplantation increased over time for LDCs. Twenty-five potential donors contacted our center on behalf of LDC participants; four participants achieved live donor kidney transplantation and three additional participants have donors in evaluation, compared with zero among matched controls (P < 0.001). CONCLUSIONS Transplant candidates are ill equipped to seek live donors; by separating the advocate from the patient, understandable concerns about initiating conversations are reduced.

Journal ArticleDOI
TL;DR: A low intraoperative transfusion rate could be maintained throughout 500 consecutive OLTs, and Bleeding did not correlate with the severity of recipient’s disease.
Abstract: Background Orthotopic liver transplantation (OLT) has been associated with major blood loss and the need for blood product transfusions. During the last decade, improved surgical and anesthetic management has reduced intraoperative blood loss and blood product transfusions. A first report from our group published in 2005 described a mean intraoperative transfusion rate of 0.3 red blood cell (RBC) unit per patient for 61 consecutive OLTs. Of these patients, 80.3% did not receive any blood product. The interventions leading to those results were a combination of fluid restriction, phlebotomy, liberal use of vasopressor medications, and avoidance of preemptive transfusions of fresh frozen plasma. This is a follow-up observational study, covering 500 consecutive OLTs. Methods Five hundred consecutive OLTs were studied. The transfusion rate of the first 61 OLTs was compared with the last 439 OLTs. Furthermore, multivariate logistic regression was used to determine the main predictors of intraoperative blood transfusion. Results A mean (SD) of 0.5 (1.3) RBC unit was transfused per patient for the 500 OLTs, and 79.6% of them did not receive any blood product. There was no intergroup difference except for the final hemoglobin (Hb) value, which was higher for the last 439 OLTs compared with the previously reported smaller study (94 [20] vs. 87 [20] g/L). Two variables, starting Hb value and phlebotomy, correlated with OLT without transfusion. Conclusions In our center, a low intraoperative transfusion rate could be maintained throughout 500 consecutive OLTs. Bleeding did not correlate with the severity of recipient's disease. The starting Hb value showed the strongest correlation with OLT without RBC transfusion.

Journal ArticleDOI
TL;DR: Elevated levels of AT1R and ETAR Abs are associated with cellular and Ab-mediated rejection and early onset of microvasculopathy and should be routinely monitored after heart transplantation.
Abstract: Background Non-human leukocyte antigen antibodies (Abs) targeting vascular receptors are implicated in the pathogenesis of renal allograft vascular rejection and in progressive vasculopathy in patients with systemic sclerosis. Methods We prospectively tested in 30 heart transplant recipients the impact of Abs directed against endothelin-1 type A (ET(A)R) and angiotensin II type 1 receptors (AT(1)R, cell-enzyme-linked immunosorbent assay) at time of transplantation and during the first posttransplantation year on cellular and Ab-mediated rejection (immunohistochemistry, C3d, and immunoglobulins) and microvasculopathy in endomyocardial biopsy. Results Cellular rejection, Ab-mediated rejection, and microvasculopathy was found in 40% and 13%, 57% and 18%, and 37% and 40% of biopsies at 1 month and 1 year posttransplantation, respectively. Maximum levels of AT(1)R and ET(A)R Abs were higher in patients with cellular (16.5±2.6 vs. 9.4±1.3; P=0.021 and 16.5±2.5 vs. 9.9±1.9; P=0.041) and Ab-mediated rejection (19.0±2.6 vs. 10.0±1.3; P=0.004 and 19.4±2.7 vs. 9.0±1.7; P=0.002), as compared with patients who had no rejection. Patients with elevated AT(1)R Abs (53% [16/30]) or ETAR Abs (50% [15/30]; pretransplantation prognostic rejection cutoff >16.5 U/L) presented more often with microvasculopathy (both, 67% vs. 23%; P=0.048) than patients without. Conclusions Elevated levels of AT(1)R and ET(A)R Abs are associated with cellular and Ab-mediated rejection and early onset of microvasculopathy and should be routinely monitored after heart transplantation.

Journal ArticleDOI
TL;DR: In this article, the authors evaluated agreement within the CARGO II pathology panel and between the panel (acting by majority) and the collaborating centers (treated as a single entity), regarding the ISHLT grades of 937 EMBs (with all grades ≥ 2R merged because of small numbers).
Abstract: BACKGROUND There has been no large evaluation of the ISHLT 2004 acute cellular rejection grading scheme for heart graft endomyocardial biopsy specimens (EMBs). METHODS We evaluated agreement within the CARGO II pathology panel and between the panel (acting by majority) and the collaborating centers (treated as a single entity), regarding the ISHLT grades of 937 EMBs (with all grades ≥2R merged because of small numbers). RESULTS Overall all-grade agreement was almost 71% both within the panel and between the panel and the collaborating centers but, in both cases, was largely because of agreement on grade 0: for the average pair of pathologists, fewer than a third of the EMBs assigned grade ≥2R by at least one were assigned this grade by both. CONCLUSION The 2004 revision has done little to improve agreement on the higher ISHLT grades. An EMB grade ≥2R is not by itself sufficient as a basis for clinical decisions or as a research criterion. Steps should be taken toward greater uniformity in EMB grading, and efforts should be made to replace the ISHLT classification with diagnostic criteria--EMB based or otherwise--that correspond better with the pathophysiology of the transplanted heart.

Journal ArticleDOI
TL;DR: A Kidney Donor Risk Index based on five donor variables provides a clinically useful tool that may help with organ allocation and informed consent and be prognostic of outcome in a validation cohort.
Abstract: BACKGROUND We sought to determine the deceased donor factors associated with outcome after kidney transplantation and to develop a clinically applicable Kidney Donor Risk Index. METHODS Data from the UK Transplant Registry on 7620 adult recipients of adult deceased donor kidney transplants between 2000 and 2007 inclusive were analyzed. Donor factors potentially influencing transplant outcome were investigated using Cox regression, adjusting for significant recipient and transplant factors. A United Kingdom Kidney Donor Risk Index was derived from the model and validated. RESULTS Donor age was the most significant factor predicting poor transplant outcome (hazard ratio for 18-39 and 60+ years relative to 40-59 years was 0.78 and 1.49, respectively, P<0.001). A history of donor hypertension was also associated with increased risk (hazard ratio 1.30, P=0.001), and increased donor body weight, longer hospital stay before death, and use of adrenaline were also significantly associated with poorer outcomes up to 3 years posttransplant. Other donor factors including donation after circulatory death, history of cardiothoracic disease, diabetes history, and terminal creatinine were not significant. A donor risk index based on the five significant donor factors was derived and confirmed to be prognostic of outcome in a validation cohort (concordance statistic 0.62). An index developed in the United States by Rao et al., Transplantation 2009; 88: 231-236, included 15 factors and gave a concordance statistic of 0.63 in the UK context, suggesting that our much simpler model has equivalent predictive ability. CONCLUSIONS A Kidney Donor Risk Index based on five donor variables provides a clinically useful tool that may help with organ allocation and informed consent.

Journal ArticleDOI
TL;DR: There is a case for developing comprehensive, methodologically robust, and regularly updated guidelines on wait-listing for kidney transplantation, based on life expectancy, comorbidities, lifestyle, and psychosocial factors.
Abstract: BackgroundApparent variability in wait-listing criteria globally has raised concern about inequitable access to kidney transplantation. This study aimed to compare the quality, the scope, and the consistency of international guidelines on wait-listing for kidney transplantation.MethodsElectronic dat

Journal ArticleDOI
TL;DR: It is found that colectomy can be performed with excellent survival in selected patients, and significant risk factors for CDAD and predictors of progression to CCDC are identified.
Abstract: Background Clostridium difficile-associated diarrhea (CDAD) is an increasingly important diagnosis in solid organ transplant recipients, with rising incidence and mortality. We describe the incidence, risk factors, and outcomes of colectomy for CDAD after solid organ transplantation. Methods Patients with CDAD were identified from a prospective transplant database. Complicated Clostridium difficile colitis (CCDC) was defined as CDAD associated with graft loss, total colectomy, or death. Results From 1999 to 2010, we performed solid organ transplants for 1331 recipients at our institution. The incidence of CDAD was 12.4% (165 patients); it increased from 4.5% (1999) to 21.1% (2005) and finally 9.5% (2010). The peak frequency of CDAD was between 6 and 10 days posttransplantation. Age more than 55 years (hazard ratio [HR]: 1.47, 95% confidence interval [CI]=1.16-1.81), induction with antithymocyte globulin (HR: 1.43, 95% CI=1.075-1.94), and transplant other than kidney alone (liver, heart, pancreas, or combined kidney organ) (HR: 1.41, 95% CI=1.05-1.92) were significant independent risk factors for CDAD. CCDC occurred in 15.8% of CDAD cases. Independent predictors of CCDC were white blood cell count more than 25,000/μL (HR: 1.08, 95% CI=1.025-1.15) and evidence of pancolitis on computed tomography scan (HR: 2.52, 95% CI=1.195-5.35). Six patients with CCDC underwent colectomy with 83% patient survival and 20% graft loss. Of the medically treated patients with CCDC (n=20), the patient survival was 35% with 100% graft loss. Conclusions We have identified significant risk factors for CDAD and predictors of progression to CCDC. Furthermore, we found that colectomy can be performed with excellent survival in selected patients.

Journal ArticleDOI
TL;DR: A new enzyme mixture composed of intact C1 and C2 collagenases and ChNP in place of thermolysin recovers higher islet yield from deceased and pancreatitis pancreases while retaining islet quality and function.
Abstract: BACKGROUND The optimal enzyme blend that maximizes human islet yield for transplantation remains to be determined. In this study, we evaluated eight different enzyme combinations (ECs) in an attempt to improve islet yield. The ECs consisted of purified, intact or truncated class 1 (C1) and class 2 (C2) collagenases from Clostridium histolyticum (Ch), and neutral protease (NP) from Bacillus thermoproteolyticus rokko (thermolysin) or Ch (ChNP). METHODS We report the results of 249 human islet isolations, including 99 deceased donors (research n=57, clinical n=42) and 150 chronic pancreatitis pancreases. We prepared a new enzyme mixture (NEM) composed of intact C1 and C2 collagenases and ChNP in place of thermolysin. The NEM was first tested in split pancreas (n=5) experiments and then used for islet autologous (n=21) and allogeneic transplantation (n=10). Islet isolation outcomes from eight different ECs were statistically compared using multivariate analysis. RESULTS The NEM consistently achieved higher islet yields from pancreatitis (P<0.003) and deceased donor pancreases (P<0.001) than other standard ECs. Using the NEM, islet products met release criteria for transplantation from 8 of 10 consecutive pancreases, averaging 6510 ± 2150 islet equivalent number/gram (IEQ/g) pancreas and 694,681 ± 147,356 total IEQ/transplantation. In autologous isolation, the NEM yielded more than 200,000 IEQ from 19 of 21 pancreases (averaging 422,893 ± 181,329 total IEQ and 5979 ± 1469 IEQ/kg recipient body weight) regardless of the severity of fibrosis. CONCLUSIONS A NEM composed of ChNP with CIzyme high intact C1 collagenase recovers higher islet yield from deceased and pancreatitis pancreases while retaining islet quality and function.

Journal ArticleDOI
TL;DR: Differentially expressed miRNAs and their predicted targets identified by deep sequencing are candidates for further investigation to decipher the mechanism and management of kidney allograft fibrosis.
Abstract: Background MicroRNA (miRNA) alterations accompanying interstitial fibrosis and tubular atrophy (IFTA) in kidney allografts may point toward pathologic mechanisms. Small-RNA sequencing provides information on total miRNA abundance and specific miRNA expression, and allows analysis of differential expression based on read counts.

Journal ArticleDOI
TL;DR: DTC is rare but frequently results in graft loss and death, and Explantation/excision is likely to benefit recipients with localized cancer, but in transplants other than kidney/pancreas, the benefits should be balanced against the risks of retransplantation.
Abstract: Background Donor origin cancer (DOC) in transplant recipients may be transmitted with the graft (donor-transmitted cancer [DTC]) or develop subsequently from the graft (donor-derived cancer [DDC]). Methods Recipients with DOC between January 1, 2001, and December 31, 2010, were identified from the United Kingdom Transplant Registry and database search at transplantation centers. Results Of 30,765 transplants from 14,986 donors, 18 recipients developed DOC from 16 donors (0.06%): 3 were DDC (0.01%) and 15 were DTC (0.05%). Of the 15 DTCs, 6 were renal cell cancer; 5, lung cancer; 2, lymphoma; 1, neuroendocrine cancer; and 1, colon cancer. Recipients with DTC underwent explant/excision (11), chemotherapy (4), and radiotherapy (1). Of 15 recipients, 3 (20%) recipients with DTC died as a direct consequence of cancer. Early DTC (diagnosed ≤6 weeks of transplantation) showed a better outcome (no DTC-related deaths in 11 cases) as opposed to late DTC (DTC-related deaths in 3 of 4 cases). Five-year survival was 83% for kidney recipients with DTC compared with 93% for recipients without DTC (P=0.077). None of the donors resulting in cancer transmission was known to have cancer at donation. Conclusions DTC is rare but frequently results in graft loss and death. The risk of cancer transmission cannot be eliminated because, in every case, the presence of cancer was not known at donation. This information will allow informed consent for prospective recipients. Explantation/excision is likely to benefit recipients with localized cancer, but in transplants other than kidney/pancreas, the benefits should be balanced against the risks of retransplantation.

Journal ArticleDOI
TL;DR: It is shown that cotransplantation with myeloid-derived suppressor cells (MDSC) effectively protect islet allografts from rejection without requirement of immunosuppression, and this approach holds great clinical application potential and may overcome the limitation of requiring chronic administration of immunOSuppression in cell transplants.
Abstract: Background Side effects of lifetime immunosuppression for cell transplants often outweigh the benefits; therefore, induction of transplant tolerance is needed. We have shown that cotransplantation with myeloid-derived suppressor cells (MDSC) effectively protect islet allografts from rejection without requirement of immunosuppression. This study was to investigate the underlying mechanisms. Methods MDSC were generated by addition of hepatic stellate cells from various stain mice into dendritic cell (DC) culture. The quality of MDSC was monitored by phenotype and function analyses. MDSC mixed with islet allografts were transplanted into diabetic recipients. T-cell response was analyzed after transplantation by using flow and histochemical analyses, and was compared with islet alone and islet/DC transplant groups. B7-H1 knockout mice were used to determine the role of B7-H1 on MDSC in regulation of T-cell response. Results Cotransplantation with MDSC (not DC) effectively protected islet allografts without requirement of immunosuppression. This is associated with attenuation of CD8 T cells in the grafts and marked expansion of regulatory T (Treg) cells, which contributed to MDSC-induced T-cell hyporesponsiveness. Antigen-specific Treg cells were prone to accumulate in lymphoid organs close to the grafts. Both in vitro and in vivo data demonstrated that B7-H1 was absolutely required for MDSC to exert immune regulatory activity and induction of Treg cells. Conclusion The described approach holds great clinical application potential and may overcome the limitation of requiring chronic administration of immunosuppression in cell transplants. Understanding the underlying mechanisms will facilitate the development of this novel therapeutic strategy.

Journal ArticleDOI
TL;DR: Cirrhotic patients with RF, in particular with hepatorenal syndrome, CLKT is preferable to LTA because it improves liver allograft and patient survival.
Abstract: BACKGROUND The role of combined liver-kidney transplantation (CLKT) for cirrhotic patients with renal failure (RF) is controversial Since the model for end-stage liver disease era, there has been a rise in the number of CLKT Using the Organ Procurement Transplant Network/United Network for Organ Sharing database, this study was undertaken to compare outcomes of cirrhotic patients with RF who received either liver transplant alone (LTA) or CLKT between 2002 and 2008 METHODS Analysis was limited to cirrhotic patients 18 years old or older, with serum creatinine level 25 mg/dL or higher at the time of orthotopic liver transplantation (OLT) or who received dialysis at least twice during the week before OLT Patients who received CLKT were categorized based on the cause of their underlying RF RESULTS Overall liver allograft and patient survival rates of LTA patients were significantly lower compared with CLKT patients (P<0001) CLKT patients with hepatorenal syndrome showed significantly higher patient and liver allograft survival rates Liver allograft survival was superior among CLKT patients irrespective of whether they received dialysis Prevalence of posttransplantation RF was higher for LTA patients at 6 months and 3 years of follow-up (P<0001) LTA was a significant risk factor both for graft loss and mortality Recipient hepatitis C virus seropositivity, donor age, donor cause of death, and life support at the time of OLT were also risk factors for graft loss and death CONCLUSIONS Cirrhotic patients with RF, in particular with hepatorenal syndrome, CLKT is preferable to LTA because it improves liver allograft and patient survival

Journal ArticleDOI
TL;DR: Subclinical rejection in early protocol biopsies is associated with late appearance of chronic humoral rejection and interstitial fibrosis and tubular atrophy in renal transplants with a protocol biopsy performed within the first 6 months posttransplant between 1988 and 2006.
Abstract: Background Subclinical rejection and interstitial fibrosis and tubular atrophy (IF/TA) in protocol biopsies are associated with outcome. We study the relationship between histologic lesions in early protocol biopsies and histologic diagnoses in late biopsies for cause. Materials and methods Renal transplants with a protocol biopsy performed within the first 6 months posttransplant between 1988 and 2006 were reviewed. Biopsies were evaluated according to Banff criteria, and C4d staining was available in biopsies for cause. Results Of the 517 renal transplants with a protocol biopsy, 109 had a subsequent biopsy for cause which showed the following histological diagnoses: chronic humoral rejection (CHR) (n=44), IF/TA (n=42), recurrence of the primary disease (n=11), de novo glomerulonephritis (n=7), T-cell-mediated rejection (n=4), and polyoma virus nephropathy (n=1). The proportion of retransplants (15.9% vs. 2.3%, P=0.058) and the prevalence of subclinical rejection were higher in patients with CHR than in patients with IF/TA (52.3% vs. 28.6%, P=0.0253). Demographic donor and recipient characteristics and clinical data at the time of protocol biopsy were not different between groups. Logistic regression analysis showed that subclinical rejection (relative risk, 2.52; 95% confidence interval, 1.1-6.3; P=0.047) but not retransplantation (relative risk, 6.7; 95% confidence interval, 0.8-58.8; P=0.085) was associated with CHR. Conclusion Subclinical rejection in early protocol biopsies is associated with late appearance of CHR.

Journal ArticleDOI
TL;DR: It is suggested that non-Gal antibody-induced chronic endothelial cell activation coupled to possible hemostatic incompatibilities may be the primary stimulus for delayed xenograft rejection of GTKO hearts.
Abstract: Background Transgenic expression of human complement regulatory proteins reduces the frequency of hyperacute rejection (HAR) in Gal-positive cardiac xenotransplantation. In this study, we examined the impact of human CD55 (hCD55) expression on a Gal knockout (GTKO) background using pig-to-primate heterotopic cardiac xenotransplantation. Methods Cardiac xenotransplantation was performed with GTKO (group 1; n=6) and GTKO.hCD55 (group 2; n=5) donor pigs using similar immunosuppression. Cardiac biopsies were obtained 30 min after organ reperfusion. Rejection was characterized by histology and immunohistology. Intragraft gene expression, serum non-Gal antibody, and antibody recovered from rejected hearts were analyzed. Results HAR of a GTKO heart was observed. Remaining grafts developed delayed xenograft rejection. Median survival was 21 and 28 days for groups 1 and 2, respectively. Vascular antibody deposition was uniformly detected 30 min after organ reperfusion and at explant. A higher frequency of vascular C5b deposition was seen in GTKO organs at explant. Serum non-Gal antibody, antibody recovered from the graft, and intragraft gene expression were similar between the groups. Conclusion HAR of GTKO hearts without hCD55 may occur. Expression of hCD55 seemed to restrict local complement activation but did not improve graft survival. Chronic vascular antibody deposition with evidence of protracted endothelial cell activation was seen. These observations suggest that non-Gal antibody-induced chronic endothelial cell activation coupled to possible hemostatic incompatibilities may be the primary stimulus for delayed xenograft rejection of GTKO hearts. To avoid possible HAR, future clinical studies should use donors expressing human complement regulatory proteins in the GTKO background.

Journal ArticleDOI
TL;DR: The third edition of the joint British Transplantation Society/Renal Association guidelines for living donor kidney transplantation was published in May 2011 and has used the GRADE system to rate the strength of evidence and recommendations.
Abstract: The third edition of the joint British Transplantation Society/Renal Association guidelines for living donor kidney transplantation was published in May 2011. The guideline has been extensively revised since the previous edition in 2005 and has used the GRADE system to rate the strength of evidence and recommendations. This article summarizes the statements of recommendation contained in the guideline, which provide a framework for the delivery of living kidney donation in the United Kingdom and may be of wide international interest. It is recommended that the full guideline document is consulted for details of the relevant references and evidence base. This may be accessed at http://www.bts.org.uk/transplantation/standards-and-guidelines/ and http://www.renal.org/clinical/OtherGuidelines.aspx (transplantation is welcome to add a web link in this article to/through its own Web site to increase traffic).

Journal ArticleDOI
TL;DR: After transplantation hyperglycemia is a strong independent risk factor for MCVE and death, mainly from CV causes, independent of the presence of CV disease identified before transplantation.
Abstract: Since the early days of transplantation, it has been recognized that diabetes can develop de novo after transplantation (new-onset diabetes after transplantation [NODAT]) (1). NODAT is associated with increased cardiovascular (CV) risk and reduced patient survival (2, 3). However, the nature of this association is unclear. Possibly, this increase in CV risk reflects preexisting CV disease because NODAT develops preferentially in older recipients, in patients with high CV burden, and in those with an unfavorable CV risk profile (1–6). If that is the case, NODAT may be mainly a marker of high CV risk, and treatment of hyperglycemia may not reduce risk. Consistent with this hypothesis, CV risk and mortality increases soon after the development of NODAT, and the magnitude of the risk is similar to that observed in patients with diabetes before transplantation (2). However, it is also possible that hyperglycemia itself or the series of biochemical and physiologic anomalies associated with elevated glucose levels contribute to CV risk. This alternative or complementary hypothesis seems to be supported by the fact that all levels of fasting glucose after transplantation are associated with increased risk of CV events and mortality (2, 3). Furthermore, previous studies suggested that reversal of posttransplantation hyperglycemia may be associated with a reduction in risk (2). Finally, it should be considered that not all risk factors for NODAT relate to CV risk. For example, previous studies have shown an association between hepatitis C and NODAT (7, 8). In this study, we analyzed a large cohort of first-time kidney transplant recipients followed up for long periods of time to examine the nature of the association between posttransplantation hyperglycemia, CV risk, and survival. We hypothesized that posttransplantation hyperglycemia is associated with increased CV risk independent of pretransplantation CV risk factors. Clarifying the variables that contribute to the increased CV risk and mortality of patients with this common complication could have significant implications for patient management after transplantation.