scispace - formally typeset
Search or ask a question

Showing papers in "Transplantation in 2008"


Journal ArticleDOI
TL;DR: The results of this study show a reliable and feasible candidates selection and prognostic criteria of LT in HCC patients and establish a new set of criteria for patient Selection and prognosis prediction.
Abstract: Introduction Liver transplantation (LT) has been the treatment of choice for patients with hepatocellular carcinoma (HCC). This study was designed to summarize our experience in LT for HCC patients and establish a new set of criteria for patient selection and prognosis prediction. Materials and methods Data of 195 patients with HCC were retrospectively analyzed and various clinical and pathological factors for survival and tumor-free survival were examined by univariate and multivariate analyses. Results Macrovascular invasion, preoperative serum alpha fetoprotein (AFP) level, tumor size, multifocality, histopathologic grading, distribution, and cirrhosis background were significant factors for survival and tumor-free survival by univariate analysis. Multivariate analysis identified macrovascular invasion, tumor size, preoperative AFP level, and histopathologic grading were prognostic factors independently associated with patient survival or tumor-free survival (RR=1.688-2.779, P=0.000-0.034). Based on the prognostic stratification of different risk groups of patients without macrovascular invasion, Hangzhou criteria was established, containing one of the two following items: (a) Total tumor diameter less than or equal to 8 cm; (b) total tumor diameter more than 8 cm, with histopathologic grade I or II and preoperative AFP level less than or equal to 400 ng/mL, simultaneously. The difference between survival curves of patients fulfilling Milan criteria (n=72) and patients fulfilling Hangzhou criteria (n=99) did not achieve statistical significance (5-year survival rates: 78.3% vs. 72.3%, P>0.05). Of the patients exceeding Milan criteria (n=123), those who fulfilled Hangzhou criteria (n=26) also had better prognosis than the others (n=97) (P=0.000). Conclusion The results of this study show a reliable and feasible candidates selection and prognostic criteria of LT in HCC patients.

397 citations


Journal ArticleDOI
TL;DR: Bortezomib represents the first effective antihumoral therapy with activity in humans that targets plasma cells and provides effective treatment of AMR and ACR with minimal toxicity and sustained reduction in iDSA and non-iDSA levels.
Abstract: Background. Current antihumoral therapies in transplantation and autoimmune disease do not target the mature antibody-producing plasma cell. Bortezomib is a first in class proteosomal inhibitor, that is Food and Drug Administration approved, for the treatment of plasma cell-derived tumors that is multiple myeloma. We report the first clinical experience with plasma cell-targeted therapy (bortezomib) as an antirejection strategy. Methods. Eight episodes of mixed antibody-mediated rejection (AMR) and acute cellular rejection (ACR) in six transplant recipients were treated with bortezomib at labeled dosing. Monitoring included serial donor-specific antihuman leukocyte antigen antibody (DSA) levels and repeated allograft biopsies. Results. Six kidney transplant patients received bortezomib for AMR and concomitant ACR. In each case, bortezomib therapy provided (1) prompt rejection reversal, (2) marked and prolonged reductions in DSA levels, (3) improved renal allograft function, and (4) suppression of recurrent rejection for at least 5 months. Moreover, immunodominant DSA (iDSA) (i.e., the antidonor human leukocyte antigen antibody with the highest levels) levels were decreased by more than 50% within 14 days and remained substantially suppressed for up to 5 months. One or more additional DSA were present at lower concentrations (non-iDSA) in each patient and were also reduced to nondetectable levels. Bortezomib-related toxicities (gastrointestinal toxicity, thrombocytopenia, and paresthesias) were all transient. Conclusions. Bortezomib therapy: (1) provides effective treatment of AMR and ACR with minimal toxicity and (2) provides sustained reduction in iDSA and non-iDSA levels. Bortezomib represents the first effective antihumoral therapy with activity in humans that targets plasma cells.

377 citations


Journal ArticleDOI
TL;DR: Normal males were found to have HLA antibodies to infrequent HLA specificities, most likely produced to cross-reactive epitopes found in microorganisms, ingested proteins and allergens—making them natural antibodies.
Abstract: Background Human leukocyte antigen (HLA) antibodies were normally not found in subjects who have not been immunized by pregnancies, transfusions, or transplants. But with new methodology, we now see that HLA antibodies are often found in nonalloimmunized males. Methods The sera of 424 healthy male donors were tested with single antigen Luminex beads. Results Human leukocyte antigen antibodies were detected in 63% of 424 male blood donors when a fluorescent value of more than 1000 was used as the cutoff. Antibodies to class I was found in 42%, class II in 11% and both in 12%. Five males who were tested eight times over a 6-month period consistently had the same specificity at similar strength levels at each testing. The antibodies reacted with specificities that are rare in the general population: 18.9% had antibodies to A*3002; more than 10% had antibodies to A*3101, B76, B*8201, and Cw*1701. About half of the donors with antibodies had one or two specificities; the other half had three or more specificities. Among those with class II specificities, 20.5% had antibodies to DPA1*0201/DPB1*0101, and 10.8% to DQA1*0503/DQB1*0301. Because the above data were obtained by testing sera of 424 Mexican donors, as a check, 29 males in Los Angeles were tested and shown to have similar specificities at roughly similar frequencies. Conclusions Normal males were found to have HLA antibodies to infrequent HLA specificities. It is likely that these HLA antibodies are produced to cross-reactive epitopes found in microorganisms, ingested proteins and allergens-making them natural antibodies.

285 citations


Journal ArticleDOI
TL;DR: The applied protocol of MMF dose adjustments based on target MPA exposure was not successful, partly because physicians seemed reluctant to implement substantial dose changes.
Abstract: BACKGROUND.: Fixed-dose mycophenolate mofetil (MMF) reduces the incidence of acute rejection after solid organ transplantation. The Fixed-Dose Concentration Controlled trial assessed the feasibility and potential benefit of therapeutic drug monitoring in patients receiving MMF after de novo renal transplant. METHODS.: Patients were randomized to a concentration-controlled (n=452; target exposure 45 mg hr/L) or a fixed-dose (n=449) MMF-containing regimen. The primary endpoint was treatment failure (a composite of biopsy-proven acute rejection [BPAR], graft loss, death, or MMF discontinuation) by 12 months posttransplantation. RESULTS.: Mycophenolic acid (MPA) exposures for both groups were similar at most time points and were below 30 mg hr/L in 37.3% of patients at day 3. There was no difference in the incidence of treatment failure (25.6% vs. 25.7%, P=0.81) or BPAR (14.9% vs. 15.5%, P>0.05) between the concentration-controlled and the fixed-dose groups, respectively. We did find a significant relationship between MPA-area under the concentration-time curve on day 3 and the incidence of BPAR in the first month (P=0.009) or in the first year posttransplantation (P=0.006). For later time points (day 10, month 1) there was no significant relationship between area under the concentration-time curve and BPAR (0.2572 and 0.5588, respectively). CONCLUSIONS.: There was no difference in the incidence of treatment failure between the concentration-controlled and the fixed-dose groups. The applied protocol of MMF dose adjustments based on target MPA exposure was not successful, partly because physicians seemed reluctant to implement substantial dose changes. Current initial MMF doses underexpose more than 35% of patients early after transplantation, increasing the risk for BPAR.

260 citations


Journal ArticleDOI
TL;DR: The cumulative results of the registry confirm the inarguably positive impact of islet transplantation on metabolic control in T1 diabetes and needs to be evaluated using the most clinically relevant endpoints such as glucose stabilization and severe hypoglycemia prevention.
Abstract: Background This report summarizes the primary efficacy and the safety outcomes of islet transplantation reported to the NIDDK and JDRF funded Collaborative Islet Transplant Registry (CITR), currently the most comprehensive collection of human-to-human islet transplant data. Methods CITR collects and monitors comprehensive data on allogeneic islet transplantation in North America, Europe, and Australia since 1999. Results As of April 2008, the CITR registry comprised 325 adult recipients of 649 islet infusions derived from 712 donors. At 3 years post-first infusion, 23% of islet-alone recipients were insulin independent (II>or=2 weeks), 29% were insulin dependent with detectable C-peptide, 26% had lost function, and 22% had missing data. Seventy percent achieved II at least once, of whom 71% were still II 1 year later and 52% at 2 years. Higher number of infusions, greater number of total islet equivalents infused, lower pretransplant HbA1c levels, processing centers related to the transplant center, and larger islet size are factors that favor the primary outcomes. Protocols with daclizumab or etanercept during induction had higher rates of II and lower rates of function loss, which endorse the current approaches. Infusion-related adverse event incidence was 0.71 events/person-year (EPY) in year 1, whereas immunosuppression-related adverse event incidence was 0.87 EPY, both declining to less than 0.21 EPY thereafter. Conclusions Clinical islet transplantation needs to be evaluated using the most clinically relevant endpoints such as glucose stabilization and severe hypoglycemia prevention. The cumulative results of the registry confirm the inarguably positive impact of islet transplantation on metabolic control in T1 diabetes.

237 citations


Journal ArticleDOI
TL;DR: The evidence supports a causal connection between human leukocyte antigen antibodies and chronic rejection and it is hoped that this review will stimulate centers to begin the one remaining task of showing that antibody removal will indeed prevent chronic failure.
Abstract: Considerable research has established an association between human leukocyte antigen antibodies and chronic rejection. Two new major developments now provide evidence that this relationship is in fact causative. First, recent studies of serial serum samples of 346 kidney transplant patients from four transplant centers show that de novo antibodies, can be detected before rejection. Moreover, serial testing revealed that when antibodies were not present, 528 patient years of good function was demonstrable in 149 patients. Second, among 90 patients whose grafts chronically failed, 86% developed antibodies before failure. To assess the likelihood of a causal link, we applied the nine widely accepted Bradford Hill criteria and conclude that the evidence supports a causal connection between human leukocyte antigen antibodies and chronic rejection. The clinical implication is significant because we hope this review will stimulate centers to begin the one remaining task of showing that antibody removal will indeed prevent chronic failure.

233 citations


Journal ArticleDOI
TL;DR: It is concluded that de novo colonization of the lung allograft by Pseudomonas is strongly associated with the subsequent development of BOS.
Abstract: Long-term survival after lung transplantation remains limited by the development of bronchiolitis obliterans syndrome (BOS). Allograft colonization with Pseudomonas aeruginosa is common particularly in recipients with BOS, but a possible etiological relationship remains unexplored. In 155 consecutive lung transplants, the development of allograft colonization with Pseudomonas was strongly associated with the development of BOS within 2 years of transplant (23.4% vs. 7.7% in those colonized and not colonized, respectively, P=0.006). Freedom from BOS was significantly shorter in those patients without any pretransplant bacterial reservoir developing de novo allograft pseudomonal colonization as compared with those remaining free of colonization (Kaplan-Meier log-rank P=0.014). The isolation of Pseudomonas preceded the diagnosis of BOS in 14 of 18 (78%) and by a median of 204 days (95% confidence interval 115-492) in patients developing both these complications. We conclude that de novo colonization of the lung allograft by Pseudomonas is strongly associated with the subsequent development of BOS.

226 citations


Journal ArticleDOI
TL;DR: The short-term survival of penetrating corneal transplants is excellent, but the eventual attrition rate appears inexorable and many factors that influence graft survival significantly are not amenable to change.
Abstract: Background Our aims were to examine graft survival and visual outcome after full-thickness corneal transplantation. Methods Records of 18,686 penetrating corneal grafts, 14,622 with archival follow-up from 1 to 22 years, were examined within a national database. Kaplan-Meier survival analysis indicated variables of interest for Cox proportional hazards regression analysis. A model clustered by patient to control intereye or intergraft dependence was constructed to identify variables best predicting penetrating corneal graft failure. Visual acuity in the grafted eye was measured by Snellen acuity. Results Probability of corneal graft survival was 0.87, 0.73, 0.60, and 0.46 at 1, 5, 10, and 15 years, respectively. Reasons for graft failure included irreversible rejection (34%), corneal endothelial cell failure including cases of glaucoma (24%), and infection (14%). Variables predicting graft failure in multivariate analysis included transplant center, location and volume of surgeon's case-load, graft era, indication for graft, number of previous ipsilateral grafts, lens status, corneal neovascularization at transplantation, a history of ocular inflammation or raised intraocular pressure, graft diameter, and postoperative events including graft neovascularization and rejection. Best-corrected Snellen acuity of 6/12 or better was achieved by 45%, and of less than 6/60 by 26%, of grafted eyes at last follow-up. Conclusions The short-term survival of penetrating corneal transplants is excellent, but the eventual attrition rate appears inexorable and many factors that influence graft survival significantly are not amenable to change. Most penetrating grafts are performed for visual improvement, and excellent acuity will be achieved by approximately half of all grafts.

212 citations


Journal ArticleDOI
TL;DR: Azithromycin can improve airflow limitation in a significant proportion of patients with even long-standing BOS and indicate the predictive value of BAL neutrophilia for treatment response and pretreatment course of FEV1 as a variable for disease progression.
Abstract: BACKGROUND Bronchiolitis obliterans syndrome (BOS) is a major cause of morbidity and mortality after lung transplantation (LTx) Macrolides are a promising treatment option for BOS The objective of this study was to determine long-term results of azithromycin treatment in patients with BOS Variables to predict treatment response were evaluated METHODS An observational study in a single center was performed Eighty-one adult LTx-recipients (single, double, combined, and re-do) with at least BOS stage 0p (mean forced expired volume in 1 second [FEV1] 55+/-19%) were included For treatment, 250 mg of oral azithromycin was administered three times per week RESULTS Twenty-four of 81 (30%) patients showed improvement in FEV1 after 6 months, 22/24 already after 3 months of treatment By univariate analysis, responders at 6 months had higher pretreatment bronchoalveolar lavage (BAL) neutrophils (51+/-29 vs 21+/-24%) A cutoff value of <20% in pretreatment BAL had a negative predictive value of 091 for treatment response Thirty-three patients (40%) showed disease progression during follow-up (491+/-165 days) Cox regression analysis identified a rapid pretreatment decline in FEV1 and comedication of an mammalian target of rapamycin inhibitor as positive predictors and proton pump inhibitor comedication and a treatment response at 3 months as negative predictors for disease progression (FEV1<90% baseline) CONCLUSIONS Azithromycin can improve airflow limitation in a significant proportion of patients with even long-standing BOS The majority of responders were identified after 3 months of treatment Results indicate the predictive value of BAL neutrophilia for treatment response and pretreatment course of FEV1 as a variable for disease progression Beneficial effects on gastroesophageal reflux disease may be a mechanism of action

200 citations


Journal ArticleDOI
TL;DR: In T-cell negative crossmatch patients, higher anti-HLA-II antibody levels are related to the increase in the risk of developing TG and the presence of C4d in peritubular capillaries in TG biopsies isrelated to the reduced graft survival.
Abstract: Background. Transplant glomerulopathy (TG) is a histopathologic entity of kidney allografts related to anti-human leukocyte antigen (HLA) antibodies. The goal of this study was to determine the relationships among antibody characteristics (level and specificity), risk for TG, and graft survival. Methods. The presence and characteristics of anti-HLA antibody were assessed by single antigen beads assays in stored pretransplant sera from 598 kidney recipients with negative T-cell crossmatch. Transplant glomerulopathy was diagnosed by surveillance and clinical biopsies. Results. Thirty-nine percent of patients presented with anti-HLA antibodies pretransplant. Transplant glomerulopathy was diagnosed in 73 patients (12%) during 54±19 months of follow-up. The risk of TG increased with higher anti-HLA-II antibody levels (HR= 1.890, 95% CI 1.42-2.52; P<0.0001), donor specificity of the antibodies (3.524 [1.67-7.44]; P=0.001), and in patients with history of antibody-mediated rejection (4.985 [2.77-8.97]; P<0.0001, multivariate Cox). Graft survival during the follow-up period was 95% without TG and 62% with TG (P<0.0001). The presence of C4d in peritubular capillaries was an independent risk factor for graft failure after TG diagnosis. Thus, 25% of TG/C4d - and 80% of TG/C4d + grafts failed (P<0.0001). Of interest, higher anti-HLA-II levels were related to the presence of C4d (3.216 [1.376-7.517]; P=0.007). Conclusions. In T-cell negative crossmatch patients, higher anti-HLA-II antibody levels are related to the increase in the risk of developing TG. Higher antibody levels are also related to the presence of C4d in peritubular capillaries in TG biopsies. Furthermore, the presence of C4d in TG is related to the reduced graft survival.

188 citations


Journal ArticleDOI
TL;DR: ABOi transplantation after this protocol does not have a negative impact on long-term graft function, and antigen-specific immunoadsorption and rituximab is equivalent to ABOc living donor kidney transplantation.
Abstract: Background. In 2001 a protocol for ABO-incompatible (ABOi) kidney transplantation based on antigen-specific immunoadsorption and rituximab was introduced at our center, short-term results being comparable with those of ABO-compatible (ABOc) living donor kidney transplantation. Of greater importance, however, is long-term graft function, thus far not evaluated. The aim of this study was therefore to assess long-term results of this protocol. Methods. Twenty ABOi kidney recipients with more than 12-month follow-up were included in the study: all adult crossmatch negative ABOi kidney recipients (n15) were compared with an adult ABOc living donor recipient control group (n30), and all pediatric ABOi kidney recipients (16 years of age) (n5) were compared with a group of pediatric ABOc kidney recipients (n18). Results. Mean follow-up was three years. There was no significant difference in patient survival, nor in graft survival or in the incidence of acute rejection in any of the groups. In the adult kidney recipients mean glomerular filtration rate was equivalent at all time points (79 – 83 mL/min), as was s-creatinine. In the pediatric groups, s-creatinine was similar but glomerular filtration rate lower among the ABOi kidney recipients. There was a significant reduction (P0.0001) without rebound in A/B antibody titers after transplantation (median IgG 1:2 and median IgM 1:11 year posttransplant) compared with pretransplant levels (median IgG 1:32 and IgM 1:16). Conclusion. We conclude that ABOi kidney transplantation using antigen-specific immunoadsorption and rituximab is equivalent to ABOc living donor kidney transplantation. ABOi transplantation after this protocol does not have a negative impact on long-term graft function.

Journal ArticleDOI
TL;DR: IAT outcomes provide a minimum theoretical standard to work toward in allotransplantation and are likely responsible for the differences, including donor brain death, longer cold ischemia time, diabetogenic immunosuppression, and auto- and alloimmunity.
Abstract: Introduction Islet allografts are currently associated with a high rate of early insulin independence, but after 1 year insulin-independence rates rapidly decline for unclear reasons. In contrast, as shown here, islet autotransplants (IATs) show durable function and extended insulin-independence rates, despite a lower beta-cell mass. Methods IAT function was determined in 173 patients after total pancreatectomy at our center. Islet function was considered full in insulin-independent patients, partial when euglycemic on once-daily long-acting insulin (all tested were C-peptide positive), and failed if on a standard diabetic regimen. Outcomes for autoislet recipients by Kaplan-Meier survival analysis were compared with those of alloislet recipients in the Collaborative Islet Transplant Registry. Results IAT function (full/partial combined) and insulin independence correlated with islet yield. Overall only 65% functioned within the first year, and only 32% were insulin independent, but of IATs that functioned initially (n=112), 85% remained so 2-years later, in contrast to 66% of allografts (n=262). Of IAT recipients who became insulin independent (n=55), 74% remained so 2-years later versus 45% of initially insulin-independent allograft recipients (n=154). Of IATs that functioned or induced insulin independence, the rates at 5 years were 69% and 47%, respectively. Conclusion Islet function is more resilient in autografts than allografts. Indeed, the 5-year insulin-independence persistence rate for IATs is similar to the 2-year rate for allografts. Several factors unique to allocases are likely responsible for the differences, including donor brain death, longer cold ischemia time, diabetogenic immunosuppression, and auto- and alloimmunity. IAT outcomes provide a minimum theoretical standard to work toward in allotransplantation.

Journal ArticleDOI
TL;DR: Patients with prolonged CIT should receive adequate immunosuppression, possibly with antilymphocyte preparations, to prevent AR occurrence and to decrease not only DGF rates but also AR incidence and hence graft loss.
Abstract: Background The aim of our study was to examine, in a recent cohort of kidney transplant recipients who have received modern immunosuppressive therapy, the respective role of cold ischemia time (CIT) and delayed graft function (DGF) on acute rejection (AR) rates and long-term graft survival. Methods We retrospectively reviewed the charts of 611 renal transplantations between 1996 and 2005. Most patients received a calcineurin inhibitor as maintenance therapy, either cyclosporine (43%) or tacrolimus (52%) and 76% of the patients received an antilymphocyte induction therapy. Study endpoints were DGF, first-year AR, and long-term graft survival. Uni- and multivariate analyses were performed to determine factors that may have influenced the study outcomes. Results DGF was observed in 16.2% of patients. Both older donor age and longer CIT were significant risk factors for DGF. DGF rates were similar whether patients received a calcineurin inhibitor before transplantation or not. AR occurred in 16.5% of grafts during the first year. Independent predictors of AR by multivariate analysis were duration of dialysis, CIT, current panel-reactive lymphocytotoxic antibody more than 5%, and the number of human leukocyte antigen-A, B, and DR mismatches. Each hour of cold ischemia increases the risk of rejection by 4%. With respect to death-censored graft survival, three pretransplant parameters emerged as independent predictors of graft loss: younger recipient age, peak panel-reactive lymphocytotoxic antibody more than 5% and longer CIT. The detrimental effect of CIT on graft survival was entirely because of its propensity to trigger AR. When AR was added to the multivariate Cox model, CIT was no longer significant whereas first-year AR became the most important predictor of graft loss (Hazards ratio, 4.6). Conclusion Shortening CIT will help to decrease not only DGF rates but also AR incidence and hence graft loss. Patients with prolonged CIT should receive adequate immunosuppression, possibly with antilymphocyte preparations, to prevent AR occurrence.

Journal ArticleDOI
TL;DR: First evidence is provided that the increase of CD4+CD25high T cells and FoxP3 transcripts is associated with operational tolerance in liver transplanted patients during IS withdrawal.
Abstract: Background. Human liver allografts do sometimes survive in a recipient after withdrawal of immunosuppression (IS), commonly referred to as "operational tolerance." Preliminary clinical data have suggested an increase in the frequency of regulatory T cells (Treg) CD4+CD25 high and FoxP3 expression in operationally tolerant liver transplant recipients (Gr-T). In the context of human liver transplantation, the dynamics of Treg have not been studied. We designed a prospective study to ascertain the profile of the Treg population and FoxP3 expression during IS withdrawal. Methods. To identify such parameters, we analyzed peripheral blood mononuclear cell populations and FoxP3 mRNA expression in 12 liver allograft recipients under cyclosporine A-based IS, who showed stable function of the allograft for more than 2 years. Results. An increase was observed in the frequency of CD4+CD25 high cells when the IS was withdrawn in Gr-T patients (n=5). These patients exhibited a 3.5-fold increase for relative mRNA FoxP3 expression before the complete IS withdrawal and this continued when IS therapy was stopped. In patients who suffered rejection (n=7) there was no increase in the CD4+CD25 high cells or FoxP3 expression. Conclusions. With the present study, the first evidence is provided that the increase of CD4-CD25 high T cells and FoxP3 transcripts is associated with operational tolerance in liver transplanted patients during IS withdrawal.

Journal ArticleDOI
TL;DR: A key recommendation of this group is to establish national intestinal failure databases that can support multicenter studies and lead to the adoption of universally accepted standards of patient care with the goal of improving outcomes in all long-term intestinal failure patients including those requiring intestinal transplantation.
Abstract: UNLABELLED: Intestinal transplant wait-list mortality is higher than for other organ transplants. The objective of this workshop was to identify the main problems contributing to high mortality in adults and children candidates for intestinal transplantation and provide recommendations on how to correct them. OUTCOME: To facilitate this, 63 relevant articles identified from the medical literature from 1987 to 2007 were reviewed. Consensus was achieved on several important definitions relevant to this review. For children and adults on parenteral nutrition (PN) the main mortality risk factors were identified as were the main risks of mortality for those on the waiting list for intestinal transplants. RECOMMENDATIONS: (1) Primary care givers managing intestinal failure patients should establish a link with an intestinal failure programs early and collaboration with intestinal failure programs should be initiated for patients whose PN requirements are anticipated to be more than 50% 3 months after initiating PN; (2) intestinal failure programs should include both intestinal rehabilitation and intestinal transplantation or have active collaborative relationships with centers performing intestinal transplantation; (3) National registries for intestinal failure patients should be established and organizations that provide home PN solutions should be expected to participate. CONCLUSION: There are many unresolved issues in adults and children with PN dependent intestinal failure. To address these, a key recommendation of this group is to establish national intestinal failure databases that can support multicenter studies and lead to the adoption of universally accepted standards of patient care with the goal of improving outcomes in all long-term intestinal failure patients including those requiring intestinal transplantation.

Journal ArticleDOI
TL;DR: These novel findings suggest a possible functional relationship between the enhanced incidence of precursor plasmacytoid DC, their comparatively high relative expression of the coinhibitory molecule PD-L1, and the elevated frequency of Treg in operational liver transplant tolerance.
Abstract: Background Both dendritic cells (DC) and T-regulatory cells (Treg) have been implicated in regulation of alloimmune responses and transplant tolerance. Methods We analyzed B7 coregulatory molecule expression on circulating DC subset precursors, together with CD4+CD25(hi) Foxp3+ Treg by rare event, flow cytometric analysis in operationally tolerant pediatric liver transplant recipients (TOL), those undergoing prospective immunosuppressive drug weaning (PW) or maintenance immunosuppression (MI), and normal healthy individuals (controls). Results Use of DC subset-specific monoclonal antibodies confirmed elevated precursor plasmacytoid DC/myeloid DC ratios in TOL and PW compared with MI. In addition, Treg frequencies were higher in TOL than in PW and MI, but not controls. While there was no difference in levels of costimulatory and coinhibitory molecules on precursor myeloid DC between the groups, the programmed death ligand-1 (PD-L1=B7-H1):CD86 (B7-2) ratio on precursor plasmacytoid DC was significantly higher in TOL than MI and correlated with the Treg frequency. There was no relation between prednisone or tacrolimus dose or tacrolimus trough level and either the PD-L1/CD86 ratio on plasmacytoid DC or the Treg frequency. Moreover, clinically relevant concentrations of dexamethasone or tacrolimus did not affect these values in short-term culture. Conclusion These novel findings suggest a possible functional relationship between the enhanced incidence of precursor plasmacytoid DC, their comparatively high relative expression of the coinhibitory molecule PD-L1, and the elevated frequency of Treg in operational liver transplant tolerance.

Journal ArticleDOI
TL;DR: Islet transplantation yields improved HbA1c and less progression of retinopathy compared with intensive medical therapy during 3 years follow-up.
Abstract: BACKGROUND We hypothesized that transplantation of islets into type 1 diabetics could improve outcomes of glucose metabolism, renal function, retinopathy, and neuropathy compared with intensive medical therapy. METHODS We conducted a prospective, crossover, cohort study of intensive medical therapy (group 1) versus islet cell transplantation (group 2) in 42 patients. All were enrolled in group 1 then 31 crossed over with group 2 when islet donation became available. Transplantation was performed by portal venous embolization of more than 12,000 islet equivalents/kg body weight under cover of immunosuppression with antithymocyte globulin, tacrolimus, and mycophenolate. Outcome measures were HbA1c, change in glomerular filtration rate (GFR), progression of retinopathy, and change in nerve conduction velocity. This report details interim analysis of outcomes after 34+/-18 months (group 1) and 38+/-18 months (group 2). RESULTS HbA1c (%) in group 1 was 7.5+/-0.9 versus 6.6+/-0.7 in group 2 (P<0.01). GFR (mL/min/month) declined in both groups (group 1 -0.45+/-0.7 vs. group 2 -0.12+/-0.7, P=0.1). Slope of the GFR decline in group 1 was significantly more than 0. Retinopathy progressed in 10 of 82 eyes in group 1 versus 0 of 51 in group 2 (P<0.01). Nerve conduction velocity (m/sec) remained stable in group 1 (47.8+/-5 to 47.1+/-5 m/sec) and group 2 (47.2+/-4.5 to 47.7+/-3.5). CONCLUSION Islet transplantation yields improved HbA1c and less progression of retinopathy compared with intensive medical therapy during 3 years follow-up.

Journal ArticleDOI
TL;DR: Extracellular matrix proteins adsorbed to microporous scaffolds can enhance the function of transplanted islets, with collagen IV maximizing graft function relative to the other proteins tested.
Abstract: Type 1 diabetes mellitus (T1DM) affects an estimated 1.5 million Americans (1) and is characterized by autoimmune-mediated destruction of pancreatic β-cells, which results in absolute insulin deficiency (2–5). Although careful glucose monitoring combined with exogenous insulin administration can effectively control acute glycemia, secondary microvascular and macrovascular complications eventually afflict most type 1 diabetic subjects (6–8). β-cell replacement via transplantation of allogeneic islets has been explored as a potential curative treatment but clinical islet transplantation has thus far yielded disappointing results, with less than 10% of those transplanted remaining insulin independent after 5 years (9). Moreover, the stringent inclusion criteria for and shortage of donors, coupled with the requirement for two to four donor pancreata per recipient, limit the potential of this approach (10–12). Reasons for the limited success of islet transplantation are multifactorial and likely related to the loss of vascular connections (13, 14) and disruption of cell-matrix contacts that occur during the isolation procedure (10). Basement membrane proteins present between intraislet endothelial and endocrine islet cells are primarily collagen IV, laminin, and fibronectin. These proteins engage integrins on the surface of islet cells to mediate adhesion, provide structural support, and activate intracellular chemical signaling pathways (15–17). During enzymatic digestion of the exocrine pancreas, these extracellular matrix (ECM) proteins are degraded, which interrupts cell-matrix interactions (18–20). Early islet cell death after transplantation may be related to a lack of integrin signaling resulting in apoptosis (20). Islets cultured on matrices containing ECM components, on the other hand, exhibited improved survival in vitro (21). Therefore, the provision of a matrix to support islet attachment may be an important requirement for maintaining the function and viability of transplanted islets. As previously reported, microporous, biocompatible, biodegradable scaffolds fabricated from poly(lactide-co-glycolide) (PLG) were successfully used as platforms for islet transplantation in mice (22). This type of scaffold offers distinct advantages, including (i) a high surface area/volume ratio to enable nutrient and waste transport, (ii) an interconnected internal pore structure to allow for cell and blood vessel infiltration, (iii) sufficient mechanical rigidity to provide a platform for cell attachment and ease of implantation, and (iv) the ability to degrade over time, allowing for complete integration into the surrounding tissue. In addition to providing structural support, the scaffold surface can be modified with nondiffusible molecules, such as ECM components, to mediate cellular interactions that are necessary for cell attachment, growth, and proliferation (23). This surface modification enables manipulation of the local microenvironment so that the impact of factors in isolation or combination on graft efficacy can be determined. In the present study, we investigated the ability and specificity of ECM proteins to promote the long-term function of islets transplanted onto microporous scaffolds coated with collagen IV, laminin or fibronectin, and implanted into a mouse model of diabetes. Consistent with a previous study, the epididymal fat pad was selected as the site of implantation due to its surgical accessibility, vascularization, and structural similarity to the greater omentum in humans (a potential extrahepatic site for clinical islet transplantation) (22, 24). Nonfasting and dynamic blood glucose data, weight measurements and immunohistochemistry results suggest that the composition of the local microenvironment surrounding transplanted islets is a key factor in promoting their long-term survival and function.

Journal ArticleDOI
TL;DR: It is suggested that SLK may be overused in the MELD era and that current prioritization of kidney grafts to those liver failure patients results in wasting of limited resources.
Abstract: BACKGROUND When the United Network for Organ Sharing changed its algorithm for liver allocation to the model for end-stage liver disease (MELD) system in 2002, highest priority shifted to patients with renal insufficiency as a major component of their end-stage liver disease. An unintended consequence of the new system was a rapid increase in the number of simultaneous liver-kidney transplants (SLK) being performed yearly. METHODS Adult recipients of deceased donor liver transplants (LT, n=19,137), kidney transplants (n=33,712), and SLK transplants (n=1,032) between 1987 and 2006 were evaluated based on United Network for Organ Sharing data. Recipients were stratified by donor subgroup, MELD score, pre- versus post-MELD era, and length of time on dialysis. Matched-control analyses were performed, and graft and patient survival were analyzed by Kaplan-Meier and Cox proportional hazards analyses. RESULTS MELD era outcomes demonstrate a decline in patient survival after SLK. Using matched-control analysis, we are unable to demonstrate a benefit in the SLK cohort compared with LT, despite the fact that higher quality allografts are being used for SLK. Subgroup analysis of the SLK cohort did demonstrate an increase in overall 1-year patient and liver graft survival only in those patients on long-term dialysis (> or =3 months) compared with LT (84.5% vs. 70.8%, P=0.008; hazards ratio 0.57 [95% CI 0.34, 0.95], P=0.03). CONCLUSION These findings suggest that SLK may be overused in the MELD era and that current prioritization of kidney grafts to those liver failure patients results in wasting of limited resources.

Journal ArticleDOI
TL;DR: The avoidance of plasma transfusion was associated with a decrease in RBC transfusions during liver transplantation, and this work supports the practice of lowering central venous pressure with phlebotomy to reduce blood loss, during liver dissection, without any deleterious effect.
Abstract: Background. In our experience, correction of coagulation defects with plasma transfusion does not decrease the need for intraoperative red blood cell (RBC) transfusions during liver transplantation. On the contrary, it leads to a hypervolemic state that result in increased blood loss. A previous study has shown that plasma transfusion has been associated with a decreased 1-year survival rate. The aim of this prospective study was to evaluate whether anesthesiologists could reduce RBC transfusion requirements during liver transplantation by eliminating plasma transfusion. Methods. Two hundred consecutive liver transplantations were prospectively studied over a 3-year period. Patients were divided into two groups: low starting international normalized ratio (INR) value <1.5 and high INR≥1.5. Low central venous pressure was maintained in all patients before the anhepatic phase. Coagulation parameters were not corrected preoperatively or intraoperatively in the absence of uncontrollable bleeding. Phlebotomy and auto transfusion of blood salvaged were used following our protocol. Independent variables were analyzed in both univariate and multivariate fashion to find a link with RBC transfusions or decreased survival rate. Results. The mean number of intraoperative RBC units transfused was 0.3±0.8. Plasma, platelet, albumin, and cryoprecipitate were not transfused. In 81.5% of the patients, no blood product was used during their transplantation. The average final hemoglobin (Hb) value was 91.2± 15.0 g/L. There were no differences in transfusional rate, final Hb, or bleeding between two groups (low or high INR values). The overall 1-year survival rate was 85.6%. Logistic regression showed that avoidance of plasma transfusion, phlebotomy, and starting Hb value were significantly linked to liver transplantation without RBC transfusion. The need for intraoperative RBC transfusion and Pugh's score were linked to the decreased 1-year survival rate. Conclusion. The avoidance of plasma transfusion was associated with a decrease in RBC transfusions during liver transplantation. There was no link between coagulation defects and bleeding or RBC or plasma transfusions. Previous reports indicating that it is neither useful nor necessary to correct coagulation defects with plasma transfusion before liver transplantation seem further corroborated by this study. We believe that this work also supports the practice of lowering central venous pressure with phlebotomy to reduce blood loss, during liver dissection, without any deleterious effect.

Journal ArticleDOI
TL;DR: This study shows active lifestyle modification benefits high-risk transplant recipients with glucose intolerance and should be aggressively pursued.
Abstract: Introduction. Lifestyle modification is recommended as first-line therapy to manage new-onset diabetes after transplantation (NODAT) and impaired glucose tolerance (IGT). No data currently demonstrate the efficacy of this approach specifically for transplant recipients. This study aimed to assess the benefit of intensive lifestyle modification in this high-risk group and to contrast this with the natural evolution of glucose metabolism after transplantation.Methods. Baseline oral glucose tolerance test (OGTT) stratified 115 patients into two groups. Group I had glucose intolerance, IGT (n=28) and NODAT (n=8), and received intensive lifestyle modification (dietician referral, exercise program, weight loss advice). Group 2 had normal glucose tolerance (n=79) and received lifestyle modification leaflets. Both groups had follow-up OGTT after 6 months to assess change in glycemic status.Results. Excluding all patients who received steroid weaning or withdrawal as part of their management, 111 patients were included in the analysis. Lifestyle modification in group I resulted in 15% improvement in 2-hr postprandial glucose versus 12% deterioration in group 2. In group 1, 44% (n= 11) of IGT patients developed normal glucose tolerance, whereas only 4% (n=1) developed NODAT. Fifty-eight percent (n=4) of NODAT patients showed improvement (29% to IGT and 29% to normal). Glucose metabolism deteriorated in group 2 with 14% (n=10) developing IGT and 3% (n=2) developing NODAT.Conclusions. Glucose metabolism can deteriorate in transplant recipients despite passive lifestyle modification advice. This study shows active lifestyle modification benefits high-risk transplant recipients with glucose intolerance and should be aggressively pursued.

Journal ArticleDOI
TL;DR: The occurrence of fatal and nonfatal cardiovascular events after successful renal transplantation not only relates to baseline cardiovascular risk factors present at transplantation, but also to immunosuppressive drugs and posttransplantation traditional and nontraditional risk factors.
Abstract: Background. Cardiovascular disease is a frequent cause of morbidity after renal transplantation. The aims of this study were to evaluate the incidence of cardiovascular events and to identify the main risk factors for cardiovascular complications and mortality in 2071 white adults with a renal transplant functioning for at least 1 year. Methods. Clinical events, routine biochemistry, and prescribed drugs at month 1, month 6, and yearly after transplantation were analyzed. Results. The incidence of cardiovascular events increased over time. At 15 years after transplantation, only 47% of surviving patients had not experienced any cardiovascular event. Risk factors associated with cardiovascular complications were male gender (P=0.04), age (P<0.0001), arterial hypertension before transplantation (P<0.0001), longer pretransplant dialysis (P<0.0001), cardiovascular event before transplantation (P<0.0001), older era of transplantation (P=0.0009), center-specific effect (P=0.003), posttransplant diabetes mellitus (P=0.01), increased pulse pressure after transplantation (P=0.02), intake of corticosteroids (P=0.016), intake of azathioprine (P=0.016), lower serum albumin after transplantation (P=0.004), and higher serum triglyceride levels after transplantation (P=0.007). The risk of death was increased in patients with low or elevated hematocrit, while it was minimal with values around 38%. Conclusions. The occurrence of fatal and nonfatal cardiovascular events after successful renal transplantation not only relates to baseline cardiovascular risk factors present at transplantation, but also to immunosuppressive drugs and posttransplantation traditional and nontraditional risk factors.

Journal ArticleDOI
TL;DR: In certain patients, inflammatory microenvironment provides BAFF-dependent paracrine survival signal to B-cells in TLOs, allowing them to escape rituximab-induced apoptosis, thereby thwarting therapeutic efficiency.
Abstract: mediated rejection. Clinical characteristics and circulating B cell count were recorded for these two patients. The composition and the microarchitecture of the inflammatory infiltrate were analyzed by flow cytometry and immunohistochemistry. Organotypic cultures were performed to evaluate the intragraft production of alloantibody. Levels of expression of BAFF (Blys, CD257) were evaluated by quantitative reverse transcriptase-polymerase chain reaction. Results. Despite the complete depletion of circulating B cells in peripheral blood, TLOs were evidenced in the interstitiumofbothexplantedgrafts.Theirfunctionalitywasassessedbythedemonstrationofapersistentlocalproduction of alloantibody. BAFF, a potent survival factor for B cells, was found to be overexpressed (both at the gene and the protein levels) in chronically rejected grafts when compared with normal kidneys and lymph nodes. Conclusions.Incertainpatients,inflammatorymicroenvironmentprovidesBAFF-dependentparacrinesurvivalsignal to B-cells in TLOs, allowing them to escape rituximab-induced apoptosis, thereby thwarting therapeutic efficiency.

Journal ArticleDOI
TL;DR: Based on administrative data, the risk of cardiovascular disease was unchanged in the first decade after kidney donation and the observed increase in diagnosed hypertension may be due to nephrectomy or more blood pressure measurements received by donors in follow-up and requires prospective study.
Abstract: Background. Knowledge of any harm associated with living kidney donation guides informed consent and living donor follow-up. Risk estimates in the literature are variable, and most studies did not use a healthy control group to assess outcomes attributable to donation. Methods. We observed a retrospective cohort using health administrative data for donations which occurred in Ontario, Canada between the years 1993 and 2005. There were a total of 1278 living donors and 6359 healthy adults who acted as a control group. Individuals were followed for a mean of 6.2 years (range, 1-13 years) after donation. The primary outcome was a composite of time to death or first cardiovascular event (myocardial infarction, stroke, angioplasty, and bypass surgery). The secondary outcome was time to a diagnosis of hypertension. Results. There was no significant difference in death or cardiovascular events between donors and controls (1.3% vs. 1.7%; hazard ratio 0.7,95% confidence interval 0.4-1.2). Donors were more frequently diagnosed with hypertension than controls (16.3% vs. 11.9%, hazard ratio 1.4, 95% confidence interval 1.2-1.7) but were also seen more often by their primary care physicians (median [interquartile range] 3.6 [1.9-6.1] vs. 2.6 [1.4-4.3] visits per person year, P<0.001). Conclusions. Based on administrative data, the risk of cardiovascular disease was unchanged in the first decade after kidney donation. The observed increase in diagnosed hypertension may be due to nephrectomy or more blood pressure measurements received by donors in follow-up and requires prospective study.

Journal ArticleDOI
TL;DR: There was no significant association between PML and MMF, but MMF use in this cohort is too high to accurately assess an association, and PML is rare in the renal transplant population.
Abstract: Mycophenolate mofetil (MMF) use may be associated with progressive multifocal leukoencephalopathy (PML). We conducted a retrospective cohort study of 32,757 renal transplant recipients using the United States Renal Data System kidney transplant files for the incidence, prognosis, and clinical features associated with PML occurring after kidney transplant. Subjects were transplanted from January 1, 2000 to July 31, 2004 and followed through December 31, 2004. The incidence density of PML in MMF users was 14.4 cases/100,000 person-years at risk versus 0 for non-MMF users (P=0.11) by log rank test. Factors significantly associated with PML were BK virus infection (22.2% vs. 1.1%), pretransplant transfusion (75% vs. 34%), panel reactive antibody more than 20% (56% vs. 14%), and use of antirejection medications in the first year (33% vs. 9.2%), all P less than 0.05. PML is rare in the renal transplant population. There was no significant association between PML and MMF, but MMF use in this cohort is too high to accurately assess an association.

Journal ArticleDOI
TL;DR: It was showed that in the AM program DDA detected by SA, and not by less-sensitive methods, may be related to acute rejection episodes but is not detrimental to long-term graft outcome, raising questions about the increasing use of more-sensitive screening techniques for the allocation of organs.
Abstract: Background Highly sensitized (HS) patients (>85% panel-reactive antibodies) have a lower chance of receiving a donor kidney. Within Eurotransplant the Acceptable Mismatch (AM) program was developed to increase the chances of HS patients to receive a crossmatch-negative donor kidney. The standard crossmatch in the AM program is based on complement-dependent cytotoxicity. Methods In this study we wanted to determine the clinical relevance of human leukocyte antigen donor-directed antibodies (DDA) detected by the single antigen (SA) bead technique, in the pretransplant sera of HS patients transplanted in our center through the Eurotransplant AM program. Results From 34 AM patients, 27 were transplanted with 1 to 5 mismatches and 7 received a 0-mismatched graft. From the mismatched patients, retrospectively, 13 proved to possess pretransplant DDA by SA whereas 14 did not. No antibodies were found in the 0-mismatched group. Comparison of the DDA+ and DDA- patients in the human leukocyte antigen-mismatched donor/recipient combinations revealed a trend to an earlier and higher number of rejection episodes in DDA+ patients (P=0.08). No detrimental effect of DDA on graft survival was observed. Conclusions This single-center study showed that in the AM program DDA detected by SA, and not by less-sensitive methods, may be related to acute rejection episodes but is not detrimental to long-term graft outcome. These findings question the increasing use of more-sensitive screening techniques for the allocation of organs.

Journal ArticleDOI
TL;DR: The current data regarding therapeutic monitoring of MMF is of limited quality, and the most promising results to date come from limited sampling strategies, with benefit seen in one prospective randomized trial.
Abstract: Background The use of mycophenolate mofetil (MMF) as a primary immunosuppressant after transplantation is increasing. A number of factors interact to result in variability in blood levels of mycophenolic acid (MPA) increasing the risk of toxicity. This has led to interest in the application of therapeutic drug monitoring to optimize its use. Methods A systematic literature search was performed using Medline, Embase, the Cochrane Central Registry of Clinical Trials, the Transplant Library, and clinical trial registries for studies investigating the clinical role of MMF pharmacokinetic drug monitoring. Studies relating monitoring regimens to clinical outcomes were included. Results The majority of studies are retrospective in nature, demonstrating good correlation between the full total MPA area-under-the-curve and the risk of acute rejection, but not toxicity. Free MPA levels may better predict toxicity. Single-point parameters, in particular trough levels, show poor correlation with the risk of acute rejection and toxicity, and in prospective studies do not improve clinical outcomes. Limited sampling strategies using samples from the first few hours postdose allow good prediction of the full area-under-the-curve, and monitoring using these strategies may improve clinical outcomes. Conclusions The current data regarding therapeutic monitoring of MMF is of limited quality. The most promising results to date come from limited sampling strategies, with benefit seen in one prospective randomized trial. Further prospective trials and longer follow-up are required to investigate the optimum sampling strategy and subsets of patients who may benefit from monitoring, but the current evidence in favor of monitoring is weak.

Journal ArticleDOI
TL;DR: The results show that patients with DSA more than 105and FCM more than 200 MCS are at higher risk for AMR.
Abstract: BACKGROUND The aims of this study were to determine the level of donor-specific antibody (DSA) that allows for successful transplantation after desensitization with IVIG and rituximab and to identify patients at risk for antibody-mediated rejection (AMR). METHODS Pre- and posttransplant sera from 16 patients with DSA before desensitization were tested. Strength of DSA was determined by single antigen Luminex bead assay and results expressed as standard fluorescence intensity (SFI). T-cell flow crossmatch results were expressed as mean channel shifts (MCS). AMR was determined by biopsy and C4d deposition. RESULTS Six had negative pretransplant flow crossmatches with a mean DSA of 8,805 SFI. Five had positive flow crossmatches (78-192 MCS) with mean DSA of 55,869 SFI. No patients in either group had AMR. Five had positive flow crossmatches (222-266 MCS) with mean DSA of 118,063 SFI. Three experienced AMR. The MCS and DSA levels for patients with AMR were significantly higher than patients without (P < or = 0.001). For patients without complications (n=7), DSA remained less than 10(5) SFI and usually decreased to approximately 10(4) SFI posttransplant for both class I and II. For patients with AMR (n=3), predominant increases in class II DSA more than 10(5) SFI were observed. All three patients continue to have DSA approximately 10(5) SFI with stable creatinine after treatment for AMR. CONCLUSIONS Approximately 63% of patients were transplanted with a positive flow crossmatch. The results show that patients with DSA more than 10(5) and FCM more than 200 MCS are at higher risk for AMR. Treatment of AMR improves renal function without significant changes in DSA.

Journal ArticleDOI
TL;DR: Extracorporeal photopheresis reduces the rate of lung function decline in recipients with BOS and is well tolerated, however, the underlying mechanism of ECP remains subject to further research.
Abstract: We report the largest single-center experience with extracorporeal photopheresis (ECP) for bronchiolitis obliterans syndrome (BOS) and recurrent acute rejection (AR) after lung transplantation. Lung transplant recipients undergoing ECP for BOS and recurrent AR were included (1997-2007). The rate of forced expiratory volume in 1 second (FEV1) decline was used as the primary measure and graft survival post-ECP as the secondary measure of efficacy. Twenty-four transplant recipients were included (BOS, n=12; recurrent AR, n=12). In recipients with BOS, decline in FEV1 was 112 mL/month before the start of ECP and 12 mL/month after 12 ECP cycles (P=0.011), mean (95% CI) change in rate of decline was 100 (28-171). Median patient survival was 7.0 (range, 3.0-13.6) years, median patient survival post-ECP 4.9 (range, 0.5-8.4) years. No ECP-related complications occurred. Extracorporeal photopheresis reduces the rate of lung function decline in recipients with BOS and is well tolerated. Furthermore, recipients with recurrent AR experience clinical stabilization. However, the underlying mechanism of ECP remains subject to further research.

Journal ArticleDOI
TL;DR: Data show that in the context of a CDC-negative crossmatch, the presence of D0 DSA has little impact on any early graft parameters, however, DSA are associated with poorer longer-term graft outcomes in kidney transplantation.
Abstract: Background. The corresponding antigens of alloantibodies identified in patients awaiting kidney transplantation are often listed as unacceptable for transplantation. The use of solid phase testing, being more sensitive and accurate than conventional complement-dependent cytotoxicity (CDC) assays, has resulted in increased identification of alloantibodies. We aimed to study the clinical importance of alloantibodies defined solely by solid phase techniques. Methods. All patients transplanted between 1999 and 2001 at our center with available day-of-transplant sera (D0) were included (121 patients). All had negative CDC crossmatches. Results. Thirty-eight patients (31%) had detectable alloantibodies using high-definition assays with 16 having donorspecific antibodies (DSA) and 22 non-DSA. There were no cases of hyperacute rejection in any of the groups. Biopsyproven acute rejection rates in the DSA and non-DSA were similar to the unsensitized group. Delayed graft function and 1-year graft survival rates were also similar for the three groups as were median 1-year serum creatinine levels. Multivariate analysis, however, showed that DSA were associated with an increased relative risk of longer-term graft failure (relative risk, 6.5; P0.05). Conclusions. These data show that in the context of a CDC-negative crossmatch, the presence of D0 DSA has little impact on any early graft parameters. DSA, however, are associated with poorer longer-term graft outcomes in kidney transplantation.