scispace - formally typeset
Search or ask a question

Showing papers in "Transplantation in 2009"


Journal ArticleDOI
TL;DR: The graded impact of KDRI on graft outcome makes it a useful decision-making tool at the time of the deceased donor kidney offer, and it is likely that there is a considerable overlap in the KDRI distribution by expanded and nonexpanded criteria donor classification.
Abstract: Background. We propose a continuous kidney donor risk index (KDRI) for deceased donor kidneys, combining donor and transplant variables to quantify graft failure risk. Methods. By using national data from 1995 to 2005, we analyzed 69,440 first-time, kidney-only, deceased donor adult transplants. Cox regression was used to model the risk of death or graft loss, based on donor and transplant factors, adjusting for recipient factors. The proposed KDRI includes 14 donor and transplant factors, each found to be independently associated with graft failure or death: donor age, race, history of hypertension, history of diabetes, serum creatinine, cerebrovascular cause of death, height, weight, donation after cardiac death, hepatitis C virus status, human leukocyte antigen-B and DR mismatch, cold ischemia time, and double or en bloc transplant. The KDRI reflects the rate of graft failure relative to that of a healthy 40-year-old donor. Results. Transplants of kidneys in the highest KDRI quintile (>1.45) had an adjusted 5-year graft survival of 63%, compared with 82% and 79% in the two lowest KDRI quintiles (<0.79 and 0.79-<0.96, respectively). There is a considerable overlap in the KDRI distribution by expanded and nonexpanded criteria donor classification. Conclusions. The graded impact of KDRI on graft outcome makes it a useful decision-making tool at the time of the deceased donor kidney offer.

823 citations


Journal ArticleDOI
TL;DR: In this article, the efficacy and safety of converting maintenance renal transplant recipients from calcineurin inhibitors (CNIs) to sirolimus (SRL) was evaluated.
Abstract: Background The efficacy and safety of converting maintenance renal transplant recipients from calcineurin inhibitors (CNIs) to sirolimus (SRL) was evaluated. Methods Eight hundred thirty renal allograft recipients, 6 to 120 months posttransplant and receiving cyclosporine or tacrolimus, were randomly assigned to continue CNI (n=275) or convert from CNI to SRL (n=555). Primary endpoints were calculated Nankivell glomerular filtration rate (GFR; stratified at baseline: 20-40 vs. >40 mL/min) and the cumulative rates of biopsy-confirmed acute rejection (BCAR), graft loss, or death at 12 months. Enrollment in the 20 to 40 mL/min stratum was halted prematurely because of a higher incidence of safety endpoints in the SRL conversion arm. Results Intent-to-treat analyses at 12 and 24 months showed no significant treatment difference in GFR in the baseline GFR more than 40 mL/min stratum. On-therapy analysis of this cohort showed significantly higher GFR at 12 and 24 months after SRL conversion. Rates of BCAR, graft survival, and patient survival were similar between groups. Median urinary protein-to-creatinine ratios (UPr/Cr) were similar at baseline but increased significantly after SRL conversion. Malignancy rates were significantly lower at 12 and 24 months after SRL conversion. Post hoc analyses identified a subgroup with baseline GFR more than 40 mL/min and UPr/Cr less than or equal to 0.11, whose risk-benefit profile was more favorable after conversion than that for the overall SRL conversion cohort. Conclusions At 2 years, SRL conversion among patients with baseline GFR more than 40 mL/min was associated with excellent patient and graft survival, no difference in BCAR, increased urinary protein excretion, and a lower incidence of malignancy compared with CNI continuation. Superior renal function was observed among patients who remained on SRL through 12 to 24 months, particularly in the subgroup of patients with baseline GFR more than 40 mL/min and UPr/Cr less than or equal to 0.11.

546 citations


Journal ArticleDOI
TL;DR: It is confirmed that HLAab produced even late after transplantation are detrimental to graft outcome, and DSA were proven to have a strong adverse impact on graft survival.
Abstract: BACKGROUND Although the incidence of early acute rejection could have been diminished in the past, the long-term renal allograft survival could not benefit from the introduction of more effective immunosuppressive regimens mainly aiming at cellular rejection mechanisms. The cause of chronic rejection is still discussed controversially. Here, we demonstrate to what extent human leukocyte antigen (HLA) antibodies (HLAab) posttransplant contribute to late graft outcome. METHODS A total of 1014 deceased kidney transplant recipients transplanted at the Charite hospital were monitored in a cross-sectional manner for the development of HLAab using Luminex Single Antigen beads. Patients with stable kidney function at a median of 5-years posttransplant were tested once for HLAab and monitored for 5.5 years after testing. RESULTS Thirty percent of recipients showed HLAab. Donor-specific antibodies (DSA) were found in 31% of antibody positive patients. The presence of DSA was associated with a significantly lower graft survival of 49% vs. 83% in the HLAab negative group (P< or =0.0001). Non-DSAs also had an adverse effect on graft survival (70% vs. 83%; P=0.0001). In a prospective analysis of 195 patients with repeatedly no detectable HLAab, the survival probability was 94% as opposed to 79% survival among patients who developed HLAab de novo after the first testing (P=0.05). CONCLUSIONS We confirmed that HLAab produced even late after transplantation are detrimental to graft outcome. DSA were proven to have a strong adverse impact on graft survival. The results indicate that a posttransplant HLAab monitoring routine could be appropriate to improve long-term results.

331 citations


Journal ArticleDOI
TL;DR: Preemptive reduction in immunosuppression is most effective in presumptive PVAN as defined by surrogate markers (i.e., high BK virus viremia), and preservation of graft function can be considered the rule.
Abstract: In the last 10 years, better immunosuppression drugs have decreased the rates of acute rejection in kidney transplantation but have also led to the emergence of polyomavirus-associated nephropathy (PVAN). This occurs in 1% to 10% of patients with kidney transplantion and is caused by BK virus in more than 95% of cases. Less than 5% of cases are attributed to the JC virus. Initially, lack of recognition or late diagnosis of PVAN resulted in rapid loss of graft function in more than 50% of patients. In recent years, it has become clear that early diagnosis and timely reduction in immunosuppression is the only proven measure, which significantly affects the outcome of PVAN. Diverse interventions have been explored including the adjunctive use of cidofovir, leflunomide, fluoroquinolones, and intravenous immunoglobulins. Allograft histology is needed to definitively establish the diagnosis of PVAN, but is of limited sensitivity in the early stage of disease. Well-established techniques and protocols for systematic screening by urine cytology and quantitative molecular-genetic techniques allow now for timely intervention before irreversible parenchymal changes occur. Moreover, preemptive reduction in immunosuppression is most effective in presumptive PVAN as defined by surrogate markers (i.e., high BK virus viremia). In this setting, preservation of graft function can be considered the rule. Nevertheless, the recovery of BK virus-specific T-cell immunity may require prolonged periods during which cytopathic damage may continue to accumulate. Despite remarkable progress in the field, important challenges remain, such as the rare patient with PVAN refractory to any intervention and the newly recognized association of PVAN with urogenital tumors.

248 citations


Journal ArticleDOI
TL;DR: The results support the utility of SAFB for pretransplant risk assessment and organ allocation, and suggest that improvement of the positive predictive value of HLA-DSA defined by SAFB will require an enhanced definition of pathogenic factors of HBA.
Abstract: BACKGROUND: Defining the clinical relevance of donor-specific HLA-antibodies detected by single-antigen flow-beads (SAFB) is important because these assays are increasingly used for pretransplant risk assessment and organ allocation. The aims of this study were to investigate to which extent HLA-DSA detected by SAFB represent a risk for antibody-mediated rejection (AMR) and diminished allograft survival, and to define HLA-DSA characteristics predictive for AMR. METHODS: In this retrospective study of 334 patients with negative complement-dependent cytotoxicity crossmatches, day-of-transplant sera were analyzed by SAFB, HLA-DSA determined by virtual crossmatching, and the results correlated with the occurrence of AMR and allograft survival. RESULTS: Sixty-seven of 334 patients (20%) had HLA-DSA. The incidence of clinical/subclinical AMR at day 200 posttransplant was significantly higher in patients with HLA-DSA than in patients without HLA-DSA (55% vs. 6%; P>0.0001). Notably, 30/67 patients with HLA-DSA (45%) did not experience clinical/subclinical AMR. Death-censored 5-year allograft survival was equal in patient without HLA-DSA and patients with HLA-DSA but no AMR (89% vs. 87%; P=0.95), whereas it was 20% lower in patients with HLA-DSA and AMR (68%; P=0.002). The number, class, and cumulative strength of HLA-DSA determined by SAFB, and prior sensitizing events were not predictive for the occurrence of AMR. CONCLUSIONS: These results support the utility of SAFB for pretransplant risk assessment and organ allocation, and suggest that improvement of the positive predictive value of HLA-DSA defined by SAFB will require an enhanced definition of pathogenic factors of HLA-DSA.

244 citations


Journal ArticleDOI
TL;DR: This is the first prospective study demonstrating that selected pre-TX psychosocial factors predict post-TX NA and poor clinical outcome, implying that pre- TX screening should include this set of factors in addition to traditional medical criteria.
Abstract: INTRODUCTION: There is growing awareness, yet scant prospective evidence that pretransplant (TX) psychosocial factors may predict post-TX outcome. We examined which pre-TX psychosocial factors predict post-TX nonadherence with immunosuppression (NA) and clinical outcomes in heart, liver, and lung TX. METHODOLOGY: We prospectively followed 141 patients (28 heart, 61 liver, and 52 lung) from pre-TX until 1 year post-TX. Multivariable analyses determined which pre-TX factors (i.e., anxiety, depression, personality traits, social support, adherence with medication, and smoking status) predict poor post-TX outcome (i.e., NA, late acute rejection, graft loss, and resource utilization), controlling for medical predictors of poor outcome. RESULTS: Pre-TX self-reported medication nonadherence (odds ratio [OR]=7.9), lower received social support (OR=0.9), a higher education (OR=2.7), and lower "conscientiousness" (OR=0.8) were independent predictors of post-TX NA. Not living in a stable relationship predicted graft loss (OR=4.9). Pre-TX medication NA was the only predictor for presence of late acute rejection (OR=4.4). No other pre-TX predictors for poor outcome could be found. CONCLUSION: This is the first prospective study demonstrating that selected pre-TX psychosocial factors predict post-TX NA and poor clinical outcome, implying that pre-TX screening should include this set of factors in addition to traditional medical criteria.

226 citations


Journal ArticleDOI
TL;DR: It is suggested that donor-specific anti-HLA antibodies are associated with a high rate of graft rejection in patients undergoing haploidentical stem-cell transplantation and should be evaluated routinely in hematopoietic stem- cell transplantation with HLA mismatched donors.
Abstract: BACKGROUND.: Although donor-specific anti-human leukocyte antigen (HLA) antibodies (DSA) have been implicated in graft rejection in solid organ transplantation, their role in hematopoietic stem-cell transplantation remains unclear. METHODS.: To address the hypothesis that the presence of DSA contributes to the development graft failure, we tested 24 consecutive patients for the presence of anti-HLA antibodies determined by a sensitive and specific solid-phase/single-antigen assay. The study included a total of 28 haploidentical transplants, each with 2 to 5 HLA allele mismatches, at a single institution, from September 2005 to August 2008. RESULTS.: DSA were detected in five patients (21%). Three of four (75%) patients with DSA before the first transplant failed to engraft, compared with 1 of 20 (5%) without DSA (P=0.008). All four patients who experienced primary graft failure had second haploidentical transplants. One patient developed a second graft failure with persistent high DSA levels, whereas three engrafted, two of them in the absence of DSA. No other known factors that could negatively influence engraftment were associated with the development of graft failure in these patients. CONCLUSIONS.: These results suggest that donor-specific anti-HLA antibodies are associated with a high rate of graft rejection in patients undergoing haploidentical stem-cell transplantation. Anti-HLA sensitization should be evaluated routinely in hematopoietic stem-cell transplantation with HLA mismatched donors.

215 citations


Journal ArticleDOI
TL;DR: The findings of this study and the relatively simple therapeutic regimen used should facilitate widespread application of ABOi kidney transplants resulting in one of the most rapid escalations in access to organs in the modern era of kidney transplantation.
Abstract: The requirements for potent immunosuppression coupled with the formidable risk of irreversible antibody-mediated rejection (AMR) have thus far limited the expansion of ABO incompatible (ABOi) kidney transplantation. We present a retrospective review of our single-center experience with 60 consecutive ABOi kidney transplants and describe the evolution of our treatment protocol to one that consists only of a brief escalation in immunosuppression without long-term B-cell suppression from splenectomy or anti-CD20. The 1-, 3-, and 5-year graft survival rates for the cohort were 98.3%, 92.9%, and 88.7%, respectively, which is comparable with United Network for Organ Sharing data for compatible live donor transplants. No instances of hyperacute rejection were observed, and no grafts were lost secondary to AMR. In fact, fewer than 15% of the patients experienced a clinical episode of AMR, and rejections were mild. Elimination of B-cell ablative therapies did not result in an increased incidence of AMR. Excellent graft function persists with a current median creatinine clearance of 60 mL/min. The findings of this study and the relatively simple therapeutic regimen used should facilitate widespread application of ABOi kidney transplantation resulting in one of the most rapid escalations in access to organs in the modern era of kidney transplantation.

215 citations


Journal ArticleDOI
TL;DR: These nonadherence rates provide benchmarks for clinicians to use to estimate patient risk and the identified psychosocial correlates ofnonadherence are potential targets for intervention.
Abstract: Background. Adherence to the medical regimen after pediatric organ transplantation is important for maximizing good clinical outcomes. However, the literature provides inconsistent evidence regarding prevalence and risk factors for nonadherence posttransplant. Methods. A total of 61 studies (30 kidney, 18 liver, 8 heart, 2 lung/heart-lung, and 3 with mixed recipient samples) were included in a meta-analysis. Average rates of nonadherence to six areas of the regimen, and correlations of potential risk factors with nonadherence, were calculated. Results. Across all types of transplantation, nonadherence to clinic appointments and tests was most prevalent, at 12.9 cases per 100 patients per year (PPY). The immunosuppression nonadherence rate was six cases per 100 PPY. Nonadherence to substance use restrictions, diet, exercise, and other healthcare requirements ranged from 0.6 to 8 cases per 100 PPY. Only the rate of nonadherence to clinic appointments and tests varied by transplant type: heart recipients had the lowest rate (4.6 cases per 100 PPY vs. 12.7–18.8 cases per 100 PPY in other recipients). Older age of the child, family functioning (greater parental distress and lower family cohesion), and the child’s psychological status (poorer behavioral functioning and greater distress) were among the psychosocial characteristics significantly correlated with poorer adherence. These correlations were small to modest in size (r=0.12–0.18). Conclusions. These nonadherence rates provide benchmarks for clinicians to use to estimate patient risk. The identified psychosocial correlates of nonadherence are potential targets for intervention. Future studies should focus on improving the prediction of nonadherence risk and on testing interventions to reduce risk.

203 citations


Journal ArticleDOI
TL;DR: Rates of BK virus allograft nephropathy rates were significantly higher in more recent transplant years, and TBKV report was associated with higher risk of subsequent graft loss.
Abstract: Introduction. Published data for BK virus allograft nephropathy, a recently emerged graft-threatening complication of kidney transplantation, are from limited-center series. Since June 30, 2004, the Organ Procurement Transplant Network national registry in the United States started collecting data on treatment of BK virus (TBKV) on the kidney follow-up forms. This study determined the rates of TBKV within 24 months posttransplant time and elucidated the risk factors for TBKV from this multicenter database. Methods. We queried the database for all primary and solitary kidney transplant recipients transplanted between January 1, 2003 and December 31, 2006, followed through July 18, 2008, and who were reported to have TBKV. Cumulative incidence of TBKV over time was estimated using Kaplan-Meier (K-M) method to reduce potential under reporting. A Cox proportional hazards regression model was fitted to determine risk factors for TBKV development, and time dependent Cox model was fitted to determine if TBKV was associated with higher risk of graft loss. Results. We included 48,292 primary and solitary kidney transplants from the US Organ Procurement Transplant Network database. The cumulative K-M incidence of BKVAN kept rising over time (0.70% at 6 months posttransplant to 2.18% at 1 year, 3.45% at 2 years and 6.6% at 5 years). Risk for BKVAN was higher with certain immunosuppressive regimens that included rabbit antithymocyte globulin or tacrolimus/mycophenolate combinations. Higher center volume and living kidney donation exerted a protective effect. Of concern, TBKV rates were significantly higher in more recent transplant years. TBKV report was associated with higher risk of subsequent graft loss (adjusted hazard ratio= 1.69, P<0.001).

197 citations


Journal ArticleDOI
TL;DR: These analyses indicate that, for several common cancers, transplant patients experience worse outcomes than the general population and suggest that cancers in transplant recipients are more aggressive biologically at the time of diagnosis.
Abstract: Background Transplant recipients are at increased risk of malignancy; however, the influence of transplantation on cancer outcomes has not been rigorously defined. The purpose of this study was to examine the influence of transplantation on the outcomes of individual cancers. Methods De novo nonsmall cell lung cancer, colon cancer, breast cancer, prostate cancer, bladder cancer, renal cell cancer (RCC), and malignant melanoma data in 635 adult (>18 years of age) transplant recipients (from the Israel Penn International Transplant Tumor Registry) were compared with data from 1,282,984 adults in the general population (from the Surveillance, Epidemiology, and End Results database). Results Compared with the general population, transplant patients were more likely to have early stage (AJCC stage 0-II) RCC, but more advanced (AJCC stage >II) colon cancer, breast cancer, bladder cancer, and malignant melanoma. Compared with the general population, disease-specific survival was worse in the transplant population for colon cancer (all stages), nonsmall cell lung cancer (stage II), breast cancer (stage III), prostate cancer (stage II, III, and IV), bladder cancer (stage III), and RCC (stage IV). Multivariate analyses demonstrated transplantation to be a negative risk factor for survival for each cancer studied, and transplantation and cancer stage at diagnosis to be the most profound negative survival predictors. Conclusions These analyses indicate that, for several common cancers, transplant patients experience worse outcomes than the general population. The data also suggest that cancers in transplant recipients are more aggressive biologically at the time of diagnosis.

Journal ArticleDOI
TL;DR: Overall, human leukocyte antigens antibody development within 1-year posttransplantation markedly lowers allograft survival compared with later antibody development, and monitoring early antibodies is useful.
Abstract: BACKGROUND Evidence shows posttransplant antibodies lead to renal allograft failure, but does the time elapsed between transplantation and antibody development impact allograft survival? This is the first study showing importance of when antibodies appear. METHODS Serial sera were collected during 17 years (1991-2008) from two groups of patients, one whose allograft failed due to chronic rejection containing 25 patients (230 sera) and a control group consisting of 25 graft functioning patients (305 sera) who were matched by transplant date to a patient whose graft failed. RESULTS The median follow-up for failure patients was 7.1+/-4.8 years and 11.8+/-4.4 years for controls. Human leukocyte antigens alloantibodies appeared in 24 of 25 (96%) of the failed patients and 48% of the controls (P<0.0001). Time to antibodies also differed between groups. Fifteen (60%) patients from the failure group developed antibodies by 1 year compared with none in the control group. Hazard ratio of antibodies present in 1-year posttransplant from multivariate analysis for allograft loss was 7.77 (P<0.001).Ten-year renal allograft survival in early antibody developers (<1 year) was 27% vs. 80% in the late antibody developers. CONCLUSIONS Overall, human leukocyte antigens antibody development within 1-year posttransplantation markedly lowers allograft survival compared with later antibody development. Therefore, monitoring early antibodies is useful.

Journal ArticleDOI
TL;DR: The results indicate that patients at high risk for FSGS recurrence can be identified and may benefit from carefully planned peritransplant interventions.
Abstract: Background. For a subset of adults and children with primary focal segmental glomerulosclerosis (FSGS), proteinuria and renal dysfunction recur after kidney transplantation (KTx). Predicting recurrence and response to plasmapheresis (PP) or other interventions remains problematic. Methods. The prevalence, recurrence rate, outcomes, and treatment responses of patients with FSGS were determined among 1573 KTx recipients. Although 5.0% carried some diagnosis of FSGS, only 1.9% (n=30) met strict diagnostic criteria for primary FSGS including biopsy-proven FSGS, lack of secondary factors, negative family history, and progression to end-stage renal disease within 10 years. Results. Of these, 47% had recurrent FSGS compared with 8% of those not meeting strict criteria (P<0.001). Recurrence was more common in children compared with adults (86% vs. 35%, P=0.01). Graft survival was lower for recipients with primary FSGS compared with all others and inferior graft survival was attributable to recurrent FSGS. Fourteen patients received PP preemptively (pre-KTx) or therapeutically (post-KTx) for recurrent disease. Four pediatric patients additionally received anti-CD20 (rituximab) therapy. Of the different treatment approaches, only PP combined with rituximab was associated with prolonged remission of proteinuria. Conclusions. The results indicate that patients at high risk for FSGS recurrence can be identified and may benefit from carefully planned peritransplant interventions.

Journal ArticleDOI
TL;DR: The current data suggest roles for indoleamine 2,3-dioxygenase, prostaglandin E2, nitric oxide, histocompatibility locus antigen-G, insulin-like growth factor-binding proteins, and tolerogenic antigen-presenting cells.
Abstract: Mesenchymal stem cells (MSC) are a type of multipotent progenitor cell, originally isolated from the bone marrow. In addition to multilineage differentiation and participation in the hematopoietic niche, they exert powerful immunomodulatory effects, which include inhibition of proliferation and function of T cells, B cells, and natural killer cells. These unique properties make MSC of great interest for clinical applications in tissue engineering and immunosuppression. Underlying the MSC-mediated immunomodulatory mechanisms is a nonspecific antiproliferative effect, which is the consequence of cyclin D2 inhibition. Of special interest are the molecular mechanisms, by which MSC influence their target cells. Several studies have been conducted in this field, and the current data suggest roles for indoleamine 2,3-dioxygenase, prostaglandin E2, nitric oxide, histocompatibility locus antigen-G, insulin-like growth factor-binding proteins, and tolerogenic antigen-presenting cells. Understanding these mechanisms is crucial for future use of MSC in research and clinical applications.

Journal ArticleDOI
TL;DR: Although induction with one dose of rituximab induced a complete depletion B cells, there was no increase in the incidence of infectious complications or leukopenia and it seems safe, therefore, to conduct further studies on the use of r ituximabs in transplantation.
Abstract: We performed a prospective, double blind, randomized, placebo-controlled multicenter study on the efficacy and safety of rituximab as induction therapy, together with tacrolimus, mycophenolate mofetil, and steroids. The primary endpoint was defined as acute rejection, graft loss, or death during the first 6 months. Secondary endpoints were creatinine clearance, incidence of infections, and incidence of rituximab-related adverse event. Results. We enrolled 140 patients (44 living donor and 96 deceased donor), and of those, 68 rituximab and 68 placebo patients fulfilled the study. In all the patients receiving rituximab, there was a complete depiction of CD 19/CD20 cells, whereas there was no change in the number of CD19/CD20 cells in the placebo group. There were 10 treatment failures in the rituximab group versus 14 in the placebo group (P=0.348). There were eight rejection episodes in the rituximab group versus 12 in the placebo group (P=0.317) Creatinine clearance was 66 +/- 22 mL/min in the study group and 67 +/- 23 mL/min in the placebo group. There was no difference in the number of bacterial infections, cytomegalovirus infections, and BK virus infections or fungal infections. Conclusion. We performed a placebo-controlled study of rituximab induction in renal transplantation. There was a tendency toward fewer and milder rejections during the first 6 months in the rituximab group. Although induction with one dose of rituximab induced a complete depletion B cells, there was no increase in the incidence of infectious complications or leukopenia and it seems safe, therefore, to conduct further studies on the use of rituximab in transplantation.

Journal ArticleDOI
TL;DR: It is shown that MMF used with a calcineurin inhibitor does indeed confer a clinical benefit over AZA by reducing the risk of acute rejection and also possibly reducing graft loss, independent of whether MMF is used in combination with sandimmune, neoral or tacrolimus.
Abstract: BACKGROUND: Mycophenolate mofetil (MMF) has increasingly replaced azathioprine (AZA) as the antimetabolite of choice in immunosuppressive protocols. Initial trials comparing MMF with AZA in patients receiving cyclosporine A sandimmune showed a clinical benefit in reducing the incidence of acute rejections. It has been questioned whether this benefit remains significant when using newer formulations of cyclosporine A (neoral) and tacrolimus. METHODS: Literature searches were performed using the Transplant Library, Cochrane library, Medline, and Embase for all randomized controlled trials directly comparing MMF with AZA in renal transplant recipients. Trials were assessed for quality using the Jadad scoring system. Trials were pooled using meta-analysis software. Confidence intervals were set at 95%. RESULTS: Nineteen relevant studies were identified, including a total of 3143 patients. MMF significantly reduces the risk of acute rejection when used in combination with any calcineurin inhibitor (relative risk 0.62, 0.55-0.87, P<0.00001). The hazard for graft loss, including death with a functioning graft, is also significantly reduced in patients treated with MMF (hazard ratio 0.76, 0.59-0.98, P=0.037). There is no significant difference in patient survival or renal transplant function between groups. Risk of adverse events, including cytomegalovirus infection, anemia, leukopenia or rates of malignancy, does not differ significantly. A greater risk of diarrhea is seen in MMF-treated patients. CONCLUSIONS: We have shown that MMF used with a calcineurin inhibitor does indeed confer a clinical benefit over AZA by reducing the risk of acute rejection and also possibly reducing graft loss. This effect is independent of whether MMF is used in combination with sandimmune, neoral or tacrolimus.

Journal ArticleDOI
TL;DR: Bortezomib therapy effectively abrogates anti-HLA antibodies in transplantation in an attempt to improve long-term allograft survival and may have benefit in autoimmune-related disease.
Abstract: Background. Current treatments for autoantibody-mediated diseases (i.e., systemic lupus erythematosus) and alloantibodies (in transplant) are minimally effective. Although they deplete naive B cells, plasmablasts, and transiently reduce antibody concentrations, they are minimally effective against long-lived, antibody-producing plasma cells. In transplantation, plasma cells produce antibodies directed against human leukocyte antigen (HLA) antigens causing poor allograft survival. We report the first clinical experience with a plasma cell depleting therapy, bortezomib, to abrogate anti-HLA antibodies in transplantation (outside of rejection) in an attempt to improve long-term allograft survival. Methods. Eleven patients with anti-HLA alloantibodies were treated with bortezomib. All patients underwent plasmapheresis to aid in removal of antibodies and to determine the effect of bortezomib. Serial measurements of anti-HLA antibody levels were conducted weekly by single antigen bead on Luminex platform. Results. Bortezomib treatment elicited substantial reduction in both donor-specific antibody (DSA) and non-DSA levels. Antibodies were directed against DSA in 8 of 11 cases. Mean time to antibody appearance was 2 months posttransplant. Within 22 days (median) from treatment initiation, 9 of 11 patients’ antibody levels dropped to less than 1000 mean fluorescence intensity. Of two patients without successful depletion, all had peak mean fluorescence intensity more than 10,000. At a mean follow-up of approximately 4 months posttreatment, all patients have stable graft function. Minimal transient side effects were noticed with bortezomib in the form of gastrointestinal toxicity, thrombocytopenia, and paresthesias. Conclusions. Bortezomib therapy effectively abrogates anti-HLA antibodies. Hence, removal of antibodies, by proteasome inhibition, represents a new treatment strategy for transplantation and may have benefit in autoimmune-related disease.

Journal ArticleDOI
TL;DR: Comparison to published clinical trials of BS in populations without kidney disease indicates comparable weight loss but higher post-BS mortality in the USRDS sample, warrants prospective study as a strategy for improving outcomes before and after kidney transplantation.
Abstract: Limited data exist on the safety and efficacy of bariatric surgery (BS) in patients with kidney failure. We examined Medicare billing claims within USRDS registry data (1991–2004) to identify BS cases among renal allograft candidates and recipients. Of 188 cases, 72 were performed pre-listing, 29 on the waitlist, and 87 post-transplant. Roux-en-Y gastric bypass was the most common procedure. Thirty-day mortality after BS performed on the waitlist and post-transplant was 3.5%, and one transplant recipient lost their graft within 30 days after BS. BMI data were available for a subset and suggested median excess body weight loss of 31%-61%. Comparison to published clinical trials of BS in populations without kidney disease indicates comparable weight loss but higher post-BS mortality in the USRDS sample. Given the substantial contributions of obesity to excess morbidity and mortality, BS warrants prospective study as a strategy for improving outcomes before and after kidney transplantation.

Journal ArticleDOI
TL;DR: This work reports on the safe and efficacy of a unique autologous SC transfer technique that utilizes an Food and Drug Administration-approved contact lens (CL) as the SC substrate and carrier for patients with LSCD and reveals a promising new technique capable of achieving ocular surface rehabilitation.
Abstract: BACKGROUND: A healthy cornea is reliant on a distinct population of stem cells (SC) that replace damaged or aging epithelium throughout life. Depletion of the SC pool or damage to the niche can result in a blinding and painful condition known as limbal-SC deficiency (LSCD). Although current treatment strategies for reconstituting the ocular surface for patients suffering LSCD are promising, they are complicated by transferring autologous or allogeneic progenitors in the presence of animal, human, and synthetic products. We report on the safe and efficacy of a unique autologous SC transfer technique that utilizes an Food and Drug Administration-approved contact lens (CL) as the SC substrate and carrier for patients with LSCD. METHODS: Three patients with LSCD due to aniridia (n=1) and posttreatment for recurrent ocular surface melanoma (n=2) were included. Limbal (n=2) or conjunctival biopsies (n=1) were harvested and progenitors expanded ex vivo on therapeutic CLs in the presence of autologous serum. Cell-laden CLs were transferred to the patient's corneal surface and clinical outcome measures were recorded (follow-up range, 8-13 months). RESULTS: A stable transparent corneal epithelium was restored in each patient. There was no recurrence of conjunctivalization or corneal vascularization, and a significant improvement in symptom score occurred in all patients. Best-corrected visual acuity was increased in all eyes after the procedure. CONCLUSION: Ex vivo expansion of ocular surface epithelium in the presence of autologous serum and transplantation with the aid of a soft CLs is a promising new technique capable of achieving ocular surface rehabilitation.

Journal ArticleDOI
Valeria Sordi1
TL;DR: The mechanisms by which MSCs are recruited to tissues and cross the endothelial cell layer are not yet fully understood, but it is probable that chemokines and their receptors are involved, as they are important factors known to control cell migration.
Abstract: Mesenchymal stem cells (MSCs) are the stromal component of bone marrow (BM) and, at the moment, the most promising prospect for tissue regeneration and repair. MSCs are easily obtained from BM, have the potential to differentiate into several cell types, and show immunomodulatory properties. The use of MSCs for cell therapies relies on the capacity of these cells to home and engraft long term into the appropriate target tissue. During the past decade, MSC homing capacity to BM and other organs has been reported. Although the mechanisms by which MSCs are recruited to tissues and cross the endothelial cell layer are not yet fully understood, it is probable that chemokines and their receptors are involved, as they are important factors known to control cell migration. The CXCR4-CXCL12 and CX3CR1-CX3CL1 axes, for instance, drive the crosstalk between MSCs and pancreatic islets.

Journal ArticleDOI
TL;DR: Given the poor prognosis of UCD with conservative therapy, LCT caused considerable beneficial effects, and periods of hyperammonemia and clinically relevant crises could be reduced during an observation period of up to 13 months.
Abstract: Background Urea cycle disorders (UCD) have a poor prognosis despite dietary and pharmacologic therapy, especially if the onset of the disease is within the neonatal period. They are promising target diseases for liver cell transplantation (LCT), which may be a less invasive alternative or supplementation to orthotopic liver transplantation. Methods Cryopreserved hepatocytes were isolated under good manufacturing practice conditions. Four children with severe neonatal UCD (age 1 day-3 years) received multiple intraportal infusions of cryopreserved hepatocytes from that same donor, a 9-day old neonate. Portal vein access was achieved surgically in two children, whereas the umbilical vein was suitable for interventional catheter placement in two neonates. Cell applications were carefully monitored by means of Doppler ultrasound and portal vein pressure. Results LCT was feasible in all children. No signs of portal vein thrombosis or extrahepatic shunting were observed. All children showed metabolic stabilization during observation periods of 4 to 13 months. One child with prenatally diagnosed ornithine transcarbamylase deficiency died after 4 months from a fatal metabolic decompensation. Conclusions Given the poor prognosis of UCD with conservative therapy, LCT caused considerable beneficial effects. Periods of hyperammonemia and clinically relevant crises could be reduced during an observation period of up to 13 months. Though cell therapy is not a permanent therapeutic option, bridging to liver transplantation may be substantially improved.

Journal ArticleDOI
TL;DR: A combination of two factors, namely the tumor size and the DCP level, was found to be useful for expanding the selection of LDLT candidates for HCC.
Abstract: Backgrounds. Because many patients who did not meet the Milan criteria have survived long after undergoing living donor liver transplantation (LDLT), extended criteria for recipient with hepatocellular carcinoma (HCC) are therefore considered to be necessary. Methods and Results. A total of 90 consecutive adult LDLT recipients with HCC between 1996 and 2007 were reviewed. The recurrence-free survival rates of all 90 patients were 86.0%, 81.3%, and 81.3% at 1, 3, and 5 years, respectively. Fourteen of 90 patients developed a recurrence of tumor after the LDLT. The tumor recurrences were diagnosed within 1 year after the LDLT in 11 (78.6%) patients. In a multivariate analysis, both the tumor size of less than 5 cm (P=0.0202) and the des-gamma-carboxy prothrombin (DCP) level of less than 300 mAU/mL (P=0.0001) were found to be favorable independent factors for the recurrence of HCC after LDLT. Therefore, the authors devised new selection criteria for HCC patients (a tumor size of <5 cm or a DCP of <300 mAU/mL). The 1-, 3-, and 5-year overall or recurrence-free survival rates of the 85 patients who met the new criteria were 92.3%, 85.9%, and 82.7%, or 90.5%, 87.0%, and 87.0%, respectively, which were significantly different from those of the five patients who did not meet the new criteria (P<0.0001). Conclusions. A combination of two factors, namely the tumor size and the DCP level, was found to be useful for expanding the selection of LDLT candidates for HCC.

Journal ArticleDOI
TL;DR: Symptomatic CARV-infection increases the risk for new onset of BOS, but not progression, and risk to develop BOS was especially increased after paramyxovirus infection.
Abstract: Background The impact of community-acquired respiratory virus (CARV) infections on bronchiolitis obliterans syndrome (BOS) and outcome after lung transplantation (LTx) and diagnostic techniques were prospectively evaluated. Methods A single-center prospective cohort study was performed in LTx-outpatients between October 31, 2005 and April 30, 2006. Symptoms of respiratory tract infections were recorded and nasopharyngeal and oropharyngeal swabs were obtained. Lower respiratory sampling was performed when indicated. Immunofluorescence testing, cultures, and polymerase chain reaction for 12 different CARV were applied. Patients were followed up until December 31, 2007. New onset and BOS-stage was recorded 1 year after presentation. Results Three hundred eighty-eight LTx-recipients were screened. Fifty-one percent reported of symptoms of respiratory tract infection. Seven hundred seventy upper and 180 lower respiratory samples were obtained. Thirty-four CARV were detected in 30 patients (7.7%): 12 parainfluenza, 7 respiratory syncytial virus, 6 metapneumovirus, 5 coronavirus, 3 rhinovirus, and 1 influenza virus. At 1 year, 43 new cases of BOS developed. One-year incidence of BOS was 25.0% in CARV-positive versus 9.0% in CARV-negative patients (log-rank P=0.01). Symptomatic CARV-infection proved to be a significant covariate for 1-year BOS-free survival in multivariate analysis (P=0.002, adjusted hazard ratio 4.13). CARV-infection did not influence BOS progression in 88 patients with prior BOS (P 0.45). After paramyxovirus infection, 8 of 24 patients developed new-onset BOS, whereas no case was recorded after rhinovirus and coronavirus infection. Discussion Surveillance detected CARV in LTx outpatients infrequently. Symptomatic CARV-infection increases the risk for new onset of BOS, but not progression. Risk to develop BOS was especially increased after paramyxovirus infection.

Journal ArticleDOI
TL;DR: The accumulated experience indicates a reduction in the incidence of donor complications, especially for right lobe resection, during 18 years of living donor liver transplantation experience in Japan.
Abstract: Background The Japanese Liver Transplantation Society presented its first report on donor morbidity in 2003. The Society has been continuing to survey outcomes in living liver donors in Japan. Methods By using a uniform comprehensive medical record review process, data were collected on 3565 living liver donors who had donated grafts by the end of December 2006 at 38 Japanese centers. Results Preoperative problems were reported in 2 donors, intraoperative problems in 27, and postoperative complications in 270. In total, 299 donors (8.4%) suffered complications related to liver donation. Postoperative complications included biliary complications in 3.0%, reoperation in 1.3%, severe after-effects in two (0.06%), and death (apparently related to donor surgery) in one donor (0.03%). The incidence of postoperative complications in left and right lobe donors was 8.7% and 9.4%, respectively. Conclusions The accumulated experience indicates a reduction in the incidence of donor complications, especially for right lobe resection. One donor death and two cases of severe after effects related to liver donation have been reported during 18 years of living donor liver transplantation experience in Japan.

Journal ArticleDOI
TL;DR: It is demonstrated for the first time that human β-cells function is compatible with encapsulation in a durable, immunoprotective device and suggested that encapsulation ofβ-cells before terminal differentiation will be a successful approach for new cell-based therapies for diabetes, such as those derived from stem cells.
Abstract: The success of islet transplantation for the treatment of type I diabetes is hindered by the need for chronic immunosuppression. There is an evidence that immunosuppressive drugs not only increase patient morbidity from infectious diseases and malignancy, but also exert dysregulatory effects on the process of β-cell regeneration (1). Encapsulation of cellular transplants has the potential to reduce or eliminate the need for immunosuppression. The technology can be divided into two major categories: microencapsulation and macroencapsulation. Both types consist of semipermeable membranes that allow for the diffusion of nutrients and therapeutic molecules, such as insulin, while preventing the free exchange of cells (2–4). The majority of islet encapsulation studies to date have been performed with microcapsules which contain one or a few islets, thereby providing a beneficial surface or volume ratio for diffusion (5). Multiple issues, however, impact the choice of an encapsulation modality for β-cell replacement. The paucity of available tissue for islet transplantation has stimulated efforts to derive human β-cells from alternate sources such as stem cells. Concern that stem cell-derived tissue may harbor undifferentiated cells with tumorigenic potential or that cells expanded ex vivo may acquire tumorigenicity has been raised as an objection to their clinical use (6). This issue was highlighted by evidence that embryonic stem cells cultured in vitro undergo selection for growth promoting genetic events such as c-myc amplification (7). Microencapsulation seems an unsuitable treatment option because the microcapsules are synthesized from semi-solid materials such as alginate and poly ethylene glycol (PEG) (8, 9) with unavoidable capsule breakage over time (10). In contrast, a durable immunoprotective device could serve as a platform for safely administering ex vivo-derived cell therapies. The TheraCyte macroencapsulation device is a planar pouch featuring a bilaminar polytetrafluorethylene membrane system. The outer layer promotes tissue integration, whereas an inner, cell impermeable, membrane has a 0.4 µm pore size (11). The durability of this encapsulation device has been exploited to sequester transformed cells in cancer vaccine studies (12, 13). Moreover, its subcutaneous placement allows cells to be transplanted in a minimally invasive manner and retrieved if necessary (12). The device is biologically inert and when transplanted into human patients for a year, there were no adverse effects (14). The immunoprotective qualities of the device in allograft and autoimmune settings have been examined in limited, qualitative studies in which the degree of tissue survival was not measured (14–16). When a transformed murine β-cell line was encapsulated and transplanted into the NOD mouse model of type I (autoimmune) diabetes, cells survived, but it is unclear whether immunoprotection was complete or whether the proliferative rate of the cell line simply outpaced destruction by the immune system. Therefore, it is important that the survival and function of encapsulated primary β-cells, which exhibit limited proliferative capacity, be quantitated in allo-and autoimmune environments. Early reports that the device provided xenograft protections (17–19) were not confirmed by others, including the device manufacturers (20–22, and Pamela Itkin-Ansari, personal communication). Human β-cell encapsulation poses a particular challenge, because of the extreme sensitivity of islets to hypoxic environments, such as exist in the early posttransplant period, before new blood vessels form (23–25). Previously, we reported that a cell line derived from human islets is capable of long-term survival inside the TheraCyte device (26). Thus, we wanted to extend those studies to primary human β-cells. Interestingly, islet-like cell clusters (ICCs) from 18- to 24- week human fetal pancreas, rich in endocrine progenitors, frequently function better in transplantation models than mature human islets (27, 28). This study was designed to test the hypothesis that macroencapsulated human β-cell precursors transplanted into severe combined immunodeficiency (SCID) mice can survive and mature into functional β-cells in vivo. In addition, we sought to apply bioluminescent imaging (BLI) to the measurement of encapsulated murine islet survival in real time in both allo- and autoimmune settings. The goal of the study is to identify a platform for the safe administration of stem cell-derived therapies.

Journal ArticleDOI
TL;DR: In vitro studies showed that lymphoblasts and endothelial cells derived from HLA-E/hu&bgr;2m transgenic pigs are effectively protected against human NK cell-mediated cytotoxicity, depending on the level of CD94/NKG2A expression on the NK cells.
Abstract: BACKGROUND: Natural killer (NK) cells participate in pig-to-primate xenograft rejection both by antibody-dependent and -independent mechanisms. A majority of human NK cells express the inhibitory receptor CD94/NKG2A, which binds specifically to human leukocyte antigen (HLA)-E, a trimeric complex consisting of the HLA-E heavy chain, beta2-microglobulin (beta2m), and a peptide derived from the leader sequence of some major histocompatibility complex class I molecules. METHODS: To use this mechanism for protection of pig tissues against human NK cell-mediated cytotoxicity, we generated transgenic pigs by pronuclear microinjection of genomic fragments of HLA-E with an HLA-B7 signal sequence and of human beta2-microglobulin (hubeta2m) into zygotes. RESULTS: Three transgenic founder pigs were generated. Northern blot analysis of RNA from peripheral blood mononuclear cells revealed the presence of the expected transcript sizes for both transgenes in two of the three founders. The founder with the highest expression and his offspring were characterized in detail. Fluorescence-activated cell sorting (FACS) and Western blot analyses demonstrated consistent expression of HLA-E and hubeta2m in peripheral blood mononuclear cells. Immunohistochemistry revealed the presence of HLA-E and hubeta2m on endothelial cells of many organs, including heart and kidney. In vitro studies showed that lymphoblasts and endothelial cells derived from HLA-E/hubeta2m transgenic pigs are effectively protected against human NK cell-mediated cytotoxicity, depending on the level of CD94/NKG2A expression on the NK cells. Further, HLA-E/hubeta2m expression on porcine endothelial cells inhibited the secretion of interferon (IFN)-gamma by co-cultured human NK cells. CONCLUSIONS: This novel approach against cell-mediated xenogeneic responses has important implications for the generation of multitransgenic pigs as organ donors for clinical xenotransplantation.

Journal ArticleDOI
TL;DR: This is the first study comparing the in vivo behavior of both cell types in the infarcted heart, and it is shown that ASC and MSC do not tolerate well in the cardiac environment, resulting in acute donor cell death and a subsequent loss of cardiac function similar to control groups.
Abstract: Background Mesenchymal stem cells hold promise for cardiovascular regenerative therapy. Derivation of these cells from the adipose tissue might be easier compared to bone marrow. However, the in vivo fate and function of adipose stromal cells (ASC) in the infarcted heart has never been compared directly to bone marrow derived mesenchymal cells (MSC).

Journal ArticleDOI
TL;DR: Data is reviewed supporting a new MSC immunoregulation pathway, in which the key molecule is the human leukocyte antigen-G protein, which was initially found on trophoblasts, where it contributes to tolerance at the materno-fetal interface.
Abstract: Adult bone marrow-derived mesenchymal stem cells (MSCs) are multipotential cells capable of regenerating injured tissues. In addition to their multipotency, MSCs inhibit natural killer cell cytotoxicity and T-lymphocyte alloproliferation. Several immunosuppressive mechanisms have been described, including indoleamine 2, 3, -dioxygenase-induced depletion of tryptophan from the lymphocyte environment, and the secretion of prostaglandin E2 and other immunosuppressive factors. Here, we review data supporting a new MSC immunoregulation pathway, in which the key molecule is the human leukocyte antigen-G protein. This nonclassical human leukocyte antigen-class I molecule was initially found on trophoblasts, where it contributes to tolerance at the materno-fetal interface. Because trophoblasts are also able to express indoleamine 2, 3, -dioxygenase and prostaglandin E2, MSC immunomodulatory properties are similar to those of trophoblasts. These mechanisms should be explored in relation to induction of tolerance to alloantigens for the prevention of graft rejection after transplantation.

Journal ArticleDOI
TL;DR: The data suggest that irrespective of the presence or absence of the adaptive immune response, early or late xenograft rejection is associated with activation of the innate immune system and porcine endothelial cell activation and primate TF expression by recipient innate immune cells may both contribute to the development of TM.
Abstract: Background. The role of the innate immune system in the development of thrombotic microangiopathy (TM) after 1,3-galactosyltransferase gene-knockout (GTKO) pig organ transplantation in primates is uncertain. Methods.Twelveorgans(ninehearts,threekidneys)fromGTKOpigsweretransplantedintobaboonsthatreceivedno immunosuppressive therapy, partial regimens, or a full regimen based on costimulation blockade. After graft failure, histologic and immunohistologic examinations were carried out. Results. Graft survival of less than 1 day was prolonged to 2 to 12 days with partial regimens (acute humoral xenograft rejection)andto5and8weekswiththefullregimen(TM).Clinicalorlaboratoryfeaturesofconsumptivecoagulopathy occurred in 7 of 12 baboons. Immunohistochemistry demonstrated IgM, IgG, and complement deposition in most cases. Histopathology demonstrated neutrophil and macrophage infiltrates, intravascular fibrin deposition, and platelet aggregation (TM). Grafts showed expression of primate tissue factor (TF), with increased mRNA levels, and TF was also expressed on baboon macrophages/monocytes infiltrating the graft. Conclusions. Our data suggest that (1) irrespective of the presence or absence of the adaptive immune response, early or late xenograft rejection is associated with activation of the innate immune system; and (2) porcine endothelial cell activationandprimateTFexpressionbyrecipientinnateimmunecellsmaybothcontributetothedevelopmentofTM.

Journal ArticleDOI
TL;DR: Reduced incidence of organ transplant recipients given mammalian target of rapamycin inhibitor (mTORi) and remission/regression of the commonest posttransplant tumors with mTOR therapy are strong reasons to expand the use of mTORi.
Abstract: Organ transplant recipients given mammalian target of rapamycin inhibitor (mTORi) have reduced incidence of de novo posttransplant malignancies (dNPTMs). Posttransplant Kaposi's sarcoma and nonmelanotic skin malignancies (NMSC) frequently undergo remission/regression after conversion to mTORi immunosuppression (IS), especially early, small, and low-grade lesions, whereas larger, aggressive, and metastatic skin tumors are less likely to respond. mTORi-based IS is effective and well tolerated in orthotopic liver transplant patients with hepatocellular carcinoma (HCC) achieving excellent survival and disease-free intervals, particularly with extended criteria tumors, although the evidence that mTORi prevents HCC recurrence after orthotopic liver transplantation is only suggestive. Regression of metastatic HCC and other tumors and various forms of posttransplant lymphoproliferative disease have occurred after mTOR conversion. Documentation of regression/remission of other solid-organ dNPTM (colon, stomach, breast, etc.) after mTORi conversion is essentially absent with only anecdotal reports lacking follow-up data. Unfortunately, there is not a single reported prospective clinical trial powered for looking at the effect of mTORi IS in transplant recipients. Nevertheless, reduced incidence of all of dNPTMs and remission/regression of the commonest posttransplant tumors with mTOR therapy are strong reasons to expand the use of mTORi.