scispace - formally typeset
Search or ask a question

Showing papers in "Transplantation in 1999"


Journal ArticleDOI
TL;DR: The use of small-for-size grafts (less than 1% of recipient body weight) leads to lower graft survival, probably through enhanced parenchymal cell injury and reduced metabolic and synthetic capacity.
Abstract: Background. Although living donor liver transplantation for small pediatric patients is increasingly accepted, its expansion to older/larger patients is still in question because of the lack of sufficient information on the impact of graft size mismatching. Methods. A total of 276 cases of living donor liver transplantation, excluding ABO-incompatible, auxiliary, or secondary transplants, were reviewed from graft size matching. Forty-three cases were highly urgent cases receiving intensive care preoperatively. Cases were categorized into five groups by graft-to-recipient weight ratio (GRWR): extra-small-for-size (XS; GRWR<0.8%, 17 elective and 4 urgent cases), small (S; 0.8

878 citations


Journal ArticleDOI
TL;DR: Sirolimus (rapamycin) is a potent immunosuppressant with a mechanism of action different from cyclosporine (CsA) or tacrolimus as discussed by the authors.
Abstract: Background. Sirolimus (rapamycin) is a potent immunosuppressant with a mechanism of action different from cyclosporine (CsA) or tacrolimus. Methods. In 11 European centers, first cadaveric renal allograft recipients were randomized to CsA (n=42) or sirolimus (n=41). Dosing of these agents was concentration-controlled and open-labeled. All patients received corticosteroids and azathioprine. Results. At 12 months, graft survival (98% sirolimus vs. 90% CsA), patient survival (100% vs. 98%), and incidence of biopsy-confirmed acute rejection (41% vs. 38%) were similar. Serum creatinine was lower with sirolimus, significantly (P<0.05) so at 3 and 4 months, and serum uric acid and magnesium were normal. Laboratory abnormalities reported significantly more often with sirolimus included hypertriglyceridemia (51% vs. 12%), hypercholesterolemia (44% vs. 14%), thrombocytopenia (37% vs. 0%), leukopenia (39% vs. 14%), and, of lesser importance, increased liver enzymes and hypokalemia. These abnormalities improved 2 months after transplantation when the sirolimus target trough level was lowered from 30 to 15 ng/ml. Occurrence of cytomegalovirus was comparable (14% vs. 12%); incidences of herpes simplex (24% vs. 10%, P=0.08) and pneumonia (17% vs. 2%, P=0.03) were higher with sirolimus. No gingival hyperplasia was seen with sirolimus, tremor was rare, and hypertension was less frequent (17% vs. 33%). Two malignancies were observed with CsA and none with sirolimus. Conclusions. Results at 12 months suggest that sirolimus can be used as base therapy in the prophylaxis of acute renal transplant rejection, and has a safety profile that differs from CsA.

878 citations


Journal ArticleDOI
TL;DR: Polyoma virus tubulo-interstitial nephritis-associated graft dysfunction usually calls for judicious decrease in immunosuppression and monitoring for acute rejection, and development of methods to serially quantify the viral load in individual patients could potentially improve clinical outcome.
Abstract: Background. Asymptomatic polyoma virus infection documented by urine cytology or serology is well known, but the clinical course of biopsy-proven interstitial nephritis is not well defined. Methods. Twenty-two cases were identified by histology, immunostaining, in situ hybridization, electron microscopy, or polymerase chain reaction. Results. The clinical features mimicked acute rejection (n=19), chronic rejection with incidental diagnosis at nephrectomy (n=2), or drug toxicity (n=1). Histology showed homogenous intranuclear inclusions. In situ hybridization showed BK virus (BKV) to be the predominant species, but polymerase chain reaction documented JC virus co-infection in one of five cases so tested. Electron microscopy in seven cases showed 20-51-nm virions. The two cases diagnosed at nephrectomy received no therapy. Initial antirejection therapy in 12 cases led to clearance of the virus in 1/12 (8%), partial therapeutic response in 3/12 (25%), and graft loss in 8/12 (67%) cases. The last recorded creatinine in patients with functional grafts ranged from 1.9 to 7.0 (median: 4.5) mg/dl, 0.4-45 (median: 4.0) months after initial diagnosis. The remaining eight cases treated by reduction of immunosuppression at the outset have been free of graft loss for 0.2-10.0 (median: 4.8) months since diagnosis, and clearance of virus has been documented in three of six (50%) cases. The serum creatinine in these patients is 1.7-6.0 (median: 2.4) mg/dl, 0.2-10 (median: 4.8) months after diagnosis. Follow-up biopsies performed 1-23.5 months after diagnosis show chronic allograft nephropathy. Conclusions. Polyoma virus tubulo-interstitial nephritis-associated graft dysfunction usually calls for judicious decrease in immunosuppression and monitoring for acute rejection. Development of methods to serially quantify the viral load in individual patients could potentially improve clinical outcome.

492 citations


Journal ArticleDOI
TL;DR: This review summarizes the current knowledge of EBV-PTLD and, as a result of two separate international meetings on this topic, provides recommendations for future areas of study.
Abstract: Epstein-Barr virus-induced posttransplant lymphoproliferative disease (EBV-PTLD) continues to be a major complication after solid organ transplantation in high-risk patients. Despite the identification of risk factors that predispose patients to develop EBV-PTLD, limitations in our knowledge of its pathogenesis, variable criteria for establishing the diagnosis, and lack of randomized studies addressing the prevention and treatment of EBV-PTLD hamper the optimal management of this transplant complication. This review summarizes the current knowledge of EBV-PTLD and, as a result of two separate international meetings on this topic, and provides recommendations for future areas of study.

483 citations


Journal ArticleDOI
TL;DR: Prophylactic basiliximab therapy is well tolerated, has an adverse event profile comparable to placebo, and significantly reduces the number of acute rejection episodes in renal allograft patients within the first year after transplantation.
Abstract: Background A double-blind, placebo-controlled phase III study was performed to assess whether basiliximab, a chimeric anti-interleukin-2 receptor monoclonal antibody, reduced the incidence of acute rejection episodes in renal allograft recipients. Methods A total of 348 patients were randomized into two demographically matched, equally sized groups treated with either basiliximab or placebo. The dose of basiliximab-20-mg infusions on day 0 and day 4-was selected to block detection of interleukin-2 receptor on 97% of peripheral blood lymphocytes for 30-45 days. All patients received immunosuppressive therapy with cyclosporine microemulsion (Neoral) and steroids. An intent-to-treat analysis of 1-year data assessed the incidence of posttransplant acute rejection episodes, patient and graft survival rates, and the safety and tolerability of basiliximab. Results Among the eligible 346 patients equally divided into the two treatment groups, basiliximab reduced the proportion of patients who experienced biopsy-confirmed acute rejection episodes by 28%: 61 (35.3%) basiliximab vs. 85 (49.1%) placebo (P=0.009). Graft losses occurred in 9 (5.2%) basiliximab-treated and 12 (6.9%) placebo-treated patients. Five (2.9%) deaths in the basiliximab group and seven (4.0%) in the placebo group occurred. Compared with placebo, a higher fraction of basiliximab patients produced urine in the operating room, and a significantly lower fraction had renal dysfunction in the first month (serum creatinine > or =5 mg(dl) and between 1 and 12 months (serum creatinine > or =3 mg/dl). During the first 12 months, 94 (54%) basiliximab-treated patients experienced serious adverse events, compared with 106 (61%) who received placebo. Conclusions Prophylactic basiliximab therapy is well tolerated, has an adverse event profile comparable to placebo, and significantly reduces the number of acute rejection episodes in renal allograft patients within the first year after transplantation.

472 citations


Journal ArticleDOI
TL;DR: Recurrent rejection episodes and high dose immunosuppressive therapy, including tacrolimus, are risk factors for manifest PV kidney graft infection, which has an ominous prognosis.
Abstract: Background Manifest polyomavirus (PV) renal graft infection is a rare complication. We diagnosed 5 cases among 70 kidney recipients undergoing transplants since December 1995; however, there were no cases at our institution before December 1995. Method. To identify risk factors promoting manifest PV graft infection, we compared those 5 patients with kidney recipients who had signs of PV replication but no manifest graft infection (n=23, control group). PV replication was judged by the presence of intranuclear inclusion cells in the urine. Results. Before the infection, five of five patients had recurrent rejection episodes. All were switched from cyclosporine A to high dose tacrolimus as rescue therapy. Infection was diagnosed histologically 9±2 months posttransplantation; it persisted and led to graft loss in four of five patients. In control patients, graft function was stable, 1 of 23 patients were switched to tacrolimus as rescue therapy, and graft loss occurred in 4 of 23 patients. Conclusion. Recurrent rejection episodes and high dose immunosuppressive therapy, including tacrolimus, are risk factors for manifest PV kidney graft infection, which has an ominous prognosis.

439 citations


Journal ArticleDOI
TL;DR: Living donor liver transplantation (LDLT) is currently performed at about 30 centers in the United States and recipient survival after adult LDLT in theUnited States is approximately 88%.
Abstract: Background. The shortage of livers for transplantation has prompted transplant centers to seek alternatives to conventional cadaveric liver transplantation. Left lateral segmentectomy from living donors has proven to be a safe operation for the donor with excellent results in the pediatric population. Left lobectomy, conceived to supply more tissue, still provides insufficient liver mass for an average size adult patient. Right lobectomy could supply a graft of adequate size. Methods. Donors were considered only after recipients were listed according to United Network for Organ Sharing (UNOS) criteria. Donor evaluation included liver biopsy, magnetic resonance imaging, and celiac and mesenteric angiography. The donor operation consisted of a right lobectomy uniformly performed throughout the series as described herein. Results. Twenty-five right lobe living donor liver transplants were performed between adults, with no significant complications in donors. Recipient and graft survival was 88%, with three recipient deaths secondary to uncontrolled sepsis in patients at high risk for liver transplant; all three had functioning grafts. Conclusions. Right lobe living donor liver transplantation poses challenges that require a meticulous surgical technique to minimize morbidity in the recipient. Right lobectomies for living donation can be performed safely with minimal risk to both donor and recipient although providing adequate liver mass for an average size adult patient.

410 citations


Journal ArticleDOI
TL;DR: This is the first report of the deliberate induction of mixed lymphohematopoietic chimerism after a nonmyeloablative preparative regimen to treat a hematological malignancy and to provide allotolerance for a solid organ transplant.
Abstract: Background Experimental and clinical evidence has demonstrated that the establishment of allogeneic chimerism after bone marrow transplantation may provide donor-specific tolerance for solid organ allografts. Methods Based on the preliminary results of a clinical trial using nonmyeloablative preparative therapy for the induction of mixed lymphohematopoietic chimerism, we treated a 55-year-old woman with end stage renal disease secondary to multiple myeloma with a combined histocompatibility leukocyte antigen-matched bone marrow and renal transplant after conditioning with cyclophosphamide, antithymocyte globulin, and thymic irradiation. Results The posttransplant course was notable for early normalization of renal function, the absence of acute graft-versus-host disease, and the establishment of mixed lymphohematopoietic chimerism. Cyclosporine, which was the only posttransplant immunosuppressive therapy, was tapered and discontinued on day +73 posttransplant. No rejection episodes occurred, and renal function remains normal on day + 170 posttransplant (14 weeks after discontinuing cyclosporine). Although there is presently no evidence of donor hematopoiesis, there is evidence of an ongoing antitumor response with a recent staging evaluation showing no measurable urine kappa light chains. The patient remains clinically well and is off all immunosuppressive therapy. Conclusion This is the first report of the deliberate induction of mixed lymphohematopoietic chimerism after a nonmyeloablative preparative regimen to treat a hematological malignancy and to provide allotolerance for a solid organ transplant.

394 citations


Journal ArticleDOI
TL;DR: The data suggest that a target AUC0-12 of 9500-11500 or AUC1-4 of 4400-5500 microg x h/L may provide optimal Neoral immunosuppression and early AUC based on PK0-4 is more closely associated with AR and CsANT than is C0.
Abstract: Background Cyclosporine (CsA) dosing is traditionally based on trough blood levels (C0) rather than area under the concentration-time curve (AUC), although AUC correlates better with posttransplantation clinical events. For Neoral, AUC based on limited sampling correlates closely with full 12-hr AUC. The purpose of our study was to correlate C0 with AUC based on CsA levels at 0, 1, 2, 3, and 4 hr after dose (PK0-4) and to compare this AUC with C0 in predicting acute rejection (AR) and acute cyclosporine nephrotoxicity (CsANT) in de novo first kidney transplant patients. Methods PK0-4 was done 2-4 days after starting Neoral for 156 patients. All received CsA-based triple-drug immunosuppression without antibody induction. AUC was calculated as projected 12-hr (AUC0-12) and actual 4-hr (AUC0-4) from the PK0-4 using the parallel trapezoid rule. Neoral dosing was based on C0 not AUC. AUC was retrospectively compared with C0 as a predictor of AR and CsANT during the first 90 days. Results C0 correlated poorly with AUC0-12 and AUC0-4 (r=0.61 and r=0.42). C0 (mean+/-SEM) levels were not significantly different in 34 patients with and 109 without AR (293+/-21 vs. 294+/-11 microg/L, P=0.95). AUC0-12 and AUC0-4 were significantly lower in patients with than without AR (AUC0-12 9090+/-598 vs. 10608+/-336 microg x h/L, P=0.01; AUC0-4 3934+/-306 vs. 4802+/-166 microg.h/L, P=0.006). In stepwise regression analysis only AUC0-12 or AUC0-4 (P=0.03/P=0.02) and delayed graft function (P=0.007) predicted AR. AUC0-12, AUC0-4, and C0 were all significantly higher in patients with CsANT than without CsANT (AUC0-12 11746+/-650 vs. 10023+/-301 microg x h/L, P=0.01; AUC0-4 5270+/-358 vs. 4474+/-150 microg x h/L, P=0.01; C0 343+/-18 vs. 287+/-10 microg/L, P=0.01), but in stepwise regression analysis C0 was not an independent predictor of CsANT. Patients with AUC0-12 in the range of 9500 to 11500 microg x h/L or AUC0-4 between 4400 and 5500 microg x h/L had the lowest incidence of AR (13% and 7%, respectively) without significantly higher risk for CsANT. Conclusion C0 correlates poorly with AUC based on PK0-4. Early AUC based on PK0-4 is more closely associated with AR and CsANT than is C0. Our data suggest that a target AUC0-12 of 9500-11500 or AUC0-4 of 4400-5500 microg x h/L may provide optimal Neoral immunosuppression.

378 citations


Journal ArticleDOI
TL;DR: In transplant patients, MCC probably proved to be more aggressive than in the general population in that 68% of patients developed lymph node metastases and 56% died of their malignancies.
Abstract: In the general population Merkel’s cell carcinoma (MCC) is an aggressive neuroendocrine skin cancer. More than 600 cases have been reported. MCC seems to be common in transplant recipients, with 41 cases being reported to the Cincinnati Transplant Tumor Registry, and another 11 in the transplant literature. In the general population, it is a disease of older adults, with only 5% of cases occurring below the age of 50 years. In transplant patients, the mean age at diagnosis was 53 (range 33‐78) years, and 29% of recipients were <50 years old. The tumor appeared from 5 to 286 (mean 91.5) months after the transplant. Of 44 lesions that occurred in 41 patients, the distribution was similar to that seen in the general population, with 36% occurring on the head and neck, 32% on the upper extremities, 16% on the trunk, 9% at unknown sites, and 7% on the lower extremities. Twenty of the patients (49%) had 22 other malignancies, the great majority of which (91%) were other skin cancers. Treatment depended on the stage of the disease and included wide surgical excision, radical lymph node dissection, radiation therapy, and chemotherapy. In transplant patients, MCC probably proved to be more aggressive than in the general population in that 68% of patients developed lymph node metastases and 56% died of their malignancies. Furthermore, one third of surviving patients still have active cancers from which they may die. Also, follow-up of survivors has been relatively short, with a mean of only 18 (range 0 ‐135) months.

367 citations


Journal ArticleDOI
TL;DR: Right lobe grafting is a safe and effective procedure, resulting in the expansion of the indication for LDLT to large-size recipients and how to deal with the possible variation in the anatomy of the right lobe graft should be given attention throughout the procedure.
Abstract: Background. For the sake of donor safety in living donor liver transplantation (LDLT), the left lobe is currently being used most often for the graft. However, size mismatch has been a major obstacle for an expansion of the indication for LDLT to larger-size recipients, because a left lobe graft is not safe enough for them. Methods. In 1998, LDLT using a right lobe graft was introduced and performed on 26 recipients to overcome the small-for-size problem. The right lobe, which does not include the middle hepatic vein of the donor, was used. Initially, indication for right lobe LDLT was basically defined as an estimated left lobe graft volume/recipient body weight ratio (GRWR) of 0.8% were implanted in all recipients, except for two, who received relatively smaller right lobes (GRWR of 0.68% and 0.66%). In one of these two, the right lobe from the donor was used as the orthotopic auxiliary graft. Postoperative transitory increases in total bilirubin and aspartate transaminoferase for right lobe donors were higher than those for the left lateral segmentectomy. Nineteen recipients (73.1%) were successfully treated with this procedure. The causes of death were not specific for right lobe LDLT, except for one patient with a graft that had multiple hepatic venous orifices. These multiple and separate anastomoses of the hepatic veins caused an outflow block as a result of a positional shift of the graft, which finally led to graft loss. Conclusion. Our experience suggests that right lobe grafting is a safe and effective procedure, resulting in the expansion of the indication for LDLT to large-size recipients. How to deal with the possible variation in the anatomy of the right lobe graft should be given attention throughout the procedure.

Journal ArticleDOI
TL;DR: Brief (7-day) induction with Thymoglobulin resulted in less frequent and less severe rejection, a better event-free survival, less cytomegalovirus disease, fewer serious adverse events, but more frequent early leukopenia than induction with Atgam.
Abstract: Background The aim of this study was to compare the efficacy and safety of Thymoglobulin (a rabbit-derived polyclonal antibody) to Atgam (a horse-derived polyclonal antibody) for induction in adult renal transplant recipients Methods Transplant recipients (n=72) were ran domized 2:1 in a double-blinded fashion to receive Thymoglobulin (n=48) at 15 mg/kg intravenously or Atgam (n=24) at 15 mg/kg intravenously, intraopera tively, then daily for at least 6 days Recipients were observed for at least 1 year of follow-up Results By 1 year after transplantation, 4% of Thymoglobulin-treated patients experienced acute rejection compared with 25% of Atgam-treated patients (P=0014) The rate of acute rejection was lower with Thymoglobulin than Atgam (relative risk=009; P=0009) Rejection was less severe with Thymoglobulin than Atgam (P=002) No recurrent rejection occurred with Thymoglobulin compared with 33% with Atgam (P=NS) Patient survival was not different, but the composite end point of freedom from death, graft loss, or rejection, the event-free survival, was superior with Thymoglobulin (94%) compared with Atgam (63%; P=00005) Fewer adverse events occurred with Thymoglobulin (P=0013) Leukopenia was more common with Thymoglobulin than Atgam (56% vs 4%; P<00001) during induction The mean absolute lymphocyte count remained below baseline with Thymoglobulin throughout the study (P<0007), but with Atgam, significant lymphocyte reductions occurred only at day 7 The incidence of cytomegalovirus disease was less with Thymoglobulin than Atgam at 6 months (10% vs 33%; P=0025) Conclusions Brief (7-day) induction with Thymoglobulin resulted in less frequent and less severe rejection, a better event-free survival, less cytomegalovirus disease, fewer serious adverse events, but more frequent early leukopenia than induction with Atgam These results may in fact be explained by a more profound and durable beneficial lymphopenia

Journal ArticleDOI
TL;DR: In this article, the authors show that if new keratinocyte culture technologies and/or delivery systems are proposed, a careful evaluation of epidermal stem cell preservation is essential for the clinical performance of this lifesaving technology.
Abstract: Background. Cell therapy is an emerging therapeutic strategy aimed at replacing or repairing severely damaged tissues with cultured cells. Epidermal regeneration obtained with autologous cultured keratinocytes (cultured autografts) can be life-saving for patients suffering from massive full-thickness burns. However, the widespread use of cultured autografts has been hampered by poor clinical results that have been consistently reported by different burn units, even when cells were applied on properly prepared wound beds. This might arise from the depletion of epidermal stem cells (holoclones) in culture. Depletion of holoclones can occur because of (i) incorrect culture conditions, (ii) environmental damage of the exposed basal layer of cultured grafts, or (iii) use of new substrates or culture technologies not pretested for holoclone preservation. The aim of this study was to show that, if new keratinocyte culture technologies and/or "delivery systems" are proposed, a careful evaluation of epidermal stem cell preservation is essential for the clinical performance of this lifesaving technology.

Journal ArticleDOI
TL;DR: It is suggested that apoptosis of endothelial cells followed by hepatocytes is an important mechanism of cell death after ischemia/reperfusion injury in the liver.
Abstract: BACKGROUND Ischemic injury of the liver is generally considered to result in necrosis, but it has recently been recognized that mediators of apoptosis are activated during ischemia/reperfusion. This study was designed to characterize the extent and the type of cells within the liver that undergo apoptosis at different periods of ischemia and reperfusion. METHODS Male Wistar rats were subjected to 30 or 60 min of normothermic ischemia. Liver sections were evaluated at the end of ischemia and at 1, 6, 24, and 72 hr after reperfusion. Apoptosis was determined by DNA fragmentation as evaluated by laddering on gel electrophoresis, in situ staining for apoptotic cells using TdT-mediated dUTP-digoxigenin nick-end labeling (TUNEL), and morphology on electron microscopy. RESULTS In situ staining of liver biopsy specimens using TUNEL showed significant apoptosis after reperfusion. Sinusoidal endothelial cells (SEC) showed evidence of apoptosis earlier than hepatocytes. For example, at 1 hr of reperfusion after 60 min of ischemia, 22+/-4% of the SEC stained TUNEL positive compared with 2+/-1% of the hepatocytes (P<0.001). With a longer duration of ischemia, a greater number of SEC and hepatocytes became TUNEL positive. An increase in TUNEL-positive cells was also noted with an increasing duration of reperfusion. The presence of apoptotic SEC and hepatocytes was supported by DNA laddering on gel electrophoresis and cell morphology on electron microscopy. Several Kupffer cells were seen containing apoptotic bodies but did not show evidence of apoptosis. Only rare hepatocytes showed features of necrosis after 60 min of ischemia and 6 hr of reperfusion. CONCLUSION These results suggest that apoptosis of endothelial cells followed by hepatocytes is an important mechanism of cell death after ischemia/reperfusion injury in the liver.

Journal ArticleDOI
TL;DR: The structural damage to beta cells demonstrated in this study is similar to morphological and functional abnormalities previously described in experimental animal models and can at least partially account for the glucose metabolism abnormalities seen in patients receiving these drugs.
Abstract: Background The introduction of the potent immunosuppressive drugs tacrolimus (FK) and cyclosporine (CSA) has markedly improved the outcome of solid organ transplantation. However, these drugs can cause posttransplantation diabetes mellitus. Abnormalities in the glucose metabolism are of particular significance in pancreas transplantation. Methods We studied 26 pancreas allograft biopsies, performed 1-8 months posttransplantation, from 20 simultaneous kidney-pancreas transplant recipients, randomized to receive either FK or CSA. The biopsies were studied by light microscopy, immunoperoxidase stains for insulin and glucagon, in situ DNA-end labeling for detection of apoptosis, and electron microscopy. The islet morphology was correlated with the mean and peak levels of CSA and FK in serum, with corticosteroid administration and with glycemia. Results On light microscopy cytoplasmic swelling, vacuolization, apoptosis, and abnormal immunostaining for insulin were seen in biopsies from patients receiving either FK or CSA. The islet cell damage was more frequent and severe in the group receiving FK than in the group receiving CSA (10/13 and 5/13, respectively) but the differences were not statistically significant. Significant correlation was seen between the presence of islet cell damage and serum levels of CSA or FK during the 15 days previous to the biopsy, as well as with the peak level of FK. Toxic levels of CSA or FK and administration of pulse steroids were associated with hyperglycemia when these occurred concurrently (P=0.005). Toxic levels of CSA or FK by themselves were associated with hyperglycemia in a minority of cases (8 and 26%, respectively). Electron microscopy showed cytoplasmic swelling and vacuolization, and marked decrease or absence of dense-core secretory granules in beta cells; the changes were more pronounced in patients on FK. Serial biopsies from two hyperglycemic patients receiving FK and evidence of islet cell damage demonstrated reversibility of the damage when FK was discontinued. Conclusions The structural damage to beta cells demonstrated in this study is similar to morphological and functional abnormalities previously described in experimental animal models and can at least partially account for the glucose metabolism abnormalities seen in patients receiving these drugs. Toxic levels of CSA or FK and higher steroid doses potentiate each others' diabetogenic effects.

Journal ArticleDOI
TL;DR: CsA induces partial CN inhibition that varies directly with the blood and tissue levels, and may be greater in some tissues due to higher drug accumulation, relevant to nephrotoxicity.
Abstract: Background Cyclosporine (CsA) acts by inhibiting the phosphatase calcineurin (CN), but the time course and extent of inhibition in vivo are unknown. We examined the effect of single oral CsA doses on CN activity in humans and mice in vivo. Methods In humans, blood CsA levels were determined and CN activity was measured in whole blood and in blood leukocytes of patients up to 12 hr after CsA dosing (just before the second dose). Samples were collected from patients receiving a first single dose (2.5 mg/kg), and up to 14 days later after repeated dosing. In mice, after CsA dosing (12.5-200 mg/kg) by oral gavage, CsA levels in blood and tissue (spleen, kidney) were determined and CN activity was measured in spleen and kidney. Results In humans, peak CsA levels of 800-2285 microg/L at 1-2 hr produced 70-96% CN inhibition. Inhibition correlated closely with the rise and fall of CsA levels with no observable lag at the times sampled. Repeated doses showed similar CN inhibition to first dose, with no significant adaptation. In mice, CsA peaked at 1 hr in blood, spleen, and kidney, with higher concentrations in spleen and kidney than in blood. CN inhibition closely followed CsA concentrations/doses, and was greater in kidney than spleen. Conclusion Thus CsA induces partial CN inhibition that varies directly with the blood and tissue levels, and may be greater in some tissues due to higher drug accumulation. The high CsA concentrations and CN inhibition in kidney may be relevant to nephrotoxicity.

Journal ArticleDOI
TL;DR: In virtually all experimental studies of organ transplantation, young, healthy living animals are used as donors; in clinical practice, in contrast, a relatively low percentage of organs comes from living donors, as cadavers remain the primary source of supply.
Abstract: Transplantation has evolved as the treatment of choice for many patients with end-stage organ disease However, despite the 80% one-year functional survival rate of most transplanted organs at the present time, the ultimate goal—to provide long-term treatment for an irreversible process—has not been achieved; the rate of attrition over time has not changed appreciably throughout the entire experience (1) Although recurrent disease, de novo infections, malignancies, and other factors may contribute to late graft deterioration, chronic rejection remains the most important etiologic factor (2) Despite well-characterized functional and morphological changes, the mechanisms leading to this progressive state remain poorly understood Its pathophysiology has been conceptualized as stemming from both antigendependent and -independent risk factors (3) Although immune-mediated events are considered to be primarily responsible for the late graft changes, it seems increasingly that the influence of nonimmunological events has been underestimated This concept has been emphasized by recent pooled United Network of Organ Sharing data that show that the survival rates of kidneys from living-unrelated and one haplotype-matched living-related donors are identical despite potentially important differences in genetic relationship with the given recipient (4) In addition, organs from all living donors demonstrate consistently superior results to those from cadaver sources over both the shortand long-term Various nonimmunological factors that might explain these discrepancies include the effects of initial ischemia/reperfusion injury, inadequate functioning nephron mass, viral infections, and drug toxicity Brain death is a rarely considered risk factor uniquely relevant to the cadaver donor Multivariate analysis has emphasized that both initial and long-term results of engrafted cadaver organs may be dependent upon donor demographics and the etiology of the central injury (5) In virtually all experimental studies of organ transplantation, young, healthy living animals are used as donors; in clinical practice, in contrast, a relatively low percentage of organs comes from living donors, as cadavers remain the primary source of supply Amongst other variables, the difference between the two donor populations implies the effect of profound physiological and structural derangements that may occur during and subsequent to brain death and before the actual engraftment procedure

Journal ArticleDOI
TL;DR: Transplantation has become a lifesaving procedure for (1) patients with intestinal failure who cannot be maintained on total parenteral nutrution and (2) patients who require abdominal evisceration to completely remove locally aggressive tumors.
Abstract: Background. Small bowel transplantation is an evolving procedure. We reviewed the world experience since 1985 to determine the current status of this procedure. Methods. All of the known intestinal transplant programs were invited to contribute to an international registry using a standardized report form. Results. Thirty-three intestinal transplant programs provided data on 273 transplants in 260 patients who were transplanted on or before February 28, 1997. The number of procedures per year has increased at a linear rate since 1990, with 58 transplants performed in 1996. Two-thirds of the recipients were children or teenagers. The short gut syndrome was the most common indication for transplantation. The types of transplants included the small bowel with or without the colon (41%); the intestine and liver (48%); and multivisceral grafts (11%). The 1-year graft/patient survival for transplants performed after February 1995 was 55%/69% for intestinal grafts; 63%/66% for small bowel and liver grafts; and 63%/63% for multivisceral grafts. Transplants since 1991 and programs that had performed at least 10 transplants had significantly higher graft survival rates. Seventy-seven percent of the current survivors had stopped total parenteral nutrition (TPN) and resumed oral nutrition. Conclusions. Transplantation has become a lifesaving procedure for (1) patients with intestinal failure who cannot be maintained on total parenteral nutrution and (2) patients who require abdominal evisceration to completely remove locally aggressive tumors. The 5-year survival rate of intestinal transplantation with large series is comparable to lung transplantation.

Journal ArticleDOI
TL;DR: Multivariate analysis was used to identify risk factors for this poor psychosocial outcome and found that relatives other than first degree and donors whose recipient died within 1 year of transplant were more likely to say they would not donate again if it were possible.
Abstract: The University of Minnesota has been a strong advocate of living donor kidney transplants. The benefits for living donor recipients have been well documented. The relative low risk of physical complications during donation has also been well documented. Less well understood is the psychosocial risk to donors. Most published reports have indicated an improved sense of well-being and a boost in self-esteem for living kidney donors. However, there have been some reports of depression and disrupted family relationships after donation, even suicide after a recipient's death. To determine the quality of life of our donors, we sent a questionnaire to 979 who had donated a kidney between August 1, 1984, and December 31, 1996. Of the 60% who responded, the vast majority had an excellent quality of life. As a group, they scored higher than the national norm on the SF-36, a standardized quality of life health questionnaire. However, 4% were dissatisfied and regretted the decision to donate. Further, 4% found the experience extremely stressful and 8% very stressful. We used multivariate analysis to identify risk factors for this poor psychosocial outcome and found that relatives other than first degree (odds ratio=3.5, P=0.06) and donors whose recipient died within 1 year of transplant (odds ratio=3.3, P=0.014) were more likely to say they would not donate again if it were possible. Further, donors who had perioperative complications (odds ratio=3.5, P=0.007) and female donors (odds ratio=1.8, P=0.1) were more likely to find the overall experience more stressful. Overall, the results of this study are overwhelmingly positive and have encouraged us to continue living donor kidney transplants.

Journal ArticleDOI
TL;DR: This novel therapy is of equal efficacy compared to conventional triple therapy, but allows the patient to be steroid-free and to be maintained on very-low-dose immunosuppressive monotherapy, which resulted in acceptable outcomes in this group of renal allograft recipients.
Abstract: Background. Campath 1H is a depleting, humanized anti-CD52 monoclonal antibody that has now been used in 31 renal allograft recipients. The results have been very encouraging and are presented herein. Methods. Campath 1H was administered, intravenously, in a dose of 20 mg, on day 0 and day 1 after renal transplant. Low-dose cyclosporine (Neoral) was then initiated at 72 hr after transplant. These patients were maintained on low-dose monotherapy with cyclosporine. Results. At present, the mean follow-up is 21 months (range: 15-28 months). All but one patient are alive and 29 have intact functioning grafts. There have been six separate episodes of steroid-responsive rejection. One patient has had a recurrence of her original disease. Two patients have suffered from opportunistic infections, which responded to therapy. One patient has died secondary to ischemic cardiac failure. Conclusions. Campath 1H has resulted in acceptable outcomes in this group of renal allograft recipients. This novel therapy is of equal efficacy compared to conventional triple therapy, but allows the patient to be steroid-free and to be maintained on very-low-dose immunosuppressive monotherapy.

Journal ArticleDOI
B. Nashan1, S Light, I R Hardie, Amy Lin, J R Johnson 
TL;DR: Administration of daclizumab in addition to dual immunosuppression therapy significantly reduced biopsy-proven acute rejection after renal transplantation, improved patient survival, and did not add to the toxicity of the immunOSuppressive regimen.
Abstract: Background Acute rejection is still a major problem in renal transplantation and is one of the most important causes of chronic graft dysfunction and late graft loss. Selective immunosuppression with a humanized antibody against the alpha-chain of the interleukin (IL)-2 receptor (CD25) was evaluated to demonstrate the efficacy of this type of immunoprophylaxis in combination with dual immunosuppression. Methods We studied the effect of daclizumab, a humanized monoclonal antibody against the alpha-chain of the IL-2 receptor, in a randomized double-blind, prospective phase III clinical trial in 275 patients receiving a first cadaveric renal allograft. Among them 111 (83%) in the placebo arm and 116 (82%) in the daclizumab arm received the full regimen of five doses (1.0 mg/kg) every other week. Baseline immunosuppression consisted of cyclosporine and corticosteroids. Results At 6 months, 39 (28%) of the patients in the daclizumab group had biopsy-proven rejections, as compared with 63 (47%) in the placebo group (P=0.001). The need for additional antilymphocyte therapy, antithymocyte globulin, antilymphocyte globulin (ATG, ALG, OKT3) was also lower in the daclizumab group (8% vs. 16%, P=0.02), and they required significantly lower mean (+/- SD) cumulative doses of prednisone (3750+/-1981 mg vs. 4438+/-2667 mg in the placebo group, P=0.01). Graft function was significantly better (P=0.02) in the daclizumab group (graft function rate: 58 vs. 51 ml/min, mean) as was patient survival (P=0.01, 99% vs. 94%). No specific adverse events were observed in daclizumab-treated patients. Patients receiving daclizumab experienced fewer cytomegalovirus infections (18% vs. 25%), and none died from severe infectious complications, compared to four patients in the placebo arm. No patient in the daclizumab group had a lymphoproliferative disorder or any other form of immunosuppression-related tumor during the first year after transplant. Conclusions Administration of daclizumab in addition to dual immunosuppression therapy significantly reduced biopsy-proven acute rejection after renal transplantation, improved patient survival, and did not add to the toxicity of the immunosuppressive regimen.

Journal ArticleDOI
TL;DR: Although SRL produced more frequent, but reversible, hematological and lipid abnormalities, it had no apparent nephrotoxic effects to exacerbate CsA-induced renal dysfunction and may permit Cs a sparing, at least among Caucasian patients, without an increased risk of rejection.
Abstract: Background. The novel agent sirolimus (SRL; Rapamune ; rapamycin) inhibits the immune response by a mechanism distinct from those of calcineurin antagonists or antimetabolites. This randomized, controlled, multicenter, single blind, phase II trial examined the combination of SRL, steroids, and full versus reduced doses of cyclosporine (CsA) for prophylaxis of acute renal allograft rejection. Methods. A total of 149 recipients of mismatched cadaveric- or living-donor primary renal allografts were randomized into six groups. Three groups received placebo or 1 or 3 mg/m 2 /day SRL, as well as steroids and full-dose CsA (Sandimmune). Three groups received steroids, reduced-dose CsA (target trough level 50% of full-dose range), and 1, 3, or 5 mg/m 2 /day SRL. Results. The incidence of biopsy-proven acute rejection episodes within the first 6 months after transplant was reduced from 32.0% in the control group to 8.5% in patients receiving SRL (1 or 3 mg/m 2 /day) and full-dose Sandimmune CsA (P=0.018). Similar low rates of acute rejection episodes were observed among non-African-Americans, but not African-Americans, treated with SRL and reduced-dose Sandimmune CsA. Despite the augmented immunosuppression, 1-year patient and graft survival rates did not differ significantly across groups. Adverse effects attributable to CsA, including hypertension and new-onset diabetes mellitus, were not exacerbated by SRL. Except for an increased incidence of pneumonia among patients receiving full-dose CsA and 3 mg/m 2 /day SRL, the incidences of opportunistic infections were similar in all treatment groups. Although SRL produced more frequent, but reversible, hematological and lipid abnormalities, it had no apparent nephrotoxic effects to exacerbate CsA-induced renal dysfunction. Conclusions. SRL in combination with CsA and steroids not only lowers the incidence of biopsy-proven acute renal allograft rejection episodes, but also may permit CsA sparing, at least among Caucasian patients, without an increased risk of rejection.

Journal ArticleDOI
TL;DR: Preoperative computed tomographic measurement of liver size of a living donor is essential and a graft that represented 40% or less of the recipient's standard liver weight should be regarded as a marginal graft with a lower success rate.
Abstract: Background. The extension of living donor liver transplantation to adult recipients is limited by the adequacy of the size of the graft. We evaluate the effect of the graft size on the survival of the recipient in order to establish a clinical guide for the minimum requirement. Methods. The clinical records of 14 adults and 11 children (body weight 6.1-100 kg) who underwent living donor liver transplantation for chronic or acute liver failure were reviewed. The effect of the graft weight ratio (graft weight divided by standard liver weight of recipient) on graft function and survival was studied. Results. The graft weight ratio ranged from 31 to 203%. The overall graft and patient survival rates were 84% at a median follow-up of 29 months. The survival rate was 95% for recipients with a graft weight ratio >40%, and 40% only for those with a ratio ≤40% (P=0.016). It was 88% (7/8) when the ratio was >100%, 100% (5/5) when the ratio was 71 to 100%, 100% (7/7) when the ratio was 41 to 70%, and 40% (2/5) only when the ratio was ≤40%. When the graft weight ratio was ≤40%, early graft dysfunction was evident and contributed to the causes of death in three patients. Conclusions. Preoperative computed tomographic measurement of liver size of a living donor is essential. A graft that represented 40% or less of the recipient's standard liver weight should be regarded as a marginal graft with a lower success rate.

Journal ArticleDOI
TL;DR: For recipients on cyclosporine/mycophenolate mofetil/P with no AR at 90 days, the chance of developing subsequent AR is small; if P is tapered and withdrawn, the risk increases, but the majority remain free of acute and chronic rejection.
Abstract: BACKGROUND Prospective randomized trials have shown a reduced rate of acute rejection (AR) in mycophenolate mofetil-treated kidney transplant recipients. We hypothesized that this increased protection from AR could allow successful prednisone (P) withdrawal in cyclosporine/mycophenolate mofetil/P-treated recipients. METHODS A multicenter, prospective, randomized, double-blind trial of P withdrawal at 3 months post-transplant was initiated. Entry criteria were: primary transplant, adult, no AR by 90 days, mycophenolate mofetil dose > or =2 g/day, cyclosporine dose = 5-15 mg/kg/ day, P dose = 10-15 mg/day. Study participants were randomized to have P tapered over 8 weeks (beginning at 3 months posttransplant) to 0 vs. 10 mg/day. Prestudy power analysis determined 500 recipients should be randomized for 80% statistical power to test equivalence of the primary endpoint, AR, or treatment failure at 1 year posttransplant. By design, the study was to be stopped if interim data precluded reaching equivalence. An established data safety monitoring board monitored the study. RESULTS After 266 patients were enrolled, the patient enrollment was stopped (after safety monitoring board review) because of excess rejection in the P withdrawal group. The Kaplan-Meier estimate of the cumulative incidence of rejection or treatment failure within 1 year posttransplant (+/-95% confidence interval) for the maintenance group was 9.8% (4.4%; treatment failure, 14.9%); for the withdrawal group, 30.8% (21.0%; 39.3%). Treatment differences in the distribution of time to event were highly significant (P = 0.0007). Of note, risk was higher in blacks (39.6%) versus nonblacks (16.0%) (P<0.001). At 1 year post-transplant, there was no difference between groups in patient or graft survival. For the patients with functioning grafts at 6 months posttransplant, withdrawal patients had lower cholesterol (P = 0.0005), had higher creatinine (P = 0.03), and were less likely to use antihypertensives (P = 0.001). These differences persist to 1 yr posttransplant. CONCLUSIONS We conclude that for recipients on cyclosporine/mycophenolate mofetil/P with no AR at 90 days, the chance of developing subsequent AR is small; if P is tapered and withdrawn, the risk increases (but the majority remain free of acute and chronic rejection). After withdrawal, the risk of AR is different for blacks versus nonblacks. Withdrawal patients had a lower cholesterol level and less need for antihypertensives.

Journal ArticleDOI
TL;DR: Both renal biopsy and EM of urine samples are useful in the diagnosis and monitoring of polyomavirus infections.
Abstract: Background. Interstitial nephritis caused by BK polyomavirus is a recognized complication of renal transplantation. A study of renal transplant recipients at Duke University Medical Center was undertaken to evaluate diagnostic modalities and assess clinical outcomes in transplant polyomavirus infections. Methods. Polyomavirus nephritis was identified in 6 of 240 patients who received renal transplants between January 1996 and June 1998 and an additional patient who underwent transplantation in 1995. The clinical records of these seven patients were reviewed, as were all renal biopsy and nephrectomy specimens. Electron microscopy (EM) was performed on negatively stained urine samples from 6 patients with polyomavirus infection and 23 patients with other diagnoses. Results. Patients with polyomavirus infection shared several clinical features, including ureteral obstruction (5/7 patients), lymphocele (3/7), bacterial urinary tract infection (3/7), hematuria (3/7), cytomegalovirus infection (3/7), and immunosuppression with mycophenolate mofetil (6/7). All patients experienced elevations in serum creatinine, which stabilized or decreased in four patients with altered or decreased immunosuppression. The diagnosis of polyomavirus infection was established by renal biopsy and EM of urine in five patients, by biopsy alone in one, and by EM alone in one. Sequential examinations of urine by EM were used to monitor the course of infection in six patients. Conclusions. Interstitial nephritis due to BK polyomavirus occurred in 2.5% of patients receiving renal transplants at our center since 1996. Polyomavirus infection can cause transplant dysfunction and graft loss, but progression of the infection can frequently be abrogated with alterations in immunosuppressive therapy. Both renal biopsy and EM of urine samples are useful in the diagnosis and monitoring of polyomavirus infections.

Journal ArticleDOI
TL;DR: During liver CI/WR injury: selective apoptosis of endothelial cells occurs; caspase 3 is activated only in endothelial Cells; and a casp enzyme inhibitor reduces endothelial cell apoptosis and prolongs animal survival after OLT, which could prove useful in clinical transplantation.
Abstract: Cold ischemia/warm reperfusion (CI/WR) liver injury remains a problem in liver transplants. Sinusoidal endothelial cells (SEC) are a target of CI/WR injury, during which they undergo apoptosis. Because caspase proteases have been implicated in apoptosis, our aim was to determine whether liver CI/WR injury induces a caspase-dependent apoptosis of SEC. Rat livers were stored in the University of Wisconsin (UW) solution for 24 hr at 4°C and reperfused for 1 hr at 37°C in vitro. Apoptosis was quantitated using the TUNEL assay, and caspase 3 activation determined by immunohistochemical analysis. Rat liver orthotopic liver transplants (OLT) were also performed using livers stored for 30 hr. Terminal deoxynucleotide transferase-mediated dUTP nick end labeling (TUNEL) positive hepatocytes were rare and did not increase during CI/WR injury. In contrast, TUNEL positive SEC increased 6-fold after reperfusion of livers stored under cold ischemic conditions, compared with controls or livers stored but not reperfused. Immunohistochemical analysis demonstrated active caspase 3 only in endothelial cells after CI/WR injury. When IDN-1965, a caspase inhibitor, was given i.v. to the donor animal and added to UW solution and the reperfusion media, TUNEL positive endothelial cells were reduced 63±11% (P<0.05). Similarly, the duration of survival after OLT was significantly increased in the presence of the inhibitor. During liver CI/WR injury: 1) selective apoptosis of endothelial cells occurs; 2) caspase 3 is activated only in endothelial cells; and 3) a caspase inhibitor reduces endothelial cell apoptosis and prolongs animal survival after OLT. The pharmacologic use of caspase inhibitors could prove useful in clinical transplantation.

Journal ArticleDOI
TL;DR: A substantial reduction in mortality in IDDM patients 10 years after successful combined pancreas and kidney transplantation is found and it is speculated that the decrease in mortality was due to the beneficial effect of long-term normoglycemia on diabetic late complications.
Abstract: Background The purpose of pancreatic transplantation in insulin-dependent diabetic patients is to restore normoglycemia and thereby prevent the secondary complications of diabetes. However, uncertainty remains as to whether the mortality rate in diabetic patients can be affected by this procedure. Method We followed 14 patients with insulin-dependent diabetes mellitus (IDDM) and end-stage diabetic nephropathy for 10 years after successful combined kidney and pancreas transplantation. Fifteen diabetic patients subjected to kidney transplantation alone have served as controls. The glycemic control has been studied annually for 10 years and diabetic polyneuropathy has been assessed in both groups after 2, 4, and 8 years. Results In recipients of pancreas-kidney grafts, metabolic control was maintained throughout the observation period, with values of glycated hemoglobin in the normal range. In contrast, glucose metabolism was impaired in the control group, with glycated hemoglobin values around 10%. Nerve conduction and parasympathetic autonomic dysfunction improved in both groups after 2 years; there was no difference between the groups. After 4 years, we found a significant difference between the study group and the control group, and after 8 years it had widened. At the 4-year evaluation, there was no difference in mortality between the groups. At 8 years, however, a significant difference was noted, which was further substantiated at 10 years with a 20% mortality rate in the pancreas-kidney group versus an 80% mortality in the kidney alone group. Conclusions We found a substantial reduction in mortality in IDDM patients 10 years after successful combined pancreas and kidney transplantation. We speculate that the decrease in mortality was due to the beneficial effect of long-term normoglycemia on diabetic late complications and suggest therefore that combined pancreas and kidney transplantation, rather than kidney transplantation alone, should be offered to IDDM patients with end-stage diabetic nephropathy.

Journal ArticleDOI
TL;DR: Although laparoNx allografts have slower initial function compared with openNx, there was no significant difference in longer term renal function.
Abstract: Background. Laparoscopic donor nephrectomy (laparoNx) has the potential to increase living kidney donation rates by reducing the pain and suffering of the donor. However, renal function outcomes of a large series of recipients of laparoNx have not been studied. Methods. We retrospectively reviewed the records of 132 recipients of laparoNx done at our center between 3/96 and 11/97 and compared them to 99 recipients of kidneys procured by the open technique (openNx) done between 10/93 and 3/96. Results. Significantly more patients in the laparoNx group (25.2%) were taking tacrolimus within the first month than those in the openNx group (2.1%). Mean serum creatinine was higher in laparoNx compared with openNx at 1 week (2.860.3 and 1.860.2 mg/dl, respectively; P50.005) and at 1 month (2.060.1 and 1.660.1 mg/dl, P50.05) after transplant. However, by 3 and 6 months, the mean serum creatinine was similar in the two groups (1.760.1 versus 1.560.05 mg/dl, and 1.760.1 versus 1.760.1, respectively). By 1 year posttransplant, the mean serum creatinine for laparoNx was actually less than that for openNx (1.460.1 and 1.760.1 mg/dl, P50.03). Although patients in the laparoNx compared to the openNx group were more likely to have delayed graft function (7.6 versus 2.0%) and ureteral complications (4.5 versus 1.0%), the rate of other complications, as well as hospital length of stay, patient and graft survival rates were similar in the two groups. Conclusion. Although laparoNx allografts have slower initial function compared with openNx, there was no significant difference in longer term renal function. Kidney transplantation is considered to be the treatment of choice for end-stage renal failure. Insufficient supply of organs for donation has produced long waiting times for many patients who may benefit from transplantation (1). During this period patients accumulate the morbidity of renal failure, they must endure the lifestyle limitations of dialysis, and they often die while waiting for the organ sharing system to grant them this resource. Live donor renal transplantation represents a large potential supply of organs that may relieve much of this shortage. Additionally, recipients of live renal transplants may reap benefits of improved patient and allograft survival that have been clearly demonstrated in this population (2,3). Although unilateral nephrectomy has proven to be safe and the solitary kidney state has been found to be well tolerated in a carefully chosen candidate for donation (4,5), substantial disincentives to donation exist. These include a significant hospitalization, prolonged convalescence period with time away from jobs, intractable perioperative pain, and, for some, cosmetic concerns of the resulting

Journal ArticleDOI
TL;DR: There was no difference in the rate of recurrent and de novo disease according to the transplant type (living related donor vs. cadaver, P=NS), and demographic findings were not significantly different.
Abstract: INTRODUCTION Short-term and long-term results of renal transplantation have improved over the past 15 years. However, there has been no change in the prevalence of recurrent and de novo diseases. A retrospective study was initiated through the Renal Allograft Disease Registry, to evaluate the prevalence and impact of recurrent and de novo diseases after transplantation. MATERIALS AND METHODS From October 1987 to December 1996, a total of 4913 renal transplants were performed on adults at the Medical College of Wisconsin, University of Cincinnati, University of California at San Francisco, University of Louisville, University of Washington, Seattle, and Washington University School of Medicine. The patients were followed for a minimum of 1 year. A total of 167 (3.4%) cases of recurrent and de novo disease were diagnosed by renal biopsy. These patients were compared with other patients who did not have recurrent and de novo disease (n=4746). There were more men (67.7% vs. 59.8%, P<0.035) and a higher number of re-transplants (17% vs. 11.5%, P<0.005) in the recurrent and de novo disease group. There was no difference in the rate of recurrent and de novo disease according to the transplant type (living related donor vs. cadaver, P=NS). Other demographic findings were not significantly different. Common forms of glomerulonephritis seen were focal segmental glomerulosclerosis (FSGS), 57; immunoglobulin A nephritis, 22; membranoproliferative glomerulonephritis (GN), 18; and membranous nephropathy, 16. Other diagnoses include: diabetic nephropathy, 19; immune complex GN, 12; crescentic GN (vasculitis), 6; hemolytic uremic syndrome-thrombotic thrombocytopenic purpura (HUS/TTP), 8; systemic lupus erythematosus, 3; Anti-glomerular basement membrane disease, 2; oxalosis, 2; and miscellaneous, 2. The diagnosis of recurrent and de novo disease was made after a mean period of 678 days after the transplant. During the follow-up period, there were significantly more graft failures in the recurrent disease group, 55% vs. 25%, P<0.001. The actuarial 1-, 2-, 3-, 4, and 5-year kidney survival rates for patients with recurrent and de novo disease was 86.5%, 78.5%, 65%, 47.7%, and 39.8%. The corresponding survival rates for patients without recurrent and de novo disease were 85.2%, 81.2%, 76.5%, 72%, and 67.6%, respectively (Log-rank test, P<0.0001). The median kidney survival rate for patients with and without recurrent and de novo disease was 1360 vs. 3382 days (P<0.0001). Multivariate analysis using the Cox proportional hazard model for graft failure was performed to identify various risk factors. Cadaveric transplants, prolonged cold ischemia time, elevated panel reactive antibody, and recurrent disease were identified as risk factors for allograft failure. The relative risk (95% confidence interval) for graft failure because of recurrent and de novo disease was 1.9 (1.57-2.40), P<0.0001. The relative risk for graft failure because of posttransplant FSGS was 2.25 (1.6-3.1), P<0.0001, for membranoprolifera. tive glomerulonephritis was 2.37 (1.3-4.2), P<0.003, and for HUS/TTP was 5.36 (2.2-12.9), P<0.0002. There was higher graft failure (64.9%) and shorter half-life (1244 days) in patients with recurrent FSGS. CONCLUSION In conclusion, recurrent and de novo disease are associated with poorer long-term survival, and the relative risk of allograft loss is double. Significant impact on graft survival was seen with recurrent and de novo FSGS, membranoproliferative glomerulonephritis, and HUS/TTP.

Journal Article
TL;DR: Analysis of 3-year data for patient and graft survival, and safety in the MMF-treated patients indicates that MMF treatment not only results in a reduction of the incidence of acute rejections but also leads to reduction of late allograft loss.
Abstract: Mycophenolate mofetil in renal transplantation: 3-year results from the placebo-controlled trial.European mycophenolate mofetil study group.