scispace - formally typeset
Search or ask a question

Showing papers in "Transplantation in 1998"


Journal ArticleDOI
TL;DR: The transplant evaluation rating scale: a revision of the psychosocial levels system for evaluating organ transplant candidates and the influence of posttransplantation psychossocial factors on heart transplantation outcome are reviewed.
Abstract: 12. Paris W, Muchmore J, Pribil A, Zuhdi N, Cooper DK. Study of the relative incidences of psychosocial factors before and after transplantation and the influence of posttransplantation psychosocial factors on heart transplantation outcome. J Heart Lung Transplant 1994; 13: 424. 13. Chacko RC, Harper RG, Kunik M, Young J. Relationship of psychiatric morbidity and psychosocial factors in organ transplant candidates. Psychosomatics 1996; 37: 100. 14. Frazier P, et al. Correlates of non-compliance among renal transplant recipients. Clin Transplant 1994; 8: 550. 15. Chacko RC, Harper RG, Gotto J, Young J. Psychiatric interview and psychometric predictors of cardiac transplant survival. Am J Psychiatry 1996; 153: 1607. 16. Twillmann RK, Manetto C, Wellisch DK, Wolcott DL. The transplant evaluation rating scale: a revision of the psychosocial levels system for evaluating organ transplant candidates. Psychosomatics 1993; 34: 144. 17. Olbrisch ME, Levenson JL, Hamer R. The PACT: a rating scale for the study of clinical decision-making in psychosocial screening of organ transplant candidates. Clin Transplant 1989; 3: 164. 18. Hecker J, Norvell N, Hills H. Psychologic assessment of candidates for heart transplantation: toward a normative data base. J Heart Transplant 1989; 8(2): 171. 19. Levenson JL, Olbrisch ME. Psychosocial evaluation of organ transplant candidates: a comparative survey of process, criteria, and outcomes in heart, liver, and kidney transplantation. Psychosomatics 1993; 34(4): 314.

802 citations


Journal ArticleDOI
TL;DR: The production of TGF-beta1 is under genetic control, and this in turn influences the development of lung fibrosis, and the TGF -beta1 genotype has prognostic significance in transplant recipients.
Abstract: Background Transforming growth factor (TGF)-β1 is a profibrogenetic cytokine that has been implicated in the development of fibrosis in transplanted tissues. In this study, we have analyzed the genetic regulation of TGF-β1 production in lung transplant recipients. Method. A polymerase chain reaction-single-stranded conformational polymorphism technique was used to detect polymorphisms in the TGF-β1 gene from genomic DNA. Polymorphisms were shown to correlate with in vitro TGF-β1 production by stimulated lymphocytes. A single-specific oligonucleotide probe hybridization method was devised to screen for these polymorphisms in lung transplant groups and controls. Results. We have identified five polymorphisms in the TGF-β1 gene: two in the promoter region at positions -800 and -509, one at position +72 in a nontranslated region, and two in the signal sequence at positions +869 and +915. The polymorphism at position +915 in the signal sequence, which changes codon 25 (arginine→proline), is associated with interindividual variation in levels of TGF-β1 production. Stimulated lymphocytes of homozygous genotype (arginine/arginine) from control individuals produced significantly more TGF-β1 in vitro (10037±745 pg/ml) compared with heterozygous (arginine/proline) individuals (6729±883 pg/ml; P<0.02). In patients requiring lung transplantation for a fibrotic lung condition, there was an increase in the frequency of the high-producer TGF-β1 allele (arginine). This allele was significantly associated with pretransplant fibrotic pathology (P<0.02) (n=45) when compared with controls (n=107) and with pretransplant nonfibrotic pathology (P<0.004) (n=50). This allele was also associated with allograft fibrosis in transbronchial biopsies when compared with controls (P<0.03) and with nonallograft fibrosis (P<0.01). Conclusion. The production of TGF-β1 is under genetic control, and this in turn influences the development of lung fibrosis. Hence, the TGF-β1 genotype has prognostic significance in transplant recipients.

648 citations


Journal ArticleDOI
TL;DR: Substitution of MMF for azathioprine may reduce mortality and rejection in the first year after cardiac transplantation.
Abstract: BACKGROUND After heart transplantation, 1-year and 5-year survival rates are 79% and 63%, respectively, with rejection, infection, and allograft coronary artery disease accounting for the majority of deaths. Mycophenolate mofetil (MMF), an inhibitor of the de novo pathway for purine biosynthesis, decreases rejection in animals and in human renal transplantation. METHODS In a double-blind, active-controlled trial, 28 centers randomized 650 patients undergoing their first heart transplant to receive MMF (3000 mg/day) or azathioprine (1.5-3 mg/kg/day), in addition to cyclosporine and corticosteroids. Rejection and survival data were obtained for 6 and 12 months, respectively. Because 11% of the patients withdrew before receiving study drug, data were analyzed on all randomized patients (enrolled patients) and on patients who received study medications (treated patients). RESULTS Survival and rejection were similar in enrolled patients (MMF, n=327; azathioprine, n=323). In treated patients (MMF, n=289; azathioprine, n=289), the MMF group compared with the azathioprine group was associated with significant reduction in mortality at 1 year (18 [6.2%] versus 33 deaths [11.4%]; P=0.031) and a significant reduction in the requirement for rejection treatment (65.7% versus 73.7%; P=0.026). There was a trend for fewer MMF patients to have > or = grade 3A rejection (45.0% versus 52.9%; P=0.055) or require the murine monoclonal anti-CD3 antibody or antithymocyte globulin (15.2% versus 21.1%; P=0.061). Opportunistic infections, mostly herpes simplex, were more common in the MMF group (53.3% versus 43.6%; P=0.025). CONCLUSIONS Substitution of MMF for azathioprine may reduce mortality and rejection in the first year after cardiac transplantation.

579 citations


Journal ArticleDOI
TL;DR: MPA C predose and MPA AUC are significantly related to the incidence of biopsy-proven rejection after kidney transplantation, whereas MMF dose is significantly relatedto the occurrence of adverse events.
Abstract: Background. Adding a fixed dose of 1 g b.i.d. of mycophenolate mofetil (MMF) to an immunosuppressive regimen consisting of cyclosporine and prednisone results in a 50% reduction in the incidence of acute rejection after kidney transplantation. This study was designed to investigate the relationship between pharmacokinetic data (mycophenolic acid area under the curve; MPA AUC) and the prevention of rejection after kidney transplantation. Methods. A total of 154 adult recipients of a primary or secondary cadaveric kidney graft were randomly allocated, in this double-blind trial, to receive MMF treatment aimed at three predefined target MPA AUC values (16.1, 32.2, and 60.6 μg-hr/ml). During the first 6 months after transplantation, plasma samples for nine AUCs were collected. After analysis of the samples, a coded dose adjustment advice was generated using a Bayesian algorithm, maintaining the double blinding. Immunosuppressive therapy further consisted of cyclosporine and prednisone. The primary end point of this study was the occurrence of biopsy-proven acute rejection within the 6-month study period. Results. A total of 150 patients were eligible for analysis. Although after day 21, the mean MMF dose was reduced, the mean MPA AUC gradually increased and target MPA AUC values were exceeded in all three groups. The incidences of biopsy-proven acute rejection in the low, intermediate, and high target MPA AUC groups were 14 of 51 (27.5%), 7 of 47 (14.9%), and 6 of 52 (11.5%), respectively. The incidences of premature withdrawal from the study due to adverse events in the three groups were 4 of 51 (7.8%), 11 of 47 (23.4%), and 23 of 52 (44.2%), respectively. Logistic regression analysis showed a highly statistically significant relationship between median ln(MPA AUC) and the occurrence of a biopsy-proven rejection (P<0.001). The logistic regression using median In(C predose ) was also statistically significant for this relationship (P=0.01), whereas it was not when using mean MMF dose (P=0.082). In contrast, the logistic regression using mean MMF dose for comparison of patients who successfully completed the study versus patients experiencing premature withdrawal due to adverse events was highly significant (P<0.001), whereas this was not significant when using median In(C predose ) (P=0.512) or median In(MPA AUC) (P=0.434). Conclusion. MPA C predose and MPA AUC are significantly related to the incidence of biopsy-proven rejection after kidney transplantation, whereas MMF dose is significantly related to the occurrence of adverse events.

515 citations


Journal ArticleDOI
TL;DR: It is hypothesized that irreversible central nervous system injury may up-regulate proinflammatory mediators and cell surface molecules in peripheral organs to be engrafted, making them more prone to host inflammatory and immunological responses.
Abstract: Background: The success rate of transplanted organs from brain-dead cadaver donors is consistently inferior to that of living sources. As cadaver and living unrelated donors are equally genetically disparate with a given recipient, the difference must lie within the donor himself and/or the effects of organ preservation and storage. We have hypothesized that irreversible central nervous system injury may up-regulate proinflammatory mediators and cell surface molecules in peripheral organs to be engrafted, making them more prone to host inflammatory and immunological responses. Methods. Rats undergoing surgically induced acutely increased intracranial pressure (explosive brain death) were followed for 6 hr. Their peripheral tissues were examined by reverse transcriptase polymerase chain reaction and immunohistology, serum factors were assessed by enzyme-linked immunosorbent assay, and the influence of inflammatory molecules in the blood stream was determined by cross-circulation experiments with normal animals. Results. mRNA expression of both lymphocyte- and macrophage-associated products increased dramatically in all tissues. Similar factors in serum were coincidentally increased; these were shown to be active in vivo by cross-circulation with normal animals. The organs of all control groups, including animals with important ischemic injury and with hemorrhagic shock, were negative. Up-regulation of MHC class I and II antigens and the co-stimulatory molecule B7 suggests increased immunogenicity of the peripheral organs. These changes could be inhibited by: (i) administration of a recombinant soluble P-selectin glycoprotein ligand-Ig, a P- and E-selectin antagonist; and (ii) a fusion protein, cytotoxic T lymphocyte antigen 4-Ig, which blocks B7-mediated T-cell co-stimulation. Conclusions. Activation of peripheral organs following explosive brain death may be caused by various interrelated events, including the effects of massive acute central injury, hypotension, and circulating factors. Almost complete suppression of these changes could be produced by biological agents. Such interventions, if reproducible in humans, could improve the quality of organs from marginal donors, broadening the criteria for donor acceptance.

421 citations


Journal ArticleDOI
TL;DR: DGF is an important independent predictor of poor graft survival, and newer immunosuppressive strategies must minimize nonimmune and immune renal injury if long-term graft survival is to improve.
Abstract: Background. In cadaveric renal transplantation, delayed graft function (DGF) correlates with poor long-term graft survival; however, whether its effects are independent of acute rejection is controversial. We wished to study the effect of DGF on graft survival, controlling for acute rejection, discharge creatinine, and human leukocyte antigen match. Methods. We analyzed 27,096 first cadaveric donor renal transplants reported to the UNOS Scientific Renal Transplant Registry between January 1994 and November 1997. DGF was defined as dialysis need in the first week. Acute rejection was recorded for initial hospitalization and within 6 months. Kaplan Meier survival curves were analyzed with the log rank test. Results. DGF increased the incidence of acute rejection before discharge (8% without DGF; 25% with DGF, P<0.01) and any acute rejections by 6 months (25% without DGF, 42% with DGF, P<0.01). Without early rejection, DGF reduced 1-year graft survival from 91 to 75% (P<0.0001) and graft half-life from 12.9 to 8.0 years. In kidneys with acute rejection within 6 months, DGF decreased 3-year graft survival from 77 to 60% and graft half-life from 9.4 to 6.2 years (P<0.001). With a discharge creatinine of less than 2.5 mg/dl, the difference in graft half-life between no DGF and no rejection (13.4 years) and DGF with rejection (9.8 years) was significant (P<0.001). Increased donor age and cold ischemia time additionally decreased graft survival, whereas a good human leukocyte antigen match could not overcome the deleterious effects of DGF or acute rejection. Conclusions. DGF is an important independent predictor of poor graft survival. Newer immunosuppressive strategies must minimize nonimmune and immune renal injury if long-term graft survival is to improve.

385 citations


Journal ArticleDOI
TL;DR: MMF significantly reduced the incidence of rejection in the first 6 months, but there was not a significant improvement in graft survival throughout the 3 years after cadaver kidney transplantation.
Abstract: Background Three large-scale clinical trials conducted in North America, Europe, and Australia showed that mycophenolate mofetil (MMF) decreases the incidence of acute renal allograft rejection in the first 6 months after transplant compared with placebo or azathioprine. This study extends the randomized, prospective, double-blind trial of MMF conducted by the Tricontinental Mycophenolate Mofetil Renal Transplantation Study Group. Methods Patients (n=503) were randomized to receive 100-150 mg of azathioprine (AZA) (n=166), 2 g of MMF (n=173), or 3 g of MMF (n=164) per day, in conjunction with cyclosporine and prednisone from the time of transplantation. Results During the first 6 months, the incidence of biopsy-proven acute graft rejection (BPR) was reduced by approximately 50% in the MMF 2 g (19.7%) and MMF 3 g (15.9%) groups compared with the AZA group (35.5%). The incidence of treatment failure during the first 6 months, including BPR, death, graft loss, and early withdrawal without prior BPR, was significantly decreased: AZA, 50%, compared with MMF 2 g, 38.2% (P=0.0287), and MMF 3 g, 34.8% (P=0.0045). At 3 years after transplant, both intent-to-treat and on-study (censoring at 90 days after treatment) analyses of graft and patient survival showed a trend toward advantage for MMF 2 g and 3 g vs. AZA (intent-to-treat: 81.9% and 84.8% vs. 80.2%; on-study: 84.0% and 86.4% vs. 82.7%), although this trend did not reach statistical significance. Rejection was the principal cause of graft loss in all groups: AZA, 9.9%; MMF 2 g, 5.8%; and MMF 3 g, 3.0%. Graft function (intent-to-treat and on-study) was comparable in all three groups at 3 years. Gastrointestinal toxicity, leukopenia, and tissue-invasive cytomegalovirus disease were more common in the MMF 3 g group both during and after the first posttransplant year. Lymphoproliferative disorders were diagnosed in one AZA (0.6%), two MMF 2 g (1.2%), and three MMF 3 g (1.8%) patients. Other (non-lymphoproliferative disorders, noncutaneous) malignancies occurred in six AZA (3.7%), four MMF 2 g (2.3%), and nine MMF 3 g (5.5%) patients. Mortality was comparable in all three groups (AZA, 8.6%; MMF 2 g, 4.7%; MMF 3 g, 9.1%) by 3 years of follow-up. Conclusion MMF significantly reduced the incidence of rejection in the first 6 months, but there was not a significant improvement in graft survival throughout the 3 years after cadaver kidney transplantation.

380 citations


Journal ArticleDOI
TL;DR: The occurrence of histologic acute rejection was rare at 2 years, confirming the absence of subclinical acute rejection in these late biopsies and suggests that nonimmunologic factors, such as drug-induced toxicity, may play an important role in chronic allograft nephropathy.
Abstract: Background. This paper reports the histopathologic results of 2-year protocol biopsies from patients who were enrolled in the U.S. FK506 kidney transplant study . Methods. Recipients of cadaveric kidney transplants were randomized to tacrolimus or cyclosporine therapy. Patients active in the trial at 2 years after transplantation were approached for a protocol biopsy. Biopsies were scored by the Banff classification in a blinded fashion by one pathologist. Results. A total of 144 patients (41.3% of those active at 2 years) had a 2-year protocol biopsy performed; 79 patients were treated with tacrolimus and 65 patients were treated with cyclosporine. Evidence of acute rejection was found in seven (8.9%) of the 2-year biopsies in tacrolimus-treated patients and six (9.2%) cyclosporine-treated patients. Chronic allograft nephropathy was found in 49 (62.0%) tacrolimus biopsies and 47 (72.3%) cyclosporine biopsies (P=0.155). There were no apparent histopathologic differences between the tacrolimus and cyclosporine biopsies. The occurrence of chronic allograft nephropathy was significantly higher in patients who received a graft from an older donor (P<0.01), who experienced presumed cyclosporine or tacrolimus nephrotoxicity (P<0.001), who developed a cytomegalovirus infection (P=0.038), or who experienced acute rejection in the first year after transplantation (P=0.045). A multivariate analysis showed that nephrotoxicity and acute rejection were the most significant predictors for chronic allograft nephropathy. Conclusions. The occurrence of histologic acute rejection was rare at 2 years, confirming the absence of subclinical acute rejection in these late biopsies. A majority of the biopsies showed features consistent with chronic allograft nephropathy that was associated with acute rejection (particularly in cyclosporine-treated patients), nephrotoxicity, and cytomegalovirus infection in the first year. This suggests that nonimmunologic factors, such as drug-induced toxicity, may play an important role in chronic allograft nephropathy.

351 citations


Journal ArticleDOI
TL;DR: The overall incidence of PTLD has fallen from 10% to 5% for children receiving primary tacrolimus therapy after OLT, and serial monitoring of peripheral blood for Epstein Barr virus (EBV) by polymerase chain reaction (PCR) after pediatric OLT is recommended.
Abstract: Background. We have previously reported a 10% incidence of posttransplant lymphoproliferative disease (PTLD) in pediatric patients receiving first liver grafts and primarily immunosuppressed with tacrolimus. To decrease the incidence of PTLD, we developed a protocol utilizing preemptive intravenous ganciclovir in high-risk recipients (i.e., donor (D) + , recipient (R) - ), combined with serial monitoring of peripheral blood for Epstein Barr virus (EBV) by polymerase chain reaction (PCR). Methods. Consecutive pediatric recipients of a first liver graft were immunosuppressed with oral tacrolimus (both induction and maintenance), and low-dose prednisone. EBV serologies were obtained at the time of orthotopic liver transplant in recipients and donors. Recipients were divided into groups: group 1, high-risk (D + R - ), and group 2, low-risk (D + R + ; D - R - ; D - R + ). In group 1 (high-risk), all patients received a minimum of 100 days of intravenous ganciclovir (6-10 mg/kg/day), while, in group 2 (low-risk), patients received intravenous ganciclovir during their initial hospitalization and then were converted to oral acyclovir (40 mg/kg/day) at discharge. Semiquantitative EBV-PCR determinations were made at 1-2-month intervals. In both groups, patients with an increasing viral copy number by EBV-PCR had tacrolimus levels decreased to 2-5 ng/ml, Tacrolimus was stopped, and intravenous ganciclovir reinstituted for PTLD. A positive EBV-PCR with symptoms, but negative histology, was defined as EBV disease; PTLD was defined as histologic evidence of polyclonal or monoclonal B cell proliferation. Results. Forty children who had survived greater than 2 months were enrolled. There were 18 children in group 1 (high-risk; mean age of 14±15 months and mean follow-up time of 243±149 days) and 22 children in group 2 (low-risk; mean age of 64±65 months and follow-up time of 275±130 days). In group 1 (high-risk), there was no PTLD and one case of EBV disease (mononucleosis-like syndrome), which resolved. In group 2 (low-risk), there were two cases of PTLD ; both resolved when tacrolimus was stopped. Both children were 8 months old at time of transplant. Neither received OKT3, and they had one and two episodes of steroid-sensitive rejection, respectively. One child had EBV disease (mild hepatitis), which resolved. Conclusions. Since instituting this protocol, the overall incidence of PTLD has fallen from 10% to 5% for children receiving primary tacrolimus therapy after OLT. No high-risk pediatric liver recipient treated preemptively with intravenous ganciclovir developed PTLD. Both children with PTLD were less than 1 year at OLT and considered low-risk. However, their positive EBV antibody titers may have been maternal in origin and not have offered long-term protection. Serial monitoring of EBV-PCR after pediatric OLT is recommended to decrease the risk of PTLD by allowing early detection of EBV infection, which is then managed by decreasing immunosuppression and continuing intravenous ganciclovir.

321 citations


Journal ArticleDOI
TL;DR: The initial experience suggests that this technique is a safe and reliable option for adults with chronic end-stage liver disease and a conservative application of this procedure in the adult population could significantly reduce the mortality on the adult waiting list.
Abstract: Background Living donor liver transplantation has gained wide acceptance as an alternative for children with end-stage liver disease. The standard left lateral segment used in this operation does not provide adequate parenchymal mass to broaden its application to larger children or adults. Methods We report two cases of adult to adult living donor liver transplantation using a right hepatic lobe in patients with chronic liver disease. Results Both recipients experienced excellent initial graft function and have normal liver function 4 and 9 months postoperatively. Both donors are alive and well and returned to normal life 4 weeks postoperatively. Conclusions Our initial experience suggests that this technique is a safe and reliable option for adults with chronic end-stage liver disease. A conservative application of this procedure in the adult population could significantly reduce the mortality on the adult waiting list.

314 citations


Journal ArticleDOI
TL;DR: With improved survival of liver transplant recipients, chronic renal failure has become an important cause of morbidity and is associated with a high mortality, and treatment regimens that avoid or prevent cyclosporine-induced nephrotoxicity are urgently required.
Abstract: BACKGROUND Liver transplant recipients are at risk of chronic renal disease, principally as a result of nephrotoxicity of the commonly used immunosuppressive agents cyclosporine and tacrolimus We have investigated the incidence of chronic renal failure and its risk factors in our transplant population, which was treated predominantly with cyclosporine METHODS A single-center retrospective study was done of 883 consecutive adult patients receiving a first liver transplant between 1982 and 1996 Potential risk factors for the development of chronic renal failure were recorded, including serial measurements of cyclosporine therapy and renal function RESULTS Severe chronic renal failure (serum creatinine level >250 microM/L for at least 6 months) developed in 25 patients, representing 4% of patients surviving 1 year or more Twelve of these patients developed end-stage renal failure and mortality was 44% The predominant cause of renal failure was cyclosporine nephrotoxicity Serum creatinine as early as 3 months after surgery was strongly associated with the eventual development of severe chronic renal failure (P=0001), and this group could be further subdivided into two groups with differing risk factors The first group had early ( 1 year) renal dysfunction, with cyclosporine levels at 1 month after surgery (P=0007) and daily and cumulative cyclosporine dosage at 5 years (P=001 for both) as risk factors CONCLUSIONS With improved survival of liver transplant recipients, chronic renal failure has become an important cause of morbidity and is associated with a high mortality Many patients at risk of severe chronic renal failure may be identified at an early stage Treatment regimens that avoid or prevent cyclosporine-induced nephrotoxicity are urgently required for this population

Journal ArticleDOI
TL;DR: Posttransplant diabetes mellitus is associated with impaired long-term renal allograft survival and function, complications similar to those in non-transplant-associated diabetes may occur in posttransplant Diabetes mellitus, and, hence, as inNon-Transplant- associated diabetes, tight glycemic control may also be warranted in patients with posttrans transplant diabetes.
Abstract: Background. Despite use of lower doses of corticosteroid hormones after renal allotransplantation in the era of cyclosporine and tacrolimus, posttransplant diabetes mellitus remains a common clinical problem. Methods. We prospectively investigated the effect of posttransplant diabetes on long-term (mean follow-up, 9.3±1.5 years) graft and patient survival in the 11.8% of our renal transplant population (n=40) who developed diabetes after kidney transplantation, and we compared outcome in 38 randomly chosen nondiabetic control patients who had received transplants concurrently. Results. Twelve-year graft survival in diabetic patients was 48%, compared with 70% in control patients (P=0.04), and Cox's regression analysis revealed diabetes to be a significant predictor of graft loss (P=0.04, relative risk=3.72) independent of age, sex, and race. Renal function at 5 years as assessed by serum creatinine level was inferior in diabetic patients compared to control patients (2.9±2.6 vs. 2.0±0.07 mg/dl, P =0.05). Two diabetic patient who experienced graft loss had a clinical course and histological features consistent with diabetic nephropathy; other diabetes-related morbidity in patients with posttransplant diabetes included ketoacidosis, hyperosmolar coma or precoma, and sensorimotor peripheral neuropathy. Patient survival at 12 years was similar in diabetic and control patients (71% vs. 74%). Conclusions. Posttransplant diabetes mellitus is associated with impaired long-term renal allograft survival and function, complications similar to those in non-transplant-associated diabetes may occur in posttransplant diabetes, and, hence, as in non-transplant-associated diabetes, tight glycemic control may also be warranted in patients with posttransplant diabetes.

Journal ArticleDOI
TL;DR: Thymoglobulin was found to be superior to Atgam in reversing acute rejection and preventing recurrent rejection after therapy in renal transplant recipients.
Abstract: Background. Thymoglobulin, a rabbit anti-human thymocyte globulin, was compared with Atgam, a horse anti-human thymocyte globulin for the treatment of acute rejection after renal transplantation. Methods. A multicenter, double-blind, randomized trial with enrollment stratification based on standardized histology (Banff grading) was conducted. Subjects received 7-14 days of Thymoglobulin (1.5 mg/kg/ day) or Atgam (15 mg/kg/day). The primary end point was rejection reversal (return of serum creatinine level to or below the day 0 baseline value). Results. A total of 163 patients were enrolled at 25 transplant centers in the United States. No differences in demographics or transplant characteristics were noted. Intent-to-treat analysis demonstrated that Thymoglobulin had a higher rejection reversal rate than Atgam (88% versus 76%, P=0.027, primary end point). Day 30 graft survival rates (Thymoglobulin 94% and Atgam 90%, P=0.17), day 30 serum creatinine levels as a percentage of baseline (Thymoglobulin 72% and Atgam 80%; P=0.43), and improvement in posttreatment biopsy results (Thymoglobulin 65% and Atgam 50%; P=0.15) were not statistically different. T-cell depletion was maintained more effectively with Thymoglobulin than Atgam both at the end of therapy (P=0.001) and at day 30 (P=0.016). Recurrent rejection, at 90 days after therapy, occurred less frequently with Thymoglobulin (17%) versus Atgam (36%) (P=0.011).

Journal ArticleDOI
John F. Neylan1
TL;DR: Tacrolimus is more effective than cyclosporine in preventing acute rejection in both African-American and Caucasian patients, however, tacrolimUS was associated with an increased risk of posttransplant diabetes mellitus, particularly in African-Americans, which was reversible in some patients.
Abstract: Background Results of a multicenter, randomized, clinical trial demonstrated that tacrolimus was more effective than cyclosporine in preventing acute rejection in cadaveric renal transplant patients. As African-Americans comprised approximately 25% of the study population, their outcome was analyzed relative to the experience of Caucasian patients. Methods. Of the 205 patients randomized to tacrolimus, 56 (27.3%) were African-American and 114 (55.6%) were Caucasian. Of the 207 patients randomized to cyclosporine, 48 (23.2%) were African-American and 123 (59.4%) were Caucasian. The efficacy variables were 1-year patient survival, graft survival, and incidence of acute rejection. Results. The incidence of acute rejection was significantly lower in African-American and Caucasian patients treated with tacrolimus than with cyclosporine. Additionally, no African-American patient who was treated with tacrolimus experienced moderate or severe acute rejection, as determined by blinded independent review. The incidence of nephrotoxicity, cardiovascular and gastrointestinal events, malignancies, and opportunistic infections was similar between treatments and race groups. However, there was an increased incidence of posttransplant diabetes mellitus in tacrolimus-treated patients, particularly in African-Americans, and tacrolimus was associated with significantly lower lipid levels in both Caucasians and African-Americans. African-American patients required a 37% mean higher dose of tacrolimus than Caucasian patients to achieve comparable blood concentrations. Conclusions. Tacrolimus is more effective than cyclosporine in preventing acute rejection in both African-American and Caucasian patients. However, tacrolimus was associated with an increased risk of posttransplant diabetes mellitus, particularly in African-Americans, which was reversible in some patients.

Journal ArticleDOI
TL;DR: This study demonstrates that a kidney from an hDAF transgenic pig can support the life of a primate for up to 35 days and shows the basic physiological compatibility between the pig and nonhuman primate, and the presence of hDAf on the kidney confers some protection against acute vascular rejection.
Abstract: Background. In order to circumvent the complement-mediated hyperacute rejection of discordant xenografts, a colony of pigs transgenic for the human regulator of complement activity, human decay-accelerating factor (hDAF), has been produced. Methods. Seven kidneys from hDAF transgenic pigs and six kidneys from nontransgenic control pigs were transplanted into cynomolgus monkeys; both native kidneys were removed during the same operation. The recipient animals were immunosuppressed with cyclosporine, steroids, and cyclophosphamide. Results. In the transgenic group, the median survival time was 13 days (range, 6-35 days); the median survival time in the control group was 6.5 days (range, 0.3-30 days). There were no cases of hyperacute rejection in the transgenic group, and the two longest-surviving kidneys in this group showed no evidence of rejection on histological examination. In contrast, all control kidneys underwent antibody-mediated rejection, one demonstrating hyperacute rejection and the others acute vascular rejection. Conclusion. This study demonstrates that (i) a kidney from an hDAF transgenic pig can support the life of a primate for up to 35 days (and also shows the basic physiological compatibility between the pig and non-human primate); (ii) nontransgenic kidneys are not routinely hyperacutely rejected; and (iii) the presence of hDAF on the kidney confers some protection against acute vascular rejection. Improved immunosuppression and immunological monitoring may enable extended survival.

Journal ArticleDOI
TL;DR: Sirolimus potentiates the immunosuppressive effects of a cyclosporine-based regimen by reducing the rate of acute rejection episodes, and did not augment the nephrotoxic or hypertensive proclivities of cyclOSporine.
Abstract: Background. Sirolimus, a novel immunosuppressant that inhibits cytokine-driven cell proliferation and maturation, prolongs allograft survival in animal models. After a phase I trial in stable renal transplant recipients documented that cyclosporine and sirolimus have few overlapping toxicities, we conducted an open-label, single-center, phase I/II dose-escalation trial to examine the safety and efficacy of this drug combination. Methods. Forty mismatched living-donor renal transplant recipients were sequentially assigned to receive escalating initial doses of sirolimus (0.5-7.0 mg/m 2 /day), in addition to courses of prednisone and a concentration-controlled regimen of cyclosporine. We conducted surveillance for drug-induced side effects among sirolimus-treated patients and compared their incidence of acute rejection episodes as well as mean laboratory values with those of a historical cohort of 65 consecutive, immediately precedent, demographically similar recipients treated with the same concentration-controlled regimen of cyclosporine and tapering doses of prednisone. Results. The addition of sirolimus reduced the overall incidence of acute allograft rejection episodes to 7.5% from 32% in the immediately precedent cyclosporine/prednisone-treated patients. At 18- to 47-month follow-up periods, both treatment groups displayed similar rates of patient and graft survival, as well as morbid complications. Although sirolimus-treated patients displayed comparatively lower platelet and white blood cell counts and higher levels of serum cholesterol and triglycerides, sirolimus did not augment the nephrotoxic or hypertensive proclivities of cyclosporine. The degree of change in the laboratory values was more directly associated with whole blood trough drug concentrations than with doses of sirolimus. Conclusions. Sirolimus potentiates the immunosuppressive effects of a cyclosporine-based regimen by reducing the rate of acute rejection episodes.

Journal ArticleDOI
TL;DR: These studies demonstrate that mitotically expanded bone marrow cells can serve as an abundant source of osteoprogenitor cells that are capable of repairing craniofacial skeletal defects in mice without the addition of growth or morphogenetic factors.
Abstract: Background. Techniques used to repair craniofacial skeletal defects parallel the accepted surgical therapies for bone loss elsewhere in the skeleton and include the use of autogenous bone and alloplastic materials. Transplantation of a bone marrow stromal cell population that contains osteogenic progenitor cells may be an additional modality for the generation of new bone. Methods. Full thickness osseous defects (5 mm) were prepared in the cranium of immunocompromised mice and were treated with gelatin sponges containing murine alloplastic bone marrow stromal cells derived from transgenic mice carrying a type I collagen-chloramphenicol acetyltransferase reporter gene to follow the fate of the transplanted cells. Control surgical sites were treated with spleen stromal cells or gelatin sponges alone, or were left untreated. The surgical defects were analyzed histologically for percent closure of the defect at 2, 3, 4, 6, and 12 weeks. Results. Cultured bone marrow stromal cells transplanted within gelatin sponges resulted in osteogenesis that repaired greater than 99.0±2.20% of the original surgical defect within 2 weeks. In contrast, cranial defects treated with splenic fibroblasts, vehicle alone, or sham-operated controls resulted in minimal repair that was limited to the surgical margins. Bone marrow stromal cells carrying the collagen transgene were immunodetected only in the newly formed bone and thus confirmed the donor origin of the transplanted cells. Conclusions. These studies demonstrate that mitotically expanded bone marrow cells can serve as an abundant source of osteoprogenitor cells that are capable of repairing craniofacial skeletal defects in mice without the addition of growth or morphogenetic factors.

Journal ArticleDOI
TL;DR: HBV is thought to be transmitted to recipients by liver grafts from the HBcAb(+) donors at a significantly high rate and the prevention of viral activation and clinical disease development by means of passive immunization with HBIG seems promising, although the follow-up period in the study may be too short for any definitive conclusions.
Abstract: Background In order to clarify the risk of hepatitis B virus (HBV) transmission from hepatitis B core antibody-positive (HBcAb(+)) donors and to evolve a new strategy to counter such a risk, we undertook a retrospective (1990-1995) and prospective (1995-1996) analysis of our experience with living related liver transplantation involving HBcAb(+) donors. Methods Between June 15, 1990, and June 30, 1995, HBcAb(+) individuals were not excluded as donor candidates at our institutions. For 171 liver transplants, 16 donors were HBcAb(+). Between July 1, 1995, and June 30, 1996, HBcAb(+) individuals were generally excluded as donor candidates; however, three recipients were given liver grafts from HBcAb(+) donors because other donor candidates presented even higher risks. In the latter period, recipients with transplants from HBcAb(+) donors underwent prophylactic passive immunization with hyperimmune hepatitis B immunoglobulin (HBIG). The serum of 10 HBcAb(+) donors was examined by nested polymerase chain reaction for the presence of HBV-DNA, but it was not detected in any of them. However, the same examination of the liver tissue of five such donors yielded positive results in all cases. Results In the first 5-year period, out of 16 recipients with HBcAb(+) donors, 15 became hepatitis B surface antigen-positive after transplant. The three recipients with HBcAb(+) donors during the second 1-year period, who were treated by prophylactic passive immunization with HBIG, remained hepatitis B surface antigen-negative and negative for serum HBV-DNA after transplant. Conclusions HBV exists in the liver of healthy HBcAb(+) individuals, but not in the blood. Therefore, HBV is thought to be transmitted to recipients by liver grafts from the HBcAb(+) donors at a significantly high rate. The prevention of viral activation and clinical disease development by means of passive immunization with HBIG seems promising, although the follow-up period in our study may be too short for any definitive conclusions.

Journal ArticleDOI
TL;DR: Liver transplantation may be justified in selected patients to provide immediate relief of otherwise intractable pain or hormone-related symptoms and in older patients with extrahepatic disease requiring extended operations, long-term results are discouraging and the small benefit achieved by liver transplantation must be weighed against medical treatment options and the natural course of often slowly progressing disease.
Abstract: Background. Patients with neuroendocrine carcinoma often present with liver metastases not amenable to hepatic resection. For them, liver transplantation has been considered a viable treatment option, especially if hormonal symptoms and pain cannot be controlled medically. Still, little is known regarding potential prognostic factors and long-term survival after liver transplantation for neuroendocrine tumors. Methods. A search of English, French, and German literature identified patients with liver transplantation for extensive metastases from neuroendocrine carcinoma for whom follow-up data were available. Results. Overall, 2-year and 5-year survival for all 103 patients was 60% and 47%, respectively, but recurrence-free 5-year survival did not exceed 24%. Univariate analysis identified age less than 50 years, primary tumor location in lung or bowel, and pretransplant somatostatin therapy as favorable prognostic factors, whereas extended operations combining liver transplantation with upper abdominal exenteration or Whipple's procedure were associated with poor prognosis. Multivariate analysis identified age greater than 50 years (P<0.03) and transplantation combined with upper abdominal exenteration or Whipple's operation (P<0.001) as adverse prognostic factors. Conclusions. Liver transplantation may be justified in selected patients to provide immediate relief of otherwise intractable pain or hormone-related symptoms. Transplantation with curative intent appears worthwhile in young patients with only hepatic disease. In older patients with extrahepatic disease requiring extended operations, long-term results are discouraging, and the small benefit achieved by liver transplantation must be weighed against medical treatment options and the natural course of often slowly progressing disease.

Journal ArticleDOI
TL;DR: I.v.IG appears to be an effective therapy to control posttransplant AR episodes in heart and kidney transplant recipients, including patients who have had no success with conventional therapies.
Abstract: Background Intravenous gammaglobulin (i.v.IG) contains anti-idiotypic antibodies that are potent inhibitors of HLA-specific alloantibodies in vitro and in vivo. In addition, highly HLA-allosensitized patients awaiting transplantation can have HLA alloantibody levels reduced dramatically by i.v.IG infusions, and subsequent transplantation can be accomplished successfully with a crossmatch-negative, histoincompatible organ. Methods In this study, we investigated the possible use of i.v.IG to reduce donor-specific anti-HLA alloantibodies arising after transplantation and its efficacy in treating antibody-mediated allograft rejection (AR) episodes. We present data on 10 patients with severe allograft rejection, four of whom developed AR episodes associated with high levels of donor-specific anti-HLA alloantibodies. Results Most patients showed rapid improvements in AR episodes, with resolution noted within 2-5 days after i.v.IG infusions in all patients. i.v.IG treatment also rapidly reduced donor-specific anti-HLA alloantibody levels after i.v.IG infusion. All AR episodes were reversed. Freedom from recurrent rejection episodes was seen in 9 of 10 patients, some with up to 5 years of follow-up. Results of protein G column fractionation studies from two patients suggest that the potential mechanism by which i.v.IG induces in vivo suppression is a sequence of events leading from initial inhibition due to passive transfer of IgG to eventual active induction of an IgM or IgG blocking antibody in the recipient. Conclusion I.v.IG appears to be an effective therapy to control posttransplant AR episodes in heart and kidney transplant recipients, including patients who have had no success with conventional therapies. Vascular rejection episodes associated with development of donor-specific cytotoxic antibodies appears to be particularly responsive to i.v.IG therapy.

Journal ArticleDOI
TL;DR: First cadaver kidney transplant recipients with anti-HCV antibodies had a significantly shorter patient and graft long-term survival than recipients without anti- HCV antibodies, suggesting that HCV infection has a harmfulLong-term impact on the survival of kidney transplant recipient.
Abstract: Background. The long-term impact of hepatitis C virus (HCV) infection in renal transplant recipients remains controversial. We report here our experience, in a homogeneous single center, of 499 patients with a fairly long follow-up. Methods. We retrospectively studied 499 hepatitis B virus-negative patients who received an initial cadaver donor kidney transplantation at Necker Hospital between January 1, 1979 and December 31, 1994, with a graft or patient survival of at least 6 months. Anti-HCV antibodies were detected at time of transplantation in 112 patients (22%). Patient survival and causes of death were compared among anti-HCV-positive and -negative patients. Results. Our results clearly indicate that first cadaver kidney transplant recipients with anti-HCV antibodies had a significantly shorter patient and graft long-term survival than recipients without anti-HCV antibodies (P<0.01 and P<0.0001 respectively). Mean follow-up time after transplantation was 79±2 months in the former group and 81±5 months in the latter (NS). Increased mortality was primarily caused by liver disease (P<0.001) and sepsis (P<0.01). In a multivariate analysis, HCV infection significantly affected the mortality rate (odds ratio: 2.8). Conclusions. These results suggest that HCV infection has a harmful long-term impact on the survival of kidney transplant recipients.


Journal ArticleDOI
TL;DR: The world experience of both pig-to-human and pig- to-nonhuman primate organ transplantation is reviewed, finding that the future of xenotransplantation may lie in the judicious combination of current approaches.
Abstract: The pig-to-primate model is increasingly being utilized as the final preclinical means of assessing therapeutic strategies aimed at allowing discordant xenotransplantation. We review here the world experience of both pig-to-human and pig-to-nonhuman primate organ transplantation. Eight whole organ transplants using discordant mammalian donors have been carried out in human recipients; only one patient was reported (in 1923) to have survived for longer than 72 hr. Therapeutic approaches in the experimental laboratory setting have included pharmacologic immunosuppression, antibody and/or complement depletion or inhibition, the use of pig organs transgenic for human complement regulatory proteins, and conditioning regimens aimed at inducing a state of tolerance or specific immunologic hyporesponsiveness. The greatest success to date has been obtained with methods that inhibit complement-mediated injury, either by the administration of cobra venom factor or soluble complement receptor I to the recipient (with organ survival up to 6 weeks) or by the use of donor organs transgenic for human decay-accelerating factor (with organ survival up to 2 months). The future of xenotransplantation may lie in the judicious combination of current approaches.

Journal ArticleDOI
TL;DR: Quercetin and curcumin reduce ischemia-reperfusion injury and its inflammatory sequelae and hold promise as agents that can reduce immune and nonimmune renal injury, the key risk factors in chronic graft loss.
Abstract: BACKGROUND Nonimmune renal injury plays an important role in acute and chronic rejection by triggering an injury response through cytokine and chemokine release. Bioflavonoids are agents with potential immunosuppressive and renoprotective properties. We studied the effects of quercetin and curcumin, two bioflavonoids, on ischemia-reperfusion in the rat. METHODS Rats underwent 30 min of left renal pedicle occlusion with simultaneous right nephrectomy and were pretreated with quercetin or curcumin. Serial serum creatinine was measured, and renal expression of the chemokines regulated upon activation, normal T-cell expressed and secreted (RANTES), monocyte chemoattractant protein-1 (MCP-1), and allograft inflammatory factor (AIF) was quantified by polymerase chain reaction. RESULTS Pretreatment with quercetin or curcumin resulted in preservation of histological integrity, with a decrease in tubular damage and interstitial inflammation. On day 2 after ischemia-reperfusion, quercetin pretreatment decreased the mean serum creatinine level from 6.5+/-1.4 to 3.3+/-0.5 mg/dl (P=0.06). On day 7, the creatinine level for control animals was 7.5+/-1.5 mg/dl, which was significantly decreased by pretreatment with quercetin, curcumin, or both together (creatinine levels: 1.6+/-1.3, 1.8+/-0.2, and 2.0+/-0.4 mg/dl, respectively; all P<0.05 vs. untreated). By semiquantitative polymerase chain reaction, RANTES, MCP-1, and AIF were detected at high levels in kidneys on day 2 but not in normal kidneys. Pretreatment with quercetin or curcumin strongly attenuated this expression. CONCLUSION Quercetin and curcumin reduce ischemia-reperfusion injury and its inflammatory sequelae. The bioflavonoids hold promise as agents that can reduce immune and nonimmune renal injury, the key risk factors in chronic graft loss.

Journal ArticleDOI
TL;DR: Recipients who develop EAD have longer ICU and hospital stays and greater mortality than those without and a logistic regression model combining donor, graft, and recipient factors predicted EAD better than models examining these factors in isolation.
Abstract: Background Poor graft function early after liver transplantation is an important cause of morbidity and mortality. We defined early allograft dysfunction (EAD) using readily available indices of function and identified donor, graft, and pretransplant recipient factors associated with this outcome. Methods This study examined 710 adult recipients of a first, single-organ liver transplantation for non-fulminant liver disease at three United States centers. EAD was defined by the presence of at least one of the following between 2 and 7 days after liver transplantation: serum bilirubin >10 mg/dl, prothrombin time (PT) > or =17 sec, and hepatic encephalopathy. Results EAD incidence was 23%. Median intensive care unit (ICU) and hospital stays were longer for recipients with EAD than those without (4 days vs. 3 days, P = 0.0001; 24 vs. 15 days, P = 0.0001, respectively). Three-year recipient and graft survival were worse in those with EAD than in those without (68% vs. 83%, P = .0001; 61% vs. 79%, P = 0.0001). A logistic regression model combining donor, graft, and recipient factors predicted EAD better than models examining these factors in isolation. Pretransplant recipient elevations in PT and bilirubin, awaiting a graft in hospital or ICU, donor age > or =50 years, donor hospital stay >3 days, preprocurement acidosis, and cold ischemia time > or =15 hr were independently associated with EAD. Conclusion Recipients who develop EAD have longer ICU and hospital stays and greater mortality than those without. Donor, graft, and recipient risk factors all contribute to the development of EAD. Results of these analyses identify factors that, if modified, may alter the risk of EAD.

Journal ArticleDOI
TL;DR: ABO-incompatible transplant recipients experienced a significantly higher rate of early graft loss up to 3 years but showed an equivalent graft loss by year 4, while ABO- incompatible patients with end-stage renal failure experienced a significant increase in graft survival.
Abstract: Background. Despite great efforts to promote the donation of cadaveric organs, the number of organ transplantations in Japan is not increasing and a serious shortage of cadaveric organs exists. These circumstances have forced a widening of indications for kidney transplantation. For this purpose, ABO-incompatible living kidney transplantations (LKTs) have been performed. Although we have already reported the short-term results of ABO-incompatible LKT, there is no report of long-term results in such cases; anti-A and anti-B antibodies could cause antibody-induced chronic rejection and result in poor long-term graft survival. In this study, we have reviewed the long-term results of ABO-incompatible LKT and tried to identify the most important factors for long-term renal function in ABO-incompatible LKT. Methods. Sixty-seven patients with end-stage renal failure underwent ABO-incompatible living kidney transplantation at our institute between January, 1989, and December, 1995. The mean age was 34.9 years (range, 8-58 years), with 38 males and 29 females. Incompatibility in ABO blood group antigens was as follows: A1

Journal ArticleDOI
TL;DR: The loss of a primary renal allograft was associated with significant mortality, especially in recipients with type I DM, and repeat transplantation wasassociated with a substantial improvement in 5-year patient survival.
Abstract: Background Survival of transplant recipients after primary renal allograft failure has not been well studied. Methods. A cohort of 19,208 renal transplant recipients with primary allograft failure between 1985 and 1995 were followed from the date of allograft loss until death, repeat transplantation, or December 31, 1996. The mortality, wait-listing, and repeat transplantation rates were assessed. The mortality risks associated with repeat transplantation were estimated with a time-dependent survival model. Results. In total, 34.5% (n=6,631) of patients died during follow-up. Of these deaths, 82.9% (n=5,498) occurred in patients not wait-listed for repeat transplantation, 11.9% (n=789) occurred in wait-listed patients, and 5.2% (n=344) occurred in second transplant recipients. Before repeat transplantation, the adjusted 5-year patient survival was 36%, 49%, and 65% for type I diabetes mellitus (DM), type II DM, and nondiabetic end-stage renal disease, respectively (P<0.001; DM vs. nondiabetics). The adjusted 5-year patient survival was lower in Caucasians (57%, P<0.001) compared with African-Americans (67%) and other races (64%). The 5-yr repeat transplantation rate was 29%, 15%, and 19%, whereas the median waiting time for a second transplant was 32, 90, and 81 months for Caucasians, African-Americans, and other races, respectively (P<0.0001 each). Repeat transplantation was associated with 45% and 23% reduction in 5-year mortality for type I DM and nondiabetic end-stage renal disease, respectively, when compared with their wait-listed dialysis counterparts with prior transplant failure. Conclusions. The loss of a primary renal allograft was associated with significant mortality, especially in recipients with type I DM. Repeat transplantation was associated with a substantial improvement in 5-year patient survival. Recipients with type I DM achieved the greatest proportional benefit from repeat transplantation.

Journal ArticleDOI
TL;DR: The results reported here demonstrate that hDAF transgenic pig hearts are not hyperacutely rejected when transplanted into baboon recipients and are capable of maintaining cardiac output in baboons.
Abstract: Background. Previous studies demonstrated that hearts from transgenic pigs expressing human decay-accelerating factor (hDAF) were not hyperacutely rejected when transplanted heterotopically into the abdomen of cynomolgus monkeys. This study examines orthotopic transplantation of hDAF transgenic pig hearts into baboon recipients. Methods. Orthotopic xenogeneic heart transplantation was performed using piglets, transgenic for hDAF, as donors. Ten baboons were used as recipients and were immunosuppressed with a combination of cyclophosphamide, cyclosporine, and steroids. Results. Five grafts failed within 18 hr without any histological signs of hyperacute rejection. Pulmonary artery thrombosis induced by a size mismatch was observed in two of these animals. The other three recipients died because of failure to produce even a low cardiac output and/or dysrhythmia. The remaining five animals survived between four and nine days. One animal died of bronchopneumonia on day 4. Three xenografts stopped beating on day 5 due to acute vascular rejection. The longest survivor was killed on day 9 with a beating, histologically normal xenograft, because of pancytopenia. Conclusions. The results reported here demonstrate that hDAF transgenic pig hearts are not hyperacutely rejected when transplanted into baboon recipients. Orthotopically transplanted transgenic pig hearts are capable of maintaining cardiac output in baboons. An optimum immunosuppressive regimen is the subject of ongoing research.

Journal ArticleDOI
TL;DR: Results of this large, multicenter survey designed to identify variables that affect the likelihood of compliance with immunosuppressive medication regimens and distinguish among noncompliant patients can be used by clinicians to identify patients likely to becomeNoncompliant, by researchers to develop randomized, prospective clinical trials of interventions designed to increase compliance, and by educators to tailor patient education programs.
Abstract: Background. Noncompliance with medication is a major cause of renal allograft failure among adult renal transplant patients. We summarize previous studies of noncompliance and report results of a large, multicenter survey designed to identify variables that (1) affect the likelihood of compliance with immunosuppressive medication regimens and (2) distinguish among noncompliant patients. Methods. Questionnaires were distributed to 2500 patients at 56 U.S. transplant centers. Compliance was determined by patient responses to questions concerning whether, within the previous 4 weeks, one or more doses of immunosuppressive medications had been missed. Independent variables included patient and transplant characteristics, memories of dialysis, posttransplant symptoms and beliefs, and beliefs concerning the efficacy and importance of immunosuppressants. Results. The incidence of noncompliance reported by the 1402 respondents was 22.4%. A logistic regression model that included age, occupation, time since transplant, and three medication-related beliefs was most predictive of the likelihood of compliance. Donor type and histories of diabetes and of infection entered the multivariate model when belief-related variables were excluded. Cluster analyses identified three distinct profiles of noncompliers : accidental noncompliers, invulnerables, and decisive noncompliers. Conclusions. Results of this study, which included nearly three times more patients than the largest previously reported study, can be used by clinicians to identify patients likely to become noncompliant, by researchers to develop randomized, prospective clinical trials of interventions designed to increase compliance, and by educators to tailor patient education programs.

Journal ArticleDOI
Russell H. Wiesner1
TL;DR: Tacrolimus is a safe and effective long-term maintenance immunosuppressive agent in primary liver transplantation and demonstrated an acceptable safety profile with maintenance of adequate renal and liver function and a low incidence of malignancy/lymphoproliferative disease and serious infections.
Abstract: Background The long-term (5 year) efficacy and safety of tacrolimus (FK506) and cyclosporine were compared in primary liver transplant recipients who participated in a 1-year randomized, multicenter trial and a 4-year follow-up extension study. Methods A total of 529 patients (263 tacrolimus group, 266 cyclosporine group) were randomized to study drug. Patients were evaluated at 3-month intervals. Patient and graft survival rates, incidence of adverse events, and changes in laboratory and clinical profiles were determined. Results Cumulative 5-year patient and graft survival rates were comparable for the tacrolimus (79.0%, 71.8%) and cyclosporine (73.1%, 66.4%) groups. However, patient half-life survival was longer for tacrolimus-treated patients (25.1+/-5.1 years versus 15.2+/-2.5 years; P=0.049). Improved patient survival with tacrolimus was also observed for hepatitis C-positive patients (78.9% tacrolimus group versus 60.5% cyclosporine group; P=0.041). Both treatments were associated with a low incidence of late acute rejection, late steroid-resistant rejection, and death or graft loss related to rejection. Both treatments demonstrated an acceptable safety profile with maintenance of adequate renal and liver function and a low incidence of malignancy/lymphoproliferative disease and serious infections. Conclusions Tacrolimus is a safe and effective long-term maintenance immunosuppressive agent in primary liver transplantation.