scispace - formally typeset
Search or ask a question

Showing papers in "Clinical Transplantation in 2002"


Journal Article
TL;DR: After reduction of immunosuppression, the course of BKAN in most patients followed one of 2 pathways: 1) Clearance of the infection and disappearance of the viral cytopathic changes in biopsies and urine, and 2)Persistence of viral replication with continuous associated tubular damage.
Abstract: The first case of BK virus allograft nephropathy at the University of Maryland Renal Transplant Program was diagnosed in 1997. Since then more than 100 cases have been identified. The incidence of BKAN has increased from 1% for patients transplanted in 1997 to 5.8% for patients transplanted in 2001. BKAN is an important cause of premature kidney graft loss at the University of Maryland Transplant Program. One-third of the patients diagnosed with BKAN since 1997 have already lost graft function, and a third of the remaining patients have creatinine levels over 3 mg/dl. We could not determine that a specific immunosuppressive drug increased the incidence of BKAN. Older patients had an increased risk of developing the disease. The histological diagnosis of BKAN was made at a mean time of 14.4 months after transplantation (range 1.2-53 months). BKAN occurred in 4.3% of all patients biopsied during the period described. The diagnosis of BK allograft nephropathy was based on a combination of renal biopsy to demonstrate viral cytopathic chages, urine cytology and quantitative viral load in plasma. A threshold of >10,000 copies of BK virus per ml of plasma is proposed as an indication of BKAN. Following diagnosis of BKAN, patients on a single immunosuppressve drug (FK506, CsA, sirolimus or MMF) in addition to prednisone had less graft loss and higher viral clearance in comparison to patients on prednisone and 2 immunosuppressant drugs (FK506, CsA or sirolimus and MMF). There was no difference in the rate of acute allograft rejection among different immunosuppression reduction protocols. Three patients who lost their grafts to BKAN were retransplanted. For these patients there has not yet been evidence of recurrence of BKAN. After reduction of immunosuppression, the course of BKAN in most patients followed one of 2 pathways: 1) Clearance of the infection and disappearance of the viral cytopathic changes in biopsies and urine (20%); 2)Persistence of viral replication with continuous associated tubular damage (70%). Renal transplant patients should be routinely screened with urine cytology. The presence of decoy cells in the urine is an indication for quantitative measurement of viral load in plasma. Patients with any evidence of BK viral reactivation should be followed closely. In patients biopsied early due to persistence of BK virus-infected cells in urine, there is a higher rate of conversion from positive to negative urine cytology after reduction of immunosuppression.

185 citations


Journal ArticleDOI
TL;DR: Humar et al. as discussed by the authors performed a multivariate analysis to determine risk factors for slow graft function after kidney transplants, which have not been well defined (in contrast to risk for DGF).
Abstract: Humar A, Ramcharan T, Kandaswamy R, Gillingham K, Payne WD, Matas AJ. Risk factors for slow graft function after kidney transplants: a multivariate analysis. Clin Transplant 2002: 16: 425–429. © Blackwell Munksgaard, 2002 Background: We previously defined an intermediate group of cadaver kidney transplant recipients who do not have immediate graft function (IGF), but do not have sufficient graft dysfunction to be classified as having delayed graft function (DGF). We showed that this group with slow graft function (SGF) had an increased risk of rejection and inferior long-term results vs. recipients with IGF. The aim of our current study was to determine risk factors for SGF, which have not been well defined (in contrast to risk factors for DGF). Methods: Between January 1, 1984 and September 30, 1999, we performed 896 adult cadaver kidney transplants at the University of Minnesota. Recipients were analysed in three groups based on initial graft function: IGF [creatinine (Cr) 3 mg/dL on POD no. 5, but no need for dialysis), and DGF (need for dialysis in the first week post-transplant). A multivariate analysis looked specifically at risk factors for SGF, as compared with risk factors for DGF. Outcomes with regard to graft survival and acute rejection (AR) rates were determined for the three groups. Results: Of the 896 recipients, 425 had IGF, 238 had SGF, and 233 had DGF. A multivariate analysis of risk factors for SGF showed donor age >50 yr (RR=3.3, p=0.0001) and kidney preservation time >24 h (RR=1.6, p=0.01) to be the most significant risk factors. A multivariate analysis of risk factors for DGF showed similar findings, although high panel-reactive antibodies (PRA) and donor Cr >1.7 mg/dL were also significant risk factors for DGF. Initial function of the graft significantly influenced the subsequent risk of AR: at 12 months post-transplant, the incidence of AR was 28% for those with IGF, 38% for those with SGF, and 44% for those with DGF (p=0.04 for SGF vs. DGF). Initial graft function also significantly influenced graft survival: the 5-yr death-censored graft survival rate was 89% for recipients with IGF, 72% for those with SGF, and 67% for those with DGF (p=0.01 for IGF vs. SGF; p=0.03 for SGF vs. DGF). Conclusions: SGF represents part of the spectrum of graft injury and post-transplant graft dysfunction. Risk factors for SGF are similar to those seen for DGF. Even mild to moderate graft dysfunction post-transplant can have a negative impact on long-term graft survival.

140 citations


Journal ArticleDOI
TL;DR: This report examines the longitudinal relationship between adverse effects and QOL, with particular attention to the relative impact of adverse effects associated with immunosuppression.
Abstract: Introduction: Previous cross-sectional analyses have identified significant associations between quality of life (QOL), comorbidities and adverse effects in renal transplant recipients. This report examines the longitudinal relationship between adverse effects and QOL, with particular attention to the relative impact of adverse effects associated with immunosuppression. Methods: The Transplant Learning Center (TLC) is a program designed to improve QOL and preserve graft function in transplant recipients. Self-selected enrollees filled out questionnaires at roughly 3-month intervals. Each questionnaire included QOL scales developed for the program. Repeated measures multiple regression analysis was used to examine the relationship between the QOL scales, comorbidities, adverse effects, adjusting for other factors. Results: A total of 4247 TLC enrollees were included in the analysis, with a mean time since transplant of 5.1 yr. Comorbidities and adverse effects were common, with high blood pressure reported by 87% of respondents and unusual hair growth reported by 69.6%. In bivariate analysis, emotional/psychological problems and headaches had the largest impact on QOL. In multivariate analysis, emotional/psychological problems decreased sexual interest or ability, and headache had the largest adverse QOL effect. Conclusions: We have identified QOL issues that have been previously underemphasized in transplant recipients. These findings open new areas of research to further explore and define these issues. They provide new opportunities for interventions to address factors adversely impacting QOL and to develop strategies to improve QOL in these patients. Clinicians should actively solicit information about adverse effects of medications, particularly information about sexual and relationship issues, when evaluating renal recipients. These issues should be taken into account when making therapeutic decisions.

115 citations


Journal Article
TL;DR: As of October 10, 2002, nearly 19,000 pancreas transplants had been reported to the IPTR, approximately 14,000 in the US and approximately 5,000 outside the US, with progressive improvement in outcome.
Abstract: As of October 10, 2002, nearly 19,000 pancreas transplants had been reported to the IPTR, approximately 14,000 in the US and approximately 5,000 outside the US. An era analysis of US cases from 1987-2002 showed a progressive improvement in outcome (p 45 years old increased from 9-11% for 1987-92 to 25-30% for 1999-2002 cases, and the improved outcomes encompassed the older patients as well. Contemporary pancreas transplant outcomes were calculated separately for 1996-2002 US and non-US cases. The results of the US analysis are summarized first. US patient survival rates at one year were > or = 94% in each recipient category, with one-year primary pancreas GSRs of 84% for SPK (n = 5.784), 76% for PAK (n = 1,033), and 77% for PTA (n = 470) (p or = 80% in all 3 recipient categories. An analysis of GSRs according to type of anti-T-cell antibody agents (depleting vs. non-depleting vs. none) for induction therapy in TAC+MMF treated recipients did not detect significant differences in any of the categories. The absolute numbers and the proportions of US pancreas grafts that were retransplants in the SPK and PTA categories (1% and 10% respectively) were relatively small, while a large number of the PAK grafts were retransplants (n = 346; 32% of the total). The majority of the latter were done after isolated failure of a pancreas graft in SPK recipients. In the SPK (n = 78) and PTA (n = 53) categories, the retransplant GSRs were significantly (p --> 0.06) lower than those for primary transplants. In the PAK category, however, GSRs were not significantly different (p = 0.14) for retransplant versus primary cases (72% and 76%, respectively, at one year). Known causes of death in 1996-2002 US pancreas recipients were tabulated for each category. Most were from cardio-cerebro-vascular accidents (1.3-2.6% incidence) or from infections (1.2-1.4% incidence), while death from malignancy or PTLD was reported in or = 80% at one year in all categories of recipients (SPK, PAK, PTA). The solitary pancreas transplant outcomes continue to improve as more are done.

110 citations


Journal ArticleDOI
TL;DR: In this paper, the authors found that DGF was associated with increased EAR (odds ratio = 1.7) within 6 months of transplant; whereas, DGF, but not DGF and LAR, were independent risk factors for long-term graft loss.
Abstract: 1. From 1991 to 1998, the incidence of DGF remained at 21% of all kidney grafts (n = 86,682) reported to the UNOS Scientific Transplant Registry. In contrast, percentages of early acute rejection (EAR) and late acute rejection (LAR) have dropped precipitously to half their starting values. (EAR started at 37% and dropped to 18%, and LAR started at 11% and dropped to 5%.) 2. Among discharged recipients, DGF was associated with increased EAR (odds ratio = 1.7) within 6 months of transplant; whereas, EAR (odds ratio = 4.7) but not DGF (odds ratio = 1.1) was associated with increased LAR for recipients from 6 months to one year after transplantation. 3. Non-immune factors (e.g., duration of pretransplant dialysis, donor age, and cold ischemia time) primarily influenced the risk of DGF, and immune factors (e.g., recipient race, recipient age, HLA) mainly determined the risk of EAR and LAR. 4. DGF, EAR and LAR were independent risk factors for long-term graft loss. DGF and LAR exhibited the strongest influences, reducing half-lives by 30% and 50%, respectively. 5. Some long-term risk factors demonstrated consistent effects regardless of DGF and/or LAR. For example, Black recipients always had poor long-term GS. On the other hand, some risk factors, mostly immune-type factors, exhibited effects only in the absence of DGF (e.g., recipient sex, age and HLA matching). Many non-immune factors exhibited long-term effects only in the absence of LAR (e.g., donor age, cause of donor death). 6. Strategies aimed at reducing both DGF and AR are necessary to improve the long-term outcome of kidney transplants.

103 citations


Journal ArticleDOI
TL;DR: Mortality or graft loss after renal transplantation might be influenced by hepatitis virus infection, and the importance of knowing the carrier and removal status of the virus is unclear.
Abstract: Background: Mortality or graft loss after renal transplantation might be influenced by hepatitis virus infection. Methods: Sera from time of transplantation of 927 renal transplant recipients were tested for hepatitis C (HCV) and hepatitis B virus (HBV) in order to investigate the impact of hepatitis virus infection on graft loss and mortality over an observation period of 20 yr. Results: One hundred and twenty three of 927 patients were HCV positive, 30 patients HBV positive and seven patients HBV and HCV positive. The observation period was 9.2 ± 4.4 yr. Mortality was significantly higher in patients with hepatitis B (p = 0.0005), as well as in patients with concomitant B and C hepatitis (p < 0.0001) and in those who acquired HCV infection after transplantation (n = 30, p = 0.0192) compared with non-infected patients. Patients with replicating HBV infection (HBeAg positive) had the worst prognosis (p < 0.0001). In the multivariate analysis the presence of HBeAg (p < 0.0001), patients' age (p < 0.0001) and HCV infection after transplantation (p = 0.0453) were predictors for death. Graft survival was significantly shorter in patients with concomitant hepatitis B and C (p = 0.0087) as well as in HBeAg positive patients (p = 0.002). HCV infection or HBs antigenemia did not have a significant impact on graft survival compared with non-infected patients. Conclusion: HCV infection after transplantation is associated with a high mortality whereas chronic HCV infection before trans plantation does not have a significant impact on mortality. Patients with replicating HBV infection or concomitant HBV and HCV infection have a high risk of graft loss and mortality.

96 citations


Journal ArticleDOI
TL;DR: This analysis of paired cadaver kidneys indicated that obesity is not a risk factor for DGF, acute rejection, and 1‐yr graft survival, however, a decreased medium‐ and long‐term graft survival trend, which reached statistical significance at 5 yr, was observed in obese recipients.
Abstract: Background: Previous studies indicate that obesity is a risk factor in renal transplantation. However, these analyses did not control for variable donor factors that may strongly influence outcome. To control for donor variables such as age, cause of death, procurement techniques, preservation methods, cold ischaemia time and implantation technique, we analysed patient and graft survival in recipients of paired kidneys, derived from the same procurement procedure, preserved in the same manner, subjected to similar cold ischaemia time and implanted by the same surgical team. Between June 1992 and August 1999, 28 procurement procedures provided kidneys which were transplanted into one obese and one non-obese recipient. Body mass index (BMI) was calculated as kg/m 2 . Recipients were classified as obese (BMI > 30) or non-obese (BMI < 30). Immunotherapy for all recipients consisted of a triple therapy regimen of cyclosporine or prograf, azathioprine or cellcept, and prednisone. Patients with delayed graft function (DGF), defined as the need for dialysis within 72 h of the transplant procedure, were treated with anti-thymocyte globulin (ATG) or thymoglobulin (TMG) induction for 5-7 d. The rate of DGF (7.1 versus 10.7%) and acute rejection (39.3 versus 35.7%) were similar in the obese and non-obese recipient groups. Patient survival was similar at 1, 3 and 5 yr in both groups. In addition, graft survival was similar at 1 yr. However, a trend toward decreased medium-term graft survival, which reached significance at 5 yr, was observed in the obese group. Furthermore, mean serum creatinine at 1 yr was higher in the obese group (2.0) compared with the non-obese group (1, 4) (p = 0.12). This analysis of paired cadaver kidneys indicated that obesity is not a risk factor for DGF, acute rejection, and 1-yr graft survival. However, a decreased medium- and long-term graft survival trend, which reached statistical significance at 5 yr, was observed in obese recipients.

94 citations


Journal ArticleDOI
TL;DR: The optimal doses of tacrolimus (FK) needed to reach and maintain a target blood level vary among cases of living donor liver transplantation (LDLT) and equations for predicting ID or SD were derived.
Abstract: The optimal doses of tacrolimus (FK) needed to reach and maintain a target blood level vary among cases of living donor liver transplantation (LDLT). One hundred and twenty four LDLTs in 122 patients were included in this study. Tacrolimus was administered by continuous intravenous infusion at a rate of 2.5 microg/kg/h just after the operation. The time needed to reach the target blood level and the dose needed to maintain this level for 1 week (17-18 ng/mL) were defined as the initial duration (ID) and secondary dose (SD), respectively. In the first 100 LDLTs, the correlations between ID or SD and some clinical factors were examined and equations for predicting ID or SD were derived. In the latest 24 LDLTs, FK was administered using these equations and the actual and calculated ID and SD values were compared. A multiple regression analysis revealed that only the graft weight/recipient standard liver volume (GW/SLV) ratio (%) correlated with ID or SD. Stepwise regression analysis led to the equations ID (h) = 0.4 x GW/SLV ratio + 0.2; SD (microg/kg/h) = 0.02 x GW/SLV ratio - 0.4. Simple regression analysis revealed a significant correlation between the actual and calculated ID and SD values (p < 0.0001). Initial duration and SD can be estimated from equations using the GW/SLV ratio.

87 citations


Journal ArticleDOI
TL;DR: A prospective study using psychotherapeutic principles to understand and intervene in emotional issues in adult recipients of first cadaver kidney transplants.
Abstract: : Background: Negative emotional states are the single most influential factor in determining quality of life after a successful kidney transplant. We designed a prospective study using psychotherapeutic principles to understand and intervene in emotional issues in adult recipients of first cadaver kidney transplants. Methods: Forty-nine recipients of first cadaver kidney transplants were subjected to 12 sessions (at weekly intervals) of psychotherapy within 3 months of receiving their transplant. The Beck Depression Inventory (BDI) was utilized as a measure of change in emotional state, pretherapy, at 3, 6, 9 and 12 months. A higher score on BDI was suggestive of psychological dysfunction. In the first instance, data was analysed within a quantitative framework, by virtue of the BDI. In the second instance, data was considered in terms of recurring themes described by patients during psychotherapy and was analysed qualitatively. In the third instance, both qualitative and quantitative data was considered in terms of individual patient's ability to achieve some feeling of having implemented some social, relational and vocational equilibrium into their everyday life. Recipients of live kidneys, paediatric transplants and patients who received more than one transplant were excluded, as emotional issues are different in this cohort of patients. All patients have completed 1 yr of follow up. None of the patients were on antidepressant medication before or after therapy. Results: This is an ongoing study in which we are comparing individual vs. group therapy vs. controls (who receive no therapy). The total number of patients recruited will be 120 and the final report will be available in 2003–04. The results reported in this paper form the 49 patients in the individual arm of the study. All the patients in our study happened to be white people. There was significant improvement in the BDI scores following therapy. The mean score was 26.3 ± 7.9 before and 20.5 ± 8.8 after therapy (p = 0.001); the lowering of the scores remained sustained at 12 months. Multivariate analysis of age, gender, employment status, duration of dialysis (if in dialysis for more than 3 yrs) and psychotherapy given before transplantation did not affect the results of our study. For the qualitative aspect of the study, we grouped the emotional problems as expressed by the patients into three recurring themes (i) fear of rejection, (ii) feelings of paradoxical loss post-transplant despite having received a successful transplant and (iii) the psychological integration of the newly acquired kidney. Conclusions: Psychotherapeutic intervention was an effective means of addressing emotional problems in recipients of kidney transplants. The recurring themes as identified above provided a baseline for psychotherapeutic exploration and resolution of these issues. Successful resolution of these issues was associated with lower BDI scores and the redefinition of normality in daily living post-transplant.

82 citations


Journal ArticleDOI
TL;DR: It was observed that infections and drugs were the most frequent causes for diarrhoea in the authors' series of renal transplant recipients and diarrhoeal patients had significantly higher creatinine and significantly lower albumin levels when compared with the latter group.
Abstract: In this study, we retrospectively evaluated all attacks of diarrhoea in our renal transplant recipients that came to our medical attention between 1985 and 2000. Also, the clinical features of patients with diarrhoea were compared with the features of recipients without diarrhoea. We diagnosed 41 attacks of diarrhoea in 39 (12.6%) of 308 renal transplant recipients during this time period. An aetiology was detected in 33 (80.5%) of all diarrhoeal episodes and in seven (17.1%) of those the specific agent was diagnosed with the help of stool microscopy. The most frequent causes of diarrhoeal attacks were infectious agents (41.5%) and drugs (34%). Six (14.6%) episodes of diarrhoea were chronic and six were nosocomial. About two-thirds of diarrhoea developed within the late post-transplant period (>6 months). When recipients with diarrhoea were compared with those without diarrhoea, it was seen that diarrhoeal patients had significantly higher creatinine and significantly lower albumin levels when compared with the latter group (p < 0.05). Also, the frequency of antibiotic usage was significantly higher in diarrhoeal patients than in the control group (p < 0.05). Four (10.2%) patients with diarrhoea died despite institution of the appropriate therapy. Two of these deaths were primarily related to diarrhoea and the aetiological agent was Clostridium difficile in both these cases. During the 15-yr study period, 3.6% of all deaths and 5.1% of infection-related deaths in transplant recipients were secondary to diarrhoea. As a result, we observed that infections and drugs were the most frequent causes for diarrhoea in our series of renal transplant recipients. Also, diarrhoea was an important cause of mortality in this patient population.

78 citations


Journal ArticleDOI
TL;DR: Lamivudine monotherapy for recipients receiving anti‐HBc(+) liver grafts is a simple, relatively inexpensive and effective prophylactic regimen for prevention of de novo HBV infection.
Abstract: Exclusion of liver grafts from hepatitis B core antibody (anti-HBc) positive donors to prevent de novo hepatitis B virus (HBV) infection after liver transplantation is not feasible in areas highly endemic for HBV virus like Taiwan, where approximately 80% of adults are anti-HBc(+). The efficacy of lamivudine monotherapy to prevent de novo HBV infection after living donor liver transplantation (LDLT) using grafts from anti-HBc(+) donors remains to be elucidated. From June 1994 to August 2000, LDLT was performed in 42 recipients. Twenty-four of the 42 donors were anti-HBc(+) (57%). Pre-transplant HBV vaccination was given to all recipients irrespective of anti-HBc status at monthly intervals for 3 months. Until December 1997, eight recipients received liver grafts from anti-HBc(+) donors without prophylaxis. Since January 1998, prophylaxis with lamivudine monotherapy was given to 16 recipients receiving liver grafts from anti-HBc(+) donors. De novo HBV infection occurred in three of the eight recipients (37.5%) who did not receive prophylaxis, while none of the 16 recipients given lamivudine developed de novo HBV infection after a mean follow-up of 25 months. Two of the three recipients with de novo HBV infection were anti-HBs(-) and one recipient was anti-HBs(+). Lamivudine was well tolerated, and no side effects were noted. These results suggest that lamivudine monotherapy for recipients receiving anti-HBc(+) liver grafts is a simple, relatively inexpensive and effective prophylactic regimen for prevention of de novo HBV infection. The additive protection provided by vaccine-induced or natural immunity is uncertain.


Journal ArticleDOI
TL;DR: Fungal brain abscess is an unusual but serious complication associated with solid organ and hematopoietic stem cell transplantation and outcome was poor, suggesting that early recognition of this disease might be helpful.
Abstract: Fungal brain abscess is an unusual but serious complication associated with solid organ and hematopoietic stem cell transplantation. To examine the epidemiology and clinical features of fungal brain abscess in transplant recipients, we reviewed retrospectively all cases of fungal brain abscess diagnosed during a 3-yr period among 1,620 adult patients who underwent allogeneic or autologous stem cell, liver, heart, lung, or renal transplantation at one institution. Seventeen cases of fungal brain abscess were identified and occurred a median of 140 d post-transplantation. Fungal brain abscess was more common among allogeneic stem cell transplant recipients (p < 0.01). Aspergillus species were most commonly isolated, but unusual, opportunistic molds were also identified. Altered mental status was present in 65% of patients, and multiple brain lesions were commonly seen on imaging studies. Although fungal brain abscess is an uncommon disease in this population, outcome was poor, suggesting that early recognition of this disease might be helpful.

Journal ArticleDOI
TL;DR: Whether endothelial function can be improved in HTX patients participating in a regular physical training program as demonstrated in patients with chronic heart failure, hypertension and coronary artery disease is studied.
Abstract: BACKGROUND: Impaired endothelial function is detectable in heart transplant (HTX) recipients and regarded as risk factor for coronary artery disease. We have studied whether endothelial function can be improved in HTX patients participating in a regular physical training program as demonstrated in patients with chronic heart failure, hypertension and coronary artery disease. METHODS: Male HTX patients and healthy, age-matched controls were studied. Seven HTX patients (age: 60 +/- 6 yr; 6 +/- 2 yr of HTX) participated in an outpatient training program, six HTX patients (age: 63 +/- 8 yr; 7 +/- 1 yr of HTX) maintained a sedentary lifestyle without regular physical exercise since transplantation. A healthy control group comprised six subjects (age: 62 +/- 6 yr). Vascular function was assessed by flow-mediated dilation of the brachial artery (FMD). Systemic haemodynamic responses to intravenous infusion of the endothelium independent vasodilator sodium nitroprusside (SNP) and to NG-monomethyl-L-arginine (L-NMMA), an inhibitor of constitutive nitric oxide synthase, were also measured. RESULTS: Resting heart rate was significantly lower (p < 0.05) in healthy controls (66 +/- 13) than in the HTX training group (83 +/- 11) and in non-training HTX patients (91 +/- 9), baseline blood pressure also tended to be lower in healthy subjects and in the training HTX patients. FMD was significantly higher (p < 0.05) in the control group (8.4 +/- 2.2%) and in the training group (7.1 +/- 2.4%), compared with non-training HTX patients (1.4 +/- 0.8%). The response of systolic blood pressure (p = 0.08) and heart rate (p < 0.05) to L-NMMA was reduced in sedentary HTX patients compared with healthy controls and heart rate response to SNP was also impaired in sedentary HTX patients. DISCUSSION: Regular aerobic physical training restores vascular function in HTX patients, who are at considerable risk for developing vascular complications. This effect is demonstrable in conduit and systemic resistance arteries.

Journal Article
TL;DR: Prior analyses of renal retransplants reported to the UNOS Registry were updated by estimating the compound effects of 22 covariates on regraft survival within 2 consecutive posttransplant risk periods, demonstrating that donor age was the dominant factor governing the 5-year survival rates among regrafts, accounting for 30% of long-term variation.
Abstract: 1. GENERAL: We updated prior analyses of renal retransplants reported to the UNOS Registry by estimating the compound effects of 22 covariates on regraft survival within 2 consecutive posttransplant risk periods. During an early risk period, 9,126 kidney-only regraft recipients were followed through one year, and, in a second risk period, 7,798 recipients whose regrafts survived beyond one year were followed for 5 years posttransplant. The study sample represented a unique set of patients whose first renal transplants were also recorded by the registry. 2. RELATIVE INFLUENCE OF TRANSPLANT FACTORS AND CENTER: From a multivariate log-linear analysis, the top 5 factors influencing one-year regraft survival rates were ranked as follows: 1) transplant center (accounted for 24% of the variation in short-term outcomes); 2) duration of first graft (19%); 3) donor age (15%); 4) recipient's body mass index (7%); and 5) year of transplant (6%). Ranking long-term outcomes demonstrated that donor age was the dominant factor governing the 5-year survival rates among regrafts, accounting for 30% of long-term variation. Transplant center, recipient age and race, and donor relationship accounted for another 16%, 14%, 10% and 8% of changes in long-term regraft survival, respectively. Despite center effects, a center's volume did not appear to be associated with outcome, and a center's short-term effect did not predict its long-term results, as the correlation between one- and 5-year center-specific rates was small (R = 0.11) and statistically insignificant (P = 0.15). 3. RECIPIENT FACTORS RELATED TO FIRST TRANSPLANT: Among 4 recipient factors related to a first transplant, only the first graft's survival duration significantly influenced both short- and long-term outcomes of regrafts. If the first graft survived for more than 2 years, a regraft had an approximate 90% chance of surviving to one year as compared with an 80% chance if the first graft failed within 2 years. Regrafts among recipients whose first graft lasted more than 4 3/4 years exhibited better 5-year survival rates (82.2%) versus the less-than-average rates for the other groups. 4. RECIPIENT FACTORS: Six of the 7 recipient factors selected for analysis significantly influenced short- or long-term regraft outcomes: 1) female recipients had significantly higher long-term regraft survival rates; 2) Black recipients of regrafts had poor long-term results; 3) children (0-12 yr) exhibited markedly diminished one-year regraft survival rates, and teenage recipients exhibited poor long-term regraft function; 4) obese recipients (body mass index > 30 kg/m2) had poor one-year and 5-year regraft survival rates; 5) impaired functional status immediately pre-retransplant significantly reduced both short- and long-term rates; and 6) regraft recipients whose PRA was above 0% exhibited diminished one-year and 5-year regraft survival rates. 5. DONOR FACTORS: Regraft recipients receiving a living donor's kidney had 87% one-year graft survival, outperforming cadaveric regrafts by 8 percentage points during the initial period. At 5 years, survival rates for patients receiving living related (except parents) or unrelated donor regrafts enjoyed above average 5-year survival rates, but parental and cadaver types of donors demonstrated poor long-term values. The strong short-term effect of donor age emanated from poorer regraft functions from both younger and older donors, whereas the long-term effect arose primarily as a result of the poor regraft functions from older donors. After 24 hours of cold ischemia time, early cadaveric regraft survival rates were significantly impaired. 6. TRANSPLANT FACTORS: This study clarified the effects of HLA mismatches and re-exposure to mismatched antigens on regraft survival rates. Generally, receiving a zero mismatched regraft was beneficial. Specifically, class I incompatible regrafts with repeat AB mismatches demonstrated superior long-term rates, even surpassing the 5-year results for 0 AB mismatches. Incompatible class II regrafts with re-exposure to DR antigens had marginal reductions in short- and long-term outcomes. Increasing numbers of HLA-AB mismatches did not seriously impact regraft survival, but any DR mismatch seriously reduced the one-year regraft survival rate.

Journal ArticleDOI
TL;DR: Assessment of the relative importance of insulin secretion (ISec) and insulin sensitivity (IS) in the pathogenesis of post‐transplant diabetes mellitus, impaired glucose tolerance (IGT) and impaired fasting glucose (IFG) after renal transplantation.
Abstract: The current knowledge of the pathogenesis of post-transplant glucose intolerance is sparse. This study was undertaken to assess the relative importance of insulin secretion (ISec) and insulin sensitivity (IS) in the pathogenesis of post-transplant diabetes mellitus (PTDM), impaired glucose tolerance (IGT) and impaired fasting glucose (IFG) after renal transplantation. An oral glucose tolerance test (OGTT) was performed in 167 non-diabetic recipients 10 wk after renal transplantation. Fasting, 1-h and 2-h insulin and glucose levels were used to estimate the insulin secretory response and IS. One year after transplantation 89 patients were re-examined with an OGTT including measurements of fasting and 2 h glucose. Ten weeks after transplantation the PTDM-patients had significantly lower ISec and IS than patients with IGT/IFG, who again had lower ISec and IS than those with normal glucose tolerance (NGT). One year later, a similar difference in baseline ISec was observed between the three groups, whereas baseline IS did not differ significantly. Patients who improved their glucose tolerance during the first year, were mainly characterized by a significantly greater baseline ISec, and they received a significantly higher median prednisolone dose at baseline with a subsequent larger dose reduction during the first year, than the patients who had their glucose tolerance unchanged or worsened. In conclusion, both impaired ISec and IS characterize patients with PTDM and IGT/IFG in the early course after renal transplantation. The presence of defects in insulin release, rather than insulin action, indicates a poor prognosis regarding later normalization of glucose tolerance.

Journal ArticleDOI
TL;DR: HV‐6 viremia in HCV‐positive liver transplant recipients identified a subgroup of patients at increased risk for early fibrosis upon HCV recurrence, including patients with recurrent HCV hepatitis.
Abstract: A role of tumor necrosis factor-alpha (TNF-alpha) In the immunopathogenesis of hepatitis C virus (HCV) infection has been proposed The novel herpes virus, human herpes virus-6 (HHV-6), is amongst the most potent inducers of cytokines, including TNF-alpha The impact of HHV-6 viremia on the progression of recurrent HCV hepatitis was assessed in 51 HCV-positive liver transplant recipients The frequency of recurrent HCV hepatitis did not differ between patients with HCV viremia (476%, 10/21) as compared with those without HCV viremia (467%, 14/30, p = 09) However, the patients with HHV-6 viremia had a significantly higher fibrosis score upon HCV recurrence than those without HHV-6 viremia (mean 15 vs 03, p = 001) An association between cytomegalovirus (CMV) viremia and HCV recurrence was not documented; 50% (15/30) of the patients with CMV viremia and 428% (9/21) of those without CMV viremia had recurrent HCV hepatitis (p > 05) Receipt of ganciclovir (administered upon the detection of CMV viremia) was associated with lower total Knodell score (mean 52 vs 69, p = 005) and a trend towards lower fibrosis score (mean 044 vs 100, p = 012) in patients with recurrent HCV hepatitis Thus, HHV-6 viremia in HCV-positive liver transplant recipients identified a subgroup of patients at increased risk for early fibrosis upon HCV recurrence

Journal ArticleDOI
TL;DR: Two‐hour post dose cyclosporin level is a better predictor than trough level of acute rejection of renal allografts of kidney rejection in Simulect US01 Study Group.
Abstract: Cyclosporine (CyA) trough concentrations are poor predictors for acute rejection post-transplant. Patients were part of a randomized trial of basiliximab (n=70) vs. anti-thymocyte globulin (ATGAM) (n=65), both in combination with Neoral, mycophenolate mofetil, and steroids, undergoing first or second, cadaveric or live donor renal transplants. Whole blood samples were collected just before (C0) and at 2 h after CyA dosing on day 4 and at the end of weeks 1, 2, 4, and 8. The CyA was measured by fluorescence polarization immunoassay (TDx). Mean CyA C0 and C2 concentrations were calculated. Logistic regression analysis revealed that mean C2 level was the only predictor of acute rejection (P < or = 0.001). Higher mean C2 levels predicted lower rejection probabilities. Linear regression analysis revealed that higher mean C2 levels were not related to higher serum creatinine levels at either week 4 or 24 or to incidence of headache or tremor. The CyA C2 levels predict the frequency of rejection postrenal transplant. Target C2 levels are in the range of 1500 ng/dL.

Journal ArticleDOI
TL;DR: The psychosocial outcome of living donors after living donor liver transplantation: a pilot study shows promising results in both the social and physical aspects of the treatment.
Abstract: In view of the scarcity of organ resources available for transplantation, living donor liver transplantation (LDLT) is gaining growing importance in the treatment of chronically terminal liver diseases. In the period between December 1999 and October 2000, 47 potential living liver donors were evaluated and 24 right hepatic lobes and two left lateral segments were transplanted at the Virchow-Klinikum of the Charite Hospital in Berlin. The present study looks into biomedical and psychosocial parameters of 23 donors before and 6 months after LDLT. Our aims were to investigate the development of psychosocial parameters after donation and the relationship between psychosocial findings and post-operative complications. Most donors showed an improved quality of life (QoL) after LDLT when compared with pre-operative results. Twenty-six percent of donors show high values for 'tiredness', 'fatigue' and 'limb pain' following donation. The post-operative complications had no influence on the psychosocial outcome. In this pilot study the resection of the right hepatic lobe amounts to a safe operation for donors and holds promise of a good psychosocial outcome for most donors, irrespective of donation-related complications. The pronounced complaints appears to indicate psychological tension and distress in some donors following donation.

Journal ArticleDOI
TL;DR: By using the strategies suggested, transplant patients' adherence to medication therapy may be enhanced and patients' quality of life improved.
Abstract: Despite the importance of proper medication use, many transplant patients do not take their medications correctly. Non-adherence to medication therapy leads to adverse consequences, and practitioners should encourage adherence by transplant patients. This manuscript discusses several aspects of medication taking behavior including: (1) methods of identifying medication non-adherence; (2) models used to identify possible determinants of medication taking behavior; (3) strategies to educate patients concerning their therapy; (4) factors promoting adherence and non-adherence; and (5) practical interventions that we, as practitioners, can employ to enhance adult and pediatric transplant patients' adherence to therapy. By using the strategies suggested, transplant patients' adherence to medication therapy may be enhanced and patients' quality of life improved.

Journal ArticleDOI
TL;DR: A survey of the practices surrounding specific infectious diseases at US renal transplant centers found different approaches to most infectious diseases following renal transplantation, leading to different approaches at different transplant centers.
Abstract: Definitive approaches to most infectious diseases following renal transplantation have not been established, leading to different approaches at different transplant centers. To study the extent of these differences, we conducted a survey of the practices surrounding specific infectious diseases at US renal transplant centers. A survey containing 103 questions covering viral, bacterial, mycobacterial and protozoal infections was developed. Surveys were sent to program directors at all U.S. renal transplant centers. Responses were received from 147 of 245 (60%) transplant centers and were proportionately represented all centers with respect to program size and geographical location. Pre-transplant donor and recipient screening for hepatitis B virus (HBV), hepatitis C virus (HCV), human immunodeficiency virus (HIV) and cytomegalovirus (CMV) is uniform, but great discrepancy exists in the testing for other agents. HCV seropositive donors are used in 49% of centers. HIV seropositivity remains a contraindication to transplantation, although 13% of centers indicated they have experience with such patients. Post-transplant, there is wide variety in approach to CMV and Pneumocystis carinii (PCP) prophylaxis. Similarly divergent practices affect post-transplant vaccinations, with 54% of centers routinely vaccinating all patients according to customary guidelines in non-transplant populations. In contrast, 22% of centers indicated they do not recommend vaccination in any patients. We believe an appreciation of the differences in approaches to post-transplant infectious complications may encourage individual centers to analyse the results of their own practices. Such analysis may assist in the design of studies to answer widespread and important questions regarding the care of patients following renal transplantation.

Journal ArticleDOI
TL;DR: A retrospective analysis of 325 patients who had their first cadaver renal transplant between January 1991 and December 1996 and followed up for a mean of 61 ± 26 months found the effect of haemodialysis or peritoneal dialysis on acute rejection, delayed graft function (DGF), graft and patient survival after cADAveric renal transplantation was examined.
Abstract: INTRODUCTION: We examined the effect of haemodialysis (HD) or peritoneal dialysis (PD) on acute rejection, delayed graft function (DGF), graft and patient survival after cadaveric renal transplantation. MATERIALS AND METHODS: We carried out a retrospective analysis of 325 patients (cyclosporin [CyA]-based therapy) who had their first cadaver renal transplant between January 1991 and December 1996 and followed up for a mean of 61 +/- 26 months. They were divided into three groups: HD, PD and CD (where both PD and HD was used for at least 3 months). Delayed graft function was diagnosed if the patient needed dialysis in the first week post-transplant while primary non-function (PNF) was diagnosed if the kidney never achieved function. Graft rejection was confirmed by biopsy; early acute rejection (EAR) was defined as acute rejection occurring before 90 days and late acute rejection (LAR) as one after 90 d. RESULTS: A total of 183 patients had PD, 117 HD and 25 CD. The mean time period in which the patients were on dialysis for PD was 24 months, HD 34.5 months and CD 50.6 months (p < 0.01). The recipients were matched for age and gender. The donor variables (age, gender and cold ischaemia time) did not differ between the groups. The mean time for the development of first acute rejection following renal transplant in each group was as follows: PD group: 68.8 d, HD group: 81.3 d and CD group: 105 d (p = 0.08). The number of patients who developed EAR was 90 (49.2%) in PD group, 51 (43.6%) in HD group and 11 (56%) in CD group (p = 0.6); the number who developed LAR was nine in PD group (4.9%), six in HD group (5.1%) and one in CD group (4%) (p = 0.9). Fifty-six patients with PD had DGF compared with 58 with HD (p = 0.01). There was no difference in the number and severity of rejection episodes or DGF based on the duration of dialysis. The 5-yr survival of patients was 79% for PD, 81% HD and 78% CD groups (p = n.s), while the graft survival for PD group was 61%, HD group 63% and CD group 74% (p = n.s). SUMMARY: We could find no difference in the patient or graft survival between patients who had pre-transplant HD, PD or CD. There was no difference in the incidence of acute rejection episodes between the three groups of patients as well. However, we found a significantly higher rate of DGF in the HD versus PD patients.

Journal ArticleDOI
TL;DR: Rapamycin is an effective replacement agent as primary immunosuppressive therapy following withdrawal of CNIs in liver transplant patients with CNI‐induced chronic nephrotoxicity and can be reversed upon withdrawal of the CNI.
Abstract: The purpose of this study was to determine whether calcineurin inhibitor (CNI)-induced chronic nephrotoxicity in liver transplant patients is reversible by replacement of the CNI with rapamycin as the primary immunosuppressive agent. CNIs, while providing potent immunosuppression for liver transplant patients, exhibit nephrotoxicity as a major side-effect. Whereas acute CNI-induced nephrotoxicity is reversible by withdrawal of the CNI, chronic nephrotoxicity due to CNIs is a progressive process thought to be irreversible. Eight liver transplant patients with CNI-induced chronic nephrotoxicity were converted to rapamycin as the primary immunosuppressive agent. The CNI was either discontinued (four patients) or the dosage lowered to maintain a subtherapeutic level (four patients). Renal function as assessed by serum creatinine was measured before and after conversion to rapamycin. Two patients progressed to dialysis dependence following conversion to rapamycin. These two patients had been on CNIs for a mean of 112 months (range 93-131 months) prior to conversion to rapamycin. Five patients experienced improvement in renal function. These patients had been on calcineurin inhibitors for a mean of 60 months (range 42-75 months) prior to conversion. One patient with chronic nephrolithiasis as a contributing factor to his renal dysfunction has progressed to dialysis dependence despite conversion to rapamycin following exposure to a CNI for 24 months. In the five patients with improved renal function, serum creatinine levels decreased significantly (2.4 +/- 0.3 mg/dL to 1.5 +/- 0.1 mg/dL, p < 0.05) by a mean of 7.2 months (range 5-10 months) after conversion from CNI to rapamycin-based immunosuppression. Liver function remained stable after conversion to rapamycin. CNI-induced chronic nephrotoxicity can be reversed upon withdrawal of the CNI. Rapamycin is an effective replacement agent as primary immunosuppressive therapy following withdrawal of CNIs in liver transplant patients with CNI-induced chronic nephrotoxicity.

Journal ArticleDOI
TL;DR: Clinicians should be more careful for early and fast diagnosis and treatment should be started immediately for post‐transplant TB infection, which develops mostly within the first year after transplantation.
Abstract: Tuberculosis (TB) is an unusual infection in transplant recipients. We evaluated (i) the frequency of TB, (ii) the duration to develop the TB infection, and (iii) clinical consequences, in 380 solid-organ recipients from January 1995 to December 2000. A total of 10 (2.63%) patients (eight renal, two liver transplant recipients) were found to have post-transplantation TB. The frequency of TB in this patient population is 8.5-fold higher than the prevalance in the general Turkish population. Tuberculosis developed within 2-33 months after transplantation, with a median of 15 months. In all of these 10 patients, Mycobacterium tuberculosis (MTB) was isolated from the culture. All the patients continued to have low dose immunosuppressive treatment, and also quadriple antituberculosis treatment [isoniazid (INH), rifampin (RIF), pyrazinamide (PRZ) and ethambutol (ETB)] has been given. The two recipients had died of disseminated form of TB. Relapse was detected in one patient 6 months after the completion of the treatment. As post-transplant TB infection develops mostly within the first year after transplantation, clinicians should be more careful for early and fast diagnosis and treatment should be started immediately.

Journal ArticleDOI
TL;DR: The purpose of this study was to evaluate the efficacy of the anti‐CD25 monoclonal antibody basiliximab (BSLIX) started post‐operatively in patients at high risk for DGF combined with low dose tacrolimus (TAC).
Abstract: Background: Patients who develop delayed graft function (DGF) following cadaveric renal transplantation have inferior survival to those who do not. Calcineurin inhibitors (CNI) may prolong recovery from DGF. Patients with DGF are therefore routinely treated with either polyclonal antilymphocyte preparations or monoclonal anti-CD3 monoclonal antibodies and delayed introduction of CNI. The purpose of this study was to evaluate the efficacy of the anti-CD25 monoclonal antibody basiliximab (BSLIX) started post-operatively in patients at high risk for DGF combined with low dose tacrolimus (TAC). Methods: Patients who received a primary cadaveric renal transplant only after August 1998 were included in this retrospective study (n=143). All patients received TAC and mycophenolate Mofetil (MMF) pre-operatively. At 6 h post-operatively, graft function was assessed clinically by urine output and serum creatinine. Those patients who had a urine output <300 cc/6 h or a rising serum creatinine were presumed to be at risk for DGF (n=46). These patients were treated with 20 mg BSLIX and had TAC dose reduced to maintain a trough blood level of <5 ng/mL. Basiliximab was repeated at day 5. Patients not felt to be at risk for DGF were treated with standard TAC dose with trough level target of 9–12 ng/mL. Patients at risk were classified as DGF if they needed dialysis or as slow graft function (SGF) if they did not. The combined group (SGF/DGF) were analysed together. Patients with SGF/DGF had their TAC dose increased to achieve trough levels of 9–12 ng/mL when renal function improved. Patient groups were compared for demographics, need for dialysis, serum creatinine, glomerular filtration rate (GFR), TAC trough levels, MMF dosage, complications and 1- and 2-yr actuarial graft survival. Results: Patients with SGF/DGF had a longer length of stay (8 vs. 5.7 d), were more likely to be black (41.3 vs. 25.7%), and required more post-operative haemodialysis (HD) (52.2 vs. 4.1%). SGF/DGF and non-SGF/DGF patients had similar rates of rejection (28.2 vs. 19.6%, p=0.28) and steroid resistant rejection (SRR) (6.5 vs. 2.1%, p=0.32). There were no differences in the rate of cytomegalovirus (CMV) infection (4.3 vs. 6.1%). Serum creatinine was higher and GFR lower at all time points in the SGF/DGF patients. The 1 and 2 yr actuarial survival in the non-SGF/DGF patients was 97.6 and 97.6% compared with 1 and 2 yrs actuarial survival of 94.1% and 80.0% in the SGF/DGF patients, p=0.04. There were no differences in patient survival. There were no differences in actuarial survival for the SGF/DGF patients who received dialysis compared with those who did not receive dialysis. Comparison of patients who received HD (n=28) to those who did not (n=115), regardless of group demonstrated no difference in 1 and 2 yrs actuarial survival, 100 and 94.1% in HD patients vs. 98.2 and 92.5% in non-HD patients. Conclusions: The clinical diagnosis of SGF/DGF can be made 6 h post-operatively based on urine output and serum creatinine. Basiliximab can be started post-operatively in these patients and decreased levels of TAC can be used to achieve acceptably low rates of rejection in these patients. However, SGF/DGF patients, regardless of their need for dialysis, have worse function at 1 yr and lower 2-yr actuarial graft survival compared with non-SGF/DGF patients. Most of the poor survival can be attributed to the SGF group. Further strategies to either prevent SGF/DGF or to optimize treatment in these patients are needed.

Journal ArticleDOI
TL;DR: Takatsuka H, Wakae T, Mori A, Okada M, Okamoto T, Kakishita E. Effects of total body irradiation on the vascular endothelium.
Abstract: Total body irradiation (TBI) is used as conditioning for stem cell transplantation. We studied its effects on the vascular endothelium in 55 consecutive patients undergoing stem cell transplantation with TBI (TBI group n=35) or without TBI (non-TBI group: n=20). Fifty patients underwent bone marrow transplantation and five underwent peripheral blood stem cell transplantation. The levels of thrombomodulin, plasminogen activator inhibitor-1, and cyclic GMP were measured before and after TBI. At both times, the thrombomodulin and plasminogen activator inhibitor-1 levels were within the normal range in all patients from the two groups, without any significant differences between the groups. The cyclic GMP level was increased after TBI in six of 35 patients. Five of these six patients died as a result of complications of transplantation, while one patient survived in whom the cyclic GMP level rapidly returned to normal. In contrast, the cyclic GMP level remained normal in all patients not receiving TBI. These results suggest that conditioning with TBI stimulates vascular endothelial cells, even if it does not cause immediate direct injury. Such stimulation may be related to vascular endothelial dysfunction, the development of which may be mediated by nitric oxide.

Journal ArticleDOI
TL;DR: The influence of ethnic miscegenation on tacrolimus pharmacokinetics and trough concentrations during the first 6’months after transplantation is determined.
Abstract: : The impact of ethnic miscegenation on tacrolimus clinical pharmacokinetics and therapeutic drug monitoring. We sought to determine the influence of ethnic miscegenation on tacrolimus pharmacokinetics and trough concentrations during the first 6 months after transplantation. Methods: Tacrolimus concentrations were measured in blood samples obtained from 22 transplant recipients during the first week of transplant, within pharmacokinetic profiles, and throughout the first 6 months post-transplant, using the Pro Tac II ELISA method. Pharmacokinetic parameters and between- and within-subject blood concentration variability were compared stratifying the total population in two distinct ethnic groups of white (W) and non-white (NW) patients, according to a stringent criterion. Results: Between-subject variability in dose-adjusted concentrations during dosing interval varied from 38.8 to 69.5%. Compared with W patients, NW patients showed higher variability in blood tacrolimus concentrations during dosing interval (37.40 ± 5.64 vs. 56.95 ± 11.49, p < 0.001) and lower drug exposures (AUC: 229.4 ± 55.5 vs. 66.9 ± 67.1 ng × h/mL, p=0.036). The correlation coefficients (r2) between C0, C12 or Cmax and AUC were 0.83, 0.91 and 0.5, respectively. An equation derived from early time concentrations (C0, C1.5 and C4) accounted for 94% of the variability observed in AUC. Compared with W patients, a higher proportion of tacrolimus blood determinations during the first week were below 10 νg/mL in NW patients (24% vs. 62%, p=0.028). Tacrolimus absorption increased from week 1–4 (1.1 ± 0.53 vs. 1.73 ± 0.97 νg/mL/mg, p < 0.0001) but was still showed high between- (41.6–70.4%) and within-subject (18.2–32.5%) variability, regardless of ethnicity, after stabilization. Conclusion: Non-white patients show higher tacrolimus variability and lower drug exposures after transplantation compared with W patients. Therefore, higher initial tacrolimus doses and intensive monitoring are recommended when administering tacrolimus-based immunosupressive therapy to NW patients of this transplant population.

Journal ArticleDOI
TL;DR: The clinical outcomes of transplant recipients who experienced adverse effects from tacrolimus and were converted to cyclosporine‐microemulsion‐based (Neoral® [cyclosporin, USP] MODIFIED) therapy are characterized.
Abstract: Background. The calcineurin inhibitors, cyclosporine and tacrolimus, are the mainstay of current immunosuppressive regimens for the prevention of acute rejection in organ transplantation. The choice of the individual agent used often depends on the preference of the Transplant Center and patient type. Adverse effects associated with tacrolimus may impact its clinical utility in many patients. This study characterizes the clinical outcomes of transplant recipients who experienced adverse effects from tacrolimus and were converted to cyclosporine-microemulsion-based (Neoral® [cyclosporine, USP] MODIFIED) therapy. Methods. Hepatic or renal allograft recipients unable to maintain adequate immunosuppression with a tacrolimus-based regimen for reasons of toxicity or efficacy were recruited for this study and converted to cyclosporine-microemulsion-based therapy. Data were collected on drug dosing, trough concentrations, and treatment duration, as well as detailed information on tacrolimus-associated toxicities that prompted rescue with cyclosporine-microemulsion. Furthermore, clinical and laboratory data related to the clinical course of the patients after conversion to cyclosporine-microemulsion were recorded for up to 1 yr following conversion. Results. One hundred and fifty-seven transplant recipients were enrolled in this study. Predominant reasons for discontinuation of tacrolimus were neurotoxicity (55%), diabetes (24%), nephrotoxicity (15%), and gastrointestinal intolerance (24%). Patients frequently had multiple symptoms prompting rescue therapy with cyclosporine-microemulsion. Over 70% of subjects had improvement or resolution of their tacrolimus-associated adverse symptoms within 3 months post-conversion. Acute rejection episodes occurred in 27% of patients converted to cyclosporine-microemulsion. Conclusions. Cyclosporine-microemulsion rescue therapy in patients experiencing adverse clinical effects associated with tacrolimus is an effective treatment option which leads to resolution of these adverse effects in the majority of patients, and allows for satisfactory clinical outcomes.

Journal ArticleDOI
TL;DR: Ten renal transplant recipients developed malignancies within 15 yr of transitional cell carcinoma (TCC) of the urinary tract, and among these 10 patients, five maintained normal renal function, three returned to hemodialysis without tumor recurrence, and two patients died of cancer.
Abstract: At our clinic we followed 320 renal transplant recipients, 16 of whom developed malignancies within 15 yr. Ten of the 16 malignancies were transitional cell carcinoma (TCC) of the urinary tract. The modalities of treatment included standard nephroureterectomy with bladder cuff removal for upper tract tumor, transurethral resection for superficial bladder tumor and partial cystectomy for one case of invasive bladder tumor, as requested by the patient. Post-operative intravesical chemotherapy with epirubicin, or immunotherapy with bacillus Calmette-Guerin (BCG) were carried out for superficial bladder tumor. Cyclosporine (CsA) used as post-transplant immunosuppressant was switched to low dose azathioprine (Aza) at the initial diagnosis of TCC. Four patients experienced tumor recurrence despite conversion of immunosuppressant from CsA to Aza. Among these 10 patients, five maintained normal renal function, three returned to hemodialysis without tumor recurrence, and two patients died of cancer.

Journal ArticleDOI
TL;DR: A negative or positive FCXM (when the FlowPRATM against donor antigens is positive or negative, respectively) is not always a straightforward interpretation and it is suspect that both types of antibodies described above have clinical relevance.
Abstract: In 1969, a study by Patel and Terasaki persuaded the renal transplant community that a pre-transplant cross-match should always be performed between donor and recipient to detect HLA antibodies and prevent hyperacute allograft rejection. Although the role of the cross-match among nonsensitized patients is controversial, its importance among sensitized recipients is undeniable. Over the past 30 years, more sensitive techniques, such as the flow cytometric cross-match (FCXM), were developed to identify low levels of antibodies undetectable by other approaches. The clinical relevance of a positive FCXM, however, has been hotly disputed, with some investigators maintaining that the FCXM is 'too sensitive' and rules out acceptable donor-recipient combinations. An alternative explanation is that the FCXM is non-specific, and, at least in certain situations, identifies non-HLA antibodies that are clinically irrelevant. Recently, a solid phase immunoassay utilizing purified HLA Class I or Class II molecules bound to microparticles (FlowPRA) was developed. Ideally, use of the FlowPRA for the identification of HLA antibodies in recipient sera would help ascertain whether a positive FCXM with donor cells was truly the result of an HLA-specific antibody. As shown here, this may not always be true. In this study, two unexpected serum patterns were observed. Pattern 1: FlowPRA beads were positive (with an associated HLA Class I specificity) and the FCXM with cells expressing the HLA antigen(s) to which the antibody was directed, was negative. Sequence analysis of the HLA antigens reactive with this unexpected antibody suggests that the epitope recognized resides on the floor of the groove, a site generally not expected to generate antibody activity. Pattern 2: FlowPRA beads were negative yet the FCXM was T and B cell positive. Further analysis of the FlowPRA negative/FCXM positive sera using a flow cytometric cell-based panel reactive antibody (PRA) approach revealed those sera to have specific anti-HLA Class I activity. We suspect that both types of antibodies described above have clinical relevance. Thus, a negative or positive FCXM (when the FlowPRA against donor antigens is positive or negative, respectively) is not always a straightforward interpretation.