scispace - formally typeset
Search or ask a question

Showing papers in "Transplantation in 2000"


Journal ArticleDOI
TL;DR: PVT patients undergo more difficult surgery, have more postoperative complications, have higher in-hospital mortality rates, and have reduced 5-year survival rates, whereas patients with grades 2, 3, and 4 PVT had reduced survival rates.
Abstract: Background Portal vein thrombosis (PVT) has been seen as an obstacle to liver transplantation (LTx). Recent data suggest that favorable results may be achieved in this group of patients but only limited information from small size series is available. The present study was conducted in an effort to review the surgical options in patients with PVT and to assess the impact of PVT on LTx outcome. Risk factors for PVT and the value of screening tools are also analyzed. Methods Adult LTx performed from 1987 through 1996 were reviewed. PVT was retrospectively graded according to the operative findings: grade 1: 50% PVT; grade 3: complete PV and proximal SMV thrombosis; grade 4: complete PV and entire SMV thrombosis. Results Of 779 LTx, 63 had operatively confirmed PVT (8.1%): 24 had grade 1, 23 grade 2, 6 grade 3, and 10 grade 4 PVT. Being male, treatment for portal hypertension, Child-Pugh class C, and alcoholic liver disease were associated with PVT. Sensitivity of ultrasound (US) in detecting PVT increased with PVT grade and was 100% in grades 3-4. In patients with US-diagnosed PVT, an angiogram was performed and ruled out a false positive US diagnosis in 13%. In contrast with US, angiograms differentiated grade 1 from grade 2, and grade 3 from grade 4 PVT. Grade 1 and 2 PVT were managed by low dissection and/or a thrombectomy; in grade 3 the distal SMV was directly used as an inflow vessel, usually through an interposition donor iliac vein; in grade 4 a splanchnic tributary was used or a thrombectomy was attempted. Transfusion requirements in PVT patients (10 U) were higher than in non-PVT patients (5 U) (P Conclusions The value of US diagnosis in patients with PVT depends on the PVT grade, and false negative diagnoses occur only in incomplete forms of PVT (grades 1-2). The degree of PVT dictates the surgical strategy to be used, thrombectomy/low dissection in grade 1-2, mesoportal jump graft in grade 3, and a splanchnic tributary in grade 4. Taken altogether, PVT patients undergo more difficult surgery, have more postoperative complications, have higher in-hospital mortality rates, and have reduced 5-year survival rates. Analysis by PVT grade, however, reveals that grade 1 PVT patients do as well as controls; only grades 2 to 4 PVT patients have poorer outcomes. With increased experience, results of LTx in PVT patients have improved and, even in severe forms of PVT, a 5-year survival rate >60% can now be achieved.

567 citations


Journal ArticleDOI
TL;DR: This protocol shows promise for eliminating DSA preemptively among patients with low-titer positive antihuman globulin-enhanced, complement-dependent cytotoxicity cross-matches, allowing the successful transplantation of these patients using a live donor without any cases of HAR.
Abstract: Background Hyperacute rejection (HAR) and acute humoral rejection (AHR) remain recalcitrant conditions without effective treatments, and usually result in graft loss. Plasmapheresis (PP) has been shown to remove HLA- specific antibody (Ab) in many different clinical settings. Intravenous gamma globulin (IVIG) has been used to suppress alloantibody and modulate immune responses. Our hypothesis was that a combination of PP and IVIG could effectively and durably remove donor-specific, anti-HLA antibody (Ab), rescuing patients with established AHR and preemptively desensitizing recipients who had positive crossmatches with a potential live donor. Methods The study patients consisted of seven live donor kidney transplant recipients who experienced AHR and had donor-specific Ab (DSA) for one or more mismatched donor HLA antigens. The patients segregated into two groups: three patients were treated for established AHR (rescue group) and four cross-match-positive patients received therapy before transplantation (preemptive group). Results Using PP/IVIG we have successfully reversed established AHR in three patients. Four patients who were cross-match-positive (3 by flow cytometry and 1 by cytotoxic assay) and had DSA before treatment underwent successful renal transplantation utilizing their live donor. The overall mean creatinine for both treatment groups is 1.4+/-0.8 with a mean follow up of 58+/-40 weeks (range 17-116 weeks). Conclusions In this study, we present seven patients for whom the combined therapies of PP/IVIG were successful in reversing AHR mediated by Ab specific for donor HLA antigens. Furthermore, this protocol shows promise for eliminating DSA preemptively among patients with low-titer positive antihuman globulin-enhanced, complement-dependent cytotoxicity (AHG-CDC) cross-matches, allowing the successful transplantation of these patients using a live donor without any cases of HAR.

563 citations


Journal ArticleDOI
TL;DR: Patients and graft survival and the incidence of biopsy-proven acute rejection at 12 months were comparable between sirolimus and CsA, whereas safety profiles were different, suggesting that siro Limus may be used as primary therapy for the prevention of acute rejection.
Abstract: Introduction. A previous trial in renal transplantation comparing sirolimus (rapamycin) to cyclosporine (CsA) ina triple-drug therapy regimen with azathioprine and corticosteroids found that the incidence of acute rejection was similar (approximately 40%) with a trend for better renal function with sirolimus. Methods. In 14 European centers, first cadaveric renal allograft recipients were randomized to receive sirolimus (n=40) or CsA (n=38) in an open-label design. All patients received corticosteroids and mycophenolate mofetil 2 g/day. Sirolimus and CsA were concentration controlled; trough levels of mycophenolic acid and prednisolone were also measured. Results. At 12 months, graft survival(92.5% sirolimus vs. 89.5% CsA), patient survival (97.5% sirolimus vs. 94.7% CsA), and the incidence of biopsy-proven acute rejection (27.5% sirolimus vs. 18.4% CsA) were not statistically different. The use of antibodies to treat suspected rejection episodes was also similar (7.5% sirolimus vs. 5.3% CsA). More sirolimus patients received bolus steroid therapy (20 vs. 11, P=0.068). From month 2 onward, the calculated glomerular filtration rate was consistently higher in sirolimus-treated patients. The adverse events reported more frequently with sirolimus were thrombocytopenia (45% vs. 8%) and diarrhea (38% vs. 11%). In the CsA group, increased creatinine (18% vs. 39%), hyperuricemia (3% vs. 18%), cytomegalovirus infection (5% vs. 21%), and tremor (5% vs. 21%) were observed significantly more often. Discussion. Patient and graft survival and the incidence of biopsy-proven acute rejection at 12 months were comparable between sirolimus and CsA, whereas safety profiles were different. These data suggest that sirolimus may be used as primary therapy for the prevention of acute rejection.

541 citations


Journal ArticleDOI
TL;DR: Because of the high rate of recurrent tumor and lack of positive prognostic variables, transplantation should seldom be used as a treatment for cholangiocarcinoma, and more effective adjuvant therapies are necessary.
Abstract: Background. Because of the high incidence of recurrent tumor, many surgeons have become disenchanted with transplantation as a treatment for cholangiocarcinoma. Methods. The Cincinnati Transplant Tumor Registry database was used to examine 207 patients who underwent liver transplantation for otherwise unresectable cholangiocarcinoma or cholangiohepatoma. Specific factors evaluated included tumor size, presence of multiple nodules, evidence of tumor spread at surgery, and treatment with adjuvant chemotherapy and/or radiation therapy. Incidentally found tumors were compared to tumors that were known or suspected to be present before transplantation. Results. The 1, 2, and 5-year survival estimates using life table analysis were 72, 48, and 23%. Fifty-one percent of patients had recurrence of their tumors after transplantation and 84% of recurrences occurred within 2 years of transplantation. Survival after recurrence was rarely more than 1 year. Forty-seven percent of recurrences occurred in the allograft and 30% in the lungs. Tumor recurrence, and evidence of tumor spread at the time of surgery, were negative prognostic variables. There were no positive prognostic variables. Patients with incidentally found cholangiocarcinomas did not have improved survival over patients with known or suspected tumors. A small number of patients survived for more than 5 years without recurrence. However, this group had no variable in common that would aid in the selection of similar patients in the future. Conclusions. Because of the high rate of recurrent tumor and lack of positive prognostic variables, transplantation should seldom be used as a treatment for cholangiocarcinoma. For transplantation to be a viable treatment in the future, more effective adjuvant therapies are necessary. Cholangiocarcinoma is a rare malignant tumor of the biliary system with a poor prognosis. Total hepatectomy and liver transplantation held promise as a possible curative treatment for unresectable cholangiocarcinoma in the early days of liver transplantation. However, because of the high incidence of recurrent tumor, high postoperative morbidity and mortality, and the international organ shortage, many surgeons became disenchanted with transplantation as a treatment for this malignancy. (1‐5) We examined the results of a large number of transplantations (207) for cholangiocarcinoma and attempted to find prognostic variables that would lead to survival results encouraging enough to justify the use of a limited resource.

433 citations


Journal ArticleDOI
TL;DR: Mycophenolate Mofetil therapy decreased the relative risk for development of chronic allograft failure (CAF) by 27% and was independent of its outcome on acute rejection.
Abstract: Background. Mycophenolate Mofetil (MMF) has been shown to significantly decrease the number of acute rejection episodes in renal transplant recipients during the 1st year. A beneficial effect of MMF on long-term graft survival has been more difficult to demonstrate. This beneficial effect has not been detected, despite the impact of acute rejection on the development of chronic allograft nephropathy and experimental evidence that MMF may have a salutary effect on chronic allograft nephropathy independent of that of rejection. Methods. Data on 66,774 renal transplant recipients from the U.S. renal transplant scientific registry were analyzed. Patients who received a solitary renal transplant between October 1, 1988 and June 30, 1997 were studied. The Cox proportional hazard regression was used to estimate relevant risk factors. Kaplan-Meier analysis was performed for censored graft survival. Results. MMF decreased the relative risk for development of chronic allograft failure (CAF) by 27% (risk ratio [RR] 0.73, P<0.001). This effect was independent of its outcome on acute rejection. Censored graft survival using MMF versus azathioprine was significantly improved by Kaplan-Meier analysis at 4 years (85.6% v. 81.9%). The effect of an acute rejection episode on the risk of developing CAF seems to be increasing over time (RR51.9, 1988 ‐91; RR52.9, 1992‐94; RR53.7, 1995‐ 97). Conclusion. MMF therapy decreases the risk of developing CAF. This improvement is only partly caused by the decrease in the incidence of acute rejection observed with MMF; but, is also caused by an effect independent of acute rejection.

398 citations


Journal ArticleDOI
TL;DR: It is predicted that recently developed ELISA and flow cytometry techniques using purified HLA antigen will increase the clinical relevance of posttransplantation HLA antibody monitoring by allowing the detection of low levels of donor antibody and easily distinguishing the isotype and target (HLA class I or class II) of the antibodies.
Abstract: We have cited more than 23 studies showing that de novo development of anti-HLA antibodies is associated with increased acute and chronic rejection and decreased graft survival in kidney, heart, lung, liver, and corneal transplants. Antibodies to both HLA class I and class II antigens seem to be detrimental. Antibodies of the IgG isotype and possibly the IgM isotype were clinically relevant. Most studies showed that donor-specific antibodies were associated with rejection and graft loss. Therefore, HLA antibodies provide a clinical readout for patient alloreactivity that may have the ability to distinguish graft dysfunction due to immunologic and nonimmunologic causes. Antibody may act as a critical trigger for rejection of allografts and may serve as an early indicator of a slowly smoldering chronic rejection that is not manifested at a given time by biochemical measures such as serum creatinine levels. The effectiveness of various drugs on chronic rejection should be evaluable by their effects on HLA antibody production. We predict that recently developed ELISA and flow cytometry techniques using purified HLA antigen will increase the clinical relevance of posttransplantation HLA antibody monitoring by (1) allowing the detection of low levels of donor antibody; (2) easily distinguishing the isotype and target (HLA class I or class II) of the antibodies; and (3) correlating the antibody with specific graft pathology.

374 citations


Journal ArticleDOI
TL;DR: MRI accurately determines right lobe mass, and most liver regeneration occurs in the 1st week after resection or transplantation, and the time course does not differ significantly in donors or recipients.
Abstract: Background. Regeneration of the liver to a predetermined size after resection or transplantation is a well described phenomenon, but the time course over which these events occur has not been well defined. It is not clear how initial liver mass, reperfusion, immunosuppression, or steatosis influence this process. Methods. Liver regeneration was assessed prospectively by volumetric magnetic resonance imaging (MRI) in living right lobe liver donors and the recipients of these grafts. Imaging was performed at regular intervals through 60 days after resection/transplantation, and liver mass was determined. Liver function tests and synthetic function were monitored throughout the study period in donors and recipients of these grafts as well as recipients of cadaveric grafts. Results. MRI consistently overestimated liver mass by a mean of 45 g (±65) (range 10-123). Donor liver mass increased by 101%, 110%, 115%, and 144% at 7, 14, 30, and 60 days after resection, respectively. Recipient liver mass increased by 87,101,119, and 99% at 7, 14, 30, and 60 days after transplantation, respectively. Steatosis did not influence the degree of regeneration or graft function, nor was there a functional difference between grafts of >1% graft to recipient body weight ratio or <1%. Conclusions. MRI accurately determines right lobe mass. Most liver regeneration occurs in the 1st week after resection or transplantation, and the time course does not differ significantly in donors or recipients. The mass of the graft or remnant segment affects the duration of the regeneration process, with a smaller initial liver mass prolonging the course. Steatosis of <30% had no bearing on liver function or regeneration and, therefore, should not be an absolute criterion for exclusion of donors. A calculated graft to recipient body weight ratio of 0.8% is adequate for right lobe living donor liver transplantation.

327 citations


Journal ArticleDOI
TL;DR: The use of fibrin matrices has been shown to have considerable advantages over plastic for the culture of skin cells for grafting and that it is now possible to generate and transplant enough cultured epithelium from a small skin biopsy to restore completely the epidermis of an adult human in 16 days as mentioned in this paper.
Abstract: Background. Extensive third degree burn wounds can be permanently covered by the transplantation of autologous cultured keratinocytes. Many modifications to Green and colleagues' original technique have been suggested, including the use of a fibrin matrix. However, the properties of the cultured cells must be assessed using suitable criteria before a modified method of culture for therapeutic purposes is transferred to clinical use, because changes in culture conditions may reduce keratinocyte lifespan and result in the loss of the transplanted epithelium. Methods. To evaluate the performances of human keratinocytes grown on a fibrin matrix, we assay for their colony-forming ability, their growth potential and their ability to generate an epidermis when grafted onto athymic mice. The results of these experiments allowed us to compare side by side the performance for third degree burn treatment of autologous cultured epithelium grafts grown according to Rheinwald and Green on fibrin matrices with that of grafts grown directly on plastic surfaces. Results. We found that human keratinocytes cultured on a fibrin matrix had the same growth capacity and transplantability as those cultured on plastic surfaces and that the presence of a fibrin matrix greatly facilitated the preparation, handling, and surgical transplantation of the grafts, which did not need to be detached enzymatically. The rate of take of grafts grown on fibrin matrices was high, and was similar to that of conventionally cultured grafts. The grafted autologous cells are capable of generating a normal epidermis for many years and favor the regeneration of a superficial dermis. Conclusion. We have demonstrated that: 1) fibrin matrices have considerable advantages over plastic for the culture of skin cells for grafting and that it is now possible to generate and transplant enough cultured epithelium from a small skin biopsy to restore completely the epidermis of an adult human in 16 days; and 2) the generated epidermis self-renews itself for years. The use of fibrin matrices thus significantly improves the transplantation of cultured epithelium grafts for extensive burns as recently demonstrated in a follow-up work.

315 citations


Journal ArticleDOI
TL;DR: All regimens yielded similar acute rejection rates and graft survival, but the tacrolimus + MMF regimen was associated with the lowest rate of steroid resistant rejection requiring antilymphocyte therapy.
Abstract: Background Our clinical trial was designed to investigate the optimal combination of immunosuppressants for renal transplantation. Methods A randomized three-arm, parallel group, open label, prospective study was performed at 15 North American centers to compare three immunosuppressive regimens: tacrolimus + azathioprine (AZA) versus cyclosporine (Neoral) + mycophenolate mofetil (MMF) versus tacrolimus + MMF. All patients were first cadaveric kidney transplants receiving the same maintenance corticosteroid regimen. Only patients with delayed graft function (32%) received antilymphocyte induction. A total of 223 patients were randomized, transplanted, and followed for 1 year. Results There were no significant differences in baseline demography between the three treatment groups. At 1 year the results are as follows: acute rejection 17% (95% confidence interval 9%, 26%) in tacrolimus + AZA; 20% (confidence interval 11%, 29%) in cyclosporine + MMF; and 15% (confidence interval 7%, 24%) in tacrolimus + MMF. The incidence of steroid resistant rejection requiring antilymphocyte therapy was 12% in the tacrolimus + AZA group, 11% in the cyclosporine + MMF group, and 4% in the tacrolimus + MMF group. There were no significant differences in overall patient or graft survival. Tacrolimus-treated patients had a lower incidence of hyperlipidemia through 6 months posttransplant. The incidence of posttransplant diabetes mellitus requiring insulin was 14% in the tacrolimus + AZA group, 7% in the cyclosporine + MMF and 7% in the tacrolimus + MMF groups. Conclusions All regimens yielded similar acute rejection rates and graft survival, but the tacrolimus + MMF regimen was associated with the lowest rate of steroid resistant rejection requiring antilymphocyte therapy.

282 citations


Journal ArticleDOI
TL;DR: Donor selection limits the application of living donor liver transplantation in the adult population, and genetically unrelated individuals increase the size of the donor pool.
Abstract: Background. The shortage of cadaveric livers has sparked an interest in adult-to-adult living donor transplantation. Right lobe donor hepatectomy is frequently required to obtain a graft of adequate size for adult recipients. Careful donor selection is necessary to minimize complications and assure a functional graft. Methods. A four-step evaluation protocol was used for donor selection and satisfactory results of all tests in each step were required before proceeding to the next. Donors were selected based on a battery of laboratory studies chosen to exclude unrecognized infection, liver disease, metabolic disorders, and conditions representing undue surgical risk. Imaging studies included ultrasonography, angiography, magnetic resonance imaging, and intraoperative cholangiography and ultrasonography. The information obtained from liver biopsy was used to correct the estimated graft mass for the degree of steatosis. Results. From March 1998 to August 1999, 126 candidates were evaluated for living donation. A total of 35 underwent donor right lobectomy with no significant complications. Forty percent of all donors that came to surgery were genetically unrelated to the recipient. A total of 69% of those evaluated were excluded. ABO incompatibility was the primary reason for exclusion after the first step (71%) and the presence of steatosis yielding an inadequate estimated graft mass after the second step (20%). Conclusions. Donor selection limits the application of living donor liver transplantation in the adult population. Unrelated individuals increase the size of the donor pool. Right lobe hepatectomy can be performed safely in healthy adult liver donors. Preoperative liver biopsy is an essential part of the evaluation protocol, particularly when the estimated graft mass is marginal. Living donor liver transplantation (LDLT) using the left

276 citations


Journal ArticleDOI
TL;DR: Analysis of long-term complications in liver recipients surviving > or =5 years after transplant, to assess their medical condition and to compare findings to the general population found liver transplantation is being performed with excellent 5-year survival.
Abstract: Background Short-term outcomes of liver transplantation are well reported. Little is known, however, about long-term results in liver recipients surviving > or =5 years. We sought to analyze long-term complications in liver recipients surviving > or =5 years after transplant, to assess their medical condition and to compare findings to the general population. Methods We analyzed the chart and database records of all patients (n=139) who underwent liver transplantation at a major transplant center before January 1, 1991. Outcome measures included the presence of diabetes, hypertension, heart, renal or neurological disease, osteoporosis, incidence of de novo malignancy or fracture, or other pathology, body mass index, serum cholesterol and glucose, liver function, blood pressure, frequency of laboratory and clinic follow-up, current pharmacological regimen, and late rejection episodes. Results Ninety-six patients (70%) survived > or =5 years. Compared to numbers expected based on U.S. population rates, transplant recipients had significantly higher overall prevalences of hypertension (standardized prevalence ratio [SPR]=3.07, 95% confidence interval [CI], 2.35-3.93) and diabetes (SPR=5.99, 95% CI, 4.15-8.38), and higher incidences of de novo malignancy (standardized incidence ratio [SIR]=3.94, 95% CI, 2.09-6.73), non-Hodgkin's lymphoma (SIR=28.56, 95% CI, 7.68-73.11), non-melanoma skin cancer (estimated SIR> or =3.16) and fractures in women (SIR=2.05, 95% CI, 1.12-3.43). Forty-one of 87 (47.1%) patients were obese, and 23 patients (27.4%) had elevated serum cholesterol levels (> or =240 mg/dl, 6.22 mmol/L), compared to 33% and 19.5% of U.S. adults, respectively. Prevalences of heart or peptic ulcer disease were not significantly higher. Conclusions Liver transplantation is being performed with excellent 5-year survival. Significant comorbidities exist, however, which appear to be related to long-term immunosuppression.

Journal ArticleDOI
TL;DR: Pregnancy in tacrolimus-treated transplant recipients resulted in a favourable outcome, and complications of the mother and neonate were similar to those previously described with other immunosuppressants.
Abstract: BACKGROUND The increasing use of tacrolimus as a primary immunosuppressant is paralleled by a growing number of pregnancies occurring in mothers receiving tacrolimus systemically. METHODS In this retrospective analysis during 1992-1998; data sources were case reports from clinical studies, spontaneous reports from health care professionals, routine surveys by transplant registries, and the published literature. RESULTS One hundred pregnancies in 84 mothers were recorded. Mean maternal age was 28 years. All except one mother (autoimmune disease) were solid organ transplant recipients (66% liver and 27% kid- ci ney). Mean time from transplantation to conception was 26 months. The mean daily dose of tacrolimus (range 11.7-12.8 mg/day) and the mean tacrolimus whole blood level (range 8.5-11.5 ng/ml) remained fairly constant from preconception through the third trimester. The most frequent maternal complications were graft rejection followed by preeclampsia, renal impairment, and infection. All cases of rejection were successfully treated with corticosteroids and did not result in graft loss. Of 100 pregnancies, 71 progressed to delivery (68 live births, 2 neonatal deaths, and 1 stillbirth), 24 were terminated (12 spontaneous and 12 induced), 2 pregnancies were ongoing, and 3 were lost to follow-up. Mean gestation period was 35 weeks with 59% deliveries being premature (<37 weeks). The birth weight (mean 2573 g) was appropriate for gestational age in 90% of cases. Most common complications in the neonate were hypoxia, hyperkalemia, and renal dysfunction. These were transient in nature. Four neonates presented with malformations, without any consistent pattern of affected organs. CONCLUSION Pregnancy in tacrolimus-treated transplant recipients resulted in a favourable outcome. Complications of the mother and neonate were similar to those previously described with other immunosuppressants.

Journal ArticleDOI
TL;DR: The literature overview shows the necessity of preoperative psychosocial screening regarding predictors for posttransplant noncompliance in organ transplant patients.
Abstract: BACKGROUND: Many studies confirm that noncompliance or poor compliance is one of the great problems in health care as it results in waste of resources and funds METHODS: This overview includes literature on heart, liver, and kidney transplants with emphasis on heart transplantation in adult and pediatric transplant patients and addresses the following variables as potential predictors of postoperative compliance problems: demographic variables (age, marital status, gender) psychological variables (anxiety, denial) psychiatric disorders (major depression, anxiety, and personality disorders), poor social support, pretransplant noncompliance, obesity, substance abuse, and health-related variables (distance from transplant center, indication for transplantation, required pretransplant assist device) Relevant studies on these topics that were conducted up to 1999 are included and discussed in this overview The most important results are presented in tables RESULTS: Unfortunately, there has not been any systematic and comprehensive review of the literature on predictors of noncompliance in organ transplant patients so far With organ transplantation noncompliance impairs both life quality and life span as it is a major risk factor for graft rejection episodes and is responsible for up to 25% of deaths after the initial recovery period Therefore, it might be assumed that well-informed transplant patients are a highly motivated group whose compliance is just as high This is not the case However, even when graft loss means loss of life as in heart or liver transplantation, noncompliance occurs To best select potential organ recipients, it would be ideal if patients who are very likely to show noncompliant behavior could be identified already before being transplanted CONCLUSION: The literature overview shows the necessity of preoperative psychosocial screening regarding predictors for posttransplant noncompliance

Journal ArticleDOI
TL;DR: In renal transplant recipients treated with mycophenolate mofetil and cyclosporine, reduction and early withdrawal of the prophylactic corticosteroid dose is feasible without an unacceptable increase in serious rejection episodes.
Abstract: BACKGROUND Renal transplant recipients experience adverse events attributed to corticosteroid therapy. METHODS This was a multicenter, randomized, double-blind, 6-month, controlled steroid dose-reduction study in renal transplant recipients with an unblinded 6-month follow-up. In the low/stop arm, corticoste. roids were given at half the dosage of control for 3 months from the date of transplantation, and then withdrawn. Both arms received mycophenolate mofetil and cyclosporine. The primary endpoint was the incidence of biopsy-proven acute rejection at 6 months posttransplantation. RESULTS There were 248 patients in the control group and 252 in the low/stop group. At 6 months the low/stop group had more biopsy-proven acute rejection episodes than the control (23% vs. 14%; P=0.008). At 12 months this increased to 25% vs. 15%. Most rejections were Banff grade I. Twelve-month graft loss was 5% in the low/stop group vs. 4% in the control. At 6 and 12 months serum cholesterol (P<0.01, P<0.01), triglycer. ides (P<0.01, P<0.01), and systolic blood pressure (P<0.001, P<0.001) were lower in the low/stop group. Diastolic pressure was lower (P<0.01) and lumbar spine bone density was greater (P<0.01) in the low/ stop group at 12 months. CONCLUSIONS In renal transplant recipients treated with mycophenolate mofetil and cyclosporine, reduction and early withdrawal of the prophylactic corticosteroid dose is feasible without an unacceptable increase in serious rejection episodes. This is accompanied by a significant reduction of steroid-related adverse events.

Journal ArticleDOI
TL;DR: Strong expression of CD55 and CD59 completely protected porcine kidneys from hyperacute rejection and allowed a detailed analysis of xenograft rejection in the absence of immunosuppression.
Abstract: Background The genetic modification of pigs is a powerful strategy that may ultimately enable successful xenotransplantation of porcine organs into humans. Methods Transgenic pigs were produced by microinjection of gene constructs for human complement regulatory proteins CD55 and CD59 and the enzyme alpha1,2-fucosyltransferase (H-transferase, HT), which reduces expression of the major xenoepitope galactose-alpha1,3-galactose (alphaGal). Kidneys from CD55/HT and CD55/CD59/HT transgenic pigs were transplanted into nephrectomised, nonimmunosuppressed adult baboons. Results In several lines of transgenic pigs, CD55 and CD59 were expressed strongly in all tissues examined, whereas HT expression was relatively weak and did not significantly reduce alphaGal. Control nontransgenic kidneys (n=4) grafted into baboons were hyperacutely rejected within 1 hr. In contrast, kidneys from CD55/HT pigs (n=2) were rejected after 30 hr, although kidneys from CD55/CD59/HT pigs (n=6) maintained function for up to 5 days. In the latter grafts, infiltration by macrophages, T cells, and B cells was observed at days 3 and 5 posttransplantation. The recipients developed thrombocytopenia and abnormalities in coagulation, manifested in increased clotting times and an elevation in the plasma level of the fibrin degradation product D-dimer, within 2 days of transplantation. Treatment with low molecular weight heparin prevented profound thrombocytopenia but not the other aspects of coagulopathy. Conclusions Strong expression of CD55 and CD59 completely protected porcine kidneys from hyperacute rejection and allowed a detailed analysis of xenograft rejection in the absence of immunosuppression. Coagulopathy appears to be a common feature of pig-to-baboon renal transplantation and represents yet another major barrier to its clinical application.

Journal ArticleDOI
TL;DR: In this article, the porcine islets of Langerhans after exposure to human blood invitro, or after intraportal transplantation to cynomologus monkeys were found to be protected by sCR1 and heparin.
Abstract: Damage to porcine islets of Langerhans after exposure to human blood invitro, or after intraportal transplantation to cynomologus monkeys:protective effects of sCR1 and heparin [see comments]

Journal ArticleDOI
TL;DR: The intense nonimmune inflammation produced in isografts after donor BD may represent the initial stages of a continuum between an initial nonspecific and later immune reactivity, when placed in the context of allotransplantation.
Abstract: BACKGROUND: Brain death (BD) has been thought to influence the early course of transplanted organs by triggering a series of nonspecific inflammatory events that in turn may increase the kinetics and intensity of the immunological host responses. In this study early nonspecific, cellular, and molecular changes occurring in kidney isografts from BD donors are compared with those from normal anesthetized, ventilated controls. METHODS: After induction of brain death, the animals were mechanically ventilated for 6 hr before organ removal. Only rats with stable blood pressure (mean arterial pressure >80 mmHg) were included. Serum creatinines were measured daily. Representative grafts were harvested 6 hr after brain death and between 1 hr and 5 days after engraftment for morphology, immunohistology, and reverse transcriptase-polymerase chain reaction. The presence of serum cytokines was assessed by enzyme linked immunoabsorbant assay. RESULTS: Serum creatinine levels rose slightly in recipients from BD donors. Serum interleukin-1beta levels increased within 6 hr versus controls (P<0.05). mRNA levels of interleukin-1beta and macrophage inhibitory protein-1 in the kidneys were up-regulated transiently before engraftment (6 hr after BD) and 1 hr after revascularization (P<0.05). By immunohistology, numbers of infiltrating polymorphonuclear leukocytes peaked at 24 hr in parallel with intragraft induction of P- and E-selectin, complement, and other proinflammatory chemokines and cytokines. At 5 days, the isografts from BD donors were highly infiltrated by host leukocyte populations associated with intense up-regulation of their products. In contrast, those from control donors remained relatively normal through this initial follow-up period. CONCLUSIONS: The intense nonimmune inflammation produced in isografts after donor BD may represent the initial stages of a continuum between an initial nonspecific and later immune reactivity, when placed in the context of allotransplantation.

Journal ArticleDOI
TL;DR: Independently of known confounding variables, the impact of AR on CAF has significantly increased from 1988 to 1997, which may in part explain the relative lack of improvements in long term renal allograft survival, despite a decline in AR rates.
Abstract: Background. Acute rejection (AR) remains a major risk factor for the development of chronic renal allograft failure (CAF), which is a major cause of late graft loss. With the introduction of several newer immunosuppressive agents (e.g., mycophenolate mofetil, tacrolimus and neoral) acute rejection rates have been steadily decreasing. However, the incidence of CAF has not decreased as dramatically as the incidence of acute rejection. One possible explanation is that the impact of AR on CAF is changing. The goal of this study was to analyze the relative impact of AR era on the development of CAF. Methods. We evaluated 63,045 primary renal transplant recipients reported to the USRDS from 1988 to 1997. CAF was defined as graft loss after 6 months posttransplantation, censored for death, acute rejection, thrombosis, infection, surgical complications, or recurrent disease. A Cox proportional hazard model correcting for 15 possible confounding factors evaluated the relative impact of AR on CAF. The era effect (years 1988-1989, 1990-1991, 1992-1993, 1994-1995 and 1996-1997) was evaluated by an era versus AR interaction term. Results. An AR episode within the first 6 months after transplantation was the most important risk factor for subsequent CAF (RR=2.4, CI 2.3-2.5). Compared with the reference group (1988-89 with no rejection), having an AR episode in 1988-89, 1990-1991, 1992-1993, 1994-1995, and 1996-1997, conferred a 1.67, 2.35, 3.4, 4.98 and 5.2-fold relative risk for the subsequent development of CAF (P<0.001). Conclusions. Independently of known confounding variables, the impact of AR on CAF has significantly increased from 1988 to 1997. This effect may in part explain the relative lack of improvements in long term renal allograft survival, despite a decline in AR rates.

Journal ArticleDOI
TL;DR: These studies are the first to consistently demonstrate prevention of a secondary humoral response after cell or organ transplantation in a pig-to-primate model and suggest that T cell costimulatory blockade may facilitate induction of mixed hematopoietic chimerism and, consequently, of tolerance to pig organs and tissues.
Abstract: In pig-to-primate organ transplantation, hyperacute rejection can be prevented, but the organ is rejected within days by acute vascular rejection, in which induced high-affinity anti-Gal alpha1-3Gal (alphaGal) IgG and possibly antibodies directed against new porcine (non-alphaGal) antigenic determinants are considered to play a major role. We have explored the role of an anti-CD40L monoclonal antibody in modifying the humoral response to porcine hematopoietic cells in baboons pretreated with a nonmyeloablative regimen.

Journal Article
TL;DR: Normalization of renal function (urea and creatinine) in primate recipients of porcine renal xenografts suggests that pig kidneys may be suitable for future clinical xenotransplantation.
Abstract: Background. Recently, there has been a resumed interest in clinical xenotransplantation using pig organs. However, no data are available yet regarding the capacity of porcine organs to sustain the life of a primate beyond the first month. We have attempted to obtain long-term survival of nonhuman primates using human decay-accelerating factor (hDAF) transgenic pig organs and an immunosuppressive strategy particularly aimed at neutralizing the humoral component of the immune response. Methods, hDAF transgenic or control kidneys were transplanted into 14 bilaterally nephrectomized cynomolgus monkeys (Macaca fascicularis) that underwent splenectomy and were immunosuppressed with cyclosporine A, cyclophosphamide, and steroids. All animals also received recombinant erythropoietin. Postoperatively, the primates were monitored daily. Laboratory evaluations included serum biochemistry, hematology, and measurements of hemolytic antipig antibodies. To assess the role of splenectomy in the control of humoral response, historical data were also used from a group of monkeys (n=7) that received the same immunosuppressive regimen and an hDAF transgenic porcine kidney but did not have splenectomy or receive recombinant erythropoietin. Results. This immunosuppressive approach obtained the longest survival time (78 days) described to date of a primate receiving a life-supporting porcine renal xenograft. Furthermore, four of nine animals in this series survived for 50 days or more. Most biochemical measurements in this study (including plasma urea, creatinine, sodium, and potassium concentrations) remained within normal ranges for several weeks in all of the longest-surviving animals. Conclusions. Normalization of renal function (urea and creatinine) in primate recipients of porcine renal xenografts suggests that pig kidneys may be suitable for future clinical xenotransplantation. Additional immunosuppressive approaches, specifically designed to prevent humorally mediated immunological damage, should be explored to further prolong survival of primates that have received porcine xenografts.

Journal ArticleDOI
TL;DR: The data demonstrate that for ex vivo expansion of human BMSCs, medium with FBS remains most effective and increases the practicality of using culture-expanded BMSC for autologous human transplantation and suggests the presence of osteogenic inhibitors in serum.
Abstract: Background. Bone marrow stromal cell (BMSC) transplantation may offer an efficacious method for the repair of bone defects. This approach has been developed using BMSCs expanded ex vivo in medium with fetal bovine serum (FBS). For clinical applications, however, contact of BMSCs with FBS should be minimized. We studied the effect of FBS substitutes on both human BMSC proliferation in vitro and subsequent bone formation in vivo. Methods. BMSC proliferation was measured by colony forming efficiency (CFE) and by cell numbers at consecutive passages. Bone formation was studied in 6- to 8-week-old transplants of human BMSCs in immunocompromised mice. Results. Medium with FBS was more effective in stimulating BMSC proliferation than medium with either human serum (HS) or rabbit serum (RS). Compared to bone formed by BMSCs cultured continuously with FBS, bone formed by cells cultured with HS, or with FBS switched to HS, was considerably less extensive, while bone formed by cells cultured with FBS switched to serum-free medium (SFM) was considerably more extensive. The increase in bone formation was due to neither the SFM components nor to the proliferation status of BMSCs prior to transplantation. Conclusions. Our data demonstrate that for ex vivo expansion of human BMSCs, medium with FBS remains most effective. However, incubation of human BMSCs in SFM prior to in vivo transplantation significantly stimulates subsequent bone formation. This finding increases the practicality of using culture-expanded BMSCs for autologous human transplantation and suggests the presence of osteogenic inhibitors in serum.

Journal ArticleDOI
TL;DR: The data demonstrate that a large number of patients deemed non-sensitized by cytotoxicity-based antibody assessment are, in fact, sensitized, and recommend that, if a transplant center chooses to forego a prospective final crossmatch, the decision to do so should be based on methods more sensitive than AHG-CDC.
Abstract: BACKGROUND Since the landmark studies of Patel and Terasaki in the late 1960s, pretransplant cross-matching has been performed by HLA laboratories on a 24-hr/7-day basis. In fact, regulating agencies such as the American Society for Histocompatibility and Immunogenetics and the United Network for Organ Sharing have mandated prospective crossmatching for selected solid organ transplants. However, two recent publications (Transplantation 1998; 66: 1833; and Transplantation 1998; 66: 1835) have suggested a change to this approach. Specifically, those authors advocate the transplantation of non-sensitized individuals without a final prospective cross-match as a means to reduce cold ischemia time and the incidence of delayed graft function. Such considerations were predicated upon results generated by cytotoxicity-based antibody screening. We and others, however, have reported that a flow cytometric-based assay is a more sensitive method to detect alloantibodies than cytotoxicity. Furthermore, an increasing number of reports document that graft survival is improved among patients whose final flow cytometric crossmatches were negative compared to patients with positive flow cytometric crossmatches. Although we agree that it is reasonable to transplant truly non-sensitized patients without a prospective final crossmatch, our data demonstrate that a large number of patients deemed non-sensitized by cytotoxicity-based antibody assessment are, in fact, sensitized. METHODS Panel-reactive antibody (PRA) testing was performed with 703 sera from 527 patients. The patient population consisted of individuals awaiting either renal or cardiac transplantation. PRA evaluations were performed using lymphocyte cytotoxicity (antiglobulin-enhanced, complement-dependent cytotoxicity [AHG-CDC]) or assays (enzyme-linked immunosorbent assay [ELISA]; flow cytometry) in which solubilized HLA molecules were affixed to solid phase matrices. RESULTS PRA activity in 264 sera from 88 patients was evaluated by AHG-CDC, ELISA, and flow cytometry. Results among the three methods were concordant for 83% of these sera. Discordant results occurred with 32 samples and demonstrated a distinct hierarchy in the sensitivity of the three techniques to detect alloantibodies. None of the 32 sera were positive by AHG-CDC, 20/32 were positive by ELISA, and 32/32 were positive by flow cytometry. Subsequent studies revealed that, among 527 patients, 302 (57%) exhibited 0% PRA by AHG-CDC. Of these 302 AHG-CDC-negative patients, 76 (25%) had class I or class II antibodies detectable using a flow cytometric approach. Within the AHG-CDC-negative/flow cytometric-positive patients, PRA values exhibited a wide range (6-99%) for both class I and class II antibodies. The average PRA was 27% and 38% for class I and II, respectively. Retrospective flow cytometric crossmatches performed for 30 recipients of cardiac allografts whose AHG-CDC PRA were 0% revealed that 11/30 crossmatches were positive. CONCLUSIONS The concept of transplanting non-sensitized patients without a prospective final crossmatch is appealing and, if bona fide, clearly makes sense. However, our data demonstrate that how a patient is deemed non-sensitized is critical. The difference between AHG- and flow cytometric-based PRA testing is significant and can result in transplantation of alloimmunized patients considered to be non-sensitized. Therefore, we recommend that, if a transplant center chooses to forego a prospective final crossmatch, the decision to do so should be based on methods more sensitive than AHG-CDC.

Journal ArticleDOI
TL;DR: The data suggest that ligation of surface-expressed CD154 provides an important signal that modulates T cell activation and thereby contributes to the effects of CD154 mAb, in addition to previously recognized actions involving blockade of CD40/CD154-dependent cellactivation and activation-induced cell death.
Abstract: Background Recent experimental data indicate that the targeting of the costimulatory molecule CD40-ligand (CD154) may well offer an opportunity for tolerance induction in transplant recipients and patients with autoimmune diseases, although the optimal therapeutic strategy for clinical application of CD154 monoclonal antibody (mAb) is unclear. Methods We undertook vascularized heterotopic cardiac allograft transplantation in completely MHC-mismatched mice, treated recipients with CD154 mAb plus various immunosuppressive agents, and performed flow cytometric analysis of CD154 expression by T cells activated in vitro in the presence of corresponding immunosuppressive agents. We also tested the extent to which CD154 induction was NFkappaB-dependent by using NFkappaB/p50-deficient mice as allograft recipients and as source of cells for in vitro studies of CD154 induction, and through use of proteasome inhibitors to block IkappaBalpha degradation and NFKB activation in wild-type mice. Results Concomitant use of cyclosporin A or methylprednisolone, but not rapamycin or mycophenolate, inhibited CD154 mAb-induced allograft survival. The differential effects of these agents on CD154 mAb-induced tolerance correlated with their capacity to inhibit activation-induced CD154 expression on CD4+ T cells. Full expression of CD154 expression was found to require NF-kappaB activation, and CD154 mAb was ineffective in NF-kappaB/p50 deficient allograft recipients or control mice in which NF-kappaB activation was blocked by proteasome inhibition. Conclusions Strategies to use CD154 mAb clinically must take into account the effects of immunosuppressive agents on CD154 induction, which seems to be at least partially NF-kappaB dependent. Our data suggest that ligation of surface-expressed CD154 provides an important signal that modulates T cell activation and thereby contributes to the effects of CD154 mAb, in addition to previously recognized actions involving blockade of CD40/CD154-dependent cell activation and activation-induced cell death.

Journal ArticleDOI
TL;DR: Routine ureteric stenting is unnecessary in kidney transplantation in patients at low risk for urological complications, and careful surgical technique with selective stenting of problematic anastomoses yields similar results.
Abstract: Background Whether routine ureteric stenting in low-urological-risk patients reduces the risk of urological complications in kidney transplantation is not established Methods Eligible patients were recipients of singleorgan renal transplants with normal lower urinary tracts Patients were randomized intraoperatively to receive either routine stenting or stenting only in the event of technical difficulties with the anastomosis All patients underwent Lich-Gregoire ureteroneocystostomy Results Between June 1994 and December 1997, 331 kidney transplants were performed at a single center, 305 patients were eligible, and 280 patients were enrolled and randomized Donor and recipient age, sex, donor source, whether first or subsequent grafts, ureteric length, native renal disease, and immunosuppression were similar in each group In the no-routinestenting group 6 of 137 patients (44%) received stents after randomization for intraoperative events that in the surgeon’s opinion required use of a stent In an intention-to-treat analysis there was no difference between groups in the primary outcome cluster of obstruction or leak [routine stenting 5 of 143 (35%) vs no routine stenting 9 of 137 (66%); P5023], or in either of these complications analyzed separately All urological complications were successfully managed without major morbidity Living donor organs and shorter ureteric length (after trimming) were univariate risk factors for leaks, although increasing donor age was associated with obstruction Conclusions Routine ureteric stenting is unnecessary in kidney transplantation in patients at low risk for urological complications Careful surgical technique with selective stenting of problematic anastomoses yields similar results Urological complications after kidney transplantation can cause significant morbidity and mortality Although an early report described a 29% incidence of ureteric complications (1), improvements in surgical techniques have reduced this figure to 5‐10% (2‐ 4) Although placement of ureteric stents is common practice for ureteric anastomoses in general urology, their routine use in renal transplantation is controversial Retrospective studies have suggested benefit (5, 6 ,) but randomized controlled trials have produced conflicting results (7‐10) We conducted a randomized controlled trial of routine ureteric stenting in low-urological-risk kidney transplantation

Journal ArticleDOI
TL;DR: NAPRTCS data show that graft survival is improved in patients receiving PTx, compared with those receiving PD and HD, and graft loss resulting from vascular thrombosis is more common in children who receive PD than in those receiving HD.
Abstract: Background. There are no large studies of the effect of pretransplant dialysis status on the outcome of renal transplantation (Tx) in children. This study evaluated the North American Pediatric Renal Transplant Cooperative Study (NAPRTCS) registry data for the outcome of Tx in pediatric patients who either (1) received their transplants preemptively or (2) were maintained on dialysis before receiving their transplants. Methods. We compared graft survival and patient survival rates, incidence of acute tubular necrosis (ATN), acute rejection episodes, and causes of graft failure in peritoneal dialysis (PD) patients with those maintained on hemodialysis (HD) and those undergoing preemptive Tx (PTx). Results. Primary Tx was performed in 2495 children (59% male; 61% Caucasian; 1090 PD, 780 HD, 625 PTx) between 1/1/1992 and 12/31/1996. The overall graft survival rates of the PD and HD groups were similar, but were less than that of the PTx group (3-year: 82% PD and HD, 89% PTx, overall P50.0003). Improved graft survival in the PTx group was present only in recipients of grafts from living donors. There was no difference in the overall patient survival rate at 3 years, or in time to first acute-rejection episodes in the three groups. The incidence of ATN in the first 7 days post-Tx was higher in PD and HD patients than in PTx patients (11% PD and 12% HD vs. 2% PTx, P<0.001; HD vs. PD, P5NS). The major single cause of graft failure in each group was: PD, vascular thrombosis (20%); HD, chronic rejection (27%); PTx, acute and chronic rejection (21% each). Conclusion. NAPRTCS data show that graft survival is improved in patients receiving PTx, compared with those receiving PD and HD. Graft loss resulting from vascular thrombosis is more common in children who receive PD than in those receiving HD. Several renal replacement therapies are available for children with end-stage renal disease (ESRD), but the selection for an individual patient is modified by many variables, including the child’s size, underlying disease, availability of organs, and physician and family preference. According to North American Pediatric Renal Transplant Cooperative Study (NAPRTCS) data, 63% of children with ESRD had undergone some form of peritoneal dialysis (PD) in 1995 (1). Up to 75% of children undergo some kind of dialysis therapy before primary renal transplantation (Tx), whereas the rest undergo preemptive Tx (PTx). We hypothesized that pre-Tx dialysis status may have significant implications for long

Journal ArticleDOI
TL;DR: Renal transplant recipients can safely be given deferred ganciclovir therapy forCMV disease if they are intensively monitored for CMV infection and disease and identify subgroups of patients likely to benefit from CMV prophylaxis or preemptive therapy.
Abstract: BACKGROUND Cytomegalovirus (CMV) infection is the single most frequent infectious complication in renal transplant recipients. Because no CMV-prophylaxis is given and ganciclovir is used only as deferred therapy for CMV disease at our center, we have been able to study the natural course of CMV infections. The aim was to assess risk factors for CMV infection and disease and thus identify subgroups of patients likely to benefit from CMV prophylaxis or preemptive therapy. METHODS Between October 1994 and July 1997, 477 consecutive renal transplant recipients (397 first transplants and 80 retransplants) were included in the study. The patients were followed prospectively for 3 months with serial measurements of CMV pp65 antigen for monitoring activity of CMV infections. RESULTS The incidence of CMV infections in first transplants was 68% in D+R- and D+/-R+ serostatus groups, whereas the incidence of CMV disease was higher in D+R- (56%) than in D+/-R+ (20%, P<0.001). No difference in severity of CMV disease in D+R- and D+/-R+ was seen except for an increased incidence of hepatitis in primary infections. One of 14 deaths could be associated with CMV disease in a seropositive recipient. Cox regression analysis showed that rejection (RR 2.5, P<0.01) and serostatus group D+R- (RR 3.9, P<0.001) were significant risk factors for development of CMV disease. The maximum CMV pp65 antigen count had significant correlation to disease only in CMV seropositive recipients, P<0.001. Conclusion. Renal transplant recipients can safely be given deferred ganciclovir therapy for CMV disease if they are intensively monitored for CMV infection. Patients with primary CMV infection (D+R-), CMV infected patients undergoing anti-rejection therapy and R+ patients with high CMV pp65 counts seem to have a particular potential for benefit from preemptive anti-CMV-therapy.

Journal ArticleDOI
TL;DR: Both PCTx and POTx lead to profound alterations in hemostasis and coagulation parameters that must be overcome if discordant xenotransplantation of hematopoietic cells and organs is to be fully successful.
Abstract: Efforts to achieve tolerance to transplanted pig organs in nonhuman primates by the induction of a state of mixed hematopoietic chimerism have been associated with disorders of coagulation and thrombosis. Activation of recipient vascular endothelium and platelets by porcine hematopoietic cells and/or activation of donor organ vascular endothelium and/or molecular differences between the species may play roles. Irradiation or drug therapy could possibly potentiate endothelial cell activation and/or injury.

Journal Article
TL;DR: Deep-seated FI were relatively rare in this series, although their mortality rate is still very high, and three additional patients were diagnosed affected by deep mould infection by histology alone.
Abstract: Background Fungal infections (FI) after solid organ transplantation (Tx) remain a major cause of morbidity and mortality. Aspergillus and Candida account for more than 80% of FI. Methods One thousand nine hundred and sixty-three patients undergoing thoracic organ Tx [1,852 heart and 111 lung (35 heart-lung Tx, 30 double-lung Tx, 46 single-lung Tx)] in 12 Italian Centers between November 1985 and January 1997 were included in the study. Results Fifty-one patients (41 heart Tx - 2.2%; 9 heart-lung Tx - 25.7%; 1 single-lung Tx - 2.2%) developed 53 invasive FI at a median of 58 days (range 6-2479) after Tx. Aspergillosis was the most frequent FI in our series accounting for 64.1% (34/53) of all FI [A fumigatus, n=29 (85.3%); A nidulans, n=2 (5.9%); A niger, n=2 (5.9%); A terreus, n=1 (2.9%)]; 30 (88.2%) patients developed invasive lung aspergillosis, 2 (5.9%) a tracheobronchitis, 1 (2.9%) a skin infection, and 1 (2.9%) a sternal wound infection. Twelve patients (22.6%) developed candidiasis [C albicans, n=8 (66.6%); C krusei, n=1 (8.3%); C glabrata, n=1 (8.3%); C parapsilosis, n=1 (8.3%); C sake, n=1 (8.3%)]. There were seven episodes (58.3%) of candidemia, two (16.7%) esophagitis, two (16.7%) gastritis, and one (8.3%) tracheobronchitis. Mortality was 29.4% for patients developing aspergillosis and 33.3% for those experiencing candidiasis. Furthermore, four patients developed the following: one C neoformans meningitis, one Sporothrix cyanescens pneumonia, one Rhizopus spp. tracheobronchitis, and one Trichosporon beigelii disseminated infection. Three additional patients were diagnosed affected by deep mould infection by histology alone. Conclusions Deep-seated FI were relatively rare in our series, although their mortality rate is still very high.

Journal ArticleDOI
TL;DR: Late occurring lymphomas could be considered an entity distinct from PTLD, occurring within 1 year of transplant, because they show a histological and clinical presentation similar to lymphomas of immunocompetent subjects, are frequently negative for the EBV genome, are invariably clonal, and may rearrange the c-myc oncogene.
Abstract: Background. Solid organ transplant patients undergoing long-term immunosuppression have high risk of developing lymphomas. The pathogenesis of the late-occurring posttransplantation lymphoproliferative disorders (PTLD) have not yet been extensively investigated. Methods. We studied 15 patients who developed PTLD after a median of 79 months (range 22–156 months) after organ transplant. Clonality, presence of Epstein-Barr virus (EBV) genome, and genetic lesions were evaluated by Southern blot analysis or polymerase chain reaction. Results. All monomorphic PTLD and two of three polymorphic PTLD showed a monoclonal pattern. Overall, 44% of samples demonstrated the presence of the EBV genome. Within monomorphic PTLD, the EBV-positive lymphomas were even lower (31%). A c-myc gene rearrangement was found in two cases (13%), whereas none of the 15 samples so far investigated showed bcl-1, bcl-2, or bcl-6 rearrangement. The modulation of immunosuppression was ineffective in all patients with monomorphic PTLD independent of the presence of the EBV genome. The clinical outcome after chemotherapy was poor because of infectious complications and resistant disease. With a median follow-up of 4 months, the median survival time of these patients was 7 months. Conclusions. Late occurring lymphomas could be considered an entity distinct from PTLD, occurring within 1 year of transplant, because they show a histological and clinical presentation similar to lymphomas of immunocompetent subjects, are frequently negative for the EBV genome, are invariably clonal, and may rearrange the c-myc oncogene. New therapeutic strategies are required to reduce the mortality rate, and new modalities of long-lasting immunosuppression are called for.

Journal ArticleDOI
TL;DR: Tacrolimus in combination with an initial dose of Mycophenolate mofetil 2 g/day is a very effective and safe regimen in cadaveric kidney transplant recipients.
Abstract: BACKGROUND Tacrolimus (FK506) is a safe and effective treatment for the prevention of rejection of renal allografts. Mycophenolate mofetil (MMF) has been used as adjunct immunosuppressive therapy with cyclosporine and corticosteroids for the same purpose. The objective of this study was to investigate the safety and efficacy of FK506 and MMF in renal transplant recipients. METHODS After cadaveric renal transplant, patients were randomized to receive tacrolimus in combination with either azathioprine (AZA, n=59), MMF 1 g/day (n=59), or MMF 2 g/day group (n=58). Patients were followed for 1 yr posttransplant for the incidence of biopsy-confirmed acute rejection, patient and graft survival, and adverse events. RESULTS Tacrolimus doses and trough concentrations were similar between treatment groups at all time points; 80% of patients were maintained within a range of 5.0-13.9 ng/ml at 12 months posttransplant. The mean dose of MMF decreased in the 2 g/day group to 1.5 g/day by 6 months posttransplant, primarily due to gastrointestinal GI-related disorders. The incidence of biopsy-confirmed acute rejection at 1 year was 32.2%, 32.2%, and 8.6% in the AZA, MMF 1 g/day, and MMF 2 g/day groups, respectively (P<0.01). The use of antilymphocyte antibodies for the treatment of rejection was comparable across treatment groups. The incidence of most adverse events was similar across treatment groups and comparable with previous reports. The overall incidence of posttransplant diabetes mellitus was 11.9%, with the lowest rate observed in the MMF 2 g/day group (4.7%), and was reversible in 40% of patients. The incidence of malignancies and opportunistic infections was low and not different across treatment groups. CONCLUSION Tacrolimus in combination with an initial dose of MMF 2 g/day is a very effective and safe regimen in cadaveric kidney transplant recipients.