scispace - formally typeset
Search or ask a question

Showing papers in "Transplantation in 2001"


Journal ArticleDOI
TL;DR: The addition of either 2 mg/ day sirolimus or 5 mg/day siro Limus to CsA/corticosteroid therapy significantly reduces the incidence of acute rejection episodes in primary mismatched renal allograft recipients, without an increase in immunosuppressant-related side effects, at 6 months and at 1 year after transplantation.
Abstract: Background. Despite the various immunosuppressive regimens presently in use, acute rejection in the early postoperative period continues to occur in 20 to 40% of renal transplant patients. In a double-blind, multicentred study, we investigated the ability of two different doses of sirolimus (rapamycin, RAPAMUNE), a new class of immunosuppressant that blocks cell cycle progression, to prevent acute rejection in recipients of primary mismatched renal allografts when added to a regimen of cyclosporine (cyclosporin A, CsA) and corticosteroids. Methods. Between October 1996 and September 1997, 576 recipients of primary mismatched cadaveric or living donor renal allografts were randomly assigned in a 2:2:1 ratio (before the transplant operation) to receive an initial loading dose of either 6 or 15 mg of orally administered sirolimus, followed by a daily dose of either 2 or 5 mg/day, or to receive a matched placebo. All groups received cyclosporine (microemulsion formula, CsA) and corticosteroids. The primary endpoint was a composite of first occurrence of biopsy-confirmed acute rejection, graft loss, or death during the first 6 months after transplantation. Safety data were monitored by an independent drug safety monitoring board. Results. Based on an intention-to-treat analysis of 576 patients, there were no significant differences in patient demographic or baseline characteristics among treatment groups. The overall rate of the primary composite endpoint for the 6-month period after transplantation was 30.0% (68/227) in the 2 mglday sirolimus group and 25.6% (56/219) in the 5 mglday sirolimus group, significantly lower than the 47.7% (62/130) in the placebo group (P=0.002, P<0.001, respectively). During this period, the incidence of biopsy-confirmed acute rejection was 24.7% (56/227) in the 2 mglday sirolimus group and 19.2% (42/219) in the 5 mglday sirolimus group, compared with 41.5% (54/130) in the placebo group (P=0.003, P<0.001, respectively), representing a significant reduction in acute rejection of 40.5 and 53.7%, respectively. The need for antibody therapy to treat the first episode of biopsy-confirmed acute rejection was significantly reduced in the 5 mg/ day sirolimus group (3.2%) compared to the placebo group (8.5%; P=0.044). The results 1 year after transplantation were similar for the efficacy parameters studied. Adverse events and infections occurred in all groups. Conclusions. The addition of either 2 mglday sirolimus or 5 mglday sirolimus to CsA/corticosteroid therapy significantly reduces the incidence of acute rejection episodes in primary mismatched renal allograft recipients, without an increase in immunosuppressant-related side effects, including infections and malignancy, at 6 months and at 1 year after transplantation.

612 citations


Journal ArticleDOI
TL;DR: Preservation of limbal stem cells in culture gives new perspectives on the treatment of ocular disorders characterized by completeLimbal stem cell deficiency and the handiness and ease of long-distance transportation of the fibrin-cultured epithelial sheets suggest that this technology can now be widely applied.
Abstract: Background Ocular burns cause depletion of limbal stem cells, which leads to corneal opacification and visual loss Autologous cultured epithelial cells can restore damaged corneas, but this technology is still developing We sought to establish a culture system that allows preservation of limbal stem cells and preparation of manageable epithelial sheets and to investigate whether such cultures can permanently restore total limbal stem cell deficiency Methods We selected a homogeneous group of patients whose limbal cell deficiency was evaluated by scoring the gravity of the clinical picture and the keratin expression pattern Stem cells, obtained from the limbus of the contralateral eye, were cultivated onto a fibrin substrate and their preservation was evaluated by clonal analysis Fibrin cultures were grafted onto damaged corneas Results Fibrin-cultured limbal stem cells were successful in 14 of 18 patients Re-epithelialization occurred within the first week Inflammation and vascularization regressed within the first 3-4 weeks By the first month, the corneal surface was covered by a transparent, normal-looking epithelium At 12-27 months follow-up, corneal surfaces were clinically and cytologically stable Three patients had a penetrating keratoplasty approximately 1 year after restoration of their corneal surface Their visual acuity improved from light perception or counting fingers to 08-10 Conclusions Preservation of limbal stem cells in culture gives new perspectives on the treatment of ocular disorders characterized by complete limbal stem cell deficiency The multicenter nature of this study and the handiness and ease of long-distance transportation of the fibrin-cultured epithelial sheets suggest that this technology can now be widely applied

499 citations


Journal ArticleDOI
TL;DR: Patients who are more than 10 years post-OLTX have CRF and ESRD at a high rate and new strategies for long-term immunosuppression may be needed to decrease this complication.
Abstract: Background The calcineurin inhibitors cyclosporine and tacrolimus are both known to be nephrotoxic. Their use in orthotopic liver transplantation (OLTX) has dramatically improved success rates. Recently, however, we have had an increase of patients who are presenting after OLTX with end-stage renal disease (ESRD). This retrospective study examines the incidence and treatment of ESRD and chronic renal failure (CRF) in OLTX patients. Methods Patients receiving an OLTX only from June 1985 through December of 1994 who survived 6 months postoperatively were studied (n=834). Our prospectively collected database was the source of information. Patients were divided into three groups: Controls, no CRF or ESRD, n=748; CRF, sustained serum creatinine >2.5 mg/dl, n=41; and ESRD, n=45. Groups were compared for preoperative laboratory variables, diagnosis, postoperative variables, survival, type of ESRD therapy, and survival from onset of ESRD. Results At 13 years after OLTX, the incidence of severe renal dysfunction was 18.1% (CRF 8.6% and ESRD 9.5%). Compared with control patients, CRF and ESRD patients had higher preoperative serum creatinine levels, a greater percentage of patients with hepatorenal syndrome, higher percentage requirement for dialysis in the first 3 months postoperatively, and a higher 1-year serum creatinine. Multivariate stepwise logistic regression analysis using preoperative and postoperative variables identified that an increase of serum creatinine compared with average at 1 year, 3 months, and 4 weeks postoperatively were independent risk factors for the development of CRF or ESRD with odds ratios of 2.6, 2.2, and 1.6, respectively. Overall survival from the time of OLTX was not significantly different among groups, but by year 13, the survival of the patients who had ESRD was only 28.2% compared with 54.6% in the control group. Patients developing ESRD had a 6-year survival after onset of ESRD of 27% for the patients receiving hemodialysis versus 71.4% for the patients developing ESRD who subsequently received kidney transplants. Conclusions Patients who are more than 10 years post-OLTX have CRF and ESRD at a high rate. The development of ESRD decreases survival, particularly in those patients treated with dialysis only. Patients who develop ESRD have a higher preoperative and 1-year serum creatinine and are more likely to have hepatorenal syndrome. However, an increase of serum creatinine at various times postoperatively is more predictive of the development of CRF or ESRD. New strategies for long-term immunosuppression may be needed to decrease this complication.

474 citations


Journal ArticleDOI
TL;DR: Sirolimus allows early cyclosporine withdrawal in renal transplantation resulting in improved renal function and lower blood pressure.
Abstract: Sirolimus allows early cyclosporine withdrawal in renal transplantation resulting in improved renal function and lower blood pressure

424 citations


Journal ArticleDOI
TL;DR: Porcine extracellular matrix elicits an immune response that is predominately Th2-like, consistent with a remodeling reaction rather than rejection, which is consistent with graft acceptance.
Abstract: Background Porcine small intestinal submucosa (SIS) is an acellular, naturally derived extracellular matrix (ECM) that has been used for tissue remodeling and repair in numerous xenotransplantations. Although a vigorous immune response to xenogeneic extracellular matrix biomaterials is expected, to date there has been evidence for only normal tissue regeneration without any accompanying rejection. The purpose of this study was to determine the reason for a lack of rejection. Methods Mice were implanted s.c. with xenogeneic tissue, syngeneic tissue, or SIS, and the graft site analyzed histologically for rejection or acceptance. Additionally, graft site cytokine levels were determined by reverse transcriptase polymerase chain reaction and SIS-specific serum antibody isotype levels were determined by ELISA. Results Xenogeneically implanted mice showed an acute inflammatory response followed by chronic inflammation and ultimately graft necrosis, consistent with rejection. Syngeneically or SIS implanted mice, however, showed an acute inflammatory response that diminished such that the graft ultimately became indistinguishable from native tissue, observations that are consistent with graft acceptance. Graft site cytokine analysis showed an increase in interleukin-4 and an absence of interferon-gamma. In addition, mice implanted with SIS produced a SIS-specific antibody response that was restricted to the IgG1 isotype. Reimplantation of SIS into mice led to a secondary anti-SIS antibody response that was still restricted to IgG1. Similar results were observed with porcine submucosa derived from urinary bladder. To determine if the observed immune responses were T cell dependent, T cell KO mice were implanted with SIS. These mice expressed neither interleukin-4 at the implant site nor anti-SIS-specific serum antibodies but they did accept the SIS graft. Conclusions Porcine extracellular matrix elicits an immune response that is predominately Th2-like, consistent with a remodeling reaction rather than rejection.

415 citations


Journal ArticleDOI
TL;DR: CsA does not appear to be a major human teratogen and may be associated with increased rates of prematurity, although more research is needed to evaluate whether cyclosporine increases teratogenic risk.
Abstract: Background. Cyclosporine (CsA) therapy must often be continued during pregnancy to maintain maternal health in such conditions as organ transplantation and autoimmune disease. This meta-analysis was performed to determine whether CsA exposure during pregnancy is associated with an increased risk of congenital malformations, preterm delivery, or low birthweight. Methods. Various health science databases were searched to identify relevant articles. Articles selected for inclusion in the study were required to be free of any apparent selection bias and report outcomes in at least 10 newborns exposed to CsA in utero, specifically commenting on the presence or absence of congenital malformations. Article selection and data extraction were performed by two independent reviewers, with adjudication in cases of disagreement. To assess risks of CsA exposure, a summary odds ratio was calculated. Prevalence of malformations was calculated as a rate for all cyclosporine-exposed live births and for the subgroups identified. Ninety-five percent confidence intervals were constructed for both the odds ratio and prevalence rates. Results. Fifteen studies (6 with control groups of transplant without use of cyclosporine; total patients: 410) met the inclusion criteria for major malformations, 10 for preterm delivery (4 with control groups; total patients: 379) and 5 for low birth weight (1 with control groups; total number of patients: 314). The calculated odds ratio of 3.83 for malformations did not achieve statistical significance (CI 0.75‐19.6). The overall prevalence of major malformations in the study population (4.1%) also did not vary substantially from that reported in the general population. OR for prematurity [1.52 (CI 1.00 ‐2.32)] did not reach statistical significance although the overall prevalence rate was 56.3%. The OR for low birth weight [1.5 (CI 0.95‐ 2.44 based on 1 study)]. Conclusions. CsA does not appear to be a major human teratogen. It may be associated with increased rates of prematurity. More research is needed to evaluate whether cyclosporine increases teratogenic risk.

392 citations


Journal ArticleDOI
TL;DR: Most cases with DSA at the time of rejection had widespread C4d deposits in peritubular capillaries, suggesting a pathogenic role of the circulating alloantibody.
Abstract: Background. Acute rejection (AR) associated with de novo production of donor-specific antibodies (DSA) is a clinicopathological entity that carries a poor prognosis (acute humoral rejection, AHR). The aim of this study was to determine the incidence and clinical characteristics of AHR in renal allograft recipients, and to further analyze the antibodies involved. Methods. During a 4-year period, 232 renal transplants (Tx) were performed at our institution. Assays for DSA included T and B cell cytotoxic and/or flow cytometric cross-matches and cytotoxic antibody screens (PRA). C4d complement staining was performed on frozen biopsy tissue. Results. A total of 81 patients (35%) suffered at least one episode of AR within the first 3 months: 51 had steroid-insensitive AR whereas the remaining 30 had steroid-sensitive AR. No DSA were found in patients with steroid-sensitive AR. In contrast, circulating DSA were found in 19/51 patients (37%) with steroid-insensitive AR, and widespread C4d deposits in peritubular capillaries were present in 18 of these 19 (95%). In at least three cases, antibodies were against donor HLA class II antigens. DSA were not found in the remaining 32 patients but C4d staining was positive in 2 of 32. The DSA/C4d positive (n=18) and DSA/C4d negative (n=30) groups differed in pre-Tx PRA levels, percentage of re-Tx patients, refractoriness to antilymphocyte therapy, and outcome. Plasmapheresis and tacrolimus-mycophenolate mofetil rescue reversed rejection in 9 of 10 recipients with refractory AHR. Conclusion. More than one-third of the patients with steroid-insensitive AR had evidence of AHR, often resistant to antilymphocyte therapy. Most cases (95%) with DSA at the time of rejection had widespread C4d deposits in peritubular capillaries, suggesting a pathogenic role of the circulating alloantibody. Combined DSA testing and C4d staining provides a useful approach for the early diagnosis of AHR, a condition that often necessitates a more intensive therapeutic rescue regimen.

346 citations


Journal ArticleDOI
TL;DR: The profound effect of the IL-2 gene polymorphism in homozygous individuals may serve as a marker for those that could mount the most vigorous allo- or autoimmune responses, or perhaps become tolerant more easily.
Abstract: BACKGROUND Genetic variations in cytokine genes are thought to regulate cytokine protein production. However, studies using T cell mitogens have not always demonstrated a significant relationship between cytokine polymorphisms and in vitro protein production. Furthermore, the functional consequence of a polymorphism at position -330 in the IL-2 gene has not been described. We associated in vitro protein production with cytokine gene polymorphic genotypes after costimulation of cultured peripheral blood lymphocytes. METHODS PBL were isolated from forty healthy volunteers. Cytokine protein production was assessed by enzyme-linked immunosorbent assay. Polymorphisms in interleukin- (IL) 2, IL-6, IL-10, tumor necrosis factor (TNF-alpha), tumor growth factor (TGF-beta), and interferon (IFN-gamma) were determined by polymerase chain reaction (PCR). RESULTS Statistical difference between protein production and cytokine polymorphic variants in the IL-10, IFN-gamma, and TNF-alpha genes was not evident after 48-hour stimulation with concanavalin-A. In contrast, after anti-CD3/CD28 stimulation significant differences (P<0.05) were found among high and low producers for IL-2, IL-6, and among high, intermediate, and low producers for IFN-gamma, and IL-10. Augmented levels of IL-2 in individuals that were homozygous for the polymorphic IL-2 allele were due to an early and sustained enhancement of IL-2 production. No association was found among TNF-alpha and TGF-beta genotypes and protein production. CONCLUSION Polymorphisms in IL-2, IL-6, IL-10, and IFN-gamma genes are associated with their protein production after anti-CD3/CD28 stimulation. The profound effect of the IL-2 gene polymorphism in homozygous individuals may serve as a marker for those that could mount the most vigorous allo- or autoimmune responses, or perhaps become tolerant more easily.

339 citations


Journal ArticleDOI
TL;DR: Reduction in immunosuppression is an effective initial therapy for Posttransplant lymphoproliferative disorder and clinical prognostic factors may allow clinicians to identify which patients are likely to respond to reduction in immuno- suppression.
Abstract: Background Posttransplant lymphoproliferative disorder (PTLD) is an Epstein-Barr virus-associated malignancy that occurs in the setting of pharmacologic immunosuppression after organ transplantation. With the increased use of organ transplantation and intensive immunosuppression, this disease is becoming more common. We explore reduction in immunosuppression as an initial therapy for PTLD. Methods We analyzed our organ transplant patient database to identify patients with biopsy-proven PTLD who were initially treated with reduction of their immunosuppressive medications with or without surgical resection of all known disease. Results Forty-two adult patients were included in this study. Thirty patients were treated with reduction in immunosuppression alone. Twelve patients were treated with both reduction in immunosuppression and surgical resection of all known disease. Thirty-one of 42 patients (73.8%) achieved a complete remission. Of those patients who were treated with reduction in immunosuppression alone, 19 of 30 (63%) responded with a median time to documentation of response of 3.6 weeks. Multivariable analysis showed that elevated lactate dehydrogenase (LDH) ratio, organ dysfunction, and multi-organ involvement by PTLD were independent prognostic factors for lack of response to reduction in immunosuppression. In patients with none of these poor prognostic factors, 16 of 18 (89%) responded to reduction in immunosuppression in contrast to three of five (60%) with one risk factor and zero of seven (0%) with two to three factors present. The analysis also showed that increased age, elevated LDH ratio, severe organ dysfunction, presence of B symptoms (fever, night sweats, and weight loss), and multi-organ involvement by PTLD at the time of diagnosis are independent prognostic indicators for poor survival. With median follow-up of 147 weeks, 55% of patients are alive with 50% in complete remission. Conclusions Reduction in immunosuppression is an effective initial therapy for PTLD. Clinical prognostic factors may allow clinicians to identify which patients are likely to respond to reduction in immunosuppression.

315 citations


Journal ArticleDOI
TL;DR: Among patients with type 1 DM with end-stage nephropathy, SPK transplantation before the age of 50 years was associated with long-term improvement in survival compared to solitary cadaveric renal transplantation or dialysis.
Abstract: Background. Simultaneous pancreas-kidney transplantation (SPK) ameliorates the progression of microvascular diabetic complications but the procedure is associated with excess initial morbidity and an uncertain effect on patient survival when compared with solitary cadaveric or living donor renal transplantation. We evaluated mortality risks associated with SPK, solitary renal transplantation, and dialysis treatment in a national cohort of type 1 diabetics with end-stage nephropathy. Methods. A total of 13,467 adult-type 1 diabetics enrolled on the renal and renal-pancreas transplant waiting list between 10/01/88 and 06/30/97 were followed until 06/30/98. Time-dependent mortality risks and life expectancy were calculated according to the treatment received subsequent to wait-list registration: SPK; cadaveric kidney only (CAD); living donor kidney only (LKD) transplantation; and dialysis [wait-listed, maintenance dialysis treatment (WLD)]. Results. Adjusted 10-year patient survival was 67% for SPK vs. 65% for LKD recipients (P=0.19) and 46% for CAD recipients (P<0.001). The excess initial mortality normally associated with renal transplantation and the risk of early infectious death was 2-fold higher in SPK recipients. The time to achieve equal proportion of survivors as the WLD patients was 170, 95, and 72 days for SPK, CAD, and LKD recipients, respectively (P<0.001). However, the adjusted 5-year morality risk (RR) using WLD as the reference and the expected remaining life years were 0.40, 0.45, and 0.75 and 23.4, 20.9, and 12.6 years for SPK, LKD, and CAD, respectively. There was no survival benefit in SPK recipients ≥50 years old (RR=1.38, P=0.81). Conclusions. Among patients with type 1 DM with end-stage nephropathy, SPK transplantation before the age of 50 years was associated with long-term improvement in survival compared to solitary cadaveric renal transplantation or dialysis.

307 citations


Journal ArticleDOI
TL;DR: Laroscopic donor nephrectomy is associated with a briefer, less intense, and more complete convalescence compared with the open surgical approach.
Abstract: Background. Laparoscopic live donor nephrectomy for renal transplantation is being performed in increasing numbers with the goals of broadening organ supply while minimizing pain and duration of convalescence for donors. Relative advantages in terms of recovery provided by laparoscopy over standard open surgery have not been rigorously assessed. We hypothesized that laparoscopic as compared with open surgical live donor nephrectomy provides briefer, less intense, and more complete convalescence. Methods. Of 105 volunteer, adult, potential living-renal donors interested in the laparoscopic approach, 70 were randomly assigned to undergo either hand-assisted laparoscopic or open surgical live donor nephrectomy at a single referral center. Objective data and subjective recovery information obtained with telephone interviews and validated questionnaires administered 2 weeks, 6 weeks, and 6–12 months postoperatively were compared between the 23 laparoscopic and 27 open surgical patients. Results. There was 47% less analgesic use (P =0.004), 35% shorter hospital stay (P =0.0001), 33% more rapid return to nonstrenuous activity (P =0.006), 23% sooner return to work (P =0.037), and 73% less pain 6 weeks postoperatively (P =0.004) in the laparoscopy group. Laparoscopic patients experienced complete recovery sooner (P =0.032) and had fewer long-term residual effects (P =0.0015). Conclusions. Laparoscopic donor nephrectomy is associated with a briefer, less intense, and more complete convalescence compared with the open surgical approach.

Journal ArticleDOI
TL;DR: In this article, a right liver graft without a middle hepatic vein (MHV) was transplanted in patients including two hepatitis B virus-cirrhosis, two fulminant hepatic failure and one secondary biliary cirrhosis.
Abstract: Background Left liver graft from a small donor will not meet the metabolic demands of a larger adult recipient. One solution to this problem is to use a right liver graft without a middle hepatic vein (MHV). However, the need for drainage from the MHV tributaries has not yet been described. Methods Five right liver grafts without a MHV were transplanted in patients including two hepatitis B virus-cirrhosis, two fulminant hepatic failure and one secondary biliary cirrhosis. The graft weight ranged from 650 to 1,000 g, corresponding to 48 to 83% of the standard liver volume of the recipients. Results Two of five recipients were complicated with severe congestion of the right median sector immediately after reperfusion, followed by prolonged massive ascites and severe liver dysfunction. One of the patients died of sepsis with progressive hepatic dysfunction 20 days after the operation. Conclusions Preservation and reconstruction of the MHV tributaries is recommended to prevent congestion of the right liver graft without MHV.

Journal ArticleDOI
TL;DR: A majority of long-term patients undergoing living donor liver transplantation may be potential candidates to be successfully weaned from immunosuppression, and the mechanism of graft acceptance in these patients has yet to be elucidated.
Abstract: Background Some reported studies have indicated the possibility of immunosuppression withdrawal in cadaveric liver transplantation. The aim of this study was to evaluate the possibility and feasibility of weaning living donor liver transplant recipients from immunosuppression. Methods From June of 1990 to October of 1999, 63 patients were considered to be weaned from immunosuppression. They consisted of 26 electively weaned patients and 37 either forcibly or incidentally weaned patients (nonelective weaning) due to various causes but mainly due to infection. Regarding elective weaning, we gradually reduced the frequency of tacrolimus administration for patients who survived more than 2 years after transplantation, maintained a good graft function, and had no rejection episodes in the preceding 12 months. The frequency of administration was reduced from the conventional b.i.d. until the start of weaning to q.d., 4 times a week, 3 times a week, twice a week, once a week, twice a month, once a month, and finally, the patients were completely weaned off with each weaning period lasting from 3 to 6 months. The reduction method of nonelective weaning depended on the clinical course of each individual case. When the patients were clinically diagnosed to develop rejection during weaning, then such patients were treated by a reintroduction of tacrolimus or an additional steroid bolus when indicated. Results Twenty-four patients (38.1%) achieved a complete withdrawal of tacrolimus with a median drug-free period of 23.5 months (range, 3-69 months). Twenty-three patients (36.5%) are still being weaned at various stages. Sixteen patients (25.4%) encountered rejection while weaning at median period of 9.5 months (range, 1-63 months) from the start of weaning. All 16 were easily treated with the reintroduction of tacrolimus or additional steroid bolus therapy. Conclusions We were able to achieve a complete withdrawal of immunosuppression in some selected patients. Although the mechanism of graft acceptance in these patients has yet to be elucidated, we believe that a majority of long-term patients undergoing living donor liver transplantation may, thus, be potential candidates to be successfully weaned from immunosuppression.

Journal ArticleDOI
TL;DR: The results indicate that T-cell depletion is achieved rapidly and primarily in peripheral lymphoid tissues at high ATG dosage, and short ATG treatments could therefore be clinically evaluated when major peripheral T- cell depletion is required.
Abstract: Background. The mechanisms of action of polyclonal antithymocyte globulins (ATGs) are still poorly understood and the selection of doses used in different clinical applications (prevention or treatment of acute rejection in organ allografts, treatment of graft-versus-host disease, or conditioning for allogeneic stem cell transplantation) remains empirical. Low T-cell counts are usually achieved in peripheral blood during ATG treatment but the extent of T-cell depletion in lymphoid tissues is unknown. Methods. Experiments were conducted in cynomolgus monkeys using Thymoglobuline at low (1 mg/kg), high (5 mg/kg), and very high (20 mg/kg) doses. Results. ATG treatment induced a dose-dependent lymphocytopenia in the blood and a dose-dependent T-cell depletion in spleen and lymph nodes but not in the thymus, indicating a limited access of ATG to this organ. T-cell apoptosis in peripheral lymphoid tissues was the main mechanism of depletion. Remaining T cells in peripheral lymphoid organs were coated by antibodies and had down-modulated surface expression of CD2, CD3, CD4, and CD8 molecules, whereas their responsiveness in mixed leukocyte reaction was impaired. The survival of MHC-mismatched skin and heart allografts was prolonged in a dose-dependent fashion, despite the occurrence of a strong anti-ATG antibody response resulting in the rapid clearance of circulating ATGs. Conclusion. The results indicate that T-cell depletion is achieved rapidly and primarily in peripheral lymphoid tissues at high ATG dosage. Short ATG treatments could therefore be clinically evaluated when major peripheral T-cell depletion is required.

Journal ArticleDOI
TL;DR: Sirolimus is very probably responsible for interstitial pneumonitis and should be added to the list of possible causes of pulmonary complications after renal transplantation, after cessation or dose reduction of sirolimu led to complete and lasting resolution of symptoms.
Abstract: Background.Sirolimus, a promising new immunosuppressive drug for organ transplantation, is currently associated with side effects, such as thrombocytopenia and hyperlipidemia.Methods.Eight renal transplant recipients, who developed unexplained interstitial pneumonitis during sirolimus therapy, were

Journal ArticleDOI
TL;DR: On average, the remaining renal function of kidney donors did not deteriorate more rapidly than what may be expected from ageing, and the continued use of living kidney donors if strict criteria are used for acceptance.
Abstract: BACKGROUND There is a lack of kidneys available for kidney transplantation, and living donors are increasingly used. It is important to examine the possible long-term adverse affect on the renal function and blood pressure of the donors. METHODS We have made a comprehensive follow-up of all living kidney donors at our center from 1964 to 1995. Of 402 donors still alive, we were able to get information about serum creatinine, urinary proteins, and blood cells in urine using reagent strips, and blood pressure from 87%. The glomerular filtration rate (GFR) was estimated using a formula and was measured with Iohexol clearance in 43 of the donors. Individual data on GFR and the prevalence of hypertension were compared with the age- and gender-expected values. RESULTS The mean age of the examined donors was 61 years (SD:13) at follow-up, and the time since donation was 12 years (SD:8). The average estimated GFR was 72% (SD:18) of the age-predicted value. The ratio of the estimated to the predicted GFR showed no correlation to the time since donation, indicating that there is no accelerated loss of renal function after donation. GFR below 30 ml/min was found in five donors. No donor died in uremia or had dialysis treatment before death. However, three donors developed renal disease, and one was in dialysis treatment. In two of these cases, hereditary factors were possibly involved. Hypertension was present in 38% of the donors but the age-adjusted prevalence of hypertension among donors was not higher than in the general population. Significant proteinuria (> or =1.0 g/L) was found in 3% and slight proteinuria (<1.0 g/L) in 9% of the donors. Proteinuria was associated with hypertension and a lower GFR. CONCLUSIONS On average, the remaining renal function of kidney donors did not deteriorate more rapidly than what may be expected from ageing. However one-third of the female and half of the male donors developed hypertension and, approximately, 10% displayed proteinuria. Nevertheless, our study supports the continued use of living kidney donors if strict criteria are used for acceptance.

Journal ArticleDOI
TL;DR: A high prevalence of PTDM was found in HCV (+) recipients and PTDM after OLT was associated with significantly increased mortality, and HCV infection and methylprednisolone boluses were found to be independent risk factors for the developed PTDM.
Abstract: Background. Recent studies suggest an association between diabetes mellitus and hepatitis C virus (HCV) infection. Our aim was to determine (1) the prevalence and determinants of new onset posttransplant diabetes mellitus (PTDM) in HCV (+) liver transplant (OLT) recipients, (2) the temporal relationship between recurrent allograft hepatitis and the onset of PTDM, and (3) the effects of antiviral therapy on glycemic control. Methods. Between January of 1991 and December of 1998, of 185 OLTs performed in 176 adult patients, 47 HCV (+) cases and 111 HCV (-) controls were analyzed. We reviewed and analyzed the demographics, etiology of liver failure, pretransplant alcohol abuse, prevalence of diabetes mellitus, and clinical characteristics of both groups. In HCV (+) patients, the development of recurrent allograft hepatitis and its therapy were also studied in detail. Results. The prevalence of pretransplant diabetes was similar in the two groups, whereas the prevalence of PTDM was significantly higher in HCV (+) than in HCV (-) patients (64% vs. 28%, P=0.0001). By multivariate analysis, HCV infection (hazard ratio 2.5, P=0.001) and methylprednisolone boluses (hazard ratio 1.09 per bolus, P=0.02) were found to be independent risk factors for the development of PTDM. Development of PTDM was found to be an independent risk factor for mortality (hazard ratio 3.67, P<0.0001). The cumulative mortality in HCV (+) PTDM (+) versus HCV (+) PTDM (-) patients was 56% vs. 14% (P=0.001). In HCV (+) patients with PTDM, we could identify two groups based on the temporal relationship between the allograft hepatitis and the onset of PTDM: 13 patients developed PTDM either before or in the absence of hepatitis (group A), and 12 concurrently with the diagnosis of hepatitis (group B). In gr. B, 11 of 12 patients received antiviral therapy. Normalization of liver function tests with improvement in viremia was achieved in 4 of 11 patients, who also demonstrated a marked improvement in their glycemic control. Conclusion. We found a high prevalence of PTDM in HCV (+) recipients. PTDM after OLT was associated with significantly increased mortality. HCV infection and methylprednisolone boluses were found to be independent risk factors for the development of PTDM. In approximately half of the HCV (+) patients with PTDM, the onset of PTDM was related to the recurrence of allograft hepatitis. Improvement in glycemic control was achieved in the patients who responded to antiviral therapy.

Journal ArticleDOI
TL;DR: Early tubulointerstitial damage at 3 months profoundly influenced graft survival beyond 10 years and CAN was predicted by kidney ischemia, 3-month chronic intimal vascular thickening, tubular injury, proteinuria, and late rejection.
Abstract: Background. Chronic renal allograft failure remains a major challenge to overcome. Factors such as donor quality, delayed graft function (DGF), acute rejection, and immunosuppression are known to affect long-term outcome, but their relationship to histological damage to graft outcome is unclear. Methods. Protocol kidney biopsies (n=112) obtained at 3 months after transplantation yielded 102 with adequate tissue. Histology was scored by the Banff schema, and compared with implantation biopsies (n=91), repeat 12-month histology (n=39), decline in serum creatinine and serial isotopic glomerular filtration rate, onset of chronic allograft nephropathy (CAN), and actuarial graft survival censored for death with a functioning graft. Results. At a median follow-up of 9.3 years, 20 patients had graft failure and 26 died with a functioning graft. Banff chronic nephropathy was present in 24% of 3-month biopsies, and was predicted by microvascular disease in the donor, cold ischemia, DGF, and acute vascular rejection (P<0.001). Acute glomerulitis at 3 months correlated with segmental glomerulosclerosis at 12 months, subsequent recurrent glomerulonephritis, and graft failure (P<0.01). Subclinical rejection at 3 months occurred in 29% of biopsies, correlated with prior acute rejection and HLA mismatch, and led to chronic histological damage by 12 months (r=0.25-0.67, P<0.05-0.001). Subclinical rejection, arteriolar hyalinosis, and tubulitis present at 3 months had resolved by 12 months. The 10-year survival rates for Banff chronic nephropathy were 90.4% for grade 0, 81.0% grade 1, and 57.9% for grades 2 or greater (P<0.01). Early tubulointerstitial damage at 3 months profoundly influenced graft survival beyond 10 years. CAN was predicted by kidney ischemia, 3-month chronic intimal vascular thickening, tubular injury, proteinuria, and late rejection. Chronic fibrointimal thickening of the small arteries and chronic interstitial fibrosis at 3 months independently predicted graft loss and decline in renal function (P<0.05-0.001). Conclusions. Early transplant damage occurs in the tubulointerstitial compartment from preexisting donor kidney injury and discrete events such as vascular rejection and DGF. Subsequent chronic damage and graft failure reflect accumulated previous injury and chronic interstitial fibrosis, vascular impairment, subclinical rejection, and injury from late rejection. CAN may be conceptualized as the sequelae of incremental and cumulative damage to the transplanted kidney. The duration of graft survival is dependent and predicted by the quality of the transplanted donor kidney combined with the intensity, frequency, and irreversibility of these damaging insults.

Journal ArticleDOI
TL;DR: This CNI avoidance study in immunologic low-risk patients, while only partially successful in preventing acute rejection, provided benefits to a sizable minority of patients who have not required chronic CNI therapy.
Abstract: Background The adoption of calcineurin inhibitors (CNI) as the mainstay of immunosuppression has resuited in a significant decrease of acute rejection and improvement of short-term graft survival However, because of the irreversible nephrotoxicity associated with the chronic use of the CNI, the magnitude of the improvement of long-term graft survival has been more modest Therefore, an effective immunosuppression regimen that does not rely on CNI may result in improvement of long-term outcome and simplification of the management of transplant recipients Methods Ninety-eight patients of primary cadaver or living donor kidneys at low immunologic risk were enrolled in a CNI avoidance study The immunosuppression regimen consisted of daclizumab, a humanized monoclonal antibody that binds to the alpha chain of the interleukin-2 receptor (IL-2Ralpha), administered for a total of five doses at biweekly intervals; 3 gm/day mycophenolate mofetil for the first 6 month and 2 gm thereafter; and conventional corticosteroid therapy Patients who underwent rejection episodes could be started on CNI The primary efficacy end-point was biopsy-proven rejection during the first 6 months posttransplant Results Biopsy-proven rejection was diagnosed in 48% of patients during the first 6 months after transplantation The majority of rejection episodes were Banff grade I and IIA and were fully reversed with corticosteroid therapy The median time to the first biopsy-proven rejection among patients who experienced this event during the first 6 months was 39 days In 22 patients with delayed graft function, the proportion of patients with biopsy-proven rejection was 50% at 6 months However in the first 2 weeks posttransplant, only 1 of 22 patients with delayed graft function developed biopsy-proven rejection At 1 year, patient survival was 97% and graft survival was 96% Only two grafts were lost secondary to rejection At 1-year posttransplant, 62% of patients had received CNI for more than 7 days At 1-year posttransplant, the mean serum creatinine in the nonrejectors with no CNI use was 113 micromol/L (95%, confidence interval [CI], 1007 to 1253 micromol/L) and in the rejectors or patients with CNI use (more than 7 days) was 154 micromol/L (95% CI, 1350 to 1730 micromol/L) In selected patients with rejection, analysis of circulating and intragraft lymphocytes revealed complete IL-2Ralpha saturation Conclusions This CNI avoidance study in immunologic low-risk patients, while only partially successful in preventing acute rejection, provided benefits to a sizable minority of patients who have not required chronic CNI therapy However, wide acceptance of a CNI-sparing immunosuppression regimen may require a lower rate of acute rejection, possibly through the addition of a non-nephrotoxic dose of CNI However, because complete IL-2Ralpha blockade was present during rejection, it can be assumed that alternative pathways, such as IL-15, may be responsible for the rejection; thus, the incorporation of non-nephrotoxic immunosuppressive agents, such as sirolimus, may provide a more strategic approach

Journal ArticleDOI
TL;DR: The experimental design allows the identification of patients with sufficient, insufficient, or absent T-cell activity and can serve as diagnostic tool to facilitate decisions on antiviral therapy.
Abstract: Background. Immunosuppressive treatment in transplant patients frequently causes infectious complications with cytomegalovirus (CMV). The extent of CMV replication can be followed by a number of diagnostic methods. There is, however, no simple diagnostic tool to assess the quality of the cellular antiviral immune response of an individual patient. This would be of particular importance for therapy decisions, as patients with detectable virus load do not necessarily develop CMV-related disease. Using a rapid whole blood assay, the frequencies of CMV-reactive CD4 and CD8 T cells were followed after renal transplantation to characterize their relative contribution in the containment of CMV infection. Methods. T cells from transplant patients and healthy control persons were stimulated with CMV antigen in vitro. Based on specific cellular activation and induction of intracellular cytokines, the frequency of CMV-reactive CD4 and CD8 T cells was determined using flow cytometry. Viral load was quantified using the “hybrid-capture” assay. Results. The absence of CMV complications in long-term transplant recipients is reflected by stable virus-specific T-cell frequencies, which do not differ from healthy CMV-positive controls. In contrast, during the first months after transplantation, clinical symptoms are preceded by a decrease in CMV-reactive CD4 T-cell frequencies and an increase in CMV load. Conclusions. The individual immune response and CMV replication are critically balanced and can be characterized by assessing both viral load and antiviral T cells. Our experimental design allows the identification of patients with sufficient, insufficient, or absent T-cell activity and can serve as diagnostic tool to facilitate decisions on antiviral therapy.

Journal ArticleDOI
TL;DR: Antivimentin antibodies are an independent predictor of TxCAD and can be used to identify some of the patients who are at high risk of developing this complication.
Abstract: Background. Transplant-associated coronary artery disease (TxCAD) is the most serious long-term complication after cardiac transplantation. Anti-endothelial antibodies are associated with disease, and one of the major endothelial antigens recognized in the sera of patients has been shown to be the protein filament vimentin. In this study, we investigated whether antivimentin antibodies are associated with TxCAD and whether their presence can be used to identify patients at high risk of developing angiographically detectable TxCAD. Methods. Up to 5 years after transplantation, 880 sequential sera (7.07+/-1.8 samples/patient) were collected retrospectively from 109 patients; the majority were collected in the first 2 years. Sera were assessed for antivimentin antibodies using ELISA. TxCAD was assessed by annual angiography. Results. Mean titres of antivimentin antibodies, calculated up to 1, 2, and 5 years, were significantly higher in patients who developed TxCAD than those who remained disease free (P 120) produced a test with 63% sensitivity and 76% specificity. Inclusion of persistent rejection or high 1-year mean titre (greater than or equal to 270) as a risk, factor produced a test with 66% sensitivity and 82% specificity. Multivariate analysis of time to occurrence of transplant vasculopathy showed that mean titre at 1 or 2 years was an independent predictor of time until disease in the presence of all other variables. Conclusions. Antivimentin antibodies are an independent predictor of TxCAD and can be used to identify some of the patients who are at high risk. of developing this complication.

Journal ArticleDOI
TL;DR: This overview seeks to familiarize the reader with the clinical information that provided the bases for drug approval and with the single-center reports that document alternate approaches to optimize the outcomes of treatment with this immunosuppressive agent.
Abstract: Sirolimus (rapamycin; RAPA) is a macrocyclic lactone with a novel mechanism of immunosuppressive action (1). During the past 7 years, the drug has undergone clinical trials progressing from Phase I safety, tolerability, and pharmacokinetic investigation to Phase II dose-finding studies and limited-sized, multicenter evaluations of drug combination regimens. The completion of Phase III large randomized national and international trials led to approval of the drug to achieve augmented acute rejection prophylaxis in combination with cyclosporine (CsA) and steroids by the Food and Drug Administration of the United States in September 1999. In November 2000, the drug was approved by the European Agency as an alternate to calcineurin antagonists for long-term maintenance therapy. This overview seeks to familiarize the reader with the clinical information that provided the bases for drug approval and with the single-center reports that document alternate approaches to optimize the outcomes of treatment with this immunosuppressive agent.

Journal ArticleDOI
TL;DR: Management interventions such as antihypertensives, lipid‐lowering agents, antidiabetic therapy, aspirin prophylaxis, and smoking cessation have a positive impact on known risk factors for cardiovascular disease, and their use may decrease cardiovascular morbidity and mortality in renal transplant recipients.
Abstract: A large proportion of late renal allograft failures are attributable to patient death with a functioning graft, with almost half of the deaths related to cardiovascular events. Using data from the Framingham Heart Study and our own renal transplant population, risk factors for cardiovascular disease in the general population, such as hypertension, hyperlipidemia, and cigarette smoking, were found to be predictive in renal transplant recipients. However, diabetes mellitus dramatically elevated the risk in renal transplantation. Also, two or more acute rejection episodes in the first year after transplantation were associated with a greater risk, whereas pretransplantation nephrectomy and higher serum albumin levels reduced the risk for ischemic heart events. Pretransplantation screening assists identification of patients who are at risk of, or who have preexisting, cardiovascular disease. Management interventions such as antihypertensives, lipid-lowering agents, antidiabetic therapy, aspirin prophylaxis, and smoking cessation have a positive impact on known risk factors for cardiovascular disease, and their use may decrease cardiovascular morbidity and mortality in renal transplant recipients.

Journal ArticleDOI
TL;DR: In this paper, a single infusion of Recombinant Factor VIIa (rFVIIa) (80 mug/kg) was administered at the start of the operation, to be repeated according to predefined criteria, Packed red blood cells (RBC), fresh-frozen plasma, and platelet concentrates were administered according to pre-defined criteria.
Abstract: Background. Large transfusion requirements, i.e., excessive blood loss, during orthotopic liver transplantation (OLT) are correlated with increased morbidity and mortality, Recombinant factor VIIa (rFVIIa) has been shown to improve hemostasis in a variety of conditions, but has never been studied in liver transplantation. Methods. We performed a single-center, open-label, pilot study in adult patients undergoing OLT for cirrhosis Child-Pugh B or C, to assess efficacy and safety of rFVIIa. rFVIIa (80 mug/kg) was administered at the start of the operation, to be repeated according to predefined criteria, Packed red blood cells (RBC), fresh-frozen plasma, and platelet concentrates were administered according to predefined criteria, Perioperative transfusion requirements in study patients were compared with matched controls, Results. Six patients were enrolled in the study, All received a single dose of rFVIIa, Transfusion requirements (given as median, with range in parentheses) were lower in the study group than in matched controls: 1.5 (0-5) vs. 7 (2-18) units of allogeneic RBC (P=0.006), 0 (0-2) vs. 3.5 (0-23) units of autologous RBC (P=0.043), total amount of RBC 3 (0-5) vs. 9 (4-40) units (P=0.002). Transfused fresh-frozen plasma was 1 (0-7) vs, 8 (2-35) units P=0.011). flood loss was 3.5 L; (1.4-5.3) vs, 9.8 L (3.7-35.0) (P=0.004), One study patient developed a hepatic artery thrombosis at day 1 postoperatively. Conclusions. A single dose of 80 mug/kg rFVIIa significantly reduced transfusion requirements during OLT, Further study is needed to establish the optimally effective and safe dose of rFVIIa in orthotopic liver transplantation.

Journal ArticleDOI
TL;DR: Despite immunosuppression including chronic steroids, the incidence of wound infections, incisional hernias, and fascial dehiscence is low in kidney recipients, and the main risk factors are obesity, reoperation, and increased age.
Abstract: Background The most common surgical complication after a kidney transplant is likely related to the wound. The purpose of this analysis was to determine the incidence of, and risk factors for, wound complications (e.g., infections, hernias) in kidney recipients and to assess whether newer immunosuppressive drugs increase the risk for such complications. Methods Between January 1, 1984 and September 30, 1998, we performed 2013 adult kidney transplants. Of these 2013 recipients, 97 (4.8%) developed either a superficial or a deep wound infection. Additionally, 73 (3.6%) recipients developed either a fascial dehiscence or a hernia of the wound. We used univariate and multivariate techniques to determine significant risk factors and outcomes. Results Mean time to development of a superficial infection (defined as located above the fascia) was 11.9 days posttransplant; to development of a deep infection (defined as located below the fascia), 39.2 days; and to development of a hernia or fascial dehiscence, 12.8 months. By multivariate analysis, the most significant risk factor for a superficial or deep wound infection was obesity (defined as body mass index>30 kg/m2) (RR=4.4, P=0.0001). Other significant risk factors were a urine leak posttransplant, any reoperation through the transplant incision, diabetes, and the use of mycophenolate mofetil (MMF) (vs. azathioprine) for maintenance immunosuppression (RR=2.43, P=0.0001). Significant risk factors for a hernia or fascial dehiscence were any reoperation through the transplant incision, increased recipient age, obesity, and the use of MMF (vs. azathioprine) for maintenance immunosuppression (RR=3.54, P=0.0004). Use of antibody induction and treatment for acute rejection were not significant risk factors for either infections or hernias. Death-censored graft survival was lower in recipients who developed a wound infection (vs. those who did not); it was not lower in recipients who developed an incisional hernia or facial dehiscence (vs. those who did not). Conclusions Despite immunosuppression including chronic steroids, the incidence of wound infections, incisional hernias, and fascial dehiscence is low in kidney recipients. As with other types of surgery, the main risk factors for postoperative complications are obesity, reoperation, and increased age. However, in kidney recipients, use of MMF (vs. azathioprine) is an additional risk factor -one that potentially could be altered, especially in high-risk recipients.

Journal ArticleDOI
TL;DR: The right kidney can be procured with LLDN; however, a rational approach to preoperative angiographic imaging, donor operation, and recipient operation is crucial.
Abstract: Background. The left kidney is preferred for live donation. In open live donor nephrectomy, the right kidney is selected if the left kidney has multiple renal arteries or anomalous venous drainage. With laparoscopic live donor nephrectomy (LLDN), there is reluctance to procure the right kidney because of the more difficult exposure and further shortening of the right renal vein (RRV) after a stapled transection. An experience with LLDN is reviewed to determine whether the right kidney should be procured laparoscopically. Methods. From February 1995 to November 1999, 227 patients underwent live donor renal transplants with allografts procured by LLDN. The results of these transplants were analyzed. Results. Of the 227 kidneys transplanted, 17 (7.5%) were right kidneys. In the early experience, three (37.5%) of the eight right renal allografts developed venous thrombosis, two of which had duplicated RRV. Based on these initially unacceptable results, donor evaluation and LLDN techniques were modified. Spiral computerized tomography (CT) replaced conventional angiography to define better the venous anatomy. LLDN was modified in one of three ways: (1) changing the stapler port placement such that the RRV was transected in a plane parallel to the inferior vena cava, (2) relocation of the incision for open division of RRV, or (3) lengthening of the donor RRV with a panel graft constructed of recipient greater saphenous vein. Finally, the recipient operation enjoined complete mobilization of the left iliac vein with transposition lateral to the iliac artery. With these modifications, there were no vascular complications with the subsequent nine right renal allografts (P<0.05). Of the left kidneys transplanted, 31 had multiple renal arteries, 14 had retroaortic or circumaortic veins, 4 had both multiple arteries and venous anomalies, and 1 had a duplicated IVC draining the left renal vein. There were no vascular complications with left renal allografts that had multiple arteries or venous anomalies. Conclusions. LLDN of the left kidney is technically easier. Left kidneys with multiple arteries or anomalous venous drainage are not problematic. The right kidney can be procured with LLDN; however, a rational approach to preoperative angiographic imaging, donor operation, and recipient operation is crucial.

Journal ArticleDOI
TL;DR: There is a trend towards increasing incidence and earlier occurrence of PTLD in the pediatric renal transplant population and white race and cadaver donor sources are risk factors not reported before.
Abstract: Background. Posttransplant lymphoproliferative disorder (PTLD) is an important complication of transplantation. The North American Pediatric Renal Transplant Cooperative Study (NAPRTCS) database has documented 56 cases of PTLD, the largest such series to date. Methods. We analyzed the available longitudinal and multicenter data in the NAPRTCS database to evaluate the demographic and therapeutic risk factors and the temporal trends for PTLD in children after renal transplantation. Results. The overall incidence of PTLD was 1.2% of all patients or 298/100,000 posttransplantation years of follow-up. However, this incidence increased from 254/ 100,000 years between 1987 and 1991 to 395/100,000 years from 1992 onwards. In the same periods, the time to PTLD decreased from a median of 356 days (range 64-3048) to a median of 190 days (range 42-944). PTLD occurred with greater frequency in white children (P=0.003) and in cadaver donor transplants (P=0.019), but there was no significant predilection for gender, younger children (0-5 years), or primary diagnosis. No significant difference was found in the use of anti-T-cell antibodies or in doses of CsA, azathioprine, or prednisone at 1 month, 6 months, and 1 year. Between 1996 and 1997, 69 patients were initiated with tacrolimus. Eight cases of PTLD were identified in these recipients to date (prevalence rate 11.5%), compared with 46/4084 (1.1%) where cyclosporine was used (P<0.0001). Conclusions. There is a trend towards increasing incidence and earlier occurrence of PTLD in the pediatric renal transplant population. White race and cadaver donor sources are risk factors not reported before. Continued monitoring of tacrolimus immunosuppression is important.

Journal ArticleDOI
TL;DR: This is the first report to describe generation of transgenic pigs that express human CD46, and the first in vivo demonstration of the ability of humanCD46 expressed on pig organs to regulate complement activation and overcome hyperacute rejection upon transplantation of a vascularized organ into nonhuman primates.
Abstract: Background The chronic shortage in the supply of human organs available for allotransplantation has turned attention toward the use of animals as potential donors, with pigs as the most likely species under consideration. Hyperacute rejection, the initial and immediate barrier to a pig-to-primate xenograft, has been addressed by generation of transgenic pigs that express the human membrane-bound complement-regulatory proteins CD59 and/or CD55. Difficulty has been encountered in generation of transgenic animals that express a third membrane-bound complement-regulatory protein, CD46. Methods We have generated transgenic animals by using a large genomic construct that encompasses the entire human CD46 gene. Results We report the first description of transgenic mice and pigs that express high levels of human CD46 in a cell and tissue type-specific manner, resembling patterns of endogenous CD46 expression observed in human tissues. Furthermore, when human CD46 transgenic porcine hearts were transplanted into baboons, the grafts did not succumb to hyperacute rejection, and survival extended for up to 23 days. Under the same conditions, nontransgenic grafts underwent hyperacute rejection within 90 min. Conclusions This is the first report to describe generation of transgenic pigs that express human CD46, and the first in vivo demonstration of the ability of human CD46 expressed on pig organs to regulate complement activation and overcome hyperacute rejection upon transplantation of a vascularized organ into nonhuman primates.

Journal ArticleDOI
TL;DR: In plasma of nonimmunosuppressed individuals, even a qualitative detection of EBV-related sequences was sensitive and specific for the diagnosis of primary EBV infection, whereas for analysis of PBMC DNA a quantitative parameter had to be considered to differentiate healthy individuals from patients with primaryEBV infection.
Abstract: Background. Early diagnosis of Epstein-Barr virus (EBV)-associated posttransplant lymphoproliferative disorder (PTLD) is required to detect a stage of disease that is more likely to respond to treatment. Elevated levels of EBV DNA were found in peripheral blood of patients at the onset of PTLD. Methods. To compare plasma and peripheral blood mononuclear cells (PBMCs) as material for real-time quantitative polymerase chain reaction (RQ-PCR) measurement of Epstein-Barr viral load, we used two sets of primers and probes specific for the BAM HI-K or BAM HI-W region of the EBV genome. Results. Patients with PTLD had a median viral load of 19,200 EBV genomes/μg DNA (n=9) or 3,225 EBV genomes/100 μl plasma (n=5), being significantly higher compared with immunosuppressed patients with primary (n=9) or reactivated (n=20) EBV infection or immunosuppressed patients without serological signs of active EBV infection (n=67) (P 100 EBV genomes/μg PBMC DNA). Conclusion. Although both PBMCs and plasma were useful as material for EBV-specific RQ-PCR in immunosuppressed patients and nonimmunosuppressed individuals, the specificity of analysis seemed to be higher if plasma was taken for analysis.

Journal ArticleDOI
TL;DR: Complete steroid-free immunosuppression is efficacious and safe in this selected group of children with no early clinical acute rejection episodes, and may play a future critical role in avoiding noncompliance, although optimizing renal function and growth.
Abstract: Background Corticosteroids have been a cornerstone of immunosuppression for four decades despite their adverse side effects. Past attempts at steroid withdrawal in pediatric renal transplantation have had little success. This study tests the hypothesis that a complete steroid-free immunosuppressive protocol avoids steroid dependency for suppression of the immune response with its accompanying risk of acute rejection on steroid withdrawal. Methods An open labeled prospective study of complete steroid avoidance immunosuppressive protocol was undertaken in 10 unsensitized pediatric recipients (ages 5-21 years; mean 14.4 years) of first renal allografts. Steroids were substituted with extended daclizumab use, in combination with tacrolimus and mycophenolate mofetil. Protocol biopsies were performed in the steroid-free group at 0, 1, 3, 6, and 12 months posttransplantation. Clinical outcomes were compared to a steroid-based group of 37 matched historical controls. Results Graft and patient survival was 100% in both groups. Clinical acute rejection was absent in the steroid-free group at a mean follow-up time of 9 months (range 3-13.7 months). Protocol biopsies in the steroid-free group (includes 10 patients at 3 months, 7 at 6 months, and 4 at 12 months) revealed only two instances of mild (Banff 1A) subclinical rejection (reversed by only a nominal increase in immunosuppression) and no chronic rejection. At 6 months the steroid-free group had no hypertension requiring treatment (P=0.003), no hypercholesterolemia (P=0.007), and essentially no body disfigurement (P=0.0001). Serum creatinines, Schwartz GFR, and mean delta height Z scores trended better in the steroid-free group. In the steroid-free group, one patient had cytomegalovirus disease at 1 month and three had easily treated herpes simplex stomatitis, but with no significant increase in bacterial infections or rehospitalizations over the steroid-based group. The steroid-free group was more anemic early posttransplantation (P=0.004), suggesting an early role of steroids in erythrogenesis; erythropoietin use normalized hematocrits by 6 months. Conclusions Complete steroid-free immunosuppression is efficacious and safe in this selected group of children with no early clinical acute rejection episodes. This protocol avoids the morbid side effects of steroids without increasing infection, and may play a future critical role in avoiding noncompliance, although optimizing renal function and growth.