scispace - formally typeset
Search or ask a question

Showing papers in "Transplantation in 2004"


Journal ArticleDOI
TL;DR: A novel cell-sheet manipulation technology using temperature-responsive culture surfaces to generate functional, cultivated corneal epithelial cell sheet grafts that retain stem cells from limbal stem cells expanded ex vivo, and indicates highly promising clinical capabilities for this bioengineered corneAL epithelial sheet.
Abstract: Background Limbal stem-cell deficiency by ocular trauma or diseases causes corneal opacification and visual loss. Recent attempts have been made to fabricate corneal epithelial graft constructs, but the technology is still evolving. We have developed a novel cell-sheet manipulation technology using temperature-responsive culture surfaces to generate functional, cultivated corneal epithelial cell sheet grafts. Methods Human or rabbit limbal stem cells were cocultured with mitomycin C-treated 3T3 feeder layers on temperature-responsive culture dishes at 37 degrees C. Cell sheets were harvested from the dishes after 2 weeks by reducing temperature to 20 degrees C. Histologic analyses, immunoblotting, and colony-forming assay were performed to characterize the cell sheets. Autologous transplantation was undertaken to reconstruct the corneal surfaces of rabbits with experimentally induced limbal stem cell deficiencies. Results Multilayered corneal epithelial sheets were harvested intact simply by reducing the temperature, without the use of proteases. Cell-cell junctions and extracellular matrix on the basal side of the sheet, critical to sheet integrity and function, remained intact. A viable population of corneal progenitor cells, close in number to that originally seeded, was found in the sheets. Harvested sheets were easily manipulated, transplantable without any carriers, and readily adhesive to corneal stroma so that suturing was not required. Corneal surface reconstruction in rabbits was highly successful. Conclusions Cell sheet engineering technology allows us to create intact, transplantable corneal epithelial cell sheets that retain stem cells from limbal stem cells expanded ex vivo. Our research indicates highly promising clinical capabilities for our bioengineered corneal epithelial sheet.

564 citations


Journal ArticleDOI
TL;DR: Nonadherence to immunosuppressants is shown to be common and to have a large impact on transplant survival, therefore, significant improvements in graft survival could be expected from effective interventions to improve adherence.
Abstract: Nonadherence to immunosuppressants is recognized to occur after renal transplantation, but the size of its impact on transplant survival is not known. A systematic literature search identified 325 studies (in 324 articles) published from 1980 to 2001 reporting the frequency and impact of nonadherence in adult renal transplant recipients. Thirty-six studies meeting the inclusion criteria for further review were grouped into cross-sectional and cohort studies and case series. Meta-analysis was used to estimate the size of the impact of nonadherence on graft failure. Only two studies measured adherence using electronic monitoring, which is currently thought to be the most accurate measure. Cross-sectional studies (n=15) tended to rely on self-report questionnaires, but these were poorly described; a median (interquartile range) of 22% (18%-26%) of recipients were nonadherent. Cohort studies (n=10) indicated that nonadherence contributes substantially to graft loss; a median (interquartile range) of 36% (14%-65%) of graft losses were associated with prior nonadherence. Meta-analysis of these studies showed that the odds of graft failure increased sevenfold (95% confidence interval, 4%-12%) in nonadherent subjects compared with adherent subjects. Standardized methods of assessing adherence in clinical populations need to be developed, and future studies should attempt to identify the level of adherence that increases the risk of graft failure. However, this review shows nonadherence to be common and to have a large impact on transplant survival. Therefore, significant improvements in graft survival could be expected from effective interventions to improve adherence.

490 citations


Journal ArticleDOI
TL;DR: CsA is unsuitable as a universal, long-term immunosuppressive agent for kidney transplantation and strategies to ameliorate or avoid nephrotoxicity are thus urgently needed.
Abstract: BACKGROUND The role and burden of cyclosporine (CsA) nephrotoxicity in long-term progressive kidney graft dysfunction is poorly documented. METHODS The authors evaluated 888 prospective protocol kidney biopsy specimens from 99 patients taken regularly until 10 years after transplantation for evidence of CsA nephrotoxicity. RESULTS The most sensitive histologic marker of CsA nephrotoxicity was arteriolar hyalinosis, predicted by CsA dose and functional CsA nephrotoxicity. Striped fibrosis was associated with early initiation of CsA and the need for posttransplant dialysis (both P < 0.05). The 10-year cumulative Kaplan-Meier prevalence of arteriolar hyalinosis, striped fibrosis, and tubular microcalcification was 100%, 88.0%, and 79.2% of kidneys, respectively. Beyond 1 year, 53.9% had two or more lesions of CsA nephrotoxicity. Structural CsA nephrotoxicity occurred in two phases, with different clinical and histologic characteristics. The acute phase occurred with a median onset 6 months after transplantation, was usually reversible, and was associated with functional CsA nephrotoxicity (P < 0.05), high CsA levels (P < 0.05), and mild arteriolar hyalinosis (P < 0.001). The chronic phase of CsA nephrotoxicity persisted over several biopsies, occurred at a median onset of 3 years, and was associated with lower CsA doses and trough levels (both P < 0.05). It was largely irreversible and accompanied by severe arteriolar hyalinosis and progressive glomerulosclerosis (both P < 0.001). A threshold CsA dose of 5 mg/kg/day predicted worsening of arteriolar hyalinosis on sequential histology. CONCLUSIONS Pathologic changes of CsA nephrotoxicity were virtually universal by 10 years and exacerbated chronic allograft nephropathy. CsA is unsuitable as a universal, long-term immunosuppressive agent for kidney transplantation. Strategies to ameliorate or avoid nephrotoxicity are thus urgently needed.

469 citations


Journal ArticleDOI
TL;DR: Human amnion and chorion cells from term placenta can successfully engraft neonatal swine and rats and are suggested to represent an advantageous source of progenitor cells with potential applications in a variety of cell therapy and transplantation procedures.
Abstract: BACKGROUND: Fetal membranes are tissues of particular interest for several reasons, including their role in preventing rejection of the fetus and their early embryologic origin. which may entail progenitor potential. The immunologic reactivity and the transplantation potential of amnion and chorion cells, however, remain to be elucidated. METHODS: Amnion and chorion cells were isolated from human term placenta and characterized by immunohistochemistry, flow cytometric analysis, and expression profile of relevant genes. The immunomodulatory characteristics of these cells were studied in allogeneic and xenogeneic mixed lymphocyte reactions and their engraftment potential analyzed by transplantation into neonatal swine and rats. Posttransplant chimerism was determined by polymerase chain reaction analysis with probes specific for human DNA. RESULTS: Phenotypic and gene expression studies indicated mesenchymal stem cell-like profiles in both amnion and chorion cells that were positive for neuronal, pulmonary, adhesion, and migration markers. In addition, cells isolated both from amnion and chorion did not induce allogeneic nor xenogeneic lymphocyte proliferation responses and were able to actively suppress lymphocyte responsiveness. Transplantation in neonatal swine and rats resulted in human microchimerism in various organs and tissues. CONCLUSIONS: Human amnion and chorion cells from term placenta can successfully engraft neonatal swine and rats. These results may be explained by the peculiar immunologic characteristics and mesenchymal stem cell-like phenotype of these cells. These findings suggest that amnion and chorion cells may represent an advantageous source of progenitor cells with potential applications in a variety of cell therapy and transplantation procedures.

372 citations


Journal ArticleDOI
TL;DR: The current literature regarding the pro- and anti-neoplastic effects of immunosuppressive agents on cancer growth and development is presented.
Abstract: Development of cancer is a feared, and increasingly apparent, complication of long-term immunosuppressive therapy in transplant recipients. In addition to the need to reduce cancer occurrence in these patients, therapeutic protocols are lacking to simultaneously attack the malignancy and protect the allograft when neoplasms do occur. In this overview, we present the current literature regarding the pro- and anti-neoplastic effects of immunosuppressive agents on cancer growth and development. Recent experimental findings are paving the way for new therapeutic strategies aimed at both protecting an allograft from immunologic rejection and addressing the problem of cancer in this high-risk population.

336 citations


Journal ArticleDOI
TL;DR: The use of sirolimus-based immunosuppressive regimens leads to a higher incidence of wound-healing complications and will require new approaches to patient selection and management to decrease their incidence.
Abstract: BACKGROUND Sirolimus has been associated with an increased risk of wound-healing complications in several retrospective analyses. The authors compared the rates of wound-healing complications in renal allograft recipients in a prospective, randomized trial of sirolimus-mycophenolate mofetil-prednisone versus tacrolimus-mycophenolate mofetil-prednisone. METHODS All patients received antithymocyte globulin induction. In the first phase of the study, patients (n = 77) were included regardless of body mass index (BMI). In the second phase (n = 46 patients), the authors excluded patients with a BMI greater than 32 kg/m2, and the target trough sirolimus level was lowered to 10 to 15 ng/mL (previously 15-20 ng/mL). Multivariate logistic regression analyses were performed to identify predictors of wound complications. RESULTS Fifty-nine patients received tacrolimus and 64 received sirolimus and were included in subsequent analyses. The incidence of complications was 8% (5 of 59) in the tacrolimus group and 47% (30 of 64) in the sirolimus group (P < 0.0001). Rates of perigraft fluid collections, superficial wound infections, and incisional herniae were significantly higher in the sirolimus group. Multivariate logistic regression showed only sirolimus (P = 0.0001) and BMI (P = 0.0021) to independently correlate with complications. In the first phase of the study, the wound complication rate in the sirolimus group was 55% (21 of 38 patients). After excluding obese recipients and decreasing the target sirolimus level, the wound complication rate in the sirolimus group was 35% (9 of 26 patients; P = 0.1040). CONCLUSIONS The use of sirolimus-based immunosuppressive regimens leads to a higher incidence of wound-healing complications and will require new approaches to patient selection and management to decrease their incidence.

299 citations


Journal ArticleDOI
TL;DR: The results suggest that Flk1+ mMSCs might initiate endogenous hepatic tissue regeneration, engraft into host liver in response to CCl4 injury, and ameliorate its fibrogenic effects.
Abstract: Background. Fibrosis is the common end stage of most liver diseases, for which, unfortunately, there is no effective treatment available currently. It has been shown that mesenchymal stem cells (MSCs) from bone marrow (BM) could engraft in the lung after bleomycin exposure and ameliorate its fibrotic effects. This study was designed to evaluate the effect of FIk1 + MSCs from murine BM (termed here Flk1 + mMSCs) on fibrosis formation induced by carbon tetrachloride (CCl 4 ). Methods. A CCl 4 -induced hepatic fibrosis model was used. Flk1 + mMSCs were systemically infused immediately or 1 week after mice were challenged with CCl 4 . Control mice received only saline infusion. Fibrosis index and donor-cell engraftment were assessed 2 or 5 weeks after CCl 4 challenge. Results. We found that Flk1 + mMSCs transplantation immediately, but not 1 week after exposure to CCl 4 , significantly reduced CCl 4 -induced liver damage and collagen deposition. In addition, levels of hepatic hydroxyproline and serum fibrosis markers in mice receiving immediate Flk1 + mMSCs transplantation after CCl 4 challenge were significantly lower compared with those of control mice. More importantly, histologic examination suggested that hepatic damage recovery was much better in these immediately Flk1 + mMSCs-treated mice. Immunofluorescence, polymerase chain reaction, and fluorescence in situ hybridization analysis revealed that donor cells engrafted into host liver, had epithelium-like morphology, and expressed albumin, although at low frequency. Conclusion. These results suggest that Flk1 + mMSCs might initiate endogenous hepatic tissue regeneration, engraft into host liver in response to CCl 4 injury, and ameliorate its fibrogenic effects.

285 citations


Journal ArticleDOI
TL;DR: The temporal relationship between sirolimus exposure and onset of pulmonary symptoms in the absence of infectious causes and other alternative pulmonary disease and the associated clinical and radiologic improvement after its cessation suggests a causal relationship.
Abstract: Background Pulmonary toxicity has recently been recognized as a potentially serious complication associated with sirolimus therapy We further detail this condition on the basis of our own cases and those reported in the literature Methods We report three cases of suspected sirolimus-induced pulmonary toxicity that occurred in three renal transplant recipients and searched PubMed for all previously reported cases Results Including our current cases, 43 patients with sirolimus-induced pulmonary toxicity have now been reported Clinical data were incomplete in 28 cases Analysis of available data for 15 patients revealed that the most commonly presenting symptoms were dyspnea on exertion and dry cough followed by fatigue and fever Chest radiographs and high-resolution computed tomography scans commonly revealed bilateral patchy or diffuse alveolo-interstitial infiltrates Bronchoalveolar fluid analysis and lung biopsy in selected case reports revealed several distinct histologic features, including lymphocytic alveolitis, lymphocytic interstitial pneumonitis, bronchoalveolar obliterans organizing pneumonia, focal fibrosis, pulmonary alveolar hemorrhage, or a combination thereof The diagnosis of sirolimus-associated pulmonary toxicity was made after an exhaustive work-up to exclude infectious causes and other pulmonary disease Sirolimus discontinuation or dose reduction resulted in clinical and radiologic improvement in all 15 patients within 3 weeks Conclusion The temporal relationship between sirolimus exposure and onset of pulmonary symptoms in the absence of infectious causes and other alternative pulmonary disease and the associated clinical and radiologic improvement after its cessation suggests a causal relationship Because the use of sirolimus in organ transplantation has become more widespread, clinicians must remain vigilant to its potential pulmonary complication

272 citations


Journal ArticleDOI
TL;DR: The analysis of causes leading to graft failure in patients with HCV showed that HCV recurrence is responsible for one of three deaths in HCV-positive patients, and new antiviral treatments, as well as adapted immunosuppressive protocols, will be necessary to further improve the outcome of HCV -positive patients after liver transplantation.
Abstract: Background. Recurrence of hepatitis C (HCV) infection after orthotopic liver transplantation (OLT) in HCV-positive patients is almost universal. Severity of graft hepatitis increases during the long-term follow-up, and up to 30% of patients develop severe graft hepatitis and cirrhosis. However, there are still no clear predictors for severe recurrence. The aim of this study was to examine the 10-year outcome and risk factors for graft failure caused by HCV recurrence. Methods. In a prospective analysis, 234 OLTs in 209 HCV-positive patients with a median age of 53 years were analyzed. Immunosuppression was based on cyclosporine A or tacrolimus in different protocols. Predictors for outcome were genotype, viremia, donor variables, recipient demographics, postoperative immunosuppression, and human leukocyte antigen (HLA) compatibilities. Results. Actuarial 5-, and 10-year patient survival was 75.8% and 68.8%. Eighteen of 209 (8.7%) patients died because of HCV recurrence, which was responsible for 35.9% of the total 53 deaths. Significant risk factors for HCV-related graft failure in an univariate analysis were multiple steroid pulses, use of OKT3, and donor age greater than 40. However, in a multivariate analysis, multiple rejection treatments with steroids and OKT3 treatment proved to be significantly associated with HCV-related graft loss. Conclusions. The analysis of causes leading to graft failure in patients with HCV showed that HCV recurrence is responsible for one of three deaths in HCV-positive patients. Rejection treatment contributed significantly to an enhanced risk for HCV-related graft loss. New antiviral treatments, as well as adapted immunosuppressive protocols, will be necessary to further improve the outcome of HCV-positive patients after liver transplantation.

269 citations


Journal ArticleDOI
TL;DR: RTX can be safely administered and may be an effective agent to reduce high-titer anti-HLA Abs in subjects awaiting kidney transplantation in patients with end-stage renal failure.
Abstract: Background Preformed HLA antibodies (Ab), reported as panel-reactive antibody (PRA), prolong patient waiting time for kidney transplantation. We hypothesized that rituximab (RTX) could reduce PRA via B-cell depletion. This initial study reports the safety, pharmacokinetics, and pharmacodynamics of RTX in patients with end-stage renal failure. Methods The study was an investigator-initiated single-dose, dose-escalation phase I trial of RTX in chronic dialysis patients (PRA >50%). It was approved by the Institutional Review Board and the Food and Drug Administration. Nine subjects were treated with a single dose of RTX (n=3 per group) at 50, 150, or 375 mg/m. Peripheral lymphocyte cell surface markers and HLA Ab levels (%PRA and titers) were tested using flow cytometry. Results There were four significant adverse events: a suspected histoplasmosis infection; two Tenchkoff dialysis catheter infections; and fever (38.7 degrees C) during infusion. At 2 days after RTX therapy, there was depletion of CD19 cells (pre-RTX 181+/-137 vs. post-RTX 12+/-5.6, P =0.006). In 2 (22%) of 9 subjects, there was no appreciable change in PRA. Among the other seven patients, one had a decrease in PRA from 87% to 51% with a concurrent decrease in fluorescence intensity; five patients had changes in histogram architecture suggesting loss of antibody specificity; and one patient had a fourfold decrease in PRA titer from 1:64 to 1:16 at 6 months after treatment. In addition, one of the seven patients converted a donor-specific crossmatch to negative and underwent a successful living donor kidney transplantation. Conclusions RTX can be safely administered and may be an effective agent to reduce high-titer anti-HLA Abs in subjects awaiting kidney transplantation.

264 citations


Journal ArticleDOI
TL;DR: The FAPWTR has become a valuable tool that will help to accurately evaluate the potential risks and benefits of OLT in patients with FAP and promote a fruitful collaboration between centers engaged in this field.
Abstract: Background Transthyretin (TTR) amyloidosis is a group of systemic amyloidoses disorders caused by an amyloidogenic TTR variant. Untreated, it slowly leads to severely disabling symptoms that relentlessly progress until the death of the patient. Because the mutant form of TTR is produced mainly in the liver, successful orthotopic liver transplantation (OLT) results in the elimination of the source of the variant TTR molecule and is presently the only known curative treatment. OLT in patients with familial amyloidotic polyneuropathy (FAP) was first performed in 1990 at the Karolinska Institute in Sweden, and because the results were promising other centers took up the procedure. Methods To gain as great an experience as possible regarding this treatment, the Familial Amyloidotic Polyneuropathy World Transplant Registry (FAPWTR) was initiated in 1995, and this article presents the 10-year registry results. Results A total of 54 centers in 16 countries have performed OLT for FAP, and today approximately 60 OLTs are performed annually worldwide. During the last decade, a total of 539 patients have undergone 579 OLTs. Patient survival is excellent (overall 5-year patient survival 77%) and comparable to the survival with OLT performed for other chronic liver disorders, but longer follow-up is needed to compare the outcome after OLT with the natural course of the disease. The main cause of death was cardiac related (39%). Conclusions We believe that the FAPWTR has become a valuable tool that will help to accurately evaluate the potential risks and benefits of OLT in patients with FAP and promote a fruitful collaboration between centers engaged in this field.

Journal ArticleDOI
TL;DR: It is postulated that conversion from cyclosporine to sirolimus in patients with KS could favor regression of KS lesions without increasing the risk of graft rejection.
Abstract: The increased incidence of Kaposi's sarcoma (KS) in organ transplantation has been related to the KS herpesvirus and the permissive effect of immunosuppressive therapy. Calcineurin inhibitors are the cornerstone of immunosuppression in organ transplantation, although they could promote tumor progression. In contrast, sirolimus, a new immunosuppressive agent, exhibits potent antitumor activity. We postulated that conversion from cyclosporine to sirolimus in patients with KS could favor regression of KS lesions without increasing the risk of graft rejection. Two renal transplant recipients with KS underwent conversion from cyclosporine to sirolimus. Both patients showed complete regression of KS lesions and excellent clinical and functional results. Sirolimus offers a new and promising approach to the management of posttransplantation KS and probably to other types of malignancies in organ transplant recipients.

Journal ArticleDOI
TL;DR: Histologic evidence of acute rejection in the absence of clinical suspicion resulted in significant tubulointerstitial damage to transplanted kidneys and contributed to CAN.
Abstract: Background. Subclinical rejection (SCR) is defined as histologically proven acute rejection in the absence of immediate functional deterioration. Methods. We evaluated the impact of SCR in 961 prospective protocol kidney biopsies from diabetic recipients of a kidney-pancreas transplant (n=119) and one kidney transplant alone taken regularly up to 10 years after transplantation. Results. SCR was present in 60.8%, 45.7%, 25.8%, and 17.7% of biopsies at 1, 3, 12, and greater than 12 months after transplantation. Banff scores for acute interstitial inflammation and tubulitis declined exponentially with time. SCR was predicted by prior acute cellular rejection and type of immunosuppressive therapy (P<0.05-0.001). Tacrolimus reduced interstitial infiltration (P<0.001), whereas mycophenolate reduced tubulitis (P<0.05), and the combination effectively eliminated SCR (P<0.001). Persistent SCR of less than 2 years duration on sequential biopsies occurred in 29.2% of patients and was associated with prior acute interstitial rejection (P<0.001) and requirement for antilymphocyte therapy (P<0.05). It resolved by 0.49± 0.33 years and resulted in higher grades of chronic allograft nephropathy (CAN, P<0.05). True chronic rejection, defined as persistent SCR of 2 years or more duration and implying continuous immunologic activation was found in only 5.8% of patients. The presence of SCR increased chronic interstitial fibrosis, tubular atrophy, and CAN scores on subsequent biopsies (P<0.05-0.001). SCR preceded and was correlated with CAN (P<0.001) on sequential analysis. Conclusions. Histologic evidence of acute rejection in the absence of clinical suspicion resulted in significant tubulo-interstitial damage to transplanted kidneys and contributed to CAN.

Journal ArticleDOI
TL;DR: The results support the use of MELD for liver allocation and indicate that statistical modeling, such as reported in this article, can be used to identify futile cases in which expected outcome is too poor to justify transplantation.
Abstract: Background The Model for End-Stage Liver Disease (MELD) has been found to accurately predict pretransplant mortality and is a valuable system for ranking patients in greatest need of liver transplantation. It is unknown whether a higher MELD score also predicts decreased posttransplant survival. Methods. We examined a cohort of patients from the United Network for Organ Sharing (UNOS) database for whom the critical pretransplant recipient values needed to calculate the MELD score were available (international normalized ratio of prothrombin time, total bilirubin, and creatinine). In these 2,565 patients, we analyzed whether the MELD score predicted graft and patient survival and length of posttransplant hospitalization. Results. In contrast with its ability to predict survival in patients with chronic liver disease awaiting liver transplant, the MELD score was found to be poor at predicting posttransplant outcome except for patients with the highest 20% of MELD scores. We developed a model with four variables not included in MELD that had greater ability to predict 3-month posttransplant patient survival, with a c-statistic of 0.65, compared with 0.54 for the pretransplant MELD score. These pretransplant variables were recipient age, mechanical ventilation, dialysis, and retransplantation. Recipients with any two of the three latter variables showed a markedly diminished posttransplant survival rate. Conclusions. The MELD score is a relatively poor predictor of posttransplant outcome. In contrast, a model based on four pretransplant variables (recipient age, mechanical ventilation, dialysis, and retransplantation) had a better ability to predict outcome. Our results support the use of MELD for liver allocation and indicate that statistical modeling, such as reported in this article, can be used to identify futile cases in which expected outcome is too poor to justify transplantation.

Journal ArticleDOI
TL;DR: Infusion of isolated human hepatocytes improved the coagulation defect and markedly decreased the requirement for exogenous recombinant factor VII (rFVIIa) to approximately 20% of that before cell transplantation.
Abstract: Hepatocyte transplantation has been investigated in patients with liver-based metabolic disorders and acute liver failure. We report the first use of hepatocyte transplantation in two brothers with severe inherited coagulation factor VII deficiency. Patient 1 received a total of 1.09×109 cryopreserv

Journal ArticleDOI
TL;DR: Kidney transplant recipients with the CYP3A5 *1 allele required a higher daily tacrolimus dose compared with those with the cytochrome P450 3A5*3/*3 genotype to maintain both the target trough level and AUC0–12, suggesting that this polymorphism is useful for determining the appropriate dose of tacolimus.
Abstract: BACKGROUND A body-weight-based dose of tacrolimus often results in marked individual diversity of blood drug concentration. Tacrolimus is a substrate for cytochrome P450 (CYP) 3A5 and p-glycoprotein encoded by CYP3A5 and MDR1 (ABCB1), respectively, having multiple single nucleotide polymorphisms. In this study, we genotyped CYP3A5 A6986G, MDR1 G2677(A/T), and C3435T polymorphisms and investigated the association between these polymorphisms and the pharmacokinetics of tacrolimus in renal transplant recipients. METHODS Thirty consecutive recipients were enrolled in this study. The pharmacokinetics of tacrolimus was analyzed on day 28 after transplant, when the daily dose was adjusted to the target trough level of 10-15 ng/mL. The polymerase chain reaction-restriction fragment length polymorphism and direct sequence method were used for genotyping the CYP3A5 and MDR1 polymorphisms, respectively. RESULTS The single tacrolimus dose per body weight was significantly higher in CYP3A5 *1 carriers than CYP3A5 *3/*3 carriers (0.143+/-0.050 vs. 0.078+/-0.031 mg/kg, P<0.001). The dose-adjusted trough level and the area under the concentration-time curve (AUC0-12) were significantly lower in CYP3A5 *1 carriers than CYP3A5 *3/*3 carriers (0.040+/-0.014 vs. 0.057+/-0.024 ng/mL/mg/kg, P=0.015 and 0.583+/-0.162 vs. 0.899+/-0.319 ng.hr/mL/mg/kg, P=0.004), respectively. The MDR1 polymorphism was not associated with any pharmacokinetic parameters. CONCLUSIONS Kidney transplant recipients with the CYP3A5 *1 allele required a higher daily tacrolimus dose compared with those with the CYP3A5 *3/*3 genotype to maintain both the target trough level and AUC0-12, suggesting that this polymorphism is useful for determining the appropriate dose of tacrolimus.

Journal ArticleDOI
TL;DR: The data from this study suggest that more patients develop skin malignancies than previously reported from Europe, and it is important to advise patients before transplantation in regard to skin complications, provide regular dermatological follow-up, and tailor immunosuppressive regimen to minimum doses to be compatible with good graft function.
Abstract: The data from this study suggest that more patients develop skin malignancies than previously reported from Europe. It is important to advise patients before transplantation in regard to skin complications, provide regular dermatological follow-up, and tailor immunosuppressive regimen to minimum doses to be compatible with good graft function.

Journal ArticleDOI
TL;DR: Thrombosis was the most common cause for TF in all three transplant categories and for isolated pancreas transplants, TF is second only to rejection as a cause of graft loss.
Abstract: Background. Technical failure (TF) rates remain high after pancreas transplants; while rates have decreased over the last decade, more than 10% of all pancreas grafts continue to be lost due to technical reasons. We performed a multivariate analysis to determine causes and risk factors for TF of pancreas grafts. Results. Between 1994 and 2003, 937 pancreas transplants were performed at our center in the following transplant categories: simultaneous pancreas-kidney (SPK) (n=327), pancreas after kidney (PAK) (n=399), and pancreas transplant alone (PTA) (n=211). Of these, 123 (13.1%) grafts were lost due to technical reasons (thrombosis, leaks, infections). TF rates were higher for SPK (15.3%) versus PAK (12.2%) or PTA (11.4%), though this was not statistically significant. Thrombosis accounted for 52.0% of all TFs. Other causes were infections (18.7%), pancreatitis (20.3%), leaks (6.5%), and bleeding (2.4%). Thrombosis was the most common cause for TF in all three transplant categories. By multivariate analysis, the following were significant risk factors for TF of the graft: recipient body mass index (BMI) >30 kg/m 2 (relative risk [RR] =2.42, P=0.0003), preservation time >24 hr (1.87, P=0.04), cause of donor death other than trauma (RR=1.58, P=0.04), enteric versus bladder drainage (1.68, P=0.06), and donor BMI >30 kg/m 2 (1.66, P=0.06). Not significant were donor or recipient age, a retransplant, and the category of transplant. Conclusions. TFs remain significant after pancreas transplants. In SPK recipients, TF represents the most common cause of pancreas graft loss. For isolated pancreas transplants, TF is second only to rejection as a cause of graft loss. Increased preservation times and donor or recipient obesity seem to be risk factors. Minimizing these risks factors would be important to try to decrease TF.

Journal ArticleDOI
TL;DR: A standardized technique of islet isolation is presented applying novel means to improve enzymatic digestion and to meet cGMP standards to enhance enzyme digestion and minimize mechanical forces during the digestion process.
Abstract: Background. The procedure of human islet isolation needs further optimization and standardization. Here, we describe techniques to enhance enzymatic digestion and minimize mechanical forces during the digestion process. The isolation protocol has also been modified to meet current GMP (cGMP) standards. Moreover, the impact of donor- and process-related factors was correlated to the use of islets for clinical transplantation. Methods. One hundred twelve standardized consecutive islet isolations were evaluated. Metyltioninklorid and indermil (topical tissue adhesive) were applied to detect leakage of collagenase injected and to repair the damaged pancreatic glands. The effects of dye and glue were evaluated in terms of islet yield, islet function using the perifusion assay, and success rate of the isolation. To analyze key factors for successful isolations, both univariate and multivariate regression analysis were performed. Results. Both Metyltioninklorid and Indermil were effective to prevent leakage of enzyme solutions from the pancreatic glands. Both islet yield and success rate were higher when these tools were applied (4,516.1 +/- 543.0 vs. 3,447.7 +/- 323.5, P=0.02; 50.0% vs. 21.3%, P=0.02, respectively). No adverse effects on islet function or collagenase activity were observed. Multivariate regression analysis identified the maximal recorded amylase >100 U/L (P=0.026), BMI (P=0.03), and the use of catecholamine (P=0.04) as crucial donor-related factors. In addition, cold ischemia time (P=0.005), the dissection procedure using whole glands with duodenum (P=0.02), and the local procurement team (P=0.03) were identified as crucial isolation-related variables. Conclusions. A standardized technique of islet isolation is presented applying novel means to improve enzymatic digestion and to meet cGMP standards. (Less)

Journal ArticleDOI
TL;DR: This study demonstrates that rapamycin simultaneously protects allografts from rejection and attacks tumors in a complex transplant-tumor situation, and in vitro experiments showed that CsA promotes angiogenesis by a transforming growth factor-β–related mechanism, and that this effect is abrogated by Rapamycin.
Abstract: Cancer is an increasingly recognized problem associated with immunosuppression. Recent reports, however, suggest that the immunosuppressive agent rapamycin has anti-cancer properties that could address this problem. Thus far, rapamycin's effects on immunity and cancer have been studied separately. Here we tested the effects of rapamycin, versus cyclosporine A (CsA), on established tumors in mice simultaneously bearing a heart allograft. In one tumor-transplant model, BALB/c mice received subcutaneous syngenic CT26 colon adenocarcinoma cells 7 days before C3H ear-heart transplantation. Rapamycin or CsA treatment was initiated with transplantation. In a second model system, a B16 melanoma was established in C57BL/6 mice that received a primary vascularized C3H heart allograft. In vitro angiogenic effects of rapamycin and CsA were tested in an aortic ring assay. Results show that CT26 tumors grew for 2 weeks before tumor complications occurred. However, rapamycin protected allografts, inhibited tumor growth, and permitted animal survival. In contrast, CsA-treated mice succumbed to advancing tumors, albeit with a functioning allograft. Rapamycin's antitumor effect also functioned in severe combined immunodeficient BALB/c mice. Similar effects of the drugs occurred with B16 melanomas and primary vascularized C3H allografts in C57BL/6 mice. Furthermore, in this model, rapamycin inhibited the tumor growth-enhancing effects of CsA. Moreover, in vitro experiments showed that CsA promotes angiogenesis by a transforming growth factor-beta-related mechanism, and that this effect is abrogated by rapamycin. This study demonstrates that rapamycin simultaneously protects allografts from rejection and attacks tumors in a complex transplant-tumor situation. Notably, CsA protects allografts from rejection, but cancer progression is promoted in transplant recipients.

Journal ArticleDOI
TL;DR: Despite technical modifications and application of various surgical techniques, biliary complications remain frequent after RL LDLT and patients with multiple biliary reconstructions had a higher incidence of bile leaks.
Abstract: Background. Biliary reconstruction represents one of the most challenging parts of right lobe (RL) living donor liver transplantations (LDLTs). Different causes, surgical techniques, and treatments have been suggested but are incompletely defined. Methods. Between June 1999 and January 2002, 96 RL LDLTs were performed in our center. We reviewed the incidence of biliary complications in all the recipients. Results. Roux-en-Y reconstruction was performed in 53 cases (55.2%) and duct-to-duct was performed in 39 cases (40.6%). Both procedures were performed in 4 cases (4.2%). Multiple ducts (≥2) were found in 58 grafts (60.4%). Thirty-nine recipients (40.6%) had 43 biliary complications: 21 had bile leaks, 22 had biliary strictures, and 4 had both complications. Patients with multiple ducts had a higher incidence of bile leaks than those patients with a single duct (P=0.049). No significant differences in complications were found between Roux-en-Y or duct-to-duct reconstructions. Freedom from biliary complications was 59% at 1 year and 55% at 2 years. The overall 1-year and 2-year survival rates for patients were 86% and 81%, respectively. The overall 1-year and 2-year survival rates for grafts were 80% and 77%, respectively. Occurrence of bile leaks affected patient and graft survival (76% and 65% 2-year patient and graft survival, respectively, vs. 89% and 85% for those without biliary leaks, P=0.07). Conclusions. Despite technical modifications and application of various surgical techniques, biliary complications remain frequent after RL LDLT. Patients with multiple biliary reconstructions had a higher incidence of bile leaks. Patients who developed leaks had lower patient and graft survival rates.

Journal ArticleDOI
TL;DR: Given a 40% risk of rejection, seven patients would need treatment with IL-2Ra in addition to standard therapy, to prevent one patient from undergoing rejection, with no definite improvement in graft or patient survival.
Abstract: Background: Interleukin 2 receptor antagonists (IL-2Ra) are increasingly used to treat renal transplant recipients. This study aims to systematically identify and summarize the effects of using IL-2Ra as induction immunosuppression, as an addition to standard therapy, or as an alternative to other antibody therapy. Methods: Databases, reference lists, and abstracts of conference proceedings were searched extensively to identify relevant randomized controlled trials in all languages. Data were synthesized using the random effects model. Results are expressed as relative risk (RR), with 95% confidence intervals (CI). Results: A total of 117 reports from 38 trials involving 4,893 participants were included. When IL-2Ra were compared with placebo (17 trials; 2,786 patients), graft loss was not significantly different at 1 year (14 trials: RR 0.84; CI 0.64–1.10) or 3 years (4 trials: RR 1.08; CI 0.71–1.64). Acute rejection was significantly reduced at 6 months (12 trials: RR 0.66; CI 0.59–0.74) and at 1 year (10 trials: RR 0.67; CI 0.60–0.75). At 1 year, cytomegalovirus infection (7 trials: RR 0.82; CI 0.65–1.03) and malignancy (9 trials: RR 0.67; CI 0.33–1.36) were not significantly different. When IL-2Ra were compared with other antibody therapy, no significant differences in treatment effects were demonstrated, but IL-2Ra had significantly fewer side effects. Conclusions: Given a 40% risk of rejection, seven patients would need treatment with IL-2Ra in addition to standard therapy, to prevent one patient from undergoing rejection, with no definite improvement in graft or patient survival. There is no apparent difference between basiliximab and daclizumab.

Journal ArticleDOI
TL;DR: A new cultured bioengineered skin based on keratinocytes and fibroblasts obtained from a single skin biopsy and a dermal matrix based on human plasma is described, demonstrating that this new dermal equivalent allows for generation of large bioengineering skin surfaces, restoration of both the epidermal and dermal skin compartments, and functional epidersmal stem-cell preservation.
Abstract: Background Keratinocyte cultures have been used for the treatment of severe burn patients Here, we describe a new cultured bioengineered skin based on (1) keratinocytes and fibroblasts obtained from a single skin biopsy and (2) a dermal matrix based on human plasma A high expansion capacity achieved by keratinocytes grown on this plasma-based matrix is reported In addition, the results of successful preclinical and clinical tests are presented Methods Keratinocytes and fibroblasts were obtained by a double enzymatic digestion (trypsin and collagenase, respectively) In this setting, human fibroblasts are embedded in a clotted plasma-based matrix that serves as a three-dimensional scaffold Human keratinocytes are seeded on the plasma-based scaffold to form the epidermal component of the skin construct Regeneration performance of the plasma-based bioengineered skin was tested on immunodeficient mice as a preclinical approach Finally, this skin equivalent was grafted on two severely burned patients Results Keratinocytes seeded on the plasma-based scaffold grew to confluence, allowing a 1,000-fold cultured-area expansion after 24 to 26 days of culture Experimental transplantation of human keratinocytes expanded on the engineered plasma scaffold yielded optimum epidermal architecture and phenotype, including the expression of structural intracellular proteins and basement-membrane components In addition, we report here the successful engraftment and stable skin regeneration in two severely burned patients at 1 and 2 years follow-up Conclusions Our data demonstrate that this new dermal equivalent allows for (1) generation of large bioengineered skin surfaces, (2) restoration of both the epidermal and dermal skin compartments, and (3) functional epidermal stem-cell preservation

Journal ArticleDOI
TL;DR: This report advocates selection of crossmatch negative donors on the basis of the Acceptable Mismatch Program, as the first and best option for highly sensitized patients to undergo transplantations.
Abstract: There are many highly sensitized patients on the kidney waiting lists of organ exchange organizations because it is difficult to find a crossmatch negative cadaver kidney for these patients. Recently, several protocols have been developed to remove the donor-specific human leukocyte antigen (HLA) antibodies from the serum of these patients before transplantation. These approaches, including the use of intravenous immunoglobulins, plasmapheresis and immunoglobulins (plasmapheresis-cytomegalovirus-immunoglobulin), and immunoabsorption, seem to lead to a certain success rate, although the additional immunosuppression necessary to remove and control the production of donor-specific alloantibodies may have its impact on the short-term (infections) and long-term (incidence of cancer) immune surveillance. Furthermore, some of these therapies represent a considerable financial burden for patients and society. In the present report, we advocate selection of crossmatch negative donors on the basis of the Acceptable Mismatch Program, as the first and best option for highly sensitized patients to undergo transplantations. No additional immunosuppression is necessary, and graft survival in this group of "difficult" patients is identical to that of nonsensitized recipients. Because the nature of the HLA polymorphism does not allow all patients to profit from this approach, removal of circulating HLA antibodies can be considered as a rescue therapy for those patients for whom the Acceptable Mismatch Program does not give a solution.

Journal ArticleDOI
TL;DR: L/S ratio calculated from preoperative CT can be a useful tool to discriminate hepatic macrovesicular steatosis as well as other parameters including body mass index (BMI) and serum liver function tests using receiver operating characteristic (ROC) analysis.
Abstract: BACKGROUND Hepatic steatosis affects graft function as well as postoperative recovery of donors in living donor liver transplantation. Liver macrovesicular steatosis in living donors was assessed using quantitative X-ray computed tomography (CT) analysis and histological examination of intraoperative liver biopsy. METHODS A total of 266 living donors with complete pretransplant CT data and intraoperative "time 0" biopsy were included in the study. Liver biopsy specimen obtained during donor operation was examined for macrovesicular steatosis and was classified as none; mild ( 60%). Liver-to-spleen CT attenuation values ratio (L/S ratio) on noncontrast-CT was evaluated for its usefulness as an index of hepatic steatosis in comparison with other parameters including body mass index (BMI) and serum liver function tests (gamma-glutamyl transpeptidase, alanine aminotransferase, aspartate aminotransferase, cholinesterase, and total cholesterol) using receiver operating characteristic (ROC) analysis. RESULTS.: Histological grade of macrovesicular steatosis was none in 198 patients (74.4%), mild in 50 (18.8%), moderate in 15 (5.7%), and severe in 3 (1.1%). The median L/S ratios for the respective histological grades were 1.20 (range: 1.00-1.46), 1.12 (0.83-1.37), 1.01 (0.74-1.21), and 0.90 (0.70-0.99) (P<0.0001). The ROC curve for L/S ratio was located closest to the upper left corner, and the area under the curve of L/S ratio was significantly larger than that of any other preoperative variables. CONCLUSION L/S ratio calculated from preoperative CT can be a useful tool to discriminate hepatic macrovesicular steatosis. Based on the present results, the optimal cut-off value for L/S ratio to exclude more than moderate steatosis would be 1.1.

Journal ArticleDOI
TL;DR: Both aerosol AmBd and ABLC appear to be associated with a low rate of invasive pulmonary fungal infection in the early posttransplant period and patients receiving ABLC were less likely to experience a treatment-related adverse event.
Abstract: Background Aerosolized administrations of amphotericin B deoxycholate (AmBd) and amphotericin B lipid complex (ABLC) in lung transplant recipients were compared for safety and tolerability. The incidence of invasive fungal infections in patients receiving aerosolized amphotericin B formulations as sole prophylaxis was determined. Methods A prospective, randomized (1:1), double-blinded trial was conducted with 100 subjects. AmBd and ABLC were administered postoperatively by nebulizer at doses of 25 mg and 50 mg, respectively, which were doubled in mechanically ventilated patients. The planned treatment was once every day for 4 days, then once per week for 7 weeks. Treatment-related adverse events and invasive fungal infections were quantitated for 2 months after study drug initiation. Results Intent-to-treat analysis revealed study drug was discontinued for intolerance in 6 of 49 (12.2%) and 3 of 51 (5.9%) patients in the AmBd- and ABLC-treated groups, respectively (p=0.313). Subjects receiving AmBd were more likely to have experienced an adverse event (odds ratio 2.16, 95% confidence interval 1.10, 4.24, p=0.02). Primary prophylaxis failure within 2 months of study drug initiation was observed in 7 of 49 (14.3%) AmBd-treated patients and 6 of 51 (11.8%) ABLC-treated patients. No fungal pneumonias were observed. Only two (2%) patients experienced documented primary prophylaxis failure with Aspergillus infections within the follow-up period. Conclusions Both aerosol AmBd and ABLC appear to be associated with a low rate of invasive pulmonary fungal infection in the early posttransplant period. Patients receiving ABLC were less likely to experience a treatment-related adverse event.

Journal ArticleDOI
TL;DR: In de novo renal-transplant recipients, the regimen of everolimus plus RDN was well tolerated, with low efficacy failure and better renal function in comparison with everolitus plus FDN.
Abstract: Background. Everolimus and cyclosporine (CsA) exhibit synergistic immunosuppressive activity when used in combination. We explored the use of everolimus with a CsA-sparing strategy in de novo renal-transplant recipients. Methods. A phase II, randomized, open-label 3-year study was performed in 111 patients to compare the efficacy and tolerability of everolimus (3 mg/day) in combination with basiliximab, steroids, and either full-dose Neoral (FDN) or reduced-dose Neoral (RDN) (CsA trough levels 125–250 ng/mL and 50–100 ng/mL, respectively). Efficacy failure (biopsy-proven acute rejection, death, graft loss, or loss to follow-up), safety, and renal function were evaluated at 6, 12, and 36 months. A protocol amendment allowed further reduction of CsA exposure after 12 months. Results. Efficacy failure was significantly higher in FDN than in the RDN group at 6 (15.1% vs. 3.4%; P=0.046), 12 (28.3% vs. 8.6%; P=0.012), and 36 (35.8% vs. 17.2%; P=0.032) months. Mean creatinine clearance was higher in the RDN group at 6 (59.7 mL/min vs. 51.1 mL/min; P=0.009), 12 (60.9 mL/min vs. 53.5 mL/min; P=0.007), and 36 (56.6 mL/min vs. 51.7 mL/min; P=0.436) months. Discontinuations and serious adverse events were more frequent in the FDN group. Reduction of CsA exposure for 6 months during the amendment improved renal function in the FDN group. Conclusions. In de novo renal-transplant recipients, the regimen of everolimus plus RDN was well tolerated, with low efficacy failure and better renal function in comparison with everolimus plus FDN.

Journal ArticleDOI
TL;DR: There was short-term histologic benefit to the use of this regimen, even in those patients without viral clearance, and post-OLT HCV recurrence can be safely treated with PEG-IFN and RIB.
Abstract: BACKGROUND Hepatitis C virus (HCV) recurrence after orthotopic liver transplantation (OLT) is universal. We aimed to evaluate the efficacy and safety of pegylated interferon (PEG-IFN) and ribavirin (RIB) in the treatment of post-OLT HCV recurrence. METHODS Thirty-seven patients with recurrent HCV after OLT were screened and began treatment. Nineteen patients have completed therapy. PEG-IFN was started at a dose of 0.5 microg/kg per week and titrated toward a maximum dose of 1.5 microg/kg per week. RIB was started at a dose of 400 mg per day and titrated toward a maximum of 1000 mg per day, as tolerated. Therapy continued for 1 year after HCV replication was undetectable by reverse transcriptase-polymerase chain reaction and was discontinued if there was no virologic clearance at 48 weeks. RESULTS Twelve patients (63%) completed the combination regimen. Therapy was discontinued in seven (37%) patients. Seven patients (37%) had undetectable viral load at the end of treatment. Of those, five patients (26%) had sustained viral response 6 months after discontinuation of therapy. Five patients (26%) had no virologic response. Necro-inflammatory score declined from 5.22 to 2.89 (P=0.05) in nonresponders versus 6.8 to 2.6 (P<0.01) in responders. Fibrosis stage did not change in either group. Genotype 1-infected patients had a lower likelihood of attaining end of treatment or sustained viral response (P<0.05 for both). CONCLUSIONS Post-OLT HCV recurrence can be safely treated with PEG-IFN and RIB. Bone marrow toxicity, depression, and rejection are limiting factors that require aggressive management. There was short-term histologic benefit to the use of this regimen, even in those patients without viral clearance.

Journal ArticleDOI
TL;DR: Self-report at a confidential interview was the best measure of adherence for the detection of both missed doses and erratic timing of medication in renal transplant recipients.
Abstract: Nonadherence to immunosuppressants in renal transplant recipients is a major factor affecting graft survival, but it is difficult to detect accurately in clinical practice. Adherence was measured in 153 adult renal transplant recipients using self-report questionnaires and interview, clinician rating, and cyclosporine levels. The sensitivity and specificity of these measures were determined by comparison with electronic monitoring in a randomly selected subsample of 58 subjects. Measures of adherence in current clinical use do not perform well when tested against electronic monitoring. Self-report at a confidential interview was the best measure of adherence for the detection of both missed doses and erratic timing of medication. However, the use of a confidential interview is not directly applicable to a clinical setting. Further research on how best to facilitate disclosure in clinical settings may be the best way to develop adherence measures for use in routine practice.

Journal ArticleDOI
TL;DR: Both ATG and basiliximab, when used for IT in a sequential protocol, are equally effective in terms of graft and patient survival as well as at preventing acute rejection, however, basilixIMab is associated with a lower incidence of certain key adverse events, namely CMV infection, leukopenia, and thrombocytopenia.
Abstract: Background Sequential anti-thymocyte globulins (ATG)/cyclosporine immunosuppression has two main advantages: delayed introduction of the nephrotoxic drug cyclosporine and prevention of acute rejection. Basiliximab, a recently developed chimeric monoclonal antibody that selectively depletes the minor subpopulation of activated T lymphocytes, has been shown to reduce the incidence of acute rejection when used with cyclosporine introduced on day 1. Methods This open, randomized, multicenter study was undertaken to compare the safety and efficacy of ATG versus basiliximab induction therapy (IT) with delayed introduction of cyclosporine for microemulsion (Neoral) in 105 low immunologic risk renal-transplant patients receiving mycophenolate mofetil and steroids. Results One-year patient and graft survival rates were 98.1% and 94.2%, respectively, in the basiliximab group (n = 52), and 98.1% and 96.2% in the ATG group (n = 53). The incidence of biopsy-confirmed acute rejection was comparable (basiliximab 9.6%, ATG 9.4%), as were key parameters of renal function, notably serum creatinine levels, time-to-nadir serum creatinine, and the number of patients requiring posttransplantation dialysis (basiliximab 28.8%, ATG 30.2%). However, significantly fewer patients in the basiliximab group experienced cytomegalovirus (CMV) infection, leukopenia, and thrombocytopenia, and this without any significant difference in any other key safety parameters (including the incidences of serum sickness, fever, lymphoma, and infections in general). Conclusions Both ATG and basiliximab, when used for IT in a sequential protocol, are equally effective in terms of graft and patient survival as well as at preventing acute rejection. However, basiliximab is associated with a lower incidence of certain key adverse events, namely CMV infection, leukopenia, and thrombocytopenia.