scispace - formally typeset
Search or ask a question

Showing papers in "Clinical Transplantation in 1995"


Journal Article
TL;DR: The number of heart transplant operations performed in the United States has decreased by 164 procedures between 1998 and 1999 and no meaningful variance from the US recipient demographic profile is noted for the non-US recipients during the same time period.
Abstract: 1. The number of heart transplant operations performed in the United States grew modestly as indicated by a 12% increase from 1990 (n = 2,108) to 1995 (n = 2,360). From 1990 (n = 203) to 1995 (n = 871), lung transplant procedures increased by 329%. This trend has continued with 723 procedures performed in 1994 and 871 (21% increase from 1994) reported for 1995. As in the US, the number of non-US heart transplants has leveled during recent years. 2. The number of heart transplant programs in the United States has remained relatively constant over the last 3 years with a decrease of 5 heart programs from 1995 to 1996. The number of centers performing lung transplantation has also leveled during the last 3 years with an increase of only 4 programs from 1994 to 1996. Non-US lung programs increased 90% from 1994 to 1995. 3. The most frequently reported indication for heart transplantation in the US has changed from coronary artery disease (40.9%) in previous registry reports to all cardiomyopathies (44.7%). For other thoracic transplants, the most frequently reported indications included cystic fibrosis (36.7%) for double-lung, emphysema/COPD (46.8%) for single-lung and congenital lung disease (41.2%) for heart-lung transplants. The most frequently reported diagnoses for thoracic transplantation outside the US included cardiomyopathy (48.5%) for heart, cystic fibrosis (36.0%) for double-lung, idiopathic pulmonary fibrosis (26.5%) for single-lung and primary pulmonary hypertension (25.0%) for heart-lung transplants. 4. US heart transplant recipients were predominantly male (77.8%), 50-64 years old (51.1%) and white (82.7%). In contrast, US lung transplant recipients were predominantly female (52.9%), 35-64 years old (73.1%) and white (89.9%). No significant variance from the US recipient demographic profile was noted for non-US recipients in these analyses. 5. The one-year survival rate for US heart transplant recipients during recent years was fairly consistent, with only a 0.4% increase from 1990-1995. Improvement in the one-year US lung transplant survival rate was demonstrated by a rise from 35.3% in 1987 to 74.0% in 1995. The one-year survival rates at non-US centers were 76.0% for heart recipients and 64.5% for lung recipients in 1995. 6. The long-term thoracic patient survival rates in the United States were: 33.3% at 12 years for heart, 43.7% at 5 years for lung and 27.6% at 10 years for heart-lung recipients. Long-term survival rates for non-US cases were: 30.3% at 12 years for heart, 44.8% at 6 years for lung and 19.8% at 10 years for heart-lung. 7. The most important risk factor for US heart recipients at 1 month, 1 year, 3 years and 5 years after transplantation was receipt of a previous heart transplant. Other substantial long-term risk factors included recipient age less than 1 year, donor aged 45-54, and non-white recipient. 8. The most important mortality risk factor in US lung recipients was the order of the transplant (primary or repeat). Diagnosis and ventilator use remained highly influential risk factors for mortality.

170 citations


Journal Article
TL;DR: Older patients, particularly those with pre-existing but compensated heart disease, are at greatest risk for a major cardiac event and may require more extensive pre-operative risk assessment.
Abstract: Background As the indications for liver transplantation broaden to include older and more critically ill patients, the likelihood of encountering unsuspected cardiovascular disease increases. Purpose This study examined the frequency, type, and subsequent outcome of intra- and postoperative cardiovascular complications that occurred during the first 6 months following liver transplantation. Methods The records of 146 consecutive patients who underwent primary liver transplantation were reviewed retrospectively to determine the occurrence of major (myocardial infarction or reversible ischemia, pulmonary edema, cardiogenic shock, symptomatic rhythm disturbances, or pulmonary embolism) and minor (transient hypertension, hypotension, atrial or ventricular premature beats) cardiac events. The relation between such events and actuarial patient survival was evaluated. Stepwise logistic regression analysis was also employed to identify those pre-operative variables that predicted an increased risk of postoperative events or mortality. Results Cardiac events directly caused or contributed to 4 deaths (2.7%). Ventricular tachycardia/fibrillation was the most frequent intra-operative cardiac complication (3.4%); transient hypotension (post-reperfusion syndrome) was the most common minor event (20%). Thirty-four recipients (23%) developed a major postoperative cardiac complication including pulmonary edema (9%), myocardial ischemia or infarction (5.4%), new dilated cardiomyopathy (3.4%), and ventricular tachycardia (2.7%). Pre-existing cardiac disease and older age (mean age 49 +/- 8 years) at transplantation were the only independent predictors of a major complication. Major cardiac events did not affect 6 month survival but were associated with a lower 5-year survival rate (event: 32% vs event-free: 52%; p = 0.04). The frequency of major intraoperative (21% vs 2%; p = 0.0005) and postoperative (57% vs 17%; p = 0.0001) cardiac complications was significantly higher for recipients with known heart disease (Group A) compared with those without pre-existing heart disease (Group B). Five-year survival in Group A patients was 36% versus 50% for Group B patients; p = 0.45. Conclusion One or more cardiovascular complications occurred in over 70% of liver transplant recipients. Major events were associated with a lower likelihood of long-term survival. Older patients, particularly those with pre-existing but compensated heart disease, are at greatest risk for a major cardiac event and may require more extensive pre-operative risk assessment.

123 citations


Journal Article
TL;DR: The incidence of PTLD reported here in patients receiving intravenous ganciclovir followed by high-dose oral acyclovir antiviral prophylaxis is lower than previously recorded when consideration is given for patient's EBV status and the use of antilymphocyte preparations.
Abstract: Epstein Barr virus (EBV) infection has been associated with the post-transplant lymphoproliferative disorder (PTLD) in up to 8% of transplant recipients. Primary EBV infection and the use of antilymphocyte preparations appear to increase the incidence of PTLD. Experimental evidence suggests that the antiviral prophylaxis used by many transplant programs may influence the development of this post-transplant complication. In order to investigate the influence of antiviral prophylaxis (intravenous ganciclovir followed by high-dose oral acyclovir) on the development of PTLD in kidney-pancreas and liver allograft recipients from the University of Washington Medical Center, records were reviewed for pretransplant EBV status, antilymphocyte preparation use and for histologic documentation of PTLD. Two of 83 kidney-pancreas recipients (1 EBV-seronegative, 1 EBV-seropositive) and 1 of 123 liver recipients (EBV-seropositive) has developed PTLD. Six of 83 kidney-pancreas patients were EBV-seronegative prior to transplantation and 4 of these patients received at least two courses of an antilymphocyte preparation. Thirty-eight (49%) of the 77 EBV-seropositive kidney-pancreas recipients received at least two courses of an antilymphocyte globulin without the development of PTLD. Both the EBV-seronegative kidney-pancreas and the liver recipient who developed PTLD had received multiple courses of antilymphocyte globulins. One EBV-seropositive kidney-pancreas recipient had only received one course of OKT3 1 year prior to the development of PTLD. The incidence of PTLD reported here in patients receiving intravenous ganciclovir followed by high-dose oral acyclovir antiviral prophylaxis is lower than previously recorded when consideration is given for patient's EBV status and the use of antilymphocyte preparations.

100 citations


Journal Article
TL;DR: There was a strong association between theCGD score at 6 months and the risk of graft loss up to 2 and 3 years following transplantation, and in patients with a functioning graft at 2 years, the CGD-score at 6 weeks correlated with graft function at2 years.
Abstract: In order to assess the prognostic value of renal transplant biopsies for identifying patients at risk for chronic vascular rejection (CVR), 99 biopsies performed at 6 months after renal transplantation were evaluated as part of a prospective study. A chronic graft damage (CGD) score was calculated from the scores of vascular intimal hyperplasia, glomerular mesangial changes, focal lymphocytic infiltration, focal and diffuse interstitial fibrosis, and tubular atrophy, features compatible with CVR. The mean score for the whole patient population was 4.7 ± 2.9 (range 0-11) There was a strong association between the CGD score at 6 months and the risk of graft loss up to 2 and 3 years following transplantation. Patients with a CGD-score of ≥ 6 had a higher graft loss rate at 2 years than those with a score of < 6 (6/35 vs 2/54; p = 0.037). In patients with a functioning graft at 2 years, the CGD-score at 6 months correlated with graft function at 2 years. In addition to higher serum creatinine (p= 0.003) and lower GFR (p = 0.01), patients with a CGD-score of ≥ 6 at 6 months also had a higher degree of albuminuria (p = 0.008) at 2 years as compared with patients with a CGD-score of < 6. At 3 years 10/35 patients with a CGD score of ≥ 6 and 2/54 patients with a CGD-score of < 6 had lost their grafts (p = 0.002). The relative risk for graft loss associated with a CGD-score of ≥ 6 was 7.65. In patients with a functioning graft there was still a significant positive correlation between graft function at 3 years and the CGD-score at 6 months (p = 0.003). In conclusion, the histopathological picture at 6 months is a prognostic indicator of graft function and graft loss at 2 and 3 years after transplantation. Patients with a high CGD-score, as indicated here ≥ 6, may benefit from pharmacological or other interventional measures directed towards the progression of CVR

90 citations


Journal Article
TL;DR: Multivariate analysis showed that 5-year graft survival was lower in patients with delayed graft function, in those who had an acute rejection episode in the first 6 months post-transplant, in recipients greater than 55 years of age at the time of transplant, and in patients who were highly sensitized at thetime of transplant.
Abstract: To determine factors predictive of long-term graft function in patients treated prophylactically with an antilymphocyte antibody, 670 first cadaveric adult renal transplant procedures performed between 1985 and 1991 were reviewed. The actuarial 1- and 5-year patient survival in this group was 95% and 87% respectively, and graft survival was 84% and 70% respectively. The final analysis was based on a study group of 635 patients which excluded 28 patients who lost grafts to early technical failures and 8 patients who were not induced with an antilymphocyte preparation. Multivariate analysis showed that 5-year graft survival was lower in patients with delayed graft function (p=0.0001), in those who had an acute rejection episode in the first 6 months post-transplant (p=0.0001), in recipients greater than 55 years of age at the time of transplant (p=0.0001), in patients who were highly sensitized at the time of transplant (p=0.0331) and, finally, in those who received a graft from an older donor (p=0.044). The 209 patients with delayed graft function had a 16% lower long-term graft survival than 425 patients with early graft function (62% vs. 78% respectively at 5 years). One or more rejection episodes in the first 6 months post-transplant (329 patients) reduced long-term graft survival by 13% compared to those who did not have a rejection episode (67% vs. 80% respectively at 5 years). 219 patients who were free of early rejection episodes and had early graft function had a 5-year graft survival of 85% compared to 60% in the 122 patients who had early rejection and delayed graft function (p<0.0005). Recipient age of over 55 years reduced graft survival at 5 years because of death with a functioning graft. The other two independent variables, degree of sensitization and donor age, had a weaker effect on graft survival by increasing the risk ratio to 1.013 for sensitization and to 1.104 for donor age. There was only a minor interaction between time to graft function and rejection activity. 52% of the 219 patients were free of rejection episodes whereas only 42% of the 209 patients with delayed function had no rejection (p<0.05). HLA matching did not have an independent effect on long-term graft outcome, but did interact with rejection in that of 57% of patients with 2 or fewer BDR mismatches were free of rejection as compared to 44% of patients who had more than 2 mismatches free of rejection (p<0.0001). Serum creatinine at 6 months was a dependent variable that influenced long-term survival. Patients with a serum creatinine at 6 months of ≤200 μmol/1 (2.27 mg%) had a 27% increase in long-term survival compared to those with more impaired function (86% vs. 59% actuarial survival at 5 years post-transplant) (p<0.0005).

83 citations


Journal Article
TL;DR: White donors accounted for the majority of all transplanted organs and matching donor and recipient race ("race matching") led to better long-term allograft survival for White recipients only, while there was no donor-recipient "race matching" effect for minority groups.
Abstract: Asian recipients of cadaveric renal allografts had the best long-term survival rates Five-year graft survival rates were 66% for 1,713 Asians, 61% for 4,722 Hispanics and 33,190 Whites, and 47% for 12,948 Blacks This trend had already been established at one-year posttransplant Transplant half-lives calculated after 6 months were 12 years for Asians, 10 years for Whites, 9 years for Hispanics and 5 years for Blacks These have all improved over the last 4 years Part of the explanation for the outstanding half-life for Asian recipients is the 15 year half-life of the 672 Asian females reported The superior graft survival for Asian recipients may be due in part to the low incidence of sensitization, the low incidence of acute rejection and chronic rejection leading to graft loss, and the high prevalence of primary disease entities that have been associated with excellent long-term prognoses, especially IgA nephropathy and chronic glomerulonephritis Hispanic recipients also had excellent short- and long-term graft survival rates This may be due to having the lowest incidence of early acute rejection episodes compared with all other racial groups, and the limited deleterious effect of ATN on long-term graft survival among Hispanics The poor overall graft survival for Black recipients may be due to poor HLA matching, a high rate of sensitization and a grim effect of sensitization on graft survival, the high incidences of acute rejection and ATN, and the high incidence of HTN both pre- and posttransplant The only subgroups of Black recipients who had graft survival rates that were comparable to other racial groups were the zero-HLA-mismatched Black recipients and those Black recipients over age 65 Long-term patient survival rates were the best for Asians and Hispanics (89% and 90% at 5 years, respectively) The 5-year patient survival rates were lower for Blacks and Whites (86% each) There was no difference in patient survival at one-year posttransplant (95-96% for each group) A higher proportion of White diabetic recipients received simultaneous SPK transplants (31%) than Black (10%), Hispanic (11%) or Asian (7%) diabetics The reasons for this disparity are unclear However, SPK transplants improved 5-year kidney graft survival for Whites (67% vs 55% in patients receiving kidneys alone), but were not associated with improved 5-year kidney survival among non-Whites White donors accounted for the majority of all transplanted organs (79%) Matching donor and recipient race ("race matching") led to better long-term allograft survival for White recipients only There was no donor-recipient "race matching" effect for minority groups

81 citations


Journal Article
TL;DR: The growth in liver transplantations recorded by the Pitt-UNOS Liver Transplant Registry since October 1987 continues as does the net growth of new centers and no significant differences were found for gender, race or age of pediatric recipients in 1994.
Abstract: The growth in liver transplantations recorded by the Pitt-UNOS Liver Transplant Registry since October 1987 continues as does the net growth of new centers Characteristics of pediatric recipients in 1994 were compared to those of previous years and no significant differences were found for gender, race or age The majority of pediatric recipients in 1994 awaited transplantation at home The most common indication for liver transplantation in children was bilary atresia, though the proportion of recipients with this primary liver disease decreased significantly Significant increases were noted in the proportions of pediatric recipients with autoimmune disease (though this remains a relatively uncommon indication) and fulminant liver failure There was a significant decrease in the proportion of children who received ABO-incompatible livers Many of the characteristics examined for adult recipients changed over time The proportion of male recipients continued to increase The mean age of adult recipients continued to increase, likely contributing to the increased prevalence of positive CMV-serology The proportion of adult recipients awaiting transplantation outside the hospital increased over time The increase in the proportion of multiorgan transplantations was in large part due to the increased reporting of bone marrow/liver transplants in 1994 Hepatitis non-A, non-B, or C and alcoholic liver disease were the most common reasons for LTX The proportions of recipients with hepatitis B, fulminant liver failure and malignancies, indications with the poorest survival, all declined significantly The cumulative probability of surviving (without retransplantation) for 7 years after initial transplantation was 070 (057) for pediatric recipients Despite changes in recipient characteristics, the one-year survival for pediatric recipients did not change significantly over time Significant differences in survival, unadjusted for other factors, were found by age (the youngest recipients had the worst survival), location awaiting transplantation (greater medical intervention just prior to transplantation led to poorer survival), multiorgan transplantation, primary liver disease (survival was worst for recipients transplanted due to malignancies, and best for patients with metabolic diseases), and donor/recipient ABO matching (survival was best for recipients of livers from donors with the same blood type) These results are similar to those previously reported for 4- and 5-year survivals The cumulative probability of adults surviving (without retransplantation) for 7 years following LTX was 059 (052) Significant differences in survival, unadjusted for other factors, were found for year of transplantation (recipients in 1994 had better one-year survival than those transplanted in previous years), sex (males had worse survival than females), race (Blacks and Asians had the poorest survivals), age (recipients 50 years of age and older had the poorest survival), location awaiting transplantation (greater medical intervention just prior to transplantation led to poorer survival), multiorgan transplantation (recipients of organs in addition to the liver had worse patient survival than recipients of liver only), and primary liver disease (the best survival was for cirrhosis due to cryptogenic or cholestatic cirrhosis, the poorest survival was for malignancies and hepatitis B) Similar results were also reported previously for 4- and 5-year survivals

64 citations


Journal Article
TL;DR: Results demonstrate significant loss of forearm bone mineral with long-term follow-up after renal transplantation, but suggest that patients treated with cyclosporin monotherapy may be at lower risk of this complication.
Abstract: Serial measurements of serum and urine markers of bone metabolism and of forearm bone density (BMD) by dual photon absorptiometry were performed in 22 patients undergoing renal transplantation in 1986. Patients were randomised to immunosuppression with (1) cyclosporin alone (CsA group, n=10), (2) cyclosporin for 3 months followed by azathioprine-prednisone (CsA/AzP group, n=3) or (3) long-term azathioprine-prednisone (LT AzP group, n=9). As no reduction in bone mineral density (BMD) was noted in the first 6 months, groups 2 and 3 were considered together (AzP group, n=12). Mean+/-SEM BMD fell by 19+/-2% at 36 months (n=19, p<0.01), with similar reductions seen in the CsA and AzP groups. At 60 months, BMD of the AzP group was 25+/-3% below baseline (p<0.01), while the CsA group were only 5+/-4% below baseline (p=NS vs baseline, p<0.05 vs AzP group). The degree of reduction in BMD over 5 years correlated with total glucocorticoid dose (r=0.63, p<0.05), but not with biochemical markers of bone turnover. Serum alkaline phosphatase fell post-transplant in patients treated with AzP, but not in the CsA group. These results demonstrate significant loss of forearm bone mineral with long-term follow-up after renal transplantation, but suggest that patients treated with cyclosporin monotherapy may be at lower risk of this complication.

58 citations


Journal Article
TL;DR: Although the graft survival rates were similar regardless of the donor's relationship to the recipient, the causes of graft failure differed, and Pretransplant blood transfusions did not result in improved graft survival, nor in a reduced incidence of early rejection episodes.
Abstract: The number of living donor transplants reported to the UNOS Scientific Renal Transplant Registry has increased from 1,810 in 1988 to 2,861 in 1994. Nearly all 250 United States transplant centers have reported living donor transplants. Graft survival rates were uniformly high for recipients of living donor transplants. One- and 5-year survival ranged from 95% and 84% for 3,653 HLA-identical sibling transplants to 89% and 69% for 1,981 offspring-to-parent transplants. Transplant half-lives ranged from 10 years for 360 transplants from nonspouse unrelated donors to 22 years for 3,653 transplants between HLA-identical siblings. These half-lives were significantly better than the 9 year half-lives projected for cadaveric grafts (p<0.01). The 282 second transplants from HLA-identical sibling donors had one- and 5-year survival rates of 94% and 80%, respectively, which were not significantly different than those of first grafts. The 1,005 retransplants from HLA-mismatched living donors had one- and 5-year survival rates of 88% and 69%, respectively, 2% below the survival rate for first transplants (p=0.026). The one- and 5-year graft survival rates for 1,932 zero-HLA haplotype matched living donor transplants were 89% and 73%, respectively, and the half life was 13.6 years. These results were comparable to those for parent donor transplants. Although the graft survival rates were similar regardless of the donor's relationship to the recipient, the causes of graft failure differed. About 48% of graft failures among parents who received a kidney from their offspring were deaths with a functioning graft. Only 12% of failures among parent-to-offspring transplants were deaths. One- and 5-year graft survival rates were 85% and 57%, respectively, for Black recipients of parent donor kidneys, and were 95% and 68%, respectively, for Blacks with an HLA-identical sibling donor transplant. The results were higher for comparable Whites: 91% and 73% for parental, and 95% and 86% for HLA-identical sibling grafts, respectively. HLA-mismatched transplants to 656 broadly sensitized recipients resulted in 83% and 62% one- and 5-year graft survival rates, respectively. The comparable results for those with less than 50% PRA were 91% and 73% (p<0.001). Pretransplant blood transfusions did not result in improved graft survival, nor in a reduced incidence of early rejection episodes. Information was not available to distinguish between donor-specific and random donor transfusions. The oldest living donor was aged 76, and 404 donors were over age 60. The one- and 5-year survival rates were 86% and 61%, respectively, for donors over age 60. The comparable results for younger donors were 90% and 72% (p=0.005).

54 citations


Journal Article
TL;DR: An information campaign carried out in three geographical areas of Sweden in the winter of 1992-93 was intended to increase public awareness of organ donation and to increase the signing of donor cards.
Abstract: One of the aims of this study was to evaluate an information campaign carried out in three geographical areas of Sweden in the winter of 1992-93. The campaign was intended to increase public awareness of organ donation and to increase the signing of donor cards. Another objective was to test the effects of different kinds of information. These were: A) an extensive package of information including training of key groups, lecturing at meetings and exhibitions, and advertisements of donor cards: B) a brochure to households including two donor cards; and C) a combination of A and B. Yet another aim was to reassess public opinion on transplantation issues, which had been surveyed before in 1987, 1988, and 1990. Random samples of the population in three campaign areas and a control sample were surveyed before and after the campaign, altogether 5600 persons. The average response rate was 69% (1992) and 68% (1993). In the two areas where the brochure had been distributed to the households, the rate of donor card holders had more than doubled (from 3% and 5% to 13% and 12%). In the two areas where the brochure had not been distributed, the rate was unchanged (5%). In the brochure areas also a somewhat larger number of people had informed their relatives about their decisions, compared with people in the other areas. In all campaign areas considerably more people were aware of the cards than in the control area. No attitude changes could be shown in any area. Thus, the mailed brochure was the most effective in increasing people's signing of the donor cards and informing their families. The more elaborate campaign could possibly have long-term effects, but this remains to be studied. The attitudes toward organ donation have been rather constant in Sweden from 1987 to 1993, with only slight variations. A frankness gradient was confirmed in this study as well as in an earlier study, where those who were negative toward donation of their own organs had discussed this issu

49 citations


Journal Article
TL;DR: The current studies clearly indicate that asymptomatic hyperparathyroidism is common even after 2 years post-renal transplant and monitoring for PTH and 1:25VD will help prevent bone disease post-transplant now that rocaltrol is available.
Abstract: Since endogenous 1:25 vitamin D (1:25VD) is principally involved with involution of secondary hyperparathyroidism post-renal transplant we correlated 1:25VD levels with intact PTH in 82 random patients with a serum creatinine of 200 pg/ml. Of concern, 20% of 73 patients had 1:25VD deficiency (< 15 pg/ml). This may not have been previously appreciated because of the number of patients studied. Like previous investigators, we failed to understand why 1:25VD levels were relatively low. There was no correlation between 1:25VD and serum creatinine. Of 25 patients with a serum creatinine of 1.4 or less, there were 10 patients (40%) with 1:25VD of less than 20 pg/ml. Since persistently high PTH can contribute to bone demineralization, which is not uncommon post-transplant, we treated 8 patients with small doses of oral 1:25VD (rocaltrol). In less than 6 months PTH levels returned to normal in 7 of the 8 patients. The current studies clearly indicate that asymptomatic hyperparathyroidism is common even after 2 years post-renal transplant. Monitoring for PTH and 1:25VD will help prevent bone disease post-transplant now that rocaltrol is available.

Journal Article
TL;DR: The case of a patient who developed fatal acute graft-versus host disease (GvHD) after liver transplantation (LT) further supports the qualification that LT may be complicated by GvHD and strongly suggests that minor rather than major histocompatibility antigens are the main target of allogenic interactions of Gv HD.
Abstract: We report herein the case of a patient who developed fatal acute graft-versus host disease (GvHD) after liver transplantation (LT). GvHD occurred 18 days after LT and was characterized by skin epidermolysis, diarrhea and leucopenia. Skin biopsy showed epidermal dyskeratosis with epithelial necrosis, a lesion consistent with GvHD. Despite immunosuppressive therapy, the patient died within 24 days. In our observation, GvHD occurred although five HLA compatibilities were identified between the donor and the recipient, an apparently favorable and uncommon situation. This case further supports the qualification that LT may be complicated by GvHD and strongly suggests that minor rather than major histocompatibility antigens are the main target of allogenic interactions of GvHD. The involvement of chimerism in GvHD is controversial and requires further investigation.

Journal Article
TL;DR: The presence of increased days of vomiting and occult blood in stools suggests that rotavirus causes a more invasive process in the intestinal mucosa of transplant recipients compared to immunocompetent children, however, the process remains self-limited despite the use of potent immunosuppressives.
Abstract: A retrospective survey of nosocomial rotavirus infection in pediatric liver transplant recipients was performed. Immunocompetent children with nosocomial infections served as controls. Co-pathogens were not identified. A total of 12 transplant cases and 12 controls could be evaluated. New onset vomiting occurred in 7/8 cases and 6/11 controls lasting an average of 2.8 days per case and 0.8 days per control (p 38 degrees C) was noted in 8/12 cases and 9/12 controls. New onset occult blood was noted in 7/11 cases and 1/12 controls (p < .01). A concomitant rise and fall in transaminases was noted in 5/12 transplant recipients. Eleven of the 12 were maintained on constant or increased immunosuppression doses without the development of fulminant disease. The presence of increased days of vomiting and occult blood in stools suggests that rotavirus causes a more invasive process in the intestinal mucosa of transplant recipients compared to immunocompetent children. However, the process remains self-limited despite the use of potent immunosuppressives.

Journal Article
TL;DR: The measurement of malondialdehyde suggests that the products of oxygen free radical damage can be measured during renal transplantation, and that they may have an adverse effect on early graft function.
Abstract: We prospectively measured malondialdehyde (MDA), a marker of free radical oxygen damage during 44 renal transplant operations. When corrected for intra-operative changes in plasma volume, there was a significant increase in the ratio of MDA to total cholesterol (x10 3), from a median of 0.32 (0.24-0.44) (interquartile range) to 0.39 (0.31-0.50) at 30 minutes following reperfusion, p 250 umol/l at the end of the 1st post-operative week), mean 0.32 (sem 0.08) at 30 min and 0.32 (0.09) at 60 min, compared to those with good function (serum creatinine < 250 umol/l), 0.12 (0.05) and 0.10 (0.04) respectively, p < 0.05. This suggests that the products of oxygen free radical damage can be measured during renal transplantation, and that they may have an adverse effect on early graft function.

Journal Article
TL;DR: The presence of anti- HCV does not appear to alter long-term patient or graft survival, and histologic evidence of severe chronic liver disease was rare in anti-HCV positive patients with chemical hepatitis.
Abstract: To assess the prevalence and long-term impact of HCV on kidney transplant recipients, we assayed 716 pre-transplant sera using a first-generation ELISA. The anti-HCV positive sera were confirmed by a 6-antigen radioimmunoassay (RIA). Patients were followed up for 5 years. Graft survival, function, evidence of chemical hepatitis (AST > 2x normal), patient mortality and cause of death were evaluated. The prevalence of anti-HCV antibody was 10.3%. In the 638 patients who were followed up for 5 years, there were no differences in graft function, graft survival, overall mortality, or death from sepsis or liver disease. Peak AST levels were significantly higher in anti-HCV positive patients compared to anti-HCV negative patients. At 5 years, the AST levels remained significantly higher in the anti-HCV positive group, however, this was only 6 U/1 > normal. Liver biopsies performed 3 to 7 years post-transplant in 80% of anti-HCV positive patients with chemical hepatitis showed 12% CAH, 50% mild hepatitis and 38% normal histology. Six (9.7%) patients seroconverted from anti-HCV positive to anti-HCV negative 2 to 5 years post-transplant. The presence of anti-HCV does not appear to alter long-term patient or graft survival, and histologic evidence of severe chronic liver disease was rare in anti-HCV positive patients with chemical hepatitis. From these results, the presence of anti-HCV antibody should not preclude kidney transplantation.

Journal Article
TL;DR: Predictive factors for long-term GFR were evaluated in consecutive SPK recipients using a Tc99m DTPA GFR reference method between 90 days and 6 years after transplantation and this formula was more accurate in estimation of GFR inSPK recipients than six published predictive methods.
Abstract: Impairment of glomerular filtration rate (GFR) after simultaneous pancreas and kidney (SPK) transplantation is an important marker of chronic renal rejection and recurrence of diabetic glomerulopathy. The use of unmodified serum creatinine to estimate GFR, however, is limited by variations in muscle mass. In this study, predictive factors for long-term GFR were evaluated in consecutive SPK recipients (n = 33) using a Tc99m DTPA GFR reference method between 90 days and 6 years after transplantation (n = 136 measurements). Substantial variability between serum creatinine and isotopic GFR after SPK (R2 = 0.30) high-lighted the inaccuracy of an unmodified serum creatinine in the evaluation of GFR. Factors which predicted GFR apart from serum creatinine included age, sex, height and body weight. A detailed formula was derived for accurate estimation of GFR (ml/min) = [71.4 (male) or 50.4 (female)] + 5520/creatinine (mumol/l) + 0.27 x body weight (kg) - 0.50 x age (yr) - 0.29 x height (cm). This formula was more accurate in estimation of GFR in SPK recipients than six published predictive methods which were derived from chronic renal failure patients using creatinine clearance. All of these methods overestimated GFR at lower levels of renal function. Most correlated poorly with Tc99m DTPA GFR and contained a generalized systematic overestimation of GFR which ranged from 4.7 to 8.4 ml/min (p < 0.05). A simplified version for rapid calculation was also derived as GFR (ml/min) = [25 (male) or 5 (female)] + 5000/creatinine (mumol/l).(ABSTRACT TRUNCATED AT 250 WORDS)

Journal Article
TL;DR: The introduction of cyclosporine into widespread clinical use has resulted in improved patient survival following cardiac transplantation and the incidence of renal allograft rejection in the heart transplant patients was 10-fold less than that of the renal transplant controls.
Abstract: The introduction of cyclosporine into widespread clinical use has resulted in improved patient survival following cardiac transplantation. As a result of increased numbers of cardiac transplants, the inherent nephrotoxicity of cyclosporine, and prolonged patient survival, cardiac transplant recipients commonly present with renal dysfunction. In the subgroup who ultimately develop end-stage renal disease (ESRD), therapeutic options include renal transplantation. However, the clinical course associated with this treatment modality is unknown. From 1980 to 1993, 430 cardiac transplants were performed with cyclosporine-based immunosupression at the Standard University Medical Center. Fourteen (3.3%) patients developed ESRD, requiring chronic dialysis or renal transplantation. The cause of ESRD was cyclosporine nephropathy (13/14; 93%) and glomerulonephritis (1/14; 7%). The average time interval to the development of ESRD was 82 ± 42 months. Nine patients underwent renal transplantation. During the period of followup (38 ± 27 months; range 6-89 months) after renal transplantation, cardiac function remained stable. There were no episodes of primary nonfunction of the renal allograft. Patient and renal allograft survival was 89% at both 1 and 3 years after renal transplant. Average serum creatinine was 1.3 ± 0.6 mg/dl at 1 year and 1.6 ± 0.8 mg/dl at 3 years post-transplant. The incidence of infectious complications was not statistically different when compared to that of the heart transplant controls and that of a group of cadaveric renal transplant controls (n = 20). Surprisingly, the incidence of renal allograft rejection in the heart transplant patients was 10-fold less than that of the renal transplant controls (0.006 ± 0.02/patient-year vs. 0.062 ± 0.05/patientyear; p<0,01). Longitudinal comparison of cyclosporine trough levels in the two groups showed no statistical difference. These data indicate that interval renal transplantation in cardiac transplant recipients can be performed with minimal morbidity and mortality. The renal allografts function well over an extended period of time while cardiac allograft function remains stable

Journal Article
TL;DR: A management strategy is hypothesized based on the donor graft weight-to-recipient body weight ratio (GRBWR) and the Redox Tolerance Index (RTI) and it is suggested that favorable recipients and compromised recipients might be suitable for a left liver auxiliary-heterotopic transplantation.
Abstract: Living-related liver transplantation (LRLT) in pediatric recipients has been successful with 90% 1-year survival (1, 2). LRLT in adults is yet to succeed. We hypothesize a management strategy based on the donor graft weight-to-recipient body weight ratio (GRBWR) and the Redox Tolerance Index (RTI) (3). Favorable recipients (GRBWR above 0.7%) are candidates for left liver orthotopic transplantation. Unfavorable recipients (GRBWR 0.5-0.7% and RTI above 0.5%) are suitable for left liver auxiliary-orthotopic transplantation. Compromized recipients (GRBWR below 0.5% or RTI below 0.5%) might be suitable for a left liver auxiliary-heterotopic transplantation. The above hypothesis remains to be validated.

Journal Article
TL;DR: In general, the indication for simultaneous organ transplantation should be considered earlier than for transplantation involving only one organ, because patients with liver cirrhosis have a very poor prognosis due to their poor overall clinical state at the time of terminal renal failure.
Abstract: Kidney or liver transplantation (KTx, LTx) today is a standard therapeutic procedure if one of these organs fails. However, the need for transplantation of both organs may arise with deterioration of organ function. To evaluate the success of combined LTx/KTx we analyzed 20 patients (aged 14-64) who received a total of 21 LTx and 31 KTx. Simultaneous LTx/KTx was performed in 14 patients, of whom 5 required further replacement of one or the other of the grafted organs. Six patients had sequential transplantation: 3 had LTx prior to the KTx, and 3 KTx prior to LTx. In 12 patients the indication for LTx was end-stage liver cirrhosis, and of these 8 died after LTx, mostly of infections. In a group of 8 transplant recipients without liver cirrhosis (e.g. polycystic liver), only 1 patient died. Eleven of the 20 grafted patients are still alive now (follow-up after LTx 14-120 months). Episodes of liver and kidney rejection occurred in only 30% and 15% of transplanted patients respectively. Only 1 patient is back on hemodialysis, the others have normal liver and kidney function. Combined LTx/KTx may be successful in appropriate circumstances. However, patients with liver cirrhosis have a very poor prognosis due to their poor overall clinical state at the time of terminal renal failure. In contrast, patients without liver cirrhosis are better candidates, even for simultaneous LTx/KTx. In general, the indication for simultaneous organ transplantation should be considered earlier than for transplantation involving only one organ.

Journal Article
TL;DR: The IBMTR and ABMTR offer a unique resource for examining the role of allogeneic and autologous transplants in cancer treatment and consideration of the relative efficacy of conventional therapy in specific settings is addressed.
Abstract: Bone marrow transplants are an effective treatment for many life-threatening diseases. Considerations for use include potential for cure, availability of a suitable donor, feasibility of using autologous stem cells, and risks of transplant-related mortality. Not addressed in this article, but also important, is consideration of the relative efficacy of conventional therapy in specific settings. The IBMTR and ABMTR offer a unique resource for examining the role of allogeneic and autologous transplants in cancer treatment.

Journal Article
TL;DR: 3 preoperative variables were found to be significant in the univariate analysis: Mayo risk score, Child's score, nutritional status, Karnofsky score, INR, serum levels of bilirubin and creatinine, presence of renal failure, and gastrointestinal bleeding.
Abstract: Orthotopic liver transplantation (OLT) has been shown to be effective in prolonging life and improving its quality in patients with end-stage liver disease. However, it remains one of the most expensive surgical procedures performed today. In an era when economic efficiency and financial accountability are being emphasized, it is imperative to consider resource utilization in evaluating candidates for OLT. We prospectively followed 106 patients who underwent OLT at the Mayo Clinic for primary biliary cirrhosis and primary sclerosing cholangitis between 1990 and 1994. Hospital and professional charges for the initial hospitalization were obtained on all patients. Univariate and multivariate models were constructed using preoperative clinical variables that had been previously found to be important in predicting clinical outcomes. The preoperative variables considered were age, gender, diagnosis of liver disease, Mayo risk score, Child's score, nutritional status, Karnofsky score, INR, serum levels of albumin, bilirubin, and creatinine, and the presence/absence of ascites, edema, encephalopathy, renal failure (serum creatinine >2.0) and gastrointestinal bleeding. Of the 106 patients, 3 were excluded from the analysis because they received multiple transplants during the initial hospitalization. Of the hospital charges we analyzed, the surgical fee for transplantation and donor acquisition expense were fixed in advance and, therefore, excluded. The following preoperative variables were found to be significant in the univariate analysis: Mayo risk score, Child's score, nutritional status, Karnofsky score, INR, serum levels of bilirubin and creatinine, presence of renal failure, and gastrointestinal bleeding. In the multivariate analyses, Karnofsky score of 40 or less was associated with a 48% increase in total charges. Poor nutritional status and renal failure were associated with a 34% and 31% increase, respectively. We identified 3 preoperative variables as significant independent predictors of resource utilization in liver transplantation. In an effort to maximize the economic efficiency with which liver transplantation is performed, we believe these factors should be taken into consideration in determining both the timing of transplantation and the suitability of potential transplant recipients.

Journal Article
TL;DR: It is indicated that acute rejection and cyclosporine dose are the major factors influencing long-term renal function after steroid withdrawal and there is an inverse relationship between cyclospora dose and serum creatinine concentration for up to 5 years.
Abstract: We retrospectively measured changes in serum creatinine concentration as estimates of changes in renal function in 96 renal transplant recipients who were withdrawn from steroid therapy, maintained on cyclosporine and azathioprine, and followed for 1 to 5 years. Multivariate analyses were used to assess the influence of cyclosporine dose and blood levels, azathioprine dose, age, sex, race, diabetes, HLA match and mismatch, PRA, and history of rejection following steroid withdrawal on long-term allograft function. Results indicate that acute rejection and cyclosporine dose are the major factors influencing long-term renal function after steroid withdrawal. In this setting, there is an inverse relationship between cyclosporine dose and serum creatinine concentration for up to 5 years. Optimal renal function is achieved in patients receiving more than 5.5 mg/kg of cyclosporine per day at the time of steroid withdrawal.

Journal Article
TL;DR: Early diagnosis, institution of intravenous acyclovir 10 mg/kg tds, zoster immunoglobulin, cessation of azathioprine treatment and aggressive supportive care may improve an otherwise bleak prognosis.
Abstract: Disseminated primary varicella zoster infection in renal transplant patients can result in severe and often fatal illness. The disease tends to be much more severe in adults with an 80% mortality in the only reported series (1). We report 3 cases of severe disseminated varicella zoster in adult renal transplant patients who all survived. Early diagnosis, institution of intravenous acyclovir 10 mg/kg tds, zoster immunoglobulin, cessation of azathioprine treatment and aggressive supportive care may improve an otherwise bleak prognosis.

Journal Article
TL;DR: This work has used liver transplantation in combination with cardiac transplantation to care for selected patients with end stage disease of both organs, and has been remarkably successful for the 3 patients transplanted at the University of Chicago.
Abstract: Over the past 5 years, we have employed several strategies to increase the donor pool for both the pediatric and adult populations. The innovative expansion of the donor pool with the use living-related donors for children and cadaveric, high-risk donors for adults has increased our ability to serve our recipients and transplant them at an earlier stage in the disease process, thereby improving survival. As Hepatitis C is now the leading indication for liver transplantation in the adult population, the investigation of the natural history of Hepatitis C prior to and after transplantation provides a major challenge and is currently a focus of both laboratory and clinical efforts. For those recipients of Hepatitis C-positive-donor livers, determining the role of recipient and donor genotypes in the progression of recurrent hepatitis should help define the proper utilization of these organs. For patients on CsA-based immunosuppression regimens who experience steroid-resistant rejection, tacrolimus has proved to be extremely effective in reversing the rejection episodes and maintaining normal graft function. The long-term results of this therapy appear to be superior to OKT3 therapy. The recipients of living-related liver transplantation continue to have a survival advantage in comparison to recipients of cadaveric grafts. The donor operation can be routinely performed with minimal risk. Because of the superior results achieved and minimal donor risks, we feel that providing the option of living-donor transplantation is ethically justified, and medically necessary. Despite the encouraging results from living-donor transplantation, unexpected complications including portal vein complications and hepatic artery thrombosis have forced technical modifications of the original technique which may have implications to pediatric liver transplantation in general. As the volume of pediatric liver transplants and the number of immmunosuppressive regimens have increased over the years, posttransplant lymphoproliferative disease has been identified as a problem which requires more inspection. We have determined that the severity of rejection and the subsequent treatment, and primary Epstein-Barr virus are the primary risk factors for developing of PTLD. Identification of the risk factors and early detection may provide some hope for treatment of the disease while allowing long-term graft function. Results of our preliminary data show that, following transplantation, 3/4 of the children or parents report minimal impairment with regard to developmental or physical milestones. Patients and their families, however, continue to report significant levels of stress in their lives and occasional pain. Further research on outcome needs to be performed on our pediatric recipients to ensure the long-term benefit of our efforts.

Journal Article
TL;DR: Irrespective of the immunosuppressive regimen, the incidence of early postoperative neurotoxicity was significantly lower in patients transplanted owing to HBV disease, alcoholic cirrhosis and various other liver diseases summarized than in Patients transplanted due to HCV disease receiving FK506 therapy.
Abstract: Since we may soon be able to choose between primarily CsA-or FK506-based immunosuppression, it is important to establish the superior immunosuppressive agent for the individual patient. In the present study, 121 patients, 61 randomly assigned to FK506- and 60 assigned to CsA-based immunosuppression, were analyzed according to the primary diagnosis for liver transplantation. One-year patient survival was similar in all groups. However, the incidence and severity of acute rejection within the 1st year after transplantation was significantly higher in patients transplanted due to HCV disease who were receiving FK506 (58.8%) compared with those patients receiving CsA (27.8% ; p≤0.05). Furthermore, the incidence of moderate and severe neurotoxicity was significantly higher during the 1st month after LTX in patients transplanted owing to HCV disease treated with FK506 (35.3%) compared with those patients receiving CsA (16.7% ; p≤0.05). Irrespective of the immunosuppressive regimen, the incidence of early postoperative neurotoxicity was significantly lower in patients transplanted owing to HBV disease, alcoholic cirrhosis and various other liver diseases summarized than in patients transplanted due to HCV disease receiving FK506 therapy. During the lst year, the incidence and severity of rejection in patients transplanted due to alcoholic cirrhosis and PBC was significantly lower in patients treated with FK506 (11.1% for both groups) compared with those patients receiving CsA (54.5% and 60.0%, respectively ; p≤0.05). Furthermore, this was accompanied by a lower incidence of toxicity. Therefore, our results indicate that morbidity, according to the incidence and severity of rejection and neurotoxicity, was significantly higher in patients transplanted owing to HCV disease treated with FK506 compared with those patients receiving CsA. FK506 demonstrated superior immunosuppressive potency in patients transplanted because of alcoholic cirrhosis and PBC regarding incidence and severity of rejection, as well as incidence of neuro- and nephrotoxicity.

Journal Article
TL;DR: After a decade of little change, many changes in the immunologic management of organ transplant recipients are now imminent, and basic science is revealing mechanisms and reagents, and clinicians are identifying opportunities for application of this new knowledge.
Abstract: After a decade of little change, many changes in the immunologic management of organ transplant recipients are now imminent. Basic science is revealing mechanisms and developing new reagents, and clinicians are identifying opportunities for application of this new knowledge. The interaction between the laboratory and the clinic, and the renewed interest in clinical trials, guarantee that transplantation practice will evolve rapidly in the new few years. The examples mentioned here high-light the scope of the possibilities for progress.

Journal Article
TL;DR: It is concluded that APACHE II scoring may be useful in predicting outcome in post-operative liver transplant recipients, but is not useful in stratifying risk in renal transplant recipients due to the inherently low mortality involved.
Abstract: Over a 26-month period we assessed the ability of APACHE II, scored on admission to the surgical intensive care unit (SICU), to predict the in-hospital mortality of liver and kidney transplant recipients either post-operatively or after subsequent complications, and compared these results to non-transplant SICU admissions. There were 866 SICU admissions, of which 128 were liver transplant recipients, 112 were renal transplant recipients, 211 were trauma admissions and 415 were non-transplant/non-trauma admissions. In hospital mortalities among all liver transplant admissions were 0%, 10%, 38%, and 82% for APACHE II ranges of 0-10, 11-20, 21-30 and > 30, respectively, with differences between the second and third, and third and fourth ranges significant (p < or = 0.05 by chi-square analysis). These differences were also seen when examining scores following the primary transplantation alone. Mortalities in corresponding APACHE II ranges for trauma and nontransplant/nontrauma admissions were similar. APACHE II scoring was not useful for renal transplant recipient, as it consistently overpredicted mortality. We conclude that APACHE II scoring may be useful in predicting outcome in post-operative liver transplant recipients, but is not useful in stratifying risk in renal transplant recipients due to the inherently low mortality involved.

Journal Article
TL;DR: The goal of transplantation is to provide organs to all with long-term survival of the graft and efforts that can be made to increase and sustain transplant activity in Pakistan require a concerted effort on the part of the government, society and the medical profession.
Abstract: The economic indicators of Pakistan show that the GNP is dollar 70 billion and foreign exchange reserves stand at dollar 8.0 billion and foreign debt at more than dollar 36 billion. Against this backdrop, the government is unlikely to provide state-of-the-art facilities for management of end-stage organ failure. The unequal distribution of wealth leaves more than 40% below the poverty line. Economic solutions are based on temporary fixes where foreign aid and loans keeps the government machinery operational. Many of the basic health measures such as immunization are also foreign funded. Under such a scenario, local philanthropy has come to play a vital role. SIUT developed a model based on self-help--a model based on a community-government partnership, where the doctor plays the pivotal role and the beneficiary is the patient. SIUT acquired funds by developing a community-government partnership. The government fulfills about 40% of the total budget and the rest comes from the community as donations. The scheme has been extremely successful in providing free medical care and renal support to thousands of patients. It has been sustained over the past 15 years through complete transparency, public audit and accountability. These confidence-building means stimulate the community to come forward and donate money, equipment and medicines. The goal of transplantation is to provide organs to all with long-term survival of the graft. The emerging challenges to achieve this goal and efforts that can be made to increase and sustain transplant activity in Pakistan require a concerted effort on the part of the government, society and the medical profession.

Journal Article
TL;DR: Overall, recipient category was the most significant factor (relative risk for graft loss being significantly lower for SPK than for PAK and PTA cases) and other variables also had an impact on results depending on the recipient category.
Abstract: As of 1995, more than 7,500 pancreas transplants had been reported to the International Pancreas Transplant Registry. More than 5,300 were performed in the United States, including more than 4,000 since the inception of the UNOS Registry in October, 1987. The bladder drainage (BD) technique has been the most widely used duct management technique since 1987 with 93% of all cases. In the overall analysis of US BID cadaveric pancreas transplants reported to the registry, patient survival and pancreas functional graft survival rates were 91% and 75% respectively, at one year, 88% and 72% at 2 years, and 85% and 67% at 3 years. When the 1987-95 US data for primary BID cases was analyzed according to the three major recipient categories [simultaneous pancreas/kidney transplants (SPK) (n=3,539); pancreas after kidney transplants (PAK) (n=238); and pancreas transplants alone (PTA) (n=175)], patient survival rates were no different (91%, 92% and 91% at one year, respectively), but pancreas graft survival rates were significantly higher in the SPK than in the PAK and PTA categories (78%, 56%, and 55%, at one year, respectively). In the SPK group, kidney graft survival rates at one year were 86%. An improvement of the graft survival could be shown over the analyzed time period for all categories. Outcomes were also compared according to whether induction immunotherapy in recipients included ALG/ATG/ATS, OKT3 or neither. In the primary SPK category, patients who received OKT3 (n=1,416) showed the best outcome with one-year graft survival rates of 83% followed by ALG/ATG/ATS (n=1,559) with 78%. Patients that received neither (n=410) had a significantly lower graft survival rate. In the primary PAK category, the use of OKT3 (n=49) was associated with lower graft survival rates than when ALG/ATG/ATS (n=143) or neither (n=40) were given, 51%, 66%, and 55% at one-year, respectively. In the PTA category, the use of ALG/ATG/ATS (n=93) was associated with significantly higher graft survival rates than when OKT3 (n=62) or neither (n=9) were used, being 63%, 58%, and none, respectively, at one-year. No negative impact of longer preservation time could be found in the univariate analysis. The effect of HLA-A, B and DR mismatching on outcome for primary US cases was also determined. Again the results differed according to recipient category. For SPK cases, there was only a beneficial effect of a perfect 6 antigen match (n=21) compared to 1 and 2-6 mismatches being 85%, 73% and 78% at one-year. In the primary PAK category, graft survival rates were significantly higher in those mismatched for 0 (n=6) and one (n=25) than for 2-6 (n=195) HLA antigens, being 100%, 76% and 58% at one year. In the primary PTA category there were no zero mismatch, technically successful cases. One-year graft survival rates were 70% (n=10) for the category with one mismatch and 55% (n=157) in the 2-6 antigen mismatch group. Cox multivariate analyses of the US data base showed that, overall, recipient category was the most significant factor (relative risk for graft loss being significantly lower for SPK than for PAK and PTA cases). Other variables also had an impact on results depending on the recipient category. Recipient age has an impact on patient survival as well as graft survival. It was most influential in the SPK and PAK category, but an effect was not seen in the PTA category. In both the PAK and PTA categories, minimizing HLA mismatches was associated with a significantly lower risk for graft loss. In the SPK and PTA category, anti-T-cell therapy significantly lowered the risk of graft loss. In the PAK and SPK category the transplant outcome improved significantly over the analyzed time period. Patient survival also improved overtime.

Journal Article
TL;DR: The increase of hematopoietic stem cell transplants as a therapeutic modality over the last 20 years in Europe is illustrated by the increasing number of teams performing transplants.
Abstract: This report details the evolution of bone marrow transplantation in Europe over a 20-year period. In 1973, 8 teams undertook a total of 16 allogeneic bone marrow transplants; in 1983, 97 teams performed 1353 transplants. In 1993, the numbers had risen to 260 teams and 7737 transplants. Donor source in 3092 cases was an allogeneic donor (2464 HLA-identical sibling transplants, 147 non-identical family donor transplants, 25 twin donor transplants and 456 unrelated donor transplants). For 4645 patients the transplant was autologous (2450 autologous bone marrow transplants, 1830 autologous peripheral blood stem cell transplants and 365 combined autologous peripheral blood and bone marrow transplants). Indications for transplants in 1993 were leukemias in 3419 patients (44%; 2332 allogeneic, 1087 autologous), lymphoproliferative disorders in 2666 patients (34%; 197 allogeneic, 2469 autologous), solid tumors in 1077 patients (14%; 9 allogeneic, 1068 autologous), aplastic anemia in 251 patients (3%; 250 allogeneic, 1 autologous), inborn errors in 244 patients (3%; 242 allogeneic, 2 autologous) and miscellaneous disorders in 80 patients (1%; 62 allogeneic, 18 autologous). These data illustrate the increase of hematopoietic stem cell transplants as a therapeutic modality over the last 20 years in Europe.