pubmed-article:8116051 | pubmed:abstractText | To determine if cardiac allograft outcome is improved among patients with fewer HLA-DR mismatches with their donors, we studied 132 recipients of a primary cardiac allograft who were transplanted between December 1985 and December 1991. These recipients and their donors all had high-confidence-level serological HLA-DR typing, previously shown to correlate highly with DNA DR typing. Patients were divided in two groups based on the HLA-DR mismatch with their donors. Group I consisted of 78 patients with 1 or zero DR mismatch and group II of 54 patients with 2 DR mismatches. Allograft outcome measurements included incidence of moderate rejection, incidence of allograft vasculopathy at 12 months, cardiac function measured as left ventricular ejection fraction (LVEF) and cardiac index (CI), and actuarial graft survival up to 7 years. Groups I and group II were not different with regard to recipient age, donor age, ischemia time, pulmonary vascular resistance, sex, or PRA greater than 0%. Group II had a higher incidence of moderate rejection on the first-week biopsy (47% vs. 25%, P = 0.019), and during the first month (84% vs. 58%, P = 0.006), but no difference was found in frequency of rejection from months 2 to 12. LVEF was not different in the groups at any point. CI was better in group I at 12 months (2.76 vs. 2.5, P = 0.03). No statistically significant difference was found in incidence of allograft vasculopathy (17% vs. 26%, P = 0.204). Actual graft survival at 1 year was better for group I (91% vs. 74%, P = 0.008), and actuarial graft survival at 6 years also favored group I (76% vs. 56%, P = 0.04). Using high-confidence-level serological HLA-DR typing assignments we demonstrated that HLA-DR mismatching correlates highly with cardiac allograft outcome. Implications are that heart transplant survival could be improved if prospective matching were feasible and prioritized or if immunosuppression were tailored to the HLA-DR match. | lld:pubmed |