El Portal de Urología en Español para profesionales

Búsqueda personalizada


Documento sin título


Los Blogs de UroPortal:
Novedades en UroPortal
Vídeos Urología
Presentaciones Urología



Este mes en... Clinical Transplantation

Sumarios de Revistas

Este mes en... Clinical Transplantation:

  • Subsequent Malignancies After Allogeneic Hematopoietic Stem Cell Transplantation
    We evaluated 979 patients for the development of post-transplant lymphoproliferative disease (PTLD) and solid malignancies after allogeneic hematopoietic stem cell transplantations (allo-HSCT) as a late complication. We found 15 (1.5%) subsequent malignancies; three of these malignancies were PTLD, and twelve were solid tumors. The median time from allo-HSCT to the development of PTLD was 9 (3-20) months and that from allo-HSCT to the development of solid tumors was 93 (6-316) months. The cumulative incidence of evolving subsequent malignancy in patients was 1.3% (±0.5 SE) at 5 years and 3.9% (±1.2 SE) at 10 years. The Cumulative incidence of developing subsequent malignancy in patients with benign hematological diseases as the transplant indication was 7.4%±4.2SE at 5 years. More subsequent malignancy developed in patients having ≥1 year chronic GVHD (3.7% in ≥1 year chronic GVHD and 0.7% in <1 year chronic GVHD patient groups, p=0.002). Subsequent epithelial tumor risk was higher in ≥1 year chronic GVHD patients than <1 year (3.7% vs 0.1%, p<0.001). In multivariate analysis, benign hematological diseases as transplant indication (RR:5.6, CI 95%: 1.4-22.3, p=0.015) and ≥1 year chronic GVHD (RR:7.1, CI95%: 2.3-22.5, p=0.001) were associated with the development of subsequent malignancy. This article is protected by copyright. All rights reserved.
  • Early Post-Operative Management After Lung Transplantation: Results of an International Survey
    Introduction Little data exists regarding optimal therapeutic strategies post-operatively after lung transplant (LTx). Current practice patterns rely on expert opinion and institutional experience resulting in non-uniform post-operative care. To better define current practice patterns, an international survey of LTx clinicians was conducted. Methods A 30-question survey was sent to transplant clinicians via email to the ISHLT open forum mailing list and directly to the chief transplant surgeon and pulmonologist of all LTx centers in the United States. Results 52 clinicians representing 10 countries responded to the survey. Sedatives use patterns included: opiates + propofol (57.2%), opiates + dexmedetomidine (18.4%), opiates + intermittent benzodiazepines, (14.3%), opiates + continuous benzodiazepines (8.2%) and opiates alone (2%). 40.4% reported no formal sedation scale was followed and 13.5% of programs had no formal policy on sedation and analgesia. A lung protective strategy was commonly employed, with 13.8%, 51.3%, and 35.9% of respondents using tidal volumes of < 6 mL/kg ideal body weight (IBW), 6 mL/kg IBW, and 8 mL/kg IBW, respectively. Conclusion Practice patterns in the early post-operative care of lung transplant recipients differ considerably among centers. Many of the reported practices do not conform to consensus guidelines on management of critically ill patients. This article is protected by copyright. All rights reserved.
  • Diabetic Kidney Transplant Recipients: Impaired Infection Control and Increased Alloreactivity
    Background Posttransplantation diabetes mellitus (PTDM) has been associated with inferior patient and allograft outcomes. However, previous studies didn't identify differences in infection control and alloreactivity. Methods We studied 449 kidney transplant recipients (KTRs) between 2005 and 2013. 50 (11.1%) KTRs were diagnosed with PTDM and 60 (13.4%) KTRs with pre-existing diabetes. Samples were collected pretransplantation, at +1, +2, +3 months posttransplantation. CMV-specific and alloreactive T-cells were quantified by interferon-γ Elispot assay. Lymphocyte subpopulations were quantified by flow cytometry. Results Upon multivariate analysis age, obesity, and the use of tacrolimus increased the risk of PTDM (p<0.05). KTRs with pre-existing diabetes/PTDM showed higher rates of sepsis (p<0.01). Total CD3+ and CD4+ T-cell counts were significantly lower in KTRs with PTDM/pre-existing diabetes posttransplantation (p<0.05). No differences were observed for CMV-specific T-cells between any group (p>0.05). KTRs developing PTDM showed increased frequencies of alloreactive T-cells posttransplantation (p<0.05). Conclusions Our results suggest higher rates of infection in KTRs with pre-existing diabetes/PTDM, that may be attributed to impaired overall immunity. Higher frequencies of alloreactive T-cells contribute to inferior long-term outcomes. Since acute rejection, but not pre-existing diabetes/PTDM, was associated with inferior allograft survival and function, maintaining adequate immunosuppression to prevent rejection seems important. This article is protected by copyright. All rights reserved.
  • Post-transplant lymphoproliferative disease in lung transplantation: a nested case-control study
    Post-transplant lymphoproliferative disorder (PTLD) may compromise long-term outcome of lung transplant (LTx) recipients. A case-control study was performed, comparing LTx recipients with PTLD (n=31) to matched recipients without PTLD (Controls, n=62). Risk factors for PTLD and post-transplant outcomes were assessed. PTLD prevalence was 3.9%, time to PTLD 323 (166-1132) days; and 54.8% had early-onset PTLD versus 45.2% late-onset PTLD. At LTx, more Epstein-Barr virus (EBV)-seronegative patients were present in PTLD (42%) compared to Controls (5%) (p<0.0001); most of whom had undergone EBV-seroconversion upon PTLD diagnosis. EBV viral load was higher in PTLD versus Controls (p<0.0001). Overall, lower hemoglobin and higher C-reactive protein levels were present in PTLD versus Controls (p<0.0001). EBV status at LTx (p=0.0073) and EBV viral load at PTLD (p=0.0002) were the most important risk determinates for later PTLD. PTLD patients demonstrated shorter time to onset of chronic lung allograft dysfunction (CLAD) (p=0.0006) and poorer 5-year survival post-LTx (66.6% versus 91.5%), resulting in worse CLAD-free survival (HR 2.127, 95%CI 1.006-4.500; p=0.0483) and overall survival (HR 3.297 95%CI 1.473-7.382; p=0.0037) compared to Controls. Late-onset PTLD had worse survival compared to early-onset PTLD (p=0.021). Primary EBV infection is a risk for PTLD; which is associated with worse long-term outcome post-LTx. This article is protected by copyright. All rights reserved.
  • Medication adherence and rejection rates in older versus younger adult liver transplant recipients
    A growing number of older adults are undergoing liver transplantation (LT) in the US. In some settings, it is thought that adherence declines with age. This retrospective study examined adherence and clinical outcomes in older versus younger adult LT recipients. Medical records of adult LT recipients from 2009-2012 from a single urban center were reviewed. The medication level variability index (MLVI) was the pre-defined primary outcome, with nonadherence defined as MLVI >2.5. The secondary outcome was incidence of rejection. Outcomes were evaluated starting one year post-LT until 2015. 42/248 patients were ≥65 at transplant. Older adults had significantly better adherence than younger ones (65% ≥65 were adherent vs. 42% younger adults; Chi-Square two-tailed p=0.02). Survival analyses of rejection between age groups censored by time since transplant showed no difference among the four age groups (χ2 = 0.84, p=0.84). Older age was not found to be a risk factor for reduced adherence or graft rejection in patients surviving at least one year post-LT. This article is protected by copyright. All rights reserved.
  • Development of a Human Cadaver Model for Training in Laparoscopic Donor Nephrectomy
    Background The Organ Procurement Network recommends a surgeon record 15 cases as surgeon or assistant for laparoscopic donor nephrectomies (LDN) prior to independent practice. The literature suggests that the learning curve for improved perioperative and patient outcomes is closer to 35 cases. In this article, we describe our development of a model utilizing fresh tissue and objective, quantifiable endpoints to document surgical progress and efficiency in each of the major steps involved in LDN. Materials and Methods Phase I of model development focused on the modifications necessary to maintain visualization for laparoscopic surgery in a human cadaver. Phase II tested proposed learner-based metrics of procedural competency for multiport LDN by timing procedural steps of LDN in a novice learner. Results Phase I and II required 12 and 9 cadavers, with a total of 35 kidneys utilized. The following metrics improved with trial number for multiport LDN: time taken for dissection of the gonadal vein, ureter, renal hilum, adrenal and lumbrical veins, simulated warm ischemic time (WIT), and operative time. Conclusion Humans cadavers can be used for training in LDN as evidenced by improvements in timed learner-based metrics. This simulation-based model fills a gap in available training options for surgeons. This article is protected by copyright. All rights reserved.
  • Pharmacokinetic and Pharmacogenetic Analysis of Immunosuppressive Agents after Laparoscopic Sleeve Gastrectomy
    Background Severe obesity has been shown to limit access to renal transplantation in patients with end stage renal disease (ESRD). Laparoscopic sleeve gastrectomy (LSG) has been performed in the ESRD population to assist in achieving waitlist and transplant eligibility. Little is known about how LSG impacts the bioequivalence tacrolimus products and immunosuppression pharmacokinetics. Methods This was a prospective, open-label, single-dose, crossover, two-period pharmacokinetic (PK) study. The purpose of this study was to assess single-dose PK of immediate-release tacrolimus (IR-TAC), extended-release tacrolimus (ER-TAC), and mycophenolic acid (MPA) in adult ESRD patients post-LSG. Results Twenty-three subjects were included in the 24-hour PK assessments. The ratio of geometric means between ER-TAC and IR-TAC was 103.5% (90% CI 89.6 – 119.6%) for AUC0-24 and 92.5% (90% CI 80.4 – 106.4%) for Cmax. PK parameters were similar between ER-TAC and IR-TAC, except for Cmin (p=0.004) and Cmax (p=0.04). MPA AUC0-24 was similar when given with either ER-TAC or IR-TAC (p=0.32). Patients expressing CYP3A5*1 genotypes had lower tacrolimus AUC0-24 values versus those with CYP3A5*3/*3 (IR-TAC p<0.001; ER-TAC p=0.008). Genotype did not impact MPA PK. Conclusion Dose modification of immunosuppressants post-LSG may not be necessary aside from standard therapeutic drug monitoring. This article is protected by copyright. All rights reserved.
  • Pharmacokinetics of prolonged-release tacrolimus versus immediate-release tacrolimus in de novo liver transplantation: a randomized phase III sub-study
    Background With the same dose of tacrolimus, lower systemic exposure on the first day of dosing has been reported for prolonged-release tacrolimus compared with immediate-release tacrolimus, prompting investigation of differing initial doses. Methods This sub-study of a double-blind, randomized, phase III trial in de novo liver transplant recipients compared the pharmacokinetics of once-daily prolonged-release tacrolimus (initial dose: 0.2mg/kg/day) versus twice-daily immediate-release tacrolimus (initial dose: 0.1mg/kg/day) during the first 2 weeks post-transplant. Results Pharmacokinetic data were analysed from patients receiving prolonged-release tacrolimus (n=13) and immediate-release tacrolimus (n=12). Mean systemic exposure (AUC0–24) was higher with prolonged-release versus immediate-release tacrolimus. Dose-normalized AUC0–24 (normalized to 0.1mg/kg/day) showed generally lower exposure with prolonged-release tacrolimus versus immediate-release tacrolimus. There was good correlation between AUC0–24 and concentration at 24 hours after the morning dose (r=0.96 and r=0.86, respectively), and the slope of the line of best fit was similar for both formulations. Conclusions Doubling the initial starting dose of prolonged-release tacrolimus compared with immediate-release tacrolimus overcompensated for lower exposure on Day 1. A 50% higher starting dose of prolonged-release tacrolimus than immediate-release tacrolimus may be required for similar systemic exposure. However, doses of both formulations can be optimized using the same trough-level monitoring system. (ClinicalTrials . gov number: NCT00189826) Discipline liver transplantation/hepatology, immunosuppression/immune modulation. This article is protected by copyright. All rights reserved.
  • Characteristics of compatible pair participants in kidney paired donation at a single center
    Compatible pairs of living kidney donors and their intended recipients can enter into kidney paired donation (KPD) and facilitate additional living donor kidney transplants (LDKTs). We examined 11 compatible pairs (the intended recipients and their intended, compatible donors) who participated in KPD, along with the recipients’ 11 matched, exchange donors. The 11 pairs participated in 10 separate exchanges (three were multicenter exchanges) that included 33 total LDKTs (22 additional LDKTs). All the intended donors were blood group O and female, with a mean living kidney donor profile index (LKDPI) of 27.6 (SD 16.8). The matched donors had a mean LKDPI of 9.4 (SD 31.7). Compatible pairs entered KPD for altruistic reasons (N=2) or due to mismatch of age (N=7) or body/kidney size (N=2) between the recipient and intended donor. In four cases, retrospective calculation of the LKDPI revealed that the matched donor had a higher LKDPI than the intended donor. Of the 22 recipients of LDKTs enabled by the compatible pairs, three were highly sensitized, with PRA >80%. In conclusion, most compatible pairs entered into KPD so that the recipient could receive a LDKT transplant from a donor whose age or body/kidney size were more favorable to post-transplant outcomes.
  • A clinical tool to calculate post-transplant survival using pre-transplant clinical characteristics in adults with cystic fibrosis
    Background We previously identified factors associated with a greater risk of death post-transplant. The purpose of this study was to develop a clinical tool to estimate the risk of death after transplant based on pre-transplant variables. Methods We utilized the Canadian CF registry to develop a nomogram that incorporates pre-transplant clinical measures to assess post-lung transplant survival. The 1-, 3-, and 5-year survival estimates were calculated using Cox proportional hazards models. Results Between 1988 and 2012, 539 adult Canadians with CF received a lung transplant with 208 deaths in the study period. Four pre-transplant factors most predictive of poor post-transplant survival were older age at transplantation, infection with B. cepacia complex, low FEV1 percent predicted, and pancreatic sufficiency. A nonlinear relationship was found between risk of death and FEV1 percent predicted, age at transplant, and BMI. We constructed a risk calculator based on our model to estimate the 1-, 3-, and 5-year probability of survival after transplant which is available online. Conclusions Our risk calculator quantifies the risk of death associated with lung transplant using pre-transplant factors. This tool could aid clinicians and patients in the decision-making process and provide information regarding the timing of lung transplantation.
  • Graft-derived macrophage migration inhibitory factor correlates with hepatocellular injury in patients undergoing liver transplantation
    Experimental studies suggest that macrophage migration inhibitory factor (MIF) mediates ischemia/reperfusion injury during liver transplantation. This study assessed whether human liver grafts release MIF during preservation, and whether the release of MIF is proportional to the extent of hepatocellular injury. Additionally, the association between MIF and early allograft dysfunction (EAD) after liver transplantation was evaluated. Concentrations of MIF, aspartate aminotransferase (AST), alanine aminotransferase (ALT), lactate dehydrogenase (LDH), and creatine kinase (CK) were measured in effluents of 38 liver grafts, and in serum of recipients. Concentrations of MIF in the effluent were greater than those in the recipients’ serum before and after reperfusion (58 [interquartile range, IQR:23-79] μg/mL vs 0.06 [IQR:0.03-0.07] μg/mL and 1.3 [IQR:0.7-1.8] μg/mL, respectively; both P<.001). Effluent MIF concentrations correlated with effluent concentrations of the cell injury markers ALT (R=.51, P<.01), AST (R=.51, P<.01), CK (R=.45, P=.01), and LDH (R=.56, P<.01). Patients who developed EAD had greater MIF concentrations in effluent and serum 10 minutes after reperfusion than patients without EAD (Effluent: 80 [IQR:63-118] μg/mL vs 36 [IQR:20-70] μg/mL, P=.02; Serum: 1.7 [IQR:1.2-2.5] μg/mL vs 1.1 [IQR:0.6-1.7] μg/mL, P<.001). Conclusion Human liver grafts release MIF in proportion to hepatocellular injury. Greater MIF concentrations in effluent and recipient's serum are associated with EAD after liver transplantation.
  • Association of myocardial injury with increased mortality after liver transplantation
  • Early conversion to belatacept after renal transplantation
    Belatacept is a non-nephrotoxic immunosuppressive agent, which may make it the ideal agent for patients with delayed or slow graft function on calcineurin inhibitors. There are limited data on conversion of patients to belatacept within 6 months of transplantation. Between January 2012 and December 2015, 16 patients were converted to belatacept for delayed or poor graft function (eGFR<30 mL/min/1.73 m2, MDRD); three were HIV positive. Conversion protocols were analyzed in patients ≤4 months and 4-6 months post-transplantation. Mean serum creatinine levels after belatacept conversion were compared with preconversion levels. Patient survival was 100%, and graft survival was 88%. The mean creatinine fell from 3.9±1.82 mg/dL prebelatacept conversion to 2.1±1.1 mg/dL at 6 months and 1.9±0.47 mg/dL (median 1.8 mg/dL) at 12 months postconversion. There was no significant increased risk of rejection, infection, or malignancy. HIV parameters remained largely stable. Early conversion to belatacept in patients with DGF or slow graft function is safe and efficacious, in a single-center nonrandomized retrospective analysis.
  • Biliary reconstruction in liver transplant patients with primary sclerosing cholangitis, duct-to-duct or Roux-en-Y?
    Introduction Roux-en-Y choledochojejunostomy and duct-to-duct (D-D) anastomosis are biliary reconstruction methods for liver transplantation. However, there is a controversy over which method produces better results. We have compared the outcome of D-D anastomosis vs. Roux-en-Y hepaticojejunostomy in patients with primary sclerosing cholangitis who had undergone liver transplant in Shiraz Organ Transplant Center. Materials The medical records of 405 patients with primary sclerosing cholangitis (PSC) who had undergone liver transplant from 1996 to 2015 were reviewed. Patients were divided into two groups: Roux-en-Y group and D-D group. Morbidity, disease recurrence, and graft and patient survival rates were compared between the two groups. Results Total of 143 patients underwent a D-D biliary reconstruction, and 260 patients had a Roux-en-Y loop. Biliary complication involved 4.2% of patients from the D-D group, and 3.9% from the Roux-en-Y group (P=. 863). Actuarial 1-, 3-, and 5-year patient survival for D-D and Roux-en-Y group was 92%, 85%, and 74%; and 87%, 83%, and 79%, respectively (P=.384). The corresponding 1-, 3-, and 5-year probability of biliary complication was 97%, 95%, and 92%; and 98%, 97%, and 94%, respectively (P=.61). Conclusion Duct-to-duct biliary reconstruction in liver transplantation for selected patients with PSC is a good alternative instead of Roux-en-Y biliary reconstruction.
  • Factors contributing to employment patterns after liver transplantation
    Background Many liver transplant recipients return to work, but their patterns of employment are unclear. We examine patterns of employment 5 years after liver transplantation. Methods First-time liver transplant recipients ages 18-60 years transplanted from 2002 to 2009 and surviving at least 5 years were identified in the United Network for Organ Sharing registry. Recipients' post-transplant employment status was classified as follows: (i) never employed; (ii) returned to work within 2 years and remained employed (continuous employment); (iii) returned to work within 2 years, but was subsequently unemployed (intermittent employment); or (iv) returned to work ≥3 years post-transplant (delayed employment). Results Of 28 306 liver recipients identified during the study period, 12 998 survived at least 5 years and contributed at least 1 follow-up of employment status. A minority of patients (4654; 36%) were never employed, while 3780 (29%) were continuously employed, 3027 (23%) were intermittently employed, and 1537 (12%) had delayed employment. In multivariable logistic regression analysis, predictors of intermittent and delayed employment included lower socioeconomic status, higher local unemployment rates, and post-transplant comorbidities or complications. Conclusion Never, intermittent, and delayed employment are common after liver transplantation. Socioeconomic and labor market characteristics may add to clinical factors that limit liver transplant recipients’ continuous employment.
  • Simultaneous pancreas and kidney transplantation: Incidence and risk factors for amputation after 10-year follow-up
    Introduction The incidence of amputation after simultaneous pancreas and kidney (SPK) transplantation ranges from 9.5% to 23% after 5 years of follow-up. The objective of this study was to investigate the incidence and risk factors for amputation in SPK transplant patients compared to kidney transplantation alone (KTA) after a minimum follow-up of 10 years. Methods An analysis was performed on a prospectively maintained database of 81 SPK transplants and 43 KTA consecutively performed in one center for insulin-dependent diabetes mellitus between December 1992 and January 2006. Primary outcome variables were incidence of amputation per patient, total number of amputations, and type of amputation performed. Data are presented as a mean ± standard deviation. Results Seven patients (9%) in the SPK cohort and one patient (2%) in the KTA cohort underwent amputation (P<.001). One amputee had pancreas allograft failure prior to amputation. Fifteen amputations were performed in total and four patients required ≥2 amputations. The latency period between transplantation and amputation was 133.57±49.43 months in the SPK cohort and 168 months in the KTA group. Conclusions The incidence of amputation after SPK transplantation is approximately 9% after 10-year follow-up. Patients are at a significantly greater risk of amputation after SPK transplantation compared to KTA for type 1 diabetes despite insulin independence.
  • The appropriate dose of thymoglobulin induction therapy in kidney transplantation
    Background Thymoglobulin is used effectively as an induction agent in kidney transplantation, but there is no consensus on the optimal dose. In order to delineate the safest effective dose, an open-labeled randomized clinical trial was designed. Methods In this study, 90 adult kidney transplant recipients (KTR) were randomized before transplantation in three groups to receive thymoglobulin: Arm A (4.5 mg/kg in 3 days), Arm B (4.5 mg/kg single bolus dose), and Arm C (6 mg/kg in 3 days). Renal function, infections, and rate of readmissions were evaluated during the first post transplantation year. Results Ninety adult kidney recipients were enrolled (51% deceased donor). No significant statistical difference was found in acute rejection episodes or type of rejection between these groups, although patients in Arm A showed more severe histopathologic changes according to Banff 2013 criteria, in renal biopsies (P=.03). At the first month after transplantation serum Cr was lower (P=.001) and GFR was higher (P=.04) in Arm A, but there was no significant difference among the three groups at 3, 6, and 12 months post-transplant. Conclusion Although all regimens showed the same efficacy regarding the rate of rejection episodes, 3-day 4.5 mg/kg Thymoglobulin had significantly fewer complications.
  • Airway inflammation and symptoms in children following liver and heart transplantation
    Objectives To describe the upper airway endoscopic findings of children with upper airway symptoms after liver transplantation (LT) or heart transplantation (HT). Methods Review of children undergoing airway endoscopy after LT or HT from 2011 to 2015 at a tertiary care pediatric hospital. Airway findings, biopsy results, immunosuppression, and Epstein-Barr virus (EBV) levels were recorded. Results Twenty-three of 158 LT (111) and HT (47) recipients underwent endoscopy. Median time from LT to endoscopy was 9 months (range 4-25) and 31 months (range 1-108) for HT. Thirteen of 23 patients presented with upper airway symptoms, and 10/23 presented with respiratory failure or for surveillance. Thirteen patients with upper airway symptoms had abnormal findings (7 LT; 6 HT), most commonly arytenoid edema (13 patients). There were five EBV-positive biopsies (four with post-transplant lymphoproliferative disorder), and six EBV-negative biopsies with lymphocytic inflammation. One biopsy demonstrated fungal infection. Immunosuppression was decreased in seven patients, and three received steroids. There were no episodes of allograft rejection. No patients had airway symptoms at last follow-up. Conclusions In pediatric solid organ transplant recipients, symptoms of airway obstruction are not uncommon and should be evaluated with endoscopy. Endoscopy without symptoms is low-yield. Treatment with decreased immunosuppression improved airway symptoms.
  • Long-term survival following kidney transplantation in previous lung transplant recipients—An analysis of the unos registry
    Background Kidney transplantation has been advocated as a therapeutic option in lung recipients who develop end-stage renal disease (ESRD). This analysis outlines patterns of allograft survival following kidney transplantation in previous lung recipients (KAL). Methods Data from the UNOS lung and kidney transplantation registries (1987–2013) were cross-linked to identify lung recipients who were subsequently listed for and/or underwent kidney transplantation. Time-dependent Cox models compared the survival rates in KAL patients with those waitlisted for renal transplantation who never received kidneys. Survival analyses compared outcomes between KAL patients and risk-matched recipients of primary, kidney-only transplantation with no history of lung transplantation (KTx). Results A total of 270 lung recipients subsequently underwent kidney transplantation (KAL). Regression models demonstrated a lower risk of post-listing mortality for KAL patients compared with 346 lung recipients on the kidney waitlist who never received kidneys (P<.05). Comparisons between matched KAL and KTx patients demonstrated significantly increased risk of death and graft loss (P<.05), but not death-censored graft loss, for KAL patients (P = .86). Conclusions KAL patients enjoy a significant survival benefit compared with waitlisted lung recipients who do not receive kidneys. However, KAL patients do poorly compared with KTx patients. Decisions about KAL transplantation must be made on a case-by-case basis considering patient and donor factors.
  • Pilot cohort study on the potential role of TCF7L2 rs7903146 on ischemic heart disease among non-diabetic kidney transplant recipients
    Background TCF7L2 rs7903146 C>T polymorphism is associated with diabetes in the general population but its independent impact on cardiovascular disease is debated. On this basis, we investigated its association with major adverse cardiac events (MACE) in a single-center cohort of non-diabetic kidney transplant recipients (KTRs). Methods Patients with pretransplant diabetes were excluded and patients who developed post-transplant diabetes were censored at time of diagnosis. Results rs7903146 C>T polymorphism appeared to modulate the risk of MACE: 5-year prevalence was 0.8% in CC patients, 7.2% in CT patients and 9.7% in TT patients (P<.001). TCF7L2 rs7903146 was an independent predictor of MACE in a multivariate Cox regression model (for each T allele, HR: 2.99, 95%CI: 1.62-5.52, P<.001), together with history of cardiac ischemic events (HR: 8.69, 95%CI: 3.57-21.16, P<.001), DGF (HR: 2.42, 95%CI: 0.98-5.95, P=.056) and HLA-mismatches (for each mismatch: HR: 1.55, 95%CI: 1.00-2.43, P=.053). Introduction of rs7903146 C>T polymorphism into a model based on these clinical variables significantly increased predictive power for MACE (P=.003). Conclusions TCF7L2 rs7903146 T allele may be strongly and independently associated with MACE in non-diabetic KTRs. These findings suggest the possibility of employing this SNP to more accurately stratify cardiological risk in KTRs.
  • Relationship between pre-transplant physical function and outcomes after kidney transplant
    Background Performance-based measures of physical function predict morbidity following non-transplant surgery. Study objectives were to determine whether physical function predicts outcomes after kidney transplant and assess how physical function changes post-transplant. Methods We conducted a prospective study involving living donor kidney transplants recipients at our center from May 2012 to February 2014. Physical function was measured using the Short Physical Performance Battery (SPPB [balance, chair stands, gait speed]) and grip strength testing. Initial length of stay (LOS), 30- day rehospitalizations, allograft function, and quality of life (QOL) were assessed. Results The majority of the 140 patients in our cohort had excellent pre-transplant physical function. In general, balance scores were more predictive of post-transplant outcomes than the SPPB. Decreased pre-transplant balance was independently associated with longer LOS and increased rehospitalizations but not with post-transplant QOL; 35% of patients experienced a clinically meaningful (≥ 1.0 m/s) improvement in gait speed 4 months post-transplant. Conclusions Decreased physical function may be associated with longer LOS and rehospitalizations following kidney transplant. Further studies are needed to confirm this association. The lack of relationship between pre-transplant gait speed and outcomes in our cohort may represent a ceiling effect. More comprehensive measures, including balance testing, may be required for risk stratification.
  • Epidemiology, risk factors, and outcome of Clostridium difficile infection in heart and heart-lung transplant recipients
    Background Clostridium difficile is a major cause of diarrhea in thoracic organ transplant recipients. We investigated the epidemiology, risk factors, and outcome of Clostridium difficile infection (CDI) in heart and heart-lung transplant (HT) recipients. Methods This is a retrospective study from 2004 to 2013. CDI was defined by diarrhea and a positive toxigenic C. difficile in stool measured by toxin enzyme immunoassay (2004-2006) or polymerase chain reaction (2007-2013). Cox proportional hazards regression was used to model the association of risk factors with time to CDI and survival with CDI following transplantation. Results There were 254 HT recipients, with a median age of 53 years (IQR, 45-60); 34% were female. During the median follow-up of 3.1 years (IQR, 1.3-6.1), 22 (8.7%) patients developed CDI. In multivariable analysis, risk factors for CDI were combined heart-lung transplant (HR 4.70; 95% CI, 1.30-17.01 [P=.02]) and retransplantation (HR 7.19; 95% CI, 1.61-32.12 [P=.01]). Acute cellular rejection was associated with a lower risk of CDI (HR 0.34; 95% CI, 0.11-0.94 [P=.04]). CDI was found to be an independent risk factor for mortality (HR 7.66; 95% CI, 3.41-17.21 [P<.0001]). Conclusions Clostridium difficile infection after HT is more common among patients with combined heart-lung and those undergoing retransplantation. CDI was associated with a higher risk of mortality in HT recipients.
  • Making inroads to the cure: Barriers to clinical trial enrollment in hematopoietic cell transplantation
    A significant barrier to advancing the standard of care for patients with hematologic malignancies undergoing stem cell transplantation is access and willingness to participate in clinical trials. The importance of clinical trial enrollment is magnified in an era of targeted therapies, accelerated drug discovery, and investment by the pharmaceutical industry. As disease targets are identified, novel therapies are being evaluated in efforts to reduce treatment-related toxicity and improve progression-free and overall survival. The enrollment of hematopoietic cell transplantation (HCT) patients on clinical studies is essential to promote the development of such therapies. Increasing clinical trial participation requires understanding of potential barriers to enrollment, including patient concerns, institutional and provider hurdles, and disease-specific characteristics.
  • Immunosuppression with mTOR inhibitors prevents the development of donor-specific antibodies after liver transplant
    Background Donor-specific antibodies (DSAs) are an important cause of complications after solid organ transplant. Risk factors and, thus, strategies for preventing DSA development are not well defined. Methods The DSA status of 400 patients who underwent liver transplant (LT) at the outpatient clinic of the University Hospital Essen was determined. Human leukocyte antigen (HLA) antibodies were detected by single-antigen bead technology. The strength of DSAs was reported as mean fluorescence intensity. Results Detectable DSAs were found in 74 (18.5%) patients and significantly more often in patients who underwent LT for autoimmune liver disease than for all other indications (29.3%; P=.022), but significantly less often found in patients who underwent LT for hepatocellular carcinoma (7.6%, P=.005). The incidence of DSAs increased with time after LT, and the risk was generally higher for female patients. The frequency of DSA detection was significantly lower (10.6%) for patients receiving immunosuppressive treatment with mammalian target of rapamycin (mTOR) inhibitors than for those receiving other regimens (20.5%; P=.025). Conclusion Autoimmune liver diseases, female sex, and time of more than 8 years since LT predispose patients to the development of DSAs. Immunosuppression with the mTOR inhibitor everolimus protects against DSA development after liver transplant.
  • Low serum testosterone is associated with impaired graft function early after heart transplantation
    Background We sought to investigate a correlation between serum testosterone levels and graft function early after heart transplantation. Methods In a cross-sectional study, we measured serum testosterone levels 4 weeks after heart transplantation in 49 consecutive male recipients. Echocardiography was carried out to evaluate graft function. Low serum testosterone was defined as <11 nmol/L. Results Low serum testosterone was present in 21 (43%) recipients (Group A), and 28 (57%) had normal testosterone levels (Group B). The two groups did not differ in age and presence of renal dysfunction, arterial hypertension, diabetes, or hyperlipidemia. Donor age and allograft ischemic time were not different between the two groups. Both groups had comparable tacrolimus through levels, dose of mycophenolate mophetil, and methylprednisolone. Patients in Group A had significantly lower LVEF (58±5% vs 65±6% vs Group B, P=.001) and TAPSE (1.3±0.3 cm vs 1.6±0.3 cm in Group B, P=.01). In comparison with Group B, more patients in Group A were found to have low grade (1R) rejection (25% vs 3%; P=.02). Conclusion Low serum testosterone levels appear to be associated with impaired graft function and an increased incidence of low-grade rejection episodes early after heart transplantation.
  • Brain natriuretic peptide and right heart dysfunction after heart transplantation
    Heart transplantation (HT) should normalize cardiac endocrine function, but brain natriuretic peptide (BNP) levels remain elevated after HT, even in the absence of left ventricular hemodynamic disturbance or allograft rejection. Right ventricle (RV) abnormalities are common in HT recipients (HTx), as a result of engraftment process, tricuspid insufficiency, and/or repeated inflammation due to iterative endomyocardial biopsies. RV function follow-up is vital for patient management as RV dysfunction is a recognized cause of in-hospital death and is responsible for a worse prognosis. Interestingly, few and controversial data are available concerning the relationship between plasma BNP levels and RV functional impairment in HTx. This suggests that infra-clinical modifications, such as subtle immune system disorders or hypoxic conditions, might influence BNP expression. Nevertheless, due to other altered circulating molecular forms of BNP, a lack of specificity of BNP assays is described in heart failure patients. This phenomenon could exist in HT population and could explain elevated BNP plasmatic levels despite a normal RV function. In clinical practice, intra-individual change in BNP over time, rather than absolute BNP values, might be more helpful in detecting right cardiac dysfunction in HTx.
  • Incidence of acute cellular rejection following granulocyte colony-stimulating factor administration in lung transplantation: A retrospective case-cohort analysis
    Granulocyte colony-stimulating factor (GCSF) is an option to treat leukopenia in lung transplant recipients. Conflicting evidence exists regarding its effects on acute cellular rejection (ACR). A retrospective, case-cohort study was conducted to assess whether the use of GCSF in lung transplant recipients is associated with an increased incidence of ACR. Patients had to have received at least one dose of GCSF but were excluded if they received GCSF within 30 days prior to transplant or received a lymphocyte-depleting agent within 14 days of GCSF administration. Thirty-five patients who received GCSF within 3 months of transplant met inclusion criteria and 105 patients were identified as controls based on a 1:3 allocation scheme. Incidence of ACR was 57.1% in the GCSF group versus 50.5% in the control group (relative risk (RR)=1.13; 95% CI, 0.80 to 1.59; P=.48). At 3 months post-transplant, 74.3% of the GCSF group had a dose reduction or discontinuation of their antiproliferative agent versus 17.1% of the control group (RR=4.33; 95% CI, 2.73 to 6.89; P<.0001). Rejection severity and incidence of infections was similar among groups. These findings show that GCSF administration within 3 months following lung transplantation was not associated with a higher incidence or severity of ACR.
  • Histologic surveillance after liver transplantation due to autoimmune hepatitis
    Background Autoimmune hepatitis (AIH) often recurs after liver transplantation (LT). Our aim was to evaluate the recurrence rate of AIH after LT, impact of AIH recurrence on survival and fibrosis progression, and find risk factors for AIH recurrence. Methods Forty-two patients with AIH prior to LT with ≥1 protocol biopsy ≥1 year post-LT were included with a median follow-up of 5.0 years (1.0-17.0). Follow-up liver biopsies were re-evaluated for AIH recurrence, fibrosis progression, and cirrhosis development. Results A histological recurrence of AIH was diagnosed in 15 (36%) patients at a median of 5 years of follow-up. Recurrent AIH lead to progressive fibrosis (METAVIR stage 3-4) in two but did not cause a single patient or graft loss. Transaminases were normal in three patients with recurrent AIH (20%). AIH recurrence was more common in patients with no overlapping cholangitis (OR 1.44, P=.021). Immunosuppression without antimetabolite increased the risk of AIH recurrence (OR 1.47, P=.018). Patient and graft survival rates at 1, 5, and 10 years were 94%, 86%, and 86% and 91%, 77%, and 74%. AIH recurrence did not affect survival. Conclusions AIH recurrence occurs in 36% in 5 years, but does not affect patient or graft outcome.
  • Quantitative computed tomography assessment of bronchiolitis obliterans syndrome after lung transplantation
    Background Bronchiolitis obliterans syndrome (BOS) is a clinical manifestation of chronic allograft rejection following lung transplantation. We examined the quantitative measurements of the proximal airway and vessels and pathologic correlations in subjects with BOS. Methods Patients who received a lung transplant at the Brigham and Women's Hospital between December 1, 2002 and December 31, 2010 were included in this study. We characterized the quantitative CT measures of proximal airways and vessels and pathological changes. Results Ninety-four (46.1%) of the 204 subjects were included in the study. There was a significant increase in the airway vessel ratio in subjects who developed progressive BOS compared to controls and non-progressors. There was a significant increase in airway lumen area and decrease in vessel cross-sectional area in patients with BOS compared to controls. Patients with BOS had a significant increase in proximal airway fibrosis compared to controls. Conclusions BOS is characterized by central airway dilation and vascular remodeling, the degree of which is correlated to decrements in lung function. Our data suggest that progressive BOS is a pathologic process that affects both the central and distal airways.
  • The epidemiology of Clostridium difficile infection in a national kidney transplant center
    Background We aimed to describe the epidemiology and outcomes of CDI in a national kidney transplant center from 2008 to 2015. Methods Adult kidney and kidney-pancreas transplant recipients were included for analysis if they met the surveillance CDI case definition. Rates of new healthcare-associated CDI (HA-CDI) were expressed per 10 000 KTR/KTPR bed days used (BDU) to facilitate comparisons. Results Fifty-two cases of CDI were identified in 42 KTRs and KPTRs. This corresponded to an average annual rate of 9.6 per 10 000 BDU, higher than that seen among general hospital inpatients locally, nationally, and internationally. Of the 45 cases (87%) that were considered HA-CDI, nine (20%) had symptom onset in the community. Recent proton-pump inhibitor (PPI) and broad-spectrum antimicrobial exposure preceded the majority of cases. KTRs and KPTRs with CDI had a longer mean length of hospital stay (35 days) than those KTR and KPTRs admitted during the same period that did not have CDI (8 days). Conclusions Education regarding CDI must be extended to transplant recipients and their general practitioners. Other targets for future CDI rate reduction must include stringent antimicrobial stewardship (both in hospital and in the community) and judicious PPI prescribing.
  • Delirium after lung transplantation: Association with recipient characteristics, hospital resource utilization, and mortality
    Background Delirium is associated with increased morbidity and mortality. The factors associated with post-lung transplant delirium and its impact on outcomes are under characterized. Methods The medical records of 163 consecutive adult lung transplant recipients were reviewed for delirium within 5 days (early-onset) and 30 hospital days (ever-onset) post-transplantation. A multivariable logistic regression model assessed factors associated with delirium. Multivariable negative binomial regression and Cox proportional hazards models assessed the association of delirium with ventilator duration, intensive care unit (ICU) length of stay (LOS), hospital LOS, and one-year mortality. Results Thirty-six percent of patients developed early-onset, and 44% developed ever-onset delirium. Obesity (OR 6.35, 95% CI 1.61-24.98) and bolused benzodiazepines within the first postoperative day (OR 2.28, 95% CI 1.07-4.89) were associated with early-onset delirium. Early-onset delirium was associated with longer adjusted mechanical ventilation duration (P=.001), ICU LOS (P<.001), and hospital LOS (P=.005). Ever-onset delirium was associated with longer ICU (P<.001) and hospital LOS (P<.001). After adjusting for clinical variables, delirium was not significantly associated with one-year mortality (early-onset HR 1.65, 95% CI 0.67-4.03; ever-onset HR 1.70, 95% CI 0.63-4.55). Conclusions Delirium is common after lung transplant surgery and associated with increased hospital resources.
  • Cultural competency of a mobile, customized patient education tool for improving potential kidney transplant recipients’ knowledge and decision-making
    Patients considering renal transplantation face an increasingly complex array of choices as a result of the revised kidney transplant allocation system. Decision aids have been shown to improve patient decision-making through the provision of detailed, relevant, individualized clinical data. A mobile iOS-based application (app) including animated patient education and individualized risk-adjusted outcomes following kidney transplants with varying donor characteristics and DSA waiting times was piloted in two large US transplant programs with a diverse group of renal transplant candidates (N = 81). The majority (86%) of patients felt that the app improved their knowledge and was culturally appropriate for their race/ethnicity (67%-85%). Patients scored significantly higher on transplant knowledge testing (9.1/20 to 13.8/20, P < .001) after viewing the app, including patients with low health literacy (8.0 to 13.0, P < .001). Overall knowledge of and interest in living and deceased donor kidney transplantation increased. This pilot project confirmed the benefit and cultural acceptability of this educational tool, and further refinement will explore how to better communicate the risks and benefits of nonstandard donors.
  • Screening for asymptomatic bacteruria at one month after adult kidney transplantation: Clinical factors and implications
    Objective Urinary tract infections (UTIs) account for significant morbidity after kidney transplantation (KT). Screening for asymptomatic bacteruria (AB) has proven to be beneficial in certain population including pregnant women; however, it is not well-studied in KT population. We reviewed the incidence, clinical features, and implications of asymptomatic bacteruria one month after KT. Methods A total of 171 adult KT patients (86 [50.3%] living transplants, 87 [50.9%] males, mean age 47.3 ± 13.7 years), between 2005 and 2012, were analyzed. Immunosuppression induction and maintenance were as per protocol. Protocol urine cultures were taken at 1 month post-transplantation. Patients were stratified for presence of AB and analyzed for demographics and clinical parameters. Outcomes of hospitalization for symptomatic UTIs, graft, and patient survival were ascertained. Results Forty-one (24%) KT recipients had AB at 30 days post-transplant. Multiresistant organisms accounted for 43.9% of these infections. Logistic regression confirms female sex and deceased donor recipients as independent predictors of 30-day bacteruria, which predicts subsequent hospitalization for symptomatic UTI. One-year patient and graft survival were similar in recipient with or without AB. Conclusion Asymptomatic bacteruria 30 days post-transplant can be predicted in female recipients and kidneys from deceased donors probably due to anatomical and functional differences respectively. There is increased morbidity of subsequent hospitalization for symptomatic UTI and more research in prevention of UTI is needed, particularly non-antibiotic prophylaxis.
  • Sinus tachycardia is associated with impaired exercise tolerance following heart transplantation
    Background Sinus tachycardia often presents in heart transplantation (HTx) recipients, but data on its effect on exercise performance are limited. Methods Based on mean heart rate (HR) value 3 months after HTx, 181 patients transplanted from 2006 to 2015 at University of Nebraska Medical Center were divided into two groups: (i) HR<95 beats/min (bpm, n=93); and (ii) HR≥95 bpm (n=88). Cardiopulmonary exercise testing (CPET) was performed 1 year after HTx. Results Mean HR at 3 months post-HTx was 94±11 bpm and did not change significantly at 1 year post-HTx (96±11 bpm, P=.13). HR≥95 bpm at 3 months was associated with younger donor age (OR 1.1; CI 1.0-1.1, P=.02), female donors (OR −2.4; CI 1.16-5.24 P=.02), and lack of donors' heavy alcohol use (OR −0.43; CI 0.17-0.61; P=.04). HR≥95 bpm at 3 months post-HTx was independently associated with decreased exercise capacity in metabolic equivalent (P=.008), reduced peak VO2 (P=.006), and percent of predicted peak VO2 (P=.002) during CPET. Conclusions HR≥95 at 3 months following HTx is associated with reduced exercise tolerance in stable HTx recipients. Medical HR reduction after HTx could improve exercise performance after HTx and merits further investigation.
  • Assessment of cardiac allograft systolic function by global longitudinal strain: From donor to recipient
    Background Cardiac allografts are routinely evaluated by left ventricular ejection fraction (LVEF) before and after transplantation. However, myocardial deformation analyses with LV global longitudinal strain (GLS) are more sensitive for detecting impaired LV myocardial systolic performance compared with LVEF. Methods We analyzed echocardiograms in 34 heart donor-recipient pairs transplanted at Duke University from 2000 to 2013. Assessments of allograft LV systolic function by LVEF and/or LV GLS were performed on echocardiograms obtained pre-explanation in donors and serially in corresponding recipients. Results Donors had a median LVEF of 55% (25th, 75th percentile, 54% to 60%). Median donor LV GLS was −14.6% (−13.7 to −17.3%); LV GLS was abnormal (ie, >−16%) in 68% of donors. Post-transplantation, LV GLS was further impaired at 6 weeks (median -11.8%; −11.0 to −13.4%) and 3 months (median −11.4%; −10.3 to −13.9%) before recovering to pretransplant levels in follow-up. Median LVEF remained ≥50% throughout follow-up. We found no association between donor LV GLS and post-transplant outcomes, including all-cause hospitalization and mortality. Conclusions GLS demonstrates allograft LV systolic dysfunction in donors and recipients not detected by LVEF. The clinical implications of subclinical allograft dysfunction detected by LV GLS require further study.
  • Idiopathic hyperammonemia after solid organ transplantation: Primarily a lung problem? A single-center experience and systematic review
    Background Idiopathic hyperammonemia syndrome (IHS) is an uncommon, often deadly complication of solid organ transplantation. IHS cases in solid organ transplantation seem to occur predominantly in lung transplant (LTx) recipients. However, to the best of our knowledge, the occurrence of IHS has not been systematically evaluated. We set out to identify all reported cases of IHS following nonliver solid organ transplantations. Methods Retrospective review of our institutional experience and systematic review of the literature. Results At our institution six cases (of 844 nonliver solid organ transplants) of IHS were identified: five occurred following LTx (incidence 3.9% [lung] vs 0.1% [nonlung], P=.004). In the systematic review, 16 studies met inclusion criteria, reporting on 32 cases of IHS. The majority of IHS cases in the literature (81%) were LTx-recipients. The average peak reported ammonia level was 1039 μmol/L occurring on average 14.7 days post-transplant. Mortality in previously reported IHS cases was 69%. A single-center experience suggested that, in addition to standard treatment for hyperammonemia, early initiation of high intensity hemodialysis to remove ammonia was associated with increased survival. In the systematic review, mortality was 40% (four of 10) with intermittent hemodialysis, 75% (nine of 12) with continuous veno-venous hemodialysis, and 100% in six subjects that did not receive renal replacement to remove ammonia. Three reports identified infection with urease producing organisms as a possible etiology of IHS. Conclusion IHS is a rare but often fatal complication that primarily affects lung transplant recipients within the first 30 days.
  • Increased mid-abdominal circumference is a predictor for surgical wound complications in kidney transplant recipients: A prospective cohort study
    Kidney transplant recipients are at an increased risk of developing surgical site wound complications due to their immunosuppressed status. We aimed to determine whether increased mid-abdominal circumference (MAC) is predictive for wound complications in transplant recipients. A prospective study was performed on all kidney transplant recipients from October 2014 to October 2015. “Controls” consisted of kidney transplant recipients without a surgical site wound complication and “cases” consisted of recipients that developed a wound complication. In total, 144 patients underwent kidney transplantation and 107 patients met inclusion criteria. Postoperative wound complications were documented in 28 (26%) patients. Patients that developed a wound complication had a significantly greater MAC, body mass index (BMI), and body weight upon renal transplantation (P<.001, P=.011, and P=.011, respectively). On single and multiple logistic regression analyses, MAC was a significant predictor for developing a surgical wound complication (P=.02). Delayed graft function and a history of preformed anti-HLA antibodies were also predictive for surgical wound complications (P=.003 and P=.014, respectively). Increased MAC is a significant predictor for surgical wound complications in kidney transplant recipients. Integrating clinical methods for measuring visceral adiposity may be useful for stratifying kidney transplant recipients with an increased risk of a surgical wound complication.
  • Cardiac transplantation in a neonate—First case in Switzerland and European overview
    Twenty-four percent of pediatric heart transplantations (pHTx) are carried out in infants. Neonatal heart transplantation is both rarely performed and challenging. We report on a newborn baby girl suffering from cardiac failure due to a huge tumor (24×52 mm) within the free wall of the left ventricle (LV) and subtotal obstruction of the main left bronchus. Following a surgical tumor resection, a Berlin Heart EXCOR left ventricular assist device was implanted as the bridge to the transplantation. In spite of an organ donor/recipient mismatch of >200%, both heart transplantation and the postoperative course were successful. In addition to this case report, the authors also present data from a survey on performed infant and neonatal transplantations in Western Europe. As neonatal heart transplantation is a rare event in Europe, the authors think it is of crucial importance to share this limited experience. We discuss an alternative strategy—namely, palliative surgical correction using the Fontan pathway. The challenges of donor/recipient weight mismatch and the possibilities of overcoming infant donor organ shortage as a postoperative immunosuppressive regimen are discussed as well.
  • (D+10) MELD as a novel predictor of patient and graft survival after adult to adult living donor liver transplantation
    We modified the previously described D-MELD score in deceased donor liver transplant, to (D+10)MELD to account for living donors being about 10 years younger than deceased donors, and tested it on living donor liver transplantation (LDLT) recipients. Five hundred consecutive LDLT, between July 2010 and December 2012, were retrospectively analyzed to see the effect of (D+10)MELD on patient and graft survival. Donor age alone did not influence survival. Recipients were divided into six classes based on the (D+10)MELD score: Class 1 (0-399), Class 2 (400-799), Class 3 (800-1199), Class 4 (1200-1599), Class 5 (1600-1999), and Class 6 (>2000). The 1 year patient survival (97.1, 88.8, 87.6, 76.9, and 75% across Class 1-5, P=.03) and graft survival (97.1, 87.9, 82.3, 76.9, and 75%; P=.04) was significantly different among the classes. The study population was divided into two groups at (D+10)MELD cut off at 860. Group 1 had a significantly better 1 year patient (90.4% vs 83.4%; P=.02) and graft survival (88.6% vs 80.2%; P=.01). While donor age alone does not predict recipient outcome, (D+10)MELD score is a strong predictor of recipient and graft survival, and may help in better recipient/donor selection and matching in LDLT.
  • Low vitamin D exposure is associated with higher risk of infection in renal transplant recipients
    Background Vitamin D is a steroid hormone with multiple vital roles within the immune system. Various studies evaluated the influence of vitamin D on infections postrenal transplantation and found contrasting results. This study aimed to assess the relationship between vitamin D status and the incidence of infection in renal transplant recipients. Methods This is a retrospective cohort study of adult renal transplant recipients at the University of Pittsburgh Medical Center between 2005 and 2012. Patients were grouped as vitamin D sufficient (≥30 ng/mL) or deficient (<30 ng/mL) based on total serum 25-hydroxyvitamin D concentrations. The association between vitamin D levels collected at any point post-transplantation and incidence of infection within ±90 days of the vitamin D levels were assessed using logistic and Poisson's regression models. Results Vitamin D sufficiency at any point post-transplantation was significantly associated with a 66% lower odds (OR: 0.34; 95% CI: 0.22-0.52; P<.001) and 43% lower rate of infections (incident rate ratio (IRR): 0.57; 95% CI: 0.46-0.71; P<.001) within ±90 days of the vitamin D level. Baseline vitamin D level was also associated with lower incidence and risk for infections within the first year post-transplantation. Conclusion Adequate levels of vitamin D in kidney transplant recipients are associated with lower infection risk in the first year and at any time post-transplantation.
  • Association of pretransplant kidney function with outcomes after lung transplantation
    Purpose There is a lack of data regarding the independent association of pretransplant kidney function with early and late outcomes among lung transplant (LT) recipients. Methods We queried the United Network for Organ Sharing database for adult patients (≥18 years of age) undergoing LT between 1987 and 2013. Glomerular filtration rate (GFR) was estimated using the modification of diet in renal disease (MDRD) and the Chronic kidney disease epidemiology collaboration (CKD-EPI) equations. The study population was split into four groups (>90, 60-90, 45-59.9, and <45 mL/min/1.73 m2) based on the estimated GFR at the time of listing. Results Overall, there was a good correlation between the GFR estimated from the two equations (n=17884, Pearson r=.816, P<.001). There was a consistent and independent association of worse early and late outcomes with declining GFR throughout the spectrum including those above 60 mL/min/1.73 m2 (P<.001 for overall comparisons). Although GFR<45 mL/min/1.73 m2 was associated with worse early and late survival, patients with GFR 45-59.9 mL/min/1.73 m2 do not appear to have survival advantage beyond 3 years post-transplant. Conclusion There is a good correlation between GFR estimated using MDRD and CKD-EPI equations among patients being considered for LT. Early and late outcomes after LT worsen in a linear fashion with progressively lower pretransplant GFR.
  • Pharmacogenetics of steroid-responsive acute graft-versus-host disease
    Glucocorticoids are central to effective therapy of acute graft-versus-host disease (GVHD). However, only about half of the patients respond to steroids in initial therapy. Based on postulated mechanisms for anti-inflammatory effectiveness, we explored genetic variations in glucocorticoid receptor, co-chaperone proteins, membrane transporters, inflammatory mediators, and variants in the T-cell receptor complex in hematopoietic cell transplant recipients with acute GVHD requiring treatment with steroids and their donors toward response at day 28 after initiation of therapy. A total of 300 recipient and donor samples were analyzed. Twenty-three SNPs in 17 genes affecting glucocorticoid pathways were included in the analysis. In multiple regression analysis, donor SNP rs3192177 in the ZAP70 gene (O.R. 2.8, 95% CI: 1.3-6.0, P=.008) and donor SNP rs34471628 in the DUSPI gene (O.R. 0.3, 95% CI: 0.1-1.0, P=.048) were significantly associated with complete or partial response. However, after adjustment for multiple testing, these SNPs did not remain statistically significant. Our results, on this small, exploratory, hypothesis generating analysis suggest that common genetic variation in glucocorticoid pathways may help identify subjects with differential response to glucocorticoids. This needs further assessment in larger datasets and if validated could help identify subjects for alternative treatments and design targeted treatments to overcome steroid resistance.
  • Adverse symptoms of immunosuppressants: A survey of Canadian transplant clinicians
    Adverse symptoms of immunosuppressants (ASI) impact quality of life (QOL) in solid organ transplant recipients; however, standardized approaches for active ASI surveillance and intervention are lacking. While management is highly clinician dependent, clinician views remain largely unexplored. We surveyed Canadian Society of Transplantation members on their perceptions of ASI including frequency, perceived QOL impact, causal attribution, management strategies, and success. Sixty-one clinicians participated in the survey of 12 ASI (tremor, diarrhea, nausea, constipation, dyspepsia, insomnia, edema, dyspnea, arthralgia, acne, mouth sores, paresthesias), for a 22% response rate. Forty-nine completed the survey (80% completion rate). Diarrhea, dyspepsia, and insomnia were most frequent, requiring management in ≥ 2% of patients by 96%, 90%, and 82% of respondents, respectively. Diarrhea, insomnia, and dyspnea were deemed to have an important QOL impact by 92%, 82%, and 69%. Immunosuppressants were universally implicated as causative of tremor, diarrhea, acne, and mouth sores. Over 80% reported success in managing mouth sores, dyspepsia, and constipation. Management strategies included adjustment of immunosuppressant or other medications, drug therapy, and nonpharmacologic approaches and varied according to perceived causal attribution. More study is needed to compare clinician and patient views. These results will be used to establish priorities for further investigation of ASI.
  • Severe acute cellular rejection after intestinal transplantation is associated with poor patient and graft survival
    Background Severe acute cellular rejection (ACR) occurs frequently after intestinal transplantation (ITx). Aim To evaluate the outcomes and the risk factors for graft failure and mortality in patients with severe ACR after ITx. Methods Retrospective study evaluating all ITx recipients who developed severe ACR between 01/2000 and 07/2014. Demographic and histologic data were reviewed. Results 20/126 (15.9%) ITx recipients developed severe ACR. Of these 20 episodes, 13 were in adults (median age: 47.1). The median (IQR) time from ITx to severe ACR was 206.5 (849) days. All patients received intravenous methylprednisolone and increased doses of tacrolimus. Sixteen (80%) patients did not respond to initial treatment and required thymoglobulin administration. Moreover, 11 (55%) patients required additional immunosuppressive medications. Six (30%) patients required graft enterectomy. Complications related to ACR treatment were the following: 10 (50%) patients developed bacterial infections, four (20%) patients developed cytomegalovirus infection and four (20%) patients developed post-transplant lymphoproliferative disease. At the end of follow-up, only 3/20 (15%) were alive with a functional allograft. The median patient survival time after diagnosis of severe ACR was 400 days (95% CI: 234.0-2613.0). Conclusions Severe ACR episodes are associated with high rates of graft loss and complications related to treatment.
  • Lactobacillus rhamnosus GG probiotic enteric regimen does not appreciably alter the gut microbiome or provide protection against GVHD after allogeneic hematopoietic stem cell transplantation
    Graft-versus-host disease (GVHD) is a major adverse effect associated with allogeneic stem cell transplant. Previous studies in mice indicated that administration of the probiotic Lactobacillus rhamnosus GG can reduce the incidence of GVHD after hematopoietic stem cell transplant. Here we report results from the first randomized probiotic enteric regimen trial in which allogenic hematopoietic stem cell patients were supplemented with Lactobacillus rhamnosus GG. Gut microbiome analysis confirmed a previously reported gut microbiome association with GVHD. However, the clinical trial was terminated when interim analysis did not detect an appreciable probiotic-related change in the gut microbiome or incidence of GVHD. Additional studies are necessary to determine whether probiotics can alter the incidence of GVHD after allogeneic stem cell transplant.
  • The effects of Share 35 on the cost of liver transplantation
    On June 18, 2013, the United Network for Organ Sharing (UNOS) instituted a change in the liver transplant allocation policy known as “Share 35.” The goal was to decrease waitlist mortality by increasing regional sharing of livers for patients with a model for end-stage liver disease (MELD) score of 35 or above. Several studies have shown Share 35 successful in reducing waitlist mortality, particularly in patients with high MELD. However, the MELD score at transplant has increased, resulting in sicker patients, more complications, and longer hospital stays. Our study aimed to explore factors, along with Share 35, that may affect the cost of liver transplantation. Our results show Share 35 has come with significantly increased cost to transplant centers across the nation, particularly in regions 2, 5, 10, and 11. Region 5 was the only region with a median MELD above 35 at transplant, and cost was significantly higher than other regions. Several other recipient factors had changes with Share 35 that may significantly affect the cost of liver transplant. While access to transplantation for the sickest patients has improved, it has come at a cost and regional disparities remain. Financial implications with proposed allocation system changes must be considered.
  • Living donor kidney allograft survival ≥ 50 years
    The first successful kidney transplant occurred in 1954. Since then, long-term graft survival has been an elusive idealistic goal of transplantation. Yet 62 years later, we know of only 6 kidney transplant recipients who have achieved ≥  50-year graft survival while being on no immunosuppression or a substantially reduced regimen. Herein, we report graft survival  ≥ 50 years in 2 living donor recipients who have been maintained on standard-of-care immunosuppression the entire time. For our 2 recipients, their living donor's altruism altered the course, length, and quality of their life, which by all accounts can be deemed normal: They attended college, held jobs, had successful pregnancies, raised families, and were productive members of society. Both donors are still alive and well, more than 50 years post-donation; both have an acceptable GFR and normal blood pressure, with hyperlipidemia as their only medical problem. These 2 intertwined stories illustrate the tremendous potential of a successful kidney transplant: long-term survival with a normal lifestyle and excellent quality of life, even after more than 5 decades on full-dose immunosuppression.
  • Early post-transplant conversion from tacrolimus to belatacept for prolonged delayed graft function improves renal function in kidney transplant recipients
    Prolonged delayed graft function (DGF) in kidney transplant recipients imparts a risk of poor allograft function; tacrolimus may be detrimental in this setting. We conducted a retrospective single center analysis of the first 20 patients converted to belatacept for prolonged DGF as part of a clinical protocol as a novel treatment strategy to treat prolonged DGF. Prior to conversion, patients underwent an allograft biopsy to rule out rejection and confirm tubular injury. The primary outcome was the estimated glomerular filtration rate (eGFR) at 12 months post-transplant; secondary outcome was the change in eGFR 30 days post-belatacept conversion. At 1 year post-transplant, the mean eGFR was 54.2 (SD 19.2) mL/min/1.73 m2. The mean eGFR on the day of belatacept conversion was 16 (SD 12.7) mL/min/1.73 m2 and rose to 43.1 (SD 15.8) mL/min/1.73 m2 30 days post-conversion (P<.0001). The acute rejection rate was 20% with 100% patient survival at 12 months post-transplant. There was one graft loss in the setting of an invasive Aspergillus infection that resulted in withdrawal of immunosuppression and transplant nephrectomy. Belatacept conversion for prolonged DGF is a novel treatment strategy that resulted in an improvement in eGFR. Additional follow-up is warranted to confirm the long-term benefits of this strategy.
  • Graft quality matters: Survival after simultaneous liver-kidney transplant according to KDPI
    Background Poor renal function is associated with higher mortality after liver transplantation. Our aim was to understand the impact of kidney graft quality according to the kidney donor profile index (KDPI) score on survival after simultaneous liver-kidney (SLK) transplantation. Methods Using United Network of Organ Sharing data from 2002 to 2013 for adult deceased donor SLK recipients, we compared survival and renal graft outcomes according to KDPI. Results Of 4207 SLK transplants, 6% were from KDPI >85% donors. KDPI >85% recipients had significantly increased mortality (HR=1.83, 95%CI=1.44-2.31) after adjusting for recipient factors. Additionally, dialysis in the first week (HR=1.4, 95%CI=1.2-1.7) and death-censored kidney graft failure at 1 year (HR=5.7, 95%CI=4.6-7.0) were associated with increased mortality after adjusting for recipient factors and liver donor risk index score. Conclusions KDPI >85% recipients had worse patient and graft survival after SLK. Poor renal allograft outcomes including dialysis in the first week and death-censored kidney graft failure at 1 year, which occurred more frequently with KDPI >85% grafts, were associated with significantly reduced patient survival. Questions remain about the survival impact of liver vs kidney graft quality given the close relationship between donor factors contributing to both, but KDPI can still be valuable as a metric readily available at the time of organ offers for SLK candidates.
  • Ledipasvir/sofosbuvir is effective and well tolerated in postkidney transplant patients with chronic hepatitis C virus
    Patients with end-stage renal diseases on hemodialysis have a high prevalence of hepatitis C infection (HCV). In most patients, treatment for HCV is delayed until postrenal transplant. We assessed the effectiveness and tolerance of ledipasvir/sofosbuvir (LDV/SOF) in 32 postkidney transplant patients infected with HCV. The group was composed predominantly of treatment-naïve (75%) African American (68.75%) males (75%) infected with genotype 1a (62.5%). Most patients received a deceased donor kidney graft (78.1%). A 96% sustained viral response (SVR) was reported (27/28 patients). One patient relapsed. One patient with baseline graft dysfunction developed borderline rejection. No graft loss was reported. Six HIV-coinfected patients were included in our analysis. Five of these patients achieved SVR 12. There were four deaths, and one of the deaths was in the HIV group. None of the deaths were attributed to therapy. Coinfected patients tolerated therapy well with no serious adverse events. Serum creatinine remained stable at baseline, end of therapy, and last follow-up, (1.351±.50 mg/dL; 1.406±.63 mg/dL; 1.290±.39 mg/dL, respectively). In postkidney transplant patients with HCV infection with or without coinfection with HIV, a combination of LDV/SOF was well tolerated and effective.
  • The high incidence of severe chronic kidney disease after intestinal transplantation and its impact on patient and graft survival
    Introduction Using data from the Scientific Registry of Transplant Recipients (SRTR), cumulative incidence, risk factors for, and impact on survival of severe chronic kidney disease (CKD) in intestinal transplantation (ITx) recipients were assessed. Methods First-time adult ITx recipients transplanted in the United States between January 1, 1990 and December 31, 2012 were included. Severe CKD after ITx was defined as: glomerular filtration rate (GFR) <30 mL/min/1.73 m2, chronic hemodialysis initiation, or kidney transplantation (KTx). Survival analysis and extended Cox model were conducted. Results The cumulative incidence of severe CKD 1, 5, and 10 years after ITx was 3.2%, 25.1%, and 54.1%, respectively. The following characteristics were significantly associated with severe CKD: female gender (HR 1.34), older age (HR 1.38/10 year increment), catheter-related sepsis (HR 1.58), steroid maintenance immunosuppression (HR 1.50), graft failure (HR 1.76), ACR (HR 1.64), prolonged requirement for IV fluids (HR 2.12) or TPN (HR 1.94), and diabetes (HR 1.54). Individuals with higher GFR at the time of ITx (HR 0.92 for each 10 mL/min/1.73 m2 increment), and those receiving induction therapies (HR 0.47) or tacrolimus (HR 0.52) showed lower hazards of severe CKD. In adjusted analysis, severe CKD was associated with a significantly higher hazard of death (HR 6.20). Conclusions The incidence of CKD after ITx is extremely high and its development drastically limits post-transplant survival.
  • Kidney allograft surveillance biopsy practices across US transplant centers: A UNOS survey
    Background The approach to the diagnosis and management of subclinical rejection (SCR) in kidney transplant recipients remains controversial. Methods We conducted a survey through UNOS across US transplant centers regarding their approach to surveillance biopsies and reasons for the nonperformance of surveillance biopsies. Results Responses were obtained from 106/238 centers (45%), and only 18 (17%) of the centers performed surveillance biopsies on all patients and 22 (21%) performed biopsy for select cases. The most common time points for surveillance biopsies were 3 and 12  months post-transplant. The common reasons for not performing biopsies were low yield (n = 44, 65%) and the belief that it will not change outcome (n = 24, 36%). The incidence of SC-TCMR was ≥ 10% among 39% of centers. The mean serum creatinine was slightly worse by 0.06 mg/dL at 1 year and 0.07 mg/dL at 3 years among centers performing biopsy, P < .0001. The. 1-and 3-year Observed-Expected (O-E) graft survival was similar among centers performing biopsies vs. those not performing biopsy (P = .07, .88). Conclusion Only 17% of US centers perform surveillance biopsies, with another 21% performing surveillance biopsies in select cases (among centers that responded to the survey). Greater uniformity in the approach and management of this condition is of paramount importance.
  • Issue Information
  • In Memoriam—The Legacy of Thomas E. Starzl, MD, PhD 3/11/1926-3/4/2017
  • Pulmonary thromboembolism as a complication of lung transplantation
    Post-transplantation mortality after lung transplantation (LTX) is higher than for other solid organ transplantations. Thoracic surgery is associated with increased risk of thromboembolic complications, and as LTX recipients lack the collateral bronchial circulation, pulmonary thromboembolism (PTE) may represent a pertinent yet largely underdiagnosed cause of post-transplantation respiratory failure. In this systematic review, we sought to elucidate the occurrence and predilection site of PTE after LTX, and its potential impact on LTX-associated mortality. Based on twelve original articles identified by a systematic search strategy in PubMed, we found that PTE was reported in 4% of LTX recipients, and 38% of these events occurred within the first 30 days after the LTX procedure. In single-lung transplantation (SLTX) recipients, 12% were diagnosed with PTE, with 92% of these affecting the allograft. Of LTX patients diagnosed with PTE, 11% died within 1 year after LTX and 75% of these deaths occurred within the first 30 days. Our findings suggest that PTE is a potentially underdiagnosed cause of early post-LTX respiratory failure. This should be confirmed in larger studies with systematic follow-up diagnostic imaging.
  • Adverse outcomes associated with postoperative atrial arrhythmias after lung transplantation: A meta-analysis and systematic review of the literature
    Background Postoperative atrial arrhythmias (AAs) are common after lung transplantation, but studies are mixed regarding their impact on outcomes. We therefore performed this systematic review and meta-analysis to determine whether AAs after lung transplantation impede postoperative recovery. Methods MEDLINE, EMBASE, CINAHL, and the Cochrane Register were searched to identify studies comparing outcomes in adult patients undergoing lung transplantation who experienced postoperative AAs in the immediate postoperative period vs those without postoperative AAs. Our primary outcome was perioperative mortality, and secondary outcomes were length of stay (LOS), postoperative complications, and mid-term (1-6 years) mortality. Results Nine studies including 2653 patients were included in this analysis. Of this group, 791 (29.8%) had postoperative AAs. Patients with postoperative AAs had significantly higher perioperative (OR 2.70 [95% CI: 1.73-4.19], P<.0001) mortality, longer hospital LOS (MD 8.29 [95% CI: 4.37-12.21] days, P<.0001), more frequent requirement for tracheostomy (OR 4.67 [95% CI: 2.59-8.44], P<.0001), and higher mid-term mortality (OR 1.71 [95% CI: 1.28-2.30], P=.0003). Conclusions AAs after lung transplantation are frequent and associated with significantly higher mortality, longer hospital LOS, and requirement for tracheostomy. Given their impact on recovery, prophylactic strategies against AAs need to be developed.
  • Prediction of nonalcoholic fatty liver in prospective liver donors
    Background Metabolic risk factors should be important in addition to imaging for prediction of steatosis in prospective liver donors. Materials and methods The study group included all prospective liver donors who had a liver biopsy during workup. Risk factors of metabolic syndrome were analyzed, and body mass index (BMI) ≥25 kg/m2 was used in place of waist circumference. Three BMI cutoffs (25, 28, and 30 kg/m2) and two CT-measured liver attenuation index (LAI) cutoffs (<5 and ≤10) were used for steatosis assessment of ≥5%, ≥10%, and ≥20%. Results Of the 573 prospective donors (307 females), 282 (49.2%) donors had nonalcoholic fatty liver (NAFL). When donors with NAFL were compared with donors having normal histology, multivariate analysis showed BMI, ALT, triglycerides, and LAI as significant predictors of NAFL. BMI ≥25 kg/m2 and LAI <10 were better cutoffs. The presence of ≥2 metabolic risk factors had better sensitivity than CT-LAI for the presence of NAFL and ≥20% steatosis (58% and 54% vs 47% and 22%, respectively, for CT-LAI ≤10). The presence of LAI >10 and <2 metabolic risk factors predicted <10% steatosis with 96% specificity and 92% positive predictive value. Conclusion The presence of ≥2 metabolic risk factors improves sensitivity of CT-LAI for prediction of donor steatosis.
  • Rescue alemtuzumab for refractory acute cellular rejection and bronchiolitis obliterans syndrome after lung transplantation
    Refractory acute cellular rejection (rACR) is associated with death and bronchiolitis obliterans syndrome (BOS) post-lung transplantation. We report the largest cohort of lung transplant recipients (LTRs) treated with rescue alemtuzumab for rACR or BOS. RACR outcomes included burden of ACR 30 days before and 180 days after rescue assessed by a novel composite rejection standardized score (CRSS, range 0-6) and freedom from ≥A2 ACR. BOS outcomes included freedom from BOS progression and FEV1 decline >10%. Univariate parametric and nonparametric statistical approaches were used to assess treatment response. Kaplan-Meier method with log rank conversion was used to assess freedom from events. Fifty-seven alemtuzumab doses (ACR 40 and BOS 17) given to 51 patients were included. Median time to rescue was 722 (IQR 42-1403) days. CRSS declined significantly (3 vs 0.67, P<0.001) after rescue. Freedom from ≥A2 was 62.5% in rACR. Freedom from BOS progression was 52.9% at 180 days in the BOS cohort. Freedom from FEV1 decline >10% was 70% in BOS grade 1 and 14.3% in advanced BOS grades 2-3. Infections developed in 72.5% and 76.5% of rACR and BOS groups. Rescue alemtuzumab appears useful for rACR. Patients with BOS 1 may have transient benefit, and patients with advanced BOS seem not to respond to alemtuzumab.
  • Ureteric complications in recipients of kidneys from donation after circulatory death donors
    A large increase in the use of kidneys from donation after circulatory death (DCD) donors prompted us to examine the impact of donor type on the incidence of ureteric complications (UCs; ureteric stenosis, urinary leak) after kidney transplantation. We studied 1072 consecutive kidney transplants (DCD n=494, live donor [LD] n=273, donation after brain death [DBD] n=305) performed during 2008-2014. Overall, there was a low incidence of UCs after kidney transplantation (3.5%). Despite a trend toward higher incidence of UCs in DCD (n=22, 4.5%) compared to LD (n=10, 3.7%) and DBD (n=5, 1.6%) kidney transplants, donor type was not a significant risk factor for UCs in multivariate analysis (DCD vs DBD HR: 2.33, 95% CI: 0.77-7.03, P=.13). There was no association between the incidence of UCs and donor, recipient, or transplant-related characteristics. Management involved surgical reconstruction in the majority of cases, with restenosis in 2.7% requiring re-operation. No grafts were lost secondary to UCs. Despite a significant increase in the number of kidney transplants from DCD donors, the incidence of UCs remains low. When ureteric complications do occur, they can be treated successfully with surgical reconstruction with no adverse effect on graft or patient survival.
  • Evaluating living donor kidney transplant rates: Are you reaching your potential?
    Background Traditionally, living donor kidney transplant (LDKT) rate has been calculated as a percentage of total kidney transplant volume. We believe this calculation to be inherently flawed because the number of deceased donor kidney transplants has no bearing on the number of LDKT performed. We propose an alternative calculation of LDKT rate as a percentage of the number of new waitlist registrants. Methods We evaluated 192 adult transplant centers in the United States with respect to their LDKT rate according to both the traditional and proposed calculations, using data from the scientific registry of transplant recipients between July 2014 and June 2015. Results The median LDKT rate for every 100 new waitlist registrants was 12.3, compared to 27.9 for every 100 total kidney transplants. Based on our proposed calculation of LDKT rate, 16.7% of transplant centers were misevaluated when compared to the national mean using the traditional method. Conclusions A new calculation of LDKT rate based on new waitlist registrants, and not total kidney transplants, is necessary to eliminate the bias associated with the traditional method, allowing for the identification of centers for improvement as well as each individual center's true potential based on their patient demographics.
  • Morphologic patterns and treatment of transplant glomerulopathy: A retrospective analysis
    Transplant glomerulopathy is mainly due to chronic antibody-mediated rejection and actually represents a major cause of long-term allograft failure. The lack of effective treatment remains a serious problem in transplantation. A retrospective and uni-center study was performed in 48 kidney allograft recipients with transplant glomerulopathy between January 2010 and December 2015. Median time for diagnosis was 7.1 (3.6-11.8) years post-transplant. Light microscopy showed severity of transplant glomerulopathy in the majority of patients (cg1=10.4%; cg2=20.8%; cg3=68.8%). Moderate microvascular inflammation was present in 56.3% (g+ptc≥2), and almost half of recipients (51.1%) were C4d positive in immunofluorescence. Female gender (P=.001), age (P=.043), renal dysfunction (P=.002), acute rejection episodes (P=.026), and anti-HLA class II antibodies (P=.004) were associated with kidney allograft failure. Treatment of transplant glomerulopathy was performed in 67.6% of patients. The histologic and laboratory features that led to a therapeutic intervention were score ptc (P=.021), C4d (P=.03), and the presence of anti-HLA antibodies (P=.029), whereas score ah (P=.005) was associated with conservative measure. The overall cumulative kidney allograft survival at 10 years was 75%. Treatment of transplant glomerulopathy was ineffective to improve long-term kidney allograft survival.
  • Corticosteroid wean after heart transplantation—Is there a risk for antibody formation?
    Background Corticosteroid withdrawal after heart transplantation is limited to select immune-privileged patients but it is not known whether this predisposes patients to a higher risk for sensitization. Methods A total of 178 heart transplant recipients had panel-reactive antibody (PRA) measurements at transplant and every 6 months and were monitored for rejection with protocol endomyocardial biopsies. Corticosteroid withdrawal was initiated at 6 months post-transplant in select patients. Results Patients successfully weaned off prednisone (SPW; n=103) had lower PRA compared to those maintained on prednisone (MP; n=51) at pretransplant (34% vs 63%), 6 months (18% vs 49%), 12 months (19% vs 51%), and 18 months (15% vs 47%) after transplant (P<.05). Among 68 nonsensitized patients at transplant in the SPW group, seven (10%) developed de novo PRA at 12 months, compared to four of 19 (21%) of MP patients. Freedom from any treated rejection (97% vs 69% vs 67%), acute cellular rejection (100% vs 86% vs 71%), and antibody-mediated rejection (100% vs 88% vs 88%; all P≤.001) at 2 years was higher in SPW compared to MP and those who failed prednisone wean, respectively. Conclusion Few patients successfully weaned off prednisone after heart transplant develop de novo circulating antibodies but are not at increased risk for developing rejection.
  • Short- and long-term outcomes with renin–angiotensin–aldosterone inhibitors in renal transplant recipients: A meta-analysis of randomized controlled trials
    Background Angiotensin-converting enzyme inhibitors (ACEIs) and angiotensin II receptor type 1 blockers (ARBs) are often prescribed for renal transplant recipients (RTRs), but the outcomes of these medications in RTRs remain controversial. Methods The PubMed, Embase, and Cochrane Library databases were systematically searched. Randomized controlled trials investigating the outcomes of ACEI/ARBs in RTRs were included for meta-analysis. Results Twenty-two trials with 2242 patients were identified. After treatment for at least 12 months, ACEI/ARBs were associated with a decline in glomerular filtration rate (GFR) (weighed mean differences [WMD] −5.76 mL/min; 95% confidence intervals [CI]: −9.31 to −2.20) and a decrease in hemoglobin (WMD −9.81 g/L; 95% CI: −14.98 to −4.64). There were no significant differences in mortality between ACEI/ARB and non-ACEI/ARB groups (risk ratio [RR] 0.98, 95% CI: 0.58 to 1.76), nor in graft failure (RR 0.68, 95% CI: 0.38 to 1.32). After short-term treatment (less than 1 year), significant differences were found in changes of 24-hour proteinuria (WMD−0.57 g/d; 95% CI: −0.72 to −0.42) and serum potassium (WMD 0.25 mEq/L; 95% CI: 0.14 to 0.37) in ACEI/ARB groups compared to control arm, while these differences were not confirmed in the long run. Conclusion This meta-analysis indicates ACEI/ARBs may be prescribed to RTRs with GFR and hemoglobin being carefully monitored.
  • Risk of tumor transmission after thoracic allograft transplantation from adult donors with central nervous system neoplasm—A UNOS database study
    Background We analyzed the UNOS database to better define the risk of transmission of central nervous system (CNS) tumors from donors to adult recipients of thoracic organs. Methods Data were procured from the Standard Transplant Analysis and Research dataset files. Donors with CNS tumors were identified, and recipients from these donors comprised the study group (Group I). The remaining recipients of organs from donors who did not have CNS tumors formed the control group (Group II). Incidence of recipient CNS tumors, donor-related malignancies, and overall survival were calculated and compared in addition to multivariable logistic regression. Results A cohort of 58 314 adult thoracic organ recipients were included, of which 337 received organs from donors who had documented CNS tumors (Group I). None of these recipients developed CNS tumors at a median follow-up of 72 months (IR: 30-130 months). Although overall mortality in terms of the percentage was higher in Group I than Group II (163/320=51% vs 22 123/52 691=42%), Kaplan-Meier curves indicate no significant difference in the time to death between the two groups (P=.92). Conclusions There is little risk of transmission of the common nonaggressive CNS tumors to recipients of thoracic organs.
  • Long-term renal outcome after allogeneic hemopoietic stem cell transplant: A comprehensive analysis of risk factors in an Asian patient population
    Allogeneic hemopoietic stem cell transplantation (allo-HSCT) poses a significant challenge to renal function due to multiple drug- and complication-related renal toxicity. In this single-center series of 216 adult Asian patients with a long and complete follow-up, 41 developed chronic kidney disease (CKD) giving a cumulative incidence of 19.0% at 25 years (median follow-up duration 7.84 years, range 2.0-27.7 years), but only two of the 41 patients reached stage 4 CKD and another two required dialysis. In contrast, acute kidney injury occurred in most patients, where glomerular filtration rate (GFR) suffered a mean fall of 50 mL/min/1.73 m2 at 6 months post-transplant compared with baseline. Suppression of renal function may last beyond 6 months but is potentially reversible, although not to baseline level in most patients. Analysis of a comprehensive range of 18 risk factors showed that older age, lower GFR at transplant, unrelated donor, diagnosis of AML, presence of diabetes mellitus at transplant, and duration of foscarnet use were significantly associated with CKD development, with the first three remaining as independent risks for CKD in multivariate analysis. Long-term survival is not affected by renal function, being 78.6% as compared to 85.5% for patients with low vs normal GFR at 2 years, respectively.
  • Treatment of cutaneous and/or soft tissue manifestations of corticosteroids refractory chronic graft versus host disease (cGVHD) by a total nodal irradiation (TNI)
    The management of corticosteroids refractory chronic graft versus host disease (cGVHD) remains controversial. Retrospective analysis of patients treated at the Integrated Center of Oncology by total nodal irradiation (TNI) was performed to evaluate its therapy potency. TNI delivers a dose of 1 Gy in a single session. The delimitation of the fields is clinical (upper limit: external auditory meatus; lower limit: mid-femur). No pre-therapeutic dosimetry scanner was necessary. Evaluation of the efficacy was by clinical measures at 6 months after the treatment. Twelve patients were treated by TNI between January 2010 and December 2013. TNI was used in second-line treatment or beyond. The median time between allograft and TNI was 31.2 months, and the median time between the first manifestations of cGVHD and TNI was about 24.2 months. Of the 12 patients, nine had a clinical response at 6 months (75%), including five complete clinical responses (41.6%). Five patients could benefit from a reduction of corticosteroid doses. Three patients had hematologic toxicity. TNI could be considered as an option for the treatment of a cutaneous and/or soft tissues corticosteroids refractory cGVHD. However, prospective randomized and double-blind trials remain essential to answer the questions about TNI safety and effectiveness.
  • De novo DQ donor-specific antibodies are associated with worse outcomes compared to non-DQ de novo donor-specific antibodies following heart transplantation
    Background Antibody-mediated rejection (AMR) resulting from de novo donor-specific antibodies (dnDSA) leads to adverse outcomes following heart transplantation (HTx). It remains unclear what role dnDSA to specific HLA antigens play in adverse outcomes. This study compares outcomes in patients developing dnDSA to DQ antigens with those developing non-DQ dnDSA and those free from dnDSA. Methods The present study was a single-center, retrospective analysis of 122 consecutive HTx recipients. The primary outcome was a composite of death or graft dysfunction. Results After 3.3 years of follow-up, 31 (28%) patients developed dnDSA. Mean time to dnDSA was 539 days. Of 31 patients, 19 developed DQ antibodies and 12 developed non-DQ antibodies. Compared to non-DQ dnDSA, DQ antibodies presented with higher MFI values (P=.001) were more likely persistent (P=.001) and appeared later post-HTx (654 vs 359 days, P=.035). In a multivariable analysis, DQ dnDSA was associated with increased risk of the primary endpoint (HR 6.15, 95% CI 2.57-14.75, P=.001), whereas no increased risk was seen with non-DQ dnDSA (P=.749). Conclusions dnDSA to DQ antigens following HTx are associated with increased risk of death and graft dysfunction.
  • Prediction model for cardiac allograft vasculopathy: Comparison of three multivariable methods
    Background Cardiac allograft vasculopathy (CAV) remains an important cause of graft failure after heart transplantation (HT). Although many risk factors for CAV have been identified, there are no clinical prediction models that enable clinicians to determine each recipient's risk of CAV. Methods We studied a cohort of 14 328 heart transplant recipients whose data were reported to the International Society for Heart and Lung Transplantation Registry between 2000 and 2010. The cohort was divided into training (75%) and test (25%) sets. Multivariable modeling was performed in the test set using variables available at the time of heart transplant using three methods: (i) stepwise Cox proportional hazard, (ii) regularized Cox proportional hazard, and (iii) Bayesian network. Results Cardiac allograft vasculopathy developed in 4259 recipients (29.7%) at a median time of 3.0 years after HT. The regularized Cox proportional hazard model yielded the optimal performance and was also the most parsimonious. We deployed this model as an Internet-based risk calculator application. Conclusions We have developed a clinical prediction model for assessing a recipient's risk of CAV using variables available at the time of HT. Application of this model may allow clinicians to determine which recipients will benefit from interventions to reduce the risk of development and progression of CAV.
  • Cognitive function after heart transplantation: Comparing everolimus-based and calcineurin inhibitor-based regimens
    Background Studies have shown conflicting results concerning the occurrence of cognitive impairment after successful heart transplantation (HTx). Another unresolved issue is the possible differential impact of immunosuppressants on cognitive function. In this study, we describe cognitive function in a cohort of HTx recipients and subsequently compare cognitive function between subjects on either everolimus- or calcineurin inhibitor (CNI)-based immunosuppression. Methods Cognitive function, covering attention, processing speed, executive functions, memory, and language functions, was assessed with a neuropsychological test battery. Thirty-seven subjects were included (everolimus group: n=20; CNI group: n=17). The extent of cerebrovascular pathology was assessed with magnetic resonance imaging. Results About 40% of subjects had cognitive impairment, defined as performance at least 1.5 standard deviations below normative mean in one or several cognitive domains. Cerebrovascular pathology was present in 33.3%. There were no statistically significant differences between treatment groups across cognitive domains. Conclusions Given the high prevalence of cognitive impairment in the sample, plus the known negative impact of cognitive impairment on clinical outcome, our results indicate that cognitive assessment should be an integrated part of routine clinical follow-up after HTx. However, everolimus- and CNI-based immunosuppressive regimens did not show differential impacts on cognitive function.
  • Professional interpersonal dynamics and burnout in European transplant surgeons
    Background Burnout within the health professions has become an increasingly important topic. Evidence suggests there are differences in burnout across different countries. Research has yet to examine burnout in transplant surgeons throughout Europe. Methods A cross-sectional survey of transplant surgeons across Europe. Survey included sociodemographics, professional characteristics, frequency and discomfort with difficult patient interactions (PI), decisional autonomy, psychological job demands (PJD), support (coworker, supervisor, and hospital administration), and burnout including emotional exhaustion (EE), depersonalization (DP), and personal accomplishment (PA). Results One hundred and eight transplant surgeons provided data; 33 (30.6%) reported high EE, 19 (17.6%) reported high DP, and 29 (26.9%) reported low PA. Three hierarchical multiple linear regressions examined the burnout subscales as outcomes (EE, DP, and PA), and predictors selected based upon theoretical relationships with the outcomes. Greater PJD, greater discomfort in managing difficult PI, and lower levels of perceived supervisor support (SS) predicted greater EE. Only decisional autonomy significantly predicted DP, accounting for a small proportion of the variance. None of the steps for PA were significant. Conclusions Given prior research on burnout, there were several surprising findings from this study. For example, the relatively low levels of EE compared to U.S. physicians and surgeons. At this time, we can only hypothesize why this finding occurred but there are multiple possible explanations including cultural effects, response bias, or other factors unknown at this time. Research is needed to attempt to clarify these findings.
  • Physical activity in solid organ transplant recipients: Participation, predictors, barriers, and facilitators
    Background Our objectives were to describe the physical activity (PA) levels, predictors, barriers, and facilitators to PA in solid organ transplant (SOT) recipients. Methods A web-based questionnaire was sent to members of the Canadian Transplant Association including the Physical Activity Scale for the Elderly (PASE), and questions regarding barriers and facilitators of PA. Results One hundred and thirteen SOT recipients completed the survey. The median PASE score was 164.5 (24.6-482.7). Re-transplantation was the only statistically significant predictor of levels of PA. The most common facilitators of PA included a feeling of health from activity (94%), motivation (88%), social support (76%), knowledge and confidence about exercise (74%) and physician recommendation (59%). Influential barriers were cost of fitness centers (42%), side effects post-transplant or from medications (41%), insufficient exercise guidelines (37%), and feelings of less strength post-transplant (37%). Conclusion There is a large variation in PA levels among SOT recipients. Multiple factors may explain the variance in PA levels in SOT recipients. Identification of facilitators and barriers to PA can inform the development of health and educational promotion strategies to improve participation among SOT recipients with low activity levels.
  • Effect of transversus abdominis plane block in combination with general anesthesia on perioperative opioid consumption, hemodynamics, and recovery in living liver donors: The prospective, double-blinded, randomized study
    Background Transversus abdominis plane (TAP) block provides effective postoperative analgesia after abdominal surgeries. It can be also a useful strategy to reduce perioperative opioid consumption, support intraoperative hemodynamic stability, and promote early recovery from anesthesia. The aim of this prospective randomized double-blind study was to assess the effect of subcostal TAP blocks on perioperative opioid consumption, hemodynamic, and recovery time in living liver donors. Methods The prospective, double-blinded, randomized controlled study was conducted with 49 living liver donors, aged 18-65 years, who were scheduled to undergo right hepatectomy. Patients who received subcostal TAP block in combination with general anesthesia were allocated into Group 1, and patients who received general anesthesia alone were allocated into Group 2. The TAP blocks were performed bilaterally by obtaining an image with real-time ultrasound guidance using 0.5% bupivacaine diluted with saline to reach a total volume of 40 mL. The primary outcome measure in our study was perioperative remifentanil consumption. Secondary outcomes were mean blood pressure (MBP), heart rate (HR), mean desflurane requirement, anesthesia recovery time, frequency of emergency vasopressor use, total morphine use, and length of hospital stay. Results Total remifentanil consumption and the anesthesia recovery time were significantly lower in Group 1 compared with Group 2. Postoperative total morphine use and length of hospital stay were also reduced. Changes in the MAP and HR were similar in the both groups. There were no significant differences in HR and MBP between groups at any time. Conclusions Combining subcostal TAP blocks with general anesthesia significantly reduced perioperative and postoperative opioid consumption, provided shorter anesthesia recovery time, and length of hospital stay in living liver donors.
  • Erythrocytosis after allogeneic hematopoietic stem cell transplantation
  • Candida is an emerging pathogen beyond the neutropenic period of allogeneic hematopoietic cell transplantation
  • Higher Anti-A/B isoagglutinin titers of IgG class, but not of IgM, are associated with increased red blood cell transfusion requirements in bone marrow transplantation with major ABO-mismatch
    Background Major ABO mismatch between donor and recipient in bone marrow transplantation (BMT) may cause hemolysis, delayed red blood cell (RBC) engraftment and pure red cell aplasia (PRCA), which result in increased transfusion needs. High pretransplant anti-A/B antibody titers have been associated with increased risk of PRCA. Herein, we studied the impact of anti-A/B titers on transfusion needs after BMT with major ABO mismatch. Methods We reviewed the medical charts of 27 patients who underwent to BMT with major ABO mismatch and categorized them into two groups according to anti-A/B titers of IgG (≤16 and ≥32). We recorded the number of RBC and platelet units transfused in the first 180 days after transplantation. We also evaluated the impact of anti-A/B titers on overall survival. Results Patients with anti-A/B titer ≥32 of IgG class required more RBC transfusion than patients with titer ≤16 (6.60±4.55 vs 21.29±14.68; P=.03). Anti-A/B of IgM class had no impact on both RBC and platelet transfusion needs. Anti-A/B titers had no impact on overall survival. Conclusion Higher titers of anti-A/B antibodies of IgG class, but not of IgM, are associated with a higher demand for RBC transfusion.


Documento sin título

Aviso para pacientes:
Esta página contiene información urológica dirigida a profesionales de la sanidad.
Si tiene algún problema relacionado con esta patología,
consulte con su urólogo o médico de familia.
Si desea información diseñada para pacientes y público general. puede visitar:

Portal de Información Urológica para Pacientes



Carlos Tello Royloa


Actualizada el: 08-Abr-2013