uroportal

uroportal.es

El Portal de Urología en Español para profesionales

Búsqueda personalizada
 

 

Documento sin título

Presentación


Los Blogs de UroPortal:
Novedades en UroPortal
Vídeos Urología
Presentaciones Urología
UroInternet
UroGuías
UroPacientes

 

 

Este mes en... Clinical Transplantation

Sumarios de Revistas

Este mes en... Clinical Transplantation:

  • Dual Kidney Transplant Techniques: A Systematic Review
    Background Dual kidney transplantation (DKT) was developed in order to improve outcomes from transplantation of extended criteria donors (ECD). This paper examined which surgical techniques have been reported for DKT, and whether any technique had superior patient and graft survival. Method Electronic databases were searched for published studies mapping to MESH terms: ‘‘kidney or renal” AND “transplan*” AND “dual or double”. Single case reports; studies of patients less than 18 years old; studies which did not describe the surgical technique; and studies that did not report patient or graft survival were excluded. Results Fifteen reports of 434 DKT recipients were identified. Three techniques were described: bilateral placement; unilateral placement with separate anastomoses; and unilateral placement with patch anastomoses. Patient survival across all three techniques was over 95% at one year, and graft survival was also similar at over 90%. Rates of delayed graft function were between 20 and 30% across all techniques. Conclusion The three techniques have equivalent delayed graft function as well as patient and graft survival rates. This is an encouraging result as it means that the surgeon can choose to use the technique which is most appropriate for their own skills and for the patient. This article is protected by copyright. All rights reserved.
  • Dual En-bloc Technique for Adult Renal Transplantation
    Introduction We describe and provide follow-up for a novel simplified technique permitting dual en-bloc (DEB) transplantation of adult organs using single in situ arterial and venous anastomoses. Methods Twenty-two adult DEB transplants were performed at our centre between 2001 and 2012, utilizing 44 kidneys en bloc. Results were compared with 20 solitary transplants from expanded criteria donors (ECD) associated with lower terminal serum creatinines and Remuzzi biopsy scores vs. DEB group. Adult DEB implants had donor inferior vena cava connected to recipient external iliac vein and “Y” arterial interposition graft anastomosed to the recipient iliac artery. Ureters were conjoined prior to implantation as a single patch into the recipient bladder. Results Mean operative time was 206 ± 57 minutes in DEB vs 180 ± 30 min in single transplants (p < 0.05). Delayed graft function rate was 23% vs. 25% in both groups. At 12-month follow-up, mean serum creatinine was 152 ± 66 μmol/L vs in 154 ± 52 μmol/L DEB and single kidney transplant recipients respectively (p = NS). Three-year overall and graft specific survival were 86% and 84% in the DEB group, respectively (p = NS). Complication rates were similar between groups. Conclusions This DEB renal transplantation technique is safe and effective in adults. By employing techniques used to conjoin organ vasculature ex vivo, the number of in situ anastomoses are reduced, thereby minimizing operative ischemic time and potential for complications associated with extensive vascular dissection. This article is protected by copyright. All rights reserved.
  • The Impact of Calcineurin Inhibitors on Neutrophil Gelatinase-Associated Lipocalin and Fibroblast Growth Factor 23 in Long-term Kidney Transplant Patients
    Background Neutrophil gelatinase-associated lipocalin (NGAL), a protein with bacteriostatic functions rapidly excreted from stimulated or damaged epithelial cells, is elevated in acute and chronic kidney disease. A calcineurin dependent signaling pathway for fibroblast growth factor 23 (FGF23) has been revealed, but the effect of calcineurin inhibitors (CNIs) on the levels of NGAL and markers of mineral metabolism in long-term kidney transplant patients has not been explored. Methods In a cross-sectional study, 39 patients who received a first kidney transplant more than 10 years ago were split into two groups based on whether (n=28) or not (n=11) they used CNIs. Only patients with well-functioning grafts defined as an eGFR ≥45 ml/min per 1.73 m2 were included. Results The median levels of NGAL, intact parathyroid hormone (iPTH) and iFGF23 were significantly higher in CNI users vs CNI non-users, 167.0 (134.0-235.0) ng/ml vs 105.0 (91.3-117.0) ng/ml, p <0.001, 13.8 (10.0-17.3) pmol/l vs 8.4 (6.4-9.9) pmol/l, p=0.003, and 81.6 (56.4-116.5) pg/ml vs 61.8 (43.3-72.1) pg/ml, p=0.04 respectively. Conclusions The median levels of iFGF23 were higher in CNI users compared to CNI non-users giving support to the notion of a CNI induced FGF23 resistance in the parathyroid. The net result of CNIs side effects needs to be further explored. This article is protected by copyright. All rights reserved.
  • Race/Ethnicity is Associated with ABO Non-Identical Liver Transplantation in the United States
    United Network for Organ Sharing (UNOS) policies allows for ABO non-identical liver transplantation (LT) in candidates with Model for End-Stage Liver Disease (MELD) scores greater than 30. Previous studies showed ABO non-identical LT resulted in an 18% and 55% net gain in livers for B and AB candidates. These results suggested that the current liver ABO allocation policies may need refinement. There are, however, strong associations between ABO blood groups and race/ethnicity. We hypothesized that race/ethnicity is associated with ABO non-identical LT and that this is primarily influenced by recipient ABO status. We examined non-status 1 adult candidates registered between 7/1/2013-12/31/2015. There were 27,835 candidates (70% Non-Hispanic White, 15% Hispanic, 9% Black, 4% Asian, 1% Other/Multiracial). 11,369 underwent deceased donor LT: 93% ABO-identical, 6% ABO-compatible, 1% ABO-incompatible. Black and Asian race/ethnicity were associated with increased likelihoods of ABO non-identical LT. Adjustment for disease etiology, listing MELD, transplant center volume, and UNOS region did not alter this association. Stepwise inclusion of recipient ABO status did eliminate this significant association of race/ethnicity with ABO non-identical LT. Blacks and Asians may be advantaged by ABO non-identical LT and we suspect that changes to the existing policies may disproportionately impact these groups. This article is protected by copyright. All rights reserved.
  • Patterns and Correlates of Adherence to Self-Monitoring in Lung Transplant Recipients during the First 12-months after Discharge from Transplant
    Self-monitoring of lung function, vital signs, and symptoms is crucial for lung transplant recipients (LTRs) to ensure early detection of complications and prompt intervention. This study sought to identify patterns and correlates of adherence to self-monitoring among LTRs over the first 12-months post-discharge from transplant. This study analyzed existing data from the usual care arm participants of a randomized clinical trial who tracked self-monitoring activities using paper-and-pencil logs. Adherence was calculated as the percent of days LTRs recorded any self-monitoring data per interval: hospital discharge-2 months, 3-6 months, and 7-12 months. The sample (N=91) was mostly white (87.9%), male (61.5%), with a mean age of 57.2±13.8 years. Group-based trajectory analyses revealed 2 groups: 1) moderately adherent with slow decline (n=29, 31.9%) and 2) persistently nonadherent (n=62, 68.1%). Multivariate binary logistic regression revealed the following baseline factors increased the risk in the persistently nonadherent group: female (p=.035), higher anxiety (p=.008), and weaker sense of personal control over health (p=.005). Poorer physical health over 12-months were associated with increased risk in the persistently nonadherent group (p=.004). This study highlighted several modifiable factors for future interventions to target, including reducing post-transplant anxiety, and strengthening sense of personal control over health in LTRs. This article is protected by copyright. All rights reserved.
  • Accuracy of Computed Tomography for Detecting Hepatic Steatosis in Donors for Liver Transplantation: A Meta-analysis
    Background The accuracy of computed tomography (CT) for detecting donor hepatic steatosis (HS) before liver transplantation is not well established. Methods A meta-analysis was performed to determine the accuracy of CT for HS detection in liver donor candidates. Pooled sensitivity, specificity, positive and negative likelihood ratios, hierarchical summary receiver operating characteristic (HSROC) curves, and the area under the curve (AUC) were estimated using HSROC and bivariate random-effects models. Results Twelve studies involving 1782 subjects were eligible for this meta-analysis. For detecting significant HS (>10% to 30% steatosis in liver pathology) with CT in liver donors, the pooled sensitivity, specificity, positive likelihood ratio, and negative likelihood ratio were 0.81 (95% confidence interval [CI]: 0.70-0.89), 0.94 (95%CI: 0.90-0.96), 13.7 (95%CI: 8.1-23.1), and 0.20 (95%CI: 0.12-0.33). The AUC was 0.95 (95%CI: 0.92-0.96). For detecting the presence of HS, these corresponding diagnostic estimates were 0.50 (95%CI: 0.36-0.64), 0.90 (95%CI: 0.83-0.95), 5.2 (95%CI: 3.1-8.9), 0.55 (95%CI: 0.42-0.72) and 0.80 (95%CI: 0.76-0.83). Moderate to high heterogeneity was detected. Conclusion CT shows high accuracy in detecting significant HS while poor accuracy in detecting the presence of HS in liver donors. Donors estimated to have significant HS by CT may avoid unnecessary liver biopsy. This article is protected by copyright. All rights reserved.
  • No effect of HLA-C mismatch after allogeneic hematopoietic stem cell transplantation with unrelated donors and T-cell depletion in patients with hematological malignancies
    HLA-C mismatch in unrelated donors hematopoeitic stem cell transplantation (HSCT) has been associated with poor patient outcome. However, the impact of HLA-C mismatch in the context of HSCT combined with in vivo T-cell depletion remains unclear. We therefore performed a single-center, retrospective analysis of the clinical outcome on patients with hematological malignancies treated with allo-HSCT, who underwent T-cell depletion. The majority of the patients (n=276) received a HLA-A, -B, -DRB1 matched graft that were either also HLA-C matched (n=260), or patients with the permissive HLA-C*03:03/03:04 mismatch (n=16), while the remaining patients (n=95) received a HLA-C mismatched graft (excluding HLA-C*03:03/03:04 mismatches). We did not observed any significant differences between the HLA-C matched patients (including the permissive HLA-C*03:03/03:04 mismatch) and the HLA-C mismatched patients regarding cumulative proportion surviving, graft failure, relapse-free survival, relapse, or acute graft-versus host disease. Our data suggest that in the context of high dose T lymphocyte depleting agents, HLA-C matching is not essential for patients with hematological malignancies. This article is protected by copyright. All rights reserved.
  • BK Viremia Surveillance and Outcomes in Simultaneous Pancreas-Kidney Transplant Recipients
    Background While screening for asymptomatic BK viremia (BKV) has been well studied in isolated kidney transplant recipients, there is a paucity of published outcomes in simultaneous pancreas-kidney (SPK) transplant recipients who underwent BKV screening followed by pre-emptive reduction of immunosuppression. Methods This is a single-center, retrospective review of 31 consecutive SPK recipients who were transplanted over a five year period following the initiation of a serum BKV screening protocol. Results BK viremia developed in 11 (35.5%) patients, and all patients achieved complete viral clearance following reduction of immunosuppression. Two patients (6.5%) developed BK virus nephropathy, but both had preserved allograft function. One patient developed mild rejection of the kidney allograft following clearance of BKV, and two patients developed mild rejection of the pancreas allograft after reduction of immunosuppression, but there were no kidney or pancreas allograft losses due to rejection. The development of BK viremia did not impact overall patient survival or kidney and pancreas allograft survival. Conclusion Screening asymptomatic SPK recipients for BKV followed by reduction of maintenance immunosuppression appears to be an effective strategy to prevent kidney allograft dysfunction and graft loss due to BK virus nephropathy, without compromising pancreas allograft outcomes. This article is protected by copyright. All rights reserved.
  • Kidney Outcomes in Patients with Liver Cirrhosis and Chronic Kidney Disease Receiving an Orthotopic Liver Transplant Alone
    Kidney transplant in patients with liver cirrhosis and non-dialysis chronic kidney disease (CKD) is controversial. We report 14 liver cirrhotic patients who had persistently low MDRD-6 estimated-glomerular filtration rate (e-GFR) < 40 ml/min/1.73m2 for ≥ 3 months and underwent either liver transplant alone (LTA) (n=9) or simultaneous liver-kidney transplant (SLKT) (n=5). Pre-transplant, patients with LTA compared with SLKT had lower serum creatinine (2.5±0.73 vs. 4.6±0.52 mg/dl, P= 0.001), higher MDRD-6 e-GFR (21.0±7.2 vs. 10.3±2.0 ml/min/1.73m2, P=0.002), higher 24-hour urine creatinine clearance (34.2±8.8 vs. 18.0±2.2 ml/min, P=0.002), lower proteinuria (133.2±117.7 vs. 663±268.2 mg/24 hours, P=0.0002), and relatively normal kidney biopsy and ultrasound findings. Post-LTA, the e-GFR (ml/min/1.73 m2) increased in all 9 patients, with mean e-GFR at 1 month (49.8± 8.4), 3 months (49.6 ± 8.7), 6 months (49.8 ± 8.1), 12 months (47.6± 9.2), 24 months (47.9 ± 9.1) and at 36 months (45.1 ± 7.3) significantly higher compared to pre-LTA e-GFR (P-value ≤ 0.005 at all-time points). One patient developed end-stage renal disease nine years post-LTA and another patient expired seven years post-LTA. The low e-GFR alone in the absence of other markers or risk factors of CKD should not be an absolute criterion for SLKT in patients with liver cirrhosis. This article is protected by copyright. All rights reserved.
  • An objective definition for clinical suspicion of T-cell mediated rejection after liver transplantation
    A uniform definition of clinical suspicion of T-cell mediated rejection (TCMR) in liver transplantation (LT) is needed to homogenize clinical decisions, especially within randomized trials. The present multicentre study included a total of 470 primary LT recipients. The derivation cohort consisted of 142 patients who had clinically-driven liver biopsies at any time after LT. The external validation cohort included 328 patients who underwent protocol biopsies at day 7-10 after LT. The rates of moderate-severe histological TCMR were 33.8% in the derivation cohort and 43.6% in the validation cohort. Independent predictors (ie. risk factors) of moderate-severe TCMR in the derivation cohort were: serum bilirubin >4mg/dL (OR=5.83; p<0.001), rising bilirubin within the 4 days prior to liver biopsy (OR=4.57; p=0.003), and blood eosinophils count >0.1*109/L (OR=3.81; p=0.004). In the validation cohort, the number of risk factors was an independent predictor of moderate-severe TCMR (OR=1.74; p=0.001), after controlling for hepatitis C status. The number of risk factors paralleled the rates of moderate-severe TCMR in the derivation and validation cohorts (p<0.001 in both comparisons). In conclusion, increased serum bilirubin, rising bilirubin and eosinophilia are validated risk factors for moderate-severe histological TCMR, and could be used as objective criteria to select candidates for liver biopsy. This article is protected by copyright. All rights reserved.
  • High MELD score does not adversely affect outcome of Living Donor Liver Transplantation: Experience in 1000 recipients
    In countries where deceased organ donation is scarce, there is big gap between demand and supply of organs and living donor liver transplantation (LDLT) plays an important role in meeting this unmet need. This study was conducted to analyse the effect of pre-transplant MELD score on outcomes following LDLT. Outcome of 1000 patients who underwent LDLT from July 2010 to March 2015 was analysed retrospectively. Patients were grouped into low MELD <25 and high MELD ≥25 score to compare short term outcomes. Cumulative overall survival rates were calculated using Kaplan–Meier methods. 849 recipients were in low MELD group (Mean MELD = 16.90±9.2) and 151 were in high MELD group (Mean MELD = 28.77±7.2). No significant difference in etiology of CLD was observed between groups except for a higher prevalence of hepatitis C virus (29.6% vs 19.9%, p=0.01) in low MELD patients. No significant difference was observed in one year survival (88.5% vs 84.1%, p=0.12) between the groups. The multivariate analysis showed that pre- transplant MELD score does not predict survival of recipients. Pre-transplant high MELD score does not adversely affect outcomes after LDLT. In view of shortage of deceased organs, LDLT can be good option in high MELD recipients. This article is protected by copyright. All rights reserved.
  • Long-term outcomes of Early Compared to Late Onset Choledocho-choledochal Anastomotic Strictures after Orthotopic Liver Transplantation
    Background Endoscopic treatment of anastomotic biliary stricture (ABS) after liver transplantation (LT) has been proven to be effective and safe, but long term outcomes of early compared to late onset ABS have not been studied. The aim of this study is to compare the long-term outcome of early ABS to late ABS. Methods Of the 806 adult LT recipients (04/2006-12/2012), 93 patients met the criteria for inclusion, and were grouped into non-ABS (No stenosis on ERCP, n=41), early ABS [stenosis < 90 days after LT, 18 (19.3%)], and late ABS [stenosis ≥ 90 days after LT, 34(36.5%)]. A propensity matched control group for the ABS group (n=42) was obtained matched for outcome variables for age, gender, and calculated MELD score at listing. Results Mean number of ERCPs (2.33±1.3 vs. 2.56±1.5, P=0.69) was comparable between the groups; however significantly better long-term resolution of the stricture was noted in the early ABS group (94.44% vs. 67.65%, P=0.04). Kaplan-Meyer analysis revealed worst survival in the early ABS group compared to the non-ABS, late ABS and control groups (P=0.0001). Conclusion LT recipients with early ABS have inferior graft survival despite better response to endoscopic intervention. This article is protected by copyright. All rights reserved.
  • Use of Intra-Aortic Counterpulsation in Cardiogenic Shock Post Liver Transplantation
    Left ventricular dysfunction resulting in cardiogenic shock occurs infrequently following organ reperfusion in liver transplantation. The etiology of the cardiogenic shock is often multifactorial and difficult to manage due to the complex nature of the procedure and the patient's baseline physiology. Traditionally, this hemodynamic instability is managed medically using inotropic agents and vasopressor support. If medical treatment is insufficient, the use of an intra-aortic balloon pump for counterpulsation may be employed to improve the hemodynamics and stabilize the patient. Here, we analyze three cases and review the literature. This article is protected by copyright. All rights reserved.
  • Clinical and Virologic Outcomes in High-Risk Adult EBV Mismatched Organ Transplant Recipients
    Background EBV D+/R- organ transplant recipients are a high-risk group for developing PTLD. Little data are available for prevention in the adult EBV mismatched population. We conducted a retrospective study of EBV D+/R- organ transplants performed during 2002-2014. Of the 153 patients identified, 82.4% patients received antiviral prophylaxis with valganciclovir for a median of 4.5 months (range 0.8-22 months) and 36.6% underwent viral load monitoring in the first post-transplant year. EBV viremia developed in 67.2% monitored patients. In viremic patients, immunosuppression was reduced in 20/37(54.1%) in response to viremia and 17/37(45.9%) received therapeutic dose valganciclovir. In patients with EBV viremia who received valganciclovir and/or had a reduction in immunosuppression and had sufficient viral load time points (n=31), 28(90.3%) had a significant decline in viral load at day 14 (median log decline 0.49(0.24-0.64), p<0.001) and at day 30 (0.87(0.52-1.21), p<0.001). PTLD developed in 27(15%) patients (biopsy proven=25, possible=2) at median 8 months (range 2.4-130) post-transplant with the majority (81.5%) within the first year. In multivariate analysis, viral load monitoring and use of mycophenolate were associated with a lower incidence of PTLD. Antiviral prophylaxis was not associated with a lower risk of PTLD but viral load monitoring and use of MMF were protective. This article is protected by copyright. All rights reserved.
  • Increased risk of portal vein thrombosis in patients with autoimmune hepatitis on the liver transplantation waiting list
    Liver transplantation (LT) is indicated in autoimmune hepatitis (AIH) for both acute presentation with liver failure and end-stage chronic liver disease. Few studies have suggested an association between AIH and coagulation disorders and a higher incidence of portal vein thrombosis (PVT) in patients with AIH listed for LT. The aim of this study was to determine the incidence of thrombotic complications, particularly PVT, in a cohort of 37 patients undergoing LT because of AIH. PVT was present before transplantation in 30% (n=11) of these patients compared to 11% in the whole population transplanted in our centre (p=0.002). On comparing only patients with cirrhosis, PVT was present in 55% of the AIH group, being 12% in the whole cohort (p<0.001). Among patients with PVT before LT, no patient receiving anticoagulation therapy early after LT developed recurrence of PVT, whereas two patients (33%) without anticoagulation therapy did. The increased incidence of PVT in the pretransplant period and the possibility of thrombosis recurrence after LT suggest that patients with AIH and PVT could benefit from anticoagulation therapy after transplantation. However, further studies are needed to recommend anticoagulation in these patients in clinical practice. This article is protected by copyright. All rights reserved.
  • Clinical Yield of Endoscopic Ultrasound and Endoscopic Ultrasound-Guided Fine-Needle Aspiration for Incidental Pancreatic Cysts in Kidney Transplant Evaluation
    Introduction For several reasons, including an elevated risk for malignancy after transplant, kidney transplant candidates undergo a thorough evaluation prior to transplantation. Further assessment of incidentally discovered pancreatic cysts on routine abdominal imaging has been assumed to be prudent, and the preferred method has been endoscopic ultrasound (EUS) and endoscopic ultrasound-guided fine-needle aspiration (EUS-FNA). The clinical utility of EUS/EUS-FNA with respect to transplant decision-making has not been evaluated. Materials and Methods Kidney transplant candidates undergoing EUS/EUS-FNA for further evaluation of one or more pancreatic cysts were identified. The clinical yield of the EUS/EUS-FNA was determined via retrospective chart review. Results After exclusion criteria were applied, a total of 15 cases were identified at a high-volume transplant center over a 71-month period. EUS/EUS-FNA was deemed to have a clinically relevant impact in 73.3% of cases. Conclusion Kidney transplant candidates are a unique group with respect to the need to clarify the etiology of pancreatic cysts. EUS/EUS-FNA frequently provides information that is clinically relevant to the determination of transplant status. This article is protected by copyright. All rights reserved.
  • Acute cellular rejection later than one year after heart transplantation: A single-center retrospective study at Skåne University Hospital in Lund 1988-2010
    Routine endomyocardial biopsy (EMB) to detect acute cellular rejection (ACR) late (> 1 year) after heart transplantation (HT) remains debated. To gain knowledge on late ACR and thereby approach this issue, we studied the incidence, predictors, and outcome of late ACR. 815 late EMBs from 183 patients transplanted 1988-2010 were retrospectively reviewed until June 30, 2012. Only 4.4% of the routine and 17.6% of the additional clinically indicated late EMBs showed ACR ≥ grade 2. With time post-HT, there was a clear trend towards fewer ACRs, a lower incidence of ACR per patient per year, and a deceleration in the decrease in the proportion of patients free from ACR. Sex-mismatching and first-year ACR were associated with an increased risk of late ACR, which also was associated with worse outcome. Although rare, when compared to our previous study on first-year EMBs, it appears as if late more often than early ACR remains undetected and that also late and not only early ACR influences outcome. Extended EMB surveillance > 1 year post-HT therefore still seems reasonable in “high-risk” patients, as also suggested in the ISHLT-guidelines. These should include, but not be limited to, the two risk groups above. This article is protected by copyright. All rights reserved.
  • Effect of Diltiazem on Exercise Capacity after Heart Transplantation
    Background Sinus tachycardia (ST) is common after heart transplantation (HTx). The aim of the study was to evaluate the effect of Diltiazem treatment during the first year after HTx on heart rate (HR), cardiac allograft function and exercise capacity. Methods From the total cohort, 25 HTx recipients started Diltiazem treatment 4±2 weeks after HTx and continued it for at least 1 year (Diltiazem group). Each study case was matched to a control. All patients underwent hemodynamic assessment and cardiopulmonary exercise test (CPET) at one year after HTx. Results HR decreased in the Diltiazem group from 99±11 bpm to 94±7 bpm (p=0.03) and did not change in the controls (98±11 bpm vs. 100±13 bpm, p=0.14). The difference between the groups at 1 year after HTx was significant (p=0.04). In the Diltiazem group left ventricular (LV) stroke volume and ejection fraction increased (48 ± 16 vs. 55± 17 ml, p=0.02, and 60% ± 10% vs. 62% ± 12% p=0.03 respectively) but did not differ from controls. E/E’ decreased (10.7 ± 2.7 vs. 7.3 ± 1.9, p=0.003) while cardiac index was higher (3.5 ± 0.8 vs. 3.1 ± 0.5; p=0.05) in the Diltiazem group at 1 year follow up. The absolute peak VO2 (21 ± 4 vs. 18 ± 6 ml/kg/min; p=0.05) and normalized peak VO2 (73% ± 17% vs. 58% ± 14%; p=0.004) were significantly higher in the Diltiazem group. Conclusions This study showed that Diltiazem treatment reduces ST, may improve cardiac allograft function and exercise tolerance during the first year after HTx. This article is protected by copyright. All rights reserved.
  • Prolonged Corrected QT Interval in the Donor Heart: Is There a Risk?
    Background Assessing the QT interval in donors is important to exclude long QT syndrome as the cause of death. A donor heart with a corrected QT (QTc)>500 msec is often concerning. We sought to evaluate first-year outcomes for donors with a QTc interval > 500 msec. Methods Between 2010 and 2014, we assessed 257 donor hearts for QTc interval > 500msec. Post-transplant outcomes included 1-year survival, 1-year freedom from any-treated rejection, 1- year freedom from CAV defined as stenosis ≥ 30% by angiography, and 1-year freedom from Non-Fatal Major Adverse Cardiac Events. Results Patients with QTc interval > 500 msec had a significantly lower 1-year freedom from CAV development. There were no significant differences for other outcomes. A significantly higher percentage of donors with QTc > 500 msec had a stroke or subarachnoid hemorrhage. Multivariate analysis found that donor QTc > 500 msec was associated with a 6.7-fold increased risk of developing CAV (p=0.029, 95% CI 1.21-36.6) after adjusting for other known risk factors. Conclusion QTc > 500 msec in the donor heart appears to be an independent risk factor for the development of early CAV after heart transplantation possibly due to a higher immunological risk. This article is protected by copyright. All rights reserved.
  • Elevated Donor Hemoglobin A1c Does Not Impair Early Survival in Cardiac Transplant Recipients
    Background Orthotopic heart transplantation (OHT) is the gold-standard therapy for end-stage heart failure. An increasing deficit between suitable allograft availability and clinical demand for OHT exists. The role of donor diabetes mellitus (DM) on post-transplant recipient outcomes in OHT is controversial. The purpose of this study was to examine donor hemoglobin A1c (HbA1c) levels to identify the impact of donor glycemic control on recipient survival. Methods Adult OHT recipients with donor HbA1c data were identified in the UNOS database from 2010-2015. Recipients were stratified on the basis of donor glycemic status: hyperglycemic-donor and euglycemic-donor cohorts defined as HbA1c levels ≥6.5% and <6.5%, respectively. Outcomes were compared between unadjusted and propensity-matched hyperglycemic versus euglycemic donors. Primary endpoint was 3-year survival. Results Of 5342 OHT recipients, 208 (3.89%) received an allograft from a hyperglycemic donor and 5134 (96.1%) received an allograft from a euglycemic donor. There was no significant difference in survival in the hyperglycemic group before (p = 0.87) or after (p = 0.78) propensity matching. Conclusions No difference in 4-year survival was noted in recipients who received allografts from hyperglycemic donors. These results suggest that recent cumulative donor glycemic status alone may not be an important predictor of recipient outcomes. This article is protected by copyright. All rights reserved.
  • Follow-up of patients with refractory or relapsed multiple myeloma after allogeneic hematopoietic cell transplantation
    Background The role of allogeneic hematopoietic cell transplantation (HCT) for the treatment of multiple myeloma is controversial. However, the introduction of proteasome inhibitors and immunomodulatory drugs might influence outcomes in case of relapse or refractory disease after allogeneic HCT. Methods: We report 41 consecutive patients that underwent allogeneic HCT for the treatment of relapsed or refractory multiple myeloma. Results Three-year event-free survival (EFS) and overall survival (OS) of the whole cohort were 15% and 51%, respectively. In a subgroup analysis, allogeneic HCT after a second high-dose chemotherapy with autologous stem cell support was associated with a decreased 3-year EFS (6% vs. 24%, p=0.04) and OS (35% vs. 64%, p=0.09). In case of relapse or refractory disease after allogeneic HCT, the treatment with proteasome inhibitors or immunomodulatory drugs significantly improved survival (1-year OS 79% vs. 29%, p=0.001). Conclusion The incorporation of proteasome inhibitors and immunomodulatory drugs into transplant protocols has the potential to improve outcomes and refine the role of allogeneic HCT for the treatment of multiple myeloma as a platform for long-term disease control. This article is protected by copyright. All rights reserved.
  • The Association of Donor Age and Survival is Independent of Ischemic Time Following Deceased Donor Lung Transplantation
    Purpose Early research suggests prolonged ischemic time in older donor lungs is associated with decreased survival following lung transplantation. The purpose of this study was to determine if this association holds in the post-LAS era. Methods We analyzed the UNOS database 2005-2013 for adult recipients of cadaveric lung transplants. Cox proportional hazards modeling was utilized to determine the association of donor age, ischemic time, and the interaction of donor age and ischemic time with transplant-free survival. Results 11,835 patients met criteria. Median donor age was 32 years and median ischemic time was 4.9hr. Cox modeling demonstrated that donor age 50-60 (adjusted HR: 1.11) and ≥60 (adjusted HR: 1.42) were associated with reduced overall survival. Neither ischemic time nor interaction of ischemic time and donor age were significantly associated with overall survival. Sub-analysis demonstrated that this finding held true for patients undergoing either single or bilateral lung transplantation. Conclusions Prolonged ischemic time is not associated with decreased overall survival in patients undergoing lung transplantation regardless of the donor's age. However, donor age >50 is independently associated with decreased survival. The lack of an association between ischemic time and survival should encourage broader geographic allocation of pulmonary allografts. This article is protected by copyright. All rights reserved.
  • Decisional Conflict between Treatment Options among End-Stage Renal Disease Patients Evaluated for Kidney Transplantation
    Although kidney transplantation provides a significant benefit over dialysis, many end-stage renal disease (ESRD) patients are conflicted about their decision to undergo kidney transplant. We aimed to identify the prevalence and characteristics associated with decisional conflict between treatment options in ESRD patients presenting for transplant evaluation. Among a cross-sectional sample of ESRD patients (n=464) surveyed in 2014 and 2015, we assessed decisional conflict through a validated 10-item questionnaire. Decisional conflict was dichotomized into no decisional conflict (score = 0) and any decisional conflict (score > 0). We investigated potential characteristics of patients with decisional conflict using bivariate and multivariable logistic regression. The overall mean age was 50.6 years, with 62% male and 48% African American patients. Nearly half (48.5%) of patients had decisional conflict regarding treatment options. Characteristics significantly associated with decisional conflict in multivariable analysis included male sex, lower educational attainment, and less transplant knowledge. Understanding characteristics associated with decisional conflict in ESRD patients could help identify patients who may benefit from targeted interventions to help patients make informed, value-based, and supported decisions when deciding how to best treat their kidney disease. This article is protected by copyright. All rights reserved.
  • Patterns of primary care utilization before and after living kidney donation
    Background Annual visits with a primary care provider (PCP) are recommended for living kidney donors to monitor long-term health post-donation, yet adherence to this recommendation is unknown. Methods We surveyed 1170 living donors from our center from 1970-2012 to ascertain frequency of PCP visits pre- and post-donation. Interviews occurred median (IQR) 6.6 (3.8 – 11.0) years post-transplant. We used multivariate logistic regression to examine associations between donor characteristics and PCP visit frequency. Results Overall, only 18.6% had less-than-annual PCP follow-up post-donation. The strongest predictor of post-donation PCP visit frequency was pre-donation PCP visit frequency. Donors who had less-than-annual PCP visits before donation were substantially more likely to report less-than-annual PCP visits post-donation (OR = 9.814.421.0, p<0.001). Men were more likely to report less-than-annual PCP visits post-donation (adjusted OR = 1.21.62.3, p<0.01); this association was amplified in unmarried/non-cohabiting men (aOR = 2.43.96.3, p<0.001). Donors without college education were also more likely to report less-than-annual PCP visits post-donation (aOR = 1.31.82.5, p=0.001). Conclusions The importance of annual PCP visits should be emphasized to all living donors, especially those with less education, men (particularly single men) and donors who did not see their PCP annually before donation. This article is protected by copyright. All rights reserved.
  • Predictors of outcome among patients on extracorporeal membrane oxygenation as a bridge to lung transplantation
    Background There is a lack of data regarding clinical variables associated with successful bridge to lung transplantation (LT) using extracorporeal membrane oxygenation (ECMO) support. Methods We reviewed the institutional database for patients supported with veno-venous (VV) or veno-arterial ECMO as a bridge to LT (n=25; mean age:50.6±14.2 years). We recorded clinical and laboratory variables, findings on echocardiogram and development of organ dysfunction along with hospital and one-year survival. Variables were compared between patients successfully bridged to LT versus those who were not. Results The most common diagnostic group was interstitial lung disease (18/25, 72%). VV-ECMO was used in the majority (84%). Fifteen patients (60%) were successfully bridged to LT and majority were alive at one year (14/15, 93.3%). The presence of right ventricular systolic dysfunction on pre-ECMO echocardiogram was associated with increased risk of unsuccessful bridging (OR, 95% CI: 2.67, 1.01-6.99, p=0.041). While on ECMO, trough albumin levels <2.5 gm%, peak blood urea nitrogen levels>35 mg/dL and positive fluid balance were also associated with failure to bridge to LT. Conclusions Among patients awaiting LT, the presence of RV systolic dysfunction before ECMO initiation along with worsening renal functions, low albumin levels, and volume overload are associated with poor outcomes. This article is protected by copyright. All rights reserved.
  • Early Post-Operative Management After Lung Transplantation: Results of an International Survey
    Introduction Little data exists regarding optimal therapeutic strategies post-operatively after lung transplant (LTx). Current practice patterns rely on expert opinion and institutional experience resulting in non-uniform post-operative care. To better define current practice patterns, an international survey of LTx clinicians was conducted. Methods A 30-question survey was sent to transplant clinicians via email to the ISHLT open forum mailing list and directly to the chief transplant surgeon and pulmonologist of all LTx centers in the United States. Results 52 clinicians representing 10 countries responded to the survey. Sedatives use patterns included: opiates + propofol (57.2%), opiates + dexmedetomidine (18.4%), opiates + intermittent benzodiazepines, (14.3%), opiates + continuous benzodiazepines (8.2%) and opiates alone (2%). 40.4% reported no formal sedation scale was followed and 13.5% of programs had no formal policy on sedation and analgesia. A lung protective strategy was commonly employed, with 13.8%, 51.3%, and 35.9% of respondents using tidal volumes of < 6 mL/kg ideal body weight (IBW), 6 mL/kg IBW, and 8 mL/kg IBW, respectively. Conclusion Practice patterns in the early post-operative care of lung transplant recipients differ considerably among centers. Many of the reported practices do not conform to consensus guidelines on management of critically ill patients. This article is protected by copyright. All rights reserved.
  • Subsequent malignancies after allogeneic hematopoietic stem cell transplantation
    We evaluated 979 patients for the development of post-transplant lymphoproliferative disease (PTLD) and solid malignancies after allogeneic hematopoietic stem cell transplantations (allo-HSCT) as a late complication. We found 15 (1.5%) subsequent malignancies; three of these malignancies were PTLD, and twelve were solid tumors. The median time from allo-HSCT to the development of PTLD was 9 (3-20) months and that from allo-HSCT to the development of solid tumors was 93 (6-316) months. The cumulative incidence of evolving subsequent malignancy in patients was 1.3% (±0.5 SE) at 5 years and 3.9% (±1.2 SE) at 10 years. The cumulative incidence of developing subsequent malignancy in patients with benign hematological diseases as the transplant indication was 7.4%±4.2 SE at 5 years. More subsequent malignancy developed in patients having ≥1 year chronic graft-vs-host disease (GVHD; 3.7% in ≥1 year chronic GVHD and 0.7% in <1 year chronic GVHD patient groups, P=.002). Subsequent epithelial tumor risk was higher in ≥1 year chronic GVHD patients than <1 year (3.7% vs 0.1%, P<.001). In multivariate analysis, benign hematological diseases as transplant indication (RR: 5.6, CI 95%: 1.4-22.3, P=.015) and ≥1 year chronic GVHD (RR: 7.1, 95% CI: 2.3-22.5, P=.001) were associated with the development of subsequent malignancy.
  • Endoscopic findings following retroperitoneal pancreas transplantation
    Aim An evaluation of the efficacy of endoscopic methods for the diagnosis and correction of surgical and immunological complications after retroperitoneal pancreas transplantation. Materials and Methods From October 2011 to March 2015, 27 patients underwent simultaneous retroperitoneal pancreas–kidney transplantation (SPKT). Diagnostic oesophagogastroduodenoscopy (EGD) with protocol biopsy of the donor and recipient duodenal mucosa and endoscopic retrograde pancreatography (ERP) were performed to detect possible complications. Endoscopic stenting of the main pancreatic duct with plastic stents and three-stage endoscopic hemostasis were conducted to correct the identified complications. Results Endoscopic methods showed high efficiency in the timely diagnosis and adequate correction of complications after retroperitoneal pancreas transplantation.
  • Diabetic kidney transplant recipients: Impaired infection control and increased alloreactivity
    Background Post-transplantation diabetes mellitus (PTDM) has been associated with inferior patient and allograft outcomes. However, previous studies did not identify differences in infection control and alloreactivity. Methods We studied 449 kidney transplant recipients (KTRs) between 2005 and 2013. Fifty (11.1%) KTRs were diagnosed with PTDM and 60 (13.4%) KTRs with pre-existing diabetes. Samples were collected pretransplantation, at +1, +2, +3 months post-transplantation. CMV specific and alloreactive T cells were quantified by interferon-γ Elispot assay. Lymphocyte subpopulations were quantified by flow cytometry. Results Upon multivariate analysis, age, obesity, and the use of tacrolimus increased the risk of PTDM (P<.05). KTRs with pre-existing diabetes/PTDM showed higher rates of sepsis (P<.01). Total CD3+ and CD4+ T cell counts were significantly lower in KTRs with PTDM/pre-existing diabetes post-transplantation (P<.05). No differences were observed for CMV-specific T cells between any group (P>.05). KTRs developing PTDM showed increased frequencies of alloreactive T-cells post-transplantation (P<.05). Conclusions Our results suggest higher rates of infection in KTRs with pre-existing diabetes/PTDM that may be attributed to impaired overall immunity. Higher frequencies of alloreactive T cells contribute to inferior long-term outcomes. As acute rejection, but not pre-existing diabetes/PTDM, was associated with inferior allograft survival and function, maintaining adequate immunosuppression to prevent rejection seems important.
  • The National Living Donor Assistance Center perspective on barriers to the use of federal travel grants for living donors
    Recent research has identified important barriers that potential living organ donors face in utilizing travel reimbursement funds from the National Living Donor Assistance Center (NLDAC). In this article, we provide clarification and comment on these potential barriers from the NLDAC program perspective. The goal of financial neutrality for living donors requires further action. We discuss recent developments and further steps that may help achieve this goal and ultimately affect the shortage of donor organs.
  • Post-transplant lymphoproliferative disease in lung transplantation: A nested case-control study
    Post-transplant lymphoproliferative disorder (PTLD) may compromise long-term outcome of lung transplant (LTx) recipients. A case-control study was performed, comparing LTx recipients with PTLD (n=31) to matched recipients without PTLD (Controls, n=62). Risk factors for PTLD and post-transplant outcomes were assessed. PTLD prevalence was 3.9%, time to PTLD 323 (166-1132) days; and 54.8% had early-onset PTLD versus 45.2% late-onset PTLD. At LTx, more Epstein-Barr virus (EBV)-seronegative patients were present in PTLD (42%) compared to Controls (5%) (P<.0001); most of whom had undergone EBV seroconversion upon PTLD diagnosis. EBV viral load was higher in PTLD versus Controls (P<.0001). Overall, lower hemoglobin and higher C-reactive protein levels were present in PTLD versus Controls (P<.0001). EBV status at LTx (P=.0073) and EBV viral load at PTLD (P=.0002) were the most important risk determinates for later PTLD. Patients with PTLD demonstrated shorter time to onset of chronic lung allograft dysfunction (CLAD) (P=.0006) and poorer 5-year survival post-LTx (66.6% versus 91.5%), resulting in worse CLAD-free survival (HR 2.127, 95%CI 1.006-4.500; P=.0483) and overall survival (HR 3.297 95%CI 1.473-7.382; P=.0037) compared to Controls. Late-onset PTLD had worse survival compared to early-onset PTLD (P=.021). Primary EBV infection is a risk for PTLD; which is associated with worse long-term outcome post-LTx.
  • Vitamin D—FGF-23 axis in kidney transplant recipients
  • Medication adherence and rejection rates in older vs younger adult liver transplant recipients
    A growing number of older adults are undergoing liver transplantation (LT) in the United States. In some settings, it is thought that adherence declines with age. This retrospective study examined adherence and clinical outcomes in older vs younger adult LT recipients. Medical records of adult LT recipients from 2009 to 2012 from a single urban center were reviewed. The medication level variability index (MLVI) was the predefined primary outcome, with nonadherence defined as MLVI >2.5. The secondary outcome was incidence of rejection. Outcomes were evaluated starting 1 year post-LT until 2015. A total of 42 of 248 patients were ≥65 at transplant. Older adults had significantly better adherence than younger ones (65%≥65 were adherent vs 42% younger adults; chi-square two-tailed P=.02). Survival analyses of rejection between age groups censored by time since transplant showed no difference among the four age groups (χ2=0.84, P=.84). Older age was not found to be a risk factor for reduced adherence or graft rejection in patients surviving at least 1 year post-LT.
  • Pharmacokinetic and pharmacogenetic analysis of immunosuppressive agents after laparoscopic sleeve gastrectomy
    Background Severe obesity has been shown to limit access to renal transplantation in patients with end-stage renal disease (ESRD). Laparoscopic sleeve gastrectomy (LSG) has been performed in the ESRD population to assist in achieving waitlist and transplant eligibility. Little is known about how LSG impacts the bioequivalence of tacrolimus products and immunosuppression pharmacokinetics. Methods This was a prospective, open-label, single-dose, crossover, two-period pharmacokinetic (PK) study. The purpose of this study was to assess single-dose PK of immediate-release tacrolimus (IR-TAC), extended-release tacrolimus (ER-TAC), and mycophenolic acid (MPA) in adult ESRD patients post-LSG. Results Twenty-three subjects were included in the 24-hour PK assessments. The ratio of geometric means between ER-TAC and IR-TAC was 103.5% (90% CI; 89.6%-119.6%) for AUC0-24 and 92.5% (90% CI; 80.4%-106.4%) for Cmax. PK parameters were similar between ER-TAC and IR-TAC, except for Cmin (P=.004) and Cmax (P=.04). MPA AUC0-24 was similar when given with either ER-TAC or IR-TAC (P=.32). Patients expressing CYP3A5*1 genotypes had lower tacrolimus AUC0-24 values vs those with CYP3A5*3/*3 (IR-TACP<.001; ER-TACP=.008). Genotype did not impact MPA PK. Conclusion Dose modification of immunosuppressants post-LSG may not be necessary aside from standard therapeutic drug monitoring.
  • Pharmacokinetics of prolonged-release tacrolimus versus immediate-release tacrolimus in de novo liver transplantation: A randomized phase III substudy
    Background With the same dose of tacrolimus, lower systemic exposure on the first day of dosing has been reported for prolonged-release tacrolimus compared with immediate-release tacrolimus, prompting investigation of differing initial doses. Methods This substudy of a double-blind, randomized, phase III trial in de novo liver transplant recipients compared the pharmacokinetics of once-daily prolonged-release tacrolimus (initial dose: 0.2 mg/kg/day) versus twice-daily immediate-release tacrolimus (initial dose: 0.1 mg/kg/day) during the first 2 weeks post-transplant. Results Pharmacokinetic data were analyzed from patients receiving prolonged-release tacrolimus (n=13) and immediate-release tacrolimus (n=12). Mean systemic exposure (AUC0-24) was higher with prolonged-release versus immediate-release tacrolimus. Dose-normalized AUC0-24 (normalized to 0.1 mg/kg/day) showed generally lower exposure with prolonged-release tacrolimus versus immediate-release tacrolimus. There was good correlation between AUC0-24 and concentration at 24 hours after the morning dose (r=.96 and r=.86, respectively), and the slope of the line of best fit was similar for both formulations. Conclusions Doubling the initial starting dose of prolonged-release tacrolimus compared with immediate-release tacrolimus overcompensated for lower exposure on Day 1. A 50% higher starting dose of prolonged-release tacrolimus than immediate-release tacrolimus may be required for similar systemic exposure. However, doses of both formulations can be optimized using the same trough-level monitoring system. (ClinicalTrials.gov number: NCT00189826).
  • Development of a human cadaver model for training in laparoscopic donor nephrectomy
    Background The organ procurement network recommends a surgeon record 15 cases as surgeon or assistant for laparoscopic donor nephrectomies (LDN) prior to independent practice. The literature suggests that the learning curve for improved perioperative and patient outcomes is closer to 35 cases. In this article, we describe our development of a model utilizing fresh tissue and objective, quantifiable endpoints to document surgical progress, and efficiency in each of the major steps involved in LDN. Materials and Methods Phase I of model development focused on the modifications necessary to maintain visualization for laparoscopic surgery in a human cadaver. Phase II tested proposed learner-based metrics of procedural competency for multiport LDN by timing procedural steps of LDN in a novice learner. Results Phases I and II required 12 and nine cadavers, with a total of 35 kidneys utilized. The following metrics improved with trial number for multiport LDN: time taken for dissection of the gonadal vein, ureter, renal hilum, adrenal and lumbrical veins, simulated warm ischemic time (WIT), and operative time. Conclusion Human cadavers can be used for training in LDN as evidenced by improvements in timed learner-based metrics. This simulation-based model fills a gap in available training options for surgeons.
  • Characteristics of compatible pair participants in kidney paired donation at a single center
    Compatible pairs of living kidney donors and their intended recipients can enter into kidney paired donation (KPD) and facilitate additional living donor kidney transplants (LDKTs). We examined 11 compatible pairs (the intended recipients and their intended, compatible donors) who participated in KPD, along with the recipients’ 11 matched, exchange donors. The 11 pairs participated in 10 separate exchanges (three were multicenter exchanges) that included 33 total LDKTs (22 additional LDKTs). All the intended donors were blood group O and female, with a mean living kidney donor profile index (LKDPI) of 27.6 (SD 16.8). The matched donors had a mean LKDPI of 9.4 (SD 31.7). Compatible pairs entered KPD for altruistic reasons (N=2) or due to mismatch of age (N=7) or body/kidney size (N=2) between the recipient and intended donor. In four cases, retrospective calculation of the LKDPI revealed that the matched donor had a higher LKDPI than the intended donor. Of the 22 recipients of LDKTs enabled by the compatible pairs, three were highly sensitized, with PRA >80%. In conclusion, most compatible pairs entered into KPD so that the recipient could receive a LDKT transplant from a donor whose age or body/kidney size were more favorable to post-transplant outcomes.
  • A clinical tool to calculate post-transplant survival using pre-transplant clinical characteristics in adults with cystic fibrosis
    Background We previously identified factors associated with a greater risk of death post-transplant. The purpose of this study was to develop a clinical tool to estimate the risk of death after transplant based on pre-transplant variables. Methods We utilized the Canadian CF registry to develop a nomogram that incorporates pre-transplant clinical measures to assess post-lung transplant survival. The 1-, 3-, and 5-year survival estimates were calculated using Cox proportional hazards models. Results Between 1988 and 2012, 539 adult Canadians with CF received a lung transplant with 208 deaths in the study period. Four pre-transplant factors most predictive of poor post-transplant survival were older age at transplantation, infection with B. cepacia complex, low FEV1 percent predicted, and pancreatic sufficiency. A nonlinear relationship was found between risk of death and FEV1 percent predicted, age at transplant, and BMI. We constructed a risk calculator based on our model to estimate the 1-, 3-, and 5-year probability of survival after transplant which is available online. Conclusions Our risk calculator quantifies the risk of death associated with lung transplant using pre-transplant factors. This tool could aid clinicians and patients in the decision-making process and provide information regarding the timing of lung transplantation.
  • Graft-derived macrophage migration inhibitory factor correlates with hepatocellular injury in patients undergoing liver transplantation
    Experimental studies suggest that macrophage migration inhibitory factor (MIF) mediates ischemia/reperfusion injury during liver transplantation. This study assessed whether human liver grafts release MIF during preservation, and whether the release of MIF is proportional to the extent of hepatocellular injury. Additionally, the association between MIF and early allograft dysfunction (EAD) after liver transplantation was evaluated. Concentrations of MIF, aspartate aminotransferase (AST), alanine aminotransferase (ALT), lactate dehydrogenase (LDH), and creatine kinase (CK) were measured in effluents of 38 liver grafts, and in serum of recipients. Concentrations of MIF in the effluent were greater than those in the recipients’ serum before and after reperfusion (58 [interquartile range, IQR:23-79] μg/mL vs 0.06 [IQR:0.03-0.07] μg/mL and 1.3 [IQR:0.7-1.8] μg/mL, respectively; both P<.001). Effluent MIF concentrations correlated with effluent concentrations of the cell injury markers ALT (R=.51, P<.01), AST (R=.51, P<.01), CK (R=.45, P=.01), and LDH (R=.56, P<.01). Patients who developed EAD had greater MIF concentrations in effluent and serum 10 minutes after reperfusion than patients without EAD (Effluent: 80 [IQR:63-118] μg/mL vs 36 [IQR:20-70] μg/mL, P=.02; Serum: 1.7 [IQR:1.2-2.5] μg/mL vs 1.1 [IQR:0.6-1.7] μg/mL, P<.001). Conclusion Human liver grafts release MIF in proportion to hepatocellular injury. Greater MIF concentrations in effluent and recipient's serum are associated with EAD after liver transplantation.
  • Association of myocardial injury with increased mortality after liver transplantation
  • Biliary reconstruction in liver transplant patients with primary sclerosing cholangitis, duct-to-duct or Roux-en-Y?
    Introduction Roux-en-Y choledochojejunostomy and duct-to-duct (D-D) anastomosis are biliary reconstruction methods for liver transplantation. However, there is a controversy over which method produces better results. We have compared the outcome of D-D anastomosis vs. Roux-en-Y hepaticojejunostomy in patients with primary sclerosing cholangitis who had undergone liver transplant in Shiraz Organ Transplant Center. Materials The medical records of 405 patients with primary sclerosing cholangitis (PSC) who had undergone liver transplant from 1996 to 2015 were reviewed. Patients were divided into two groups: Roux-en-Y group and D-D group. Morbidity, disease recurrence, and graft and patient survival rates were compared between the two groups. Results Total of 143 patients underwent a D-D biliary reconstruction, and 260 patients had a Roux-en-Y loop. Biliary complication involved 4.2% of patients from the D-D group, and 3.9% from the Roux-en-Y group (P=. 863). Actuarial 1-, 3-, and 5-year patient survival for D-D and Roux-en-Y group was 92%, 85%, and 74%; and 87%, 83%, and 79%, respectively (P=.384). The corresponding 1-, 3-, and 5-year probability of biliary complication was 97%, 95%, and 92%; and 98%, 97%, and 94%, respectively (P=.61). Conclusion Duct-to-duct biliary reconstruction in liver transplantation for selected patients with PSC is a good alternative instead of Roux-en-Y biliary reconstruction.
  • Factors contributing to employment patterns after liver transplantation
    Background Many liver transplant recipients return to work, but their patterns of employment are unclear. We examine patterns of employment 5 years after liver transplantation. Methods First-time liver transplant recipients ages 18-60 years transplanted from 2002 to 2009 and surviving at least 5 years were identified in the United Network for Organ Sharing registry. Recipients' post-transplant employment status was classified as follows: (i) never employed; (ii) returned to work within 2 years and remained employed (continuous employment); (iii) returned to work within 2 years, but was subsequently unemployed (intermittent employment); or (iv) returned to work ≥3 years post-transplant (delayed employment). Results Of 28 306 liver recipients identified during the study period, 12 998 survived at least 5 years and contributed at least 1 follow-up of employment status. A minority of patients (4654; 36%) were never employed, while 3780 (29%) were continuously employed, 3027 (23%) were intermittently employed, and 1537 (12%) had delayed employment. In multivariable logistic regression analysis, predictors of intermittent and delayed employment included lower socioeconomic status, higher local unemployment rates, and post-transplant comorbidities or complications. Conclusion Never, intermittent, and delayed employment are common after liver transplantation. Socioeconomic and labor market characteristics may add to clinical factors that limit liver transplant recipients’ continuous employment.
  • Simultaneous pancreas and kidney transplantation: Incidence and risk factors for amputation after 10-year follow-up
    Introduction The incidence of amputation after simultaneous pancreas and kidney (SPK) transplantation ranges from 9.5% to 23% after 5 years of follow-up. The objective of this study was to investigate the incidence and risk factors for amputation in SPK transplant patients compared to kidney transplantation alone (KTA) after a minimum follow-up of 10 years. Methods An analysis was performed on a prospectively maintained database of 81 SPK transplants and 43 KTA consecutively performed in one center for insulin-dependent diabetes mellitus between December 1992 and January 2006. Primary outcome variables were incidence of amputation per patient, total number of amputations, and type of amputation performed. Data are presented as a mean ± standard deviation. Results Seven patients (9%) in the SPK cohort and one patient (2%) in the KTA cohort underwent amputation (P<.001). One amputee had pancreas allograft failure prior to amputation. Fifteen amputations were performed in total and four patients required ≥2 amputations. The latency period between transplantation and amputation was 133.57±49.43 months in the SPK cohort and 168 months in the KTA group. Conclusions The incidence of amputation after SPK transplantation is approximately 9% after 10-year follow-up. Patients are at a significantly greater risk of amputation after SPK transplantation compared to KTA for type 1 diabetes despite insulin independence.
  • The appropriate dose of thymoglobulin induction therapy in kidney transplantation
    Background Thymoglobulin is used effectively as an induction agent in kidney transplantation, but there is no consensus on the optimal dose. In order to delineate the safest effective dose, an open-labeled randomized clinical trial was designed. Methods In this study, 90 adult kidney transplant recipients (KTR) were randomized before transplantation in three groups to receive thymoglobulin: Arm A (4.5 mg/kg in 3 days), Arm B (4.5 mg/kg single bolus dose), and Arm C (6 mg/kg in 3 days). Renal function, infections, and rate of readmissions were evaluated during the first post transplantation year. Results Ninety adult kidney recipients were enrolled (51% deceased donor). No significant statistical difference was found in acute rejection episodes or type of rejection between these groups, although patients in Arm A showed more severe histopathologic changes according to Banff 2013 criteria, in renal biopsies (P=.03). At the first month after transplantation serum Cr was lower (P=.001) and GFR was higher (P=.04) in Arm A, but there was no significant difference among the three groups at 3, 6, and 12 months post-transplant. Conclusion Although all regimens showed the same efficacy regarding the rate of rejection episodes, 3-day 4.5 mg/kg Thymoglobulin had significantly fewer complications.
  • Airway inflammation and symptoms in children following liver and heart transplantation
    Objectives To describe the upper airway endoscopic findings of children with upper airway symptoms after liver transplantation (LT) or heart transplantation (HT). Methods Review of children undergoing airway endoscopy after LT or HT from 2011 to 2015 at a tertiary care pediatric hospital. Airway findings, biopsy results, immunosuppression, and Epstein-Barr virus (EBV) levels were recorded. Results Twenty-three of 158 LT (111) and HT (47) recipients underwent endoscopy. Median time from LT to endoscopy was 9 months (range 4-25) and 31 months (range 1-108) for HT. Thirteen of 23 patients presented with upper airway symptoms, and 10/23 presented with respiratory failure or for surveillance. Thirteen patients with upper airway symptoms had abnormal findings (7 LT; 6 HT), most commonly arytenoid edema (13 patients). There were five EBV-positive biopsies (four with post-transplant lymphoproliferative disorder), and six EBV-negative biopsies with lymphocytic inflammation. One biopsy demonstrated fungal infection. Immunosuppression was decreased in seven patients, and three received steroids. There were no episodes of allograft rejection. No patients had airway symptoms at last follow-up. Conclusions In pediatric solid organ transplant recipients, symptoms of airway obstruction are not uncommon and should be evaluated with endoscopy. Endoscopy without symptoms is low-yield. Treatment with decreased immunosuppression improved airway symptoms.
  • Pilot cohort study on the potential role of TCF7L2 rs7903146 on ischemic heart disease among non-diabetic kidney transplant recipients
    Background TCF7L2 rs7903146 C>T polymorphism is associated with diabetes in the general population but its independent impact on cardiovascular disease is debated. On this basis, we investigated its association with major adverse cardiac events (MACE) in a single-center cohort of non-diabetic kidney transplant recipients (KTRs). Methods Patients with pretransplant diabetes were excluded and patients who developed post-transplant diabetes were censored at time of diagnosis. Results rs7903146 C>T polymorphism appeared to modulate the risk of MACE: 5-year prevalence was 0.8% in CC patients, 7.2% in CT patients and 9.7% in TT patients (P<.001). TCF7L2 rs7903146 was an independent predictor of MACE in a multivariate Cox regression model (for each T allele, HR: 2.99, 95%CI: 1.62-5.52, P<.001), together with history of cardiac ischemic events (HR: 8.69, 95%CI: 3.57-21.16, P<.001), DGF (HR: 2.42, 95%CI: 0.98-5.95, P=.056) and HLA-mismatches (for each mismatch: HR: 1.55, 95%CI: 1.00-2.43, P=.053). Introduction of rs7903146 C>T polymorphism into a model based on these clinical variables significantly increased predictive power for MACE (P=.003). Conclusions TCF7L2 rs7903146 T allele may be strongly and independently associated with MACE in non-diabetic KTRs. These findings suggest the possibility of employing this SNP to more accurately stratify cardiological risk in KTRs.
  • Epidemiology, risk factors, and outcome of Clostridium difficile infection in heart and heart-lung transplant recipients
    Background Clostridium difficile is a major cause of diarrhea in thoracic organ transplant recipients. We investigated the epidemiology, risk factors, and outcome of Clostridium difficile infection (CDI) in heart and heart-lung transplant (HT) recipients. Methods This is a retrospective study from 2004 to 2013. CDI was defined by diarrhea and a positive toxigenic C. difficile in stool measured by toxin enzyme immunoassay (2004-2006) or polymerase chain reaction (2007-2013). Cox proportional hazards regression was used to model the association of risk factors with time to CDI and survival with CDI following transplantation. Results There were 254 HT recipients, with a median age of 53 years (IQR, 45-60); 34% were female. During the median follow-up of 3.1 years (IQR, 1.3-6.1), 22 (8.7%) patients developed CDI. In multivariable analysis, risk factors for CDI were combined heart-lung transplant (HR 4.70; 95% CI, 1.30-17.01 [P=.02]) and retransplantation (HR 7.19; 95% CI, 1.61-32.12 [P=.01]). Acute cellular rejection was associated with a lower risk of CDI (HR 0.34; 95% CI, 0.11-0.94 [P=.04]). CDI was found to be an independent risk factor for mortality (HR 7.66; 95% CI, 3.41-17.21 [P<.0001]). Conclusions Clostridium difficile infection after HT is more common among patients with combined heart-lung and those undergoing retransplantation. CDI was associated with a higher risk of mortality in HT recipients.
  • Immunosuppression with mTOR inhibitors prevents the development of donor-specific antibodies after liver transplant
    Background Donor-specific antibodies (DSAs) are an important cause of complications after solid organ transplant. Risk factors and, thus, strategies for preventing DSA development are not well defined. Methods The DSA status of 400 patients who underwent liver transplant (LT) at the outpatient clinic of the University Hospital Essen was determined. Human leukocyte antigen (HLA) antibodies were detected by single-antigen bead technology. The strength of DSAs was reported as mean fluorescence intensity. Results Detectable DSAs were found in 74 (18.5%) patients and significantly more often in patients who underwent LT for autoimmune liver disease than for all other indications (29.3%; P=.022), but significantly less often found in patients who underwent LT for hepatocellular carcinoma (7.6%, P=.005). The incidence of DSAs increased with time after LT, and the risk was generally higher for female patients. The frequency of DSA detection was significantly lower (10.6%) for patients receiving immunosuppressive treatment with mammalian target of rapamycin (mTOR) inhibitors than for those receiving other regimens (20.5%; P=.025). Conclusion Autoimmune liver diseases, female sex, and time of more than 8 years since LT predispose patients to the development of DSAs. Immunosuppression with the mTOR inhibitor everolimus protects against DSA development after liver transplant.
  • Low serum testosterone is associated with impaired graft function early after heart transplantation
    Background We sought to investigate a correlation between serum testosterone levels and graft function early after heart transplantation. Methods In a cross-sectional study, we measured serum testosterone levels 4 weeks after heart transplantation in 49 consecutive male recipients. Echocardiography was carried out to evaluate graft function. Low serum testosterone was defined as <11 nmol/L. Results Low serum testosterone was present in 21 (43%) recipients (Group A), and 28 (57%) had normal testosterone levels (Group B). The two groups did not differ in age and presence of renal dysfunction, arterial hypertension, diabetes, or hyperlipidemia. Donor age and allograft ischemic time were not different between the two groups. Both groups had comparable tacrolimus through levels, dose of mycophenolate mophetil, and methylprednisolone. Patients in Group A had significantly lower LVEF (58±5% vs 65±6% vs Group B, P=.001) and TAPSE (1.3±0.3 cm vs 1.6±0.3 cm in Group B, P=.01). In comparison with Group B, more patients in Group A were found to have low grade (1R) rejection (25% vs 3%; P=.02). Conclusion Low serum testosterone levels appear to be associated with impaired graft function and an increased incidence of low-grade rejection episodes early after heart transplantation.
  • Brain natriuretic peptide and right heart dysfunction after heart transplantation
    Heart transplantation (HT) should normalize cardiac endocrine function, but brain natriuretic peptide (BNP) levels remain elevated after HT, even in the absence of left ventricular hemodynamic disturbance or allograft rejection. Right ventricle (RV) abnormalities are common in HT recipients (HTx), as a result of engraftment process, tricuspid insufficiency, and/or repeated inflammation due to iterative endomyocardial biopsies. RV function follow-up is vital for patient management as RV dysfunction is a recognized cause of in-hospital death and is responsible for a worse prognosis. Interestingly, few and controversial data are available concerning the relationship between plasma BNP levels and RV functional impairment in HTx. This suggests that infra-clinical modifications, such as subtle immune system disorders or hypoxic conditions, might influence BNP expression. Nevertheless, due to other altered circulating molecular forms of BNP, a lack of specificity of BNP assays is described in heart failure patients. This phenomenon could exist in HT population and could explain elevated BNP plasmatic levels despite a normal RV function. In clinical practice, intra-individual change in BNP over time, rather than absolute BNP values, might be more helpful in detecting right cardiac dysfunction in HTx.
  • The epidemiology of Clostridium difficile infection in a national kidney transplant center
    Background We aimed to describe the epidemiology and outcomes of CDI in a national kidney transplant center from 2008 to 2015. Methods Adult kidney and kidney-pancreas transplant recipients were included for analysis if they met the surveillance CDI case definition. Rates of new healthcare-associated CDI (HA-CDI) were expressed per 10 000 KTR/KTPR bed days used (BDU) to facilitate comparisons. Results Fifty-two cases of CDI were identified in 42 KTRs and KPTRs. This corresponded to an average annual rate of 9.6 per 10 000 BDU, higher than that seen among general hospital inpatients locally, nationally, and internationally. Of the 45 cases (87%) that were considered HA-CDI, nine (20%) had symptom onset in the community. Recent proton-pump inhibitor (PPI) and broad-spectrum antimicrobial exposure preceded the majority of cases. KTRs and KPTRs with CDI had a longer mean length of hospital stay (35 days) than those KTR and KPTRs admitted during the same period that did not have CDI (8 days). Conclusions Education regarding CDI must be extended to transplant recipients and their general practitioners. Other targets for future CDI rate reduction must include stringent antimicrobial stewardship (both in hospital and in the community) and judicious PPI prescribing.
  • Issue Information
  • Making inroads to the cure: Barriers to clinical trial enrollment in hematopoietic cell transplantation
    A significant barrier to advancing the standard of care for patients with hematologic malignancies undergoing stem cell transplantation is access and willingness to participate in clinical trials. The importance of clinical trial enrollment is magnified in an era of targeted therapies, accelerated drug discovery, and investment by the pharmaceutical industry. As disease targets are identified, novel therapies are being evaluated in efforts to reduce treatment-related toxicity and improve progression-free and overall survival. The enrollment of hematopoietic cell transplantation (HCT) patients on clinical studies is essential to promote the development of such therapies. Increasing clinical trial participation requires understanding of potential barriers to enrollment, including patient concerns, institutional and provider hurdles, and disease-specific characteristics.
  • Early post-transplant conversion from tacrolimus to belatacept for prolonged delayed graft function improves renal function in kidney transplant recipients
    Prolonged delayed graft function (DGF) in kidney transplant recipients imparts a risk of poor allograft function; tacrolimus may be detrimental in this setting. We conducted a retrospective single center analysis of the first 20 patients converted to belatacept for prolonged DGF as part of a clinical protocol as a novel treatment strategy to treat prolonged DGF. Prior to conversion, patients underwent an allograft biopsy to rule out rejection and confirm tubular injury. The primary outcome was the estimated glomerular filtration rate (eGFR) at 12 months post-transplant; secondary outcome was the change in eGFR 30 days post-belatacept conversion. At 1 year post-transplant, the mean eGFR was 54.2 (SD 19.2) mL/min/1.73 m2. The mean eGFR on the day of belatacept conversion was 16 (SD 12.7) mL/min/1.73 m2 and rose to 43.1 (SD 15.8) mL/min/1.73 m2 30 days post-conversion (P<.0001). The acute rejection rate was 20% with 100% patient survival at 12 months post-transplant. There was one graft loss in the setting of an invasive Aspergillus infection that resulted in withdrawal of immunosuppression and transplant nephrectomy. Belatacept conversion for prolonged DGF is a novel treatment strategy that resulted in an improvement in eGFR. Additional follow-up is warranted to confirm the long-term benefits of this strategy.
  • Early conversion to belatacept after renal transplantation
    Belatacept is a non-nephrotoxic immunosuppressive agent, which may make it the ideal agent for patients with delayed or slow graft function on calcineurin inhibitors. There are limited data on conversion of patients to belatacept within 6 months of transplantation. Between January 2012 and December 2015, 16 patients were converted to belatacept for delayed or poor graft function (eGFR<30 mL/min/1.73 m2, MDRD); three were HIV positive. Conversion protocols were analyzed in patients ≤4 months and 4-6 months post-transplantation. Mean serum creatinine levels after belatacept conversion were compared with preconversion levels. Patient survival was 100%, and graft survival was 88%. The mean creatinine fell from 3.9±1.82 mg/dL prebelatacept conversion to 2.1±1.1 mg/dL at 6 months and 1.9±0.47 mg/dL (median 1.8 mg/dL) at 12 months postconversion. There was no significant increased risk of rejection, infection, or malignancy. HIV parameters remained largely stable. Early conversion to belatacept in patients with DGF or slow graft function is safe and efficacious, in a single-center nonrandomized retrospective analysis.
  • Association of pretransplant kidney function with outcomes after lung transplantation
    Purpose There is a lack of data regarding the independent association of pretransplant kidney function with early and late outcomes among lung transplant (LT) recipients. Methods We queried the United Network for Organ Sharing database for adult patients (≥18 years of age) undergoing LT between 1987 and 2013. Glomerular filtration rate (GFR) was estimated using the modification of diet in renal disease (MDRD) and the Chronic kidney disease epidemiology collaboration (CKD-EPI) equations. The study population was split into four groups (>90, 60-90, 45-59.9, and <45 mL/min/1.73 m2) based on the estimated GFR at the time of listing. Results Overall, there was a good correlation between the GFR estimated from the two equations (n=17884, Pearson r=.816, P<.001). There was a consistent and independent association of worse early and late outcomes with declining GFR throughout the spectrum including those above 60 mL/min/1.73 m2 (P<.001 for overall comparisons). Although GFR<45 mL/min/1.73 m2 was associated with worse early and late survival, patients with GFR 45-59.9 mL/min/1.73 m2 do not appear to have survival advantage beyond 3 years post-transplant. Conclusion There is a good correlation between GFR estimated using MDRD and CKD-EPI equations among patients being considered for LT. Early and late outcomes after LT worsen in a linear fashion with progressively lower pretransplant GFR.
  • Graft quality matters: Survival after simultaneous liver-kidney transplant according to KDPI
    Background Poor renal function is associated with higher mortality after liver transplantation. Our aim was to understand the impact of kidney graft quality according to the kidney donor profile index (KDPI) score on survival after simultaneous liver-kidney (SLK) transplantation. Methods Using United Network of Organ Sharing data from 2002 to 2013 for adult deceased donor SLK recipients, we compared survival and renal graft outcomes according to KDPI. Results Of 4207 SLK transplants, 6% were from KDPI >85% donors. KDPI >85% recipients had significantly increased mortality (HR=1.83, 95%CI=1.44-2.31) after adjusting for recipient factors. Additionally, dialysis in the first week (HR=1.4, 95%CI=1.2-1.7) and death-censored kidney graft failure at 1 year (HR=5.7, 95%CI=4.6-7.0) were associated with increased mortality after adjusting for recipient factors and liver donor risk index score. Conclusions KDPI >85% recipients had worse patient and graft survival after SLK. Poor renal allograft outcomes including dialysis in the first week and death-censored kidney graft failure at 1 year, which occurred more frequently with KDPI >85% grafts, were associated with significantly reduced patient survival. Questions remain about the survival impact of liver vs kidney graft quality given the close relationship between donor factors contributing to both, but KDPI can still be valuable as a metric readily available at the time of organ offers for SLK candidates.
  • Histologic surveillance after liver transplantation due to autoimmune hepatitis
    Background Autoimmune hepatitis (AIH) often recurs after liver transplantation (LT). Our aim was to evaluate the recurrence rate of AIH after LT, impact of AIH recurrence on survival and fibrosis progression, and find risk factors for AIH recurrence. Methods Forty-two patients with AIH prior to LT with ≥1 protocol biopsy ≥1 year post-LT were included with a median follow-up of 5.0 years (1.0-17.0). Follow-up liver biopsies were re-evaluated for AIH recurrence, fibrosis progression, and cirrhosis development. Results A histological recurrence of AIH was diagnosed in 15 (36%) patients at a median of 5 years of follow-up. Recurrent AIH lead to progressive fibrosis (METAVIR stage 3-4) in two but did not cause a single patient or graft loss. Transaminases were normal in three patients with recurrent AIH (20%). AIH recurrence was more common in patients with no overlapping cholangitis (OR 1.44, P=.021). Immunosuppression without antimetabolite increased the risk of AIH recurrence (OR 1.47, P=.018). Patient and graft survival rates at 1, 5, and 10 years were 94%, 86%, and 86% and 91%, 77%, and 74%. AIH recurrence did not affect survival. Conclusions AIH recurrence occurs in 36% in 5 years, but does not affect patient or graft outcome.
  • The effects of Share 35 on the cost of liver transplantation
    On June 18, 2013, the United Network for Organ Sharing (UNOS) instituted a change in the liver transplant allocation policy known as “Share 35.” The goal was to decrease waitlist mortality by increasing regional sharing of livers for patients with a model for end-stage liver disease (MELD) score of 35 or above. Several studies have shown Share 35 successful in reducing waitlist mortality, particularly in patients with high MELD. However, the MELD score at transplant has increased, resulting in sicker patients, more complications, and longer hospital stays. Our study aimed to explore factors, along with Share 35, that may affect the cost of liver transplantation. Our results show Share 35 has come with significantly increased cost to transplant centers across the nation, particularly in regions 2, 5, 10, and 11. Region 5 was the only region with a median MELD above 35 at transplant, and cost was significantly higher than other regions. Several other recipient factors had changes with Share 35 that may significantly affect the cost of liver transplant. While access to transplantation for the sickest patients has improved, it has come at a cost and regional disparities remain. Financial implications with proposed allocation system changes must be considered.
  • (D+10) MELD as a novel predictor of patient and graft survival after adult to adult living donor liver transplantation
    We modified the previously described D-MELD score in deceased donor liver transplant, to (D+10)MELD to account for living donors being about 10 years younger than deceased donors, and tested it on living donor liver transplantation (LDLT) recipients. Five hundred consecutive LDLT, between July 2010 and December 2012, were retrospectively analyzed to see the effect of (D+10)MELD on patient and graft survival. Donor age alone did not influence survival. Recipients were divided into six classes based on the (D+10)MELD score: Class 1 (0-399), Class 2 (400-799), Class 3 (800-1199), Class 4 (1200-1599), Class 5 (1600-1999), and Class 6 (>2000). The 1 year patient survival (97.1, 88.8, 87.6, 76.9, and 75% across Class 1-5, P=.03) and graft survival (97.1, 87.9, 82.3, 76.9, and 75%; P=.04) was significantly different among the classes. The study population was divided into two groups at (D+10)MELD cut off at 860. Group 1 had a significantly better 1 year patient (90.4% vs 83.4%; P=.02) and graft survival (88.6% vs 80.2%; P=.01). While donor age alone does not predict recipient outcome, (D+10)MELD score is a strong predictor of recipient and graft survival, and may help in better recipient/donor selection and matching in LDLT.
  • Adverse symptoms of immunosuppressants: A survey of Canadian transplant clinicians
    Adverse symptoms of immunosuppressants (ASI) impact quality of life (QOL) in solid organ transplant recipients; however, standardized approaches for active ASI surveillance and intervention are lacking. While management is highly clinician dependent, clinician views remain largely unexplored. We surveyed Canadian Society of Transplantation members on their perceptions of ASI including frequency, perceived QOL impact, causal attribution, management strategies, and success. Sixty-one clinicians participated in the survey of 12 ASI (tremor, diarrhea, nausea, constipation, dyspepsia, insomnia, edema, dyspnea, arthralgia, acne, mouth sores, paresthesias), for a 22% response rate. Forty-nine completed the survey (80% completion rate). Diarrhea, dyspepsia, and insomnia were most frequent, requiring management in ≥ 2% of patients by 96%, 90%, and 82% of respondents, respectively. Diarrhea, insomnia, and dyspnea were deemed to have an important QOL impact by 92%, 82%, and 69%. Immunosuppressants were universally implicated as causative of tremor, diarrhea, acne, and mouth sores. Over 80% reported success in managing mouth sores, dyspepsia, and constipation. Management strategies included adjustment of immunosuppressant or other medications, drug therapy, and nonpharmacologic approaches and varied according to perceived causal attribution. More study is needed to compare clinician and patient views. These results will be used to establish priorities for further investigation of ASI.
  • Ledipasvir/sofosbuvir is effective and well tolerated in postkidney transplant patients with chronic hepatitis C virus
    Patients with end-stage renal diseases on hemodialysis have a high prevalence of hepatitis C infection (HCV). In most patients, treatment for HCV is delayed until postrenal transplant. We assessed the effectiveness and tolerance of ledipasvir/sofosbuvir (LDV/SOF) in 32 postkidney transplant patients infected with HCV. The group was composed predominantly of treatment-naïve (75%) African American (68.75%) males (75%) infected with genotype 1a (62.5%). Most patients received a deceased donor kidney graft (78.1%). A 96% sustained viral response (SVR) was reported (27/28 patients). One patient relapsed. One patient with baseline graft dysfunction developed borderline rejection. No graft loss was reported. Six HIV-coinfected patients were included in our analysis. Five of these patients achieved SVR 12. There were four deaths, and one of the deaths was in the HIV group. None of the deaths were attributed to therapy. Coinfected patients tolerated therapy well with no serious adverse events. Serum creatinine remained stable at baseline, end of therapy, and last follow-up, (1.351±.50 mg/dL; 1.406±.63 mg/dL; 1.290±.39 mg/dL, respectively). In postkidney transplant patients with HCV infection with or without coinfection with HIV, a combination of LDV/SOF was well tolerated and effective.
  • The high incidence of severe chronic kidney disease after intestinal transplantation and its impact on patient and graft survival
    Introduction Using data from the Scientific Registry of Transplant Recipients (SRTR), cumulative incidence, risk factors for, and impact on survival of severe chronic kidney disease (CKD) in intestinal transplantation (ITx) recipients were assessed. Methods First-time adult ITx recipients transplanted in the United States between January 1, 1990 and December 31, 2012 were included. Severe CKD after ITx was defined as: glomerular filtration rate (GFR) <30 mL/min/1.73 m2, chronic hemodialysis initiation, or kidney transplantation (KTx). Survival analysis and extended Cox model were conducted. Results The cumulative incidence of severe CKD 1, 5, and 10 years after ITx was 3.2%, 25.1%, and 54.1%, respectively. The following characteristics were significantly associated with severe CKD: female gender (HR 1.34), older age (HR 1.38/10 year increment), catheter-related sepsis (HR 1.58), steroid maintenance immunosuppression (HR 1.50), graft failure (HR 1.76), ACR (HR 1.64), prolonged requirement for IV fluids (HR 2.12) or TPN (HR 1.94), and diabetes (HR 1.54). Individuals with higher GFR at the time of ITx (HR 0.92 for each 10 mL/min/1.73 m2 increment), and those receiving induction therapies (HR 0.47) or tacrolimus (HR 0.52) showed lower hazards of severe CKD. In adjusted analysis, severe CKD was associated with a significantly higher hazard of death (HR 6.20). Conclusions The incidence of CKD after ITx is extremely high and its development drastically limits post-transplant survival.
  • Quantitative computed tomography assessment of bronchiolitis obliterans syndrome after lung transplantation
    Background Bronchiolitis obliterans syndrome (BOS) is a clinical manifestation of chronic allograft rejection following lung transplantation. We examined the quantitative measurements of the proximal airway and vessels and pathologic correlations in subjects with BOS. Methods Patients who received a lung transplant at the Brigham and Women's Hospital between December 1, 2002 and December 31, 2010 were included in this study. We characterized the quantitative CT measures of proximal airways and vessels and pathological changes. Results Ninety-four (46.1%) of the 204 subjects were included in the study. There was a significant increase in the airway vessel ratio in subjects who developed progressive BOS compared to controls and non-progressors. There was a significant increase in airway lumen area and decrease in vessel cross-sectional area in patients with BOS compared to controls. Patients with BOS had a significant increase in proximal airway fibrosis compared to controls. Conclusions BOS is characterized by central airway dilation and vascular remodeling, the degree of which is correlated to decrements in lung function. Our data suggest that progressive BOS is a pathologic process that affects both the central and distal airways.
  • Cultural competency of a mobile, customized patient education tool for improving potential kidney transplant recipients’ knowledge and decision-making
    Patients considering renal transplantation face an increasingly complex array of choices as a result of the revised kidney transplant allocation system. Decision aids have been shown to improve patient decision-making through the provision of detailed, relevant, individualized clinical data. A mobile iOS-based application (app) including animated patient education and individualized risk-adjusted outcomes following kidney transplants with varying donor characteristics and DSA waiting times was piloted in two large US transplant programs with a diverse group of renal transplant candidates (N = 81). The majority (86%) of patients felt that the app improved their knowledge and was culturally appropriate for their race/ethnicity (67%-85%). Patients scored significantly higher on transplant knowledge testing (9.1/20 to 13.8/20, P < .001) after viewing the app, including patients with low health literacy (8.0 to 13.0, P < .001). Overall knowledge of and interest in living and deceased donor kidney transplantation increased. This pilot project confirmed the benefit and cultural acceptability of this educational tool, and further refinement will explore how to better communicate the risks and benefits of nonstandard donors.
  • Kidney allograft surveillance biopsy practices across US transplant centers: A UNOS survey
    Background The approach to the diagnosis and management of subclinical rejection (SCR) in kidney transplant recipients remains controversial. Methods We conducted a survey through UNOS across US transplant centers regarding their approach to surveillance biopsies and reasons for the nonperformance of surveillance biopsies. Results Responses were obtained from 106/238 centers (45%), and only 18 (17%) of the centers performed surveillance biopsies on all patients and 22 (21%) performed biopsy for select cases. The most common time points for surveillance biopsies were 3 and 12  months post-transplant. The common reasons for not performing biopsies were low yield (n = 44, 65%) and the belief that it will not change outcome (n = 24, 36%). The incidence of SC-TCMR was ≥ 10% among 39% of centers. The mean serum creatinine was slightly worse by 0.06 mg/dL at 1 year and 0.07 mg/dL at 3 years among centers performing biopsy, P < .0001. The. 1-and 3-year Observed-Expected (O-E) graft survival was similar among centers performing biopsies vs. those not performing biopsy (P = .07, .88). Conclusion Only 17% of US centers perform surveillance biopsies, with another 21% performing surveillance biopsies in select cases (among centers that responded to the survey). Greater uniformity in the approach and management of this condition is of paramount importance.
  • Sinus tachycardia is associated with impaired exercise tolerance following heart transplantation
    Background Sinus tachycardia often presents in heart transplantation (HTx) recipients, but data on its effect on exercise performance are limited. Methods Based on mean heart rate (HR) value 3 months after HTx, 181 patients transplanted from 2006 to 2015 at University of Nebraska Medical Center were divided into two groups: (i) HR<95 beats/min (bpm, n=93); and (ii) HR≥95 bpm (n=88). Cardiopulmonary exercise testing (CPET) was performed 1 year after HTx. Results Mean HR at 3 months post-HTx was 94±11 bpm and did not change significantly at 1 year post-HTx (96±11 bpm, P=.13). HR≥95 bpm at 3 months was associated with younger donor age (OR 1.1; CI 1.0-1.1, P=.02), female donors (OR −2.4; CI 1.16-5.24 P=.02), and lack of donors' heavy alcohol use (OR −0.43; CI 0.17-0.61; P=.04). HR≥95 bpm at 3 months post-HTx was independently associated with decreased exercise capacity in metabolic equivalent (P=.008), reduced peak VO2 (P=.006), and percent of predicted peak VO2 (P=.002) during CPET. Conclusions HR≥95 at 3 months following HTx is associated with reduced exercise tolerance in stable HTx recipients. Medical HR reduction after HTx could improve exercise performance after HTx and merits further investigation.
  • Relationship between pre-transplant physical function and outcomes after kidney transplant
    Background Performance-based measures of physical function predict morbidity following non-transplant surgery. Study objectives were to determine whether physical function predicts outcomes after kidney transplant and assess how physical function changes post-transplant. Methods We conducted a prospective study involving living donor kidney transplants recipients at our center from May 2012 to February 2014. Physical function was measured using the Short Physical Performance Battery (SPPB [balance, chair stands, gait speed]) and grip strength testing. Initial length of stay (LOS), 30- day rehospitalizations, allograft function, and quality of life (QOL) were assessed. Results The majority of the 140 patients in our cohort had excellent pre-transplant physical function. In general, balance scores were more predictive of post-transplant outcomes than the SPPB. Decreased pre-transplant balance was independently associated with longer LOS and increased rehospitalizations but not with post-transplant QOL; 35% of patients experienced a clinically meaningful (≥ 1.0 m/s) improvement in gait speed 4 months post-transplant. Conclusions Decreased physical function may be associated with longer LOS and rehospitalizations following kidney transplant. Further studies are needed to confirm this association. The lack of relationship between pre-transplant gait speed and outcomes in our cohort may represent a ceiling effect. More comprehensive measures, including balance testing, may be required for risk stratification.
  • Long-term survival following kidney transplantation in previous lung transplant recipients—An analysis of the unos registry
    Background Kidney transplantation has been advocated as a therapeutic option in lung recipients who develop end-stage renal disease (ESRD). This analysis outlines patterns of allograft survival following kidney transplantation in previous lung recipients (KAL). Methods Data from the UNOS lung and kidney transplantation registries (1987–2013) were cross-linked to identify lung recipients who were subsequently listed for and/or underwent kidney transplantation. Time-dependent Cox models compared the survival rates in KAL patients with those waitlisted for renal transplantation who never received kidneys. Survival analyses compared outcomes between KAL patients and risk-matched recipients of primary, kidney-only transplantation with no history of lung transplantation (KTx). Results A total of 270 lung recipients subsequently underwent kidney transplantation (KAL). Regression models demonstrated a lower risk of post-listing mortality for KAL patients compared with 346 lung recipients on the kidney waitlist who never received kidneys (P<.05). Comparisons between matched KAL and KTx patients demonstrated significantly increased risk of death and graft loss (P<.05), but not death-censored graft loss, for KAL patients (P = .86). Conclusions KAL patients enjoy a significant survival benefit compared with waitlisted lung recipients who do not receive kidneys. However, KAL patients do poorly compared with KTx patients. Decisions about KAL transplantation must be made on a case-by-case basis considering patient and donor factors.
  • Screening for asymptomatic bacteruria at one month after adult kidney transplantation: Clinical factors and implications
    Objective Urinary tract infections (UTIs) account for significant morbidity after kidney transplantation (KT). Screening for asymptomatic bacteruria (AB) has proven to be beneficial in certain population including pregnant women; however, it is not well-studied in KT population. We reviewed the incidence, clinical features, and implications of asymptomatic bacteruria one month after KT. Methods A total of 171 adult KT patients (86 [50.3%] living transplants, 87 [50.9%] males, mean age 47.3 ± 13.7 years), between 2005 and 2012, were analyzed. Immunosuppression induction and maintenance were as per protocol. Protocol urine cultures were taken at 1 month post-transplantation. Patients were stratified for presence of AB and analyzed for demographics and clinical parameters. Outcomes of hospitalization for symptomatic UTIs, graft, and patient survival were ascertained. Results Forty-one (24%) KT recipients had AB at 30 days post-transplant. Multiresistant organisms accounted for 43.9% of these infections. Logistic regression confirms female sex and deceased donor recipients as independent predictors of 30-day bacteruria, which predicts subsequent hospitalization for symptomatic UTI. One-year patient and graft survival were similar in recipient with or without AB. Conclusion Asymptomatic bacteruria 30 days post-transplant can be predicted in female recipients and kidneys from deceased donors probably due to anatomical and functional differences respectively. There is increased morbidity of subsequent hospitalization for symptomatic UTI and more research in prevention of UTI is needed, particularly non-antibiotic prophylaxis.
  • Low vitamin D exposure is associated with higher risk of infection in renal transplant recipients
    Background Vitamin D is a steroid hormone with multiple vital roles within the immune system. Various studies evaluated the influence of vitamin D on infections postrenal transplantation and found contrasting results. This study aimed to assess the relationship between vitamin D status and the incidence of infection in renal transplant recipients. Methods This is a retrospective cohort study of adult renal transplant recipients at the University of Pittsburgh Medical Center between 2005 and 2012. Patients were grouped as vitamin D sufficient (≥30 ng/mL) or deficient (<30 ng/mL) based on total serum 25-hydroxyvitamin D concentrations. The association between vitamin D levels collected at any point post-transplantation and incidence of infection within ±90 days of the vitamin D levels were assessed using logistic and Poisson's regression models. Results Vitamin D sufficiency at any point post-transplantation was significantly associated with a 66% lower odds (OR: 0.34; 95% CI: 0.22-0.52; P<.001) and 43% lower rate of infections (incident rate ratio (IRR): 0.57; 95% CI: 0.46-0.71; P<.001) within ±90 days of the vitamin D level. Baseline vitamin D level was also associated with lower incidence and risk for infections within the first year post-transplantation. Conclusion Adequate levels of vitamin D in kidney transplant recipients are associated with lower infection risk in the first year and at any time post-transplantation.
  • Severe acute cellular rejection after intestinal transplantation is associated with poor patient and graft survival
    Background Severe acute cellular rejection (ACR) occurs frequently after intestinal transplantation (ITx). Aim To evaluate the outcomes and the risk factors for graft failure and mortality in patients with severe ACR after ITx. Methods Retrospective study evaluating all ITx recipients who developed severe ACR between 01/2000 and 07/2014. Demographic and histologic data were reviewed. Results 20/126 (15.9%) ITx recipients developed severe ACR. Of these 20 episodes, 13 were in adults (median age: 47.1). The median (IQR) time from ITx to severe ACR was 206.5 (849) days. All patients received intravenous methylprednisolone and increased doses of tacrolimus. Sixteen (80%) patients did not respond to initial treatment and required thymoglobulin administration. Moreover, 11 (55%) patients required additional immunosuppressive medications. Six (30%) patients required graft enterectomy. Complications related to ACR treatment were the following: 10 (50%) patients developed bacterial infections, four (20%) patients developed cytomegalovirus infection and four (20%) patients developed post-transplant lymphoproliferative disease. At the end of follow-up, only 3/20 (15%) were alive with a functional allograft. The median patient survival time after diagnosis of severe ACR was 400 days (95% CI: 234.0-2613.0). Conclusions Severe ACR episodes are associated with high rates of graft loss and complications related to treatment.
  • Idiopathic hyperammonemia after solid organ transplantation: Primarily a lung problem? A single-center experience and systematic review
    Background Idiopathic hyperammonemia syndrome (IHS) is an uncommon, often deadly complication of solid organ transplantation. IHS cases in solid organ transplantation seem to occur predominantly in lung transplant (LTx) recipients. However, to the best of our knowledge, the occurrence of IHS has not been systematically evaluated. We set out to identify all reported cases of IHS following nonliver solid organ transplantations. Methods Retrospective review of our institutional experience and systematic review of the literature. Results At our institution six cases (of 844 nonliver solid organ transplants) of IHS were identified: five occurred following LTx (incidence 3.9% [lung] vs 0.1% [nonlung], P=.004). In the systematic review, 16 studies met inclusion criteria, reporting on 32 cases of IHS. The majority of IHS cases in the literature (81%) were LTx-recipients. The average peak reported ammonia level was 1039 μmol/L occurring on average 14.7 days post-transplant. Mortality in previously reported IHS cases was 69%. A single-center experience suggested that, in addition to standard treatment for hyperammonemia, early initiation of high intensity hemodialysis to remove ammonia was associated with increased survival. In the systematic review, mortality was 40% (four of 10) with intermittent hemodialysis, 75% (nine of 12) with continuous veno-venous hemodialysis, and 100% in six subjects that did not receive renal replacement to remove ammonia. Three reports identified infection with urease producing organisms as a possible etiology of IHS. Conclusion IHS is a rare but often fatal complication that primarily affects lung transplant recipients within the first 30 days.
  • Increased mid-abdominal circumference is a predictor for surgical wound complications in kidney transplant recipients: A prospective cohort study
    Kidney transplant recipients are at an increased risk of developing surgical site wound complications due to their immunosuppressed status. We aimed to determine whether increased mid-abdominal circumference (MAC) is predictive for wound complications in transplant recipients. A prospective study was performed on all kidney transplant recipients from October 2014 to October 2015. “Controls” consisted of kidney transplant recipients without a surgical site wound complication and “cases” consisted of recipients that developed a wound complication. In total, 144 patients underwent kidney transplantation and 107 patients met inclusion criteria. Postoperative wound complications were documented in 28 (26%) patients. Patients that developed a wound complication had a significantly greater MAC, body mass index (BMI), and body weight upon renal transplantation (P<.001, P=.011, and P=.011, respectively). On single and multiple logistic regression analyses, MAC was a significant predictor for developing a surgical wound complication (P=.02). Delayed graft function and a history of preformed anti-HLA antibodies were also predictive for surgical wound complications (P=.003 and P=.014, respectively). Increased MAC is a significant predictor for surgical wound complications in kidney transplant recipients. Integrating clinical methods for measuring visceral adiposity may be useful for stratifying kidney transplant recipients with an increased risk of a surgical wound complication.
  • Assessment of cardiac allograft systolic function by global longitudinal strain: From donor to recipient
    Background Cardiac allografts are routinely evaluated by left ventricular ejection fraction (LVEF) before and after transplantation. However, myocardial deformation analyses with LV global longitudinal strain (GLS) are more sensitive for detecting impaired LV myocardial systolic performance compared with LVEF. Methods We analyzed echocardiograms in 34 heart donor-recipient pairs transplanted at Duke University from 2000 to 2013. Assessments of allograft LV systolic function by LVEF and/or LV GLS were performed on echocardiograms obtained pre-explanation in donors and serially in corresponding recipients. Results Donors had a median LVEF of 55% (25th, 75th percentile, 54% to 60%). Median donor LV GLS was −14.6% (−13.7 to −17.3%); LV GLS was abnormal (ie, >−16%) in 68% of donors. Post-transplantation, LV GLS was further impaired at 6 weeks (median -11.8%; −11.0 to −13.4%) and 3 months (median −11.4%; −10.3 to −13.9%) before recovering to pretransplant levels in follow-up. Median LVEF remained ≥50% throughout follow-up. We found no association between donor LV GLS and post-transplant outcomes, including all-cause hospitalization and mortality. Conclusions GLS demonstrates allograft LV systolic dysfunction in donors and recipients not detected by LVEF. The clinical implications of subclinical allograft dysfunction detected by LV GLS require further study.
  • Incidence of acute cellular rejection following granulocyte colony-stimulating factor administration in lung transplantation: A retrospective case-cohort analysis
    Granulocyte colony-stimulating factor (GCSF) is an option to treat leukopenia in lung transplant recipients. Conflicting evidence exists regarding its effects on acute cellular rejection (ACR). A retrospective, case-cohort study was conducted to assess whether the use of GCSF in lung transplant recipients is associated with an increased incidence of ACR. Patients had to have received at least one dose of GCSF but were excluded if they received GCSF within 30 days prior to transplant or received a lymphocyte-depleting agent within 14 days of GCSF administration. Thirty-five patients who received GCSF within 3 months of transplant met inclusion criteria and 105 patients were identified as controls based on a 1:3 allocation scheme. Incidence of ACR was 57.1% in the GCSF group versus 50.5% in the control group (relative risk (RR)=1.13; 95% CI, 0.80 to 1.59; P=.48). At 3 months post-transplant, 74.3% of the GCSF group had a dose reduction or discontinuation of their antiproliferative agent versus 17.1% of the control group (RR=4.33; 95% CI, 2.73 to 6.89; P<.0001). Rejection severity and incidence of infections was similar among groups. These findings show that GCSF administration within 3 months following lung transplantation was not associated with a higher incidence or severity of ACR.
  • Delirium after lung transplantation: Association with recipient characteristics, hospital resource utilization, and mortality
    Background Delirium is associated with increased morbidity and mortality. The factors associated with post-lung transplant delirium and its impact on outcomes are under characterized. Methods The medical records of 163 consecutive adult lung transplant recipients were reviewed for delirium within 5 days (early-onset) and 30 hospital days (ever-onset) post-transplantation. A multivariable logistic regression model assessed factors associated with delirium. Multivariable negative binomial regression and Cox proportional hazards models assessed the association of delirium with ventilator duration, intensive care unit (ICU) length of stay (LOS), hospital LOS, and one-year mortality. Results Thirty-six percent of patients developed early-onset, and 44% developed ever-onset delirium. Obesity (OR 6.35, 95% CI 1.61-24.98) and bolused benzodiazepines within the first postoperative day (OR 2.28, 95% CI 1.07-4.89) were associated with early-onset delirium. Early-onset delirium was associated with longer adjusted mechanical ventilation duration (P=.001), ICU LOS (P<.001), and hospital LOS (P=.005). Ever-onset delirium was associated with longer ICU (P<.001) and hospital LOS (P<.001). After adjusting for clinical variables, delirium was not significantly associated with one-year mortality (early-onset HR 1.65, 95% CI 0.67-4.03; ever-onset HR 1.70, 95% CI 0.63-4.55). Conclusions Delirium is common after lung transplant surgery and associated with increased hospital resources.
  • Cardiac transplantation in a neonate—First case in Switzerland and European overview
    Twenty-four percent of pediatric heart transplantations (pHTx) are carried out in infants. Neonatal heart transplantation is both rarely performed and challenging. We report on a newborn baby girl suffering from cardiac failure due to a huge tumor (24×52 mm) within the free wall of the left ventricle (LV) and subtotal obstruction of the main left bronchus. Following a surgical tumor resection, a Berlin Heart EXCOR left ventricular assist device was implanted as the bridge to the transplantation. In spite of an organ donor/recipient mismatch of >200%, both heart transplantation and the postoperative course were successful. In addition to this case report, the authors also present data from a survey on performed infant and neonatal transplantations in Western Europe. As neonatal heart transplantation is a rare event in Europe, the authors think it is of crucial importance to share this limited experience. We discuss an alternative strategy—namely, palliative surgical correction using the Fontan pathway. The challenges of donor/recipient weight mismatch and the possibilities of overcoming infant donor organ shortage as a postoperative immunosuppressive regimen are discussed as well.
  • Living donor kidney allograft survival ≥ 50 years
    The first successful kidney transplant occurred in 1954. Since then, long-term graft survival has been an elusive idealistic goal of transplantation. Yet 62 years later, we know of only 6 kidney transplant recipients who have achieved ≥  50-year graft survival while being on no immunosuppression or a substantially reduced regimen. Herein, we report graft survival  ≥ 50 years in 2 living donor recipients who have been maintained on standard-of-care immunosuppression the entire time. For our 2 recipients, their living donor's altruism altered the course, length, and quality of their life, which by all accounts can be deemed normal: They attended college, held jobs, had successful pregnancies, raised families, and were productive members of society. Both donors are still alive and well, more than 50 years post-donation; both have an acceptable GFR and normal blood pressure, with hyperlipidemia as their only medical problem. These 2 intertwined stories illustrate the tremendous potential of a successful kidney transplant: long-term survival with a normal lifestyle and excellent quality of life, even after more than 5 decades on full-dose immunosuppression.
  • Lactobacillus rhamnosus GG probiotic enteric regimen does not appreciably alter the gut microbiome or provide protection against GVHD after allogeneic hematopoietic stem cell transplantation
    Graft-versus-host disease (GVHD) is a major adverse effect associated with allogeneic stem cell transplant. Previous studies in mice indicated that administration of the probiotic Lactobacillus rhamnosus GG can reduce the incidence of GVHD after hematopoietic stem cell transplant. Here we report results from the first randomized probiotic enteric regimen trial in which allogenic hematopoietic stem cell patients were supplemented with Lactobacillus rhamnosus GG. Gut microbiome analysis confirmed a previously reported gut microbiome association with GVHD. However, the clinical trial was terminated when interim analysis did not detect an appreciable probiotic-related change in the gut microbiome or incidence of GVHD. Additional studies are necessary to determine whether probiotics can alter the incidence of GVHD after allogeneic stem cell transplant.
  • Pharmacogenetics of steroid-responsive acute graft-versus-host disease
    Glucocorticoids are central to effective therapy of acute graft-versus-host disease (GVHD). However, only about half of the patients respond to steroids in initial therapy. Based on postulated mechanisms for anti-inflammatory effectiveness, we explored genetic variations in glucocorticoid receptor, co-chaperone proteins, membrane transporters, inflammatory mediators, and variants in the T-cell receptor complex in hematopoietic cell transplant recipients with acute GVHD requiring treatment with steroids and their donors toward response at day 28 after initiation of therapy. A total of 300 recipient and donor samples were analyzed. Twenty-three SNPs in 17 genes affecting glucocorticoid pathways were included in the analysis. In multiple regression analysis, donor SNP rs3192177 in the ZAP70 gene (O.R. 2.8, 95% CI: 1.3-6.0, P=.008) and donor SNP rs34471628 in the DUSPI gene (O.R. 0.3, 95% CI: 0.1-1.0, P=.048) were significantly associated with complete or partial response. However, after adjustment for multiple testing, these SNPs did not remain statistically significant. Our results, on this small, exploratory, hypothesis generating analysis suggest that common genetic variation in glucocorticoid pathways may help identify subjects with differential response to glucocorticoids. This needs further assessment in larger datasets and if validated could help identify subjects for alternative treatments and design targeted treatments to overcome steroid resistance.
 

 

derecho
 
 
Documento sin título

Aviso para pacientes:
Esta página contiene información urológica dirigida a profesionales de la sanidad.
Si tiene algún problema relacionado con esta patología,
consulte con su urólogo o médico de familia.
Si desea información diseñada para pacientes y público general. puede visitar:

Portal de Información Urológica para Pacientes

 

 

Carlos Tello Royloa

 

Actualizada el: 08-Abr-2013

 

uroportal@gmail.com