El Portal de Urología en Español para profesionales

Búsqueda personalizada


Documento sin título


Los Blogs de UroPortal:
Novedades en UroPortal
Vídeos Urología
Presentaciones Urología



Este mes en... Clinical Transplantation

Sumarios de Revistas

Este mes en... Clinical Transplantation:

  • Immunosuppression with mTOR inhibitors prevents the development of donor-specific antibodies after liver transplant
    Background Donor-specific antibodies (DSAs) are an important cause of complications after solid-organ transplant. Risk factors and, thus, strategies for preventing DSA development are not well defined. Methods The DSA status of 400 patients who underwent LT at the outpatient clinic of the University Hospital Essen, was determined. Human leukocyte antigen (HLA) antibodies were detected by single-antigen bead technology. The strength of DSAs was reported as mean fluorescence intensity (MFI). Results Detectable DSAs were found in 74 (18.5%) patients and significantly more often in patients who underwent LT for autoimmune liver disease than for all other indications (29.3%; p = 0.022), but significantly less often found in patients who underwent LT for hepatocellular carcinoma (HCC) (7.6%, p = 0.005). The incidence of DSAs increased with time after LT, and the risk was generally higher for female patients. The frequency of DSA detection was significantly lower (10.6%) for patients receiving immunosuppressive treatment with mTOR inhibitors than for those receiving other regimens (20.5%; p = 0.025) Conclusion Autoimmune liver diseases, female sex, and time of more than 8 years since LT predispose patients to the development of DSAs. Immunosuppression with the mTOR inhibitor everolimus protects against DSA development after liver transplant. This article is protected by copyright. All rights reserved.
  • Development of a Human Cadaver Model for Training in Laparoscopic Donor Nephrectomy
    Background The Organ Procurement Network recommends a surgeon record 15 cases as surgeon or assistant for laparoscopic donor nephrectomies (LDN) prior to independent practice. The literature suggests that the learning curve for improved perioperative and patient outcomes is closer to 35 cases. In this article, we describe our development of a model utilizing fresh tissue and objective, quantifiable endpoints to document surgical progress and efficiency in each of the major steps involved in LDN. Materials and Methods Phase I of model development focused on the modifications necessary to maintain visualization for laparoscopic surgery in a human cadaver. Phase II tested proposed learner-based metrics of procedural competency for multiport LDN by timing procedural steps of LDN in a novice learner. Results Phase I and II required 12 and 9 cadavers, with a total of 35 kidneys utilized. The following metrics improved with trial number for multiport LDN: time taken for dissection of the gonadal vein, ureter, renal hilum, adrenal and lumbrical veins, simulated warm ischemic time (WIT), and operative time. Conclusion Humans cadavers can be used for training in LDN as evidenced by improvements in timed learner-based metrics. This simulation-based model fills a gap in available training options for surgeons. This article is protected by copyright. All rights reserved.
  • Characteristics of compatible pair participants in kidney paired donation at a single center
    Compatible pairs of living kidney donors and their intended recipients can enter into kidney paired donation (KPD) and facilitate additional living donor kidney transplants (LDKTs). We examined 11 compatible pairs (the intended recipients and their intended, compatible donors) who participated in KPD, along with the recipients’ 11 matched, exchange donors. The 11 pairs participated in 10 separate exchanges (3 were multi-center exchanges) that included 33 total LDKTs (22 additional LDKTs). All the intended donors were blood group O and female, with a mean living kidney donor profile index (LKDPI) of 27.6 (SD 16.8). The matched donors had a mean LKDPI of 9.4 (SD 31.7). Compatible pairs entered KPD for altruistic reasons (N=2) or due to mismatch of age (N=7) or body/kidney size (N=2) between the recipient and intended donor. In four cases, retrospective calculation of the LKDPI revealed that the matched donor had a higher LKDPI than the intended donor. Of the 22 recipients of LDKTs enabled by the compatible pairs, three were highly sensitized, with PRA >80%. In conclusion, most compatible pairs entered into KPD so that the recipient could receive a LDKT transplant from a donor whose age or body/kidney size were more favorable to post-transplant outcomes. This article is protected by copyright. All rights reserved.
  • Simultaneous pancreas and kidney transplantation: Incidence and risk factors for amputation after 10 years follow-up
    Introduction The incidence of amputation after simultaneous pancreas and kidney (SPK) transplantation ranges from 9.5-23% after 5 years of follow-up. The objective of this study was to investigate the incidence and risk factors for amputation in SPK transplant patients after a minimum follow-up of 10 years. Methods An analysis was performed on a prospectively maintained database of 81 SPK transplants and 43 kidney transplantation alone (KTA) consecutively performed in one centre for insulin dependent diabetes mellitus between December 1992 and January 2006. Primary outcome variables were incidence of amputation per patient, total number of amputations and type of amputation performed. Data are presented as a mean ± standard deviation. Results Seven patients (9%) in the SPK cohort and one patient (2%) in the KTA cohort underwent amputation (p<0.001). One amputee had pancreas allograft failure prior to amputation. Fifteen amputations were performed in total and 4 patients required ≥2 amputations. The latency period between transplantation and amputation was 133.57 ± 49.43 months in the SPK cohort and 168 months in the KTA group. Conclusions The incidence of amputation after SPK transplantation is approximately 9% after 10 years follow-up. Patients are at a significantly greater risk of amputation after SPK transplantation compared to KTA for type 1 diabetes despite insulin independence. This article is protected by copyright. All rights reserved.
  • Pharmacokinetic and Pharmacogenetic Analysis of Immunosuppressive Agents after Laparoscopic Sleeve Gastrectomy
    Background Severe obesity has been shown to limit access to renal transplantation in patients with end stage renal disease (ESRD). Laparoscopic sleeve gastrectomy (LSG) has been performed in the ESRD population to assist in achieving waitlist and transplant eligibility. Little is known about how LSG impacts the bioequivalence tacrolimus products and immunosuppression pharmacokinetics. Methods This was a prospective, open-label, single-dose, crossover, two-period pharmacokinetic (PK) study. The purpose of this study was to assess single-dose PK of immediate-release tacrolimus (IR-TAC), extended-release tacrolimus (ER-TAC), and mycophenolic acid (MPA) in adult ESRD patients post-LSG. Results Twenty-three subjects were included in the 24-hour PK assessments. The ratio of geometric means between ER-TAC and IR-TAC was 103.5% (90% CI 89.6 – 119.6%) for AUC0-24 and 92.5% (90% CI 80.4 – 106.4%) for Cmax. PK parameters were similar between ER-TAC and IR-TAC, except for Cmin (p=0.004) and Cmax (p=0.04). MPA AUC0-24 was similar when given with either ER-TAC or IR-TAC (p=0.32). Patients expressing CYP3A5*1 genotypes had lower tacrolimus AUC0-24 values versus those with CYP3A5*3/*3 (IR-TAC p<0.001; ER-TAC p=0.008). Genotype did not impact MPA PK. Conclusion Dose modification of immunosuppressants post-LSG may not be necessary aside from standard therapeutic drug monitoring. This article is protected by copyright. All rights reserved.
  • Airway inflammation and symptoms in children following liver and heart transplantation
    Objectives Describe the upper airway endoscopic findings of children with upper airway symptoms after liver transplantation (LT) or heart transplantation (HT). Methods Review of children undergoing airway endoscopy after LT or HT from 2011 to 2015 at a tertiary care pediatric hospital. Airway findings, biopsy results, immunosuppression and Ebstein Barr virus (EBV) levels were recorded. Results Twenty-three of 158 LT (111) and HT (47) recipients underwent endoscopy. Median time from LT to endoscopy was 9 months (range 4-25) and 31 months (range 1-108) for HT. 13/23 patients presented with upper airway symptoms, and 10/23 presented with respiratory failure or for surveillance. Thirteen patients with upper airway symptoms had abnormal findings (7 LT; 6 HT), most commonly arytenoid edema (13 patients). There were 5 EBV positive biopsies (4 with PTLD), and 6 EBV negative biopsies with lymphocytic inflammation. One biopsy demonstrated fungal infection. Immunosuppression was decreased in 7 patients, and 3 received steroids. There were no episodes of allograft rejection. No patients had airway symptoms at last follow up. Conclusions In pediatric solid organ transplant recipients, symptoms of airway obstruction are not uncommon, and should be evaluated with endoscopy. Endoscopy without symptoms is low-yield. Treatment with decreased immunosuppression improved airway symptoms. This article is protected by copyright. All rights reserved.
  • Low Serum Testosterone is Associated With Graft Function Early After Heart Transplantation
    Background We sought to investigate a correlation between serum testosterone levels and graft function early after heart transplantation. Methods In a cross-sectional study we measured serum testosterone levels 4 weeks after heart transplantation in 49 consecutive male recipients. Echocardiography was done to evaluate graft function. Low serum testosterone was defined as <11 nmol/L. Results Low serum testosterone was present in 21 (43%) recipients (Group A), and 28 (57%) had normal testosterone levels (Group B). The two groups did not differ in age and presence of renal dysfunction, arterial hypertension, diabetes or hyperlipidemia. Donor age and allograft ischemic time were not different between the two groups. Both groups had comparable tacrolimus through levels, dose of mycophenolate mophetil and methylprednisolone. Patients in Group A had significantly lower LVEF (58±5% vs. 65±6% vs. Group B, P=0.001) and TAPSE (1.3±0.3 cm vs. 1.6±0.3 cm in Group B, P=0.01). In comparison to Group B more patients in Group A were found to have low grade (1R) rejection (25% vs. 3%; P=0.02). Conclusion Low serum testosterone levels appear to be associated with impaired graft function and an increased incidence of low-grade rejection episodes early after heart transplantation. This article is protected by copyright. All rights reserved.
  • Epidemiology, Risk Factors and Outcome of Clostridium difficile Infection in Heart and Heart-Lung Transplant Recipients
    Background Clostridium difficile is a major cause of diarrhea in thoracic organ transplant recipients. We investigated the epidemiology, risk factors and outcome of Clostridium difficile infection (CDI) in heart and heart-lung transplant (HT) recipients. Methods This is a retrospective study from 2004-2013. CDI was defined by diarrhea and a positive toxigenic Clostridium difficile in stool measured by toxin enzyme immunoassay (2004-2006) or polymerase chain reaction (2007-2013). Cox proportional hazards regression was used to model the association of risk factors with time to CDI, and survival with CDI following transplantation. Results There were 254 HT recipients, with a median age of 53 years (IQR, 45-60); 34% were female. During the median follow-up of 3.1 years (IQR, 1.3-6.1), 22 (8.7%) patients developed CDI. In multivariate analysis, risk factors for CDI were combined heart-lung transplant [HR4.70; 95%CI, 1.30-17.01 (p=0.02)] and retransplantation [HR7.19; 95%CI, 1.61-32.12 (p=0.01)]. Acute cellular rejection was associated with a lower risk of CDI [HR0.34; 95%CI, 0.11-0.94 (p=0.04)]. CDI was found to be an independent risk factor for mortality [HR7.66; 95%CI, 3.41-17.21 (p<0.0001)]. Conclusions CDI after HT is more common among patients with combined heart-lung and those undergoing retransplantation. CDI was associated with a higher risk of mortality in HT recipients. This article is protected by copyright. All rights reserved.
  • Brain natriuretic peptide and right heart dysfunction after heart transplantation
    Heart transplantation (HT) should normalize cardiac endocrine function but brain natriuretic peptide (BNP) levels remain elevated after HT, even in the absence of left ventricular haemodynamic disturbance or allograft rejection. Right ventricle (RV) abnormalities are common in HT recipients (HTx), as a result of engraftment process, tricuspid insufficiency and/or repeated inflammation due to iterative endomyocardial biopsies. RV function follow-up is vital for patient management since RV dysfunction is a recognized cause of in-hospital death and is responsible for a worse prognosis. Interestingly, few and controversial data are available concerning the relationship between plasma BNP levels and RV functional impairment in HTx. This suggests that infra-clinical modifications, such as subtle immune system disorders or hypoxic conditions, might influence BNP expression. Nevertheless, due to other altered circulating molecular forms of BNP, a lack of specificity of BNP assays is described in heart failure patients. This phenomenon could exist in HT population and could explain elevated BNP plasmatic levels despite a normal RV function. In clinical practice, intra-individual change in BNP over time, rather than absolute BNP values, might be more helpful in detecting right cardiac dysfunction in HTx. This article is protected by copyright. All rights reserved.
  • Delirium after Lung Transplantation: Association with Recipient Characteristics, Hospital Resource Utilization, and Mortality
    Background Delirium is associated with increased morbidity and mortality. The factors associated with post-lung transplant delirium and its impact on outcomes are under characterized. Methods The medical records of 163 consecutive adult lung transplant recipients were reviewed for delirium within 5 days (early-onset) and 30 hospital days (ever-onset) post-transplantation. A multivariable logistic regression model assessed factors associated with delirium. Multivariable negative binomial regression and Cox proportional hazards models assessed the association of delirium with ventilator duration, intensive care unit (ICU) length of stay (LOS), hospital LOS and one-year mortality. Results Thirty six % developed early-onset and 44% - ever-onset delirium. Obesity (OR 6.35, 95% CI 1.61-24.98) and bolused benzodiazepines within the first post-operative day (OR 2.28, 95% CI 1.07-4.89) were associated with early-onset delirium. Early-onset delirium was associated with longer adjusted mechanical ventilation duration (p=0.001), ICU LOS (p<0.001), and hospital LOS (p=0.005). Ever-onset delirium was associated with longer ICU (p<0.001) and hospital LOS (p<0.001). After adjusting for clinical variables, delirium was not significantly associated with one-year mortality (early-onset HR 1.65, 95% CI 0.67-4.03; ever-onset HR 1.70, 95% CI 0.63-4.55). Conclusions Delirium is common after lung transplant surgery and associated with increased hospital resources. This article is protected by copyright. All rights reserved.
  • Factors contributing to employment patterns after liver transplantation
    Background Many liver transplant recipients return to work, but their patterns of employment are unclear. We examine patterns of employment 5 years after liver transplantation. Methods First-time liver transplant recipients ages 18-60 years transplanted from 2002-2009 and surviving at least 5 years were identified in the United Network for Organ Sharing registry. Recipient's post-transplant employment status was classified as (1) never employed; (2) returned to work within 2 years and remained employed (continuous employment); (3) returned to work within 2 years, but was subsequently unemployed (intermittent employment); or (4) returned to work ≥3 years post-transplant (delayed employment). Results Of 28,306 liver recipients identified during the study period, 12,998 survived at least 5 years and contributed at least 1 follow-up of employment status. A minority of patients (4,654; 36%) were never employed, while 3,780 (29%) were continuously employed, 3,027 (23%) were intermittently employed, and 1,537 (12%) had delayed employment. In multivariable logistic regression analysis, predictors of intermittent and delayed employment included lower socioeconomic status, higher local unemployment rates, and post-transplant comorbidities or complications. Conclusion Never, intermittent and delayed employment are common after liver transplantation. Socioeconomic and labor market characteristics may add to clinical factors that limit liver transplant recipients’ continuous employment. This article is protected by copyright. All rights reserved.
  • Incidence of acute cellular rejection following granulocyte colony-stimulating factor administration in lung transplantation: A retrospective case-cohort analysis
    Granulocyte colony-stimulating factor (GCSF) is an option to treat leukopenia in lung transplant recipients. Conflicting evidence exists regarding its effects on acute cellular rejection (ACR). A retrospective, case-cohort study was conducted to assess whether the use of GCSF in lung transplant recipients is associated with an increased incidence of ACR. Patients had to have received at least one dose of GCSF but were excluded if they received GCSF within 30 days prior to transplant or received a lymphocyte-depleting agent within 14 days of GCSF administration. Thirty-five patients who received GCSF within 3 months of transplant met inclusion criteria and 105 patients were identified as controls based on a 1:3 allocation scheme. Incidence of ACR was 57.1% in the GCSF group versus 50.5% in the control group (relative risk (RR)=1.13; 95% CI, 0.80 to 1.59; p=0.48). At 3 months post-transplant, 74.3% of the GCSF group had a dose reduction or discontinuation of their antiproliferative agent versus 17.1% of the control group (RR=4.33; 95% CI, 2.73 to 6.89; p<0.0001). Rejection severity and incidence of infections was similar among groups. These findings show that GCSF administration within 3 months following lung transplantation was not associated with a higher incidence or severity of ACR. This article is protected by copyright. All rights reserved.
  • Biliary Reconstruction in Liver Transplant Patients with Primary Sclerosing Cholangitis, Duct to Duct or Roux- en- Y?
    Introduction Roux-en-Y choledochojejunostomy and duct-to-duct anastomosis are biliary reconstruction methods for liver transplantation. However, there is a controversy over which method produces better results. We have compared the outcome of duct-to-duct anastomosis vs. Roux-en-Y hepaticojujenostomy in patients with primary sclerosing cholangitis who had undergone liver transplant in Shiraz Organ Transplant Center. Materials The medical records of 405 PSC patients who had undergone liver transplant from 1996-2015 were reviewed. Patients were divided into two groups: Roux-en-Y group and D-D group. Morbidity, disease recurrence and graft and patient survival rates were compared between the two groups. Results Total of 143 patients underwent a D-D biliary reconstruction, and 260 patients had a Roux-en-Y loop. Biliary complication involved 4.2% of patients from the D-D group, and 3.9% from the Roux-en-Y group (P = 0. 863). Actuarial 1, 3 and 5 year patient survival for D-D and Roux-en-Y group was (92%, 85% and 74%), and (87%, 83% and 79%), respectively (P = 0.384). The corresponding 1, 3 and 5 year graft survival was (97%, 95% and 92%) and (98%, 97% and 94%), respectively (P = 0.61). Conclusion D-D biliary reconstruction in liver transplantation for selected PSC patients was a good alternative instead of Roux-en-Y biliary reconstruction. This article is protected by copyright. All rights reserved.
  • Pilot cohort study on the potential role of TCF7L2 rs7903146 on ischemic heart disease among non-diabetic kidney transplant recipients
    Background TCF7L2 rs7903146 C>T polymorphism is associated with diabetes in the general population but its independent impact on cardiovascular disease is debated. On this basis we investigated its association with major adverse cardiac events (MACE) in a single-center cohort of non-diabetic kidney transplant recipients (KTRs). Methods Patients with pre-transplant diabetes were excluded and patients who developed post-transplant diabetes were censored at time of diagnosis. Results rs7903146 C>T polymorphism appeared to modulate the risk of MACE: 5-year prevalence was 0.8% in CC patients, 7.2% in CT patients and 9.7% in TT patients (p <0.001). TCF7L2 rs7903146 was an independent predictor of MACE in a multivariate Cox regression model (for each T allele, HR: 2.99, 95%CI: 1.62-5.52, p<0.001), together with history of cardiac ischemic events (HR: 8.69, 95%CI: 3.57-21.16, p<0.001), DGF (HR: 2.42, 95%CI: 0.98-5.95, p=0.056) and HLA-mismatches (for each mismatch: HR: 1.55, 95%CI: 1.00-2.43, p=0.053). Introduction of rs7903146 C>T polymorphism into a model based on these clinical variables significantly increased predictive power for MACE (p=0.003). Conclusions TCF7L2 rs7903146 T allele may be strongly and independently associated with MACE in non-diabetic KTRs. This article is protected by copyright. All rights reserved.
  • Increased mid-abdominal circumference is a predictor for surgical wound complications in kidney transplant recipients: A prospective cohort study
    Kidney transplant recipients are at an increased risk of developing surgical site wound complications due to their immunosuppressed status. We aimed to determine if increased mid-abdominal circumference (MAC) is predictive for wound complications in transplant recipients. A prospective study was performed on all kidney transplant recipients from October 2014 to October 2015. ‘Controls’ consisted of kidney transplant recipients without a surgical site wound complication and ‘cases’ consisted of recipients that developed a wound complication. In total, 144 patients underwent kidney transplantation and 107 patients met inclusion criteria. Post-operative wound complications were documented in 28 (26%) patients. Patients that developed a wound complication had a significantly greater MAC, body mass index (BMI) and body weight upon renal transplantation (p<0.001, p=0.011 and p=0.011 respectively). On single and multiple logistic regression analyses, MAC was a significant predictor for developing a surgical wound complication (p=0.02). Delayed graft function and a history of pre-formed anti-HLA antibodies were also predictive for surgical wound complications (p=0.003 and p=0.014 respectively). Increased MAC is a significant predictor for surgical wound complications in kidney transplant recipients. Integrating clinical methods for measuring visceral adiposity may be useful for stratifying kidney transplant recipients with an increased risk of a surgical wound complication. This article is protected by copyright. All rights reserved.
  • The Epidemiology of Clostridium difficile Infection in a National Kidney Transplant Centre
    Background We aimed to describe the epidemiology and outcomes of CDI in a national kidney transplant centre from 2008-2015. Methods Adult kidney and kidney-pancreas transplant recipients were included for analysis if they met the surveillance CDI case definition. Rates of new healthcare-associated CDI (HA-CDI) were expressed per 10,000 KTR/KTPR bed days used (BDU) to facilitate comparisons. Results Fifty-two cases of CDI were identified in 42 KTRs and KPTRs. This corresponded to an average annual rate of 9.6 per 10,000 BDU; higher than that seen amongst general hospital inpatients locally, nationally and internationally. Of the 45 cases (87%) that were considered HA-CDI, nine (20%) had symptom onset in the community. Recent proton-pump inhibitor (PPI) and broad-spectrum antimicrobial exposure preceded the majority of cases. KTRs and KPTRs with CDI had a longer mean length of hospital stay (35 days) than those KTR and KPTRs admitted during the same period that did not have CDI (8 days). Conclusions Education regarding CDI must be extended to transplant recipients and their general practitioners. Other targets for future CDI rate reduction must include stringent antimicrobial stewardship (both in hospital and in the community) and judicious PPI prescribing. This article is protected by copyright. All rights reserved.
  • Assessment of Cardiac Allograft Systolic Function by Global Longitudinal Strain: From Donor to Recipient
    Background Cardiac allografts are routinely evaluated by left ventricular ejection fraction (LVEF) before and after transplantation. However, myocardial deformation analyses with LV global longitudinal strain (GLS) are more sensitive for detecting impaired LV myocardial systolic performance compared with LVEF. Methods We analyzed echocardiograms in 34 heart donor-recipient pairs transplanted at Duke University from 2000–2013. Assessments of allograft LV systolic function by LVEF and/or LV GLS were performed on echocardiograms obtained pre-explanation in donors and serially in corresponding recipients. Results Donors had a median LVEF of 55% (25th, 75th percentile, 54 to 60%). Median donor LV GLS was -14.6% (-13.7 to -17.3%); LV GLS was abnormal (i.e., >-16%) in 68% of donors. Post-transplantation, LV GLS was further impaired at 6 weeks (median -11.8%; -11.0 to -13.4%) and 3 months (median -11.4%; -10.3 to -13.9%) before recovering to pre-transplant levels in follow-up. Median LVEF remained >50% throughout follow-up. We found no association between donor LV GLS and post-transplant outcomes, including all-cause hospitalization and mortality. Conclusions GLS demonstrates allograft LV systolic dysfunction in donors and recipients not detected by LVEF. The clinical implications of subclinical allograft dysfunction detected by LV GLS require further study. This article is protected by copyright. All rights reserved.
  • Pharmacokinetics of prolonged-release tacrolimus versus immediate-release tacrolimus in de novo liver transplantation: a randomized phase III sub-study
    Background With the same dose of tacrolimus, lower systemic exposure on the first day of dosing has been reported for prolonged-release tacrolimus compared with immediate-release tacrolimus, prompting investigation of differing initial doses. Methods This sub-study of a double-blind, randomized, phase III trial in de novo liver transplant recipients compared the pharmacokinetics of once-daily prolonged-release tacrolimus (initial dose: 0.2mg/kg/day) versus twice-daily immediate-release tacrolimus (initial dose: 0.1mg/kg/day) during the first 2 weeks post-transplant. Results Pharmacokinetic data were analysed from patients receiving prolonged-release tacrolimus (n=13) and immediate-release tacrolimus (n=12). Mean systemic exposure (AUC0–24) was higher with prolonged-release versus immediate-release tacrolimus. Dose-normalized AUC0–24 (normalized to 0.1mg/kg/day) showed generally lower exposure with prolonged-release tacrolimus versus immediate-release tacrolimus. There was good correlation between AUC0–24 and concentration at 24 hours after the morning dose (r=0.96 and r=0.86, respectively), and the slope of the line of best fit was similar for both formulations. Conclusions Doubling the initial starting dose of prolonged-release tacrolimus compared with immediate-release tacrolimus overcompensated for lower exposure on Day 1. A 50% higher starting dose of prolonged-release tacrolimus than immediate-release tacrolimus may be required for similar systemic exposure. However, doses of both formulations can be optimized using the same trough-level monitoring system. (ClinicalTrials . gov number: NCT00189826) Discipline liver transplantation/hepatology, immunosuppression/immune modulation. This article is protected by copyright. All rights reserved.
  • Relationship between pre-transplant physical function and outcomes after kidney transplant
    Background Performance-based measures of physical function predict morbidity following non-transplant surgery. Study objectives were to determine whether physical function predicts outcomes after kidney transplant and assess how physical function changes post-transplant. Methods We conducted a prospective study involving living donor kidney transplants recipients at our center 5/2012 to 2/2014. Physical function was measured using the Short Physical Performance Battery (SPPB) (balance, chair stands, gait speed) and grip strength testing. Initial length of stay (LOS), 30-day rehospitalizations, allograft function and quality of life (QOL) were assessed. Results The majority of the 140 patients in our cohort had excellent pre-transplant physical function. In general, balance scores were more predictive of post-transplant outcomes than the SPPB. Decreased pre-transplant balance was independently associated with longer LOS and increased rehospitalizations but not with post-transplant QOL. 35% of patients experienced a clinically meaningful (≥ 1.0 m/s) improvement in gait speed four months post-transplant. Conclusions Decreased physical function may be associated with longer LOS and rehospitalizations following kidney transplant. Further studies are needed to confirm this association. The lack of relationship between pre-transplant gait speed and outcomes in our cohort may represent a ceiling effect. More comprehensive measures, including balance testing, may be required for risk stratification. This article is protected by copyright. All rights reserved.
  • Idiopathic Hyperammonemia after Solid Organ Transplantation: Primarily a Lung Problem? A Single-Center Experience and Systematic Review
    Background Idiopathic hyperammonemia syndrome (IHS) is an uncommon, often deadly complication of solid organ transplantation. IHS cases in solid organ transplantation seem to occur predominantly in lung transplant (LTx) recipients. However to the best of our knowledge the occurrence of IHS has not been systematically evaluated. We set out to identify all reported cases of IHS following non-liver solid organ transplantations. Methods Retrospective review of our institutional experience and systematic review of the literature. Results At our institution six cases (out of 844 non-liver solid organ transplants) of IHS were identified; five occurred following LTx (incidence 3.9%[lung] vs. 0.1%[non-lung], p=0.004).In the systematic review sixteen studies met inclusion criteria, reporting on 32 cases of IHS. The majority of IHS-cases in the literature (81%) were LTx-recipients. The average peak reported ammonia level was 1039umol/L occurring on average 14.7 days post-transplant. Mortality in previously reported IHS-cases was 69%. A single center experience suggested that, in addition to standard treatment for hyperammonemia, early initiation of high intensity hemodialysis to remove ammonia was associated with increased survival. In the systematic review mortality was 40% (4 out of 10) with intermittent hemodialysis, 75% (9 out of 12) with continuous veno-venous hemodialysis and 100% in 6 subjects that did not receive renal replacement to remove ammonia. Three reports identified infection with urease producing organisms as a possible etiology of IHS. Conclusion IHS is a rare but often fatal complication that primarily affects lung transplant recipients within the first 30 days. This article is protected by copyright. All rights reserved.
  • Long term survival following kidney transplantation in previous lung transplant recipients - an analysis of the UNOS registry
    Background Kidney transplantation has been advocated as a therapeutic option in lung recipients who develop end-stage renal disease. This analysis outlines patterns of allograft survival following kidney transplantation in previous lung recipients (KAL). Methods Data from the UNOS lung and kidney transplantation registries (1987-2013) were cross-linked to identify lung recipients who were subsequently listed for and/or underwent kidney transplantation. Time-dependent Cox models compared survival in KAL patients vs. those waitlisted for renal transplantation who never received kidneys. Survival analyses compared outcomes between KAL patients and risk-matched recipients of primary, kidney-only transplantation with no history of lung transplantation (KTx). Results 270 lung recipients subsequently underwent kidney transplantation (KAL). Regression models demonstrated a lower risk of post-listing mortality for KAL patients compared with 346 lung recipients on the kidney waitlist who never received kidneys (p<0.05). Comparisons between matched KAL and KTx patients demonstrated significantly increased risk of death and graft loss (p<0.05), but not death-censored graft loss, for KAL patients (p=0.86). Conclusions KAL patients enjoy a significant survival benefit compared with waitlisted lung recipients who do not receive kidneys. However, KAL patients do poorly compared with KTx patients. Decisions about KAL transplantation must be made on a case-by-case basis considering patient and donor factors. This article is protected by copyright. All rights reserved.
  • Low Vitamin D Exposure is Associated with Higher Risk of Infection in Renal Transplant Recipients
    Background Vitamin-D is a steroid hormone with multiple vital roles within the immune system. Various studies evaluated the influence of vitamin-D on infections post-renal transplantation and found contrasting results. This study aims to assess the relationship between vitamin-D status and the incidence of infection in renal transplant recipients. Methods This is a retrospective cohort study of adult renal transplant recipients at the University of Pittsburgh Medical Center between 2005 and 2012. Patients were grouped as vitamin-D sufficient (≥30 ng/mL) or deficient (<30 ng/mL) based on total serum 25-hydroxyvitamin-D concentrations. The association between vitamin D levels collected at any point post-transplantation and incidence of infection within ±90 days of the vitamin-D levels were assessed using logistic and Poisson regression models. Results Vitamin-D sufficiency at any point post-transplantation was significantly associated with a 66% lower odds (OR:0.34; 95% CI:0.22-0.52; p<0.001) and 43% lower rate of infections (IRR:0.57; 95% CI:0.46-0.71; p<0.001) within ±90 days of the vitamin-D level. Baseline vitamin-D level was also associated with lower incidence and risk for infections within the first year post-transplantation. Conclusion Adequate levels of vitamin-D in kidney transplant recipients is associated with lower infection risk in the first year and at any time post-transplantation. This article is protected by copyright. All rights reserved.
  • Severe Acute Cellular Rejection after Intestinal Transplantation is Associated with Poor Patient and Graft Survival
    Background Severe acute cellular rejection (ACR) occurs frequently after intestinal transplantation (ITx). Aim To evaluate the outcomes and the risk factors for graft failure and mortality in patients with severe ACR after ITx. Methods Retrospective study evaluating all ITx recipients who developed severe ACR between 01/2000 and 07/2014. Demographic and histologic data were reviewed. Results 20/126 (15.9%) ITx recipients developed severe ACR. Of these 20 episodes, 13 were in adults (median age: 47.1). The median (IQR) time from ITx to severe ACR was 206.5 (849) days. All patients received intravenous methylprednisone and increased doses of tacrolimus. Sixteen (80%) patients did not respond to initial treatment and required thymoglobulin administration. Moreover, 11 (55%) patients required additional immunosuppressive medications. Six (30%) patients required graft enterectomy. Complications related to ACR treatment were the following: 10 (50%) patients developed bacterial infections, 4 (20%) patients developed CMV infection and 4 (20%) patients developed PTLD. At the end of follow-up, only 3/20 (15%) were alive with a functional allograft. The median patient survival time after diagnosis of severe ACR was 400 days (95% CI: 234.0 - 2613.0). Conclusions Severe ACR episodes are associated with high rates of graft loss and complications related to treatment. This article is protected by copyright. All rights reserved.
  • Screening for Asymptomatic Bacteruria at One Month after Adult Kidney Transplantation: Clinical Factors and Implications
    Objective Urinary tract infections(UTI) account for significant morbidity after kidney transplantation(KT). Screening for asymptomatic bacteruria(AB) has proven to be beneficial in certain population including pregnant women; however it is not well-studied in KT population. We reviewed the incidence, clinical features and implications of asymptomatic bacteruria one-month after KT. Methods 171 adult KT patients(86(50.3%) living transplants, 87(50.9%) males, mean age 47.3±13.7 years), between 2005-2012, were analyzed. Immunosuppression induction and maintenance were as per protocol. Protocol urine cultures were taken at 1 month post-transplantation. Patients were stratified for presence of AB and analyzed for demographics and clinical parameters. Outcomes of hospitalization for symptomatic UTIs, graft and patient survival were ascertained. Results Forty-one(24%) of KT recipients had AB at 30-days post-transplant. Multi-resistant organisms accounted for 43.9% of these infections. Logistic regression confirms female sex and deceased donor recipients as independent predictors of 30-day bacteruria, which predicts subsequent hospitalization for symptomatic UTI. One-year patient and graft survival were similar in recipient with or without AB. Conclusion Asymptomatic bacteruria 30-days post-transplant can be predicted in female recipients and kidneys from deceased donors probably due to anatomical and functional differences respectively. There is increased morbidity of subsequent hospitalization for symptomatic UTI and more research in prevention of UTI is needed, particularly non antibiotic prophylaxis. This article is protected by copyright. All rights reserved.
  • Early conversion to belatacept after renal transplantation
    Belatacept is a non-nephrotoxic immunosuppressive agent, which may make it the ideal agent for patients with delayed or slow graft function on calcineurin inhibitors. There are limited data on conversion of patients to belatacept within 6 months of transplantation. Between 1/2012 and 12/2015 sixteen patients were converted to belatacept for delayed or poor graft function (eGFR <30 ml/min/1.73m, MDRD); three were HIV+. Conversion protocols were analyzed in patients < 4 months and 4-6 months post transplantation. Mean serum creatinine levels after belatacept conversion were compared with pre-conversion levels. Patient survival was 100%, and graft survival was 88%. The mean creatinine fell from 3.9 ± 1.82 mg/dl pre belatacept conversion to 2.1 ±1.1 mg/dl at 6 months and 1.9 ± 0.47 mg/dl (median 1.8 mg/dl) at 12 months post conversion. There was no significant increased risk of rejection, infection or malignancy. HIV parameters remained largely stable. Early conversion to belatacept in patients with DGF or slow graft function is safe and efficacious, in a single center non randomized retrospective analysis. This article is protected by copyright. All rights reserved.
  • A Clinical Tool to Calculate Post-Transplant Survival Using Pre-Transplant Clinical Characteristics in Adults with Cystic Fibrosis
    Background We previously identified factors associated with a greater risk of death post-transplant. The purpose of this study was to develop a clinical tool to estimate the risk of death after transplant based on pre-transplant variables. Methods We utilized the Canadian CF registry to develop a nomogram that incorporates pre-transplant clinical measures to assess post-lung transplant survival. The 1-, 3-, and 5-year survival estimates were calculated using Cox proportional hazards models. Results Between 1988 and 2012, 539 adult Canadians with CF received a lung transplant with 208 deaths in the study period. Four pre-transplant factors most predictive of poor post-transplant survival were older age at transplantation, infection with B. cepacia complex, low FEV1 percent predicted and pancreatic sufficiency. A non-linear relationship was found between risk of death and FEV1 percent predicted, age at transplant, and BMI. We constructed a risk calculator based on our model to estimate the 1-, 3-, and 5- probability of survival after transplant which is available on-line at www.cf-post-tx-survival-nomogram.appspot.com. Conclusions Our risk calculator quantifies the risk of death associated with lung transplant using pre-transplant factors. This tool could aid clinicians and patients in the decision making process and provide information regarding the timing of lung transplantation. This article is protected by copyright. All rights reserved.
  • Pharmacogenetics of Steroid Responsive Acute Graft-Versus Host Disease
    Glucocorticoids are central to effective therapy of acute graft-versus-host disease (GVHD). However, only about half of the patients respond to steroids in initial therapy. Based on postulated mechanisms for anti-inflammatory effectiveness, we explored genetic-variations in glucocorticoid receptor, co-chaperone proteins, membrane transporters, inflammatory mediators, and variants in the T-cell receptor complex in hematopoietic cell transplant recipients with acute GVHD requiring treatment with steroids and their donors towards response at day 28 after initiation of therapy. 300 recipient and donor samples were analyzed. Twenty-three SNPs in 17 genes affecting glucocorticoid pathways were included in the analysis. In multiple regression analysis, donor SNP rs3192177 in the ZAP70 gene (O.R. 2.8, 95% CI: 1.3-6.0, p=0.008), and donor SNP rs34471628 in the DUSPI gene (O.R. 0.3, 95% CI: 0.1-1.0, p=0.048) were significantly associated with complete or partial response. However, after adjustment for multiple testing, these SNPs did not remain statistically significant. Our results, on this small, exploratory, hypothesis generating analysis suggest that common genetic variation in glucocorticoid pathways may help identify subjects with differential response to glucocorticoids. This needs further assessment in larger datasets, and if validated, could help identify subjects for alternative treatments and design targeted treatments to overcome steroid resistance. This article is protected by copyright. All rights reserved.
  • Cultural Competency of a Mobile, Customized Patient Education Tool for Improving Potential Kidney Transplant Recipients’ Knowledge and Decision-Making
    Patients considering renal transplantation face an increasingly complex array of choices as a result of the revised kidney transplant allocation system. Decision aids have been shown to improve patient decision making through the provision of detailed, relevant, individualized clinical data. A mobile iOS based application (app) including animated patient education and individualized risk adjusted outcomes following kidney transplants with varying donor characteristics and DSA waiting times was piloted in 2 large US transplant programs with a diverse group of renal transplant candidates (N=81). The majority (86%) of patients felt that the app improved their knowledge and was culturally appropriate for their race/ethnicity (67%-85%). Patients scored significantly higher on transplant knowledge testing (9.1/20 to 13.8/20 p<0.001) after viewing the app, including patients with low health literacy (8.0 to 13.0 p<0.001). Overall knowledge of and interest in living and deceased donor kidney transplantation increased. This pilot project confirmed the benefit and cultural acceptability of this educational tool, and further refinement will explore how to better communicate the risks and benefits of non-standard donors. This article is protected by copyright. All rights reserved.
  • Making inroads to the cure: Barriers to clinical trial enrollment in hematopoietic cell transplantation
    A significant barrier to advancing the standard of care for patients with hematologic malignancies undergoing stem cell transplantation is access and willingness to participate in clinical trials. The importance of clinical trial enrollment is magnified in an era of targeted therapies, accelerated drug discovery, and investment by the pharmaceutical industry. As disease targets are identified, novel therapies are being evaluated in efforts to reduce treatment-related toxicity and improve progression-free and overall survival. The enrollment of hematopoietic cell transplantation (HCT) patients on clinical studies is essential to promote the development of such therapies. Increasing clinical trial participation requires understanding of potential barriers to enrollment, including patient concerns, institutional and provider hurdles, and disease-specific characteristics. This article is protected by copyright. All rights reserved.
  • Lactobacillus rhamnosus GG probiotic enteric regimen does not appreciably alter the gut microbiome or provide protection against GVHD after allogeneic hematopoietic stem cell transplantation
    Graft-versus-host-disease (GVHD) is a major adverse effect associated with allogeneic stem cell transplant. Previous studies in mice indicated that administration of the probiotic Lactobacillus rhamnosus GG can reduce the incidence of GVHD after hematopoietic stem cell transplant. Here we report results from the first randomized probiotic enteric regimen trial in which allogenic hematopoietic stem cell patients were supplemented with Lactobacillus rhamnosus GG. Gut microbiome analysis confirmed a previously reported gut microbiome association with GVHD. However, the clinical trial was terminated when interim analysis did not detect an appreciable probiotic-related change in the gut microbiome or incidence of GVHD. Additional studies are necessary to determine if probiotics can alter the incidence of GVHD after allogeneic stem cell transplant. This article is protected by copyright. All rights reserved.
  • Sinus Tachycardia Is Associated with Impaired Exercise Tolerance following Heart Transplantation
    Background Sinus tachycardia often presents in heart transplantation (HTx) recipients, but data on its effect on exercise performance are limited. Methods Based on mean heart rate (HR) value 3 months after HTx, 181 patients transplanted from 2005 to 2016 at Nebraska Medical Center were divided into 2 groups: 1) HR<95 beats/min (bpm, n= 93) and 2) HR≥95 bpm (n=88). Cardiopulmonary exercise testing (CPET) was performed 1 year after HTx. Results Mean HR at 3 months post- HTx was 94 ±11 bpm and did not change significantly at 1 year post- HTx (96±11 bpm, p=0.13). HR≥95 bpm at 3 months was associated with younger donor age (OR 1.1; CI 1.0- 1.1, p=0.02), female donors (OR -2.4; CI 1.16 – 5.24 p=0.02) and lack of donors’ heavy alcohol use (OR – 0.43; CI 0.17 – 0.61; p=0.04). HR≥95 bpm at 3 months post- HTx was independently associated with decreased exercise capacity in metabolic equivalent (p=0.008), reduced peak VO2 (p=0.006), and percent of predicted peak VO2 (p=0.002) during CPET Conclusions HR≥95 at 3 months following HTx is associated with reduced exercise tolerance in stable HTx recipients. Medical HR reduction after HTx could improve exercise performance after HTx and merits further investigation. This article is protected by copyright. All rights reserved.
  • Quantitative Computed Tomography Assessment of Bronchiolitis Obliterans Syndrome after Lung Transplantation
    Background Bronchiolitis obliterans syndrome (BOS) is a clinical manifestation of chronic allograft rejection following lung transplantation. We examined the quantitative measurements of the proximal airway and vessels and pathologic correlations in subjects with BOS. Methods Patients who received a lung transplant at the Brigham and Women's Hospital between December 1st 2002 and December 31st 2010 were included in this study. We characterized the quantitative CT measures of proximal airways and vessels and pathological changes. Results 94 (46.1%) of the 204 subjects were included in the study. There was a significant increase in the airway vessel ratio in subjects who developed progressive BOS compared to controls and non-progressors. There was a significant increase in airway lumen area and decrease in vessel cross sectional area in patients with BOS compared to controls. Patients with BOS had a significant increase in proximal airway fibrosis compared to controls. Conclusions BOS is characterized by central airway dilation and vascular remodeling the degree of which is correlated to decrements in lung function. Our data suggests that progressive BOS is a pathologic process that affects both the central and distal airways. This article is protected by copyright. All rights reserved.
  • Adverse Symptoms of Immunosuppressants: A Survey of Canadian Transplant Clinicians
    Adverse symptoms of immunosuppressants (ASI) impact quality of life (QOL) in solid organ transplant recipients, however standardized approaches for active ASI surveillance and intervention are lacking. While management is highly clinician-dependent, clinician views remain largely unexplored. We surveyed Canadian Society of Transplantation members on their perceptions of ASI including frequency, perceived QOL impact, causal attribution, management strategies and success. Sixty-one clinicians participated in the survey of 12 ASI (tremor, diarrhea, nausea, constipation, dyspnea, insomnia, edema, dyspnea, arthralgia, acne, mouth sores, paresthesias), for a 22% response rate. Forty-nine completed the survey (80% completion rate). Diarrhea, dyspepsia and insomnia were most frequent, requiring management in ≥ 2% of patients by 96%, 90% and 82% of respondents, respectively. Diarrhea, insomnia and dyspnea were deemed to have an important QOL impact by 92%, 82% and 69%. Immunosuppressants were universally implicated as causative of tremor, diarrhea, acne and mouth sores. Over 80% reported success in managing mouth sores, dyspepsia and constipation. Management strategies included adjustment of immunosuppressant or other medications, drug therapy and non-pharmacologic approaches, and varied according to perceived causal attribution. More study is needed to compare clinician and patient views. These results will be used to establish priorities for further investigation of ASI. This article is protected by copyright. All rights reserved.
  • (D+10) MELD as a Novel Predictor of Patient and Graft survival after Adult to Adult Living Donor Liver Transplantation
    We modified the previously described DMELD score in deceased donor liver transplant, to (D+10)MELD to account for living donors being about 10 years younger than deceased donors, and tested it on living donor liver transplantation (LDLT) recipients. 500 consecutive LDLT, between July 2010 and December 2012 were retrospectively analyzed to see the effect of (D+10)MELD on patient and graft survival. Donor age alone did not influence survival. Recipients were divided into 6 classes on based on the (D+10)MELD score: Class 1 (0-399), Class 2 (400-799), Class 3 (800-1199), class 4 (1200-1599), Class 5 (1600-1999) and Class 6 (>2000). The one year patient survival (97.1, 88.8, 87.6, 76.9 and 75% across class 1-5, p = 0.03) and graft survival (97.1, 87.9, 82.3, 76.9 and 75%; p=0.04) was significantly different among the classes. The study population was divided into 2 groups at (D+10)MELD cut off at 860. Group 1 had a significantly better 1 year patient (90.4% vs 83.4%; p = 0.02) and graft survival (88.6% vs 80.2%; p = 0.01). While donor age alone does not predict recipient outcome, (D+10)MELD score is a strong predictor of recipient and graft survival, and may help in better recipient/donor selection and matching in LDLT. This article is protected by copyright. All rights reserved.
  • Histologic surveillance after liver transplantation due to autoimmune hepatitis
    Background Autoimmune hepatitis (AIH) often recurs after liver transplantation (LT). Our aim was to evaluate the recurrence rate of AIH after LT, impact of AIH recurrence on survival and fibrosis progression, and find risk factors for AIH recurrence. Methods 42 patients with AIH prior to LT with ≥1 protocol biopsy ≥1 year post-LT were included with a median follow-up of 5.0 years (1.0-17.0). Follow-up liver biopsies were re-evaluated for AIH recurrence, fibrosis progression, and cirrhosis development. Results A histological recurrence of AIH was diagnosed in 15 (36%) patients at a median of five years of follow-up. Recurrent AIH lead to progressive fibrosis (METAVIR stage 3-4) in two but did not cause a single patient or graft loss. Transaminases were normal in three patients with recurrent AIH (20%). AIH recurrence was more common in patients with no overlapping cholangitis (OR 1.44, p=.021). Immunosuppression without antimetabolite increased the risk of AIH recurrence (OR 1.47, p=.018). Patient and graft survival rates at one, five and ten years were 94, 86, and 86% and 91, 77, and 74%. AIH recurrence did not affect survival. Conclusions AIH recurrence occurs in 36% in five years, but does not affect patient or graft outcome. This article is protected by copyright. All rights reserved.
  • The effects of share 35 on the cost of liver transplantation
    On June 18, 2013, the United Network for Organ Sharing (UNOS) instituted a change in the liver transplant allocation policy known as ‘Share 35.’ The goal was to decrease waitlist mortality by increasing regional sharing of livers for patients with a Model for End Stage Liver Disease (MELD) score of 35 or above. Several studies have shown Share 35 successful in reducing waitlist mortality, particularly in high MELD patients. However, the MELD score at transplant has increased, resulting in sicker patients, more complications, and longer hospital stays. Our study aimed to explore factors, along with Share 35, that may affect the cost of liver transplantation. Our results show Share 35 has come with significantly increased cost to transplant centers across the nation, particularly in regions 2, 5, 10, and 11. Region 5 was the only region with a median MELD above 35 at transplant, and cost was significantly higher than other regions. Several other recipient factors had changes with Share 35 that may significantly affect the cost of liver transplant. While access to transplantation for the sickest patients has improved, it has come at a cost and regional disparities remain. Financial implications with proposed allocation system changes must be considered. This article is protected by copyright. All rights reserved.
  • Living Donor Kidney Allograft Survival ≥ 50 Years
    The first successful kidney transplant occurred in 1954. Since then, long-term graft survival has been an elusive idealistic goal of transplantation. Yet 62 years later, we know of only 6 kidney transplant recipients who have achieved > 50 year graft survival while being on no immunosuppression or a substantially-reduced regimen. Herein, we report graft survival ≥ 50 years in 2 living donor recipients who have been maintained on standard of care immunosuppression the entire time. For our 2 recipients, their living donor's altruism altered the course, length, and quality of their life, which by all accounts can be deemed normal: they attended college, held jobs, had successful pregnancies, raised families and were productive members of society. Both donors are still alive and well, more than 50 years post-donation; both have an acceptable GFR and normal blood pressure, with hyperlipidemia as their only medical problem. These 2 intertwined stories illustrate the tremendous potential of a successful kidney transplant: long-term survival with a normal lifestyle and excellent quality of life, even after more than 5 decades on full-dose immunosuppression. This article is protected by copyright. All rights reserved.
  • Cardiac transplantation in a neonate – First case in Switzerland and European Overview
    Twenty-four percent of pediatric heart transplantations (pHTx) are carried out in infants. Neonatal heart transplantation is both rarely performed and challenging. We report on a newborn baby girl suffering from cardiac failure due to a huge tumor (24x52 mm) within the free wall of the left ventricle (LV) and subtotal obstruction of the main left bronchus. Following a surgical tumor resection, a Berlin Heart EXCOR left ventricular assist device was implanted as the bridge to the transplantation. In spite of an organ donor/recipient mismatch of >200%, both heart transplantation and the post-operative course were successful. In addition to this case report, the authors also present data from a survey on performed infant and neonatal transplantations in Western Europe. As neonatal heart transplantation is a rare event in Europe, the authors think it is of crucial importance to share this limited experience. We discuss an alternative strategy—namely, palliative surgical correction using the Fontan pathway. The challenges of donor/recipient weight mismatch and the possibilities of overcoming infant donor organ shortage as a post-operative immunosuppressive regime are discussed as well. This article is protected by copyright. All rights reserved.
  • Association of pre-transplant kidney function with outcomes after lung transplantation
    Purpose There is a lack of data regarding the independent association of pre-transplant kidney function with early and late outcomes among lung transplant (LT) recipients. Methods We queried the UNOS database for adult patients (≥ 18 years of age) undergoing LT between 1987 and 2013. Glomerular filtration rate (GFR) was estimated using the Modification of diet in renal disease (MDRD) and the Chronic kidney disease epidemiology collaboration (CKD-EPI) equations. The study population was split into four groups (>90, 60-90, 45-59.9 and <45 ml/min/1.73m2) based upon the estimated GFR at the time of listing. Results Overall there was a good correlation between the GFR estimated from the two equations (n=17884, Pearson r=0.816, p<0.001). There was a consistent and independent association of worse early and late outcomes with declining GFR throughout the spectrum including those above 60 ml/min/1.73 m2 (p<0.001 for overall comparisons). Although GFR<45 ml/min/1.73 m2 was associated with worse early and late survival, patients with GFR 45-59.9 ml/min/1.73 m2 do not appear to have survival advantage beyond 3 years post transplant. Conclusion There is a good correlation between GFR estimated using MDRD and CKD-EPI equations among patients being considered for LT. Early and late outcomes after LT worsen in a linear fashion with progressively lower pre-transplant GFR. This article is protected by copyright. All rights reserved.
  • Early post-transplant conversion from tacrolimus to belatacept for prolonged delayed graft function improves renal function in kidney transplant recipients
    Prolonged delayed graft function (DGF) in kidney transplant recipients imparts a risk of poor allograft function; tacrolimus may be detrimental in this setting. We conducted a retrospective single center analysis of the first 20 patients converted to belatacept for prolonged DGF as part of a clinical protocol as a novel treatment strategy to treat prolonged DGF. Prior to conversion, patients underwent an allograft biopsy to rule out rejection and confirm tubular injury. The primary outcome was the estimated glomerular filtration rate (eGFR) at 12 months post-transplant; secondary outcome was the change in eGFR 30 days post-belatacept conversion. At 1 year post-transplant, the mean eGFR was 54.2 (SD 19.2) mL/min/1.73 m2. The mean eGFR on the day of belatacept conversion was 16 (SD 12.7) mL/min/1.73 m2 and rose to 43.1 (SD 15.8) mL/min/1.73 m2 30 days post-conversion (P<.0001). The acute rejection rate was 20% with 100% patient survival at 12 months post-transplant. There was one graft loss in the setting of an invasive Aspergillus infection that resulted in withdrawal of immunosuppression and transplant nephrectomy. Belatacept conversion for prolonged DGF is a novel treatment strategy that resulted in an improvement in eGFR. Additional follow-up is warranted to confirm the long-term benefits of this strategy.
  • Graft quality matters: Survival after simultaneous liver-kidney transplant according to KDPI
    Background Poor renal function is associated with higher mortality after liver transplantation. Our aim was to understand the impact of kidney graft quality according to the kidney donor profile index (KDPI) score on survival after simultaneous liver-kidney (SLK) transplantation. Methods Using United Network of Organ Sharing data from 2002 to 2013 for adult deceased donor SLK recipients, we compared survival and renal graft outcomes according to KDPI. Results Of 4207 SLK transplants, 6% were from KDPI >85% donors. KDPI >85% recipients had significantly increased mortality (HR=1.83, 95%CI=1.44-2.31) after adjusting for recipient factors. Additionally, dialysis in the first week (HR=1.4, 95%CI=1.2-1.7) and death-censored kidney graft failure at 1 year (HR=5.7, 95%CI=4.6-7.0) were associated with increased mortality after adjusting for recipient factors and liver donor risk index score. Conclusions KDPI >85% recipients had worse patient and graft survival after SLK. Poor renal allograft outcomes including dialysis in the first week and death-censored kidney graft failure at 1 year, which occurred more frequently with KDPI >85% grafts, were associated with significantly reduced patient survival. Questions remain about the survival impact of liver vs kidney graft quality given the close relationship between donor factors contributing to both, but KDPI can still be valuable as a metric readily available at the time of organ offers for SLK candidates.
  • Ledipasvir/sofosbuvir is effective and well tolerated in postkidney transplant patients with chronic hepatitis C virus
    Patients with end-stage renal diseases on hemodialysis have a high prevalence of hepatitis C infection (HCV). In most patients, treatment for HCV is delayed until postrenal transplant. We assessed the effectiveness and tolerance of ledipasvir/sofosbuvir (LDV/SOF) in 32 postkidney transplant patients infected with HCV. The group was composed predominantly of treatment-naïve (75%) African American (68.75%) males (75%) infected with genotype 1a (62.5%). Most patients received a deceased donor kidney graft (78.1%). A 96% sustained viral response (SVR) was reported (27/28 patients). One patient relapsed. One patient with baseline graft dysfunction developed borderline rejection. No graft loss was reported. Six HIV-coinfected patients were included in our analysis. Five of these patients achieved SVR 12. There were four deaths, and one of the deaths was in the HIV group. None of the deaths were attributed to therapy. Coinfected patients tolerated therapy well with no serious adverse events. Serum creatinine remained stable at baseline, end of therapy, and last follow-up, (1.351±.50 mg/dL; 1.406±.63 mg/dL; 1.290±.39 mg/dL, respectively). In postkidney transplant patients with HCV infection with or without coinfection with HIV, a combination of LDV/SOF was well tolerated and effective.
  • The high incidence of severe chronic kidney disease after intestinal transplantation and its impact on patient and graft survival
    Introduction Using data from the Scientific Registry of Transplant Recipients (SRTR), cumulative incidence, risk factors for, and impact on survival of severe chronic kidney disease (CKD) in intestinal transplantation (ITx) recipients were assessed. Methods First-time adult ITx recipients transplanted in the United States between January 1, 1990 and December 31, 2012 were included. Severe CKD after ITx was defined as: glomerular filtration rate (GFR) <30 mL/min/1.73 m2, chronic hemodialysis initiation, or kidney transplantation (KTx). Survival analysis and extended Cox model were conducted. Results The cumulative incidence of severe CKD 1, 5, and 10 years after ITx was 3.2%, 25.1%, and 54.1%, respectively. The following characteristics were significantly associated with severe CKD: female gender (HR 1.34), older age (HR 1.38/10 year increment), catheter-related sepsis (HR 1.58), steroid maintenance immunosuppression (HR 1.50), graft failure (HR 1.76), ACR (HR 1.64), prolonged requirement for IV fluids (HR 2.12) or TPN (HR 1.94), and diabetes (HR 1.54). Individuals with higher GFR at the time of ITx (HR 0.92 for each 10 mL/min/1.73 m2 increment), and those receiving induction therapies (HR 0.47) or tacrolimus (HR 0.52) showed lower hazards of severe CKD. In adjusted analysis, severe CKD was associated with a significantly higher hazard of death (HR 6.20). Conclusions The incidence of CKD after ITx is extremely high and its development drastically limits post-transplant survival.
  • Kidney allograft surveillance biopsy practices across US transplant centers: A UNOS survey
    Background The approach to the diagnosis and management of subclinical rejection (SCR) in kidney transplant recipients remains controversial. Methods We conducted a survey through UNOS across US transplant centers regarding their approach to surveillance biopsies and reasons for the nonperformance of surveillance biopsies. Results Responses were obtained from 106/238 centers (45%), and only 18 (17%) of the centers performed surveillance biopsies on all patients and 22 (21%) performed biopsy for select cases. The most common time points for surveillance biopsies were 3 and 12  months post-transplant. The common reasons for not performing biopsies were low yield (n = 44, 65%) and the belief that it will not change outcome (n = 24, 36%). The incidence of SC-TCMR was ≥ 10% among 39% of centers. The mean serum creatinine was slightly worse by 0.06 mg/dL at 1 year and 0.07 mg/dL at 3 years among centers performing biopsy, P < .0001. The. 1-and 3-year Observed-Expected (O-E) graft survival was similar among centers performing biopsies vs. those not performing biopsy (P = .07, .88). Conclusion Only 17% of US centers perform surveillance biopsies, with another 21% performing surveillance biopsies in select cases (among centers that responded to the survey). Greater uniformity in the approach and management of this condition is of paramount importance.
  • Professional interpersonal dynamics and burnout in European transplant surgeons
    Background Burnout within the health professions has become an increasingly important topic. Evidence suggests there are differences in burnout across different countries. Research has yet to examine burnout in transplant surgeons throughout Europe. Methods A cross-sectional survey of transplant surgeons across Europe. Survey included sociodemographics, professional characteristics, frequency and discomfort with difficult patient interactions (PI), decisional autonomy, psychological job demands (PJD), support (coworker, supervisor, and hospital administration), and burnout including emotional exhaustion (EE), depersonalization (DP), and personal accomplishment (PA). Results One hundred and eight transplant surgeons provided data; 33 (30.6%) reported high EE, 19 (17.6%) reported high DP, and 29 (26.9%) reported low PA. Three hierarchical multiple linear regressions examined the burnout subscales as outcomes (EE, DP, and PA), and predictors selected based upon theoretical relationships with the outcomes. Greater PJD, greater discomfort in managing difficult PI, and lower levels of perceived supervisor support (SS) predicted greater EE. Only decisional autonomy significantly predicted DP, accounting for a small proportion of the variance. None of the steps for PA were significant. Conclusions Given prior research on burnout, there were several surprising findings from this study. For example, the relatively low levels of EE compared to U.S. physicians and surgeons. At this time, we can only hypothesize why this finding occurred but there are multiple possible explanations including cultural effects, response bias, or other factors unknown at this time. Research is needed to attempt to clarify these findings.
  • Prediction model for cardiac allograft vasculopathy: Comparison of three multivariable methods
    Background Cardiac allograft vasculopathy (CAV) remains an important cause of graft failure after heart transplantation (HT). Although many risk factors for CAV have been identified, there are no clinical prediction models that enable clinicians to determine each recipient's risk of CAV. Methods We studied a cohort of 14 328 heart transplant recipients whose data were reported to the International Society for Heart and Lung Transplantation Registry between 2000 and 2010. The cohort was divided into training (75%) and test (25%) sets. Multivariable modeling was performed in the test set using variables available at the time of heart transplant using three methods: (i) stepwise Cox proportional hazard, (ii) regularized Cox proportional hazard, and (iii) Bayesian network. Results Cardiac allograft vasculopathy developed in 4259 recipients (29.7%) at a median time of 3.0 years after HT. The regularized Cox proportional hazard model yielded the optimal performance and was also the most parsimonious. We deployed this model as an Internet-based risk calculator application. Conclusions We have developed a clinical prediction model for assessing a recipient's risk of CAV using variables available at the time of HT. Application of this model may allow clinicians to determine which recipients will benefit from interventions to reduce the risk of development and progression of CAV.
  • Adverse outcomes associated with postoperative atrial arrhythmias after lung transplantation: A meta-analysis and systematic review of the literature
    Background Postoperative atrial arrhythmias (AAs) are common after lung transplantation, but studies are mixed regarding their impact on outcomes. We therefore performed this systematic review and meta-analysis to determine whether AAs after lung transplantation impede postoperative recovery. Methods MEDLINE, EMBASE, CINAHL, and the Cochrane Register were searched to identify studies comparing outcomes in adult patients undergoing lung transplantation who experienced postoperative AAs in the immediate postoperative period vs those without postoperative AAs. Our primary outcome was perioperative mortality, and secondary outcomes were length of stay (LOS), postoperative complications, and mid-term (1-6 years) mortality. Results Nine studies including 2653 patients were included in this analysis. Of this group, 791 (29.8%) had postoperative AAs. Patients with postoperative AAs had significantly higher perioperative (OR 2.70 [95% CI: 1.73-4.19], P<.0001) mortality, longer hospital LOS (MD 8.29 [95% CI: 4.37-12.21] days, P<.0001), more frequent requirement for tracheostomy (OR 4.67 [95% CI: 2.59-8.44], P<.0001), and higher mid-term mortality (OR 1.71 [95% CI: 1.28-2.30], P=.0003). Conclusions AAs after lung transplantation are frequent and associated with significantly higher mortality, longer hospital LOS, and requirement for tracheostomy. Given their impact on recovery, prophylactic strategies against AAs need to be developed.
  • Treatment of cutaneous and/or soft tissue manifestations of corticosteroids refractory chronic graft versus host disease (cGVHD) by a total nodal irradiation (TNI)
    The management of corticosteroids refractory chronic graft versus host disease (cGVHD) remains controversial. Retrospective analysis of patients treated at the Integrated Center of Oncology by total nodal irradiation (TNI) was performed to evaluate its therapy potency. TNI delivers a dose of 1 Gy in a single session. The delimitation of the fields is clinical (upper limit: external auditory meatus; lower limit: mid-femur). No pre-therapeutic dosimetry scanner was necessary. Evaluation of the efficacy was by clinical measures at 6 months after the treatment. Twelve patients were treated by TNI between January 2010 and December 2013. TNI was used in second-line treatment or beyond. The median time between allograft and TNI was 31.2 months, and the median time between the first manifestations of cGVHD and TNI was about 24.2 months. Of the 12 patients, nine had a clinical response at 6 months (75%), including five complete clinical responses (41.6%). Five patients could benefit from a reduction of corticosteroid doses. Three patients had hematologic toxicity. TNI could be considered as an option for the treatment of a cutaneous and/or soft tissues corticosteroids refractory cGVHD. However, prospective randomized and double-blind trials remain essential to answer the questions about TNI safety and effectiveness.
  • Physical activity in solid organ transplant recipients: Participation, predictors, barriers, and facilitators
    Background Our objectives were to describe the physical activity (PA) levels, predictors, barriers, and facilitators to PA in solid organ transplant (SOT) recipients. Methods A web-based questionnaire was sent to members of the Canadian Transplant Association including the Physical Activity Scale for the Elderly (PASE), and questions regarding barriers and facilitators of PA. Results One hundred and thirteen SOT recipients completed the survey. The median PASE score was 164.5 (24.6-482.7). Re-transplantation was the only statistically significant predictor of levels of PA. The most common facilitators of PA included a feeling of health from activity (94%), motivation (88%), social support (76%), knowledge and confidence about exercise (74%) and physician recommendation (59%). Influential barriers were cost of fitness centers (42%), side effects post-transplant or from medications (41%), insufficient exercise guidelines (37%), and feelings of less strength post-transplant (37%). Conclusion There is a large variation in PA levels among SOT recipients. Multiple factors may explain the variance in PA levels in SOT recipients. Identification of facilitators and barriers to PA can inform the development of health and educational promotion strategies to improve participation among SOT recipients with low activity levels.
  • Effect of transversus abdominis plane block in combination with general anesthesia on perioperative opioid consumption, hemodynamics, and recovery in living liver donors: The prospective, double-blinded, randomized study
    Background Transversus abdominis plane (TAP) block provides effective postoperative analgesia after abdominal surgeries. It can be also a useful strategy to reduce perioperative opioid consumption, support intraoperative hemodynamic stability, and promote early recovery from anesthesia. The aim of this prospective randomized double-blind study was to assess the effect of subcostal TAP blocks on perioperative opioid consumption, hemodynamic, and recovery time in living liver donors. Methods The prospective, double-blinded, randomized controlled study was conducted with 49 living liver donors, aged 18-65 years, who were scheduled to undergo right hepatectomy. Patients who received subcostal TAP block in combination with general anesthesia were allocated into Group 1, and patients who received general anesthesia alone were allocated into Group 2. The TAP blocks were performed bilaterally by obtaining an image with real-time ultrasound guidance using 0.5% bupivacaine diluted with saline to reach a total volume of 40 mL. The primary outcome measure in our study was perioperative remifentanil consumption. Secondary outcomes were mean blood pressure (MBP), heart rate (HR), mean desflurane requirement, anesthesia recovery time, frequency of emergency vasopressor use, total morphine use, and length of hospital stay. Results Total remifentanil consumption and the anesthesia recovery time were significantly lower in Group 1 compared with Group 2. Postoperative total morphine use and length of hospital stay were also reduced. Changes in the MAP and HR were similar in the both groups. There were no significant differences in HR and MBP between groups at any time. Conclusions Combining subcostal TAP blocks with general anesthesia significantly reduced perioperative and postoperative opioid consumption, provided shorter anesthesia recovery time, and length of hospital stay in living liver donors.
  • Pulmonary thromboembolism as a complication of lung transplantation
    Post-transplantation mortality after lung transplantation (LTX) is higher than for other solid organ transplantations. Thoracic surgery is associated with increased risk of thromboembolic complications, and as LTX recipients lack the collateral bronchial circulation, pulmonary thromboembolism (PTE) may represent a pertinent yet largely underdiagnosed cause of post-transplantation respiratory failure. In this systematic review, we sought to elucidate the occurrence and predilection site of PTE after LTX, and its potential impact on LTX-associated mortality. Based on twelve original articles identified by a systematic search strategy in PubMed, we found that PTE was reported in 4% of LTX recipients, and 38% of these events occurred within the first 30 days after the LTX procedure. In single-lung transplantation (SLTX) recipients, 12% were diagnosed with PTE, with 92% of these affecting the allograft. Of LTX patients diagnosed with PTE, 11% died within 1 year after LTX and 75% of these deaths occurred within the first 30 days. Our findings suggest that PTE is a potentially underdiagnosed cause of early post-LTX respiratory failure. This should be confirmed in larger studies with systematic follow-up diagnostic imaging.
  • De novo DQ donor-specific antibodies are associated with worse outcomes compared to non-DQ de novo donor-specific antibodies following heart transplantation
    Background Antibody-mediated rejection (AMR) resulting from de novo donor-specific antibodies (dnDSA) leads to adverse outcomes following heart transplantation (HTx). It remains unclear what role dnDSA to specific HLA antigens play in adverse outcomes. This study compares outcomes in patients developing dnDSA to DQ antigens with those developing non-DQ dnDSA and those free from dnDSA. Methods The present study was a single-center, retrospective analysis of 122 consecutive HTx recipients. The primary outcome was a composite of death or graft dysfunction. Results After 3.3 years of follow-up, 31 (28%) patients developed dnDSA. Mean time to dnDSA was 539 days. Of 31 patients, 19 developed DQ antibodies and 12 developed non-DQ antibodies. Compared to non-DQ dnDSA, DQ antibodies presented with higher MFI values (P=.001) were more likely persistent (P=.001) and appeared later post-HTx (654 vs 359 days, P=.035). In a multivariable analysis, DQ dnDSA was associated with increased risk of the primary endpoint (HR 6.15, 95% CI 2.57-14.75, P=.001), whereas no increased risk was seen with non-DQ dnDSA (P=.749). Conclusions dnDSA to DQ antigens following HTx are associated with increased risk of death and graft dysfunction.
  • Cognitive function after heart transplantation: Comparing everolimus-based and calcineurin inhibitor-based regimens
    Background Studies have shown conflicting results concerning the occurrence of cognitive impairment after successful heart transplantation (HTx). Another unresolved issue is the possible differential impact of immunosuppressants on cognitive function. In this study, we describe cognitive function in a cohort of HTx recipients and subsequently compare cognitive function between subjects on either everolimus- or calcineurin inhibitor (CNI)-based immunosuppression. Methods Cognitive function, covering attention, processing speed, executive functions, memory, and language functions, was assessed with a neuropsychological test battery. Thirty-seven subjects were included (everolimus group: n=20; CNI group: n=17). The extent of cerebrovascular pathology was assessed with magnetic resonance imaging. Results About 40% of subjects had cognitive impairment, defined as performance at least 1.5 standard deviations below normative mean in one or several cognitive domains. Cerebrovascular pathology was present in 33.3%. There were no statistically significant differences between treatment groups across cognitive domains. Conclusions Given the high prevalence of cognitive impairment in the sample, plus the known negative impact of cognitive impairment on clinical outcome, our results indicate that cognitive assessment should be an integrated part of routine clinical follow-up after HTx. However, everolimus- and CNI-based immunosuppressive regimens did not show differential impacts on cognitive function.
  • Risk of tumor transmission after thoracic allograft transplantation from adult donors with central nervous system neoplasm—A UNOS database study
    Background We analyzed the UNOS database to better define the risk of transmission of central nervous system (CNS) tumors from donors to adult recipients of thoracic organs. Methods Data were procured from the Standard Transplant Analysis and Research dataset files. Donors with CNS tumors were identified, and recipients from these donors comprised the study group (Group I). The remaining recipients of organs from donors who did not have CNS tumors formed the control group (Group II). Incidence of recipient CNS tumors, donor-related malignancies, and overall survival were calculated and compared in addition to multivariable logistic regression. Results A cohort of 58 314 adult thoracic organ recipients were included, of which 337 received organs from donors who had documented CNS tumors (Group I). None of these recipients developed CNS tumors at a median follow-up of 72 months (IR: 30-130 months). Although overall mortality in terms of the percentage was higher in Group I than Group II (163/320=51% vs 22 123/52 691=42%), Kaplan-Meier curves indicate no significant difference in the time to death between the two groups (P=.92). Conclusions There is little risk of transmission of the common nonaggressive CNS tumors to recipients of thoracic organs.
  • Long-term renal outcome after allogeneic hemopoietic stem cell transplant: A comprehensive analysis of risk factors in an Asian patient population
    Allogeneic hemopoietic stem cell transplantation (allo-HSCT) poses a significant challenge to renal function due to multiple drug- and complication-related renal toxicity. In this single-center series of 216 adult Asian patients with a long and complete follow-up, 41 developed chronic kidney disease (CKD) giving a cumulative incidence of 19.0% at 25 years (median follow-up duration 7.84 years, range 2.0-27.7 years), but only two of the 41 patients reached stage 4 CKD and another two required dialysis. In contrast, acute kidney injury occurred in most patients, where glomerular filtration rate (GFR) suffered a mean fall of 50 mL/min/1.73 m2 at 6 months post-transplant compared with baseline. Suppression of renal function may last beyond 6 months but is potentially reversible, although not to baseline level in most patients. Analysis of a comprehensive range of 18 risk factors showed that older age, lower GFR at transplant, unrelated donor, diagnosis of AML, presence of diabetes mellitus at transplant, and duration of foscarnet use were significantly associated with CKD development, with the first three remaining as independent risks for CKD in multivariate analysis. Long-term survival is not affected by renal function, being 78.6% as compared to 85.5% for patients with low vs normal GFR at 2 years, respectively.
  • Evaluating living donor kidney transplant rates: Are you reaching your potential?
    Background Traditionally, living donor kidney transplant (LDKT) rate has been calculated as a percentage of total kidney transplant volume. We believe this calculation to be inherently flawed because the number of deceased donor kidney transplants has no bearing on the number of LDKT performed. We propose an alternative calculation of LDKT rate as a percentage of the number of new waitlist registrants. Methods We evaluated 192 adult transplant centers in the United States with respect to their LDKT rate according to both the traditional and proposed calculations, using data from the scientific registry of transplant recipients between July 2014 and June 2015. Results The median LDKT rate for every 100 new waitlist registrants was 12.3, compared to 27.9 for every 100 total kidney transplants. Based on our proposed calculation of LDKT rate, 16.7% of transplant centers were misevaluated when compared to the national mean using the traditional method. Conclusions A new calculation of LDKT rate based on new waitlist registrants, and not total kidney transplants, is necessary to eliminate the bias associated with the traditional method, allowing for the identification of centers for improvement as well as each individual center's true potential based on their patient demographics.
  • Erythrocytosis after allogeneic hematopoietic stem cell transplantation
  • Short- and long-term outcomes with renin–angiotensin–aldosterone inhibitors in renal transplant recipients: A meta-analysis of randomized controlled trials
    Background Angiotensin-converting enzyme inhibitors (ACEIs) and angiotensin II receptor type 1 blockers (ARBs) are often prescribed for renal transplant recipients (RTRs), but the outcomes of these medications in RTRs remain controversial. Methods The PubMed, Embase, and Cochrane Library databases were systematically searched. Randomized controlled trials investigating the outcomes of ACEI/ARBs in RTRs were included for meta-analysis. Results Twenty-two trials with 2242 patients were identified. After treatment for at least 12 months, ACEI/ARBs were associated with a decline in glomerular filtration rate (GFR) (weighed mean differences [WMD] −5.76 mL/min; 95% confidence intervals [CI]: −9.31 to −2.20) and a decrease in hemoglobin (WMD −9.81 g/L; 95% CI: −14.98 to −4.64). There were no significant differences in mortality between ACEI/ARB and non-ACEI/ARB groups (risk ratio [RR] 0.98, 95% CI: 0.58 to 1.76), nor in graft failure (RR 0.68, 95% CI: 0.38 to 1.32). After short-term treatment (less than 1 year), significant differences were found in changes of 24-hour proteinuria (WMD−0.57 g/d; 95% CI: −0.72 to −0.42) and serum potassium (WMD 0.25 mEq/L; 95% CI: 0.14 to 0.37) in ACEI/ARB groups compared to control arm, while these differences were not confirmed in the long run. Conclusion This meta-analysis indicates ACEI/ARBs may be prescribed to RTRs with GFR and hemoglobin being carefully monitored.
  • Candida is an emerging pathogen beyond the neutropenic period of allogeneic hematopoietic cell transplantation
  • Rescue alemtuzumab for refractory acute cellular rejection and bronchiolitis obliterans syndrome after lung transplantation
    Refractory acute cellular rejection (rACR) is associated with death and bronchiolitis obliterans syndrome (BOS) post-lung transplantation. We report the largest cohort of lung transplant recipients (LTRs) treated with rescue alemtuzumab for rACR or BOS. RACR outcomes included burden of ACR 30 days before and 180 days after rescue assessed by a novel composite rejection standardized score (CRSS, range 0-6) and freedom from ≥A2 ACR. BOS outcomes included freedom from BOS progression and FEV1 decline >10%. Univariate parametric and nonparametric statistical approaches were used to assess treatment response. Kaplan-Meier method with log rank conversion was used to assess freedom from events. Fifty-seven alemtuzumab doses (ACR 40 and BOS 17) given to 51 patients were included. Median time to rescue was 722 (IQR 42-1403) days. CRSS declined significantly (3 vs 0.67, P<0.001) after rescue. Freedom from ≥A2 was 62.5% in rACR. Freedom from BOS progression was 52.9% at 180 days in the BOS cohort. Freedom from FEV1 decline >10% was 70% in BOS grade 1 and 14.3% in advanced BOS grades 2-3. Infections developed in 72.5% and 76.5% of rACR and BOS groups. Rescue alemtuzumab appears useful for rACR. Patients with BOS 1 may have transient benefit, and patients with advanced BOS seem not to respond to alemtuzumab.
  • Morphologic patterns and treatment of transplant glomerulopathy: A retrospective analysis
    Transplant glomerulopathy is mainly due to chronic antibody-mediated rejection and actually represents a major cause of long-term allograft failure. The lack of effective treatment remains a serious problem in transplantation. A retrospective and uni-center study was performed in 48 kidney allograft recipients with transplant glomerulopathy between January 2010 and December 2015. Median time for diagnosis was 7.1 (3.6-11.8) years post-transplant. Light microscopy showed severity of transplant glomerulopathy in the majority of patients (cg1=10.4%; cg2=20.8%; cg3=68.8%). Moderate microvascular inflammation was present in 56.3% (g+ptc≥2), and almost half of recipients (51.1%) were C4d positive in immunofluorescence. Female gender (P=.001), age (P=.043), renal dysfunction (P=.002), acute rejection episodes (P=.026), and anti-HLA class II antibodies (P=.004) were associated with kidney allograft failure. Treatment of transplant glomerulopathy was performed in 67.6% of patients. The histologic and laboratory features that led to a therapeutic intervention were score ptc (P=.021), C4d (P=.03), and the presence of anti-HLA antibodies (P=.029), whereas score ah (P=.005) was associated with conservative measure. The overall cumulative kidney allograft survival at 10 years was 75%. Treatment of transplant glomerulopathy was ineffective to improve long-term kidney allograft survival.
  • Corticosteroid wean after heart transplantation—Is there a risk for antibody formation?
    Background Corticosteroid withdrawal after heart transplantation is limited to select immune-privileged patients but it is not known whether this predisposes patients to a higher risk for sensitization. Methods A total of 178 heart transplant recipients had panel-reactive antibody (PRA) measurements at transplant and every 6 months and were monitored for rejection with protocol endomyocardial biopsies. Corticosteroid withdrawal was initiated at 6 months post-transplant in select patients. Results Patients successfully weaned off prednisone (SPW; n=103) had lower PRA compared to those maintained on prednisone (MP; n=51) at pretransplant (34% vs 63%), 6 months (18% vs 49%), 12 months (19% vs 51%), and 18 months (15% vs 47%) after transplant (P<.05). Among 68 nonsensitized patients at transplant in the SPW group, seven (10%) developed de novo PRA at 12 months, compared to four of 19 (21%) of MP patients. Freedom from any treated rejection (97% vs 69% vs 67%), acute cellular rejection (100% vs 86% vs 71%), and antibody-mediated rejection (100% vs 88% vs 88%; all P≤.001) at 2 years was higher in SPW compared to MP and those who failed prednisone wean, respectively. Conclusion Few patients successfully weaned off prednisone after heart transplant develop de novo circulating antibodies but are not at increased risk for developing rejection.
  • Higher Anti-A/B isoagglutinin titers of IgG class, but not of IgM, are associated with increased red blood cell transfusion requirements in bone marrow transplantation with major ABO-mismatch
    Background Major ABO mismatch between donor and recipient in bone marrow transplantation (BMT) may cause hemolysis, delayed red blood cell (RBC) engraftment and pure red cell aplasia (PRCA), which result in increased transfusion needs. High pretransplant anti-A/B antibody titers have been associated with increased risk of PRCA. Herein, we studied the impact of anti-A/B titers on transfusion needs after BMT with major ABO mismatch. Methods We reviewed the medical charts of 27 patients who underwent to BMT with major ABO mismatch and categorized them into two groups according to anti-A/B titers of IgG (≤16 and ≥32). We recorded the number of RBC and platelet units transfused in the first 180 days after transplantation. We also evaluated the impact of anti-A/B titers on overall survival. Results Patients with anti-A/B titer ≥32 of IgG class required more RBC transfusion than patients with titer ≤16 (6.60±4.55 vs 21.29±14.68; P=.03). Anti-A/B of IgM class had no impact on both RBC and platelet transfusion needs. Anti-A/B titers had no impact on overall survival. Conclusion Higher titers of anti-A/B antibodies of IgG class, but not of IgM, are associated with a higher demand for RBC transfusion.
  • Prediction of nonalcoholic fatty liver in prospective liver donors
    Background Metabolic risk factors should be important in addition to imaging for prediction of steatosis in prospective liver donors. Materials and methods The study group included all prospective liver donors who had a liver biopsy during workup. Risk factors of metabolic syndrome were analyzed, and body mass index (BMI) ≥25 kg/m2 was used in place of waist circumference. Three BMI cutoffs (25, 28, and 30 kg/m2) and two CT-measured liver attenuation index (LAI) cutoffs (<5 and ≤10) were used for steatosis assessment of ≥5%, ≥10%, and ≥20%. Results Of the 573 prospective donors (307 females), 282 (49.2%) donors had nonalcoholic fatty liver (NAFL). When donors with NAFL were compared with donors having normal histology, multivariate analysis showed BMI, ALT, triglycerides, and LAI as significant predictors of NAFL. BMI ≥25 kg/m2 and LAI <10 were better cutoffs. The presence of ≥2 metabolic risk factors had better sensitivity than CT-LAI for the presence of NAFL and ≥20% steatosis (58% and 54% vs 47% and 22%, respectively, for CT-LAI ≤10). The presence of LAI >10 and <2 metabolic risk factors predicted <10% steatosis with 96% specificity and 92% positive predictive value. Conclusion The presence of ≥2 metabolic risk factors improves sensitivity of CT-LAI for prediction of donor steatosis.
  • Ureteric complications in recipients of kidneys from donation after circulatory death donors
    A large increase in the use of kidneys from donation after circulatory death (DCD) donors prompted us to examine the impact of donor type on the incidence of ureteric complications (UCs; ureteric stenosis, urinary leak) after kidney transplantation. We studied 1072 consecutive kidney transplants (DCD n=494, live donor [LD] n=273, donation after brain death [DBD] n=305) performed during 2008-2014. Overall, there was a low incidence of UCs after kidney transplantation (3.5%). Despite a trend toward higher incidence of UCs in DCD (n=22, 4.5%) compared to LD (n=10, 3.7%) and DBD (n=5, 1.6%) kidney transplants, donor type was not a significant risk factor for UCs in multivariate analysis (DCD vs DBD HR: 2.33, 95% CI: 0.77-7.03, P=.13). There was no association between the incidence of UCs and donor, recipient, or transplant-related characteristics. Management involved surgical reconstruction in the majority of cases, with restenosis in 2.7% requiring re-operation. No grafts were lost secondary to UCs. Despite a significant increase in the number of kidney transplants from DCD donors, the incidence of UCs remains low. When ureteric complications do occur, they can be treated successfully with surgical reconstruction with no adverse effect on graft or patient survival.
  • Issue Information
  • Calcification score evaluation in patients listed for renal transplantation
    Based on native CT scans of the pelvic region using a standardized calcification score, evaluation of iliac vascular calcification was performed between 2008 and 2012 prior to listing for renal transplantation in 205 patients with chronic kidney disease. Vascular calcification showed a decrease from proximal to distal. The difference between the degree of calcification in the common iliac artery and in the external iliac artery was significant (P<.001). Risk factors for total iliac vascular calcification were age, smoking, sex, underlying renal disease, and diabetes. Multivariate analysis revealed age to be the most relevant risk factor (P<.001). The duration of hemodialysis correlated significantly with total iliac vascular calcification. As the introduction of the standardized surgical evaluation protocol, no transplantation has had to be broken off and no early graft loss due to calcification has occurred. Thus, careful scoring of vascular calcification prior to transplantation may be a valuable tool to support surgical decisions and to improve patient safety and outcome in increasingly older transplant recipients.
  • Gender differences in long-term survival post-transplant: A single-institution analysis in the lung allocation score era
    The purpose of this study was to clarify the significance of recipient gender status on lung transplant outcomes in a large single-institution experience spanning three decades, we analyzed data from all lung transplants performed in our institution since 1986. Kaplan-Meier curves and Cox proportional hazard models were used to evaluate the effect of recipient characteristics on survival and BOS score ≥1-free survival. Logistic regression analysis was used to explore the association of gender with short-term graft function. About 876 lung transplants were performed between 1986 and 2016. Kaplan-Meier survival estimates at 5 years post-transplant for females vs males in the LAS era were 71% vs 58%. In the LAS era, females showed greater unadjusted BOS≥1-free survival than males (35% vs 25%, P=.02) over 5 years. Female gender was the only factor in the LAS era significantly associated with improved adjusted 5-year survival [HR 0.56 (95% CI 0.33, 0.95) P=.03]. Conversely, in the pre-LAS era female gender was not associated with improved survival. Female recipients showed significantly improved survival over 5 years compared to males in the LAS era. A prospective analysis of biologic and immunologic differences is warranted.
  • Prognostic impact of postoperative low platelet count after liver transplantation
    Background The positive impact of platelets has been recently implicated in liver transplantation (LT). The aim of this study was to determine the risk factors for graft loss and mortality after LT, focusing on perioperative platelet counts. Methods We reviewed all deceased donor LT from 2000 to 2012 and enrolled 975 consecutive recipients. The risk factors for graft loss and mortality were analyzed by multivariate analysis, using Cox's regression model. Results Using cutoff values acquired by receiver operating characteristics curve analysis, multivariate analyses determined that viral hepatitis C (hazard ratio [HR]=1.32), donor age >40 (HR=1.33), higher peak serum alanine aminotransferase (HR=1.01), reoperation within 30 days (HR=1.51), and platelet count <72 500/μL on postoperative day (POD) 5 (HR=1.30) were independent risk factors for graft loss. Viral hepatitis C (HR=1.33), reoperation within 30 days (HR=1.35), and platelet count <72 500/μL on POD 5 (HR=1.38) were independent risk factors for mortality. Conclusion A low platelet count on POD 5 was associated with graft loss and mortality after LT. Platelet count <72 500/μL on POD 5 can be a predictor of poor graft and overall survival. Maintaining higher postoperative platelet counts could potentially improve graft and overall survival rates.
  • Single-center outcomes of combined heart and liver transplantation in the failing Fontan
    Long-term outcomes of the Fontan operation include Fontan failure and liver disease. Combined heart-liver transplantation (CHLT) is an option for select patients although limited data exist on this strategy. A retrospective review of Fontan patients 18 years or older referred for cardiac transplant evaluation between 2000 and 2013 at the Hospital of the University of Pennsylvania was performed. All patients were considered for potential CHLT. Clinical variables such as demographics, perioperative factors, and short-term outcomes were reviewed. Of 17 referrals for cardiac transplantation, seven Fontan patients underwent CHLT. All patients who underwent CHLT had either advanced fibrosis or cirrhosis on liver biopsy. There were no perioperative deaths. The most common postoperative morbidity was acute kidney injury. Short-term complications include one episode of acute liver rejection but no cardiac rejection greater than 1R. CHLT is an acceptable therapeutic option for patients with failing Fontan physiology who exhibit concomitant advanced liver fibrosis. However, optimal patient selection is currently undefined, and long-term outcomes are not known.
  • Pruritus and quality of life in renal transplant patients
    Background Pruritus has a negative impact on quality of life (QoL) in dialysis patients. The reversibility of this symptom after renal transplantation and its impact upon QoL has scarcely been studied in these patients. Methods Pruritus was evaluated by the Visual Analogue Scale (VAS), the Visual Rating Scale (VRS), and the Numerical Rating Scale (NRS) in 133 unselected renal transplant patients, 62 healthy subjects, and 29 hemodialysis patients. QoL was assessed by KDQOL-SF™1.3. The reversibility of pruritus was studied by applying retrospectively the VRS. Results The prevalence of pruritus by the VRS was 62% in hemodialysis patients, 32% in renal transplant patients, and 11% in healthy subjects (P<.001). The prevalence of pruritus among transplant patients was 32% by VRS and 38% by VAS and NRS. The prevalence of pretransplantation pruritus (68%) by the VRS recall questionnaire was higher than the prevalence of pruritus in the same patients after renal transplantation (32%, P<.01). Pruritus in transplant patients was associated with important dimensions of QoL, including social, emotional, and working limitations (P<.05 for the three comparisons). Conclusions The prevalence of pruritus markedly reduces after renal transplantation but remains substantially higher than in the general population and impacts upon quality of life in these patients.
  • Cervical human papillomavirus infection in the early postoperative period after liver transplantation: Prevalence, risk factors, and concordance with anal infections
    Solid organ transplant recipients are at increased risk of developing several human papillomavirus (HPV)-related malignancies, including cervical and anal cancers. The purpose of this prospective study was to assess the initial prevalence and risk factors for high-risk HPV (HR-HPV) cervical infections in liver transplant recipients, as well as their concordance with anal infections. A total of 50 female patients were enrolled in the Department of General, Transplant and Liver Surgery at the Medical University of Warsaw (center with >1600 liver transplantations). The initial prevalence of cervical HR-HPV infection was 10.0% (5/50). The only significant risk factor for cervical HR-HPV infection was ≥4 lifetime sexual partners (P=.037). Statistical tendencies toward higher prevalence of cervical HR-HPV infections were found for patients with hepatitis B virus (HBV, P=.082) and with model for end-stage liver disease (MELD) score ≤8 (P=.064). Cervical cytology was abnormal in 10 patients, including three with HR-HPV. Out of 12 patients with available data on anal HR-HPV, one had concordant HPV 16 infection. In conclusion, the initial prevalence of high-risk HPV infection is relatively low, except for patients with ≥4 previous sexual partners and potentially in those with HBV and/or low MELD score.
  • Evolution of body weight parameters up to 3 years after solid organ transplantation: The prospective Swiss Transplant Cohort Study
    Obesity and weight gain are serious concerns after solid organ transplantation (Tx); however, no unbiased comparison regarding body weight parameter evolution across organ groups has yet been performed. Using data from the prospective nationwide Swiss Transplant Cohort Study, we compared the evolution of weight parameters up to 3 years post-Tx in 1359 adult kidney (58.3%), liver (21.7%), lung (11.6%), and heart (8.4%) recipients transplanted between May 2008 and May 2012. Changes in mean weight and body mass index (BMI) category were compared to reference values from 6 months post-Tx. At 3 years post-Tx, compared to other organ groups, liver Tx recipients showed the greatest weight gain (mean 4.8±10.4 kg), 57.4% gained >5% body weight, and they had the highest incidence of obesity (38.1%). After 3 years, based on their BMI categories at 6 months, normal weight and obese liver Tx patients, as well as underweight kidney, lung and heart Tx patients had the highest weight gains. Judged against international Tx patient data, the majority of our Swiss Tx recipients’ experienced lower post-Tx weight gain. However, our findings show weight gain pattern differences, both within and across organ Tx groups that call for preventive measures.
  • Determinants of pre-transplantation pectoralis muscle area (PMA) and post-transplantation change in PMA in lung transplant recipients
    Background This study aimed to determine predictors of pectoralis muscle area (PMA) and assess change in PMA following lung transplantation and its relationship to outcomes. Methods A retrospective review of 88 lung transplant recipients at a single center was performed. PMA was determined on a single axial slice from chest computerized tomography. Pectoralis muscle index (PMI) was calculated from the PMA divided by the height squared. Results PMI decreased post-transplantation (8.1±2.8 cm2/m2 pre-transplantation, 7.5±2.9 cm2/m2 at 6 months, and 7.6±2.7 cm2/m2 at 12 months, P<.05). Chronic obstructive pulmonary disease (COPD) and interstitial lung disease (ILD) were predictors of pre-transplant PMI (β=−2.3, P=.001 for COPD; β=2.1, P<.001 for ILD) and percent change in PMI at 12 months post-transplantation relative to baseline (β=19.2, P=.04 for COPD; β=−20.1, P=.01 for ILD). Patients in the highest quartile for PMI change at 12 months had fewer ventilator days compared with patients in the other quartiles (P=.03). Conclusions Underlying diagnosis was a significant predictor of both pre-transplantation PMI and change in PMI post-transplantation. Further studies of PMI are needed to determine its clinical utility in predicting outcomes following lung transplantation.
  • BALF cytokines in different phenotypes of chronic lung allograft dysfunction in lung transplant patients
    The long-term success of lung transplantation (LT) is limited by chronic lung allograft dysfunction (CLAD). Different phenotypes of CLAD have been described, such as bronchiolitis obliterans syndrome (BOS) and restrictive allograft syndrome (RAS). The purpose of this study was to investigate the levels of cytokines and chemokines in bronchoalveolar lavage fluid (BALF) as markers of these CLAD phenotypes. BALF was collected from 51 recipients who underwent (bilateral and unilateral) LT. The study population was divided into three groups: stable (ST), BOS, and RAS. Levels of interleukin (IL)-4, IL-5, IL-6, IL-10, IL-13, tumor necrosis factor alpha (TNF-α), interferon-gamma (IFN-γ), and granulocyte-macrophage colony-stimulating factor (GM-CSF) were measured using the multiplex technology. BALF neutrophilia medians were higher in BOS (38%) and RAS (30%) than in ST (8%) (P=.008; P=.012). Regarding BALF cytokines, BOS and RAS patients showed higher levels of INF-γ than ST (P=.02; P=.008). Only IL-5 presented significant differences between BOS and RAS (P=.001). BALF neutrophilia is as a marker for both CLAD phenotypes, BOS and RAS, and IL-5 seems to be a potential biomarker for the RAS phenotype.
  • The metabolic syndrome and its components in pediatric survivors of allogeneic hematopoietic stem cell transplantation
    Metabolic syndrome (MetS) is a known complication after hematopoietic stem cell transplantations (HSCT) that contributes to long-term morbidity. We assessed the prevalence of components of the MetS in pediatric survivors of allogeneic HSCT and identified associated risk factors. Thirty-eight patients, median age at HSCT, 8.5 years, were evaluated at a median of 3.9 years post-HSCT. Overweight or obesity was seen in 23.7% of the patients, 15.8% had hypertension, 15.8% had hypertriglyceridemia, and 13% had low high-density lipoprotein cholesterol levels according to age and gender. Four (10.5%) met the criteria of MetS; all were transplanted for malignant disease. Twelve patients (31.6%) had at least one component of the MetS. The 5-year probability of developing components of the MetS revealed that patients with BMI-Z score ≥0 at HSCT were significantly at higher risk than those with lower BMI-Z. Patients who developed components of the MetS had higher levels of insulin, homeostasis model assessment, uric acid, leptin, and lower adiponectin levels. Multivariable regression analysis revealed that BMI-Z-score >1.036 at time of evaluation was associated with 4.3-fold increased risk (P=.050) and adiponectin levels ≤6 μg/mL were associated with 6.7-fold increased risk of develop components of the MetS (P=.007). Overweight and obesity and adiponectin levels may be useful as markers in HSCT survivors.
  • Reconditioning by end-ischemic hypothermic in-house machine perfusion: A promising strategy to improve outcome in expanded criteria donors kidney transplantation
    This clinical study evaluates end-ischemic hypothermic machine perfusion (eHMP) in expanded criteria donors (ECD) kidneys. eHMP was initiated upon arrival of the kidney in our center and continued until transplantation. Between 11/2011 and 8/2014 eHMP was performed in 66 ECD kidneys for 369 (98-912) minutes after 863 (364-1567) minutes of cold storage (CS). In 49 of 66 cases, the contralateral kidney from the same donor was preserved by static CS only and accepted by another Eurotransplant (ET) center. Five (10.2%) of these kidneys were ultimately judged as “not transplantable” by the accepting center and discarded. After exclusion of early unrelated graft losses, 43 kidney pairs from the same donor were eligible for direct comparison of eHMP vs CS only: primary non-function and delayed graft function (DGF) were 0% vs 9.3% (P=.04) and 11.6% vs 20.9% (P=.24). There was no statistically significant difference in 1-year graft survival (eHMP vs CS only: 97.7% vs 88.4%, P=.089). In a multivariate analysis, eHMP was an independent factor for prevention of DGF (OR: 0.28, P=.041). Development of DGF was the strongest risk factor for 1-year graft failure (Renal resistance: 38.2, P<.001). In summary, eHMP is a promising reconditioning technique to improve the quality and acceptance rate of suboptimal grafts.
  • Psychosocial aspects before and up to 2 years after heart or lung transplantation: Experience of patients and their next of kin
    Background Psychosocial factors are important for patients undergoing heart (HTx) or lung (LTx) transplantation and for their next of kin (NoK). Aim To describe health-related quality of life (HRQoL; patients only), anxiety, depression, stress, coping ability, and burden (NoK only) for patients and their NoK before and up to 2 years after HTx or LTx. Design Adult patients (28 hearts and 26 lungs) and their appointed NoK were surveyed with questionnaires about specific psychosocial topics when they were accepted for the transplantation waiting list and 6 months, 1 year, and 2 years after transplantation. Findings Patients’ coping ability and self-perceived health were low at baseline and improved over time after transplantation. However, lung patients took longer time to recover in terms of HRQoL, depression, and stress than heart patients. Similarly, NoK of lung patients experienced a higher burden and more stress 1 year after transplantation than NoK of heart patients. Conclusions Healthcare professionals should be aware of the psychosocial challenges patients and their NoK face in daily living and provide support both before and after heart or lung transplantation.
  • Why do patients die after a liver transplantation?
    Background As more patients achieve long-term survival, it has become important to understand mortality in liver transplantation (LT) recipients. Methods We conducted retrospective reviews of long-term outcome in two adult LT cohorts: 85 031 in the United Network for Organ Sharing (UNOS) database and 1458 transplanted at the University of Wisconsin (UW). Results During median follow-up of 3.2 years (UNOS) and 6.6 years (UW), 35.1% of UNOS patients and 44.2% of UW patients died; 43.1% of all UNOS deaths occurred in year 1 compared to 25.1% in the UW cohort. Deaths due to infection (other than viral hepatitis) or cardiovascular (CV) causes were most frequent in year 1 in both cohorts and then persisted at lower rates. In contrast, death from malignancy increased after year 1 to peak in years 1-5. Deaths due to rejection, hepatitis, or graft failure were infrequent. In the UW cohort, de novo malignancy was more common than recurrent tumor and correlated with smoking history. Conclusions A coordinated holistic approach that focuses on limiting immunosuppression, infection, risky behaviors, and CV risks, while screening for cancer, is needed to extend the healthy lives of LT recipients.
  • Good outcome of living donor liver transplantation in drug-induced acute liver failure: A single-center experience
    Introduction Drug-induced acute liver failure (ALF) is associated with high mortality. There is limited literature on results of living donor liver transplantation (LDLT). Material and Methods The study was conducted at a tertiary care center in North India. All patients who received LDLT for drug-induced ALF were included. The data are shown as median (IQR). Results A total of 18 patients (15 females and three males), aged 34 (25-45) years, underwent LDLT for drug-induced liver injury (DILI)-related ALF. Etiology of ALF was antitubercular medications (n=14), orlistat (n=1), flutamide (n=1), and complementary alternative medications (n=2). The baseline parameters were as following: bilirubin 17.7 (16.3-23.8) mg/dL, INR 3.3 (2.5-4.0), jaundice encephalopathy interval 6 (3-17.5) days, arterial ammonia 109 μmol/L (73-215), Model for End-Stage Liver Disease (MELD) 24 (18-33), grade of encephalopathy 2 (1-4), which progressed to grade 3 (3-4) before transplantation. All patients underwent right lobe LDLT; hospital stay was 17 (13-22) days, and ICU stay was 5 (5-7) days. Two patients died in the first month after liver transplantation due to sepsis and multi-organ failure; the rest of the patients are alive and doing well at a follow-up of 50 (4-82 months). Conclusion Good outcomes can be obtained by LDLT for drug-induced ALF.
  • Preliver transplant red cell distribution width predicts postliver transplant mortality
    Purpose Prognostication following liver transplantation is limited. Red cell distribution width (RDW) has been associated with morbidity and mortality in a variety of diseases. We hypothesize RDW is predictive of mortality postliver transplantation. Methods We performed a retrospective cohort study of all consecutive liver transplantation recipients at a tertiary care center from January 1, 2012 to December 31, 2012. The primary end point was association of RDW with one-year mortality. Statistical analysis was performed using the Mann-Whitney test, independent samples t test, and regression analysis. Discrimination was assessed by calculating area under receiver operating curves (AUC). A P-value <.05 was considered significant. Results RDW was positively associated with one-year mortality (P<.001). The mean difference for survivors compared to nonsurvivors was 3.9% (95% CI 1.9%-5.9%). The AUC for RDW was 0.831 (95% CI 0.727-0.935), compared to 0.723 (0.539-0.908) for total bilirubin and 0.704 (0.479-0.929) for the international normalized ratio. Conclusions To our knowledge, this is the first report of an association of RDW with post-LT mortality and the results show the predictive value of pre-LT RDW for one-year mortality.
  • Nephrectomy-induced reduced renal function and the health-related quality of life of living kidney donors
    Objective To evaluate the health impact of nephrectomy on living kidney donors (LKDs) by comparing the health-related quality of life (HrQOL) scores measured by Short Form-36 (SF36) between those with and without postdonation renal function impairment (PRFI). Methods Eighty-two LKDs (47 females, mean age=50.2±11.2 years) were prospectively recruited to participate in a SF-36 HrQOL survey. Chart review, individual baseline, and postoperative renal function (eGFR) was determined using the Modification of Diet in Renal Disease formula. PRFI was defined as eGFR<60 mL/min/1.73 m2 or proteinuria. Mean SF-36 domain scores were compared between those with and without PRFI. Results After a median follow-up of 5.7 years, the prevalence of postdonation comorbidities was 29.3% (n=24) PRFI, 25.6% (n=21) hypertension, 6.1% (n=5) diabetes, and 3.7% (n=3) heart disease, and no LKDs developed end-stage renal disease. Mean eGFR before and after donor nephrectomy was 95.5±23.4 and 71.0±17.3 mL/min/1.73 m2 (P<.01). Mean SF-36 scores of LKDs were not significantly different between those with and without PRFI in all the domains (all P>.05). Similarly, the proportion of LKDs with PRFI did not differ significantly between the patients with SF-36 domain scores above and below the published reference values. Conclusion Nephrectomy-induced PRFI may not have a significant impact on the HrQOL of the LKD population with a low proportion of other major comorbidities such as diabetes and ischemic heart disease.
  • Abdominal lean muscle is associated with lower mortality among kidney waitlist candidates
    Morphometric assessments, such as muscle density and body fat distribution, have emerged as strong predictors of cardiovascular risk and postoperative morbidity and mortality. To date, no study has examined morphometric mortality risk prediction among kidney transplant (KT) candidates. KT candidates, waitlisted 2008-2009, were identified (n=96) and followed to the earliest of transplant, death, or administrative end of study. Morphometric measures, including abdominal adipose tissue, paraspinous and psoas muscle composition, and aortic calcification, were measured from CTs. Risk of waitlist mortality was examined using Cox proportional hazard regression. On adjusted analyses, radiologic measures remained independently and significantly associated with lower waitlist mortality; the addition of radiologic measures significantly improved model predictive ability over models containing traditional risk factors alone (net reclassification index: 0.56, 95% CI: 0.31-0.75). Higher psoas muscle attenuation (indicative of leaner muscle) was associated with decreased risk of death (aHR: 0.93, 95% CI: 0.91-0.96, P<.001), and for each unit increase in lean paraspinous volume, there was an associated 2% decreased risk for death (aHR: 0.98, 95% CI: 0.96-0.99, P=.03). Radiologic measures of lean muscle mass, such as psoas muscle attenuation and paraspinous lean volume, may improve waitlist mortality risk prediction and candidate selection.
  • The AlloMap™ genomic biomarker story: 10 years after
    Over the last >20 years, we have co-developed the rationale for the first diagnostic and prognostic leukocyte gene expression profiling (GEP) biomarker test in transplantation medicine that gained US-FDA-regulatory clearance and international evidence-based medicine guideline acceptance to rule out moderate/severe acute cellular cardiac allograft rejection without invasive endomyocardial biopsies (EMB). Based on this test, a non-invasive clinical algorithm was implemented since 2005. After clinical implementation, this GEP-based monitoring in direct comparison with an EMB-based strategy was non-inferior with respect to detection of clinical rejection, defined as new onset allograft dysfunction with/without histology of ACR, re-transplantation or death, and at the same time improved patient satisfaction. Subsequently, we demonstrated the test's capacity when used as serial monitoring tool to predict these clinical rejection events. In this Personal Viewpoint article, I will discuss the various decision-making branching points that were made in the AlloMap biomarker test development to inform future genomic biomarker test development projects.
  • Long-term health-related quality of life in living liver donors: A south Asian experience
    Aim The aim of this study was to evaluate long-term health-related quality of life (HRQOL), changes in lifestyle, and complications in living liver donors at a single transplant center from southern India. Methods A total of 64 consecutive living liver donors from 2008 to 2011 were evaluated; 46 of 64 donors completed the short form 36 (SF-36) via telephonic interviews or clinic consultations. Mean follow-up was 48 months (range: 37-84 months). Results There was no mortality in the donors evaluated. Overall morbidity was 23%, which included wound infections (4.3%), incisional hernia (2.1%), biliary leak (4.3%), and nonspecific complaints regarding the incision site (15.2%). All 46 donors who completed the SF-36 had no change in career path or predonation lifestyle. A total of 40 of 46 (87%) donors had no limitations, decrements, or disability in any domain, while six of 46 (13%) had these in some domains of which general health (GH) was most severely affected. Conclusions Living donor hepatectomy is safe with acceptable morbidity and excellent long-term HRQOL with no change in career path or significant alteration of lifestyle for donors.
  • Antibody depletion strategy for the treatment of suspected antibody-mediated rejection in lung transplant recipients: Does it work?
    Background Donor-specific antibodies (DSAs) after lung transplantation correlate with poor outcomes. The ideal treatment strategy for antibody-mediated rejection AMR is not defined. Our institution implemented an aggressive multimodality protocol for the treatment of suspected AMR. Methods Lung transplant recipients with suspected AMR were treated with a standardized protocol of plasma exchange, steroids, bortezomib, rituximab, and intravenous immune globulin. Primary outcome was DSA clearance at 6 months in those alive. Secondary endpoints included preserved allograft function at 6 months, survival at 6 and 12 months and complications due to the protocol. Results Sixteen lung transplant recipients with documented DSA and allograft dysfunction were included in the analysis. Of the 16 patients, 11 survived to 6 months. Three of those 11 patients (27%) cleared all DSAs within 6 months of the protocol. Four of the 11 patients (36%) had preserved allograft function at 6 months. Overall 12-month patient survival was 56%. Complications included thrombocytopenia (50%) and abdominal pain or gastrointestinal discomfort (18.7%). Conclusions This multimodality protocol resulted in clearance of DSAs and preserved lung function in a minority of lung transplant recipients with suspected AMR surviving to 6 months after therapy. There were significant side effects of the protocol.
  • Outcomes in the highest panel reactive antibody recipients of deceased donor kidneys under the new kidney allocation system
    Since the institution of the new kidney allocation system in December 2014, kidney transplant candidates with the highest calculated panel reactive antibodies (cPRA) of 99-100 have been transplanted at much higher rates. However, concerns have been raised that outcomes in these patients might be impaired due to higher immunological risk and longer cold ischemia times resulting from long-distance sharing of kidneys. Here, we compare outcomes at the University of Wisconsin between study patients with cPRA 99-100 and all other recipients of deceased donor kidneys transplanted between 12/04/2014 and 12/31/2015. All patients had at least 6 months post-transplant follow-up. The mean follow-up was 13.9±3 months in cPRA ≥99% and 12.3±3.5 months in cPRA ≤98%. There was a total of 152 transplants, 25 study patients, and 127 controls. No statistically significant differences were found between the two groups in delayed graft function, rejection, kidney function, graft and patient survival, or infections. We conclude that transplanting the most highly sensitized patients with kidneys shared outside their local donation service areas is associated with excellent short-term outcomes that are comparable to controls.
  • Efficacy and safety of short-term treatment with isoniazid and rifampicin for latent tuberculosis infection in lung transplant candidates
    Background The current recommendation for the treatment of latent tuberculosis infection (LTBI) in solid organ transplant candidates is isoniazid for 9 months, but this treatment has the main problem of frequently reaching the posttransplant period. Methods This is the study of efficacy and safety of a 3-month regimen with isoniazid and rifampicin (3HR) in lung transplant candidates in the Reina Sofía Hospital in Córdoba. Results Three hundred and ninety-eight lung transplant patients were evaluated. Ninety-two (24.9%) had LTBI and just 22 received the 3HR treatment. One additional patient was treated because he had a history of previous incomplete treatment for active TB. None of the treated patients developed posttransplant tuberculosis compared to three of the 62 patients with LTBI who were not treated (4.8%). Three patients could not conclude the 3HR treatment (13%), but only two had adverse effects (8.7%). Conclusions Treatment of LTBI in lung transplant candidates using a short course of 3HR appears to be effective and safe in preventing posttransplant TB in lung transplant recipients.


Documento sin título

Aviso para pacientes:
Esta página contiene información urológica dirigida a profesionales de la sanidad.
Si tiene algún problema relacionado con esta patología,
consulte con su urólogo o médico de familia.
Si desea información diseñada para pacientes y público general. puede visitar:

Portal de Información Urológica para Pacientes



Carlos Tello Royloa


Actualizada el: 08-Abr-2013