The purpose of this study was to further evaluate this problem in

The purpose of this study was to further evaluate this problem in a large, multi-institutional cohort with long follow-up. Methods: Using the UNOS STAR dataset, we reviewed 63,287 adult transplants from January 2002 through June 2013. Inclusion criteria

were primary whole organ deceased donor liver transplants in recipients 18 years or older from 2002-June 2013 (MELD era). Exclusion criteria were recipient less than 18 years old; concomitant diagnosis of hepatocellular carcinoma; multi-organ, living donor, and split liver recipients; selleck chemicals and recipients of organs donated after cardiac death. Additional exclusions were made based on missing data and implausible values for donor age, CIT, BMI, serum albumin, total biliru-bin, and creatinine. Cardiovascular related deaths included all deaths listed as “cardiac arrest”, “myocardial infarction”, “arrhythmia”, “congestive failure”, “arterial embolism” and “other cardiac”. Using the primary, secondary or tertiary see more diagnosis variable in the STAR dataset, recipients were grouped as NASH if they had diagnosis of non-alcoholic steatohepatitis or cryptogenic cirrhosis; and grouped as non-NASH if they had hepatitis C, alcoholic

cirrhosis, primary biliary cirrhosis (PBC), primary sclerosing cholangitis (PSC), or auto immune hepatitis (AIH). There were 5,289 NASH recipients identified and matched 1:2 with 10,080 non-NASH controls by gender, age at transplant (+/− 3 years) and MELD score (+/−3). Conditional logistic regression was used to estimate the odds ratio and 95% confidence intervals. All data was analyzed using SAS 9.3 (Cary, NC). Chi square tests were used to compare categorical variables between NASH and non-NASH groups. T-tests

were used for continuous variables. Results: The NASH group had a higher proportion of cardiovascular MCE related deaths compared to the non-NASH group (19% v 14%, p= .0015); however, conditional logistic regression failed to confirm an increased odds of cardiovascular related death for NASH recipients [OR 1.046 95% CI: 0.628-1.742; p=0.8623]. Additionally, when all causes of death were included, conditional logistic regression indicated that the odds of death was lower for NASH vs non-NASH recipients [OR 0.774 95%CI: 0.708, 0.864; p<.0001]. Conclusion: This analysis suggests that NASH patients do not have an increased odds of cardiovascular related death compared to their non-NASH counterparts. Disclosures: Satheesh Nair – Advisory Committees or Review Panels: Jansen; Grant/Research Support: Gilead; Speaking and Teaching: Bayer, Salix, Gilead Sanjaya K. Satapathy – Advisory Committees or Review Panels: Gilead The following people have nothing to disclose: Emily H. Wong, Jason Vanatta, Donna Hathaway, Elizabeth A. Tolley, James Eason Background/aims: Data on liver disease progression in HIV mono-infection without viral hepatitis are scarce.

This new entity raises the question of a novel autonomic dysfunct

This new entity raises the question of a novel autonomic dysfunction in short-lasting unilateral neuralgiform headaches with cranial autonomic symptoms

or an unexpected presentation of migraine. “
“Recent research has uncovered associations between migraine and experiencing traumatic events, the latter of which in some cases eventuates in the development of posttraumatic stress disorder (PTSD). However, existing studies have not attempted to explore the relative associations with migraine between experiencing trauma and suffering from PTSD. The aim of this cross-sectional study was to assess the predictive utility of trauma exposure vs PTSD in predicting migraine status and headache frequency, severity, and disability. One thousand fifty-one young adults (mean age = 18.9 years [SD = 1.4]; 63.1% female; 20.6% Ivacaftor ic50 non-Caucasian) without secondary causes of headache provided data from measures of headache symptomatology and disability, trauma and PTSD symptomatology, and depression and anxiety. Three hundred met diagnostic criteria for migraine and were compared on trauma exposure and PTSD prevalence with 751 participants without migraine. Seven hundred

twenty-eight participants (69.3%) reported experiencing at least 1 traumatic event consistent with Criterion A for PTSD, of whom 184 also met diagnostic criteria for PTSD. Migraineurs were almost Y-27632 purchase twice as likely as controls to meet criteria for PTSD (25.7% vs 14.2%, P < .0001) and reported a higher number of traumatic event

types that happened to them personally (3.0 vs 2.4, P < .0001). However, experiencing a Criterion A event only was not a significant predictor of migraine either alone (odds ratio [OR] = 1.17, P = nonsignificant) or after adjustment for covariates. By comparison, the OR of migraine for those with a PTSD diagnosis (vs no Criterion A event) was 2.30 (P < .0001), which remained significant after controlling for relevant covariates (OR = 1.75, P = .009). When using continuous variables of trauma and PTSD symptomatology, PTSD was again most strongly associated with migraine. Numerous sensitivity analyses confirmed these findings. PTSD symptomatology, but not the number of traumas, was modestly but significantly associated with headache frequency, severity, and disability in univariate analyses. Consistently across analyses, PTSD 上海皓元医药股份有限公司 was a robust predictor of migraine, whereas trauma exposure alone was not. These data support the notion that it is not exposure to trauma itself that is principally associated with migraine, but rather the development and severity of PTSD symptoms resulting from such exposure. “
“(Headache 2010;50:231-241) Objectives.— A population-based cross-sectional study was conducted to estimate the prevalence of migraine, episodic tension-type headaches (ETTH), and chronic daily headaches (CDH), as well as the presence of symptoms of temporomandibular disorders (TMD) in the adult population. Background.

The greater increase

in descent speed (57%) vs ascent sp

The greater increase

in descent speed (57%) vs. ascent speed (31%) following disentanglement likely highlights the effects of both drag and buoyancy related to the entangling gear and buoys. In order to dive to depth, an individual must overcome resistive buoyant forces. More active swimming is thus required on descent, while ascents can be passive (Nowacek et al. 2001). Such buoyant effects are also evident in dive shape. The overall depth- and duration-normalized dive area (DAR) was significantly lower while entangled. Dive descents to, and ascents from maximum depth were more gradual, and less time was spent in the bottom phase of the dive while the animal was entangled as compared with the behavior following disentanglement. Given that the added buoys were further from the whale than the water column was deep, selleck compound the buoys should have never been submerged to provide an upwards buoyant force that Eg 3911 could take advantage of to conserve energy in diving (Nowacek et al. 2001). Glides occurred in all phases of the dive cycle, indicating that passive swimming was not timed to take advantage of changes in buoyancy by gliding on ascent Dactolisib while entangled. The emaciated condition of Eg 3911 may have led to negative buoyancy, as

has been found in emaciated bottlenose dolphins (Dunkin et al. 2010), and dive depths were much shallower than the predicted depth of lung collapse in cetaceans (30–235 m) (Fahlman 2008). It is thus likely that glides were employed to conserve energy (Videler and Weihs 1982, Williams 2001) rather than to optimize the benefits of buoyancy. ODBA has shown to be a reliable estimator for activity and metabolic rate in free-swimming MCE animals (Fahlman et al. 2008). It was thus expected that ODBA be greater under the entangled condition; however, ODBA was often lower while entangled, compared to after disentanglement. We suggest that restraint by the drag and buoyancy of the gear may have reduced Eg 3911′s ability to make large dynamic movements. Accelerometer measurements

determine only the movement of the animal (i.e., net movement) and those forces associated, but not the forces required to move against any materials that may be restraining movement (i.e., total exertion). Consider a running parachute: the runner expends considerably more energy with the parachute, though their motion is more limited and is slower than without the apparatus. The application of ODBA to free-swimming and restrained cases likely requires separate metabolic calibrations for each condition, which are not available for entangled large whales at this time. Together, the effects of added buoyancy, added drag, and reduced swimming speed due to towing accessory gear pose many threats to entangled whales. If buoyancy overwhelms an animal’s ability to descend to the depth of its preferred prey, its foraging ability may be significantly compromised, accelerating the transition to a negative energy balance.

The greater increase

in descent speed (57%) vs ascent sp

The greater increase

in descent speed (57%) vs. ascent speed (31%) following disentanglement likely highlights the effects of both drag and buoyancy related to the entangling gear and buoys. In order to dive to depth, an individual must overcome resistive buoyant forces. More active swimming is thus required on descent, while ascents can be passive (Nowacek et al. 2001). Such buoyant effects are also evident in dive shape. The overall depth- and duration-normalized dive area (DAR) was significantly lower while entangled. Dive descents to, and ascents from maximum depth were more gradual, and less time was spent in the bottom phase of the dive while the animal was entangled as compared with the behavior following disentanglement. Given that the added buoys were further from the whale than the water column was deep, Fulvestrant cost the buoys should have never been submerged to provide an upwards buoyant force that Eg 3911 could take advantage of to conserve energy in diving (Nowacek et al. 2001). Glides occurred in all phases of the dive cycle, indicating that passive swimming was not timed to take advantage of changes in buoyancy by gliding on ascent Alvelestat concentration while entangled. The emaciated condition of Eg 3911 may have led to negative buoyancy, as

has been found in emaciated bottlenose dolphins (Dunkin et al. 2010), and dive depths were much shallower than the predicted depth of lung collapse in cetaceans (30–235 m) (Fahlman 2008). It is thus likely that glides were employed to conserve energy (Videler and Weihs 1982, Williams 2001) rather than to optimize the benefits of buoyancy. ODBA has shown to be a reliable estimator for activity and metabolic rate in free-swimming 上海皓元 animals (Fahlman et al. 2008). It was thus expected that ODBA be greater under the entangled condition; however, ODBA was often lower while entangled, compared to after disentanglement. We suggest that restraint by the drag and buoyancy of the gear may have reduced Eg 3911′s ability to make large dynamic movements. Accelerometer measurements

determine only the movement of the animal (i.e., net movement) and those forces associated, but not the forces required to move against any materials that may be restraining movement (i.e., total exertion). Consider a running parachute: the runner expends considerably more energy with the parachute, though their motion is more limited and is slower than without the apparatus. The application of ODBA to free-swimming and restrained cases likely requires separate metabolic calibrations for each condition, which are not available for entangled large whales at this time. Together, the effects of added buoyancy, added drag, and reduced swimming speed due to towing accessory gear pose many threats to entangled whales. If buoyancy overwhelms an animal’s ability to descend to the depth of its preferred prey, its foraging ability may be significantly compromised, accelerating the transition to a negative energy balance.

Under informed consent, 1 8 cases

Under informed consent, 1 8 cases INCB018424 cost switched to Peg IFN a-2a plus adefovir, 5 continued therapy with telbivudine, 1 1 refused to receive treatment and 10 followed the suggestion to stopped treatment. The durations of treatment are from 48 to 96 weeks. Results Among 1 8 cases switched to Peg IFN a-2a plus adefovir, 15(83.3%) achieved complete virological response (HBV DNA<20IU/ml),

9(50%) achieved HBeAg clearance or sero-conversion with HBV DNA<20IU/ml, 5(27.8%) achieved HBsAg clearance(HBsAg <0.05IU/ml) or seroconversion with HBeAg loss and HBV DNA<20IU/ml, 1 did not achieve virological response, and 2 lost to follow-up. 3 patients who achieved HBsAg seroconversion had stopped antiviral therapy MG 132 with HBsAb titers greater than 300IU/L persistently. Conclusions Based on postpartum ALT elevation, decrease of HBeAg titer and reduction of HBV DNA, switching to IFN-based regimen may achieve high rates of response after childbirth in pregnant HBV carriers who

received nucleoside analogs for PTMCT . The mechanism may be associated with recovery of immune function after childbirth and enhancement of immune function by antiviral therapy in the third trimester of pregnancy. Disclosures: The following people have nothing to disclose: Junfeng Lu, Xinyue Chen, Yali Liu, Hongwei Zhang, Lina Ma, Hua Zhang Switch in HBV genotypes is well known on interferon therapy. Whether prolonged Tenofovir therapy introduces genotype switches in patients (pts) during treatment is not known. Eighty-Eight chronic hepatitis B pts with raised ALT (>1.5 x ULN) and histologically proven chronic hepatitis, 上海皓元医药股份有限公司 receiving tenofovir as per protocol were followed up every 6 months. HBV genotypes were studied if pts had at least two follow-ups (n=67) . Of these pts, 17 were treatment exposed (Lamivudine 12, Adefovir 3, Lamivudine + Adefovir 2). HBV Polymerase/surface region sequences from 67 pts were subjected to phylogenetic,

recombination analysis and inter-genotype (change of genotype), intra-genotype (change of subgenotype), recombination were determined in follow-up samples and compared to baseline. Majority of pts were male (n 58). Median HBV DNA was 3.1×106 IU/ml. HBeAg was +ve in 38 (56.7%). At baseline, HBV genotypes were : A 14 (20.8%), D 52 (77.6%) and C 1 (1.5%) with predominance of sub-genotype A1, C1 and D1. Twenty-two of 67 (32.8%) pts experienced inter-genotype switches. Six pts experienced switch in genotype at 6 months and 7 at 1 year, 3 at 1.5 year, 4 at 2 year and 2 pts at 2.5 years. The trend of genotype switch was more often detected from A to D [13 (92.8%), baseline HBeAg +ve: 9; HBeAg -ve 4] as compared to genotype D to A [9 (17.3%) baseline HBeAg +ve: 2; HBeAg -ve: 7] (P <0.002). Of 22 pts, in 3 change in HBeAg coincided with genotype switch i.e. [HBeAg +ve, Genotype A converted to HBeAg -ve, genotype D in one ], [HBeAg -ve, genotype D reverted to HBeAg +ve, genotype A in 2 pts].

Under informed consent, 1 8 cases

Under informed consent, 1 8 cases MK-8669 supplier switched to Peg IFN a-2a plus adefovir, 5 continued therapy with telbivudine, 1 1 refused to receive treatment and 10 followed the suggestion to stopped treatment. The durations of treatment are from 48 to 96 weeks. Results Among 1 8 cases switched to Peg IFN a-2a plus adefovir, 15(83.3%) achieved complete virological response (HBV DNA<20IU/ml),

9(50%) achieved HBeAg clearance or sero-conversion with HBV DNA<20IU/ml, 5(27.8%) achieved HBsAg clearance(HBsAg <0.05IU/ml) or seroconversion with HBeAg loss and HBV DNA<20IU/ml, 1 did not achieve virological response, and 2 lost to follow-up. 3 patients who achieved HBsAg seroconversion had stopped antiviral therapy PD98059 in vitro with HBsAb titers greater than 300IU/L persistently. Conclusions Based on postpartum ALT elevation, decrease of HBeAg titer and reduction of HBV DNA, switching to IFN-based regimen may achieve high rates of response after childbirth in pregnant HBV carriers who

received nucleoside analogs for PTMCT . The mechanism may be associated with recovery of immune function after childbirth and enhancement of immune function by antiviral therapy in the third trimester of pregnancy. Disclosures: The following people have nothing to disclose: Junfeng Lu, Xinyue Chen, Yali Liu, Hongwei Zhang, Lina Ma, Hua Zhang Switch in HBV genotypes is well known on interferon therapy. Whether prolonged Tenofovir therapy introduces genotype switches in patients (pts) during treatment is not known. Eighty-Eight chronic hepatitis B pts with raised ALT (>1.5 x ULN) and histologically proven chronic hepatitis, 上海皓元 receiving tenofovir as per protocol were followed up every 6 months. HBV genotypes were studied if pts had at least two follow-ups (n=67) . Of these pts, 17 were treatment exposed (Lamivudine 12, Adefovir 3, Lamivudine + Adefovir 2). HBV Polymerase/surface region sequences from 67 pts were subjected to phylogenetic,

recombination analysis and inter-genotype (change of genotype), intra-genotype (change of subgenotype), recombination were determined in follow-up samples and compared to baseline. Majority of pts were male (n 58). Median HBV DNA was 3.1×106 IU/ml. HBeAg was +ve in 38 (56.7%). At baseline, HBV genotypes were : A 14 (20.8%), D 52 (77.6%) and C 1 (1.5%) with predominance of sub-genotype A1, C1 and D1. Twenty-two of 67 (32.8%) pts experienced inter-genotype switches. Six pts experienced switch in genotype at 6 months and 7 at 1 year, 3 at 1.5 year, 4 at 2 year and 2 pts at 2.5 years. The trend of genotype switch was more often detected from A to D [13 (92.8%), baseline HBeAg +ve: 9; HBeAg -ve 4] as compared to genotype D to A [9 (17.3%) baseline HBeAg +ve: 2; HBeAg -ve: 7] (P <0.002). Of 22 pts, in 3 change in HBeAg coincided with genotype switch i.e. [HBeAg +ve, Genotype A converted to HBeAg -ve, genotype D in one ], [HBeAg -ve, genotype D reverted to HBeAg +ve, genotype A in 2 pts].

BCLC, Barcelona clinic liver cancer; HCC, hepatocellular carcinom

BCLC, Barcelona clinic liver cancer; HCC, hepatocellular carcinoma; HR, hazard ratio; LT, liver transplantation; MC, Milan criteria; MELD, model for endstage liver disease; NHB, net health benefit; NM, nonmalignant; QALDs, quality-adjusted life days; RCTs, randomized clinical trials; TACE, transarterial chemoembolization; WL, waiting list; WTP, willingness to pay. The study focused on HCC candidates for LT meeting the MC (Fig. 1). As a reference case, our model considered a patient with compensated cirrhosis14

and a T2 tumor,15 i.e., one nodule 2-5 cm or 2-3 nodules ≤3 cm. The effect of Small molecule library a generic neoadjuvant therapy on time to progression was expressed in our model in terms of HR, as in recent RCTs.12, 13 In the particular context of the WL before LT, we considered this HR value as a linear factor for correcting the conventional dropout probability (DP) of HCC patients awaiting LT. Thus, for example, if the monthly conventional DP for HCC patients was 4% and the treatment HR was 0.50, then their treatment-modified dropout probability (SDP) became 4% * 0.50 = 2% according to the following formula: SDP = HR * DP. Although there are no robust studies measuring the efficacy of locoregional therapies in Daporinad order terms of reducing the risk of dropout, because we know the exact HR of sorafenib in extending the time to progression of HCC the aim of this study

was to compare two strategies: one using sorafenib as a neoadjuvant therapy before LT (Strategy A), and one with no bridging therapies (Strategy B). In current clinical practice, however, patients likely to have to wait some time and not given priority are treated almost everywhere. For this reason our model also included a specific sensitivity analysis considering the potential introduction of locoregional therapies in Strategy B patients when MCE their median time on the WL exceeded 6 months. Starting from these assumptions, we considered four endpoints to quantify the potential benefits of sorafenib neoadjuvant therapy: 1 Gain in transplant probability. The main

assumption of this study is that, by delaying tumor progression, sorafenib could decrease dropout from the transplant WL and thus increase the number of patients able to be transplanted. We constructed a Markov model, which examines the decision whether or not to use sorafenib as neoadjuvant therapy before LT. We hypothesized that therapy with sorafenib started at the time of listing. Moreover, as in the Sharp trial and in Italian clinical practice, therapy with sorafenib was stopped once patients have tumor progression. Thus, Strategy A had the potential benefits of sorafenib therapy only during the WL (and not after dropout), whereas Strategy B benefited from sorafenib therapy only after dropout from the WL in patients with advanced HCC and compensated cirrhosis. Moreover, our model takes into account the risk of decompensation in patients with compensated cirrhosis.

PK assessments of turoctocog alfa and the patients’ previous FVII

PK assessments of turoctocog alfa and the patients’ previous FVIII product were performed in 28 patients. Mean exposure to turoctocog alfa was 60 exposure days per patient. This

corresponds to approximately 4.5 months in the trial. None of the patients developed inhibitors (≥0.6 BU) and no safety concerns were raised. A total of 120 bleeding episodes (95%) were controlled with 1–2 infusions of turoctocog Acalabrutinib manufacturer alfa. Based on patient reports, the success rate (defined as ‘excellent’ or ‘good’ haemostatic response) for treatment of bleeding episodes was 92%. Overall, the median annualized bleeding rate was 3.0 (interquartile range: 8.5) bleeds patient−1 year−1. PK parameters were comparable between the two age groups. In conclusion, the present large global clinical trial showed that turoctocog alfa was safe, effective in treatment of bleeding

episodes and had a prophylactic effect in paediatric patients. “
“The aim of this study was to evaluate the effect of haemophilia disease severity and potential intermediaries Pritelivir concentration on body mass index (BMI) in patients with haemophilia. A secondary analysis of a cross-sectional study of 88 adults with haemophilia was undertaken. On bivariate analysis, persons with severe haemophilia had 9.8% lower BMI (95% CI −17.1, −3.0) than persons with non-severe haemophilia. The effect of haemophilia severity on BMI varied significantly by human immunodeficiency virus (HIV) status. Among HIV-positive subjects, 上海皓元医药股份有限公司 haemophilia severity was not associated with BMI (+5.0%, 95% CI −22.4, 41.9). Among HIV-negative subjects, severe haemophilia was associated with 15.1% lower BMI (95% CI, −23.6, −5.7). Older (>41 years) HIV-negative subjects with severe haemophilia had a BMI that was 24.8% lower (95% CI −39.1, −7.0) than those with non-severe

haemophilia. No statistically significant association was detected between BMI and severe vs. non-severe haemophilia for younger HIV-negative subjects. Although joint disease, as measured by the World Federation of Hemophilia (WFH) joint score, did not influence the association between haemophilia disease severity and BMI, adjustment for the atrophy component of the WFH score reduced the association between haemophilia severity and BMI by 39.1–69.9%. This suggested that muscle atrophy mediated at least part of the relationship between haemophilia severity and BMI. Haemophilia disease severity is associated with BMI and appears to be mediated by muscle atrophy of surrounding joints. This association appears to be modified by HIV status and possibly age. “
“Summary.  Several genes that modify risk of factor VIII (FVIII) inhibitors in haemophilia A patients have been identified.

The published experience of inhibitors in previously treated pati

The published experience of inhibitors in previously treated patients (PTPs) informs the number of new inhibitors per cohort that are acceptable in a clinical trial. However, a single acceptable limit of new inhibitors fails to recognize the heterogeneity of inhibitors and their variable impact on clinical care. This review will discuss the published literature

on epidemiology and clinical characteristics of inhibitors and possible risk factors for formation in PTPs. As factor products containing novel expressions GW-572016 in vitro of the factor VIII (FVIII) gene are developed, a major concern is increased antigenicity leading to an anti-FVIII inhibitory antibody response. Accordingly, assessment of the risk of inhibitor formation is a major focus of the clinical development of novel FVIII products. In 1999, the International Society of Thrombosis and Haemostasis Scientific Subcommittee recommended the focused enrolment of previously treated patients (PTPs), defined as >150 lifetime exposure days, in initial clinical studies evaluating novel FVIII products [1]. This recommendation is based on the observed low rate

of inhibitor formation after extensive exposure to FVIII which facilitates detection of inhibitor induction by the new factor product, presumably resulting from exposure of neo-epitopes on the novel FVIII product under investigation. Venetoclax in vitro Although the rate of new inhibitor formation after >150 days is small, it is medchemexpress not zero; thus, knowledge of the baseline rate of inhibitor formation in

the PTP population is necessary to determine the upper acceptable limit of inhibitor development in clinical studies. Also important to this discussion is the clinical impact of new inhibitors in PTPs. Inhibitors that are limited in duration and do not require a change in the therapeutic approach to bleeding are the least clinically relevant, whereas those that are high responding, persistent and increase the propensity to bleed are the most troublesome. This report reviews what is known about inhibitor formation in patients who have previously received FVIII. Despite the definition of PTP in 1999, the term has been used to represent patients with a variety of prior exposures to FVIII concentrates ranging from a single exposure day to >250 days of exposure. A lack of standardization of the term PTP has led to many varied reports of the incidence of inhibitor formation in this population. Several reports have evaluated cohorts of patients switched from one product to another. Three such studies have identified markedly increased rates of inhibitor formation in PTPs.

3D) However,

DC activation of antigen-restricted CD8+ T

3D). However,

DC activation of antigen-restricted CD8+ T cells was unchanged in NASH. In particular, peptide-pulsed control and NASH DCs induced comparable antigen-restricted CD8+ T-cell proliferation (Supporting Fig. 3E) and cytokine find protocol production (Supporting Fig. 3F). Similarly, the antigen-specific lytic capacity of hepatic CD8+ T cells against Ova-expressing targets was equivalent after in vivo adoptive transfer immunization using Ova-pulsed control or NASH DCs (Supporting Fig. 3G). Taken together, these data suggest that, in NASH, hepatic DCs gain enhanced capacity to activate CD4+ T cells, but not CD8+ T cells. Because DC expand, mature, and gain enhanced capacity to produce inflammatory mediators in NASH, we postulated that DCs may contribute to exacerbation of disease. To test this, we employed BM

chimeric CD11c.DTR mice in which continuous DC depletion could be accomplished DMXAA in vivo (Fig. 3A and Supporting Fig. 4). Control mice were made chimeric using BM from WT mice. Surprisingly, ablation of DC populations—rather than mitigating hepatic insult—worsened disease. In particular, NASH(-DC) (NASH with depletion of DCs) mice experienced more precipitous weight loss, compared with NASH mice with intact DC populations (Supporting Fig. 5A). Furthermore, DC depletion in NASH resulted in a larger intrahepatic inflammatory cell infiltrate, compared to controls (Fig. 3B). In addition, analysis of cytokines produced by liver NPC revealed that DC depletion resulted in increased NPC production of numerous cytokines linked to hepatic injury in NASH, including TNF-α, IL-6, and IL-1β (Fig. 3C), as well as chemokines critical for hepatic leukocyte recruitment, including macrophage inflammatory protein 1 alpha (MIP-1α) and granulocyte colony-stimulating factor (G-CSF) (Fig. 3D). Conversely, IL-10, a regulatory

cytokine, had decreased expression in NASH liver in the context of DC depletion (Fig. 3E). ALT levels were similarly elevated in NASH and NASH(-DC) liver (Supporting Fig. 5B). DC depletion did not alter hepatic NPC composition (Supporting Fig. 6a-e) or production of inflammatory mediators (Supporting Fig. 6F) in mice on a control diet. DC depletion similarly had 上海皓元 no effect on NPC composition in LPS-treated mice on a normal diet (Supporting Fig. 7). Intrahepatic inflammation has a reciprocal pathogenic relationship with cellular apoptosis in NASH liver.[16] Consistent with elevated intrahepatic inflammation, NASH(-DC) liver exhibited the increased presence of apoptotic bodies (Fig. 4A). Accordingly, expression of PAR4, a marker of apoptosis, was increased in NASH liver in the context of DC depletion (Fig. 4B). Cleaved caspase-3 was also more prevalent in NASH(-DC) liver, compared to controls (Fig. 4C).