Categories
Nevin Manimala Statistics

Delgocitinib Cream Reduces Itch and Pain in Moderate to Severe Chronic Hand Eczema: Phase 3 DELTA 1 and 2 Pooled Analyses

Dermatol Ther (Heidelb). 2025 Dec 24. doi: 10.1007/s13555-025-01611-y. Online ahead of print.

ABSTRACT

INTRODUCTION: Itch and pain are two of the most common and burdensome symptoms for moderate to severe Chronic Hand Eczema (CHE). Here, we assess changes in itch/pain in patients with moderate to severe CHE treated with delgocitinib cream 20 mg/g or cream vehicle for 16 weeks.

METHODS: In a pooled DELTA 1 (NCT04871711)/DELTA 2 (NCT04872101) analysis (delgocitinib [n = 639]; cream vehicle [n = 321]; twice-daily), the Hand Eczema Symptom Diary captured patient-reported itch/pain severity on a numeric rating scale. Changes in itch/pain from baseline were assessed daily during week 1 and weekly from week 1 to 16.

RESULTS: In delgocitinib-treated patients, a statistically significant least square mean reduction from baseline was observed for itch from day 1 ([delgocitinib cream/cream vehicle] 0.75/0.32; P < 0.001) and pain from day 3 (0.98/0.58; P = 0.001) after the first application. Among patients with ≥ 4-point baseline itch/pain score, a significantly greater percentage of delgocitinib-treated patients achieved ≥ 4-point reduction from week 2 (14.2%/17.3%) versus cream vehicle (6.3%/6.9%; P < 0.001). Reductions were maintained up to week 16 with delgocitinib cream treatment. Delgocitinib cream was well tolerated.

CONCLUSION: Early onset of itch/pain reduction was observed within week 1 for delgocitinib-treated patients, thereby providing further support of the use and efficacy of delgocitinib cream in adults with moderate to severe CHE.

PMID:41442013 | DOI:10.1007/s13555-025-01611-y

Categories
Nevin Manimala Statistics

Transdermal Delivery of Poly-L-Lactic Acid via Fractional Microneedle Radiofrequency for Atrophic Acne Scars: A Split-Face Randomized Study in Fitzpatrick Skin Types III to V

Dermatol Ther (Heidelb). 2025 Dec 24. doi: 10.1007/s13555-025-01626-5. Online ahead of print.

ABSTRACT

INTRODUCTION: Fractional microneedle radiofrequency (FMRF) and poly-L-lactic acid (PLLA) each promote dermal remodeling through distinct mechanisms and have demonstrated efficacy as monotherapies for atrophic acne scars (AAS). The objective of this study is to evaluate the efficacy and safety of combining FMRF with transdermal PLLA delivery compared with sterile water in Asian patients with moderate-to-severe AAS.

METHODS: In this randomized, split-face, evaluator-blinded clinical trial, 24 participants underwent two monthly FMRF sessions. Immediately after each session, a reconstituted PLLA suspension was applied to one facial half for transdermal delivery through the FMRF-created microchannels, while sterile water was applied to the contralateral side. Outcomes were assessed using three-dimensional imaging (Antera® 3D), standardized photography, and patient self-assessments over a 6-month follow-up. Safety was monitored throughout the study.

RESULTS: PLLA-treated sides demonstrated statistically significant improvements in skin texture and scar volume at 6 months compared with baseline and with control sides (p < 0.05). Patient-reported outcomes paralleled objective findings, with a higher proportion of participants reporting > 75% improvement on the PLLA-treated side. Adverse events were of low incidence, transient, self-limited, and no serious complications occurred.

CONCLUSIONS: Combining FMRF with transdermal PLLA delivery is a safe and effective approach for moderate-to-severe AAS in Asian patients. The combination produced progressive, sustained, and clinically meaningful improvements compared with FMRF alone.

TRIAL REGISTRATION: Thai Clinical Trials Registry: TCTR20250803007.

PMID:41442012 | DOI:10.1007/s13555-025-01626-5

Categories
Nevin Manimala Statistics

Effect of deep learning reconstruction on arm-induced artifacts compared with hybrid iterative reconstruction and filtered-backprojection in abdominal CT

Radiol Phys Technol. 2025 Dec 24. doi: 10.1007/s12194-025-00998-9. Online ahead of print.

ABSTRACT

Abdominal computed tomography (CT) is normally performed with patients raising their arms over abdominal region to prevent arm-induced artifacts that degrade image quality. This study aimed to evaluate the effects of deep learning-based image reconstruction (DLIR) on arm-induced artifacts and image quality in abdominal CT with arms-down positioning, compared to adaptive statistical iterative reconstruction-Veo (ASIR-V) and filtered-backprojection (FBP). A liver nodule phantom with arms from a PBU-60 phantom was scanned in three arms-down positions: alongside the torso, across the abdomen, and crossed over the pelvis. Abdominal CT images of 10 patients in arms-alongside-torso position were also included. Images were reconstructed using DLIRs (L-low, M-medium, and H-high), ASIR-Vs (50% and 100%), and FBP. Phantom images were assessed for artifact strength (location parameter of the Gumbel distribution and standard deviation), signal-to-noise ratio, and contrast-to-noise ratio. Two radiologists qualitatively evaluated patient images for noise, artifacts, sharpness, and overall quality. DLIR-H significantly reduced streak artifacts by 37% in location parameters and by 43% in SD, while improving SNR by 28% and CNR by 29% compared to ASIR-V50%. DLIR-M performed significantly better than ASIR-V50% in all quantitative metrics, except in the arms-alongside-torso position. FBP performed worst, although sharpness was comparable. DLIR-H received the best qualitative scores (low noise and artifacts, minimal blurring, and excellent overall image quality), although ASIR-V100% had lower subjective noise. DLIR outperformed ASIR-V and FBP in arm-induced artifact reduction and image quality and is a preferable reconstruction method for arms-down abdominal CT.

PMID:41442007 | DOI:10.1007/s12194-025-00998-9

Categories
Nevin Manimala Statistics

Association Between Mixed Exposure to Endocrine-Disrupting Chemicals and Cardiovascular Health: Results from the 2003-2016 NHANES

Cardiovasc Toxicol. 2025 Dec 24;26(1):7. doi: 10.1007/s12012-025-10084-6.

ABSTRACT

Accumulating evidence supports the association between endocrine disrupting chemicals (EDCs) exposure and cardiovascular disease (CVD). However, the link between EDCs and cardiovascular health (CVH) prior to CVD onset remains unclear. This study investigates the relationship between individual and combined EDC exposure and Life’s Essential 8 (LE8). We included 9,940 participants from the National Health and Nutrition Examination Survey (NHANES) conducted between 2003 and 2016, excluding adults with known CVD. Twenty-two types of EDCs were detected in urine samples, including three phenols, two phenolic pesticides, eleven phthalates, and six polycyclic aromatic hydrocarbons (PAHs). Weighted generalized linear models (GLM) and weighted quantile sum (WQS) regression to explore the relationship between single/mixed exposure to EDCs and CVH. Overall, 9,940 individuals (weighted mean [SE] age, 42.53 [0.26] years; 5,313 women [weighted 53.7%]) without CVD were included, with a mean score of LE8 at 68.70. The GLM model reveals that specific exposures to EDCs are inversely associated with LE8, serving as independent risk factors contributing to poorer CVH. The WQS index of EDCs was independently associated with overall CVH, with an adjusted odds ratio (OR) of 3.00 (95% confidence interval [CI]: 2.30-3.90; P < 0.001). 2-Fluorenone (2-FLU) emerged as the most heavily weighted component in the overall CVH model. This study emphasizes the association between exposure to EDCs is correlated with a higher odds ratio for decline in CVH among American adults. 2-FLU emerges as a prominent contributor. It provides epidemiologic evidence for the detrimental effects of these chemicals on CVH.

PMID:41442004 | DOI:10.1007/s12012-025-10084-6

Categories
Nevin Manimala Statistics

Integrated assessment of total airway count and pneumonia volume on chest computed tomography as a prognostic biomarker for coronavirus disease

Eur Radiol. 2025 Dec 24. doi: 10.1007/s00330-025-12078-y. Online ahead of print.

ABSTRACT

OBJECTIVES: The clinical relevance of computed tomography (CT)-based airway tree structure is unclear. Herein, we used artificial intelligence to segment the airway tree and pneumonia regions, measuring total airway count (TAC) and pneumonia volume to examine whether their combination is more closely associated with clinical outcomes in patients with coronavirus disease (COVID-19) than pneumonia volume alone.

MATERIALS AND METHODS: We examined clinical data and chest CT from 781 hospitalized COVID-19 patients in a multicenter retrospective cohort in Japan, focusing on the percentage of critical outcomes (high-flow oxygen, invasive mechanical ventilation, or death). Additionally, 197 patients were followed up for 3 months to monitor TAC and pneumonia volume.

RESULTS: Critical outcomes were observed in 63 (8.8%) patients, with higher TAC in those patients. Patients were divided into four groups based on cutoff values of 17.6% for pneumonia volume percent and 255 for TAC: Group A (low TAC, low pneumonia volume), Group B (high TAC, low pneumonia volume), Group C (low TAC, high pneumonia volume), and Group D (high TAC, high pneumonia volume). Group D had the worst outcomes, highest levels of inflammation, fibrosis markers, and complications, as well as a significantly higher risk of critical outcomes after adjusting for age, body mass index, sex, total lung volume and comorbidities. In the 3-month longitudinal analysis, pneumonia volume, but not TAC, improved in critical cases.

CONCLUSIONS: The integrated assessment of TAC and pneumonia volume effectively predicted critical outcomes in COVID-19 patients and may be useful for various respiratory diseases, including infectious or interstitial pneumonia.

KEY POINTS: Question Total airway counts (TAC) on computed tomography (CT) scan is associated with respiratory disease progression, but clinical relevance of CT-based airway tree structure is unclear. Findings The integrated assessment of TAC and pneumonia volume effectively predicted critical outcomes in COVID-19 patients. Clinical relevance This metric can potentially be applied to various respiratory diseases, including infectious or interstitial pneumonia.

PMID:41442001 | DOI:10.1007/s00330-025-12078-y

Categories
Nevin Manimala Statistics

The impact of DNMT3A mutation on survival of AML patients receiving allotransplant in first remission depends on the karyotype and co-occurring mutations

Bone Marrow Transplant. 2025 Dec 23. doi: 10.1038/s41409-025-02765-1. Online ahead of print.

ABSTRACT

Mutations in the DNMT3A gene are not yet classified as a distinct prognostic group in the latest European Leukemia Net (ELN) 2022 genetic risk classification of AML. We analyzed 1888 adult AML patients with ELN 2022 intermediate- or poor-risk cytogenetics who received their first allo-transplant in first complete remission between 2015 and 2022. Among patients with cytogenetically normal AML, the triple-positive mutation group (DNMT3A, NPM1, and FLT3-ITD) was the most frequent (n = 340, 29%), while DNMT3A co-occurrence with either FLT3 or NPM1 mutations alone was less common (4% and 9%, respectively). Patients with DNMT3A mutations were less likely to have a secondary AML (14% versus 24%, p < 0.001). DNMT3A mutations negatively affected post-transplant leukemia-free survival (LFS) in patients with normal karyotype and NPM1 mutation without FLT3-ITD (2-year LFS: 70% versus 90%, hazard ratio [HR]: 3.3, p = 0.006), and increased relapse incidence (RI) in FLT3-ITD and wild-type NPM1 subgroup (2-year RI: 30% versus 18%, HR: 2.32, p = 0.03). Notably, patients with normal karyotype and triple-positive mutation exhibited excellent 2-year LFS and OS (61% and 70%), indicating that allo-transplant overcomes the dismal outcome of this group. The impact of DNMT3A mutations on post-transplant outcomes in AML patients in first remission varies based on karyotype and co-mutations.

PMID:41437149 | DOI:10.1038/s41409-025-02765-1

Categories
Nevin Manimala Statistics

Comparative effects of high intensity interval and functional training on performance outcomes in adolescent female volleyball players

BMC Sports Sci Med Rehabil. 2025 Dec 23. doi: 10.1186/s13102-025-01476-w. Online ahead of print.

ABSTRACT

BACKGROUND: Volleyball requires repeated explosive actions, agility, and technical precision, demanding contributions from both aerobic and anaerobic energy systems. Time-efficient training methods such as resistance-based high intensity interval training (HIIT) and high intensity functional training (HIFT) have been proposed to enhance multidimensional performance in young athletes. However, direct comparisons of their effects in adolescent female volleyball players are limited.

METHODS: Thirty-two licensed female volleyball players (aged 15-18 years) were randomly assigned to a resistance-based HIIT group (n = 10), a HIFT group (n = 11), or a control group (n = 11). The training interventions lasted 12 weeks with two supervised sessions per week, in addition to regular volleyball practice. The HIIT program consisted of progressive resistance-based high intensity intervals performed at 85-95% HRmax, while the HIFT program comprised multimodal circuit-style functional exercises performed at comparable intensities (~ 85-95% HRmax). Performance assessments were conducted pre- and post-intervention and included the countermovement jump (CMJ), standing long jump (SLJ), medicine ball throw (MBT), pro-agility test, 20 m sprint, repeated sprint fatigue index (RSI), volleyball skill test, Yo-Yo IR1 distance, VO₂max, blood lactate concentration, and maximal heart rate (MaxHR). Data were analyzed using a 3 × 2 mixed-model ANOVA with Tukey post hoc tests, and effect sizes were reported as Cohen’s d and partial eta squared (ηp²).

RESULTS: Significant Group × Time interactions were observed for CMJ (F(2,29) = 9.14, p < 0.001, ηp²=0.39), SLJ (F(2,29) = 12.08, p < 0.001, ηp²=0.45), VO₂max (F(2,29) = 11.53, p < 0.001, ηp²=0.44), MaxHR (F(2,29) = 3.56, p = 0.041, ηp²=0.20), and volleyball skill test (F(2,29) = 7.44, p = 0.002, ηp²=0.34). HIFT produced the greatest improvements in explosive power (CMJ Δ=+5.15 cm; SLJ Δ=+12.82 cm), RSI (Δ=-1.56), and volleyball skill performance (Δ=+13.18 points), with large effect sizes (d = 1.3-3.2). HIIT showed relatively greater improvements in aerobic capacity (VO₂max Δ=+0.90 ml·kg⁻¹·min⁻¹) and endurance (d = 0.7-2.0). No significant between-group differences were observed for lactate.

CONCLUSIONS: Both resistance-based HIIT and HIFT proved effective in improving several physical and volleyball-specific performance components in adolescent female players. HIIT produced slightly greater gains in aerobic capacity, while HIFT showed larger numerical improvements in explosive strength, agility, and technical skill performance; however, these between-intervention differences were not statistically significant. Overall, the findings suggest that each modality offers complementary benefits, and integrating both HIIT and HIFT may provide a balanced conditioning approach for the holistic development of youth volleyball athletes.

TRIAL REGISTRATION: ClinicalTrials.gov Identifier: NCT07181031. Registered on September 04, 2025.

PMID:41437127 | DOI:10.1186/s13102-025-01476-w

Categories
Nevin Manimala Statistics

Tinnitus constancy and self-reported severe headache or migraine symptoms in peri- and postmenopausal women: a statistical modeling and machine learning analysis

Eur J Med Res. 2025 Dec 23. doi: 10.1186/s40001-025-03712-y. Online ahead of print.

ABSTRACT

BACKGROUND: Severe headache or migraine symptoms are a highly prevalent and disabling neurological disorder with distinct clinical and mechanistic features in peri- and postmenopausal women. As research increasingly explores its complex sensory phenotype, auditory comorbidities such as tinnitus have drawn growing attention. Among them, the temporal constancy of tinnitus may be linked to severe headache or migraine symptom occurrence, yet remains understudied in this population.

METHODS: A total of 2,485 peri- and postmenopausal women from a nationally representative U.S. sample were included in this observational analysis. The target variable was severe headache or migraine symptoms, defined using standardized symptom and medication information. The main explanatory variable was tinnitus constancy, derived from standardized questionnaire items. Weighted multivariable logistic regression models were used to estimate the association between tinnitus constancy and severe headache or migraine symptoms, adjusting for demographic, socioeconomic, and clinical factors. Variable importance was further assessed using five complementary approaches: LASSO regression, logistic regression, random forest classification, the Boruta algorithm, and SHAP-based LightGBM modeling.

RESULTS: A significant and graded association was observed between tinnitus constancy and severe headache or migraine symptoms. Compared to women without tinnitus, those with occasional (OR = 2.29, 95% CI 1.44-3.63), intermittent (OR = 2.62, 95% CI 1.57-4.35), and constant tinnitus (OR = 1.93, 95% CI 1.26-2.95) had higher odds of reporting severe headache or migraine symptoms. Intermittent tinnitus displayed the strongest association. Age was negatively associated with symptom reporting and consistently ranked as the most important predictor across all machine learning algorithms. SHAP dependence plots confirmed elevated symptom probability among younger individuals and those with more variable tinnitus patterns.

CONCLUSIONS: Tinnitus constancy, particularly in its intermittent form, is independently associated with higher odds of severe headache or migraine symptoms in peri- and postmenopausal women. Age was also a dominant protective factor. These findings highlight the importance of considering the temporal patterns of tinnitus symptoms and age-related vulnerability when evaluating severe headache or migraine symptoms in peri- and postmenopausal women.

PMID:41437109 | DOI:10.1186/s40001-025-03712-y

Categories
Nevin Manimala Statistics

Explainable machine learning for orthopedic decision-making: predicting functional outcomes of total hip replacement from gait biomechanics

Arthritis Res Ther. 2025 Dec 23. doi: 10.1186/s13075-025-03709-2. Online ahead of print.

ABSTRACT

This study aimed to identify subpopulations of patients with hip osteoarthritis who exhibit distinct adaptations in gait biomechanics, and to evaluate subpopulation-specific effects of total hip replacement on gait biomechanics. Three datasets were analyzed: (1) a cohort of 109 unilateral hip osteoarthritis patients before total hip replacement, (2) a subset of the first dataset of 63 patients re-evaluated after total hip replacement and (3) a control group of 56 healthy individuals. For all participants, three-dimensional joint angle and moment waveforms of the pelvis, ipsilateral hip and knee, as well as sagittal-plane ankle motion and the foot progression angle, were obtained. The analytical framework integrated k-means clustering, support vector machine classifiers, Shapley Additive exPlanations, and statistical waveform analyses. Clustering of the pre-operative dataset revealed three distinct subpopulations characterized by unique patterns in gait kinematics and joint moments. These subpopulations also differed in age, Kellgren-Lawrence score, and walking speed. Prior to total hip replacement, between 51.4% and 85.2% of hip osteoarthritis patients were classified as pathologic; following surgery, this proportion decreased to 27.8% – 51.8%. Hip flexion and rotation angles and moments were identified as the most important features for patient classification. The magnitude of gait improvement after total hip replacement varied across subpopulations, indicating subpopulation-specific responses to surgical intervention. In conclusion, patients with hip osteoarthritis demonstrate distinct subpopulation-specific gait adaptations, both before and after total hip replacement. Preoperative classification of patients into the identified subpopulations using machine learning approaches may facilitate the prediction of postoperative gait recovery and support the development of personalized treatment and rehabilitation strategies.

PMID:41437106 | DOI:10.1186/s13075-025-03709-2

Categories
Nevin Manimala Statistics

Impact of point-of-care ultrasound (POCUS) in pediatric emergency departments: a meta-analysis of randomized controlled trials

Ital J Pediatr. 2025 Dec 23;51(1):323. doi: 10.1186/s13052-025-02159-5.

ABSTRACT

Point-of-care ultrasound (POCUS) is a bedside diagnostic tool clinicians use to provide immediate insights and guide therapeutic interventions. It has become increasingly significant in pediatric emergency departments (EDs) for diagnosing conditions, managing critical scenarios, and guiding procedures due to its portability, ease of use, and lack of radiation. This study aims to systematically review and analyze the efficacy of POCUS compared to conventional diagnostic methods in pediatric emergency settings. A literature search was conducted across PubMed, SCOPUS, Web of Science, Embase, and Cochrane Library up to February 2025. The inclusion criteria were pediatric patients aged 1 month to 18 years in EDs, with studies comparing POCUS to conventional methods. Primary outcomes included first-attempt procedural success and overall success rates. Secondary outcomes included time to procedure completion, mean number of attempts, hospitalization rates, and discharge rates. Data analysis was conducted in R employing a random-effects model, with dichotomous data analyzed as risk ratio (RR) and 95% confidence interval (CI), and continuous data as unbiased standardized mean difference (SMD). Statistical significance was defined at p < 0.05. Eighteen randomized controlled trials involving 2264 patients met the inclusion criteria. POCUS significantly improved first-attempt success (RR = 1.25; 95% CI: 1.09-1.43). The overall procedural success showed a significant benefit with POCUS (RR = 1.12; 95% CI: 1.03-1.22). However, no significant differences were noted in the time to procedure completion, number of attempts for a successful procedure, and rates of hospitalization and discharge to home. POCUS significantly improves first-attempt and overall procedural success rates in pediatric emergency settings, although it does not significantly reduce procedure times or the number of attempts. These findings underscore the importance of integrating POCUS into pediatric emergency care to enhance diagnostic accuracy and procedural success, though further research is needed to optimize its implementation across different age groups and procedures.

PMID:41437104 | DOI:10.1186/s13052-025-02159-5