Eur Radiol. 2026 Mar 31. doi: 10.1007/s00330-026-12518-3. Online ahead of print.
NO ABSTRACT
PMID:41917219 | DOI:10.1007/s00330-026-12518-3
Category Added in a WPeMatico Campaign
Eur Radiol. 2026 Mar 31. doi: 10.1007/s00330-026-12518-3. Online ahead of print.
NO ABSTRACT
PMID:41917219 | DOI:10.1007/s00330-026-12518-3
Br J Cancer. 2026 Mar 31. doi: 10.1038/s41416-026-03408-y. Online ahead of print.
ABSTRACT
BACKGROUND: This multicentre, modular, Phase 1 study evaluated escalating doses of ATR (ataxia telangiectasia and Rad3-related kinase) inhibitor ceralasertib plus PD-L1 inhibitor durvalumab in patients with previously treated advanced/metastatic non-small-cell lung cancer (NSCLC) or head and neck squamous cell carcinoma (HNSCC).
METHODS: Patients received ceralasertib 80/160/240 mg twice-daily (BID) or 320 mg once-daily (QD) for 7 (Days 22-28) or 14 (Days 15-28) days, plus durvalumab 1500 mg (Day 1), per 28-day cycle. The primary objective was to investigate the safety/tolerability of the combination.
RESULTS: Sixty patients were treated. Two patients had dose-limiting toxicities of: Grade 3 thrombocytopenia with Grade 3 anaemia (ceralasertib 320 mg QD for 14 days); and Grade 4 thrombocytopenia with Grade 3 neutropenia accompanied by systemic chest infection (ceralasertib 240 mg BID for 14 days). Overall, 59 (98.3%) patients had treatment-emergent adverse events; 31 (51.7%) had grade ≥3 events. The recommended Phase 2 dose was durvalumab 1500 mg (Day 1) plus ceralasertib 240 mg BID (Days 15-28). Five (8.3%) patients had objective responses; 31 (51.7%) had stable disease. Pharmacodynamic activity (pRAD50 increase) was observed in 10/14 paired biopsies.
CONCLUSION: Ceralasertib plus durvalumab was tolerated and associated with antitumour activity in advanced/metastatic NSCLC and HNSCC.
TRIAL REGISTRATION NUMBER: NCT02264678.
PMID:41917211 | DOI:10.1038/s41416-026-03408-y
Bone Marrow Transplant. 2026 Mar 31. doi: 10.1038/s41409-026-02845-w. Online ahead of print.
ABSTRACT
Allogeneic hematopoietic cell transplantation (ASCT) is the only curative option for patients with myelodysplastic syndromes (MDS), but whether cytoreductive pretreatment and molecular “downstaging” according to the IPSS-M improves outcomes remains unclear. We retrospectively analyzed 128 consecutive adults with MDS who underwent ASCT grouped as frontline transplantation (n = 87) or pretreated before transplant (n = 41). Median bone marrow blasts at diagnosis were 12% vs. 10%. IPSS-M was calculated at diagnosis and immediately before transplant using cytogenetic and next-generation sequencing data. IPSS-M improved in 26% of frontline and 34% of pretreated patients, was unchanged in 41% and 34%, and worsened in 30% and 32%, respectively. After a median follow-up of 17.3 months, overall survival (OS), relapse-free survival (RFS) and graft-versus-host disease relapse-free survival (GRFS) were superior with frontline transplantation (median OS 112.6 vs 14.0 months, p = 0.03, median RFS 61.0 vs 8.9 months, p = 0.007 and median GRFS 13.3 vs 5.3 months, p = 0.004). However, in a landmark analysis starting at the time of transplantation, the difference in OS was no longer statistically significant. Non-relapse mortality was significantly higher after pretreatment (p = 0.018). Pretransplant cytoreduction did not improve post-transplant outcomes despite modest IPSS-M improvements, supporting molecular-risk-guided timing and early donor identification rather than treatment aimed at IPSS-M downstaging.
PMID:41917167 | DOI:10.1038/s41409-026-02845-w
Sci Rep. 2026 Mar 31. doi: 10.1038/s41598-026-46713-5. Online ahead of print.
NO ABSTRACT
PMID:41917162 | DOI:10.1038/s41598-026-46713-5
Sci Rep. 2026 Mar 31. doi: 10.1038/s41598-026-46580-0. Online ahead of print.
ABSTRACT
This study evaluated floor and ceiling (F/C) effects within the Tampa Scale for Kinesiophobia (TSK) and examined item-level correlations with total TSK scores and pain intensity in individuals with knee osteoarthritis (KOA). A cross-sectional study was conducted involving 134 participants diagnosed with KOA. Each TSK item was analyzed to identify F/C effects, with a threshold of 15% set as the criterion for significance. Spearman’s rank correlation coefficient was employed to assess the relationships between each TSK item and the total TSK score, as well as between each item and pain intensity. Also, an exploratory factor analysis (EFA), a varimax rotation, was performed on the 17 items. Notably, significant floor effects were observed in items 4 and 12, while a ceiling effect was noted in item 13. Statistically significant correlations between individual items and the total TSK score were identified for all items except for items 8, 12, and 16. Furthermore, a significant correlation was found between item 3 and pain intensity. For the EFA analysis, the data were appropriate, the KMO and Bartlett sphericity tests were 0.748 and p < 0.001. the EFA suggests 4 components which explained 53% of the total variance. Items 4, 8, 12, 13, and 16 of the TSK demonstrated either F/C effects. Correlations with the total TSK score were generally non-significant, with values ranging from r = -0.053 to 0.308. For grouping of the items, 4 components are suggested. Therefore, it is recommended that these items may require further evaluation in future adaptations or psychometric evaluations of the TSK for this population.
PMID:41917156 | DOI:10.1038/s41598-026-46580-0
Sci Rep. 2026 Mar 31. doi: 10.1038/s41598-026-46144-2. Online ahead of print.
ABSTRACT
To systematically review and meta-analyze the effects of speed-agility-quickness (SAQ) training on pre-planned change-of-direction speed (CODS) in adolescent and young adult team-sport athletes and to explore potential moderating factors. Following the PRISMA 2020 guidelines, randomized controlled trials published from database inception to 15 November 2025 were searched in PubMed, Web of Science, Scopus, CNKI, EBSCOhost, and the Cochrane Library. Eligible studies involved basketball, soccer, or handball athletes aged 9-26 years, with the experimental group receiving SAQ-dominant interventions and the control group performing routine training, regular sport-specific practice, no additional training, or other non-SAQ comparison conditions. Standardized mean differences (SMDs) with 95% confidence intervals were calculated using a random-effects model. Subgroup analyses, restricted cubic spline meta-regression, sensitivity analyses, and publication bias assessments were conducted. Twenty-two studies contributing 26 effect sizes were included, comprising 17 effect sizes for pre-planned CODS and 9 for linear sprint performance. Compared with controls, SAQ training significantly improved pre-planned CODS (SMD = – 0.71, 95% CI – 0.92 to – 0.51, P < 0.00001) and also improved linear sprint performance (SMD = – 0.90, 95% CI – 1.18 to – 0.62). For CODS, subgroup analyses revealed no significant moderation by age (≤ 18 vs. > 18 years, P = 0.92), weekly training volume (≤ 120 vs. > 120 min/week, P = 0.19), competitive level (elite/club vs. school/university, P = 0.63), or sport discipline (basketball, soccer, handball). Meta-regression did not identify statistically significant non-linear associations for the examined moderators. Sensitivity analyses supported the stability of the pooled estimates, although potential publication bias should be considered when interpreting the magnitude of the effects. SAQ training appears to be an effective strategy for improving pre-planned CODS and linear sprint performance in team-sport athletes aged 9-26 years. The available evidence suggests that these benefits may be observed across different age groups, training volumes, competitive levels, and sports, although variability in intervention design, outcome assessment, and study quality should be acknowledged. These findings support the inclusion of SAQ training within routine physical conditioning programs for this population.
PMID:41917150 | DOI:10.1038/s41598-026-46144-2
Sci Rep. 2026 Mar 31. doi: 10.1038/s41598-026-45573-3. Online ahead of print.
ABSTRACT
Sudden cardiac arrest (SCA) remains a critical public health challenge with mortality rates close to 90%. Current prognostication methods commonly analyze data of individual modalities separately and delay assessment until 72 hours post-arrest, creating a critical gap in early decision-making. Here, we introduce contrastive language and image reasoning with masked autoencoders (CLAIR), a novel multimodal framework integrating head computed tomography (CT) imaging with non-imaging clinical patient information through a cross-attention mechanism and contrastive learning approach to predict cerebral performance category (CPC) score in patients after cardiac arrest. In a retrospective study of 208 patients, we evaluated CLAIR against CT-based imaging-only assessment, as well as clinical evaluation by two experienced ICU neurologists. Our method achieved an AUC-ROC of 0.94 (CI: 0.90-0.97) when trained on a combination of multiplanar CT reconstructions and non-imaging clinical data, significantly outperforming CT scan-based imaging-only methods (AUC-ROC: 0.80, CI: 0.74-0.86) with statistical significance (p = 0.03). In a structured evaluation, the clinicians suggested that CLAIR assisted assessments resulted in fewer prognostic errors than non-assisted evaluations. Further, we demonstrate the applicability of our approach for early neurologic outcome prediction using CT scans obtained within the first 24 hours post-arrest (median acquisition time: 3.1 hours). Our results suggest that CLAIR can contribute value as a clinical assistive tool aiming at reliable early prognostication for post-cardiac arrest patients, potentially enabling more timely clinical decision-making, family counseling, and resource allocation.
PMID:41917125 | DOI:10.1038/s41598-026-45573-3
J Clin Neurosci. 2026 Mar 30;149:111994. doi: 10.1016/j.jocn.2026.111994. Online ahead of print.
ABSTRACT
INTRODUCTION: Traumatic brain injury (TBI) represents a significant global health burden and often results in functional impairment. Blood pressure variability (BPV), a surrogate marker of autonomic dysfunction, has been shown to influence outcomes in patients with cerebrovascular disease. Increased BPV has been strongly linked to deviation from optimal cerebral perfusion pressure, which may elevate the risk of secondary brain injury and poor outcomes after TBI. This study aimed to investigate the association of early BPV with clinical and functional outcomes, as well as brain injury biomarkers, in patients with TBI.
METHOD: We conducted a retrospective cohort study using data from the Transforming Clinical Research and Knowledge in Traumatic Brain Injury Study (TRACK-TBI), which prospectively enrolled acute TBI patients across 18 United States Level 1 trauma centers between 2014-2018. The study population included adults with moderate-to-severe TBI who required intracranial pressure monitoring. The primary exposure was early BPV, calculated from hourly blood pressure measurements during the first 24 h after ICU admission; 72-hour BPV was examined in sensitivity analyses. Two BPV metrics were evaluated: systolic standard deviation (SSD) and average real variability (ARV). The primary outcome was the 6-month Glasgow Outcome Scale-Extended score specific to TBI (GOSE-TBI). Secondary outcomes included in-hospital mortality, GOSE-TBI at 3 and 12 months, Disability Rating Scale (DRS) at 3 months, 6 months, and 12 months, and blood-based brain injury biomarkers [glial fibrillary acidic protein (GFAP), ubiquitin carboxy-terminal hydrolase L1 (UCH-L1), neuron-specific enolase (NSE), S100 calcium-binding protein B (S100B), and the inflammatory biomarker C-reactive protein (CRP)]. Multivariable regression models were used to assess associations between BPV, clinical outcomes, and biomarker levels.
RESULTS: A total of 108 patients were included. The mean age (SD) was 41.3 years (17.3), 81% were male, and 81% identified as White. There were no statistically significant associations between 24-hour BPV and 6-month GOSE for either ARV (OR 0.84, 95% CI 0.68-1.05; p = 0.133) or SSD (OR 0.86, 95% CI 0.69-1.08; p = 0.194). Among secondary outcomes, higher 24-hour SSD was associated with increased odds of in-hospital mortality (OR 1.13, 95% CI 1.00-1.27; p = 0.048). Higher average 72-hour SSD was also associated with higher hs-CRP levels (Ratio 1.04, 95% CI 1.00-1.07; p = 0.036).
CONCLUSION: Early BPV was not associated with GOSE-TBI at 6 months or most blood-based brain injury biomarkers. However, higher 24-hour SSD may be associated with increased in-hospital mortality. The prognostic value of BPV warrants confirmation in future prospective studies.
PMID:41915974 | DOI:10.1016/j.jocn.2026.111994
Sci Total Environ. 2026 Mar 30;1029:181672. doi: 10.1016/j.scitotenv.2026.181672. Online ahead of print.
ABSTRACT
As a keystone species in tropical freshwater ecosystems, Caiman crocodilus (Linnaeus, 1758) serves as a valuable bioindicator for assessing genetic damage in polluted environments. This study examined mercury (Hg) and arsenic (As) bioaccumulation in caudal scutes and blood across various age groups, alongside the evaluation of genotoxic effects using the micronucleus (MN) assay. Among adults, subadults and juveniles (n = 16), Hg concentrations in the scutes ranged from 41.8 to 535 μg/kg with a median of 145.0 μg/kg, while in blood they ranged from 32.5 to 472.9 μg/L and a median of 131.7 μg/L. The median concentrations of As in blood were 1.0 μg/L, whereas in scutes they were below the limit of quantification (LOQ) established in the analytical methods. Females exhibited slightly higher Hg levels in both scutes (162.0 μg/kg) and blood (131.7 μg/L) compared to males (scutes: 145.0 μg/kg; blood: 118.4 μg/L), although these differences were not statistically significant (p > 0.05). Subadult individuals had significantly higher blood Hg concentrations than juveniles (U = 9; p = 0.03; n = 9). Neonates (n = 6), the median Hg concentrations were 303.1 μg/kg and 109.6 μg/L in scutes and blood, respectively. The MN assay revealed evidence of genotoxic damage. Although the mean MN frequency in large individuals (excluding neonates) was low (0.3), nuclear buds (NB, 9.8) and binucleated cells (BC, 1.4) were more prominent. A negative trend was observed between Hg concentrations and the frequency of MN, NB, and BC, whereas As showed a positive correlation with BC (r = 0.38, p = 0.28). Additionally, 37.5% of the individuals exhibited poor body condition (Elsey condition factor < 1). These findings support the potential of C. crocodilus as an effective sentinel species for assessing genotoxic effects linked to environmental pollution. Moreover, this study contributes valuable data to pollution monitoring efforts in the Colombian Pacific region, which have largely focused on toxic metal(loid) analysis in fish to date.
PMID:41915960 | DOI:10.1016/j.scitotenv.2026.181672
Mar Pollut Bull. 2026 Mar 30;228:119654. doi: 10.1016/j.marpolbul.2026.119654. Online ahead of print.
ABSTRACT
Understanding how vegetation and sediment regimes jointly shape trace metal risks in deltaic tidal wetlands is pivotal for targeted monitoring and restoration. Here we propose a vegetation-sediment indicator framework to diagnose the distribution and ecological risk of six trace metals (Cu, Zn, Cr, As, Cd, Pb) in the tidal flat wetlands of the Yellow River Estuary (YRE). A total of 18 sampling sites were established, covering five vegetation types (mudflat, Suaeda salsa, Phragmites australis, Suaeda salsa & Tamarix chinensis, Suaeda salsa & Phragmites australis) and three soil depth layers (0-10 cm, 10-20 cm, 20-30 cm). Multi-dimensional evidence was obtained through analyses of spatial distribution, pollution assessment, and statistical modeling. Redundancy analysis (RDA), linear fitting, and partial least squares regression (PLS) confirmed that hydrodynamic-driven grain size differentiation laid the foundation framework for metal distribution. Grain-size differentiation driven by local hydrodynamics emerged as the first-order control, setting a “risk template” in which fine-sediment zones show higher metal accumulation and risk than coarse-grained areas. Vegetation further amplified the spatial heterogeneity of metal distribution through “grain size regulation” and “rhizosphere chemistry” effects: Phragmites australis zones emerged as metal enrichment hotspots, while Suaeda salsa zones exhibited a distinct “buffering” capacity. Vertical differentiation across 0-30 cm soil profile was weak, which favored the formation of horizontally structured metal hotspots. This study advances a generalizable, management-oriented indicator set-vegetation type plus sediment grain-size characteristics-to support targeted surveillance, early warning, and restoration prioritization in deltaic tidal wetlands.
PMID:41915933 | DOI:10.1016/j.marpolbul.2026.119654