Categories
Nevin Manimala Statistics

Mortality among workers at the Rocky Flats Plant, 1951-2017

J Radiol Prot. 2026 Mar 4. doi: 10.1088/1361-6498/ae4d57. Online ahead of print.

ABSTRACT

BACKGROUND: The Rocky Flats (RF) Plant operated from 1951-1989 as part of the U.S. Department of Energy (DOE) nuclear complex. Its primary mission was weapons component fabrication, whereby workers were potentially exposed to radioactive and non-radioactive hazards. RF worker mortality was compared to the general population, and dose-response relationships between mortality and radiation organ doses were examined.

METHODS: RF workers first employed between 1951 through 1979 for ≥30 days were identified (n=9,397). Vital status was determined using national and state death records through 2017. Organ doses from external photons and neutrons irridation and internalized plutonium, americium, and uranium were modeled as cumulative lagged total dose per year. Beryllium exposure was evaluated as an effect modifier using data from the DOE Nationwide Beryllium Medical Program. Statistical analyses included standardized mortality ratios (SMRs), Cox proportional hazard models, and excess relative risk (ERR) models.

RESULTS: Approximately 53.2% were deceased by the end of study. Nearly 90% were monitored for radiation exposure, with mean weighted absorbed dose of 59.0 mGy for lung. Nearly 45% of workers had intakes of alpha-particle emitting radionuclides, and 46.7% were monitored for neutrons. Leading causes of death included ischemic heart disease (n=999) and lung cancer (n=361). The highest SMRs were observed for berylliosis (SMR: 176.9; 95% CI: 76.2, 348.7; n<10) and asbestosis (SMR: 4.65; 95% CI: 2.23, 8.55;n=10). Dose-response analyses showed no statistical increase in risk from low-dose radiation including lung cancer (ERR per 100 mGy: -0.02; 95% CI: -0.11, 0.08; n=361) and Parkinson’s disease (ERR per 100 mGy: 0.13; 95% CI: -0.26, 0.31; n=57). Approximately 45% of workers were monitored for beryllium, with a weak non-significant indication of effect modification for lung cancer risk.

CONCLUSION: The RF cohort showed no evidence of statistically significant increase in mortality from occupational radiation exposure. However, this study was limited by low statistical power, which inhibits the ability to detect effects. Future pooling of MPS cohorts will provide further insights, particularly regarding plutonium as a carcinogen.

PMID:41780076 | DOI:10.1088/1361-6498/ae4d57

Categories
Nevin Manimala Statistics

Transparent Reporting of Statistics in Surgery (TRESS): A Framework for Clinical Interpretability

Plast Reconstr Surg. 2026 Mar 4. doi: 10.1097/PRS.0000000000012984. Online ahead of print.

ABSTRACT

In surgical research, statistical sophistication is too often mistaken for scientific rigor. Across a growing body of plastic surgery literature, adjusted odds ratios, hazard ratios, and regression coef- ficients are frequently presented without the crude event rates or absolute measures of effect that give findings clinical meaning. We describe this phenomenon as “runic statistics”: results that are statistically valid yet clinically opaque. Through examples drawn from contemporary plastic surgery studies, we highlight three recurrent interpretive flaws: reliance on statistical significance without consideration of clinical relevance, reporting of relative measures without baseline risks or absolute differences, and conflation of association with causation. We further demonstrate how case-mix imbalances can create apparent contradictions in results (Simpson’s paradox), and how identical odds ratios can translate into very different clinical implications depending on the base- line risk. To address these challenges, we propose a thirteen-step reporting framework designed to promote transparency, interpretability, and clinical applicability. Key elements include explicit definition of the estimand, presentation of both crude and adjusted data, translation of relative effects into absolute risks and patient-facing numbers, assessment of minimal clinically important differences, careful handling of confounding, and restraint in the use of causal language. By an- choring statistical reporting in clinical realities, surgical research can remain both methodologically rigorous and directly relevant to patient care. Our goal is not to simplify science, but to ensure that its communication is clear, transparent, and ultimately useful at the bedside.

PMID:41780063 | DOI:10.1097/PRS.0000000000012984

Categories
Nevin Manimala Statistics

Finerenone in Type 1 Diabetes and Chronic Kidney Disease

N Engl J Med. 2026 Mar 5;394(10):947-957. doi: 10.1056/NEJMoa2512854.

ABSTRACT

BACKGROUND: The nonsteroidal mineralocorticoid receptor antagonist finerenone has been reported to improve kidney and cardiovascular outcomes in persons with type 2 diabetes and chronic kidney disease (CKD). The efficacy and safety of finerenone in persons with type 1 diabetes and CKD are unknown.

METHODS: We conducted a phase 3 trial involving adults who had type 1 diabetes, CKD (estimated glomerular filtration rate [eGFR], 25 to <90 ml per minute per 1.73 m2 of body-surface area), and albuminuria (urinary albumin-to-creatinine ratio [with albumin measured in milligrams and creatinine measured in grams], 200 to <5000) and were receiving an angiotensin-converting-enzyme (ACE) inhibitor or an angiotensin-receptor blocker. Participants were randomly assigned to receive finerenone (10 or 20 mg per day, depending on the eGFR) or matching placebo. The primary outcome was the relative change in the urinary albumin-to-creatinine ratio over a period of 6 months.

RESULTS: A total of 242 participants underwent randomization. The median urinary albumin-to-creatinine ratio decreased from 574.6 at baseline to 373.5 at 6 months among all the participants assigned to receive finerenone and from 506.4 to 475.6 among those assigned to receive placebo. Over a period of 6 months, the urinary albumin-to-creatinine ratio decreased by 34% with finerenone (geometric mean ratio to baseline, 0.66; 95% confidence interval [CI], 0.60 to 0.73) and 12% with placebo (geometric mean ratio to baseline, 0.88; 95% CI, 0.79 to 0.98), which corresponded to a 25% greater reduction with finerenone than with placebo (geometric mean ratio for finerenone vs. placebo, 0.75; 95% CI, 0.65 to 0.87; P<0.001). The most common adverse event was hyperkalemia (in 12 participants [10.1%] with finerenone and in 4 [3.3%] with placebo); 2 participants (1.7%) discontinued finerenone because of hyperkalemia. At 6 months, the change in the eGFR was -5.6 ml per minute per 1.73 m2 with finerenone and -2.7 ml per minute per 1.73 m2 with placebo (difference, -2.9 ml per minute per 1.73 m2; 95% CI, -5.1 to -0.7); eGFR values approached baseline levels during the washout period.

CONCLUSIONS: In adults with type 1 diabetes and CKD, finerenone resulted in a significantly greater decrease in the urinary albumin-to-creatinine ratio than placebo. (Funded by Bayer; FINE-ONE ClinicalTrials.gov number, NCT05901831.).

PMID:41780000 | DOI:10.1056/NEJMoa2512854

Categories
Nevin Manimala Statistics

Comparing deep learning and classical regression approaches for predicting healthcare expenditure and spending: a systematic review

J Med Econ. 2026 Dec;29(1):654-671. doi: 10.1080/13696998.2026.2630598. Epub 2026 Mar 4.

ABSTRACT

AIMS: This study compares deep learning architectures with traditional regression and tree-based models for individual-level healthcare cost prediction, with particular attention to performance differences across data contexts.

METHODS: We conducted a preregistered systematic review (PROSPERO CRD420251129440). Web of Science, PubMed, Embase, and Scopus were searched through August 2025. Eligible studies used real-world individual-level data (claims, electronic health records, or registries) to predict cost-related outcomes with at least one deep learning architecture and one classical regression comparator, and reported quantitative performance. Data were extracted on population, predictors, outcome horizon, model type, validation strategy, performance metrics, calibration, and interpretability.

RESULTS: Eight studies met inclusion criteria, spanning the United States, Europe, and Asia. In longitudinal designs-such as multi-year claims prediction and medication or hospitalization time-series forecasting-sequential deep learning models, particularly LSTM and CNN-LSTM hybrids, consistently outperformed regression and tree-based algorithms. Reported gains included approximately 10-20% reductions in RMSE/MAE, R2 improvements of 0.01-0.15, and AUROC values up to 0.78 for high-risk classification. Across studies, prior costs and utilization were consistently the strongest predictors, while social determinants and free-text features were rarely incorporated. In contrast, for low-dimensional, structured, cross-sectional medical data, generalized linear models and tree-based approaches remain robust baseline models due to their interpretability and calibration stability.

LIMITATIONS: Evidence is based on a small and heterogeneous set of eight studies, with limited external or temporal validation, short prediction horizons, and sparse assessment of calibration, economic interpretability, and fairness, warranting cautious interpretation.

CONCLUSIONS: Deep learning offers clear gains for longitudinal, sequence-rich cost forecasting, whereas tree-based methods remain highly competitive for cross-sectional tabular prediction. Overall, these findings are consistent with the proposed Complexity-Performance Hypothesis, which posits that the predictive advantages of deep learning emerge primarily when model capacity is well matched to data complexity.

PMID:41779998 | DOI:10.1080/13696998.2026.2630598

Categories
Nevin Manimala Statistics

Reusable Instrumentation for Arthroscopic Rotator Cuff Repair May Not Impact Clinical Outcomes

J Am Acad Orthop Surg Glob Res Rev. 2026 Mar 3;10(3). doi: 10.5435/JAAOSGlobal-D-25-00432. eCollection 2026 Mar 1.

ABSTRACT

INTRODUCTION: Hospitals contribute to substantial environmental waste and greenhouse gas emissions, with operating rooms accounting for 50% to 70% of hospital waste. Arthroscopic rotator cuff repair (RCR), a commonly performed procedure, typically uses disposable instruments to minimize infection risk. There is limited evidence regarding the clinical safety and effectiveness of disposable instruments compared with reusable instruments. We aimed to evaluate whether reusable instrumentation for arthroscopic RCR affects clinical outcomes.

METHODS: This was a retrospective cohort study involving 191 patients undergoing primary arthroscopic RCR. Patients were divided into reusable (N = 89) and disposable (N = 102) instrumentation cohorts. Primary outcomes included rates of postoperative soft-tissue infection and septic revision within 1 year postoperatively. Data were analyzed using frequentist and Bayesian statistical methods.

RESULTS: Infection rates and septic revisions were similar between reusable and disposable instrumentation groups, with one septic revision in each cohort (P = 1.0). Aseptic revision rates were also similar (P = 0.50). Surgical times did not significantly differ between groups (reusable: 1.50 ± 0.33 hours; disposable: 1.61 ± 0.41 hours; P = 0.076). Bayesian analysis supported these findings, demonstrating no meaningful difference in infection risks between groups, with median odds ratios close to 1.0 and credible intervals including 1.0.

CONCLUSION: Observed proportions of revision and infection were similar in magnitude. These findings suggest that reusable instruments have the potential to be a safe and sustainable alternative to single-use instruments in arthroscopic RCR. However, owing to the rarity of infection and revision, future multisite studies are necessary to assess whether risk of these outcomes is noninferior to nonreusable instruments.

PMID:41779932 | DOI:10.5435/JAAOSGlobal-D-25-00432

Categories
Nevin Manimala Statistics

Exploring Role Discrepancies Among Jordanian Nurses: Implications for Continuing Education and Workforce Policy

J Contin Educ Nurs. 2026 Mar;57(3):130-136. doi: 10.3928/00220124-20251204-01. Epub 2026 Mar 1.

ABSTRACT

BACKGROUND: Discrepancies between nurses’ ideal and actual roles can undermine job satisfaction, role identity, and care quality. This study explored how registered nurses in Jordan perceive their ideal versus actual roles and how these perceptions differ by demographic and organizational factors.

METHOD: A descriptive cross-sectional design was used with 357 nurses from governmental, educational, and private hospitals. Data were collected with a sociodemographic questionnaire and the Pieta Nursing Role Conception tool, which evaluates service, professional, and bureaucratic roles. Analyses included descriptive statistics, t tests, and analysis of variance.

RESULTS: A significant overall discrepancy was found (mean difference = 0.52, p < .001), with the largest gaps in service (0.96) and professional roles (0.72). Bureaucratic roles were practiced more than desired (-0.08). Role discrepancies varied by age, hospital type, education, experience, and practice area.

CONCLUSION: The findings highlight the need for continuing education, leadership development, and policy reforms to align nursing roles with professional expectations and improve workforce outcomes.

PMID:41779906 | DOI:10.3928/00220124-20251204-01

Categories
Nevin Manimala Statistics

Fine-Tuning of Label-Free Single-Cell Proteomics Workflows

J Proteome Res. 2026 Mar 4. doi: 10.1021/acs.jproteome.5c01075. Online ahead of print.

ABSTRACT

Mass spectrometry-based single-cell proteomics emerges as the most promising method for studying cellular heterogeneity at the global proteome level with unprecedented depth and coverage. Its widespread application remains limited due to robustness, reproducibility, and throughput requirements, still difficult to meet as analyzing large cohorts of single cells is necessary to ensure statistical confidence. In this context, we conducted method optimizations at three levels. First, we benchmarked three distinct workflows compatible with the nanoElute2 platform using different sample collection/preparation plate supports (EVO96 oil-free, LF48 oil-based, and LF48 oil-free, a streamlined automated sample resuspension, and direct injection protocol). Then, we compared the optimized EVO96 workflow on nanoElute2 with Evosep-based separations operating at two analytical throughputs (80 and 120 samples per day). Subsequently, we evaluated digestion efficiency using a range of enzyme/protein ratios (1:1; 10:1; 20:1; 50:1) to maximize peptide recovery. Finally, the chromatographic setup was refined to determine the best compromise between throughput and robustness. Altogether, these optimizations allowed to establish a robust workflow quantifying up to 5000 proteins in 10 min gradient time per single HeLa cell at a 55 samples-per-day throughput.

PMID:41779902 | DOI:10.1021/acs.jproteome.5c01075

Categories
Nevin Manimala Statistics

Deep learning linking mechanistic models to single-cell transcriptomics data reveals transcriptional bursting in response to DNA damage

Elife. 2026 Mar 4;13:RP100623. doi: 10.7554/eLife.100623.

ABSTRACT

Cells must adopt flexible regulatory strategies to make decisions regarding their fate, including differentiation, apoptosis, or survival in the face of various external stimuli. One key cellular strategy that enables these functions is stochastic gene expression programs. However, understanding how transcriptional bursting, and consequently, cell fate, responds to DNA damage on a genome-wide scale poses a challenge. In this study, we propose an interpretable and scalable inference framework, DeepTX, that leverages deep learning methods to connect mechanistic models and single-cell RNA sequencing (scRNA-seq) data, thereby revealing genome-wide transcriptional burst kinetics. This framework enables rapid and accurate solutions to transcription models and the inference of transcriptional burst kinetics from scRNA-seq data. Applying this framework to several scRNA-seq datasets of DNA-damaging drug treatments, we observed that fluctuations in transcriptional bursting induced by different drugs were associated with distinct fate decisions: 5′-iodo-2′-deoxyuridine treatment was associated with differentiation in mouse embryonic stem cells by increasing the burst size of gene expression, while low- and high-dose 5-fluorouracil treatments in human colon cancer cells were associated with changes in burst frequency that corresponded to apoptosis- and survival-related fate, respectively. Together, these results show that DeepTX enables genome-wide inference of transcriptional bursting from single-cell transcriptomics data and can generate hypotheses about how bursting dynamics relate to cell fate decisions.

PMID:41779826 | DOI:10.7554/eLife.100623

Categories
Nevin Manimala Statistics

Preoperative hypocalcemia predicts postoperative complications in older orthopedic patients: A multicenter cohort study

PLoS One. 2026 Mar 4;21(3):e0340876. doi: 10.1371/journal.pone.0340876. eCollection 2026.

ABSTRACT

BACKGROUND: Serum calcium, a key biochemical marker in the body, plays a crucial role in maintaining bone health. Nevertheless, research exploring the link between preoperative serum calcium levels and the occurrence of postoperative complications in elderly orthopedic patients is currently lacking.

AIMS: This study sought to assess the ability of preoperative serum calcium levels to predict the occurrence of postoperative complications in geriatric orthopedic surgery.

METHODS: We utilized multivariate logistic regression to identify correlations between serum calcium levels and complications. Generalized additive models to analyze the dose-response relationship with curve fitting and threshold effect evaluation. Subgroup analyses further evaluated the impact of other covariates.

RESULTS: This study included 690 elderly patients undergoing orthopedic surgery. Common postoperative complications primarily included infection, hypoalbuminemia, and electrolyte imbalance, etc. The study demonstrated that preoperative serum calcium levels were an independent protective factor against postoperative complications (OR: 0.24, CI: 0.07-0.76, P = 0.036). When comparing groups based on serum calcium tertiles, patients in the low calcium group exhibited a 79% higher risk of complications compared to the high calcium group (OR = 1.79, 95% CI: 1.12-2.78). Further nonlinear relationship analysis revealed a threshold effect between serum calcium and postoperative complication risk, with a turning point at 2.4 mmol/L. The association was statistically significant below this value but not above it. Subgroup analyses and interaction tests showed that age, gender, comorbidities, and medications, cognitive function, cardiac function, and surgical complexity were not significantly associated with this correlation (P > 0.05 for interaction).

CONCLUSION: Preoperative calcium screening and correction may represent a simple, low-cost strategy to reduce postoperative complications in elderly orthopedic patients. This study provides evidence for the importance of actively correcting calcium levels before surgery and establishes the value of serum calcium as an early warning indicator for poor prognosis.

PMID:41779825 | DOI:10.1371/journal.pone.0340876

Categories
Nevin Manimala Statistics

Knowledge, attitudes and factors associated with the awareness of caregivers of under-five children regarding the malaria vaccine in the Tiko Health District, Cameroon: A community-based cross-sectional study

PLOS Glob Public Health. 2026 Mar 4;6(3):e0004659. doi: 10.1371/journal.pgph.0004659. eCollection 2026.

ABSTRACT

Despite progress in malaria control, malaria remains a major public health burden in sub-Saharan Africa, particularly among children under five. The introduction of malaria vaccines, including RTS,S/AS01 (Mosquirix) and the recently WHO-recommended R21/Matrix-M, offers renewed hope for reducing malaria morbidity and mortality. The effectiveness of these vaccines, however, depends largely on caregivers’ awareness, knowledge, and attitudes. This study assessed caregivers’ knowledge and attitudes, and the factors associated with awareness of the malaria vaccine in the Tiko Health District of Cameroon. A community-based cross-sectional study was conducted among 410 caregivers of children aged 0-5 years who were selected using a multistage sampling technique. Data were collected using a structured pre-tested questionnaire. Descriptive statistics summarized participants’ characteristics, and knowledge and attitude scores were generated using a structured scoring system with a 60% cut-off defining adequate knowledge and positive attitudes. Logistic regression analysis identified factors independently associated with malaria vaccine awareness with statistical significance set at p < 0.05. The median age of participants was 32 years(IQR:27-40), and most were female(83.2%). Although 60.7% of caregivers had heard of the malaria vaccine, only 26.6% demonstrated adequate knowledge and 25.1% had positive attitudes. Healthcare workers were the primary source of vaccine information(35.4%). Caregivers whose children had a previous malaria episode were less likely to be aware of the vaccine(AOR:0.55; 95% CI:0.28-0.97). Conversely, caregivers who trusted health workers (AOR:3.02; 95% CI:1.83-4.99) and those who routinely attended childhood immunization services (AOR:3.57; 95% CI:2.27-5.60) were more likely to be aware of the vaccine. Caregivers in the Tiko Health District exhibited limited knowledge and generally negative attitudes toward the malaria vaccine. Strengthening health-worker engagement, improving communication during routine immunization services, and addressing gaps in caregivers’ understanding may enhance malaria vaccine uptake in the district.

PMID:41779824 | DOI:10.1371/journal.pgph.0004659