Categories
Nevin Manimala Statistics

Longitudinal Analysis of Fluoride Levels in Irish Water Supplies: A 52-Year Review

Community Dent Oral Epidemiol. 2026 Jan 28. doi: 10.1111/cdoe.70055. Online ahead of print.

ABSTRACT

BACKGROUND/AIM: The Health (Fluoridation of Water Supplies) Act of 1960 in Ireland mandates monthly fluoride sampling in Public Water Supplies (PWS). In 2007, authorities adjusted the mandated fluoride concentration from 0.8-1.0 to 0.6-0.8 mg/L. Approximately 71% of the Irish population has access to fluoridated drinking water. This study aimed to analyse fluoride measurements in Irish water supplies for five decades (1964-2016) to assess compliance and effectiveness of the fluoridation programme.

METHODS: Data were sourced from government records and Environmental Protection Agency (EPA) reports. Analysis focused on fluoride concentration measurements, compliance rates, and data completeness across public, private, and group water supplies. Descriptive statistics were used to evaluate trends and patterns in fluoride levels over time.

RESULTS: By 2000, over 90% (n = 307) of PWS, each serving more than 1000 persons, were fluoridated. In the early monitoring period (1964-69), missing data were substantial at 66%, with satisfactory fluoride results (0.8-1.00 mg/L) at only 17% and marginal results (0.70-0.80 and > 1.00-1.10 mg/L) at 15%. Compliance improved steadily, reaching peak performance in 1994-99 with 57% satisfactory results. Following the 2007 adjustment in target concentrations, missing data decreased significantly to 18%, with satisfactory results (0.60-0.80 mg/L) increasing from 40% to 49% and marginal results (0.50-0.60 mg/L and > 0.80-0.90 mg/L) stabilising at 7%-13%. Analysis of private and group supplies revealed evolving trends: from 2000 to 2006, 21% of fluoride testing results were satisfactory and 75% marginal, while the 2007-2016 period showed 39% satisfactory and 48% unsatisfactory results, though only 1% exceeded 0.9 mg/L.

CONCLUSION: The fluoride control in PWS has been largely effective, with consistent improvements in monitoring practices and compliance with target levels over the study period.

PMID:41606420 | DOI:10.1111/cdoe.70055

Categories
Nevin Manimala Statistics

Use of a Cystatin C-Based GFR Equation in a Population Pharmacokinetic Model of Methotrexate Clearance in Adult Patients with Lymphoma

Clin Pharmacokinet. 2026 Jan 29. doi: 10.1007/s40262-026-01618-4. Online ahead of print.

ABSTRACT

BACKGROUND: High-dose methotrexate (HDMTX) is a key treatment for lymphoma with central nervous system involvement. Whether incorporating cystatin C into glomerular filtration rate estimation improves methotrexate (MTX) clearance prediction remains unclear.

OBJECTIVES: We aimed to evaluate whether cystatin C-inclusive glomerular filtration rate equations improve MTX clearance prediction and to explore the relationship between MTX exposure and acute kidney injury (AKI) in adult patients with lymphoma receiving HDMTX.

METHODS: This was a prospective single-center study performed on 80 adult patients with lymphoma receiving HDMTX (1.5-8 g/m2) over a 4-h infusion. A population pharmacokinetic model was constructed using data from 80 administrations of HDMTX and 427 serum MTX concentrations. The population pharmacokinetic model estimated MTX concentrations were included in a logistic regression to assess the relationship between MTX exposure and AKI.

RESULTS: A two-compartment model best described the pharmacokinetic data, with baseline albumin and CKD-EPI creatinine-cystatin C (eGFRCr-CysC) as significant covariates on clearance. Seventeen patients (21%) developed any-stage AKI. Among those receiving ≤ 3.5 g/m2, model-estimated 4-h MTX concentrations were associated with AKI (odds ratio: 1.02 per µmol/L; p = 0.0038), with an optimal threshold of 160 µmol/L (area under the concentration-time curve: 0.818). Patients above this threshold were 22 times more likely to experience AKI (p = 0.0005). This association was not observed in patients treated with 8 g/m2. Despite a lower dose and exposure, patients receiving ≤ 3.5 g/m2 demonstrated a stronger concentration-toxicity relationship.

CONCLUSIONS: Our results support the use of cystatin C-inclusive glomerular filtration rate estimates in MTX pharmacokinetic modeling and suggest early MTX concentration sampling may identify AKI risk, enabling proactive, AKI-mitigating clinical interventions during HDMTX therapy.

PMID:41606413 | DOI:10.1007/s40262-026-01618-4

Categories
Nevin Manimala Statistics

Evaluating the effect of cryotherapy and low-level laser therapy on postoperative pain and quality of life in patients with symptomatic apical periodontitis- a randomized controlled clinical study

Lasers Med Sci. 2026 Jan 29;41(1):17. doi: 10.1007/s10103-026-04809-4.

ABSTRACT

The aim of this randomized controlled clinical study is to compare the effect of intracanal cryotherapy and intraoral Low-Level LASER Therapy (LLLT) applications on postoperative pain and quality of life in patients following endodontic treatment for symptomatic apical periodontitis. After ethical clearance and registering the trial at clinical trial registry of India, a randomized, parallel-controlled clinical study with 2 test arms and 1 control arm was conducted. 90 subjects diagnosed with acute apical periodontitis in molar teeth meeting the inclusion and exclusion criteria, were enrolled in the current trial. After obtaining written informed consent, first researcher randomly allocated the subjects into 3 arms: intracanal cryotherapy, intraoral Low-Level LASER Therapy (LLLT) and control. Preoperative pain scores were marked on the Heft- Parker visual analog scale (HP-VAS) by the second researcher. Preoperative quality of life was marked by the subjects on the Oral Health Impact Profile-17 (OHIP-17) questionnaire. Root canal treatment was performed for all the subjects enrolled in the study and the interventions were performed accordingly, by a single operator. The patients were instructed to record their postoperative pain levels and analgesics intake at 24 h, 48 h, the third day, fifth day, and seventh day. On the seventh day postoperative quality of life questionnaire was recorded by the subjects. The data were tabulated in MS Excel and statistically analyzed using the EPI-INFO software. After 24 h, postoperative pain reduced significantly in the cryotherapy arm when compared with the LLLT arm. However, no statistically significant differences in postoperative pain were found between the two test arms at 48 h, the third day, fifth day, and seventh day. Subjects in the control arm were relieved of postoperative pain after the fifth day. Subjects in the cryotherapy arm required fewer analgesics, with intake limited to the first 24 h postoperatively. Conversely, participants in the LLLT arm needed analgesics for the initial 48 h postoperatively. In contrast, subjects in the control arm had the longest duration of analgesic use, up to the fifth day. The quality of life for all the subjects was found to improve postoperatively after seven days, irrespective of additional interventions. Application of cryotherapy and Low-Level LASER Therapy was found to be effective at 24 h and 48 h respectively, in the management of postoperative endodontic pain. The findings highlight the potential benefits of cryotherapy and LLLT compared to standard postoperative care alone. Henceforth, cryotherapy and Low-Level LASER Therapy could be utilized, as they are simple and innocuous modalities to manage post-endodontic pain.

PMID:41606397 | DOI:10.1007/s10103-026-04809-4

Categories
Nevin Manimala Statistics

Correlation between MASCC score and the evolution of febrile neutropenia in patients with solid tumors: a retrospective study

Support Care Cancer. 2026 Jan 29;34(2):141. doi: 10.1007/s00520-026-10375-w.

ABSTRACT

BACKGROUND: The MASCC (Multinational Association for Supportive Care in Cancer) score is widely used to identify low-risk febrile neutropenia (FN) patients eligible for outpatient management. However, its performance specifically in patients with solid tumors remains insufficiently validated.

METHODS: We conducted a retrospective cohort study at the Centre Hospitalier Universitaire de Sherbrooke (CHUS) between 2011 and 2022. Adult patients admitted for FN secondary to chemotherapy for solid tumors were included. Patients were classified as high-risk (MASCC < 21) or low-risk (MASCC ≥ 21). The primary outcome was the score’s ability to predict an uncomplicated clinical course with a specificity of 95%. Secondary outcomes included ICU admission, mortality, duration of hospitalization, intravenous antibiotics, neutropenia, and potential days saved with outpatient treatment.

RESULTS: Among 329 oncologic patients, 227 (69%) were classified as low risk. The MASCC score showed a sensitivity of 83.5% (95% CI 77.8-88.2%) and a specificity of 57.3% (95% CI 47.8-66.4%) for predicting the absence of complications. ICU admission rates were significantly lower among low-risk patients (0.4% vs. 32.7%, p < 0.001), as were mortality rates (0.9% vs. 16.8%, p < 0.001). Median hospitalization duration was 4 days [IQR (interquartile range) 3-6] for low-risk patients compared to 6 days [IQR 4-10] for high-risk patients (p < 0.001). Applying outpatient eligibility criteria could have prevented 486 hospitalization days across 161 patients, with 80.7% experiencing no complications.

CONCLUSION: The MASCC score does not accurately identify solid tumor FN patients who would evolve without complications, given its moderate specificity. However, it remains associated with a substantial reduction in hospitalization burden among low-risk patients. Clinical judgment remains essential in outpatient management decisions. Integrating additional clinical parameters may further improve risk stratification in this population.

PMID:41606351 | DOI:10.1007/s00520-026-10375-w

Categories
Nevin Manimala Statistics

A cross-population compendium of gene-environment interactions

Nature. 2026 Jan 28. doi: 10.1038/s41586-025-10054-6. Online ahead of print.

ABSTRACT

Environmental differences in genetic effect sizes, namely, gene-environment interactions, may uncover the genetic encoding of phenotypic plasticity1-3. We provide a cross-population atlas of gene-environment interactions comprising 440,210 individuals from European and Japanese populations, with replication in 539,794 individuals from diverse populations. By decomposing the contributions from age, sex and lifestyles, we delineate the aetiology of these gene-environment interactions, including a reverse-causality from a disease-related dietary change. Genome-wide analyses uncovered missing heritability and trait-trait relationships connected by the synergistic effects of genome and environments, which systematically affected polygenic prediction accuracy and cross-population portability. Single-cell projection revealed aging shift of pathways and cell types responsible for genetic regulation. Omics-level gene-environment analyses identified multiple sex-discordant genetic effects in lipid metabolism, informing clinical trial failures for genetically supported drug development. Our comprehensive gene-environment study decodes the dynamics of genetic associations, offering insights into complex trait biology, personalized medicine and drug development.

PMID:41606330 | DOI:10.1038/s41586-025-10054-6

Categories
Nevin Manimala Statistics

Dental Health and Survival Following Surgery for Esophageal Cancer

Ann Surg Oncol. 2026 Jan 28. doi: 10.1245/s10434-026-19101-6. Online ahead of print.

ABSTRACT

BACKGROUND: Most patients who undergo esophagectomy for esophageal cancer develop tumor recurrence and die within 5 years of surgery. An impact of preoperative dental status on survival in this patient group has been suggested but remains uncertain.

METHODS: This national Swedish cohort study included 871 patients who underwent esophagectomy for esophageal cancer (adenocarcinoma or squamous cell carcinoma) between 2011 and 2020 and were followed up until 2024. The exposure was the number of remaining teeth, based on dentist visits within 5 years before the esophagectomy, categorized into five (quintiles) and 10 (deciles) groups. The main outcome was all-cause 5-year mortality. Data were retrieved from medical records and national complete registries. Multivariable Cox regression provided hazard ratios (HRs) with 95% confidence intervals (CIs), adjusted for age, sex, comorbidity, tumor histology, neoadjuvant therapy, education level, and pathological tumor stage.

RESULTS: There were no statistically significant associations between the number of remaining teeth and the risk of all-cause 5-year mortality, independent of the categorization of the number of teeth. The HR was 1.13 (95% CI 0.85-1.51) comparing the lowest quintile of remaining teeth (n = 0-19) with the highest quintile (n = 29-32), and the HR was 0.98 (95% CI 0.64-1.52) comparing the lowest decile of remaining teeth (n = 0-10) with the highest decile (n = 31-32). There were no statistically significant associations in any of the subgroup analyses of age, comorbidity, or education.

CONCLUSION: This study did not identify any association between the number of remaining teeth and the risk of 5 year mortality in patients who underwent esophagectomy for esophageal cancer.

PMID:41606299 | DOI:10.1245/s10434-026-19101-6

Categories
Nevin Manimala Statistics

Misperception of supine sleep in the sleep laboratory: a retrospective review of self-reported versus polysomnography-measured sleep position

Sleep Breath. 2026 Jan 28;30(1):25. doi: 10.1007/s11325-025-03560-4.

ABSTRACT

PURPOSE: Accuracy of patient-perceived sleeping position has implications for the acquisition and interpretation of sleep studies, implementation of positive airway pressure (PAP) therapies, and determining efficacy of positional therapies at home. This study aimed to compare self-reported supine sleep to observed supine sleep on polysomnography (PSG) and identify patient-related factors that contribute to supine sleep misperception.

METHODS: A retrospective review of clinical PSG records from a public sleep service was performed. Body position was measured with a Grael position sensor with manual editing based on digital video if discrepant. Each patient completed a sleep questionnaire including questions regarding sleep position on the PSG. Self-reported supine sleep was divided into categories of “None/Some/Half/Most/All”, with PSG-measured supine sleep as a percentage of total sleep time (%TST) classified into these categories based on the cut-points of 0, 2.5, 33.3, 66.6, 97.5 and 100%. Absence of supine sleep on PSG was defined by the cut-point of “None” (≤ 2.5%TST). Chance-adjusted agreement between objective and subjective measures was assessed using the Cohen’s kappa statistic (linear weighting).

RESULTS: A total of 956 patient records were available for analysis. 93% of patients who self-reported supine sleep on the PSG were correct, whereas only 56% of those who denied supine sleep accurately reported the absence of supine sleep on PSG. Patients who reported not sleeping on their back during the PSG had 11.0 ((95% CI: 7.0, 17.3), p < .001) times the odds of sleep position misperception than those who did report sleeping on their back. Agreement with self-reported categories of “None/Some/Half/Most/All was only moderate (kappa 0.45 [95% CI 0.43, 0.48]). Younger patients (< 55 years) and females were more likely to under-report supine sleep.

CONCLUSIONS: Patients denying supine sleep in the laboratory were more likely to misperceive sleep position on PSG, with younger females under-reporting supine sleep. Understanding contributors to misperception may influence decisions regarding supine sleep sampling in the laboratory, as well clinical decision making.

PMID:41606288 | DOI:10.1007/s11325-025-03560-4

Categories
Nevin Manimala Statistics

Success rates of trial of labor after cesarean delivery: the impact of prior vaginal deliveries on outcomes

Arch Gynecol Obstet. 2026 Jan 28;313(1):63. doi: 10.1007/s00404-025-08248-4.

ABSTRACT

OBJECTIVES: To estimate the success rates and risks of vaginal birth after cesarean delivery (VBAC) based on the number of prior successful VBACs.

METHODS: A retrospective cohort study of women with one cesarean section in the past who attempted vaginal delivery between 2013 and 2022, using data from our Medical Center registry. Outcomes were compared based on the number of prior successful VBACs.

RESULTS: Among 2912 deliveries meeting the eligibility criteria, the success rate of VBAC increased with the number of prior VBACs: 73.2% for those with no prior VBAC, rising to 92.3%, 94.7%, 94.0%, and 97.0% for individuals with 1, 2, 3, 4, and 5 or more prior VBACs, respectively. The history of at least one prior VBAC was associated with a 5.17-fold higher likelihood of achieving VBAC success. However, no significant differences in success rates were observed between groups with higher numbers of prior VBACs (≥ 2) compared to individuals with only one prior VBAC. In addition, the duration of hospitalization for both mother and neonate was longer in cases with no prior VBAC history. There was also a higher risk of requiring blood transfusion in the group without a prior history of VBAC.

CONCLUSIONS: Women with prior successful VBAC have a high likelihood of achieving another successful VBAC. After two prior VBACs, the success rate remains stable. In addition, women with one or more previous VBACs experience a reduced risk of blood transfusion and shorter hospitalization durations for both the mother and newborn.

PMID:41606285 | DOI:10.1007/s00404-025-08248-4

Categories
Nevin Manimala Statistics

Regulatory mechanisms of the trade off between Th17 cells and Treg cells

J Biol Phys. 2026 Jan 29;52(1):6. doi: 10.1007/s10867-026-09701-4.

ABSTRACT

Regulatory T cells (Treg) and T helper 17 cells (Th17), both derived from naïve T cells, play pivotal roles in modulating immune responses, and their dynamic balance is critical for maintaining immune homeostasis. Existing studies predominantly focus on the regulatory mechanisms of individual cell types and lack a systematic analysis of how multiparametric interactions and stochastic perturbations jointly influence cell-fate equilibrium. In this study, we investigate the gene regulatory network of Treg and Th17 cells in two major aspects: (i) elucidating the dynamical features of the network and (ii) examining the regulatory effects of Gaussian white noise on the balance between the two lineages. By integrating systems dynamics, non-equilibrium mechanics, and stochastic process theory, we propose a unified modeling framework that incorporates Gaussian white noise to simulate stochastic perturbations in gene expression, thereby establishing a mapping between parameter sets and cellular phenotypes and quantifying the regulatory weights of key factors. Our results demonstrate that parameters such as extracellular TGF-β input, foxp3 mRNA synthesis rate, and Stat3 protein degradation rate significantly modulate the differentiation balance between Treg and Th17 cells. Furthermore, within a certain range, stronger Gaussian white noise promotes the differentiation of naïve T cells toward the Th17 lineage, thereby enhancing immune responsiveness. This finding aligns with prior experimental evidence demonstrating that stochastic noise can amplify immune response efficacy. This framework uniquely couples static and dynamic perturbations, revealing stochasticity’s role in cell-fate decisions and offering both a quantitative tool for studying Th17-Treg balance and a generalizable approach for other differentiation systems.

PMID:41606283 | DOI:10.1007/s10867-026-09701-4

Categories
Nevin Manimala Statistics

Detection of selection signatures in indigenous African cattle reveals genomic footprints of adaptation, production and temperament traits

Mamm Genome. 2026 Jan 28;37(1):27. doi: 10.1007/s00335-026-10193-9.

ABSTRACT

Indigenous cattle account for approximately 80% of Uganda’s cattle population. These animals are well adapted to the country’s ten agroecological zones and are mainly kept under pastoral and agropastoral systems. Unlike commercial breeds, they thrive on low-quality feeds, while tolerating major tropical diseases and parasites including tsetse flies, ticks, and vector-borne infections. Whole-genome sequence (WGS) analysis offers opportunities to uncover genomic regions underlying these adaptations and to trace the genetic footprints of long-term breeding decisions taken by cattle keepers. In this study, WGS data from 95 animals representing six indigenous cattle populations (Ankole, Karamojong, Nganda10, Nganda17, Nkedi, and Ntuku) were analyzed to identify genomic regions under putative selection. Two complementary approaches were applied: enumeration of the µ-statistic in RAiSD and runs of homozygosity (ROH) analysis. RAiSD identified population-level signals, while conserved ROH regions were defined using breed-specific SNP-incidence thresholds. The two methods identified 803 and 49 candidate genes respectively. The top genes identified included SLC37A1 (BTA1), CHCHD3 (BTA4), and RAB3GAP1 (BTA2) detected by RAiSD, and IL26 (BTA5), FBXL7 (BTA20), and HSPA9 (BTA7) contained in ROH. Furthermore, the regions harbored 107 novel genes (92 detected by RAiSD and 15 by ROH), corresponding to 255 quantitative trait loci. The identified genes under putative selection are associated with economically important traits including adaptation to tropical environments, resistance to parasites and diseases, and other farmer-preferred characteristics. These findings provide insights into the genetic basis of adaptation, selection and production in Ugandan indigenous cattle, supporting conservation and breeding strategies to enhance resilience and productivity.

PMID:41606260 | DOI:10.1007/s00335-026-10193-9