Categories
Nevin Manimala Statistics

Multimorbidity in the Brazilian adult population: Protocol for a systematic review and meta-analysis of prevalence

PLoS One. 2026 Mar 4;21(3):e0343166. doi: 10.1371/journal.pone.0343166. eCollection 2026.

ABSTRACT

Multimorbidity (MM), defined as the co-occurrence of multiple chronic conditions in a single individual, poses a major challenge to health systems. Its consequences include higher morbidity and mortality rates, reduced quality of life, and increased healthcare costs. Despite its substantial public health burden, no systematic reviews have comprehensively assessed the pooled prevalence of MM in Brazil. This manuscript outlines a protocol for a systematic review and meta-analysis aimed at estimating the prevalence of MM among community-dwelling adults in Brazil. We will conduct a systematic review and meta-analysis of population-based studies reporting MM prevalence in community settings. A comprehensive search will be performed in PubMed, Scopus, Web of Science, Embase, LILACS, and SciELO databases. Two independent reviewers will screen articles, assess study quality using the Joanna Briggs Institute (JBI) Checklist for prevalence studies, and extract data. For the meta-analysis, pooled estimates will be calculated using random-effects models with Restricted Maximum Likelihood (REML) estimators to account for between-study variability. Heterogeneity will be assessed using the I² statistic and Cochran’s Q test. Subgroups analyses (e.g., age group, sex, region, and study type) will be conducted where feasible. Findings will be reported following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines. The protocol is registered with the International Prospective Register of Systematic Reviews (CRD42024389106). This review will provide comprehensive evidence on MM prevalence in Brazil, identifying the burden of this problem for future research and informing public health strategies.

PMID:41779770 | DOI:10.1371/journal.pone.0343166

Categories
Nevin Manimala Statistics

Management of Nail Toxicities From Fibroblast Growth Factor Receptor Inhibitors

J Drugs Dermatol. 2026 Mar 1;25(3):263-267. doi: 10.36849/JDD.9496.

ABSTRACT

BACKGROUND: Alterations in fibroblast growth factor receptor (FGFR) signaling are present in many malignancies, including urothelial carcinoma, cholangiocarcinoma, and gastrointestinal cancers, and FGFR inhibitors (FGFRi) play an increasing role in the treatment of these malignancies. Nail toxicities, such as onycholysis, paronychia, and nail fragility are an important part of the adverse effect profile of FGFRi that remain underrecognized and poorly characterized.

METHODS: We conducted a systematic literature review using PubMed and Google through March 2025, including case reports, trials, and retrospective studies reporting FGFRi-related nail disorders. Search terms included individual FGFRi (e.g., erdafitinib, pemigatinib, futibatinib) and nail-related adverse events. Data on incidence, severity (CTCAE v5.0), onset, management, and treatment impact were extracted. Statistical analyses included the Wilcoxon and Chi-square tests.

RESULTS: Twenty-three studies with 1,561 patients were analyzed. Out of these, 540 patients experienced nail toxicity. Erdafitinib had the highest nail toxicity rate (43.3%) and derazantinib the lowest (5.3%). Grade 1–2 events were most common; Grade 3 events prompted dose reduction in three patients out of 540, though no treatment discontinuations were reported. Common management strategies included antiseptic soaks, topical steroids, oral antibiotics, and protective nail care practices.

DISCUSSION/CONCLUSION: The incidence of FGFRi-associated nail toxicities varies by agent and can affect quality of life and treatment adherence. The pathogenesis remains unclear, and no predictive biomarkers exist. Further research into optimized management and preventative strategies is needed. Early recognition and proactive multidisciplinary management are essential to minimizing complications and maintaining oncologic treatment continuity. &nbsp.

PMID:41779762 | DOI:10.36849/JDD.9496

Categories
Nevin Manimala Statistics

Clinical outcomes with lower versus conventional dose polymyxin B regimens in dialysis dependent and non-dialysis patients with gram-negative sepsis: A real-world propensity-score matched cohort study

PLoS One. 2026 Mar 4;21(3):e0342835. doi: 10.1371/journal.pone.0342835. eCollection 2026.

ABSTRACT

BACKGROUND: Polymyxin B remains a key treatment option for infections caused by multidrug-resistant gram-negative bacilli, particularly in critically ill patients. However, its optimal dosing strategy recommendation remains uncertain, especially in those undergoing renal replacement therapy. This study aimed to compare the clinical and microbiological outcomes of low, usual and high dose polymyxin B in a real-world ICU population.

METHODS: This 5-year retrospective cohort study included critically ill adult patients with gram-negative sepsis who received polymyxin B. Patients were categorized into low-, usual- and high-dose groups based on loading and total daily maintenance dose. Pairwise propensity score matching was performed to adjust for baseline differences. Primary outcome was 28-day all-cause mortality. Secondary outcomes included microbiological clearance, ventilator-free days, ICU-free days, and vasopressor-free days. Subgroup and sensitivity analyses were conducted, including within patients requiring dialysis. All the statistical analysis was performed using R software.

RESULTS: A total of 674 patients were included. After matching, usual-dose polymyxin B (61%) was associated with significantly higher 28-day mortality compared to the low-dose group (48.04%) (HR = 1.47;95% CI:[1.11-1.95];p = 0.007). Vasopressor, ventilator and ICU-free days were also significantly higher in the low-dose group were compared to the other groups. No significant survival advantage was observed with high-dose regimens. Among dialysis-dependent patients (n = 254), mortality did not differ significantly across dosing groups, though microbiological clearance was better with low dosing. Sensitivity and subgroup analysis also supported the results to be robust.

CONCLUSION: Low dose polymyxin B regimens were associated with lower mortality and comparable clinical outcomes compared to higher doses and may be feasible in critically ill patients with renal impairment. However, these findings should be interpreted cautiously given the observational design and residual confounding, warranting confirmation in future randomized trials.

PMID:41779724 | DOI:10.1371/journal.pone.0342835

Categories
Nevin Manimala Statistics

Association between intrinsic capacity trajectories and risk of stroke incidence in middle-aged and older chinese adults: Evidence from a nationwide prospective cohort study based on CHARLS

PLoS One. 2026 Mar 4;21(3):e0342480. doi: 10.1371/journal.pone.0342480. eCollection 2026.

ABSTRACT

BACKGROUND: Stroke is a major public health concern and a leading cause of disability and death in aging populations. Intrinsic capacity (IC), a concept introduced by the World Health Organization, reflects an individual’s overall functional ability across multiple domains including cognition, psychological well-being, mobility, vitality, and sensory function. IC has emerged as a core metric within the healthy aging framework, but its prospective relationship with stroke risk remains unclear. A deeper understanding of this link may inform early, function-based prevention strategies.

METHODS: This study used data from 10,751 participants aged 45 years or older from the China Health and Retirement Longitudinal Study (CHARLS). Cox proportional hazards models were used to estimate the association between IC and incident stroke, with stepwise adjustment for demographic, behavioral, and health-related covariates. Modeling IC as a continuous variable enabled examination of linear trends, while quartile-based classification allowed evaluation of potential non-linear associations and improved interpretability. Kaplan-Meier curves and log-rank tests were used to compare stroke-free survival across IC quartiles. Restricted cubic spline analysis was performed to explore the presence of a non-linear association between IC and stroke risk. Robustness was tested through sensitivity analyses excluding participants with baseline cognitive impairment and those aged ≥80 years. Statistical analyses were conducted using Stata and R.

RESULTS: Over a 7-year follow-up, 243 participants (2.26%) experienced incident stroke. Stroke incidence decreased progressively with increasing IC levels, from 4.84% in the lowest quartile to 0.46% in the highest. Kaplan-Meier analysis showed significantly lower cumulative stroke incidence among individuals with higher IC (log-rank p < 0.001). In fully adjusted Cox models, each one-point increase in IC was associated with a 35.1% reduction in stroke risk (HR = 0.649; 95% CI: 0.599-0.702). Compared to the lowest IC quartile, the highest quartile had an 89.6% lower stroke risk (HR = 0.104; 95% CI: 0.055-0.197). Restricted cubic spline models confirmed a predominantly linear inverse association, with a steeper risk gradient at lower IC levels. Subgroup analyses revealed stronger protective associations in women, older adults (≥60 years), urban residents, and non-smokers or non-drinkers. Results remained consistent across all sensitivity analyses.

CONCLUSIONS: Higher IC was independently associated with a significantly reduced risk of incident stroke, underscoring IC’s potential as a holistic, function-based indicator of cerebrovascular vulnerability. These findings provide empirical support for the World Health Organization’s healthy aging framework, emphasizing IC as a modifiable reserve that reflects early, multidomain functional decline before clinical disease onset. Incorporating IC into routine screening and prevention strategies may enhance early identification of high-risk individuals and enable more targeted, function-oriented interventions, thereby promoting healthy aging and helping to reduce the future burden of stroke.

PMID:41779721 | DOI:10.1371/journal.pone.0342480

Categories
Nevin Manimala Statistics

Pre-analytical errors in a high-volume Bangladeshi diagnostic centre: Prevalence, workload impact, and mitigation strategies

PLoS One. 2026 Mar 4;21(3):e0341908. doi: 10.1371/journal.pone.0341908. eCollection 2026.

ABSTRACT

BACKGROUND: Pre-analytical errors are the most frequent cause of laboratory mistakes, accounting for nearly half of all diagnostic inaccuracies worldwide. These errors can invalidate test results, delay clinical decisions, and waste valuable healthcare resources, particularly in resource-limited, high-volume diagnostic laboratories. This study aimed to assess the prevalence, contributing factors, and severity of pre-analytical errors in a large diagnostic centre in Bangladesh.

METHODS: An observational, cross-sectional study was conducted over two months in the Biochemistry and Immunology Laboratories of a high-volume diagnostic centre in Dhaka, Bangladesh. Data from 195 documented pre-analytical errors and a structured survey of 27 laboratory staff were analysed. Errors were classified into minor, moderate, or major using definitions adapted from ISO 15189:2022 and WHO guidelines. Descriptive statistics and Chi-square tests were performed to explore associations between workload level (≥ 931 samples/day) and error frequency, with p < 0.05 considered statistically significant.

RESULTS: The most frequent errors were sample misplacement (38.5%) and incorrect labelling (17.9%). The sample collection (42.6%) and pick-and-drop (38.5%) units contributed the majority of errors. Morning shifts (65.1%) and high-workload days (70.8%) showed higher error frequencies, with a statistically significant association between workload and error occurrence (χ² = 121.093, p < 0.001). Major errors accounted for 37.4% of incidents.

CONCLUSION: Pre-analytical errors remain a critical threat to diagnostic accuracy in resource-limited laboratories. Improving workflow organization, implementing barcoding and automation, and strengthening staff training and workload management can substantially reduce error rates and enhance patient safety in high-throughput clinical settings.

PMID:41779717 | DOI:10.1371/journal.pone.0341908

Categories
Nevin Manimala Statistics

Genetic variations and clinical implications of B-thalassemia in Iraqi population

PLoS One. 2026 Mar 4;21(3):e0344034. doi: 10.1371/journal.pone.0344034. eCollection 2026.

ABSTRACT

β-thalassemia is a prevalent genetic disorder in Iraq, leading to significant health issues due to reduced hemoglobin production. The DNA sequencing technique was used to explore genetic variations and their clinical implications. Our findings have the potential to inform diagnosis, guide targeted therapeutic approaches, and enhance genetic counseling to reduce long-term morbidity in affected individuals. Peripheral blood samples were collected from 100 patients for analysis. Quantitative measurements included complete blood count (CBC), ferritin, parathyroid hormone (PTH), lactate dehydrogenase (LDH), 25‑hydroxyvitamin D, phosphorus, calcium, and bone mineral density (BMD) measured by dual-energy X-ray absorptiometry (DXA). We detected 18 β-globin mutations in β-thalassemia patients by direct Sanger sequencing, of which two were novel (HBB:c.315 + l08A>G and HBB:c.316-151A > G). Additionally, four mutations (IVS-II-1G > A, IVS-II-5G > C, IVS-I-110G > A, and HBB:c.440A > C) have been previously reported as pathogenic. This research pinpointed four β-globin gene pathogenic mutations as having significant associations with clinical parameters. Hematological parameters (HGB, WBC, RBC indices, RBC count) and biochemical/metabolic markers (phosphorus, PTH, LDH, ferritin, vitamin D3, calcium, ALP) exhibited strong statistical differences (p < 0.001-0.020) across mutation groups. Bone health markers (BMC, BMD) and red blood cell indices (MCH, MCHC, MCV, MPV) also showed significant variation (p < 0.001-0.002). In contrast, platelet count (PLT) did not differ significantly (p = 0.331). These findings highlight mutation specific impacts on hematological, metabolic, and skeletal systems in the studied population.

PMID:41779715 | DOI:10.1371/journal.pone.0344034

Categories
Nevin Manimala Statistics

Identifying high-risk combinations of metformin during COVID-19

PLoS One. 2026 Mar 4;21(3):e0343979. doi: 10.1371/journal.pone.0343979. eCollection 2026.

ABSTRACT

BACKGROUND: There is a lack of research addressing associations of antidiabetic drug combinations with COVID-19 deaths. We examined whether adding common second-line agents to metformin was associated with COVID-19 mortality risk to inform clinical decision-making when escalating diabetes treatment.

METHODS: This is a nationwide retrospective analysis covering the years 2020 and 2021. Data from the National Diabetes Registry (CroDiab) were linked to primary healthcare data, Causes of Death Registry data, and the SARS-CoV-2 vaccination database. Multivariate logistic regression models were developed for each of the combinations to compare the combination with metformin monotherapy. To address confounders, inverse probability of treatment weighting (IPTW) analysis as well as analysis with stabilized weights was performed.

RESULTS: Of 141014 analyzed patients, 1268 (0.90%) died of COVID-19 in 2 years. Weighted results of the drug combinations that showed statistically significant associations to COVID-19 death in comparison to metformin alone were metformin+DPP-4 inhibitor (OR 1.182, 95% CI 1.016-1.376), metformin+sulfonylurea (OR 1.195, 95% CI 1.015-1.406), and metformin+GLP-1 agonist (OR 2.992, 95% CI 2.117-4.229).

CONCLUSIONS: Some combinations of metformin with second-line antidiabetic drugs might require caution in the context of chronic diabetes mellitus type 2 therapy and COVID-19 related deaths. Findings should be interpreted as hypothesis-generating signals from real-world data rather than evidence of causal drug effects. Further research is needed, especially for metformin+GLP-1 agonist, as well as head-to-head comparisons of combinations therapies.

PMID:41779712 | DOI:10.1371/journal.pone.0343979

Categories
Nevin Manimala Statistics

Clinical Outcomes in Double-Exposed Chronic Lymphocytic Leukemia Patients in Italy

Hematol Oncol. 2026 Mar;44(2):e70184. doi: 10.1002/hon.70184.

ABSTRACT

B-cell receptor inhibitors (BCRi) and B-cell lymphoma-2 inhibitor (BCL2i) improved outcomes of patients with chronic lymphocytic leukemia (CLL), but relapsing after two inhibitors still represent an unmet clinical need. This multicenter real-world study analyzes outcomes of a cohort treated in Italy between May 2017 and September 2023 following prior exposure to both BCRi and BCL2i. The median follow-up after venetoclax initiation was 47 months (IQR 28-56). Of 153 double-exposed patients, 104 (68%) discontinued venetoclax and 53 of them (51%) received a subsequent treatment. Venetoclax was discontinued due to progressive disease (PD) in 51/104 cases (49.0%), with nine deaths occurring rapidly after PD without the administration of any further treatment. Fifty-three patients received treatment after venetoclax: 29/53 (54.7%) received inhibitors (13 cBTKi, 11 idelalisib, 2 BCL2i, 3 non-covalent BTKi), 19/53 (35.8%) received chemoimmunotherapy (CT: 16 intensive, 3 palliative), 5/53 (9.4%) received hematopoietic stem cell transplantation (HSCT). Overall response rate was 50%; median event free survival (EFS) in the groups of inhibitors, CT and HSCT was 11, 2, and 10 months, respectively (p < 0.0001); median overall survival (OS) was 12, 5, and 10 months, respectively (p = 0.020). Disease progression during venetoclax treatment was associated with shorter subsequent EFS compared to discontinuation for other reasons, even if the finding did not reach statistical significance (median EFS 4 vs. 10 months; p = 0.11). No decrease in EFS was associated with del17p and/or TP53 mutations, the use of venetoclax monotherapy or a previous treatment with one versus multiple BCRi. Despite its limitations, this real-world study provides additional insights into double-exposed patients, who still pose a clinical challenge, demonstrating the superior efficacy of inhibitors over alternative treatment options. Enrollment in clinical trial and treatments with novel molecules, if available, may help address this unmet clinical need.

PMID:41778381 | DOI:10.1002/hon.70184

Categories
Nevin Manimala Statistics

Demodex parasite density in patients with melasma: a case-control study

Cutan Ocul Toxicol. 2026 Mar 4:1-6. doi: 10.1080/15569527.2026.2639710. Online ahead of print.

ABSTRACT

BACKGROUND: Although multiple factors contribute to the development of melasma, there are reports suggesting a potential role of Demodex parasites in hyperpigmentation. This study aimed to compare the density and prevalence of Demodex infestation between patients with melasma and healthy controls.

METHODS: This case-control study included 35 melasma patients and 35 healthy volunteers. Standard superficial skin biopsies using cyanoacrylate adhesive were taken from the malar regions. Samples were examined via light microscopy, with a density of ≥5 Demodex/cm2 defined as positive.

RESULTS: No statistically significant difference was found between the melasma and control groups in terms of Demodex mite density or positivity rates. Correlation analysis revealed no significant relationship between mMASI scores and Demodex mite density. As a secondary finding, the mean mMASI score was significantly higher in male participants compared to female participants.

CONCLUSION: In this case-control study, we found no statistically significant association between Demodex parasite density and melasma in our study population. While our findings do not support an association in this sample, future large-scale and multicenter studies could further explore the potential role of Demodex in skin disorders that have been suggested by other reports. The single-center design and moderate sample size should be considered when interpreting these results.

PMID:41778367 | DOI:10.1080/15569527.2026.2639710

Categories
Nevin Manimala Statistics

Factors influencing time to speech processor upgrades

Cochlear Implants Int. 2026 Mar 4:1-10. doi: 10.1080/14670100.2026.2635191. Online ahead of print.

ABSTRACT

OBJECTIVE: Cochlear implant (CI) speech processors have undergone technological advancements. Therefore, patients upgrade speech processors when new features are available or when their previous device becomes broken and is no longer serviceable. This study aimed to identify factors that impact the time to a speech processor upgrade and to evaluate patient experiences with upgrading and following upgrade.

METHODS: In this retrospective cohort study, 46 CI surgeries at a single tertiary care center in 2017 and that subsequently received a speech processor upgrade were included. Data on patient demographics, hearing loss history, CI manufacturer, insurance type and status, and configuration were collected. Time to first upgrade, reasons for upgrade, patient-reported satisfaction, and speech perception scores were analyzed.

RESULTS: The mean time to a speech processor upgrade was 5.13 years after implantation. The most common reason for an upgrade was the device being over five years old, followed by the device being out of warranty. 45.7% of patients expressed satisfaction with speech processor upgrade, while 8.7% were not satisfied. There were no statistically significant associations between the time to upgrade and demographic factors such as age, sex, insurance type, or CI manufacturer. Following the upgrade, there were no significant changes in speech perception scores.

CONCLUSION: Speech processor upgrades at this center align with when insurance companies typically deem upgrades medically necessary. Demographic factors, insurance, and device manufacturer did not significantly influence time to upgrade. While objective speech perception measures did not significantly improve, many patients reported subjective satisfaction with the upgrade.

PMID:41778354 | DOI:10.1080/14670100.2026.2635191