Categories
Nevin Manimala Statistics

Has a fast treatment transition from surgical to endovascular operations improved the survival of aneurysmal subarachnoid hemorrhage?

Acta Neurochir (Wien). 2025 Feb 4;167(1):34. doi: 10.1007/s00701-025-06447-1.

ABSTRACT

BACKGROUND: Several studies have attributed decreasing case fatality rates (CFRs) of aneurysmal subarachnoid hemorrhage (aSAH) to the gradually increasing use of endovascular treatment without considering improvements in other outcome-affecting factors. To assess the independent effect of a treatment modality on CFRs, we investigated CFR changes in a high-volume center rapidly transitioning from surgical to endovascular operations as the first-line treatment for all aSAH patients except those with middle cerebral artery (MCA) aneurysms.

METHODS: We identified all surgically/endovascularly treated aSAH patients in Helsinki University Hospital (HUH) during 2012-2017. As the treatment shift occurred in 2015, we defined two treatment eras: surgical (2012-2014) and endovascular (2015-2017). We compared time-dependent changes in 1-year CFRs between non-MCA and MCA patients using a Poisson regression model. To analyze consistency in operation rates, we also identified sudden-death and conservatively treated aSAHs in the HUH catchment area via two externally validated registers.

RESULTS: Of all 665 hospitalized aSAH cases in the HUH catchment area, 557 (84%) received operative treatment; 367 (66%) underwent surgical and 190 (34%) endovascular operations. Between the treatment eras, endovascular treatment for non-MCA cases increased from 21 to 79%, whereas 99% of the MCA cases were treated surgically during the whole study-period. Among the operatively treated patients, the 1-year CFRs decreased similarly in patients with non-MCA (42%; from 14 to 8%; adjusted risk ratio (aRR) = 0.66 (95% CI 0.37-1.19)) and MCA aneurysms (42%; from 15 to 9%; aRR = 0.66 (0.16-1.60)). The proportion of operatively treated patients, their clinical condition on admission, and amount of bleeding on the first CT-scan remained unchanged over time.

CONCLUSIONS: We found similar CFR decreases in aSAH groups with and without undergoing a fast transition from surgery to endovascular operations, providing real-world evidence on the small independent effect of endovascular treatment on the decreasing CFRs in high-volume centers.

PMID:39904810 | DOI:10.1007/s00701-025-06447-1

Categories
Nevin Manimala Statistics

Clinical implications of calcification severity adjacent to calcified nodule: Its association with first and recurrent risks of target lesion revascularization after percutaneous coronary intervention

Atherosclerosis. 2025 Jan 28;402:119116. doi: 10.1016/j.atherosclerosis.2025.119116. Online ahead of print.

ABSTRACT

BACKGROUND AND AIMS: Calcified nodule (CN) is a plaque phenotype characterized by protruding calcification, associated with repeat revascularization after percutaneous coronary intervention (PCI). The severity of calcification increases the risk of future target lesion revascularization (TLR). This study was conducted to determine whether calcification severity in the adjacent zone is associated with TLR.

METHODS: We analyzed 204 patients who received PCI for de-novo CN using intravascular ultrasound (IVUS). The calcium volume index (CVI) was calculated for each 1-mm cross-sectional frame in both the CN and adjacent zones.

RESULTS: TLR occurred in 63 patients (30.9 %) during a median follow-up period of 2.8 years (interquartile range, 2.4-3.2). CVIs in both the CN and adjacent zones, along with minimum lumen area (MLA) after PCI, were significant predictors of TLR. The ROC curve-derived values for the CVIs in the CN and adjacent zones (10.52 and 5.33, respectively) and the MLA after PCI (6.65 mm2) were associated with higher TLR incidence. Among those requiring TLR, 27.0 % experienced multiple TLRs, with higher CVIs associated with recurrence. In a multi-state model, CVIs in both the CN and adjacent zones were significantly associated with the first TLR (no TLR as reference) and the second TLR (first TLR as reference). The CVI in the adjacent zone showed a higher hazard ratio for the second TLR (1.31; 95 % confidence interval [CI]: 1.16-1.48) compared to the first TLR (1.12; 95 % CI: 1.07-1.17).

CONCLUSIONS: Our findings highlight the importance of not only the calcification severity in the CN zone, but also in the adjacent zones, for TLR.

PMID:39903951 | DOI:10.1016/j.atherosclerosis.2025.119116

Categories
Nevin Manimala Statistics

Deciphering crucial salt-responsive genes in Brassica napus via statistical modeling and network analysis on dynamic transcriptomic data

Plant Physiol Biochem. 2025 Jan 29;220:109568. doi: 10.1016/j.plaphy.2025.109568. Online ahead of print.

ABSTRACT

Soil salinization severely impacts crop yields, threatening global food security. Understanding the salt stress response of Brassica napus (B. napus), a vital oilseed crop, is crucial for developing salt-tolerant varieties. This study aims to comprehensively characterize the dynamic transcriptomic response of B. napus seedlings to salt stress, identifying key genes and pathways involved in this process. RNA-sequencing on 43 B. napus seedling samples are performed, including 24 controls and 19 salt-stressed plants, at time points of 0, 1, 3, 6, and 12 h. Differential expression analysis using 33 control experiments (CEs) identified 39,330 differentially expressed genes (DEGs). Principal component analysis (PCA) and a novel penalized logistic regression with k-Shape clustering (PLRKSC) method identify 346 crucial DEGs. GO enrichment, differential co-expression network analysis, and functional validation through B. napus transformation verify the functional roles of the identified DEGs. The analysis reveals highly dynamic and tissue-specific expression patterns of DEGs under salt stress. The identified 346 crucial DEGs include those involved in leaf and root development, stress-responsive transcription factors, and genes associated with the salt overly sensitive (SOS) pathway. Specifically, Overexpression of RD26 (BnaC07g40860D) in B. napus significantly enhances salt tolerance, confirming its role in salt stress response. This study provides a comprehensive understanding of the B. napus salt stress response at the transcriptomic level and identifies key candidate genes, such as RD26, for developing salt-tolerant varieties. The methodologies established can be applied to other omics studies of plant stress responses.

PMID:39903946 | DOI:10.1016/j.plaphy.2025.109568

Categories
Nevin Manimala Statistics

Oleylphosphocholine versus Miltefosine for Canine Leishmaniasis

Am J Trop Med Hyg. 2025 Feb 4:tpmd240622. doi: 10.4269/ajtmh.24-0622. Online ahead of print.

ABSTRACT

Oleylphosphocholine (OlPC) is a miltefosine derivative that is more effective than miltefosine against Leishmania infections in rodent models. Because canines are a natural host for Leishmania, the improved treatment of canine leishmaniasis is essential both for veterinary medicine and as a large animal model for clinical development. Oleylphosphocholine, at a dosage of 4 mg/kg/day for 28 days, was compared with the approved canine regimen of miltefosine at a dosage of 2 mg/kg/day for 28 days in 33 naturally infected Brazilian dogs (17 randomly assigned to receive OlPC versus 16 designated to receive miltefosine). The animals were followed for 5 months posttreatment. The primary endpoint was the clinical score, which was calculated as the sum of scores for each of 23 clinical parameters graded 0 (normal), 1 (somewhat abnormal), or 2 (markedly abnormal) by a blinded observer. A higher clinical score signified more severe disease. The mean (SD) clinical scores for the OlPC versus miltefosine groups are as follows: pretherapy, 10.1 (5.6) versus 7.7 (4.5; P = 0.19); 3 months posttherapy, 4.3 (4.1) versus 9.5 (4.9; P <0.01); 5 months posttherapy, 3.9 (3.8) versus 8.9 (4.7; P <0.01). Scores for lymph nodes, ear crusts, and splenic parasites were statistically lower for the OlPC group versus the miltefosine group, suggesting that both visceral and cutaneous parameters contributed to OlPC’s statistically greater efficacy. One OlPC animal, with minimal splenic parasites pretreatment and zero parasites at the end of treatment, died of kidney failure due to immune-complex deposition, which was presumably already present pretreatment. The increase in blood creatinine values observed in OlPC animals warrants further study in future experiments. The superior clinical effect of OlPC in comparison to miltefosine in this canine study primes OlPC for development as an oral treatment for canine and human leishmaniasis.

PMID:39903935 | DOI:10.4269/ajtmh.24-0622

Categories
Nevin Manimala Statistics

Is Parasitic Contamination of Soil in the Southern United States Related to Poverty and Does It Represent a Human Health Threat? A Perspective

Am J Trop Med Hyg. 2025 Feb 4:tpmd240596. doi: 10.4269/ajtmh.24-0596. Online ahead of print.

ABSTRACT

In recent years, multiple reports have emerged describing real-time quantitative polymerase chain reaction (qPCR) detection of DNA derived from human parasite species in environmental soil samples. In one such report, sampling was focused in impoverished areas of the southeastern United States, and a link between poverty and the presence of parasite DNA in soil was proposed. Whether transmission of certain parasitic diseases persists in the United States in association with poverty remains an important question. However, we emphasize caution when reviewing interpretations drawn solely from qPCR detection of parasite-derived environmental DNA without further verification. We discuss here the limitations of using qPCR to test environmental DNA samples, the need for sampling strategies that are unbiased and repeatable, and the importance of selecting appropriate control areas and statistical tests to draw meaningful conclusions.

PMID:39903932 | DOI:10.4269/ajtmh.24-0596

Categories
Nevin Manimala Statistics

Sociodemographic and Psychosocial Factors Influencing Coronavirus Disease 2019 Testing Uptake: Insights from Urban and Rural Communities in South Africa

Am J Trop Med Hyg. 2025 Feb 4:tpmd230810. doi: 10.4269/ajtmh.23-0810. Online ahead of print.

ABSTRACT

Access, demand, and acceptance of coronavirus disease 2019 (COVID-19) testing have varied globally. This study explored the sociodemographic and psychosocial risk factors that contribute to the uptake of COVID-19 testing in community settings in South Africa. This paper presents a cross-sectional secondary analysis using data from a cluster randomized controlled trial and a nested perception survey of COVID-19 antigen testing in communities located in urban (eThekwini, KwaZulu-Natal) and rural settings (Worcester, Eastern Cape) in South Africa. Individuals who were reluctant to get tested participated in the perception survey. Data were analyzed using descriptive statistics and multivariable logistic regression to assess linear associations and estimate adjusted odds ratios (ORs). The analysis was conducted on 3,074 individuals, of whom 2,509 (81.6%) provided consent for COVID-19 testing. Among those, 2,505 (81.5%) tested negative, and 4 (0.1%) tested positive for COVID-19. The mean age of participants was 38 (SD = 14.61), and 57% were male. Females (OR: 1.27; 95% CI = 1-1.6), individuals older than 56 years (OR: 1.95; 95% CI = 1.24-3.07), and those who were vaccinated (OR: 1.99; 95% CI = 1.53-2.60) were more likely to consent. Individuals who had previously tested positive for severe acute respiratory syndrome coronavirus 2 were less likely to consent to testing (OR: 0.64; 95% CI = 0.11-0.46). No link was found between depression, anxiety, substance use, and willingness to undergo COVID-19 testing. A perceptions survey involving 704 participants, which explored factors influencing testing willingness, found that older adults, and urban populations were less likely to undergo COVID-19 testing. Targeted health campaigns may improve testing rates. Larger-scale implementation research is required to explore best practices for improving testing rates and confidence in population-level detection within South Africa.

PMID:39903929 | DOI:10.4269/ajtmh.23-0810

Categories
Nevin Manimala Statistics

Improving Implementation of NCD Care in Low- and Middle-Income Countries: The Case of Fixed Dose Combinations for Hypertension in Kenya

Health Syst Reform. 2025 Dec 31;11(1):2448862. doi: 10.1080/23288604.2024.2448862. Epub 2025 Feb 4.

ABSTRACT

Health systems in low- and middle-income countries face the challenge of addressing the growing burden of non-communicable diseases (NCDs) with scarce resources to do so. There are cost-effective interventions that can improve management of the most common NCDs, but many remain poorly implemented. One example is fixed dose combinations (FDCs) of medications for hypertension. Included in WHO’s Essential Medicines List, FDCs combine two or more blood pressure lowering agents into one pill and can reduce burden on patients and the health system. However, implementation of FDCs globally is poor. We aimed to identify health systems factors affecting implementation of evidence-based interventions for NCDs, and opportunities to address these, using the case study of FDCs in Kenya. We conducted semi-structured interviews with 39 policy-makers and healthcare workers involved in hypertension treatment policy and identified through snowball sampling. Interview data were analyzed thematically, using the Access Framework to categorize themes. Our interviews identified factors operating at the global, national, county, and provider levels. These include lack of global implementation guidance, context specific cost-effectiveness data, or prioritization by procurement agencies and clinical guidelines; perceived high cost; poor data for demand forecasting; insufficient budget for procurement of NCD medications; absence of prescriber training and awareness of clinical guidelines; and habitual prescribing behavior and understaffing limiting capacity for change. We propose specific strategies to address these. The findings of this work can inform efforts to improve implementation of other evidence-based interventions for NCDs in low-income settings.

PMID:39903916 | DOI:10.1080/23288604.2024.2448862

Categories
Nevin Manimala Statistics

Impact of Ambient Air Pollution on Reduced Visual Acuity Among Children and Adolescents

Ophthalmic Epidemiol. 2025 Feb 4:1-8. doi: 10.1080/09286586.2025.2457623. Online ahead of print.

ABSTRACT

PURPOSE: Previous studies have assessed the impact of air pollution on myopia from the individual level, while none of them have explored the role of air pollution in visual health disparities between different regions from the area level. This ecological study aimed to investigate the impact of ambient air pollution on reduced visual acuity (VA).

METHODS: The data were derived from the Chinese National Survey on Students’ Constitution and Health (CNSSCH) conducted in 2014 and 2019, which involved 261,833 and 267,106 students respectively. The participants were 7-22 years old randomly selected from 30 mainland provinces in China. Locally weighted scatterplot smoothing (LOESS) regression models and fixed-effects panel regression models were used to explore the associations of provincial-level prevalence of reduced VA with air quality index (AQI), fine particulate matter ;(PM2.5), PM10, sulfur dioxide (SO2), carbon monoxide (CO), nitrogen dioxide (NO2) and ozone (O3) concentrations.

RESULTS: There were nearly linear positive dose-response relationships between AQI, air pollutant concentrations and the prevalence of reduced VA. After adjusting for covariates, an interquartile range increase in PM2.5 exposure was significantly associated with a 5.0% (95% confidence interval, 0.7%-9.3%) increase in the prevalence of reduced VA, whereas no significant associations were observed between AQI, the other five pollutants and the prevalence of reduced VA.

CONCLUSION: Regions with more polluted air tend to have a higher prevalence of reduced VA. Exposure to PM2.5 might be an important risk factor for myopia among children and adolescents.

PMID:39903915 | DOI:10.1080/09286586.2025.2457623

Categories
Nevin Manimala Statistics

Electronic Health Record Use Patterns Among Well-Being Survey Responders and Nonresponders: Longitudinal Observational Study

JMIR Med Inform. 2025 Feb 4;13:e64722. doi: 10.2196/64722.

ABSTRACT

BACKGROUND: Physician surveys provide indispensable insights into physician experience, but the question of whether responders are representative can limit confidence in conclusions. Ubiquitously collected electronic health record (EHR) use data may improve understanding of the experiences of survey nonresponders in relation to responders, providing clues regarding their well-being.

OBJECTIVE: The aim of the study was to identify EHR use measures corresponding with physician survey responses and examine methods to estimate population-level survey results among physicians.

METHODS: This longitudinal observational study was conducted from 2019 through 2020 among academic and community primary care physicians. We quantified EHR use using vendor-derived and investigator-derived measures, quantified burnout symptoms using emotional exhaustion and interpersonal disengagement subscales of the Stanford Professional Fulfillment Index, and used an ensemble of response propensity-weighted penalized linear regressions to develop a burnout symptom prediction model.

RESULTS: Among 697 surveys from 477 physicians with a response rate of 80.5% (697/866), always responders were similar to nonresponders in gender (204/340, 60% vs 38/66, 58% women; P=.78) and age (median 50, IQR 40-60 years vs median 50, IQR 37.5-57.5 years; P=.88) but with higher clinical workload (median 121.5, IQR 58.5-184 vs median 34.5, IQR 0-115 appointments; P<.001), efficiency (median 5.2, IQR 4.0-6.2 vs median 4.3, IQR 0-5.6; P<.001), and proficiency (median 7.0, IQR 5.4-8.5 vs median 3.1, IQR 0-6.3; P<.001). Survey response status prediction showed an out-of-sample area under the receiver operating characteristics curve of 0.88 (95% CI 0.77-0.91). Burnout symptom prediction showed an out-of-sample area under the receiver operating characteristics curve of 0.63 (95% CI 0.57-0.70). The predicted burnout prevalence among nonresponders was 52%, higher than the observed prevalence of 28% among responders, resulting in an estimated population burnout prevalence of 31%.

CONCLUSIONS: EHR use measures showed limited utility for predicting burnout symptoms but allowed discrimination between responders and nonresponders. These measures may enable qualitative interpretations of the effects of nonresponders and may inform survey response maximization efforts.

PMID:39903913 | DOI:10.2196/64722

Categories
Nevin Manimala Statistics

Anatomical and Functional Outcomes of Heavy Silicone Oil (Oxane® HD and Densiron® 68) in Complex Primary Rhegmatogenous Retinal Detachment

Retina. 2025 Jan 29. doi: 10.1097/IAE.0000000000004419. Online ahead of print.

ABSTRACT

PURPOSE: To evaluate the efficacy of heavy silicone oils (HSO) as endotamponades in the repair of primary complex rhegmatogenous retinal detachment (RRD).

METHODS: This retrospective, single-centre, non-randomised study included 82 eyes of 82 patients with primary macular-off RRD associated with inferior proliferative vitreoretinopathy. Each eye was treated with one of two HSO tamponades: Oxane® HD or Densiron® 68. Study outcomes were primary and final success rates, final logMAR gain and postoperative complications. The final outcome was based on 12-month follow-up.

RESULTS: Of the 82 eyes, 45 were treated with Oxane HD and 37 with Densiron 68. There were no significant differences in demographic and clinical characteristics between the groups. The primary and final surgical success rates were 66.6% and 75.7% for Oxane HD and 75.6% and 81% for Densiron 68, respectively; these differences were not statistically significant. In addition, the final logMAR gain was 0.36 ± 0.51 (median 0.2) in the Oxane HD group and 0.57 ± 0.58 (median 0.5) in the Densiron 68 group (p = 0.027). Complication rates were similar between groups (p > 0.05).

CONCLUSIONS: Our study suggests that HSOs may be an effective alternative for suitable patients in primary complex RRD cases, demonstrating high anatomical success and a low adverse event profile.

PMID:39903911 | DOI:10.1097/IAE.0000000000004419