Categories
Nevin Manimala Statistics

Modelled Public Health Impact of Introducing Adjuvanted Recombinant Zoster Vaccine into the UK National Immunisation Programme

Infect Dis Ther. 2024 Nov 26. doi: 10.1007/s40121-024-01073-3. Online ahead of print.

ABSTRACT

INTRODUCTION: In 2023, recombinant zoster vaccine (RZV) replaced zoster vaccine live (ZVL) vaccine in the UK National Immunisation Programme (NIP) for prevention of herpes zoster (HZ). The vaccination age was reduced from 70 to 65 years, with a subsequent planned reduction to 60 years. This modelling study aimed to evaluate the public health impact (PHI) of RZV vaccination in the 70 years of age (YOA) population and in younger individuals 65 and 60 YOA.

METHODS: PHI was evaluated from a National Health Service perspective, as cases of HZ, post-herpetic neuralgia (PHN), non-PHN complications and deaths, hospitalisations, and general practitioner (GP) visits avoided using a multicohort Markov model. Three scenarios (RZV vs. no vaccination, ZVL vs. no vaccination, and RZV vs. ZVL) were explored for each age group using population estimates from the UK Office for National Statistics, i.e. 70 YOA (n = 649,822), 65 YOA (n = 760,578) and 60 YOA (n = 849,501).

RESULTS: Modelled outcomes in 70 YOA individuals estimated that RZV vaccination would avoid 32,894 cases of HZ and 5915 cases of PHN compared with no vaccination and 26,954 HZ and 3218 PHN cases compared with ZVL. Compared with no vaccination, 2264 fewer hospitalisations and 158,549 fewer GP visits were predicted with RZV vaccination. Hospitalisations were predicted to be reduced by 1996 and GP visits by 130,821 for RZV versus ZVL vaccination. In individuals 65 YOA, it was estimated that RZV vaccination would avoid 50,128 HZ cases, 8623 PHN cases, 222,646 GP visits, and 2671 hospitalisations versus no vaccination. In the 60 YOA group, RZV vaccination was predicted to avoid 57,182 HZ cases, 9327 PHN cases, 234,330 GP visits, and 2547 hospitalisations versus no vaccination.

CONCLUSION: The recent introduction of RZV into the NIP could substantially reduce HZ disease burden and healthcare resource use in the UK. A graphical abstract is available with this article.

PMID:39589700 | DOI:10.1007/s40121-024-01073-3

Categories
Nevin Manimala Statistics

On the Detection of Population Heterogeneity in Causation Between Two Variables: Finite Mixture Modeling of Data Collected from Twin Pairs

Behav Genet. 2024 Nov 26. doi: 10.1007/s10519-024-10207-9. Online ahead of print.

ABSTRACT

Causal inference is inherently complex and relies on key assumptions that can be difficult to validate. One strong assumption is population homogeneity, which assumes that the causal direction remains consistent across individuals. However, there may be variation in causal directions across subpopulations, leading to potential heterogeneity. In psychiatry, for example, the co-occurrence of disorders such as depression and substance use disorder can arise from multiple sources, including shared genetic or environmental factors (common causes) or direct causal pathways between the disorders. A patient diagnosed with two disorders might have one recognized as primary and the other as secondary, suggesting the existence of different types of comorbidity. For example, in some individuals, depression might lead to substance use, while in others, substance use could lead to depression. We account for potential heterogeneity in causal direction by integrating the Direction of Causation (DoC) model for twin data with finite mixture modeling, which allows for the calculation of individual-level likelihoods for alternate causal directions. Through simulations, we demonstrate the effectiveness of using the Direction of Causation Twin Mixture (mixDoC) model to detect and model heterogeneity due to varying causal directions.

PMID:39589697 | DOI:10.1007/s10519-024-10207-9

Categories
Nevin Manimala Statistics

Selenium, a Notable Micronutrient: A Crucial Element in the Context of All-Cause Long-Term Mortality in Renal Failure

Biol Trace Elem Res. 2024 Nov 26. doi: 10.1007/s12011-024-04460-6. Online ahead of print.

ABSTRACT

Selenium is a trace element involved in crucial antioxidative and anti-inflammatory processes in the body. Low selenium status has been linked to increased mortality due to compromised immune function and heightened risk of cardiovascular events. Patients with chronic kidney disease (CKD) face elevated mortality risks, prompting the need for strategies to mitigate these events. Selenium deficiency is prevalent among CKD patients, yet the long-term implications and its association with mortality in this population remain unclear. This study assessed seventy-five CKD patients’ serum selenium levels (SSL) between January and February 2020. The objective was to investigate the correlation between SSL and 36-month all-cause mortality in CKD patients. Baseline laboratory values, dialysis adequacy, Charlson comorbidity index (CCI), serum selenium status, and all-cause mortality at 36 months were subjected to statistical analysis. Significance level was set at p < 0.05. Significant differences were observed in CCI between surviving and deceased groups, with deceased patients being older and afflicted with more comorbidities. SSL also exhibited a significant difference between the groups, with levels in the mortality group significantly lower than those in other patients, suggesting a potential role of selenium in predicting patient outcomes. SSL equal to or lower than 66.35 were associated with approximately 5 times higher likelihood of mortality within three years of follow-up. Our study highlights the significant association between low serum selenium levels and survival in patients with chronic kidney disease, underscoring the potential importance of selenium monitoring in this population. These findings emphasize the need for further research to elucidate the underlying mechanisms and to explore potential interventions aimed at improving outcomes in CKD patients.

PMID:39589683 | DOI:10.1007/s12011-024-04460-6

Categories
Nevin Manimala Statistics

Genome Galaxy Identified by the Circular Code Theory

Bull Math Biol. 2024 Nov 26;87(1):5. doi: 10.1007/s11538-024-01366-1.

ABSTRACT

The genome galaxy identified in bacteria is studied by expressing the reading frame retrieval (RFR) function according to the YZ-content (GC-, AG- and GT-content) of bacterial codons. We have developed a simple probabilistic model for ambiguous sequences in order to show that the RFR function is a measure of the gene reading frame retrieval. Indeed, the RFR function increases with the ratio of ambiguous sequences and the ratio of ambiguous sequences decreases when the codon usage dispersion increases. The classical GC-content is the best parameter for characterizing the upper arm, which is related to bacterial genes with a low GC-content, and the lower arm, which is related to bacterial genes with a high GC-content. The galaxy center has a GC-content around 0.5. Then, these results are confirmed by expressing the GC-content of bacterial codons as a function of the codon usage dispersion. Finally, the bacterial genome galaxy is better described with the GC3-content in the 3rd codon site compared to the GC1-content and GC2-content in the 1st and 2nd codons sites, respectively. Whereas the codon usage is used extensively by biologists, its dispersion, which is an important parameter to reveal this genome galaxy, is surprisingly little known and unused. Therefore, we have developed a mathematical theory of codon usage dispersion by deriving several formulæ. It shows three important parameters in codon usage: the minimum and maximum codon probabilities and the number of codons with high frequency, i.e. with a probability at least 1/64. By applying this theory to the evolution of the genetic code, we see that bacteria have optimised the number of codons with high frequency to maximise the codon dispersion, thus maximising the capacity to retrieve the reading frame in genes. The derived formulæ of dispersion can be easily extended to any weighted code over a finite alphabet.

PMID:39589676 | DOI:10.1007/s11538-024-01366-1

Categories
Nevin Manimala Statistics

Machine learning for prognostic prediction in coronary artery disease with SPECT data: a systematic review and meta-analysis

EJNMMI Res. 2024 Nov 26;14(1):117. doi: 10.1186/s13550-024-01179-2.

ABSTRACT

BACKGROUND: Single-photon emission computed tomography (SPECT) analysis relies on qualitative visual assessment or semi-quantitative measures like total perfusion deficit that play a critical role in the non-invasive diagnosis of coronary artery disease by assessing regional blood flow abnormalities. Recently, machine learning (ML) -based analysis of SPECT images for coronary artery disease diagnosis has shown promise, with its utility in predicting long-term patient outcomes (prognosis) remaining an active area of investigation. In this review, we comprehensively examine the current landscape of ML-based analysis of SPECT imaging with an emphasis on prognostication of coronary artery disease.

MAIN BODY: Our systematic search yielded twelve retrospective studies, investigating SPECT-based ML models for prognostic prediction in coronary artery disease patients, with a total sample size of 73,023 individuals. Several of these studies demonstrate the superior prognostic capabilities of ML models over traditional logistic regression (LR) models and total perfusion deficit, especially when incorporating demographic data alongside SPECT imaging. Meta-analysis of 6 studies revealed promising performance of the included ML models, with sensitivity and specificity exceeding 65% for major adverse cardiovascular events and all-cause mortality. Notably, the integration of demographic information with SPECT imaging in ML frameworks shows statistically significant improvements in prognostic performance.

CONCLUSION: Our review suggests that ML models either independently or in combination with demographic data enhance prognostic prediction in coronary artery disease.

PMID:39589669 | DOI:10.1186/s13550-024-01179-2

Categories
Nevin Manimala Statistics

Spousal sleep behaviors and obstructive sleep apnea risk: effects on couples’ self-rated health

Sleep Breath. 2024 Nov 26;29(1):7. doi: 10.1007/s11325-024-03171-5.

ABSTRACT

PURPOSE: This study aims to examine the relationship between obstructive sleep apnea (OSA) risk, as assessed by the STOP-Bang questionnaire, and couples’ self-rated health. It also investigates how sleep behaviors (snoring, daytime tiredness, and observed apnea) reported in the STOP-Bang items affect couples’ self-rated health.

METHODS: Data from the Korea National Health and Nutrition Examination Survey (2019-2021) were analyzed, including 2,498 couples with complete STOP-Bang and self-rated health data. Logistic regression was used to explore these associations.

RESULTS: 59.2% of husbands and 11.0% of wives were at high risk for OSA. After adjusting for sociodemographic factors, comorbidities, and health behaviors, OSA risk and daytime tiredness were associated with poor self-rated health in both spouses (OR 1.52-3.38 in husbands, 2.23-2.63 in wives). After adjusting for these confounding factors and individual OSA risk, husbands whose wives reported snoring or daytime tiredness had higher odds of self-rated poor health (OR 2.69 [95% CI: 1.63-4.43] and 1.75 [95% CI: 1.25-2.45], respectively) compared to husbands whose wives did not report these behaviors. However, wives’ self-rated health was not significantly influenced by their husbands’ sleep behaviors. Additionally, the adjusted odds of self-rated poor health were 1.51 (95% CI: 1.06-2.16) in husbands if either partner had a high OSA risk, and 1.83 (95% CI: 1.15-2.90) in wives if both partners had a high OSA risk.

CONCLUSION: Husbands’ self-rated poor health is associated with wives’ snoring and daytime tiredness. The presence of OSA in one or both partners was also associated with poorer perceived health in the couple.

PMID:39589644 | DOI:10.1007/s11325-024-03171-5

Categories
Nevin Manimala Statistics

Application of OSA-VET® and qualiquantitative tear tests in brachycephalic dogs with and without keratoconjunctivitis sicca

Vet Res Commun. 2024 Nov 26;49(1):40. doi: 10.1007/s11259-024-10610-x.

ABSTRACT

The aim was to compare the outcomes acquired from the OSA-Vet® device with conventional quantitative and qualitative tear tests and between groups within each test, in brachycephalic dogs both healthy and those diagnosed with keratoconjunctivitis sicca. The dogs were divided into four groups: healthy dogs (HD), with mild KCS (MIKCS); moderate KCS (MOKCS); severe KCS (SKCS). All patients underwent ocular surface diagnostic examination in the following order, with a 10-minute interval between tests: non-invasive tear film breakup time (TBUTNI – OSA-Vet®), tear meniscus height (TMH-OSA-Vet®), meniscometry (I-Tear® test), Schirmer Tear Test-1 (STT-1), and tear film breakup time (TBUT). Kruskal-Wallis H tests were performed to establish the difference between the groups and Spearman´s correlation coefficient test to assess the correlation between tests. And an analysis of variance (ANOVA) followed by Tukey-Kramer post-hoc test was performed for TMH. Results with (p <.05) were considered statistically significant. The correlation of conventional tests in relation to those obtained by OSA-Vet® proved to be low, except between TBUTNI (OSA-Vet®) and TBUT in MOKCS, with a strong correlation (r =.925). In the comparison between TBUTNI (OSA-Vet®) and TBUT in MIKCS the correlation was moderate (r =.547) as well as STT-1 and I-Tear® test in MOKCS (r =.416). In the comparison between groups, the main result observed was a significant difference between all the KCS groups and HD, in the TBUT and TBUTNI (OSA-Vet®) test. The OSA-Vet® and conventional tests are useful for evaluating the ocular surface of brachycephalic dogs. However, the OSA-Vet® does not correlate well with conventional standardized tests.

PMID:39589642 | DOI:10.1007/s11259-024-10610-x

Categories
Nevin Manimala Statistics

Prevalence and morphometric evaluation of dilaceration in Indian Tamils: an analysis of 10,089 permanent teeth

Oral Radiol. 2024 Nov 26. doi: 10.1007/s11282-024-00789-9. Online ahead of print.

ABSTRACT

AIM: The present radiographic study was conducted to assess the subject and tooth prevalence of dilaceration in a cohort of Tamil population aided by morphometric analysis.

MATERIALS AND METHODS: After obtaining clearance from institutional human ethical clearance committee, 575 panoramic radiographs were retrieved. After exclusion of 233 radiographs based on the inclusion and exclusion criteria, a total of 342 panoramic radiographs were included. The angulation was estimated on Angle Meter software and categorized into three classes as mild, moderate and extreme. The data were recorded on Microsoft Excel spreadsheet 2021, and descriptively analyzed using IBM SPSS software.

RESULTS: Of the 342 subjects, there were 172 males and 170 females (1.012M:1F). Overall mean age was 33.75 ± 13.86 years. 100/342 subjects showed dilaceration in one or more teeth yielding a subject prevalence of 29.24%. There was no statistically significant difference in age and gender between the individuals who showed dilaceration and those without. The tooth prevalence was 1.49% (150/10089). The mandibular third molars were the most commonly affected teeth. Further, 97/150 teeth were mildly dilacerated (64.67%), 34/150 teeth showed moderate dilaceration (22.67%) and extreme dilaceration was noted in 19 teeth (12.66%).

CONCLUSION: Within the limits of the present study, we reported morphometric analysis of dilacerated teeth from South Indian population after examination of permanent 10,089 teeth from 342 panoramic radiographs. Mandibular third molars were the most commonly affected teeth, which led us to speculate that dilaceration is a true developmental anomaly, unrelated to trauma or other external stimuli.

PMID:39589640 | DOI:10.1007/s11282-024-00789-9

Categories
Nevin Manimala Statistics

Pancreatic cancer and long survivors: a survey of Italian society of oncological surgery (SICO)

Updates Surg. 2024 Nov 26. doi: 10.1007/s13304-024-02039-3. Online ahead of print.

ABSTRACT

Long-term survivors after pancreatic resection for PDAC are rare, constituting a specific subset of patients that remains poorly understood. The aim of this survey is to describe the current landscape related to survival in the Italian context and identify factors associated with long-term survival. An online survey, conducted by the Italian Society of Oncological Surgery (SICO) and endorsed by Italian Association of the Study of the Pancreas (AISP) and Italian Association of Hepatobiliary Pancreatic Surgery (AICEP), was distributed to surgeons in July 2023. The survey included 27 multiple-choice questions covering demographics, professional details, clinical practices, and long-term survival data. Responses were analyzed using descriptive statistics and multinomial logistic regression to identify factors related to long-term survival. The majority of surgeons (46.9%) considered LTS as “alive at 5 years, regardless of disease-free status”. The percentage of patients alive at 5 years post-2013 was higher compared to pre-2013. Almost all centers (93.2%) held multidisciplinary discussions. Very high-volume centers (> 100 resections/year) in comparison to very low-volume (< 10 resections/year) showed better long-term survival rates. No difference in survival were observed between centers with low, medium, high, and very high volumes. In addition, centers with multidisciplinary approach showed better survival rates. Centers with more neoadjuvant chemotherapy rates, low-grade and low-stage tumors were also associated with improved survival outcomes. This survey has allowed to understand the Italian scenario regarding survival in patients undergoing surgery for PDAC.

PMID:39589628 | DOI:10.1007/s13304-024-02039-3

Categories
Nevin Manimala Statistics

Socioeconomic-related inequities in child immunization: horizontal and vertical dimensions for policy insights

Health Econ Rev. 2024 Nov 26;14(1):98. doi: 10.1186/s13561-024-00566-8.

ABSTRACT

BACKGROUND: The incomplete immunization has potentially exposed vulnerable children, especially from the socioeconomically disadvantage group, to vaccine preventable diseases. The schemes would maximize social benefit only when the immunization is effectively distributed on an equitable principle.

METHOD: The empirical study is based on unit level data from India’s National Sample Survey: “Social Consumption: Health Survey- NSS 75th Round (2017-18) database. The nationwide survey is designed on the stratified multi-stage sampling method with an objective to make the sample representative. The egalitarian equity principle requires that distribution of vaccine should be based on health needs of children, irrespective of their socioeconomic and regional factors and the principle is broadly based on two aspects – horizontal and vertical equity. The horizontal inequity (HI) is a direct form of injustice, when children with equal needs of routine immunisation are treated differentially due to their socioeconomic status, while vertical inequity (VI) is indirect form of injustice when children with differential health needs and risks exposure do not receive appropriately unequal but equitable immunisation. Using Indirect Standardisation Method and Erreygers’ Corrected Concentration Index, we measure the degree of horizontal and vertical inequities, and then linearly decompose them to identify the major factors contributing towards the respective indices.

CONCLUSION: Our findings show that incomplete immunization is significantly concentrated among children belonging to poorer households. After controlling for the confounding effects of need factors, the inequity is still significantly pro-poor (i.e., horizontal inequity). The decomposition reveals that lower education, lower consumption and rural habitation are the major factors driving the corresponding inequity. Further, the differential effect of the needs between all and the target groups (at least based on education), is observed, however, is not statistically significant enough to realize inequity (i.e., no vertical inequity). Overall, the inequity is being induced via non-need factors. We further find that community health services (like anganwadi) have contributed towards reducing the inequity in child immunization significantly. The paper highlights the policy recommendation that the child immunisation program should target factors driving HI and need to align their distribution in terms of risks exposures.

PMID:39589599 | DOI:10.1186/s13561-024-00566-8