Categories
Nevin Manimala Statistics

Human papillomavirus prevalence in first, second and third cervical cell samples from women HPV-vaccinated as girls, Denmark, 2017 to 2024: data from the Trial23 cohort study

Euro Surveill. 2025 Jul;30(27). doi: 10.2807/1560-7917.ES.2025.30.27.2400820.

ABSTRACT

BACKGROUNDDanish women vaccinated with the 4-valent human papillomavirus (HPV) vaccine (HPV types: 6/11/16/18) at age 14 in 2008 reached screening age in 2017, allowing assessment of long-term effects on prevalence, persistence and incidence of HPV infections.AIMTo examine the HPV status of cervical samples over time among women vaccinated as girls.METHODSBetween February 2017 and February 2024, residual material from cytology-analysed samples collected through the ‘Trial23’ study, part of the national screening programme, was tested for HPV16/18 and non-vaccine high-risk (HR) HPV types. Prevalence in first, second and third samples, and persistence and incidence between samples were calculated.RESULTSOver 7 years, 8,659 women provided at least one sample, 5,835 at least two and 2,461 at least three. In 7,800 vaccinated women, HPV16/18 prevalence was 0.4% (95% confidence interval (CI): 0.2-0.5), 0.3% (95% CI: 0.1-0.4) and 0.2% (95% CI: 0.0-0.4) in three consecutive samples. Prevalence of non-vaccine HR HPV was 32% (95% CI: 31-33), 28% (95% CI: 27-29) and 31% (95% CI: 29-33). Persistence of HPV16/18 and non-vaccine HPV among vaccinated women was 40% and 53%. In adjusted analyses comparing vaccinated vs unvaccinated women, incidence was significantly lower for HPV16/18 (adjusted relative risk (aRR) < 0.10) while incidence of non-vaccine HR HPV types was higher (aRR: 1.66; 95% CI: 1.12-2.45). No significant difference was observed for persistence.CONCLUSIONOur study provides real-world evidence of stable protection against HPV16/18 infections in women vaccinated as girls. Less intensive screening seems reasonable until women vaccinated with the 9-valent vaccine reach screening age, when screening should be reconsidered.

PMID:40642768 | DOI:10.2807/1560-7917.ES.2025.30.27.2400820

Categories
Nevin Manimala Statistics

Prevalence of Charcot Foot Among Diabetes Mellitus Patients Under Follow-Up at the Integrated Diabetic Centers of Hospital Kulim and Hospital Raja Permaisuri Bainun: A Cross-Sectional Study

Cureus. 2025 Jun 10;17(6):e85708. doi: 10.7759/cureus.85708. eCollection 2025 Jun.

ABSTRACT

INTRODUCTION: Charcot foot is a debilitating complication of diabetes mellitus (DM), characterized by joint destruction, deformity, and instability due to neuropathy-induced microtrauma. Despite its severe impact on mobility and quality of life, Charcot foot remains underdiagnosed. This study aims to determine the prevalence of Charcot foot among diabetic patients at two major referral hospitals in Malaysia and identify associated risk factors.

METHODS: A cross-sectional study was conducted at the Integrated Diabetic Centers (IDCs) of Hospital Kulim and Hospital Raja Permaisuri Bainun (HRPB). Diabetic patients aged ≥18 years attending follow-ups at these centers were recruited via convenient sampling. Data collection included structured interviews, clinical examinations, laboratory investigations, and radiographic assessments. The diagnosis of Charcot foot was confirmed based on clinical findings, imaging, and laboratory parameters. Statistical analyses, including Fisher’s exact test, were conducted using IBM SPSS Statistics for Windows, Version 20.0 (IBM Corp., Armonk, New York, United States), with significance set at p<0.05.

RESULTS: A total of 675 diabetic patients were included, with a mean age of 56.1 years (SD±13.96). Men comprised 58.2% (n=393/675) of participants, while 41.8% (n=282/675) were women. The overall prevalence of Charcot foot among diabetic patients in this study was 1.8% (n=12/675). Gender was significantly associated with Charcot foot (p=0.005), with a higher prevalence among women (3.5%; n=10/282) than men (0.5%; n=2/393). The mean DM duration among Charcot foot patients was 19.67 years (SD±7.34), with 66.7% (n=8/12) having DM for 11-20 years. Poor glycemic control was prevalent, with a mean HbA1c of 9.21% (SD±1.87) and 75% (n=9/12) of Charcot foot patients having HbA1c ≥7%. Additionally, 50% (n=6/12) had a history of diabetic foot ulcers, and 16.7% (n=2/12) had undergone prior amputation.

CONCLUSION: Charcot foot is a significant but often underdiagnosed complication in diabetic patients, particularly in women and those with long-standing, poorly controlled DM. Early detection and multidisciplinary management are crucial to reducing morbidity. Future research should focus on longitudinal studies to assess disease progression and intervention effectiveness.

PMID:40642735 | PMC:PMC12244756 | DOI:10.7759/cureus.85708

Categories
Nevin Manimala Statistics

Validity and Reliability of the Filipino Version of the Kessler Psychological Distress Scale

Cureus. 2025 Jun 9;17(6):e85657. doi: 10.7759/cureus.85657. eCollection 2025 Jun.

ABSTRACT

INTRODUCTION: Kessler Psychological Distress Scale (K6) is a self-administered and short screening instrument for non-specific psychological distress and is used internationally in epidemiological studies because of its good psychometric properties. K6 can be feasibly used to screen for mental health problems in settings with limited mental health professionals, such as in low- and middle-income countries. This study aims to examine the psychometric validity, construct validity, and internal consistency of the Filipino version of K6.

METHODS: The participants were recruited from the community in Muntinlupa City, Philippines, and from patients in a psychiatric unit run by the local government in Muntinlupa. A structured questionnaire was used to collect data from both groups. The questionnaire assessed sociodemographic characteristics and included the K6 scale. For outpatients, in addition to the items used for community members, Patient Health Questionnaire-9 (PHQ-9), World Health Organization-Five Well-Being Index (WHO-5), and World Health Organization Quality of Life – BREF (WHOQOL-BREF) were employed to examine construct validity. Depression severity and diagnosis were determined by a psychiatrist or a resident doctor based on an unstructured clinical interview with three options: mild, moderate, and severe, and diagnostic criteria, respectively. Internal consistency and psychometric validity were assessed using Cronbach’s α and an unpaired t-test for the total K6 score between patients assessed as having mild depression and those with moderate or severe depression, respectively. K6 scores of community people and patients from the psychiatric unit were compared using a Mann-Whitney U test.

RESULTS: In total, 95 people from the psychiatric unit (27 male patients, 77 female patients, and one gender-diverse) and 405 people from the community (178 male participants, 226 female participants, and one gender-diverse) participated in the study. Cronbach’s α of K6 was 0.88. Patients in the psychiatric unit had higher K6 total scores (median 14.00) than the participants from the community (median 3.00). Coefficient correlation analysis showed that K6 was strongly and positively correlated with PHQ-9 (r = 0.74, p < 0.01) and moderately and negatively correlated with WHO-5 (r = -0.51, p <0.01) and Psychosocial health in WHOQOL-BREF (r = -0.59, p < 0.01). The optimal cutoff points for moderate and serious psychological distress were 6/7 and 11/12, respectively, based on the distribution of scores. Patients assessed as having moderate or severe depression had a statistically higher K6 score (Mean 15.60) than those assessed as having mild depression (Mean 12.96).

CONCLUSION: The Filipino version of K6 is appropriate for measuring psychological distress in clinical and community settings. The Filipino version of K6, including cutoff points, will be a useful tool in the local context in practice and can be used as a measurement tool in studies to promote mental health in various settings.

PMID:40642729 | PMC:PMC12240876 | DOI:10.7759/cureus.85657

Categories
Nevin Manimala Statistics

Is Endoscopic Approach Alone Adequate for the Management of Ureterovaginal Fistulas?

Cureus. 2025 Jun 10;17(6):e85710. doi: 10.7759/cureus.85710. eCollection 2025 Jun.

ABSTRACT

BACKGROUND: Most ureterovaginal fistulas (UVFs) are caused by gynecologic, urologic, or colorectal surgeries. Urine leaks, renal failure, and infections lower patients’ quality of life. Minimally invasive endoscopic double-J (DJ) stenting has become popular. There is insufficient research on the effects of DJ stenting on fistula size, diagnostic timeliness, and patient comorbidities.

OBJECTIVE AND METHODS: This study examines the efficacy of endoscopic DJ stent implantation in treating UVFs and addresses aspects such as fistula size, diagnosis timing, and comorbidities. This is a five-year retrospective study (2019 to 2024) conducted in Bhopal, India, comprising 31 patients with UVF who received endoscopic DJ stenting as the main treatment. Analyses included patient demographics, clinical presentation, fistula features, treatment outcomes, and complications. Statistical analysis includes chi-square tests for categorical variables and logistic regression for risk factor assessment, with a p-value < 0.05 considered significant.

RESULTS: DJ stenting showed a success rate of 77.4% (24/31 cases), with higher rates for early diagnosis (<4 weeks) and small fistula size (<5 mm) (p=0.038 and 0.032, respectively). Late diagnosis (>4 weeks), large fistula size (>5 mm), diabetes, and elevated creatinine (>1.2 mg/dL) were independent predictors of treatment failure in multivariate analysis. Minor issues included dysuria (16.1%, n=5) and hematuria (9.7%, n=3). One patient (3.2%) needed surgery due to a forgotten DJ stent.

CONCLUSION: If the UVF is minor and detected early, endoscopic DJ stenting can work. Renal failure, diabetes, larger fistulas, and delayed diagnosis reduce treatment success. Early prognostic identification and patient selection are crucial to maximize results and minimize surgery.

PMID:40642728 | PMC:PMC12244284 | DOI:10.7759/cureus.85710

Categories
Nevin Manimala Statistics

Saroglitazar Versus Simvastatin for Metabolic and Alcohol-Associated Liver Disease (MetALD)

Cureus. 2025 Jun 9;17(6):e85652. doi: 10.7759/cureus.85652. eCollection 2025 Jun.

ABSTRACT

Individuals with steatotic liver disease who consume significant amounts of alcohol and meet at least one cardiometabolic criterion are classified as having metabolic and alcohol-associated liver disease (MetALD). The efficacy of saroglitazar and simvastatin in this population remains unclear. In this single-center retrospective cohort study, 102 patients with MetALD were included. The reduction in CAP score was greater in the saroglitazar group (-40 (-109 to 3) dB/m) compared to the simvastatin group (-33 (-100 to 36) dB/m), although this difference did not reach statistical significance (P = 0.08). However, a significant difference was observed in the change in liver stiffness measurement (LSM) scores, with the saroglitazar group showing a greater reduction (-1.9 (-19.5 to 26.3) kPa) than the simvastatin group (-0.8 (-12.2 to 9) kPa) (P = 0.01). Saroglitazar also demonstrated a more pronounced effect on glycosylated hemoglobin (HbA1c), with a median decrease of -0.61 ± 0.96 compared to -0.1 ± 0.4 in the simvastatin group (P = 0.02). Saroglitazar is more effective than simvastatin in reducing CAP, LSM, and HbA1c over six months. Further prospective, well-controlled randomized studies are warranted to validate these findings.

PMID:40642726 | PMC:PMC12240681 | DOI:10.7759/cureus.85652

Categories
Nevin Manimala Statistics

Assessment of Time in Therapeutic Range (TTR) in a Primary Care Warfarin Clinic

Cureus. 2025 Jun 9;17(6):e85653. doi: 10.7759/cureus.85653. eCollection 2025 Jun.

ABSTRACT

Background Anticoagulation is commonly used to prevent thromboembolic events. Warfarin is a cost-effective, widely used anticoagulant that requires close monitoring due to its narrow therapeutic index. Time in therapeutic range (TTR) is a measure of international normalized ratio (INR) control. Using a retrospective approach, we evaluated the TTR of patients using warfarin for anticoagulation in our state-owned urban primary care clinic. Methods We conducted a retrospective chart review of adult patients on warfarin therapy followed at an urban Level 1 trauma center’s ambulatory clinic from January to December 2024. Patients with at least two contiguous INR visits over a two-month period were included. Data on warfarin dosing, INR values, anticoagulation visit history, and demographics were extracted from the electronic medical record. The primary outcome was TTR, calculated using the traditional method. Statistical analysis was performed using R-studio. Results Overall, 103 patients were analyzed in our clinic. The average TTR was 46.7% for this population. Around 17.5% of patients had a TTR >70%, and 30.1% had a TTR >60%. Approximately 29.1% of patients presented with high bleeding risk (INR >4.5) at least once during the measured time period, with 5.8% of patients requiring ED visits for significant elevations in INR. Variables such as gender, age, and insurance status did not significantly contribute to measured TTR. Conclusions The average TTR of our patient population was suboptimal. These findings highlight the need for more targeted quality improvement efforts to enhance anticoagulation management in our primary care clinic. Further evaluation of our current protocol with the help of pharmacists and the enhancement of patient education may be beneficial in achieving this goal.

PMID:40642722 | PMC:PMC12244846 | DOI:10.7759/cureus.85653

Categories
Nevin Manimala Statistics

Incidence and Predictors of Drug-Induced Liver Injury in Pediatric Tuberculosis Patients Under Anti-tubercular Therapy: A Prospective Observational Study

Cureus. 2025 Jun 9;17(6):e85661. doi: 10.7759/cureus.85661. eCollection 2025 Jun.

ABSTRACT

BACKGROUND: Tuberculosis (TB) remains a significant global health challenge, particularly in pediatric populations, where effective treatment with anti-tubercular therapy (ATT) is often complicated by adverse drug reactions. Drug-induced liver injury (DILI) is among the most serious complications of ATT, and identifying risk factors for DILI in children is essential for improving treatment safety and outcomes.

OBJECTIVE: This study aimed to determine the incidence of DILI in pediatric TB patients undergoing ATT and identify demographic and clinical factors associated with its development.

METHODS: A prospective observational study was conducted over 18 months at a tertiary care center in South India. Fifty children aged 1-14 years diagnosed with TB and initiated on ATT were enrolled. Liver function tests (LFTs) were performed at baseline, one month, and six months, and clinical parameters were monitored to identify DILI cases. Nutritional status was assessed using WHO growth standards, and statistical analyses were conducted to identify significant risk factors.

RESULTS: DILI was observed in 16 of 50 patients (32%). Malnutrition was present in 70% of DILI cases compared to 48% of non-DILI cases (p < 0.05). Female patients showed a higher incidence of DILI (56%) than males (44%). Baseline liver enzyme levels, specifically serum glutamic-oxaloacetic transaminase (SGOT) and serum glutamic-pyruvic transaminase (SGPT), were significantly higher in patients who developed DILI (p < 0.05). The most common clinical presentation of DILI was jaundice (50%), followed by anorexia and abdominal pain. Pulmonary TB accounted for 50% of DILI cases, while CNS TB represented 37.5%.

CONCLUSIONS: DILI is a common complication of ATT in pediatric TB patients, with malnutrition, female gender, and elevated baseline liver enzymes identified as significant risk factors. Routine liver function monitoring and nutritional interventions should be integral to TB management in children to mitigate the risk of DILI and improve treatment outcomes.

PMID:40642720 | PMC:PMC12244840 | DOI:10.7759/cureus.85661

Categories
Nevin Manimala Statistics

A Cross-Sectional Study to Assess the Healthcare-Seeking Behaviour of Tribal Communities in a District of Maharashtra

Cureus. 2025 Jun 10;17(6):e85687. doi: 10.7759/cureus.85687. eCollection 2025 Jun.

ABSTRACT

Introduction Tribal populations in India face longstanding barriers to accessing formal healthcare due to economic, geographic, and cultural constraints. This study assessed the healthcare-seeking behaviour of tribal households in Palghar district, Maharashtra, and examined associated determinants. Methods A community-based cross-sectional study was conducted from August 2023 to March 2024 using multistage cluster random sampling in eight tribal villages located within a 25 km radius of the district hospital. A total of 80 households were selected, and 306 individuals were enumerated. Of these, 84 individuals (27.5%) who reported illness in the past three months were included in the analysis. Data were collected using a pretested structured questionnaire and analysed using R software (R Foundation for Statistical Computing, Vienna, Austria). Chi-square tests were applied to assess associations between healthcare-seeking behaviour and independent variables. Results Only 25 (29.8%) of the ill individuals sought formal healthcare, while 29 (34.5%) accessed informal providers, and 30 (35.7%) took no action. Among all variables analysed, only perceived severity of illness was significantly associated with formal healthcare utilization. Formal care was accessed by 13 of 14 (92.9%) individuals who perceived their illness as severe, compared to 11 of 40 (27.5%) with moderate and five of 30 (16.7%) with mild perception. No significant associations were found with age, gender, education, number of symptoms, or timing of illness. Conclusion The study highlights low formal healthcare utilization and a strong influence of perceived illness severity on care-seeking behaviour. Continued reliance on spiritual healers and non-action reflects persistent cultural and structural barriers. Interventions should include culturally sensitive health promotion, expansion of nearby healthcare services, and financial support mechanisms. Further qualitative research is needed to explore contextual factors influencing healthcare choices in tribal communities.

PMID:40642714 | PMC:PMC12243070 | DOI:10.7759/cureus.85687

Categories
Nevin Manimala Statistics

Impact of Cervical and Lumbar Spine Surgeries on National Football League (NFL) Player Performance and Return-to-Play Outcomes

Cureus. 2025 Jun 10;17(6):e85706. doi: 10.7759/cureus.85706. eCollection 2025 Jun.

ABSTRACT

INTRODUCTION: American football players face a higher risk of spine injuries due to the sport’s high-impact nature, especially in the lumbar and cervical spine regions. These injuries may require surgical interventions aimed at allowing athletes to return to the sport. However, the effects of these surgeries on players’ performance and career longevity have yet to be comprehensively studied.

OBJECTIVE: This study aims to evaluate the impact of spine surgeries on National Football League (NFL) players’ return-to-play rates and performance. We hypothesize that players undergoing lumbar surgeries would demonstrate greater performance improvement and return-to-play rates compared to those undergoing cervical surgeries, with differences influenced by player position and injury location. Study design and methods: This is a retrospective cohort study (III) for which NFL injury reports from 2005 to 2022 were reviewed to identify players who had undergone spine surgery. Data collected included player position, return-to-play, and years played following the procedure. Performance metrics were gathered using Super Bowl wins and Pro Football Focus (PFF) player performance ratings. Statistical analysis was conducted using Python version 3.10.12 (Python Software Foundation, Wilmington, DE, USA) to evaluate differences in return-to-play rates, performance changes, and career duration post-surgery.

RESULTS: The study identified 144 spine surgeries (77 lumbar, 67 cervical) among 136 players. Players who had lumbar surgery had a 61% return-to-play rate, with an average performance rating increase of 6.3%. In contrast, those who had cervical surgery had a 47% return-to-play rate and an average performance rating decrease of 5.8%. Lumbar surgeries were more common among linemen with higher BMIs, while cervical surgeries were more frequent in skill positions. Players with a history of lumbar surgeries were more likely to return to play than those without previous surgeries.

CONCLUSION: Spinal surgeries significantly impact the careers of NFL players. Lumbar surgeries show better outcomes in terms of return-to-play rates and performance improvements compared to cervical surgeries. The differences in surgical outcomes based on injury location and player position highlight the need for tailored rehabilitation protocols. This study provides valuable insights for medical practitioners, team management, and athletes, contributing to a broader understanding of the implications of spine surgeries in professional football.

PMID:40642713 | PMC:PMC12244281 | DOI:10.7759/cureus.85706

Categories
Nevin Manimala Statistics

Prognosis and Outcome of Carbapenem-Resistant Enterobacterales Bacteremia Managed With Ceftazidime-Avibactam and Aztreonam Combination Therapy in Tawam Hospital, UAE: A Retrospective Study

Cureus. 2025 Jun 10;17(6):e85689. doi: 10.7759/cureus.85689. eCollection 2025 Jun.

ABSTRACT

Introduction In recent years, the medical community has grown increasingly alarmed by the escalating rates of carbapenem resistance – a global concern that is also affecting the United Arab Emirates (UAE). This rise in antibiotic resistance poses a significant challenge to healthcare systems and necessitates urgent and comprehensive research. The primary objective of this study is to investigate the factors that influence the prognosis and outcomes of bacteremia caused by carbapenem-resistant Enterobacterales (CRE), managed with a combination of ceftazidime-avibactam (CAZ-AVI) and aztreonam (ATM). Understanding the determinants of treatment success may provide valuable insights into improving patient care and outcomes. Methods This retrospective observational chart review was conducted at Tawam Hospital, Al Ain, from 2020 to 2023. Seventeen adult patients (aged >18 years) with confirmed CRE bacteremia who received combination therapy with CAZ-AVI and ATM were included. Data were extracted from the SEHA electronic medical records, including demographics, clinical features, laboratory findings, and outcomes such as ICU admission, in-hospital mortality, and length of stay. Statistical analyses were performed using Excel, Meta-Chart, and SkyBlue Statistics. Given the small sample size, descriptive statistics were prioritized, and chi-square and unpaired t-tests were used to explore associations, recognizing limitations in statistical power. Results The incidence of CRE bacteremia treated with CAZ-AVI and ATM increased over the study period, with the highest number of cases recorded in 2023. Antimicrobial resistance remained consistently high across both beta-lactam and non-beta-lactam classes. The overall in-hospital mortality rate was 29.4%, with long-term four-year mortality reaching 53%. The median length of hospital stay was 19 days, and 17.6% of patients required intensive care. Poor outcomes were primarily associated with immunosuppression, prior hospitalizations, and multiple comorbidities. Conclusion This study highlights the increasing clinical burden of CRE bacteremia in the UAE. By identifying key prognostic factors and reporting high mortality and prolonged hospital stays despite combination therapy, it underscores the urgent need for timely intervention, improved antimicrobial stewardship, and enhanced diagnostic capacity. These findings contribute valuable regional data to the global effort to curb antimicrobial resistance.

PMID:40642711 | PMC:PMC12243072 | DOI:10.7759/cureus.85689