Categories
Nevin Manimala Statistics

Spectral Computed Tomography Diagnosis of Inflammatory Bowel Disease with Neodymium-Hyaluronic Acid Nanoparticles

Biomater Res. 2026 Mar 25;30:0301. doi: 10.34133/bmr.0301. eCollection 2026.

ABSTRACT

Early detection and nonintrusive assessment of inflammatory bowel disease (IBD) remain an unmet clinical challenge. Spectral computed tomography (CT) presents a potential modality for gastrointestinal (GI) imaging; however, clinical CT contrast agents are unable to achieve targeted detection of IBD in spectral CT imaging. In this study, we developed neodymium-hyaluronic acid nanoparticles (Nd-HA NPs) as novel contrast agents for spectral CT imaging of IBD. Nd-HA NPs were synthesized by conjugating HA units with lanthanide complex neodymium-diethylenetriamine-pentaacetic acid (Nd-DTPA). The physical properties, biotoxicity, and CT imaging ability of Nd-HA NPs were systematically evaluated in vitro. Subsequently, the applicability of Nd-HA NPs for GI tract imaging was assessed in both healthy and colitis mouse models. Nd-HA NPs exhibited excellent stability, biocompatibility, and potent x-ray attenuation property in vitro as novel spectral CT contrast agents. Attributed to HA’s high affinity for cluster of differentiation 44 receptor, which is abundantly expressed at inflammatory sites, Nd-HA NPs successfully achieved targeted spectral CT imaging of IBD, and showed greater accumulation in the lesions of colitis mice compared with the clinical contrast agent iohexol. More importantly, after oral administration of Nd-HA NPs, the CT values of GI tract in healthy mice, 2.5% DSS-induced mice (moderate colitis), and 5% DSS-induced mice (severe colitis) were 90.19, 140.99, and 264.07 HU, respectively, with statistically significant difference (P < 0.001). These results indicated that Nd-HA NPs had the potential to realize severity assessment of IBD in spectral CT imaging, which was further confirmed by inductively coupled plasma optical emission spectrometry analysis and histopathological evaluation. The study suggested that Nd-HA NPs could serve as effective spectral CT contrast agents, enabling noninvasive early detection and severity assessment of IBD.

PMID:41891115 | PMC:PMC13014110 | DOI:10.34133/bmr.0301

Categories
Nevin Manimala Statistics

Shifting Drug Landscapes in China: A Multilevel Analysis of Traditional vs New Psychoactive Substance Use and Interregional Differences

Subst Abuse Rehabil. 2026 Feb 5;17:570414. doi: 10.2147/SAR.S570414. eCollection 2026.

ABSTRACT

BACKGROUND: China’s drug landscape is rapidly evolving, yet existing research remains fragmented, lacking a comprehensive national perspective. This study analyzes current drug use patterns, trends, and regional differences in China, providing critical insights to guide effective anti-drug policies.

METHODS: We conducted a comprehensive multilevel analysis of secondary data of drug use data from the China Drug Situation Report (2005-2023) and 34 academic articles (1990-2021). Our analysis includes descriptive statistics, time series, regional differences, and population-specific trends.

RESULTS: This study identifies a declining trend in traditional drug use, while the use of new synthetic drugs and new psychoactive substances (NPS) is increasing. Strong negative correlations were found between law enforcement intensity and overall drug use (r: -0.89 to -0.92). Significant regional disparity in NPS use was identified, with prevalence substantially higher in southern China than in the north (p = 0.019). Traditional drugs are more prevalent in the northwest and central regions, while new drugs and NPS are more commonly found in the eastern coastal and central urban areas. The use of NPS is notably higher among adolescents in economically developed regions.

CONCLUSION: The analysis delineates a clear shift in China’s drug landscape from traditional drugs to NPS, with concentrations in southern, coastal, and adolescent demographics. These patterns suggest that effective policy responses should be regionally tailored and prioritize youth prevention in economically advanced areas. Future research is needed to verify these associations and explore underlying causal mechanisms.

PMID:41891108 | PMC:PMC13016123 | DOI:10.2147/SAR.S570414

Categories
Nevin Manimala Statistics

Regression Patterns of Neovascularization in Proliferative Diabetic Retinopathy Following Panretinal Photocoagulation – A Prospective, Observational Study

Clin Ophthalmol. 2026 Mar 19;20:575268. doi: 10.2147/OPTH.S575268. eCollection 2026.

ABSTRACT

PURPOSE: To determine the short-term patterns of regression of neovascularization after panretinal photocoagulation (PRP) in eyes with proliferative diabetic retinopathy (PDR) without clinically significant macular edema.

METHODS: The study was a prospective, observational pilot study conducted at a tertiary care hospital in India from January 2023 to April 2024. The single eye (worst eye) of 30 patients with PDR without diabetic macular edema was selected using convenient sampling. This approach was chosen to avoid inter-eye correlation bias. The patients meeting the inclusion criteria underwent PRP laser using Green LASER (532nm), and visual acuity (VA), central macular thickness (CMT), and fundus photographs were analyzed at baseline, 1 month, and 3 months post-PRP.

RESULTS: The VA at baseline remained similar at 1-month and 3-month post-PRP. The CMT increased significantly at 1-month and 3 months but was within 300 microns, remaining below the threshold for clinically significant macular edema. No regression was seen at one month in most eyes. However, at 3 months, complete regression was seen in 10% of cases, incomplete regression in 47%. There was no difference in the regression rates based on the amount of neovascularization at baseline. NVD showed a higher odds ratio for non-regression; however, this did not reach statistical significance, and a 25-micron increment in spot size demonstrated a non-significant trend toward reduced likelihood of non-regression. Every 1gm% HbA1c increment was associated with a 2.7 times higher likelihood of CME.

CONCLUSION: In this pilot study, short-term regression of neovascularization following PRP was variable. Exploratory trends suggested possible differential regression between NVD and NVE.

PMID:41891101 | PMC:PMC13012863 | DOI:10.2147/OPTH.S575268

Categories
Nevin Manimala Statistics

Glaucoma Risk with Metformin and Sulfonylurea Therapies in Type 2 Diabetes: A Retrospective Cohort Study

Clin Ophthalmol. 2026 Feb 4;20:545641. doi: 10.2147/OPTH.S545641. eCollection 2026.

ABSTRACT

BACKGROUND: Research on metformin, sulfonylureas, and open-angle glaucoma risk in type 2 diabetes mellitus (T2DM) has yielded inconsistent findings. This study examined associations between hypoglycemic treatments and glaucoma diagnosis rates.

METHODS: This retrospective cohort study analyzed electronic health records of newly diagnosed T2DM patients from the Merative™ Explorys® Therapeutic Dataset (2010-2022), comparing three groups: metformin monotherapy, sulfonylureas plus metformin combination, and untreated controls. Propensity score matching balanced demographics, glycemic control, body mass index, blood pressure, lipid levels, and comorbidities. Cox proportional hazards models calculated adjusted hazard ratios for incident glaucoma.

RESULTS: After 1:1 propensity score matching, metformin monotherapy (N=100,387) showed a non-significant trend toward higher glaucoma diagnosis rates compared to matched controls (HR 1.106, 95% CI 1.014-1.207, p=0.023; adjusted HR 1.076, 95% CI 0.995-1.163, p=0.067). Patients receiving combination therapy with sulfonylureas and metformin (N=38,692) demonstrated statistically significantly higher glaucoma diagnosis rates relative to matched controls (HR 1.235, 95% CI 1.077-1.417, p=0.002; adjusted HR 1.194, 95% CI 1.075-1.326, p=0.001). Direct comparison between combination therapy and metformin monotherapy did not reach statistical significance (adjusted HR 1.084, 95% CI 0.973-1.207, p=0.144).

CONCLUSION: This observational study found associations between diabetes medications and increased glaucoma diagnosis rates but cannot establish causality. Multiple competing explanations exist: reverse causation (clinicians preferentially prescribing metformin to diabetic patients with emerging glaucoma based on prior protective literature), confounding by indication (sicker patients requiring medication having inherently higher glaucoma risk), and detection bias (differential surveillance patterns). The non-significant metformin monotherapy finding (p=0.067) aligns with recent meta-analyses showing no association. While the statistically significant combination therapy association warrants investigation, it should not be interpreted as definitive causation. Prospective studies controlling for disease severity, surveillance patterns, and treatment indication are needed to disentangle these explanations and inform clinical practice.

PMID:41891096 | PMC:PMC13016381 | DOI:10.2147/OPTH.S545641

Categories
Nevin Manimala Statistics

Selenium Supplementation in Graves’ Orbitopathy: Effects on Blood Concentrations and Clinical Outcomes

Clin Ophthalmol. 2026 Feb 9;20:569805. doi: 10.2147/OPTH.S569805. eCollection 2026.

ABSTRACT

OBJECTIVE: To evaluate changes in blood selenium concentrations and associated clinical outcomes following selenium supplementation in patients with Graves’ orbitopathy (GO) with varied severity.

STUDY DESIGN: A retrospective study.

PATIENTS AND METHODS: We retrospectively reviewed the medical records of patients with GO who received selenium supplementation with a total daily dose of 210 µg for six months at a single tertiary care center between January 2019 and January 2021. Clinical parameters, including visual acuity, exophthalmos, eyelid aperture, Clinical Activity Score (CAS), GO severity, Graves’ Ophthalmopathy Quality of Life (GO-QoL) scores, and selenium concentrations were assessed at baseline and post-treatment. Subgroup analysis was performed for patients who received no concurrent GO treatment.

RESULTS: Forty-eight patients (52.1% female, mean age 46.33±11.47 years) were included, with disease severity classified as mild (22.9%), moderate-to-severe (70.8%), and sight-threatening (6.3%). The mean selenium concentrations increased from 90.46±16.40 µg/L to 113.67±19.64 µg/L (mean difference: 23.21±24.75 µg/L; 95% CI: 16.02-30.39, P<0.001). Of 37 patients who did not receive other GO treatments, CAS improved significantly in 13 (35.13%; P=0.02), and the prevalence of eyelid edema decreased from 22.2% to 5.6% (P=0.03). GO-QoL appearance subscale scores significantly improved (mean change: 8.16±23.14; 95% CI: 0.33-15.99; P=0.04), while the changes in visual function scores were not statistically significant.

CONCLUSION: Selenium supplementation was associated with a significant increase in selenium concentrations and clinically significant improvements in CAS, eyelid edema, and quality of life among patients with GO. These findings support the potential role of selenium therapy in GO management. However, larger randomized controlled trials are warranted to confirm these observations and guide dosage recommendations.

PMID:41891094 | PMC:PMC13016390 | DOI:10.2147/OPTH.S569805

Categories
Nevin Manimala Statistics

Family planning desires and barriers to fertility preservation for transgender and gender-diverse military service members in the United States

Int J Transgend Health. 2025 Feb 28;27(2):1115-1129. doi: 10.1080/15532739.2025.2469279. eCollection 2026.

ABSTRACT

PURPOSE: Fertility preservation is recommended prior to initiation of gender-affirming hormone therapy. However, barriers have been described in providing appropriate counseling and pursuing fertility preservation. We hypothesized transgender and gender diverse (TGD) people serving in the United States military would face their own unique barriers to fertility preservation given the historic undulating policies with respect to their ability to openly serve. We aimed to evaluate barriers that transgender and gender diverse (TGD) individuals face when pursuing fertility preservation in the United States Military Health System (MHS).

METHODS: We developed a mixed-methods study using an explanatory sequential design. Data collection occurred between February and April 2024. We created and distributed a survey to all individuals presenting for an initial surgical consultation for gender-affirming genital reconstructive surgery at our institution and posted it on a social media page for TGD military service members. This survey assessed family-building desires, barriers to family-building, and satisfaction with prior fertility preservation counseling. In the survey, participants had the option to request a follow-up interview. We report descriptive statistics from both the survey and interviews.

RESULTS: We received 26 responses from self-identified transgender men (n = 9), transgender women (n = 14), and nonbinary (n = 3) individuals aged 21 to 49. Most respondents were married (61.5%) and desired at least one child (76.9%). The majority of respondents did not feel supported in building their families during their military service (61.5%). Twenty respondents agreed to participate in a follow-up interview, and ten interviews were conducted. Five barriers emerged from coding interviews: heterogeneity in counseling, limited resources and support, systemic barriers, providers’ implicit bias and insurance policy.

CONCLUSION: Numerous family planning and fertility preservation barriers exist for TGD people within the Military Health System. Although many individuals stated they received fertility preservation counseling, many felt it was a formality and lacked individualization. This research highlights the need to standardize fertility preservation counseling, train US military providers on fertility preservation counseling for TGD service members and pursue policy reform to decrease barriers to accessing this care.

PMID:41891058 | PMC:PMC13015071 | DOI:10.1080/15532739.2025.2469279

Categories
Nevin Manimala Statistics

Causal Mediation Pathways in Continuous Postprandial Glucose Monitoring for Type 1 Diabetes Patients

medRxiv [Preprint]. 2026 Mar 17:2026.03.16.26348520. doi: 10.64898/2026.03.16.26348520.

ABSTRACT

Managing postprandial glucose in Type 1 Diabetes Mellitus (T1DM) requires understanding how carbohydrate intake affects glucose through both direct pathways and insulin-mediated compensation. 1,2 Standard analyses often treat insulin as a confounder rather than a mediator, obscuring the distinct roles of these two causal channels and hiding clinically important heterogeneity in how different patients respond to carbohydrate intake. Using meal-centered continuous glucose monitoring windows from twelve adults in the OhioT1DM 2018 and 2020 cohorts, 3,4 we apply the causal mediation framework of Imai et al. 5 to decompose the total effect of carbohydrate intake on glucose change into the Average Causal Mediation Effect (ACME, the indirect effect operating through insulin), the Average Direct Effect (ADE, the effect not mediated by insulin), and the Average Total Effect (ATE). 6 We estimate these quantities by meal type over a 3.5-hour post-meal horizon and across outcome quantiles to characterize heterogeneity in glucose control mechanisms that population-average methods fail to detect. 7,8 To adjust for confounding by longitudinal pre-meal physiological trajectories, we introduce a Causally-constrained Linear Autoencoder (CLAE) that learns low-dimensional pre-treatment representations satisfying the conditional independence assumptions required for valid mediation. 9-11 Results reveal clinically meaningful heterogeneity in response to carbohydrate and bolus insulin intake across meal types and across the conditional glucose response distribution. At dinner, the direct glycemic effect substantially exceeds the insulin-mediated response, producing persistent total effects of 10-14 mg/dL for a +30 g carbohydrate increase that indicates systematic under-compensation by evening boluses. Breakfast, in contrast, exhibits large but nearly canceling direct and mediated effects, while lunch and snack show negligible mediation structures. Quantile-specific analysis further identifies a subgroup for whom the total carbohydrate effect at dinner reaches 22.03 mg/dL ( p = 0.04), statistically significant despite being undetectable in the mean-level analysis. This distributional heterogeneity points to patients whose glycemic risk is undermined by population-average estimates and for whom current dosing recommendations are inadequate. 12-14.

PMID:41891047 | PMC:PMC13015663 | DOI:10.64898/2026.03.16.26348520

Categories
Nevin Manimala Statistics

Childhood Mental Health and Body Mass Index as Mediators of Genetic Risk for Eating Disorders

medRxiv [Preprint]. 2026 Mar 16:2026.03.13.26347917. doi: 10.64898/2026.03.13.26347917.

ABSTRACT

IMPORTANCE: Eating disorders (EDs) are heritable, yet the developmental pathways through which genetic liability manifests in early life remain unclear.

OBJECTIVE: To investigate the associations between genetic liability for anorexia nervosa (AN) and binge eating (BE) and disordered eating behaviors (DEB) across childhood, and to identify the mediating roles of metabolic and psychosocial traits.

DESIGN SETTING AND PARTICIPANTS: This longitudinal observational study used genomic and behavioral data from the Adolescent Brain Cognitive Development SM (ABCD ® ) Study, a multisite, population-based cohort of children recruited between 2016 and 2018 at ages 9 to 10 years from 21 research centers across the United States. A three-wave temporal design was employed, utilizing data from baseline (T0), Year 1 (T1), and Year 2 (T2) follow-ups. Primary analyses focused on 5,618 participants of genetically inferred European (EUR) ancestry, with exploratory analyses conducted in a diverse sample of 9,132 participants.

EXPOSURES: Polygenic scores (PGS) for AN and BE were calculated using summary statistics from the most recent genome-wide association studies. Mediators included BMI, ADHD, anxiety/depression, and social problems from the Child Behavioral Checklist assessed at Year 1 follow-up (T1).

MAIN OUTCOMES AND MEASURES: Parent reported DEB symptoms via the Kiddie Schedule for Affective Disorders and Schizophrenia (KSADS). For longitudinal association analyses, DEB were pooled across T0, T1 and T2 to assess the relationship between genetic liability and childhood symptom severity. For mediation analyses, DEB at T2 follow-up were used to ensure a clear temporal sequence between mediators at T1 and the outcomes.

RESULTS: Among 5,618 EUR participants (mean [SD] age, 9.91 [0.62] years; 47% female), longitudinal association models revealed that higher AN-PGS was associated with increased AN symptoms, while BE-PGS was associated with increased BE and AN symptoms. These patterns were largely consistent in exploratory cross-ancestry analyses. Mediation analyses showed that BMI mediated genetic risks across sexes, while ADHD and anxiety/depression symptoms emerged as additional mediators in females.

CONCLUSIONS AND RELEVANCE: Genetic liabilities to AN and BE contribute to childhood DEB through sex-dependent pathways, highlighting the developmental continuity of ED risk from childhood. Integrating genetic profiles with behavioral markers may facilitate early identification and support multifaceted interventions.

KEY POINTS QUESTION: Do genetic risks for anorexia nervosa (AN) and binge eating (BE) contribute to childhood disordered eating behaviors, and what mechanisms mediate these effects?

FINDINGS: In this longitudinal study of 5,618 children of European ancestry, AN polygenic scores (AN-PGS) were associated with early AN symptoms, while BE-PGS showed transdiagnostic associations with both AN and BE symptoms. These links were mediated by BMI and psychosocial traits, including sex-specific pathways through ADHD and anxiety/depression symptoms in females.

MEANING: Our findings suggest that genetic liability to eating disorders manifests early in life through distinct metabolic and psychosocial pathways, highlighting a window for sex-specific targeted prevention.

PMID:41891041 | PMC:PMC13015685 | DOI:10.64898/2026.03.13.26347917

Categories
Nevin Manimala Statistics

Predicting cognitive impairment using novel functional features of spatial proximity and circularity in the digital clock drawing test

medRxiv [Preprint]. 2026 Mar 16:2026.03.14.26348336. doi: 10.64898/2026.03.14.26348336.

ABSTRACT

The digital clock drawing test (dCDT) is a cognitive screening tool employing a digital pen. While many studies rely on summary statistics of dCDT features to predict cognitive outcomes, these approaches often involve subjective decisions such as feature selection and imputation. In this study, we introduce novel dCDT features, expressed as mathematical functions, and compare them to commonly used summary features. We included dCDTs from 3,415 participants from the Framingham Heart Study. Random forest models with five-fold cross-validation were trained to distinguish participants with mild cognitive impairment or dementia from cognitively intact participants. When combined with established time-based features, functional features related to spatial proximity and circularity demonstrated predictive power comparable to commonly used summary features. Our findings highlight the potential of integrating functional features to detect subtle motions and behaviors in digital cognitive assessments, offering new tools that may enhance diagnostic accuracy and support early detection strategies.

PMID:41891039 | PMC:PMC13015632 | DOI:10.64898/2026.03.14.26348336

Categories
Nevin Manimala Statistics

Freeze-Drying as a Novel Concentrating Method for Wastewater Detection of SARS-CoV-2

medRxiv [Preprint]. 2026 Mar 19:2025.01.04.25319877. doi: 10.1101/2025.01.04.25319877.

ABSTRACT

Detecting viral RNA from wastewater has emerged as a cost-effective approach for community-level surveillance during the recent SARS-CoV-2 pandemic. Although various concentrating methods have been developed, none are optimal for all key requirements for wastewater viral detection. Freeze-drying, a technique widely used for concentrating and preserving biological materials, remains underexplored for this purpose. This study compared the performance of freeze-drying and centrifugal ultrafiltration in terms of recovery efficiency, detection limit, and other key parameters. Early pandemic samples in this study, with extremely low viral concentrations, offered an ideal benchmark to assess their suitability for early-warning applications. Statistical analyses showed that freeze-drying achieved significantly higher recovery efficiency (0.338% ± 0.065% vs. 0.149% ± 0.046%), superior detection ratio (81.6% vs. 36.8%), and lower detection limit (0.06 vs. 0.36 copies/mL) compared to centrifugal ultrafiltration. To our knowledge, this is the first study to apply freeze-drying for wastewater-based viral detection. Despite its longer processing time, freeze-drying offers multiple advantages, including the elimination of pretreatment steps, a flexible workflow, reduced RNA degradation under cryogenic conditions, minimal pathogen exposure, lower labor demands, and less human interference during processing. These features position freeze-drying as a novel alternative for wastewater-based viral surveillance, particularly for decision-making when establishing such systems.

SYNOPSIS: Freeze-drying is a new wastewater virus concentrating method that outperforms centrifugal ultrafiltration, providing a simpler, safer, and more sensitive approach for community surveillance.

PMID:41891023 | PMC:PMC13015668 | DOI:10.1101/2025.01.04.25319877