Categories
Nevin Manimala Statistics

Emergency Medical Services-Led Outreach Following Opioid-Associated Overdose: Frequency, Modality, and Treatment Linkage

Prehosp Emerg Care. 2025 Feb 7:1-9. doi: 10.1080/10903127.2025.2462211. Online ahead of print.

ABSTRACT

OBJECTIVES: Emergency medical services (EMS) post-overdose outreach programs expand beyond traditional 9-1-1 response to offer overdose survivors linkage to substance use treatment and other related harm-reducing interventions. Although intuitive and increasingly popular, evidence to define expected outcomes is exceedingly limited. We evaluated process and patient outcomes of one large Midwestern post-overdose outreach program to describe outreach characteristics and linkage to substance use treatment.

METHODS: This retrospective cohort study used clinical program records of individuals referred to a multidisciplinary post-overdose outreach program following a non-fatal presumed opioid overdose with emergency response. Measures included (i) number of outreach attempts, (ii) modalities of outreach attempts (in-person visit, text message, letter, phone call, or electronic mail), (iii) outcome of outreach (i.e., if the individual was contacted), (iv) interventions provided including linkage to substance use treatment with coordinated admission and transportation. We used descriptive statistics to report patient characteristics, outreach frequency, outreach modality, successful contact, and treatment linkage through the program.

RESULTS: From 2020-2022, the program attempted outreach to 3,437 individuals. The median age was 37 years (interquartile range, IQR, 30-47). Most individuals were white/non-Hispanic (n = 2,077, 63.1%) and male (n = 2,084, 61.2%). Few were unhoused at the time of outreach (n = 246, 7.2%). The program made a total of 7,935 outreach attempts with a median of 2 outreach attempts (IQR 1-3) per individual. The most common outreach modalities were in-person visit (n = 3,300, 41.6%) and text message (n = 2,776, 35.0%), though phone calls and in-person visits most often resulted in successful contact (52.6% and 23.7%, respectively). Outreach attempts resulted in 743 (21.6%) successful contacts and the program linked 304 individuals (40.9% of all contacted individuals, 8.8% of all attempted outreach) to treatment. Notably, 160 (52.6%) of the 304 individuals linked to treatment required 3 or more outreach attempts before treatment linkage occurred.

CONCLUSIONS: Post-overdose outreach initiated by EMS can successfully find and link individuals to substance use treatment following a non-fatal opioid overdose. However, this intervention may be resource intensive, often requiring multiple attempts at outreach and several modalities of interaction to facilitate treatment linkage.

PMID:39919200 | DOI:10.1080/10903127.2025.2462211

Categories
Nevin Manimala Statistics

Repurposing lapatinib as a triple antagonist of chemokine receptors 3, 4, and 5

Mol Pharmacol. 2025 Jan;107(1):100010. doi: 10.1016/j.molpha.2024.100010. Epub 2024 Dec 12.

ABSTRACT

Chemokine receptors CCR3, CCR4, and CCR5 are G protein-coupled receptors implicated in diseases like cancer, Alzheimer’s, asthma, human immunodeficiency virus (HIV), and macular degeneration. Recently, CCR3 and CCR4 have emerged as potential stroke targets. Although only the CCR5 antagonist maraviroc is US Food and Drug Administration-approved (for HIV), we curated data on CCR3, CCR4, and CCR5 antagonists from ChEMBL to develop and validate machine learning models. The top 5-fold cross-validation statistics for these models were high for both classification and regression models for CCR3 (receiver operating characteristic [ROC], 0.94; R2 = 0.8), CCR4 (ROC, 0.98; R2 = 0.57), and CCR5 (ROC, 0.96; R2 = 0.78). The models for CCR3/4 were used to screen a small library of US Food and Drug Administration-approved drugs and 17 were initially tested in vitro against both CCR3/4 receptors. A promising compound lapatinib, a dual tyrosine kinase inhibitor, was identified as an antagonist for CCR3 (IC50, 0.7 μM) and CCR4 (IC50, 1.8 μM). Additional testing also identified it as an CCR5 antagonist (IC50, 0.9 μM), and it showed moderate in vitro HIV I inhibition. We demonstrated how machine learning can be used to identify molecules for repurposing as antagonists for G protein-coupled receptors such as CCR3, CCR4, and CCR5. Lapatinib may represent a new orally available chemical probe for these 3 receptors, and it provides a starting point for further chemical optimization for multiple diseases impacting human health. SIGNIFICANCE STATEMENT: We describe the building of machine learning models for the chemokine receptors CCR3, CCR4, and CCR5 trained on data from the ChEMBL database. Using these models, we identified lapatinib as a potent inhibitor of CCR3, CCR4, and CCR5. Our study illustrates the potential of machine learning in identifying molecules for repurposing as antagonists for G protein-coupled receptors, including CCR3, CCR4, and CCR5, which have various therapeutic applications.

PMID:39919162 | DOI:10.1016/j.molpha.2024.100010

Categories
Nevin Manimala Statistics

Experimental study on in-situ simulation of rainfall-induced soil erosion in forest lands converted to cash crop areas in Dabie Mountains

PLoS One. 2025 Feb 7;20(2):e0317889. doi: 10.1371/journal.pone.0317889. eCollection 2025.

ABSTRACT

Soil erosion is a pervasive global challenge and a significant ecological and environmental concern in China. Its occurrence frequently triggers ecological crises, including soil degradation and water contamination. It is of great scientific and practical significance to study the factors influencing the mechanism of soil erosion occurrence. Economic development in the Dabie Mountains of China has necessitated the conversion of vast tracts of forest land into economic crops, notably tea gardens and orchards, thereby disrupting soil structure and precipitating large-scale soil erosion. Rainfall serves as the primary catalyst for soil erosion in this region. Therefore, this study was designed to reveal the evolution characteristics of rainfall-induced slope erosion and the key influencing factors in the forest land converted to cash crop area in Dabie Mountains. It focused on a tea plantation slope of the Dabie Mountains, employing four rainfall scenarios, i.e. light rain, moderate rain, heavy rain, and heavy rain following drought, to conduct in-situ simulation experiments, mirroring the prevalent rainfall patterns in the study region. Monitoring stations for soil moisture content, slope runoff, and soil erosion were strategically positioned at varying depths across experimental plots with vegetation cover percentages of 20%, 40%, and 60%. Mathematical methods of descriptive statistics were used to analyze the monitored runoff, soil erosion and soil water content data, and to study the characteristics of their changes and response relationships. The findings underscore that rainfall prompts a swift surge in surface soil moisture, destabilizing the soil surface and culminating in slope erosion; thus, the rate of change in surface soil moisture content emerges as a pivotal indicator for predicting slope soil erosion. Furthermore, within the bounds of rainfall infiltration, preceding drought conditions followed by intense rainfall exacerbate soil erosion accumulation, highlighting the significance of initial soil moisture content as a critical factor. Lastly, for the economic crop cultivation zones in the Dabie Mountains, achieving a vegetation cover of 40% or more can significantly enhance soil water retention capacity and the overall soil and water conservation efficacy.

PMID:39919158 | DOI:10.1371/journal.pone.0317889

Categories
Nevin Manimala Statistics

Analyzing and forecasting under-5 mortality trends in Bangladesh using machine learning techniques

PLoS One. 2025 Feb 7;20(2):e0317715. doi: 10.1371/journal.pone.0317715. eCollection 2025.

ABSTRACT

BACKGROUND: Under-5 mortality remains a critical social indicator of a country’s development and economic sustainability, particularly in developing nations like Bangladesh. This study employs machine learning models, including Linear Regression, Ridge Regression, Lasso Regression, Bayesian Ridge, Decision Tree, Gradient Boosting, XGBoost, and CatBoost, to forecast future trends in under-5 mortality. By leveraging these models, the study aims to provide actionable insights for policymakers and health professionals to address persistent challenges.

METHODS: Data from the 1993-94 to 2017-18 Bangladesh Demographic and Health Survey (BDHS) was analyzed using advanced machine learning algorithms. Key metrics, including Mean Absolute Error (MAE), Root Mean Squared Error (RMSE), R-squared, and Mean Absolute Percentage Error (MAPE), were employed to evaluate model performance. Additionally, k-fold cross-validation was conducted to ensure robust model evaluation.

RESULTS: This study confirms a significant decline in under-5 mortality in Bangladesh over the study period, with machine learning models providing accurate predictions of future trends. Among the models, Linear Regression emerged as the most accurate, achieving the lowest MAE (4.05), RMSE (4.56), and MAPE (6.64%), along with the highest R-squared value (0.98). Projections indicate further reductions in under-5 mortality to 29.87 per 1,000 live births by 2030 and 26.21 by 2035.

CONCLUSIONS: From 1994 to 2018, under-5 mortality in Bangladesh decreased by 76.72%. While the Linear Regression model demonstrated exceptional accuracy in forecasting trends, long-term predictions should be interpreted cautiously due to inherent uncertainties in socio-economic conditions. The forecasted rates fall short of the Sustainable Development Goal (SDG) target of 25 deaths per 1,000 live births by 2030, underscoring the need for intensified interventions in healthcare access and maternal health to achieve this target.

PMID:39919148 | DOI:10.1371/journal.pone.0317715

Categories
Nevin Manimala Statistics

Financial Incentives for COVID-19 Vaccination: A Cluster Randomized Clinical Trial

JAMA Netw Open. 2025 Feb 3;8(2):e2458542. doi: 10.1001/jamanetworkopen.2024.58542.

ABSTRACT

IMPORTANCE: Prior studies found that financial incentives have small, positive direct effects in increasing COVID-19 vaccination rates, but unmeasured social spillovers (ie, changes in outcomes among untreated individuals who are socially exposed to policy beneficiaries) may diminish the overall effect of such policies.

OBJECTIVE: To assess the spillover effects of a COVID-19 vaccination financial incentive and assess whether incorporating estimates of spillover meaningfully affects broader evaluations of policy effectiveness.

DESIGN, SETTING, AND PARTICIPANTS: This population-level, address-cluster randomized clinical trial was conducted in November 2021. Participants were all adult (aged ≥18 years) residents of Ravensburg, Germany, who were randomly assigned to the treatment or control group. One resident in each address cluster was randomly selected to be an address-cluster representative. Address-cluster representatives in the treatment group received the treatment letter; all other cohabitants at that same address received the control letter. All individuals in addresses randomly assigned to the control group were mailed a control letter. Intention-to-treat data analysis was conducted from January 2022 to May 2024.

INTERVENTION: Control letters informed recipients about 7 upcoming free COVID-19 vaccination events. Treatment letters were identical to control letters, except they also offered €40 (US $41.46) for getting vaccinated at one of the events.

MAIN OUTCOME AND MEASURE: Primary and booster COVID-19 vaccination uptake was observed and recorded on site during the public vaccination events. Primary vaccinations were defined as either the first dose of a 1-dose vaccine or the first or second dose of a 2-doses vaccine. Boosters were defined as any dose after primary vaccination. Three types of commonly used treatment effects were analyzed: direct, spillover, and overall.

RESULTS: Among 41 548 Ravensburg residents (mean [SD] age, 49.96 [19.04] years; 51.3% female), 796 (1.9%) were vaccinated at 1 of the 7 public vaccination events. The direct, spillover, and overall effects of receiving a financial incentive on primary vaccinations were all nonsignificant. For booster vaccinations, the direct effect was negative but not statistically significant (-0.32 percentage points [95% CI, -0.77 to 0.14 percentage points]; P = .17), whereas the overall effect (-0.30 percentage points [95% CI, -0.51 to -0.09 percentage points]; P = .006) was significantly negative. The spillover effect was significantly negative (-0.29 percentage points [95% CI, -0.53 to -0.06 percentage points]; P = .01), but only for the first vaccination events.

CONCLUSIONS AND RELEVANCE: This trial found null direct effects on COVID-19 vaccination uptake and negative effects on booster uptake among individuals who did not receive but were indirectly exposed to the financial incentives. The timing of this spillover suggests that cohabitants of financial incentive recipients postponed booster vaccination, thereby undermining the potential effectiveness of this policy.

TRIAL REGISTRATION: ISRCTN identifier: ISRCTN59503725.

PMID:39918821 | DOI:10.1001/jamanetworkopen.2024.58542

Categories
Nevin Manimala Statistics

Cumulative Excess Body Mass Index and MGUS Progression to Myeloma

JAMA Netw Open. 2025 Feb 3;8(2):e2458585. doi: 10.1001/jamanetworkopen.2024.58585.

ABSTRACT

IMPORTANCE: Obesity is a risk factor associated with multiple myeloma (MM) and its precursor, monoclonal gammopathy of unknown significance (MGUS). However, it is unclear how cumulative exposure to obesity affects the risk of MGUS progression to MM.

OBJECTIVE: To determine the association of cumulative exposure to excess body mass index (EBMI), defined as BMI (calculated as weight in kilograms divided by height in meters squared) greater than 25, with risk of MGUS progression to MM.

DESIGN, SETTING, AND PARTICIPANTS: This cohort study included patients with MGUS, including immunoglobin G, immunoglobin A, or light chain MGUS, from the nationwide US Veterans Health Administration database from October 1, 1999, to December 31, 2021. A published natural language processing-assisted model was used to confirm diagnoses of MGUS and progression to MM. Data were analyzed from February 12 to November 4, 2024.

EXPOSURES: Cumulative EBMI was calculated by area under the curve of measured BMI subtracting the reference BMI at 25 during the first 3 years after MGUS diagnosis.

MAIN OUTCOMES AND MEASURES: The main outcome was progression from MGUS to MM. Multivariable Fine-Gray time-to-competing-event analyses, with death as the competing event, were used to determine associations.

RESULTS: The cohort included 22 429 patients with MGUS (median [IQR] age, 70.5 [63.5-77.9] years; 21 613 [96.4%] male), with 8329 Black patients (37.1%) and 14 100 White patients (62.9%). There were 4862 patients (21.7%) with reference range BMI (18.5 to <25), 7619 patients (34.0%) with BMI 25 to less than 30, and 8513 patients (38.0%) with BMI 30 or greater at the time of MGUS diagnosis. Compared with reference range BMI at MGUS diagnosis, patients with BMI 25 to less than 30 (adjusted hazard ratio [aHR], 1.17; 95% CI, 1.03-1.34) or 30 or greater (aHR, 1.27; 95% CI, 1.09-1.47) at MGUS diagnosis had higher risk of progression to MM. In patients with reference range BMI at MGUS diagnosis, each 1-unit increase of EBMI per year was associated with a 21% increase in progression risk (aHR, 1.21; 95% CI, 1.04-1.40). However, for patients with BMI 25 or greater at MGUS diagnosis, the incremental risk associated with cumulative EBMI exposure was not statistically significant.

CONCLUSIONS AND RELEVANCE: This cohort study found that, for patients with BMI 18.5 to less than 25 at the time of MGUS diagnosis, cumulative exposure to BMI 25 or greater was associated with an increased risk of progression. These findings suggest that for these patients, maintaining a healthy and stable weight following MGUS diagnosis may prevent progression to MM.

PMID:39918819 | DOI:10.1001/jamanetworkopen.2024.58585

Categories
Nevin Manimala Statistics

Cardiometabolic Trajectories Preceding Dementia in Community-Dwelling Older Individuals

JAMA Netw Open. 2025 Feb 3;8(2):e2458591. doi: 10.1001/jamanetworkopen.2024.58591.

ABSTRACT

IMPORTANCE: Poor cardiometabolic health is a risk factor associated with cognitive impairment in later life, but it remains unclear whether cardiometabolic trajectories can serve as early markers associated with dementia.

OBJECTIVE: To compare cardiometabolic trajectories that precede dementia diagnosis with those among individuals without dementia.

DESIGN, SETTING, AND PARTICIPANTS: This case-control study analyzed a sample drawn from community-dwelling participants in the Aspirin in Reducing Events in the Elderly (ASPREE) study. Recruitment through primary care physicians occurred between March 2010 and December 2014, with participants followed up for a maximum of 11 years. Dementia cases were matched on sociodemographic characteristics and time of diagnosis to dementia-free controls. Data analysis was performed between February and June 2024.

EXPOSURES: Body mass index (BMI), waist circumference, systolic and diastolic blood pressure, glucose levels, high- and low-density lipoprotein (HDL and LDL) and total cholesterol levels, and triglyceride levels were measured repeatedly between 2010 and 2022.

MAIN OUTCOMES AND MEASURES: Dementia (Diagnostic and Statistical Manual of Mental Disorders [Fourth Edition] criteria) was adjudicated by an international expert panel.

RESULTS: Among 5390 participants (mean [SD] age, 76.9 [4.8] years; 2915 women [54.1%]), there were 2655 individuals (49.3%) with less than 12 years of education. The study included 1078 dementia cases and 4312 controls. Up to a decade before diagnosis, dementia cases compared with controls had lower BMI for all years from -7 years (marginal estimate, 27.52 [95% CI, 27.24 to 27.79] vs 28.00 [95% CI, 27.86 to 28.14]; contrast P = 002) to 0 years (marginal estimate, 26.09 [95% CI, 25.81 to 26.36] vs 27.22 [95% CI, 27.09 to 27.36]; contrast P < .001) and lower waist circumference for all years from -10 years (marginal estimate, 95.45 cm [95% CI, 94.33 to 96.57 cm] vs 97.35 cm [95% CI, 96.79 to 97.92 cm]; contrast P = .003) to 0 years (marginal estimate, 93.90 [95% CI, 93.15 cm to 94.64 cm] vs 96.67 cm [95% CI, 96.30 to 97.05 cm]; contrast P < .001); cases also had a faster decline in BMI (linear change β, -0.13 [95% CI, -0.19 to -0.08]) and waist circumference (linear change β, -0.30 cm [95% CI, -0.51 to -0.08 cm]). Compared with controls, cases generally had higher HDL levels, in particular from 5 years (marginal estimate, 62.57 mg/dL [95% CI, 61.59 to 63.56 mg/dL] vs 60.84 mg/dL [95% CI, 60.35 to 61.34 mg/dL]; contrast P = .002) to 3 years (marginal estimate, 62.78 mg/dL [95% CI, 61.82 to 63.74 mg/dL] vs 61.08 mg/dL [95% CI, 60.60 to 61.56 mg/dL]; contrast P = .002) before dementia but with a decline in levels just before diagnosis (linear change β, -0.47 mg/dL [95% CI, -0.86 to -0.07 mg/dL]). Dementia cases had lower systolic blood pressure and triglyceride levels in the decade before diagnosis and higher LDL and total cholesterol levels, but these were not significantly different from controls.

CONCLUSIONS AND RELEVANCE: In this study of older individuals, decline in BMI, waist circumference, and HDL occurred up to a decade before dementia diagnosis. These findings provide insights into cardiometabolic changes preceding dementia and the potential for early monitoring and intervention.

PMID:39918818 | DOI:10.1001/jamanetworkopen.2024.58591

Categories
Nevin Manimala Statistics

Analysis of the surprise question as a tool for predicting death in neonates

Eur J Pediatr. 2025 Feb 7;184(2):182. doi: 10.1007/s00431-024-05879-8.

ABSTRACT

The Surprise Question “Would you be surprised if the patient died in the next 12 months?” lacks pediatric research, particularly in neonatal patients. Our study aims to analyze the Surprise Question’s predictive ability in neonates and explore pediatricians’ views on palliative care patient identification. A prospective cross-sectional study was conducted from February 2021 to June 2023, including all newborns admitted to the Neonatal Intensive Care Unit of a pediatric tertiary hospital and its pediatricians. Patients with less than a year since admission were excluded from final analyses. Recorded variables included patient demographics and condition, pediatricians’ profile and opinions regarding the Surprise Question, and palliative care patient identification. The Surprise Question was formulated to one or more pediatricians per neonate at admission, 7 days of life, and 28 days of life, with patient status recorded after 12 months to elaborate a confusion matrix of prognostic test results. A total of 51 pediatricians participated. Most felt they had limited criteria for identifying palliative care patients (55%), believed the Surprise Question could be useful (77%), and predicted death (75%). The Surprise Question was answered at least in one out of the three moments for 262 neonates (61% male and at least 36% preterm), with sufficient sample at each moment to study its predictive ability. High negative predictive values were consistent, with higher positive predictive value at 7 days (26%).

CONCLUSIONS: The Surprise Question is a promising tool for predicting neonatal outcome and could guide professionals in initiating palliative care discussions. The 7-day mark appears more suitable for this application.

WHAT IS KNOWN: • Previous research has established the Surprise Question as a valuable tool for predicting death in adults. However, limited research exists on its use in pediatric patients and its role remains unexplored in the neonatal period.

WHAT IS NEW: • The study evaluates the Surprise Question as a tool for predicting death within the first year of life when applied in the neonatal period. It offers insights into its predictive ability and most suitable time for its application. This study sheds light on its applicability in neonatal care, offering a valuable tool for early identification and referral to palliative care.

PMID:39918789 | DOI:10.1007/s00431-024-05879-8

Categories
Nevin Manimala Statistics

The basics of PET molecular imaging in neurodegenerative disorders with dementia and/or parkinsonism

Eur Radiol. 2025 Feb 6. doi: 10.1007/s00330-025-11388-5. Online ahead of print.

ABSTRACT

Positron emission tomography (PET) imaging biomarkers have become crucial in understanding and diagnosing neurodegenerative disorders. PET imaging allows for the in vivo quantification of molecular targets with high sensitivity, aiding in the study of disease pathophysiology and progression from preclinical stages. By visualising specific molecular pathologies, PET biomarkers enable a shift from symptom-based to biology-based definitions of neurodegenerative diseases, allowing for earlier and more accurate detection and diagnosis. This has significant implications for developing and testing new therapies aimed at modifying disease course. In this review, we will go through the standards of PET imaging in the evaluation of neurodegenerative disorders. Specifically, we will review PET molecular imaging of amyloid-β plaques, tau pathology, as well as the effect of neurodegeneration on neuronal activity in different disorders. Moreover, we will revise PET tracers targeting neurotransmitter systems such as the dopaminergic system which can detect early functional changes in movement disorders. Issues related to methods, image interpretation, normal findings and mimics will be an important part of this review. KEY POINTS: Question A review of PET molecular imaging tools for assisting the clinical diagnosis of patients presenting with cognitive impairment or parkinsonism and suspected neurodegenerative disease. Findings PET molecular imaging tools vary widely in their image acquisition protocols and image interpretation, allowing us to study different features of neurodegenerative diseases. Clinical relevance The majority of PET molecular imaging tools are currently in use in our clinical practice. Despite the differences between them, standardised visual reading methods and specific semi-quantitative parameters have been established, allowing for their use.

PMID:39918781 | DOI:10.1007/s00330-025-11388-5

Categories
Nevin Manimala Statistics

Nanoparticles as an Alternative Strategy to Control Foot and Mouth Disease Virus in Bovines

Biol Trace Elem Res. 2025 Feb 7. doi: 10.1007/s12011-025-04533-0. Online ahead of print.

ABSTRACT

Livestock, mainly the bovines, plays an important role in rural livelihood and economies of different countries globally because they are a potent source of employment and income for workers and producers. Several viruses causing numerous deadly diseases have affected bovines over the years. Foot and mouth disease caused by the lethal foot and mouth disease virus is a contagious disease and has been known for hundreds of years, causing massive destruction in bovines, resulting in huge economic losses. To control foot and mouth disease virus, various strategies including antivirals, vaccines, movement control, biosecurity measures, and culling of infected animals have been employed, but these methods alone are not sufficient to fully contain the disease. Nanotechnology has emerged as a revolutionary field and nanoparticles have demonstrated superior performance as compared to other strategies highlighting their potential as an effective approach. They have proven beneficial against various viral diseases and are synthesized in various ways like physical, chemical, and biological methods. Nanoparticles have been successful in inhibiting the replication of the foot and mouth disease virus. Silver, gold, zinc, iron, copper, calcium phosphate, magnesium oxide, and ferritin nanoparticles having pharmacological and therapeutic actions have been used efficiently. This article focuses on the use of the various nanoparticles as a control strategy and also their use as nanocarriers to aid in vaccine delivery and enhance immune response against the foot and mouth disease virus.

PMID:39918774 | DOI:10.1007/s12011-025-04533-0