Categories
Nevin Manimala Statistics

Prevalence and characteristics of persistent pain among head and neck cancer survivors: A systematic review and meta-analysis

Pain Med. 2025 Apr 28:pnaf051. doi: 10.1093/pm/pnaf051. Online ahead of print.

ABSTRACT

OBJECTIVES: There are no updated systematic reviews examining the prevalence of persistent pain among head and neck cancer survivors. This systematic review aims to identify the prevalence and characteristics of persistent pain across locations among head and neck cancer survivors.

METHODS: A systematic review was conducted according to PRISMA guidelines on December 14th, 2023 (PROSPERO reference CRD42024494926). The MEDLINE via PubMed, Scopus, Web of Science, CINAHL, Ovid and Cochrane Library databases were searched. Studies had to report prevalence data on persistent pain in head and neck cancer survivors who completed cancer treatment at least 3 months ago. Quality of the included studies was assessed using the critical appraisal tool developed by the Joanna Briggs Institute. Statistical heterogeneity was assessed prior to performing the meta-analysis using τ2, I2, and Q. Univariate meta-regression analyses were used to examine sources of heterogeneity.

RESULTS: 1713 records were retrieved. After removing duplicates 1385 articles were screened.Ultimately, 182 articles were assessed for full-text screening, of which 17 manuscripts were included for review. The prevalence of the studies was 31% (95% CI: 20-42). The meta-regression explained approximately 40% of the observed heterogeneity (R2 = 40.57).

CONCLUSION: This systematic review highlights that almost third of head and neck cancer survivors are under persistent pain after finishing cancer treatment. No final conclusions can be drawn as to which extent cancer location, cancer treatment, pain measurement method and timing of pain assessments could modify this prevalence. Results should be interpreted with caution since there is considerable variability in the methods.

PMID:40293769 | DOI:10.1093/pm/pnaf051

Categories
Nevin Manimala Statistics

Psychological Distress Among US-Born and Non-US-Born Black or African American Adults in the US

JAMA Netw Open. 2025 Apr 1;8(4):e256558. doi: 10.1001/jamanetworkopen.2025.6558.

ABSTRACT

IMPORTANCE: Limited research explores within-group and between-group differences in the prevalence of and factors associated with psychological distress among Black or African American adults, especially by nativity.

OBJECTIVE: To estimate the prevalence of moderate-to-severe (hereafter, moderate-severe) psychological distress and to assess factors associated with increased risk among Black or African American adults according to nativity.

DESIGN, SETTING, AND PARTICIPANTS: This cross-sectional study drew data from the 2005 to 2018 National Health Interview Surveys. The study analyzed national household probability samples of the civilian noninstitutionalized Black or African American adult population aged 18 years or older, including US-born and non-US-born subgroups, between January to December 2005 and January to December 2018. Data analysis was performed from November 2023 to January 2025.

EXPOSURES: Birthplace (ie, US-born if born in the US; non-US-born if born outside the US, including US territories). Risk factors included sociodemographic, socioeconomic, and health behavior factors.

MAIN OUTCOMES AND MEASURES: The primary outcome was moderate-severe psychological distress status based on self-reported responses to the Kessler Psychological Distress Scale. Odds ratios (ORs) with 95% CIs were reported as estimates to determine the computed associations across logistic regression models.

RESULTS: A total of 49 820 individuals (43 885 born in the US and 5935 born outside the US) were analyzed. Overall, 21.9% of the sample (11 079 individuals) experienced moderate-severe psychological distress, with a higher prevalence among US-born (10 037 individuals [22.6%]) than non-US-born (1042 individuals [17.4%]) individuals. Individuals aged 65 years or older (especially US-born; OR, 0.51; 95% CI, 0.44-0.58) and male individuals (especially non-US-born; OR, 0.68; 95% CI, 0.56-0.82) had lower odds of experiencing moderate-severe psychological distress. Unemployment (OR, 1.91; 95% CI, 1.80-2.03) and having less than a college education were associated with higher odds of moderate-severe psychological distress across the subgroups, especially among US-born individuals. Current and former smoking was associated with higher odds of moderate-severe psychological distress, with greater odds among non-US-born individuals than among US-born and overall Black or African American individuals. Current and former alcohol drinking was associated with higher odds among only the general population (current drinking, OR, 1.37 [95% CI, 1.29-1.47]; former drinking, OR, 1.26 [95% CI, 1.16-1.37]) and US-born individuals (current drinking, OR, 1.45 [95% CI, 1.36-1.56]; former drinking, OR, 1.29 [95% CI, 1.19-1.41]), with higher ORs among US-born population.

CONCLUSIONS AND RELEVANCE: In this cross-sectional study of differences in moderate-severe psychological distress by nativity among Black or African American adults, more pronounced risks were observed among US-born individuals. Longitudinal studies and data disaggregation could further elucidate health differences to improve cultural competence and adaptability in mental health research and interventions.

PMID:40293749 | DOI:10.1001/jamanetworkopen.2025.6558

Categories
Nevin Manimala Statistics

All-Cause Mortality and Life Expectancy by Birth Cohort Across US States

JAMA Netw Open. 2025 Apr 1;8(4):e257695. doi: 10.1001/jamanetworkopen.2025.7695.

ABSTRACT

IMPORTANCE: Although overall US mortality rates declined from 1969 to 2020, they vary considerably by state and generation, especially when evaluated by birth cohort. Trends in mortality and life expectancy by birth cohort for US states and Washington, DC, have yet to be characterized.

OBJECTIVE: To estimate cohort mortality trends for each state and Washington, DC, and quantify life expectancy at birth and 40 years of age and the rate of increase after 35 years of age.

DESIGN, SETTING, AND PARTICIPANTS: In this cohort study, all-cause mortality rates by single years of age (0-119) and birth cohort (1900-2000) were estimated for each state in January 2025. Mortality data and population estimates were obtained from the National Center for Health Statistics, the Centers for Disease Control and Prevention Wide-Ranging Online Data for Epidemiologic Research website, and the Surveillance, Epidemiology, and End Results database for each state and Washington, DC, by single years of ages 0 to 84 and calendar years 1969 to 2020. An age-period-cohort model with constrained cubic splines for temporal effect estimates was used to estimate mortality from 1900 to 2000.

MAIN OUTCOMES AND MEASURES: Life expectancy for each cohort from birth or 40 years of age was estimated by sex and state, along with doubling time for the death rate after 35 years of age.

RESULTS: Analyses included 179 million deaths (77 million female and 102 million male). In the West and Northeast, cohort life expectancy improved from 1900 to 2000, but in some Southern states, it changed less than 3 years since 1900 in females and less than 2 years since 1950 in males. Washington, DC, had the lowest life expectancy in the 1900 birth cohort but a greater increase than the other states (from 61.1 to 72.8 years of age). After 35 years of age, the highest rate-doubling time in a state was 9.39 years in New York for females and 11.47 years for males in Florida. The shortest rate-doubling times were 7.96 years for females in Oklahoma and 8.95 years for males in Iowa.

CONCLUSIONS AND RELEVANCE: Cohort-specific patterns across states reveal wide disparities in mortality. Some states have experienced little or no improvements in life expectancy from the 1900 to 2000 birth cohorts. Understanding how mortality patterns vary by birth cohort within each state can inform decision-making around resource allocation and public health interventions.

PMID:40293748 | DOI:10.1001/jamanetworkopen.2025.7695

Categories
Nevin Manimala Statistics

Barriers and Opportunities for Cancer Clinical Trials in Low- and Middle-Income Countries

JAMA Netw Open. 2025 Apr 1;8(4):e257733. doi: 10.1001/jamanetworkopen.2025.7733.

ABSTRACT

IMPORTANCE: Clinical trials represent the gold standard to test the safety and efficacy of new or updated approaches to treatments that will inform quality cancer care. However, cancer trials enroll few patients in low- and middle-income countries (LMICs), are often led by investigators from high-income countries, and do not adequately reflect global disease burden or population diversity.

OBJECTIVE: To identify key challenges and strategies to advance contextually relevant, quality cancer trials in LMICs.

DESIGN, SETTING, AND PARTICIPANTS: The survey used in this survey study was available in English, Arabic, French, Portuguese, and Spanish and was conducted by the US National Cancer Institute from October 18 to December 22, 2023. Clinicians with experience in cancer therapeutic clinical trials in LMICs were eligible. The survey covered their professional background and views on challenges and strategies for improving clinical trial opportunities in LMICs. Analysis was performed from April 2 to August 26, 2024.

MAIN OUTCOMES AND MEASURES: Respondents were asked to rate 34 challenges by impact on their ability to conduct cancer trials using a 4-point Likert scale and 8 strategies by importance using a 5-point Likert scale. Descriptive statistics summarized participants’ backgrounds, challenges, and priorities.

RESULTS: Of 453 respondents who began the survey, a total of 223 (49%) were eligible for inclusion, and 131 of those (59%) completed the survey in full. Among the 133 respondents who provided gender data, 81 (61%) were male. In all, 107 of 130 respondents (82%) were affiliated with LMIC institutions, 65 of 223 (29%) were medical oncologists, and 52 of 133 (39%) were midcareer. Financial challenges were rated as the most impactful, with 133 of 170 respondents (78%) rating difficulty obtaining funding for investigator-initiated trials as having a large impact on ability to carry out a trial. Human capacity issues followed, with 105 of 192 respondents (55%) rating lack of dedicated research time as having a large impact. Increasing opportunities for funding and improving human capacity were reported as key strategies to advance capacity to conduct clinical trials in LMICs.

CONCLUSIONS AND RELEVANCE: This survey study of clinicians with clinical trial experience in LMICs suggests that adequate funding and a well-trained research workforce are 2 predominant challenges to advancing cancer therapeutic clinical trials in LMICs. Understanding these obstacles can inform efforts to support cancer clinical trials that better reflect worldwide needs and diversity by prioritizing and sustaining research led by LMIC investigators.

PMID:40293747 | DOI:10.1001/jamanetworkopen.2025.7733

Categories
Nevin Manimala Statistics

Donor History of Drug Use and Graft Survival in Pediatric Heart Transplant Recipients

JAMA Netw Open. 2025 Apr 1;8(4):e257766. doi: 10.1001/jamanetworkopen.2025.7766.

ABSTRACT

IMPORTANCE: Older children awaiting a heart transplant (HT) sometimes receive a heart offer from a donor with a history of drug use (HDU). The effect of using such donor hearts on posttransplant survival in pediatric recipients is unclear.

OBJECTIVE: To assess the association of using hearts from donors with HDU on posttransplant graft survival in pediatric HT recipients.

DESIGN, SETTING, AND PARTICIPANTS: For this retrospective cohort study, all pediatric HT recipients (aged <18 years) in the Organ Procurement and Transplantation Network database during January 1, 2000, to December 31, 2020, were identified. Among the recipients who received a heart from a donor with HDU, nearly all donors were aged 11 years or older. A propensity score (PS) model was developed to assess the probability of receiving a heart from a donor with HDU using baseline recipient and donor variables, limiting the study cohort to donors aged 11 years or older. Data were analyzed from October 2023 to November 2024.

EXPOSURE: HT using a heart from a donor with HDU (exposure group) vs from a donor without HDU (control group).

MAIN OUTCOMES AND MEASURES: The main outcome was graft loss (death or retransplant) assessed at 90 days after transplant and long term among 90-day survivors. Kaplan-Meier survival curves and a Cox proportional hazards regression model that accounted for matching of exposure and control groups were used to compare risk of graft loss.

RESULTS: This study included 2730 pediatric HT recipients. Their median age at HT was 14 years (IQR, 11-16 years), and most (1642 [60.1%]) were male. Overall, the exposure group comprised 822 children who received a heart from a donor with HDU; of these, 765 (93.1%) were PS matched to the control group. There was no difference in risk of graft loss within 90 days (hazard ratio [HR], 0.93 [95% CI, 0.55-1.57]; P = .78) or at long-term follow-up (HR, 1.04 [95% CI, 0.87-1.25]; P = .68) between PS-matched groups. Risk of graft loss within 90 days was not significantly different in children who received a heart from a donor with a history of cocaine use (157 pairs) vs children in the control group (HR, 0.55 [95% CI, 0.19-1.54]; P = .25); however, the risk of long-term graft loss among 90-day survivors was significantly higher (HR, 2.03 [95% CI, 1.35-3.06]; P = .001).

CONCLUSIONS AND RELEVANCE: In this cohort study of pediatric HT recipients, there was no association of 90-day graft survival with donor HDU; however, donor history of cocaine use was associated with a higher risk of long-term graft loss. These findings may be important when considering a donor with HDU for pediatric HT candidates.

PMID:40293746 | DOI:10.1001/jamanetworkopen.2025.7766

Categories
Nevin Manimala Statistics

Glitter ingestion by bromeliad-dwelling macroinvertebrates: implications for freshwater microplastic contamination

Environ Toxicol Chem. 2025 Apr 28:vgaf111. doi: 10.1093/etojnl/vgaf111. Online ahead of print.

ABSTRACT

Microplastics (MPs) are pervasive pollutants due to their extensive dispersion across terrestrial, marine, freshwater environments, and even the atmosphere. Beyond the common sources of MPs from the degradation of larger plastic items, an often-overlooked primary source is glitter. Widely incorporated into everyday products, glitter not only poses a significant environmental risk due to its ease of dispersion but also holds cultural importance in regions like Brazil, where it is extensively used in festivities. Understanding glitter as a type of microplastic can offer valuable insights into the effects of MPs on aquatic ecosystems, particularly concerning freshwater macroinvertebrates. Given the ecological significance of this issue, our study investigated the ingestion and potential bioaccumulation of MPs by macroinvertebrates in the phytotelmata of Aechmea blanchetiana bromeliads. Organisms were exposed to a microplastic treatment (0.1 g/L glitter) for seven days, followed by taxonomic identification and analysis of MP distribution across body segments. Statistical tests assessed variations in MP distribution among taxa and body regions. Results revealed significant MP ingestion, with the highest concentrations in Culicidae and Chironomidae, suggesting that their generalist feeding behaviors facilitate MP intake. Observations also pointed to preferential accumulation of MPs in certain body parts, indicating potential bioaccumulation. Additionally, the presence of fragmenting MP particles within these taxa highlights their potential role in enhancing MP bioavailability in aquatic environments. Chironomidae and Culicidae, through ingestion and fragmentation, may increase MP dispersal across trophic levels, which could exacerbate bioaccumulation risks within the food web. This evidence supports the use of Chironomidae and Culicidae as effective biomonitors for MPs and calls attention to the ecological implications of glitter pollution in tropical freshwater ecosystems.

PMID:40293741 | DOI:10.1093/etojnl/vgaf111

Categories
Nevin Manimala Statistics

Perioperative intravenous lidocaine as an analgesic adjunct in adolescent idiopathic scoliosis surgery

J Pediatr Orthop B. 2025 Apr 29. doi: 10.1097/BPB.0000000000001253. Online ahead of print.

ABSTRACT

Opioids are the mainstay of pain management in scoliosis surgery. We hypothesized that in adolescent idiopathic scoliosis (AIS) patients undergoing posterior spinal fusion (PSF) surgery, perioperative intravenous (IV) lidocaine would reduce postoperative opioid requirement and pain scores. In this retrospective observational before-and-after study, we identified AIS patients who underwent single-stage PSF at a tertiary university hospital from 2020 to 2022. All patients received total intravenous anesthesia. The Lidocaine group received a bolus of 1.5 mg/kg IV lidocaine prior to induction, followed by infusion at 2 mg/kg/h. At wound closure, the rate was reduced to 1 mg/kg/h and continued for 30 min in recovery. All patients received patient-controlled analgesia (PCA) morphine postoperatively. The primary outcome was total morphine consumption in the first 24 h. The secondary outcome was mean pain scores over 48 h using a numerical rating scale. We included 115 patients: 59 in the Usual Care group and 56 in the Lidocaine group. Postoperative morphine use in the first 24 h showed no significant difference (Lidocaine: 13.5 ± 8.9 mg vs Usual Care: 13.9 ± 10.6 mg; P = 0.821). The cumulative morphine milligram equivalents per kilogram bodyweight at 48 h was 0.43 mg/kg. Mean pain scores were higher in the Lidocaine group in the first 48 h (4.25 ± 0.37 vs 3.67 ± 1.46; P = 0.03). Perioperative IV lidocaine administered as an analgesic adjunct for AIS surgery did not reduce postoperative morphine requirement. Although pain scores were statistically higher in patients receiving intravenous lidocaine, the difference was minimal and lacked clinical significance.

PMID:40293731 | DOI:10.1097/BPB.0000000000001253

Categories
Nevin Manimala Statistics

Factors Influencing Change of Direction Performance in Youth Soccer Players: Velocity-Time Profile Analysis of the Pro-Agility Test

J Strength Cond Res. 2025 Apr 29. doi: 10.1519/JSC.0000000000005116. Online ahead of print.

ABSTRACT

Nakamura, H, Yamashita, D, Nishiumi, D, Nakaichi, N, and Hirose, N. Factors influencing change of direction performance in youth soccer players: velocity-time profile analysis of the Pro-Agility test. J Strength Cond Res XX(X): 000-000, 2025-The purpose of this study was to assess factors influencing change of direction (COD) deficit (CODD) and total time completion (CODTT) in adolescent soccer players through velocity-time profile analysis of the Pro-Agility Test. We enrolled 71 junior high school male soccer players and measured the 20-m sprint time and CODTT of the Pro-Agility Test, calculating CODD by subtracting the 20-m sprint time from CODTT. In addition, 3-dimensional motion data were collected using a markerless motion capture system during the Pro-Agility Test. Each section (5 m in the first, 10 m in the second, and 5 m in the third) was divided into acceleration and deceleration phases based on center of mass (COM) velocity, which were further divided into early and late halves. The mean COM acceleration during the acceleration phase (Acc) and deceleration during the deceleration phase (Dec) were calculated. Stepwise multiple regression analysis was performed to identify phases affecting CODTT and CODD. Statistical significance was set at p < 0.05. Total time completion of a COD task was explained by the second early Acc (β = -0.500), second late Dec (β = 0.433), and 20-m sprint time (β = 0.226) (adjusted R2 = 0.858), whereas CODD was explained by the second late Dec (β = 0.561) and second early Acc (β = -0.271) (adjusted R2 = 0.459). Maturity offset significantly correlated with CODTT (r = -0.456) but not with CODD (r = -0.119). The results indicated that deceleration and reacceleration during the Pro-Agility Test can be evaluated in adolescents by combining CODTT and CODD.

PMID:40293724 | DOI:10.1519/JSC.0000000000005116

Categories
Nevin Manimala Statistics

Handgrip Strength Associated With Leg Strength, Power, and Muscle Mass in 18-64-Year-Old Males and Females

J Strength Cond Res. 2025 Apr 29. doi: 10.1519/JSC.0000000000005089. Online ahead of print.

ABSTRACT

McBride, JM, Bauer, EC, Kaufmann, NC, Triplett, NT, and Shanely, RA. Handgrip strength associated with leg strength, power, and muscle mass in 18-64-year-old males and females. J Strength Cond Res XX(X): 000-000, 2025-The purpose of this investigation was to determine the association between handgrip strength (HGS) and measures of leg strength, power, and muscle mass. Twenty-one men (age = 32.9 ± 11.4 years, height = 175.7 ± 8.3 cm, body mass = 83.6 ± 14.4 kg, body fat = 22.6 ± 6.2%) and 24 women (age = 35.5 ± 14.0 years, height = 164.6 ± 6.8 cm, body mass = 65.2 ± 8.6 kg, body fat = 30.0 ± 5.7%) performed a HGS test, a squat and leg press 1 repetition maximum (1RM), a countermovement jump (CMJ) on a force plate, and a dominant leg peripheral quantitative computed tomography thigh scan to calculate muscle cross-section area (CSA). Lean body mass was determined through dual x-ray absorptiometry. Jump height and impulse were calculated from force time curves from the CMJ as a representation of leg muscular power. Strong statistically significant correlations were found between HGS and squat 1RM (r = 0.80, p ≤ 0.0001), leg press 1RM (r = 0.79, p ≤ 0.0001), CMJ height (r = 0.78, p ≤ 0.0001), CMJ impulse (r = 0.84, p ≤ 0.0001), and thigh muscle CSA (r = 0.75, p ≤ 0.0001 and lean body mass (r = 0.79, p ≤ 0.0001). This study indicates that HGS could be used as a preliminary screening tool for determination of leg strength, power, and muscle mass. These variables have been determined to be components to overall fitness that increase quality of life and overall health. Thus, health care providers may be able to use this simple test as an early indication of possible risk factors for poor health and well-being.

PMID:40293720 | DOI:10.1519/JSC.0000000000005089

Categories
Nevin Manimala Statistics

Synergistic Enhancement of Apo2L/TRAIL and DR4-Induced Apoptosis by Arsenic Trioxide in Triple-Negative Breast Cancer Cells: A Comparison to Conventional Chemotherapy

Cell Biochem Biophys. 2025 Apr 28. doi: 10.1007/s12013-025-01764-9. Online ahead of print.

ABSTRACT

Triple-negative breast cancer (TNBC) is an aggressive subtype lacking hormonal and HER2 receptors, making it highly resistant to treatment. Apo2L/TRAIL, a tumor necrosis factor-related ligand, induces apoptosis in cancer cells via the death receptor DR4. However, TNBC often develops resistance to TRAIL-mediated apoptosis, limiting its therapeutic potential. This study investigates whether arsenic trioxide (ATO) can overcome TRAIL resistance by modulating the Apo2L/TRAIL pathway and enhancing the effects of carboplatin (CP) and cyclophosphamide (CY). TNBC cell lines BT-20 and MDA-MB-231 were treated with ATO, CP, CY, and their combinations. Cell viability was measured using the MTT assay, while real-time PCR and Western blot analysis assessed Apo2L/TRAIL and DR4 expression. Statistical analysis was performed using ANOVA with Dunnett’s post hoc test. ATO induced dose-dependent cytotoxicity in TNBC cells, which was significantly enhanced in combination treatments. The highest reductions in cell viability were observed with 3 µM ATO plus 5000 µM CP or 500 µM CY (p < 0.0001). ATO markedly upregulated Apo2L/TRAIL and DR4 at both mRNA and protein levels, with the most pronounced effects seen in ATO-CY combinations. These findings indicate that ATO sensitizes TNBC cells to TRAIL-mediated apoptosis by upregulating DR4 and Apo2L/TRAIL, while also exhibiting strong synergistic cytotoxicity with CP and CY. This highlights ATO’s potential as an adjuvant therapy to improve TNBC treatment efficacy and overcome chemoresistance, warranting further clinical exploration.

PMID:40293700 | DOI:10.1007/s12013-025-01764-9