Categories
Nevin Manimala Statistics

Cytotoxic Effects of Common Irrigation Solutions on Chondrosarcoma and Giant Cell Tumors of Bone

J Bone Joint Surg Am. 2022 Oct 27. doi: 10.2106/JBJS.22.00404. Online ahead of print.

ABSTRACT

BACKGROUND: Irrigation is commonly used as an adjuvant treatment during the intralesional curettage of bone tumors. The goal of the present study was to analyze the in vitro cytotoxicity of commonly used irrigation solutions on chondrosarcoma and giant cell tumor (GCT) cells as there is no consensus on which solution leads to the greatest amount of cell death.

METHODS: An in vitro evaluation was performed by exposing human GCT and human chondrosarcoma cell lines to 0.9% saline solution, sterile water, 70% ethanol, 3% hydrogen peroxide, 0.05% chlorhexidine gluconate (CHG), and 0.3% povidone iodine solutions independently for 2 and 5 minutes. A low-cytotoxicity control (LCC) and a high-cytotoxicity control (HCC) were established to determine the mean cytotoxicity of each solution and each solution’s superiority to LCC and non-inferiority to HCC.

RESULTS: The present study demonstrated that 0.05% CHG was non-inferior to the HCC when chondrosarcoma was exposed for 5 minutes and when GCT was exposed for 2 and 5 minutes (mean cytotoxicity, 99% to 102%) (p < 0.003 for all). Sterile water was superior to the LCC when chondrosarcoma was exposed for 5 minutes and when GCT was exposed for 2 minutes (mean, 28% to 37%) (p < 0.05). Sterile water (mean, 18% to 38%) (p < 0.012) and 3% hydrogen peroxide (mean, 7% to 16%) (p < 0.001) were both inferior to the HCC. The 3 other solutions were non-superior to the LCC (mean, -24% to -5%) (p < 0.023).

CONCLUSIONS: In vitro irrigation in 0.05% CHG provided high cytotoxicity, comparable with the HCC. Therefore, the use of a 0.05% CHG solution clinically could serve as a potential chemical adjuvant during intralesional curettage of chondrosarcoma and GCT.

CLINICAL RELEVANCE: In an effort to reduce the burden of residual tumor cells, irrigation solutions are often utilized as adjuvant local therapy. Use of a 0.05% CHG solution clinically could serve as a potential chemical adjuvant to intralesional curettage of chondrosarcoma and GCT. Further in vivo studies may be indicated to assess clinical outcomes and safety associated with the use of 0.05% CHG in the treatment of chondrosarcoma and GCT.

PMID:36367764 | DOI:10.2106/JBJS.22.00404

Categories
Nevin Manimala Statistics

Food Insufficiency Following Discontinuation of Monthly Child Tax Credit Payments Among Lower-Income US Households

JAMA Health Forum. 2022 Nov 4;3(11):e224039. doi: 10.1001/jamahealthforum.2022.4039.

ABSTRACT

IMPORTANCE: The 2021 expanded Child Tax Credit provided advance monthly payments to many US families with children from July through December 2021 and was associated with a reduction in food insufficiency. Less is known about the effect of the discontinuation of monthly payments.

OBJECTIVE: To assess whether the discontinuation of monthly Child Tax Credit payments was associated with subsequent changes in food insufficiency among lower-income US households with children.

DESIGN, SETTING, AND PARTICIPANTS: This population-based cross-sectional study used data from the Household Pulse Survey, a recurring online survey of US households conducted by the US Census Bureau, from January 2021 to March 2022. This study estimated difference-in-differences regression models for households making less than $50 000, less than $35 000, and less than $25 000 annually, adjusting for demographic characteristics and state of residence. The estimation sample of households making less than $50 000/y included 114 705 responses, representing a weighted population size of 27 342 296 households.

EXPOSURES: Receipt of monthly Child Tax Credit payments, as measured by living in a household with children during the period of monthly payments from July through December 2021.

MAIN OUTCOMES AND MEASURES: Household food insufficiency, as measured by a respondent indicating that there was sometimes or often not enough food to eat in the household in the previous 7 days.

RESULTS: Among 114 705 households making less than $50 000/y, respondents were predominantly female (57%); White (71%); not of Hispanic, Latino, or Spanish origin (79%); had high school or equivalent education (38%); and were unmarried (70%). Following the discontinuation of monthly Child Tax Credit payments, food insufficiency in US households with children increased by 3.5 percentage points (95% CI, 1.4-5.7 percentage points) among households making less than $50 000/y, 4.9 percentage points (95% CI, 2.6-7.3 percentage points) among households making less than $35 000/y, and 6.2 percentage points (95% CI, 3.3-9.3 percentage points) among households making less than $25 000/y. These estimates represent a relative increase in food insufficiency of approximately 16.7% among households making less than $50 000/y, 20.8% among households making less than $35 000/y, and 23.2% among households making less than $25 000/y.

CONCLUSIONS AND RELEVANCE: In this population-based cross-sectional study, discontinuation of monthly Child Tax Credit payments in December 2021 was associated with a statistically significant increase in household food insufficiency among lower-income households, with the greatest increase occurring in the lowest-income households.

PMID:36367738 | DOI:10.1001/jamahealthforum.2022.4039

Categories
Nevin Manimala Statistics

Exploration of Residual Confounding in Analyses of Associations of Metformin Use and Outcomes in Adults With Type 2 Diabetes

JAMA Netw Open. 2022 Nov 1;5(11):e2241505. doi: 10.1001/jamanetworkopen.2022.41505.

ABSTRACT

IMPORTANCE: Metformin is often used as a first-line therapy for type 2 diabetes; however, frequent discontinuation with reduced kidney function and increased disease severity indicates that a comparison with any other group (eg, nonusers or insulin users) must address significant residual confounding concerns.

OBJECTIVES: To examine the potential for residual confounding in a commonly used observational study design applied to metformin and to propose a more robust study design for future observational studies of metformin.

DESIGN, SETTING, AND PARTICIPANTS: This retrospective cohort study with a prevalent user design was conducted using an administrative claims database for Medicare Advantage beneficiaries in the US. Participants were categorized into 2 distinct cohorts: 404 458 individuals with type 2 diabetes and 81 791 individuals with prediabetes. Clinical history was observed in 2018, and end points were observed in 2019. Statistical analyses were conducted between May and December 2021.

EXPOSURES: Prevalent use (recent prescription and history of use on at least 90 of the preceding 365 days) of metformin or insulin but not both at the start of the observation period.

MAIN OUTCOMES AND MEASURES: Total inpatient admission days in 2019 and total medical spending (excluding prescription drugs) in 2019. Each of these measures was treated as a binary outcome (0 vs >0 inpatient days and top 10% vs bottom 90% of medical spending).

RESULTS: The study included 404 458 adults with type 2 diabetes (mean [SD] age, 74.5 [7.5] years; 52.7% female). A strong metformin effect estimate was associated with reduced inpatient admissions (odds ratio, 0.60; 95% CI, 0.58-0.62) and reduced medical expenditures (odds ratio, 0.57; 95% CI, 0.55-0.60). However, implementation of additional robust design features (negative control outcomes and a complementary cohort) revealed that the estimated beneficial effect was attributable to residual confounding associated with individuals’ overall health, not metformin itself.

CONCLUSIONS AND RELEVANCE: These findings suggest that common observational study designs for studies of metformin in a type 2 diabetes population are at risk for consequential residual confounding. By performing 2 additional validation checks, the study design proposed here exposes residual confounding that nullifies the initially favorable claim derived from a common study design.

PMID:36367726 | DOI:10.1001/jamanetworkopen.2022.41505

Categories
Nevin Manimala Statistics

Association of Adverse Childhood Experiences and Social Isolation With Later-Life Cognitive Function Among Adults in China

JAMA Netw Open. 2022 Nov 1;5(11):e2241714. doi: 10.1001/jamanetworkopen.2022.41714.

ABSTRACT

IMPORTANCE: Studies investigating the association of threat-related and deprivation-related adverse childhood experiences (ACEs) with later-life cognitive decline are lacking.

OBJECTIVES: To evaluate the independent association of threat-related and deprivation-related ACEs with cognitive decline over time among middle-aged and older Chinese adults and to examine the modifying role of social isolation in such associations.

DESIGN, SETTING, AND PARTICIPANTS: This prospective cohort study used cognitive data from the China Health and Retirement Longitudinal Study (CHARLS) baseline survey that was administered between June 1, 2011, and March 31, 2012, and the CHARLS follow-up survey administered between July 1 and September 30, 2015. The life history survey with information of ACEs was additionally administered between June 1 and December 31, 2014. Statistical analysis was performed from March 1 to July 31, 2022. The study population consisted of middle-aged and older adults (age range, 45-97 years) with complete data on ACEs and 2 cognitive assessments and without cognitive impairment at baseline.

EXPOSURES: Five threat-related ACEs (ie, physical abuse, household substance abuse, domestic violence, unsafe neighborhood, and bullying) and 5 deprivation-related ACEs (ie, emotional neglect, household mental illness, incarcerated household member, parental separation or divorce, and parental death) before 17 years of age were queried by questionnaires. The cumulative scores of the 2 ACE dimensions were calculated and grouped into 3 categories as 0, 1, and 2 or more in main analyses.

MAIN OUTCOMES AND MEASURES: Cognitive function was measured by episodic memory and executive function. Global cognition was further calculated as the total score of these 2 dimensions. The raw scores of each cognitive test were standardized to z scores using baseline means and SDs. Linear mixed-effects models were constructed to examine the association between 2 dimensions of ACEs and the rate of annual cognitive decline. The modifying role of baseline social isolation in such associations was assessed with 3-way interaction tests.

RESULTS: Of the 6466 participants included in main analyses, 3301 (51.1%) were men and the mean (SD) age was 57.2 (8.3) years. Compared with no exposures, experience of 1 deprivation-related ACE was associated with faster cognitive decline in global cognition (β = -0.012 [95% CI, -0.022 to -0.002] SD/y) and executive function (β = -0.010 [95% CI, -0.020 to -0.00002] SD/y), whereas individuals with at least 2 childhood deprivations had faster cognitive declines in all cognitive tests (β = -0.035 [95% CI, -0.050 to -0.019] SD/y for global cognition; β = -0.047 [95% CI, -0.068 to -0.025] SD/y for episodic memory; β = -0.019 [95% CI, -0.034 to -0.004] SD/y for executive function). However, such an association was not observed for threat-related ACEs. In addition, baseline social isolation was a significant modifier in the associations between deprivation-related ACEs and cognitive declines in global cognition (β = -0.033 [95% CI, -0.061 to -0.005] SD/y; P = .02 for 3-way interaction) and executive function (β = -0.032 [95% CI, -0.059 to -0.005] SD/y; P = .02 for 3-way interaction).

CONCLUSIONS AND RELEVANCE: Deprivation-related ACEs, but not threat-related ACEs, were associated with faster decline in later-life cognitive function, whereas social isolation could modify such detrimental impact. These findings highlight the potential benefits of promoting social integration in maintaining later-life cognitive function among individuals who have experienced childhood deprivation.

PMID:36367722 | DOI:10.1001/jamanetworkopen.2022.41714

Categories
Nevin Manimala Statistics

Comparison of Acupuncture vs Sham Acupuncture or Waiting List Control in the Treatment of Aromatase Inhibitor-Related Joint Pain: A Randomized Clinical Trial

JAMA Netw Open. 2022 Nov 1;5(11):e2241720. doi: 10.1001/jamanetworkopen.2022.41720.

ABSTRACT

IMPORTANCE: Aromatase inhibitors (AIs) have proven efficacy for the treatment of hormone-sensitive breast cancer; however, arthralgias (pain and stiffness) contribute to nonadherence with therapy for more than 50% of patients.

OBJECTIVE: To examine the effect of acupuncture in reducing AI-related joint pain through 52 weeks.

DESIGN, SETTING, AND PARTICIPANTS: A randomized clinical trial was conducted at 11 sites in the US from May 1, 2012, to February 29, 2016, with a scheduled final date of follow-up of September 5, 2017, to compare true acupuncture (TA) with sham acupuncture (SA) or waiting list control (WC). Women with early-stage breast cancer were eligible if they were taking an AI and scored 3 or higher on the Brief Pain Inventory Worst Pain (BPI-WP) item (score range, 0-10; higher scores indicate greater pain). Analysis was conducted for data received through May 3, 2021.

INTERVENTIONS: Participants were randomized 2:1:1 to the TA (n = 110), SA (n = 59), or WC (n = 57) group. The TA and SA protocols were composed of 6 weeks of intervention at 2 sessions per week (12 sessions overall), followed by 6 additional weeks of intervention with 1 session per week. Participants randomized to WC received no intervention. All participants were offered 10 acupuncture sessions to be used between weeks 24 and 52.

MAIN OUTCOMES AND MEASURES: In this long-term evaluation, the primary end point was the 52-week BPI-WP score, compared by study group using linear regression, adjusted for baseline pain and stratification factors.

RESULTS: Among 226 randomized women (mean [SD] age, 60.7 [8.6] years; 87.7% White; mean [SD] baseline BPI-WP score, 6.7 [1.5]), 191 (84.5%) completed the trial. In a linear regression, 52-week mean BPI-WP scores were 1.08 (95% CI, 0.24-1.91) points lower in the TA compared with the SA group (P = .01) and were 0.99 (95% CI, 0.12-1.86) points lower in the TA compared with the WC group (P = .03). In addition, 52-week BPI pain interference scores were statistically significantly lower in the TA compared with the SA group (difference, 0.58; 95% CI, 0.00-1.16; P = .05). Between 24 and 52 weeks, 12 (13.2%) of TA, 6 (11.3%) of SA, and 5 (10.6%) of WC patients reported receipt of acupuncture.

CONCLUSIONS AND RELEVANCE: In this randomized clinical trial, women with AI-related joint pain receiving 12 weeks of TA had reduced pain at 52 weeks compared with controls, suggesting long-term benefits of this therapy.

TRIAL REGISTRATION: ClinicalTrials.gov Identifier: NCT01535066.

PMID:36367721 | DOI:10.1001/jamanetworkopen.2022.41720

Categories
Nevin Manimala Statistics

The use of electrodermal activity in pulpal diagnosis and dental pain assessment

Int Endod J. 2022 Nov 11. doi: 10.1111/iej.13868. Online ahead of print.

ABSTRACT

AIMS: To explore whether electrodermal activity (EDA) can serve as a complementary tool for pulpal diagnosis (Aim 1) and an objective metric to assess dental pain before and after local anaesthesia (Aim 2).

METHODOLOGY: A total of 53 subjects (189 teeth) and 14 subjects (14 teeth) were recruited for Aim 1 and Aim 2, respectively. We recorded EDA using commercially available devices, PowerLab and Galvanic Skin Response (GSR) Amplifier, in conjunction with cold and electric pulp testing (EPT). Participants rated their level of sensation on a 0-10 visual analogue scale (VAS) after each test. We recorded EPT-stimulated EDA activity before and after the administration of local anaesthesia for participants who required root canal treatment (RCT) due to painful pulpitis. The raw data were converted to the time-varying index of sympathetic activity (TVSymp), a sensitive and specific parameter of EDA. Statistical analysis was performed using Python 3.6 and its Scikit-post hoc library.

RESULTS: EDA activity was upregulated by the stimuli of cold and EPT testing in normal pulp. TVSymp signals were significantly increased in vital pulp compared to necrotic pulp by both, cold test and EPT. Teeth that exhibited intensive sensitivity to cold with or without lingering pain had increased peak numbers of TVSymp than teeth with mild sensation to cold. Pre- and post-anaesthesia EDA activity and VAS scores were recorded in patients with painful pulpitis. Post-anaesthesia EDA signals were significantly lower compared to pre-anaesthesia levels. Approximately 71 % of patients (10 out of 14 patients) experienced no pain during treatment and reported VAS score zero or 1. Majority of patients (10 out of 14) showed the reduction of TVSymp after the administration of anaesthesia. Two out of three patients who experienced increased pain during root canal treatment (post-treatment VAS > pre-treatment VAS) exhibited increased post-anaesthesia TVSymp.

CONCLUSIONS: Our data show promising results for using EDA in pulpal diagnosis and for assessing dental pain. While our testing was limited to subjects who had adequate communication skills, our future goal is to be able to use this technology to aid in the endodontic diagnosis of patients who have limited communication ability.

PMID:36367715 | DOI:10.1111/iej.13868

Categories
Nevin Manimala Statistics

Low-Energy Dense Potato- and Bean-Based Diets Reduce Body Weight and Insulin Resistance: A Randomized, Feeding, Equivalence Trial

J Med Food. 2022 Nov 11. doi: 10.1089/jmf.2022.0072. Online ahead of print.

ABSTRACT

We evaluated the effect of diets low in energy density (1 kcal/g) and high in either potatoes (Potato) or pulses (Bean) on blood glucose control in participants with insulin resistance. We hypothesized that the Potato and Bean diets would have equivalent effects. This was an 8-week randomized, parallel design, controlled feeding study comparing Potato and Bean diets (50-55% carbohydrate, 30-35% fat, 15-20% protein). Equivalence was prespecified as the mean change in the blood glucose concentration for Potato that was within ±20% of the Bean diet. Thirty-six participants (age: 18-60 years, body mass index: 25-40 kg/m2) with insulin resistance (homeostatic model assessment of insulin resistance [HOMA-IR] >2) were enrolled. Body weight was measured, and subjects underwent a mixed meal tolerance test at baseline and after 8 weeks. Intent-to-treat (ITT) and completer analyses were conducted. Equivalence between the two diets in the area under the curve for serum glucose was attained within ±10%, but the reduction from baseline was not statistically significant. For the Bean diet, insulin (area under the response curve: -2136.3 ± 955.5 mg/[dL∙min], P = .03) and HOMA-IR (-1.4 ± 0.6, P = .02) were lower compared with baseline. ITT and completer analyses were similar, except that HOMA-IR was also reduced by the Potato diet (-1.3 ± 0.6, P < .05). Compliance with the diets was 87-88%, and body weight was reduced in both diets (Potato: -5.6% ± 0.6%; Bean: -4.1% ± 0.6%, P < .001) with no significant difference between the two diets. Potato and Bean diets low in energy density were equally effective in reducing insulin resistance and promoting weight loss in individuals with impaired blood glucose control. Clinical Trial: The trial was registered with ClinicalTrials.gov Identifier: NCT04203238.

PMID:36367708 | DOI:10.1089/jmf.2022.0072

Categories
Nevin Manimala Statistics

Study on small ruminant brucellosis and owners awareness in two selected districts of southern region, Ethiopia

Vet Med Sci. 2022 Nov 11. doi: 10.1002/vms3.992. Online ahead of print.

ABSTRACT

INTRODUCTION: Brucellosis is one of the infectious diseases that has the greatest impact on the productivity of sheep and goats. A cross-sectional study followed by a simple random sampling technique was used to investigate the seroprevalence of brucellosis (Rose Bengal plate test; RBPT and complement fixation test; CFT) in small ruminants and its related risk variables from November 2019 to June 2020 in Kolme and Abala Abaya districts. A questionnaire was also given to owners to assess their existing knowledge of the disease.

RESULT: Using the RBPT and CFT, 28 (4.1%) and 23 (3.33%) of the 690 animals were found to be seropositive for brucellosis, respectively. In this study, the seroprevalence of brucellosis detected in the Kolme district (5.3%) was greater when compared to Abala Abaya (1.0%). The odds of Brucella infection were greater for goats (odds ratio [OR] 6.0, 95% confidence interval [CI] 16 0.8-44.9) than for sheep. The odds of adult animals (OR 0.05, 95% CI 0.03-0.07) being positive for brucellosis was higher than young animals. A statistically significant difference in the seropositivity of brucellosis was detected in univariate logistic regression among districts, different age groups, herd size, parity number, and reproductive health problems except for species and sex, but in multivariate logistic regression, only reproductive health problems were revealed a statistically significant difference. Out of 138 families, 100% of respondents were unaware of brucellosis, 94.5% drink raw milk, and 74% handle animals with retained fetal membranes with their bare hands.

CONCLUSION: This study showed that brucellosis was a widely spread disease in the study areas and poses a substantial public health danger. To reduce the spread of the disease in small ruminants, public health risks, and economic losses, stringent vaccination application and awareness of personal hygiene are critical.

PMID:36367706 | DOI:10.1002/vms3.992

Categories
Nevin Manimala Statistics

A Predictive Tool for Choledocholithiasis in Patients Undergoing Emergency Cholecystectomy

J Laparoendosc Adv Surg Tech A. 2022 Nov 11. doi: 10.1089/lap.2022.0384. Online ahead of print.

ABSTRACT

Background: Management of acute cholecystitis with emergency laparoscopic cholecystectomy has been established; however, detection and management of concurrent choledocholithiasis are debated. The aim of this study is to develop a more accurate choledocholithiasis predictive model. Materials and Methods: A 9-year audit of emergency cholecystectomies and evaluation of preoperative factors in predictive models. Receiver Operating Curve (ROC) analysis/Youdon Index was used to identify thresholds maximizing these associations for continuous variables. Results: 1601/1828 patients were analyzed. Patients who were diagnosed with choledocholithiasis were more likely to be febrile on admission, have a higher C-reactive Protein, and higher median bilirubin (25.0 μmol/L versus 11.0 μmol/L, P < .001). When excluding bilirubin, multivariate analysis detected several significant variables, including fever, biliary tree dilatation, or a common bile duct stone seen on ultrasound. When bilirubin was included into the model, bilirubin of 20-39 μmol/L (odds ratio [OR] 2.44, 95% confidence interval [CI]: 1.74-3.44) and ≥40 μmol/L (OR 4.84, 95% CI: 3.40-6.91) were shown to have increased likelihood of choledocholithiasis detection on intraoperative cholangiogram, with the ROC model having a significant C-statistic of 0.796 (P < .001). Discussion: A perfect predictive model for concurrent choledocholithiasis in acute cholecystitis does not exist; however, the results from this study are encouraging that high and low predictive groups can be established.

PMID:36367704 | DOI:10.1089/lap.2022.0384

Categories
Nevin Manimala Statistics

A machine-learning approach for nonalcoholic steatohepatitis susceptibility estimation

Indian J Gastroenterol. 2022 Nov 11. doi: 10.1007/s12664-022-01263-2. Online ahead of print.

ABSTRACT

BACKGROUND: Nonalcoholic steatohepatitis (NASH), a severe form of nonalcoholic fatty liver disease, can lead to advanced liver damage and has become an increasingly prominent health problem worldwide. Predictive models for early identification of high-risk individuals could help identify preventive and interventional measures. Traditional epidemiological models with limited predictive power are based on statistical analysis. In the current study, a novel machine-learning approach was developed for individual NASH susceptibility prediction using candidate single nucleotide polymorphisms (SNPs).

METHODS: A total of 245 NASH patients and 120 healthy individuals were included in the study. Single nucleotide polymorphism genotypes of candidate genes including two SNPs in the cytochrome P450 family 2 subfamily E member 1 (CYP2E1) gene (rs6413432, rs3813867), two SNPs in the glucokinase regulator (GCKR) gene (rs780094, rs1260326), rs738409 SNP in patatin-like phospholipase domain-containing 3 (PNPLA3), and gender parameters were used to develop models for identifying at-risk individuals. To predict the individual’s susceptibility to NASH, nine different machine-learning models were constructed. These models involved two different feature selections including Chi-square, and support vector machine recursive feature elimination (SVM-RFE) and three classification algorithms including k-nearest neighbor (KNN), multi-layer perceptron (MLP), and random forest (RF). All nine machine-learning models were trained using 80% of both the NASH patients and the healthy controls data. The nine machine-learning models were then tested on 20% of both groups. The model’s performance was compared for model accuracy, precision, sensitivity, and F measure.

RESULTS: Among all nine machine-learning models, the KNN classifier with all features as input showed the highest performance with 86% F measure and 79% accuracy.

CONCLUSIONS: Machine learning based on genomic variety may be applicable for estimating an individual’s susceptibility for developing NASH among high-risk groups with a high degree of accuracy, precision, and sensitivity.

PMID:36367682 | DOI:10.1007/s12664-022-01263-2