Categories
Nevin Manimala Statistics

Nutritional risk assessment by STAMP according to type of congenital heart disease in pediatric patients admitted to a reference hospital

Nutr Hosp. 2024 Nov 12. doi: 10.20960/nh.05421. Online ahead of print.

ABSTRACT

INTRODUCTION: patients with congenital heart disease are considered to be at high nutritional risk due to alterations in the metabolism of the underlying pathology and extracardiac factors. The STAMP (Screening Tool for the Assessment of Malnutrition in Paediatrics) is the only tool validated in our country in a pediatric population for nutritional screening.

OBJECTIVE: to evaluate nutritional risk by STAMP screening in pediatric patients according to type of congenital heart disease.

MATERIAL AND METHODS: an analytical cross-sectional study in 2023 in a pediatric reference hospital. Nutritional status was determined by Z scores based on WHO 2006/CDC 2000 child growth standards. The STAMP questionnaire was administered to establish nutritional risk. Inferential statistics with the Chi-squared and Mann-Whitney U-test were used. Spearman’s correlation coefficient was used. Analyses were carried out using the SPSS V25 statistical package.

RESULTS: 113 patients were included, male (n = 57, 50.4 %) and female (n = 56, 49.6 %). The most common age group was that of infants (n = 47, 45 %). Acute or chronic nutritional status impairment affected n = 50 (44.3 %) subjects. Cyanogenous congenital heart diseases had a greater impact on weight, height, average arm circumference and height/age, WBC/age indices. Two nutritional risk groups were established by STAMP: intermediate risk, n = 74 (65.5 %), and high risk, n = 39 (34.5 %). The greatest impact on anthropometric parameters was associated with elevated risk by STAMP, p = 0.000. The type of congenital heart disease was not associated with a greater increase in nutritional risk by STAMP (p = 0.76). There was no correlation of STAMP score with biochemical parameters.

CONCLUSION: patients with congenital heart disease will have an intermediate risk per STAMP. The type of congenital heart disease is not related to a greater increase in nutritional risk due to STAMP.

PMID:39575608 | DOI:10.20960/nh.05421

Categories
Nevin Manimala Statistics

Cannabis use and cognitive biases in people with first-episode psychosis and their siblings

Psychol Med. 2024 Nov 22:1-11. doi: 10.1017/S0033291724001715. Online ahead of print.

ABSTRACT

BACKGROUND: Cannabis use and familial vulnerability to psychosis have been associated with social cognition deficits. This study examined the potential relationship between cannabis use and cognitive biases underlying social cognition and functioning in patients with first episode psychosis (FEP), their siblings, and controls.

METHODS: We analyzed a sample of 543 participants with FEP, 203 siblings, and 1168 controls from the EU-GEI study using a correlational design. We used logistic regression analyses to examine the influence of clinical group, lifetime cannabis use frequency, and potency of cannabis use on cognitive biases, accounting for demographic and cognitive variables.

RESULTS: FEP patients showed increased odds of facial recognition processing (FRP) deficits (OR = 1.642, CI 1.123-2.402) relative to controls but not of speech illusions (SI) or jumping to conclusions (JTC) bias, with no statistically significant differences relative to siblings. Daily and occasional lifetime cannabis use were associated with decreased odds of SI (OR = 0.605, CI 0.368-0.997 and OR = 0.646, CI 0.457-0.913 respectively) and JTC bias (OR = 0.625, CI 0.422-0.925 and OR = 0.602, CI 0.460-0.787 respectively) compared with lifetime abstinence, but not with FRP deficits, in the whole sample. Within the cannabis user group, low-potency cannabis use was associated with increased odds of SI (OR = 1.829, CI 1.297-2.578, FRP deficits (OR = 1.393, CI 1.031-1.882, and JTC (OR = 1.661, CI 1.271-2.171) relative to high-potency cannabis use, with comparable effects in the three clinical groups.

CONCLUSIONS: Our findings suggest increased odds of cognitive biases in FEP patients who have never used cannabis and in low-potency users. Future studies should elucidate this association and its potential implications.

PMID:39575607 | DOI:10.1017/S0033291724001715

Categories
Nevin Manimala Statistics

Clinical Study and Finite Element Analysis on the Effects of Pseudo-Patella Baja After TKA

Orthop Surg. 2024 Nov 22. doi: 10.1111/os.14289. Online ahead of print.

ABSTRACT

OBJECTIVE: Pseudo-patella baja (PPB) was one of the complications after total knee arthroplasty (TKA). This complication may be closely related to the occurrence of knee joint movement limitation and pain after TKA. This study aimed to investigate whether PPB affects clinical outcomes after TKA and to study the biomechanical effects of PPB after TKA.

METHODS: This study was a retrospective case series of 462 eligible patients (563 knees). Clinical evaluation was performed using the visual analogue scale (VAS), the Hospital for Special Surgery (HSS), the Western Ontario McMaster University Osteoarthritis Index (WOMAC) scoring systems, the 5-Level EuroQol Generic Health Index (EQ-5D-5L), the Forgotten Joint Score-12 (FJS-12), and patient satisfaction. CT and MRI scans of two healthy left knees and TKA prostheses were taken; 3D models including PPB, True patella baja (TPB), normal patella, and patella alta (PA) were created in FEA and applied load along the direction of quadriceps femoris. T-test, Mann-Whitney U-test, chi-squared (χ2) test, and analysis of variance (ANOVA) were performed using GraphPad Prism (Version 8, GraphPad Software, USA). A statistically significant difference was considered at p < 0.05 with bilateral α.

RESULTS: The VAS, HSS, WOMAC, EQ-5D-5L, FJS-12, and patient satisfaction scores in the PPB and TPB groups were significantly worse than those in the patella normal (PN) group (p < 0.05). The PPB group found a positive correlation between Blackburne-Peel index (BPI) and FJS-12 score. PPB showed lower contact stress of patellofemoral joint compared to TPB when knee flexion was less than < 90° (p < 0.01), but no significant difference when flexion was more than > 90° (p > 0.05) in the finite element model with Patella baja (PB). The contact area of the patellofemoral joint tended to increase with the deepening of knee flexion, and decreased after reaching the peak value. The contact area of the patellofemoral joint tended to decrease with the increase in patellar height. There was no significant difference in the contact area of the patellofemoral joint among different patellar heights and different degrees of knee flexion (p > 0.05).

CONCLUSION: PPB after TKA may increase patellofemoral joint stress and postoperative complications like anterior knee pain.

PMID:39575599 | DOI:10.1111/os.14289

Categories
Nevin Manimala Statistics

Risk Factors for Readmission After Pulmonary Lobectomy: A Quality Collaborative Study

Ann Thorac Surg. 2023 Feb;115(2):329-337. doi: 10.1016/j.athoracsur.2022.10.017. Epub 2022 Oct 29.

ABSTRACT

BACKGROUND: Previous studies have identified postoperative complications as being associated with readmission after lobectomy. However, these studies have not adequately accounted for the timing of complications or accounted for institutional effects. Our objectives were to examine readmission rates after lobectomy and identify factors associated with readmission.

METHODS: Patients aged >18 years undergoing lobectomy for lung cancer between 2015 and 2019 were identified from a statewide database. Patients with in-hospital mortality, missing data regarding discharge status, 30-day readmission status, and discharge location were excluded. Data regarding The Society of Thoracic Surgeons postoperative complications were abstracted by hospital data managers to determine the timing of occurrence (index admission vs readmission). Logistic mixed-model analysis, with hospitals as the random intercept to account for clustering data structure and assess hospital-specific effect on readmission, was performed.

RESULTS: The overall readmission rate was 6.9% (184 of 2686). The most common complication was air leak ≥5 days in 17.4% (467 of 2686). Variables significantly predictive of more readmission were predischarge postoperative complications and Zubrod score ≥1. Variables predictive of less readmission were increasing length of stay and having been operated on at institutions with higher cumulative volume or having postdischarge follow-up visit protocol ≤7 days from discharge. The C statistic for the final model was 0.80.

CONCLUSIONS: Patients who experience postoperative complications are at increased risk for readmission, whereas follow-up ≤7 days was predictive of less risk for readmission. Efforts at reducing readmissions should focus on decreasing postoperative complication rates, the timing of discharge for patients experiencing complications, as well as decreasing length of time between discharge and clinic follow-up.

PMID:39575522 | DOI:10.1016/j.athoracsur.2022.10.017

Categories
Nevin Manimala Statistics

A comparison between community and treatment-seeking samples of hoarding disorder

CNS Spectr. 2024 Nov 22:1-5. doi: 10.1017/S1092852924000361. Online ahead of print.

ABSTRACT

OBJECTIVE: Hoarding disorder studies are primarily based on persons who seek treatment and demonstrate good insight. The aim of the present study is to evaluate whether there are differences between community and treatment-seeking samples of individuals with hoarding disorder (HD).

METHODS: Fourteen people with HD from the community and twenty treatment-seeking people with HD were assessed by a battery of instruments to evaluate HD features and other associated characteristics.

RESULTS: Compared to the treatment-seeking sample, the HD community sample was older, had poorer insight, and had a lower prevalence of comorbid obsessive-compulsive disorder (OCD). There were no differences in gender, education, presence of psychiatric comorbidities, quality of life, and hoarding behavior characteristics between the samples. The final logistic regression model with the Dimensional Obsessive-Compulsive Scale (DOCS) as the single predictor of treatment-seeking status was statistically significant, indicating that it was able to distinguish between the two samples. The model explained between 20.7% and 27.9% of the variance of subjects, and correctly classified 67.6% of cases.

CONCLUSIONS: Our results indicate that there appear to be few differences between the treatment-seeking and community samples of individuals with HD. The presence of comorbid OCD in treatment-seeking groups seems to be more frequent than in HD community samples.

PMID:39575521 | DOI:10.1017/S1092852924000361

Categories
Nevin Manimala Statistics

Heart Transplant Outcomes in Older Adults in the Modern Era of Transplant

Clin Transplant. 2024 Nov;38(11):e70032. doi: 10.1111/ctr.70032.

ABSTRACT

BACKGROUND: Because of advances in medical treatment of heart failure, patients are living longer than in previous eras and may approach the need for advanced therapies, including heart transplantation, at older ages. This study assesses practices surrounding heart transplant in older adults (> 70 years) and examines short- and medium-term outcomes.

METHODS AND RESULTS: This study is a retrospective analysis using the United Network for Organ Sharing (UNOS) database from 2010 to 2021. The absolute number of older adults being transplanted is increasing. Older adults were more likely to have had a prior malignancy or ischemic cardiomyopathy and less likely to be on extra-corporeal membrane oxygenation or have a high UNOS status prior to transplant. Mortality at 1-year was higher for older adults (27.8% vs. 23.4%), but at 5 years there was no significant difference (22.3% vs. 19.4%.). Older adults were more likely to die of malignancy or infection. Adults under 70 were more likely to die of cardiovascular causes or graft failure. There was less rejection in older adults. Mortality has not changed for older adults transplanted before versus after the 2018 UNOS allocation change.

CONCLUSIONS: Carefully selected older adults may be considered for heart transplantation, given similar intermediate-term mortality.

PMID:39575512 | DOI:10.1111/ctr.70032

Categories
Nevin Manimala Statistics

Unilateral biportal endoscopic spine surgery: a meta-analysis unveiling the learning curve and clinical benefits

Front Surg. 2024 Nov 7;11:1405519. doi: 10.3389/fsurg.2024.1405519. eCollection 2024.

ABSTRACT

OBJECTIVE: To provide insights into the learning curve of unilateral biportal endoscopic (UBE) spine surgery by synthesizing available evidence on critical points and associated clinical outcomes.

METHODS: A comprehensive literature search was conducted across multiple databases, yielding a pool of relevant studies. Inclusion criteria encompassed studies reporting on UBE learning curves and quantitative data related to clinical outcomes (operative time, hospital stay, and complications).

RESULTS: A total of five studies were included in the analysis, providing six datasets to elucidate the UBE learning curve. Three of the five studies analyzed learning curves using the Cumulative Sum method and identified cutoff points. One study plotted learning curves and determined cutoff points based on surgical time analysis, while the remaining one study (providing two datasets) plotted learning curves using the phased analysis method. The mean value of the cutoff point in terms of the number of cases required to reach proficiency in time to surgery was calculated at 37.5 cases, with a range spanning from 14 to 58 cases. Notably, there was a statistically significant difference in time to surgery between the late group and the early group, with the late group demonstrating a significantly reduced time to surgery (P < 0.0001). Additionally, the determined cutoff points exhibited significant variations when applied to patient outcome parameters, including postoperative hospitalization, postoperative drainage, and surgical complications (P < 0.05).

CONCLUSION: While the analysis indicates that UBE surgery’s learning curve is associated with surgical time, the limited focus on this metric and potential discrepancies in cutoff point determination highlight the need for a more comprehensive understanding.

PMID:39575448 | PMC:PMC11578948 | DOI:10.3389/fsurg.2024.1405519

Categories
Nevin Manimala Statistics

Spatial prediction of the probability of liver fluke infection in water resource within sub-basin using an optimized geographically-weighted regression model

Front Vet Sci. 2024 Nov 7;11:1487222. doi: 10.3389/fvets.2024.1487222. eCollection 2024.

ABSTRACT

INTRODUCTION: Infection with liver flukes (Opisthorchis viverrini) is partly attributed to their ability to thrive in sub-basin habitats, causing the intermediate host to remain within the watershed system throughout the year. It is crucial to conduct spatial monitoring of fluke infection at a small basin analysis scale as it helps in studying the spatial factors influencing these infections. The number of infected individuals was obtained from local authorities, converted into a percentage, and visually represented as raster data through a heat map. This approach generates continuous data with dependent variables.

METHODS: The independent set comprises nine variables, including both vector and raster data, that establish a connection between the location of an infected person and their village. Design spatial units optimized for geo-weighted modeling by utilizing a clustering and overlay approach, thereby facilitating the optimal prediction of alternative models for infection.

RESULTS AND DISCUSSION: The Model-3 demonstrated the strongest correlation between the variables X5 (stream) and X7 (ndmi), which are associated with the percentage of infected individuals. The statistical analysis showed t-statistics values of -2.045 and 0.784, with corresponding p-values of 0.016 and 0.085. The RMSE was determined to be 2.571%, and the AUC was 0.659, providing support for these findings. Several alternative models were tested, and a generalized mathematical model was developed to incorporate the independent variables. This new model improved the accuracy of the GWR model by 5.75% and increased the R 2 value from 0.754 to 0.800. Additionally, spatial autocorrelation confirmed the difference in predictions between the modeled and actual infection values. This study demonstrates that when using GWR to create spatial models at the sub-basin level, it is possible to identify variables that are associated with liver fluke infection.

PMID:39575433 | PMC:PMC11578970 | DOI:10.3389/fvets.2024.1487222

Categories
Nevin Manimala Statistics

Characterization of sarcoma topography in Li-Fraumeni syndrome

Front Oncol. 2024 Nov 7;14:1415636. doi: 10.3389/fonc.2024.1415636. eCollection 2024.

ABSTRACT

INTRODUCTION: Li-Fraumeni syndrome (LFS) is a hereditary cancer predisposition syndrome primarily caused by germline TP53 pathogenic/likely pathogenic (P/LP) variants. Soft tissue and bone sarcomas are among the most frequently occurring of the many LFS-associated cancer types. Cancer screening recommendations for LFS are centered around annual whole-body MRI (wbMRI), the interpretation of which can be challenging. This study aims to characterize sarcoma topography in LFS.

METHODS: Study subjects included individuals from clinically and genetically ascertained cohorts of germline TP53 variant-carriers, namely the National Cancer Institute’s LFS longitudinal cohort study (NCI-LFS), the NCI Genetic Epidemiology of Osteosarcoma (NCI-GEO) study, and the germline TP53 Database.

RESULTS: Data was aggregated for a total of 160 sarcomas that had detailed topography available. Abdominal sarcomas and extremity osteosarcomas were among the most frequent locations of sarcomas. Chi-squared analyses showed no statistical differences in sarcoma topography based on age (pediatric vs adult) or sex (male vs female). A case series of sarcomas from the NCI-LFS study highlights the diagnostic challenges due to topography-related imaging.

DISCUSSION: While LFS-related sarcomas frequently occur in expected locations such as the extremities, they also occur in less typical sites, leading to difficulties in discerning between differential diagnoses on wbMRI and imaging. Prospective collection of detailed cancer topography in individuals with LFS will further aid in recommendations for radiologic interpretation and personalized screening in individuals with LFS.

PMID:39575416 | PMC:PMC11578819 | DOI:10.3389/fonc.2024.1415636

Categories
Nevin Manimala Statistics

Promoting appropriate medication use by leveraging medical big data

Front Digit Health. 2024 Nov 7;6:1198904. doi: 10.3389/fdgth.2024.1198904. eCollection 2024.

ABSTRACT

According to World Health Organization statistics, inappropriate medication has become an important factor affecting the safety of rational medication. In the gray area of medical insurance supervision, such as designated drugstores and medical institutions, there are lots of inappropriate medication phenomena regarding “big prescription for minor ailments.” A traditional clinical decision support system is mostly based on established rules to regulate inappropriate prescriptions, which are not suitable for clinical environments and require intelligent review. In this study, we model the complex relationships between patients, diseases, and drugs based on medical big data to promote appropriate medication use. More specifically, we first construct the medication knowledge graph based on the historical prescription big data of tertiary hospitals and medical text data. Second, based on the medication knowledge graph, we employ a Gaussian mixture model to group patient population representation as physiological features. For diagnostic features, we employ pre-training word vector Bidirectional Encoder Representations from Transformers to enhance the semantic representation between diagnoses. In addition, to reduce adverse drug interactions caused by drug combinations, we employ a graph convolution network to transform drug interaction information into drug interaction features. Finally, we employ the sequence generation model to learn the complex relationships between patients, diseases, and drugs and provide an appropriate medication evaluation for doctor prescriptions in small hospitals from two aspects: drug list and medication course of treatment. In this study, we utilize the MIMIC III dataset alongside data from a tertiary hospital in Fujian Province to validate our model. The results show that our method is more effective than other baseline methods in the accuracy of the medication regimen prediction of rational medication. In addition, it achieved high accuracy in the appropriate medication detection of prescription in small hospitals.

PMID:39575413 | PMC:PMC11578981 | DOI:10.3389/fdgth.2024.1198904