Categories
Nevin Manimala Statistics

Risk factors for deep vein thrombosis even using low-molecular-weight heparin after total knee arthroplasty

Knee Surg Relat Res. 2021 Sep 7;33(1):29. doi: 10.1186/s43019-021-00109-z.

ABSTRACT

BACKGROUND: With an increase in deep vein thrombosis (DVT) following total knee arthroplasty (TKA) in the Asian population, most surgeons today use a form of prophylactic anticoagulant agents in patients after TKA. Nevertheless, DVT occasionally develops even in these patients with prophylaxis. The purpose of this study was to identify the risk factors for DVT after TKA in cases of postoperative low-molecular-weight heparin (LMWH) use.

METHODS: We designed a retrospective study with 103 patients who underwent primary TKA. From the second postoperative day, 60 mg of LMWH was subcutaneously injected into the patients daily. On the seventh postoperative day, patients had computed tomography angiography to check whether they had DVT. Regarding risk factors, we investigated patients’ gender, age, surgical site (unilateral/bilateral), body mass index, method of anesthesia, preoperative hypertension, diabetes, hypercholesterolemia status, and prothrombin time/international normalized ratio from electronic medical records. We analyzed the statistical significance of these risk factors.

RESULTS: Statistically significant factors in the single-variable analysis were surgical site (unilateral/bilateral), body mass index, preoperative hypertension status, and anesthesia method. Multiple logistic regression analysis with these factors revealed that the surgical site (unilateral/bilateral, p = 0.024) and anesthesia method (p = 0.039) were significant factors for the occurrence of postoperative DVT after TKA.

CONCLUSIONS: Patients undergoing simultaneous bilateral TKAs and patients undergoing TKA with general anesthesia need more attention regarding DVT even with chemoprophylaxis using LMWH after TKA.

PMID:34493344 | DOI:10.1186/s43019-021-00109-z

Categories
Nevin Manimala Statistics

The food and nutrient intake of 5 to 12 year old Australian children during school hours: A secondary analysis of the 2011-12 National Nutrition and Physical Activity Survey

Public Health Nutr. 2021 Sep 8:1-24. doi: 10.1017/S1368980021003888. Online ahead of print.

ABSTRACT

OBJECTIVE: School food intake of Australian children is not comprehensively described in literature, with limited temporal, nationally representative data. Greater understanding of intake at school can inform school-based nutrition promotion. This study aimed to describe the dietary intake of primary-aged children during school hours and its contribution to daily intake.

DESIGN: This secondary analysis used nationally representative, cross-sectional data from the 2011-12 National Nutrition and Physical Activity Survey. Dietary intake was assessed using validated 24-hour dietary recalls on school days. Descriptive statistics were undertaken to determine energy, nutrients, food groups, and food products consumed during school hours, as well as their contributions to total daily intake. Associations between school food intake and socio-demographic characteristics were explored.

SETTING: Australia.

PARTICIPANTS: Seven hundred and ninety-five children aged 5-12 years.

RESULTS: Children consumed 37% of their daily energy and 31-43% of select nutrient intake during school hours, with discretionary choices contributing 44% of school energy intake. Most children consumed less than one serve of vegetables, meat and alternatives or milk and alternatives during school hours. Commonly consumed products were discretionary choices (34%, including biscuits, processed meat), bread (17%) and fruit (12%). There were limited associations with SES variables, apart from child age.

CONCLUSIONS: Children’s diets were not aligned with national recommendations, with school food characterised by high intake of discretionary choices. These findings are consistent with previous Australian evidence and support transformation of the Australian school food system to better align school food consumption with recommendations.

PMID:34493351 | DOI:10.1017/S1368980021003888

Categories
Nevin Manimala Statistics

Prognostic factors of head and neck cutaneous squamous cell carcinoma: a systematic review

J Otolaryngol Head Neck Surg. 2021 Sep 7;50(1):54. doi: 10.1186/s40463-021-00529-7.

ABSTRACT

BACKGROUND: Head and neck cutaneous squamous cell carcinoma (HNCSCC) is a non-melanoma skin cancer that is mostly caused by solar ultraviolet radiation exposure. While it usually has an excellent prognosis, a subset of patients (5%) develops nodal metastasis and has poor outcomes. The aim of this study was to systematically review the literature and evaluate the prognostic factors of HNCSCC in order to better understand which patients are the most likely to develop metastatic disease.

METHODS: A comprehensive literature search was performed on PubMed and EMBASE to identify the studies that evaluated the prognostic factors of HNCSCC. Prognostic factors were deemed significant if they had a reported p-value of < 0.05. Proportions of studies that reported a given factor to be statistically significant were calculated for each prognostic factor.

RESULTS: The search yielded a total of 958 citations. Forty studies, involving a total of 8535 patients, were included in the final analysis. The pre-operative/clinical prognostic factors with the highest proportion of significance were state of immunosuppression (73.3%) and age (53.3%); while post-operative/pathological prognostic factors of importance were number of lymph nodes involved with carcinoma (70.0%), margins involved with carcinoma (66.7%), and tumor depth (50.0%).

CONCLUSION: This systematic review is aimed to aid physicians in assessing the prognosis of HNCSCC and identifying the subsets of patients that are most susceptible to metastasis. It also suggests that immunosuppressed patients with a high-risk feature on biopsy, such as invasion beyond subcutaneous fat, could possibly benefit from a sentinel lymph node biopsy.

PMID:34493343 | DOI:10.1186/s40463-021-00529-7

Categories
Nevin Manimala Statistics

Prognostic factors of total hip replacement during a 2-year period in participants enrolled in supervised education and exercise therapy: a prognostic study of 3657 participants with hip osteoarthritis

Arthritis Res Ther. 2021 Sep 7;23(1):235. doi: 10.1186/s13075-021-02608-6.

ABSTRACT

BACKGROUND: Evidence on prognostic factors associated with progression to total hip replacement (THR) in hip osteoarthritis (OA) is for the most patient- and disease-specific characteristics either conflicting or inconclusive. Therefore, the objectives of this study of participants with hip OA enrolled in a structured program of supervised education and exercise therapy were to describe the rate of THR and to identify prognostic factors for receiving THR within the following 2 years.

METHODS: Participants aged ≥ 45 years with hip OA enrolled in Good Life with osteoArthritis in Denmark (GLA:D®) from July 2014 to March 2017 were included. Potential prognostic factors included demographic and disease-specific baseline characteristics and measures of physical activity and quality of life (QoL). Information on THR was retrieved from The Danish National Patient Registry. A multivariable Cox proportional hazards model was developed.

RESULTS: Of 3657 included participants, 30% received a THR within 2 years. Of the 100 participants already wait-listed for THR, 60% had the procedure. Of 22 candidate prognostic factors, 14 were statistically significant for receiving THR. Factors associated with a faster rate of THR included being “male” (HR 1.43), having “self-reported radiographic hip OA” (HR 2.32), being “wait-listed for THR” (HR 2.17), and having a higher “pain intensity” (HR 1.01). In contrast, faster “walking speed” (HR 0.64), better “hip-related QoL” (HR 0.98), and having “three or more comorbidities” (HR 0.62) were predictive of a slower rate of THR.

CONCLUSION: During the 2-year follow-up period, 30% of the cohort received a THR. Notably, 40% of those wait-listed for THR when entering the program did not receive THR within 2 years. A number of baseline prognostic factors for receiving THR were identified.

PMID:34493331 | DOI:10.1186/s13075-021-02608-6

Categories
Nevin Manimala Statistics

The epidemiology of bloodstream infections and antimicrobial susceptibility patterns in Thuringia, Germany: a five-year prospective, state-wide surveillance study (AlertsNet)

Antimicrob Resist Infect Control. 2021 Sep 8;10(1):132. doi: 10.1186/s13756-021-00997-6.

ABSTRACT

BACKGROUND: Monitoring pathogens of bloodstream infections (BSI) and their antibiotic susceptibility is important to guide empiric antibiotic treatment strategies and prevention programs. This study assessed the epidemiology of BSI and antibiotic resistance patterns at the German Federal State of Thuringia longitudinally.

METHODS: A surveillance network consisting of 26 hospitals was established to monitor BSIs from 01/2015 to 12/2019. All blood culture results, without restriction of age of patients, of the participating hospitals were reported by the respective microbiological laboratory. A single detection of obligate pathogens and a repeated detection of coagulase-negative staphylococci, Bacillus spp., Corynebacterium spp., Micrococcus spp. and Propionibacterium spp., within 96 h were regarded as a relevant positive blood culture. If one of the aforementioned non-obligate pathogens has been detected only once within 96 h, contamination has been assumed. Logistic regression models were applied to analyse the relationship between resistance, year of BSI and hospital size. Generalized estimating equations were used to address potential clustering.

RESULTS: A total of 343,284 blood cultures (BC) of 82,527 patients were recorded. Overall, 2.8% (n = 9571) of all BCs were classified as contaminated. At least one relevant pathogen was identified in 13.2% (n = 45,346) of BCs. Escherichia coli (25.4%) was the most commonly detected pathogen, followed by Staphylococcus aureus (15.2%), Staphylococcus epidermidis (8.1%) and Klebsiella pneumoniae (4.6%). In S. aureus, we observed a decline of methicillin resistance (MRSA) from 10.4% in 2015 to 2.5% in 2019 (p < 0.001). The rate of vancomycin resistance in Enterococcus faecium (VRE) has increased from 16.7% in 2015 to 26.9% in 2019 (p < 0.001), with a peak in 2018 (42.5%). In addition, we observed an increase of Cefotaxime (3GC) resistance in E. coli from 10.7% in 2015 to 14.5% in 2019 (p = 0.007) whereas 3GC resistance in K. pneumoniae was stable (2015: 9.9%; 2019: 7.4%, p = 0.35). Carbapenem resistance was less than 1% for both pathogens. These patterns were robustly observed across sensitivity analyses.

CONCLUSIONS: We observed evidence for a decline in MRSA, an increase in VRE and a very low rate of carbapenem resistance in gram-negative bacteria. 3GC resistance in E. coli increased constantly over time.

PMID:34493334 | DOI:10.1186/s13756-021-00997-6

Categories
Nevin Manimala Statistics

The pupillary light reflex (PLR) as a marker for the ability to work or drive – a feasibility study

J Occup Med Toxicol. 2021 Sep 7;16(1):39. doi: 10.1186/s12995-021-00330-2.

ABSTRACT

BACKGROUND: The PLR (pupillary light reflex) can be a marker for pathological medical conditions, such as neurodegenerative or mental health disorders and diseases as well as marker for physiological alterations, such as age, sex or iris color. PLR alterations have been described in people after alcohol consumption, as well. However, the effect of sleep deprivation on PLR parameters is still under debate.

METHODS: The aim of this study was to investigate the feasibility of PLR measurements in sleep-deprived and alcohol-exposed participants. In addition, we wanted to identify PLR parameters that were altered by sleep deprivation and alcohol exposure.

RESULTS: Altogether n = 50 participants have been included in this study. Differences in the PLR parameters initial diameter (dinit), latency (∆tlat), acceleration (∆ta), contraction velocity (ϑcon), quarter dilatation velocity (ϑ1/4dil), half dilatation time (∆t1/2), and the line integral (L(0.3500)) have been evaluated between baseline, sleep deprivation, as well as alcohol exposure. In a generalized linear mixed models design, we could observe statistically significant associations between the type of exposure and the PLR parameters half dilatation time and half dilatation time after the first light pulse (all p < 0.05). The participants’ latency showed a significant association in dependence of the type of exposure after the second light pulse (p < 0.05).

CONCLUSION: Our study delivers first promising results to further develop devices that may identify conditions that impair the ability to work or drive.

PMID:34493308 | DOI:10.1186/s12995-021-00330-2

Categories
Nevin Manimala Statistics

Recovering genotypes and phenotypes using allele-specific genes

Genome Biol. 2021 Sep 7;22(1):263. doi: 10.1186/s13059-021-02477-x.

ABSTRACT

With the recent increase in RNA sequencing efforts using large cohorts of individuals, surveying allele-specific gene expression is becoming increasingly frequent. Here, we report that, despite not containing explicit variant information, a list of genes known to be allele-specific in an individual is enough to recover key variants and link the individuals back to their genotypes and phenotypes. This creates a privacy conundrum.

PMID:34493313 | DOI:10.1186/s13059-021-02477-x

Categories
Nevin Manimala Statistics

SUPERGNOVA: local genetic correlation analysis reveals heterogeneous etiologic sharing of complex traits

Genome Biol. 2021 Sep 7;22(1):262. doi: 10.1186/s13059-021-02478-w.

ABSTRACT

Local genetic correlation quantifies the genetic similarity of complex traits in specific genomic regions. However, accurate estimation of local genetic correlation remains challenging, due to linkage disequilibrium in local genomic regions and sample overlap across studies. We introduce SUPERGNOVA, a statistical framework to estimate local genetic correlations using summary statistics from genome-wide association studies. We demonstrate that SUPERGNOVA outperforms existing methods through simulations and analyses of 30 complex traits. In particular, we show that the positive yet paradoxical genetic correlation between autism spectrum disorder and cognitive performance could be explained by two etiologically distinct genetic signatures with bidirectional local genetic correlations.

PMID:34493297 | DOI:10.1186/s13059-021-02478-w

Categories
Nevin Manimala Statistics

Point-of-care detection and differentiation of anticoagulant therapy – development of thromboelastometry-guided decision-making support algorithms

Thromb J. 2021 Sep 7;19(1):63. doi: 10.1186/s12959-021-00313-7.

ABSTRACT

BACKGROUND: DOAC detection is challenging in emergency situations. Here, we demonstrated recently, that modified thromboelastometric tests can reliably detect and differentiate dabigatran and rivaroxaban. However, whether all DOACs can be detected and differentiated to other coagulopathies is unclear. Therefore, we now tested the hypothesis that a decision tree-based thromboelastometry algorithm enables detection and differentiation of all direct Xa-inhibitors (DXaIs), the direct thrombin inhibitor (DTI) dabigatran, as well as vitamin K antagonists (VKA) and dilutional coagulopathy (DIL) with high accuracy.

METHODS: Following ethics committee approval (No 17-525-4), and registration by the German clinical trials database we conducted a prospective observational trial including 50 anticoagulated patients (n = 10 of either DOAC/VKA) and 20 healthy volunteers. Blood was drawn independent of last intake of coagulation inhibitor. Healthy volunteers served as controls and their blood was diluted to simulate a 50% dilution in vitro. Standard (extrinsic coagulation assay, fibrinogen assay, etc.) and modified thromboelastometric tests (ecarin assay and extrinsic coagulation assay with low tissue factor) were performed. Statistical analyzes included a decision tree analyzes, with depiction of accuracy, sensitivity and specificity, as well as receiver-operating-characteristics (ROC) curve analysis including optimal cut-off values (Youden-Index).

RESULTS: First, standard thromboelastometric tests allow a good differentiation between DOACs and VKA, DIL and controls, however they fail to differentiate DXaIs, DTIs and VKAs reliably resulting in an overall accuracy of 78%. Second, adding modified thromboelastometric tests, 9/10 DTI and 28/30 DXaI patients were detected, resulting in an overall accuracy of 94%. Complex decision trees even increased overall accuracy to 98%. ROC curve analyses confirm the decision-tree-based results showing high sensitivity and specificity for detection and differentiation of DTI, DXaIs, VKA, DIL, and controls.

CONCLUSIONS: Decision tree-based machine-learning algorithms using standard and modified thromboelastometric tests allow reliable detection of DTI and DXaIs, and differentiation to VKA, DIL and controls.

TRIAL REGISTRATION: Clinical trial number: German clinical trials database ID: DRKS00015704 .

PMID:34493301 | DOI:10.1186/s12959-021-00313-7

Categories
Nevin Manimala Statistics

Similarities of developmental gene expression changes in the brain between human and experimental animals: rhesus monkey, mouse, Zebrafish, and Drosophila

Mol Brain. 2021 Sep 7;14(1):135. doi: 10.1186/s13041-021-00840-4.

ABSTRACT

AIM: Experimental animals, such as non-human primates (NHPs), mice, Zebrafish, and Drosophila, are frequently employed as models to gain insights into human physiology and pathology. In developmental neuroscience and related research fields, information about the similarities of developmental gene expression patterns between animal models and humans is vital to choose what animal models to employ. Here, we aimed to statistically compare the similarities of developmental changes of gene expression patterns in the brains of humans with those of animal models frequently used in the neuroscience field.

METHODS: The developmental gene expression datasets that we analyzed consist of the fold-changes and P values of gene expression in the brains of animals of various ages compared with those of the youngest postnatal animals available in the dataset. By employing the running Fisher algorithm in a bioinformatics platform, BaseSpace, we assessed similarities between the developmental changes of gene expression patterns in the human (Homo sapiens) hippocampus with those in the dentate gyrus (DG) of the rhesus monkey (Macaca mulatta), the DG of the mouse (Mus musculus), the whole brain of Zebrafish (Danio rerio), and the whole brain of Drosophila (D. melanogaster).

RESULTS: Among all possible comparisons of different ages and animals in developmental changes in gene expression patterns within the datasets, those between rhesus monkeys and mice were highly similar to those of humans with significant overlap P-value as assessed by the running Fisher algorithm. There was the highest degree of gene expression similarity between 40-59-year-old humans and 6-12-year-old rhesus monkeys (overlap P-value = 2.1 × 10– 72). The gene expression similarity between 20-39-year-old humans and 29-day-old mice was also significant (overlap P = 1.1 × 10– 44). Moreover, there was a similarity in developmental changes of gene expression patterns between 1-2-year-old Zebrafish and 40-59-year-old humans (Overlap P-value = 1.4 × 10– 6). The overlap P-value of developmental gene expression patterns between Drosophila and humans failed to reach significance (30 days Drosophila and 6-11-year-old humans; overlap P-value = 0.0614).

CONCLUSIONS: These results indicate that the developmental gene expression changes in the brains of the rhesus monkey, mouse, and Zebrafish recapitulate, to a certain degree, those in humans. Our findings support the idea that these animal models are a valid tool for investigating the development of the brain in neurophysiological and neuropsychiatric studies.

PMID:34493287 | DOI:10.1186/s13041-021-00840-4