Categories
Nevin Manimala Statistics

Current status of cardiac rehabilitation among representative hospitals treating acute myocardial infarction in South Korea

PLoS One. 2021 Dec 8;16(12):e0261072. doi: 10.1371/journal.pone.0261072. eCollection 2021.

ABSTRACT

Cardiac rehabilitation services are mostly underutilized despite the documentation of substantial morbidity and mortality benefits of cardiac rehabilitation post-acute myocardial infarction. To assess the implementation rate and barriers to cardiac rehabilitation in hospitals dealing with acute myocardial infarction in South Korea, between May and July 2016, questionnaires were emailed to cardiology directors of 93 hospitals in South Korea; all hospitals were certified institutes for coronary interventions. The questionnaires included 16 questions on the hospital type, cardiology practice, and implementation of cardiac rehabilitation. The obtained data were categorized into two groups based on the type of the hospital (secondary or tertiary) and statistically analysed. Of the 72 hospitals that responded (response rate of 77%), 39 (54%) were tertiary medical centers and 33 (46%) were secondary medical centers. All hospitals treated acute myocardial infarction patients and performed emergency percutaneous coronary intervention; 79% (57/72) of the hospitals performed coronary artery bypass grafting. However, the rate of implementation of cardiac rehabilitation was low overall (28%, 20/72 hospitals) and even lower in secondary medical centers (12%, 4/33 hospitals) than in tertiary centers (41%, 16/39 hospitals, p = 0.002). The major barriers to cardiac rehabilitation included the lack of staff (59%) and lack of space (33%). In contrast to the wide availability of acute-phase invasive treatment for AMI, the overall implementation of cardiac rehabilitation is extremely poor in South Korea. Considering the established benefits of cardiac rehabilitation in patients with acute myocardial infarction, more administrative support, such as increasing the fee for cardiac rehabilitation services by an appropriate level of health insurance coverage should be warranted.

PMID:34879117 | DOI:10.1371/journal.pone.0261072

Categories
Nevin Manimala Statistics

Identification of the best housekeeping gene for RT-qPCR analysis of human pancreatic organoids

PLoS One. 2021 Dec 8;16(12):e0260902. doi: 10.1371/journal.pone.0260902. eCollection 2021.

ABSTRACT

In the last few years, there has been a considerable increase in the use of organoids, which is a new three-dimensional culture technology applied in scientific research. The main reasons for their extensive use are their plasticity and multiple applications, including in regenerative medicine and the screening of new drugs. The aim of this study was to better understand these structures by focusing on the choice of the best housekeeping gene (HKG) to perform accurate molecular analysis on such a heterogeneous system. This feature should not be underestimated because the inappropriate use of a HKG can lead to misleading data and incorrect results, especially when the subject of the study is innovative and not totally explored like organoids. We focused our attention on the newly described human pancreatic organoids (hPOs) and compared 12 well-known HKGs (ACTB, B2M, EF1α, GAPDH, GUSB, HPRT, PPIA, RNA18S, RPL13A TBP, UBC and YWHAZ). Four different statistical algorithms (NormFinder, geNorm, BestKeeper and ΔCt) were applied to estimate the expression stability of each HKG, and RefFinder was used to identify the most suitable genes for RT-qPCR data normalization. Our results showed that the intragroup and intergroup comparisons could influence the best choice of the HKG, making clear that the identification of a stable reference gene for accurate and reproducible RT-qPCR data normalization remains a critical issue. In summary, this is the first report on HKGs in human organoids, and this work provides a strong basis to pave the way for further gene analysis in hPOs.

PMID:34879096 | DOI:10.1371/journal.pone.0260902

Categories
Nevin Manimala Statistics

Computational timeline reconstruction of the stories surrounding Trump: Story turbulence, narrative control, and collective chronopathy

PLoS One. 2021 Dec 8;16(12):e0260592. doi: 10.1371/journal.pone.0260592. eCollection 2021.

ABSTRACT

Measuring the specific kind, temporal ordering, diversity, and turnover rate of stories surrounding any given subject is essential to developing a complete reckoning of that subject’s historical impact. Here, we use Twitter as a distributed news and opinion aggregation source to identify and track the dynamics of the dominant day-scale stories around Donald Trump, the 45th President of the United States. Working with a data set comprising around 20 billion 1-grams, we first compare each day’s 1-gram and 2-gram usage frequencies to those of a year before, to create day- and week-scale timelines for Trump stories for 2016-2021. We measure Trump’s narrative control, the extent to which stories have been about Trump or put forward by Trump. We then quantify story turbulence and collective chronopathy-the rate at which a population’s stories for a subject seem to change over time. We show that 2017 was the most turbulent overall year for Trump. In 2020, story generation slowed dramatically during the first two major waves of the COVID-19 pandemic, with rapid turnover returning first with the Black Lives Matter protests following George Floyd’s murder and then later by events leading up to and following the 2020 US presidential election, including the storming of the US Capitol six days into 2021. Trump story turnover for 2 months during the COVID-19 pandemic was on par with that of 3 days in September 2017. Our methods may be applied to any well-discussed phenomenon, and have potential to enable the computational aspects of journalism, history, and biography.

PMID:34879105 | DOI:10.1371/journal.pone.0260592

Categories
Nevin Manimala Statistics

Deciphering the microbial and molecular responses of geographically diverse Setaria accessions grown in a nutrient-poor soil

PLoS One. 2021 Dec 8;16(12):e0259937. doi: 10.1371/journal.pone.0259937. eCollection 2021.

ABSTRACT

The microbial and molecular characterization of the ectorhizosphere is an important step towards developing a more complete understanding of how the cultivation of biofuel crops can be undertaken in nutrient poor environments. The ectorhizosphere of Setaria is of particular interest because the plant component of this plant-microbe system is an important agricultural grain crop and a model for biofuel grasses. Importantly, Setaria lends itself to high throughput molecular studies. As such, we have identified important intra- and interspecific microbial and molecular differences in the ectorhizospheres of three geographically distant Setaria italica accessions and their wild ancestor S. viridis. All were grown in a nutrient-poor soil with and without nutrient addition. To assess the contrasting impact of nutrient deficiency observed for two S. italica accessions, we quantitatively evaluated differences in soil organic matter, microbial community, and metabolite profiles. Together, these measurements suggest that rhizosphere priming differs with Setaria accession, which comes from alterations in microbial community abundances, specifically Actinobacteria and Proteobacteria populations. When globally comparing the metabolomic response of Setaria to nutrient addition, plants produced distinctly different metabolic profiles in the leaves and roots. With nutrient addition, increases of nitrogen containing metabolites were significantly higher in plant leaves and roots along with significant increases in tyrosine derived alkaloids, serotonin, and synephrine. Glycerol was also found to be significantly increased in the leaves as well as the ectorhizosphere. These differences provide insight into how C4 grasses adapt to changing nutrient availability in soils or with contrasting fertilization schemas. Gained knowledge could then be utilized in plant enhancement and bioengineering efforts to produce plants with superior traits when grown in nutrient poor soils.

PMID:34879068 | DOI:10.1371/journal.pone.0259937

Categories
Nevin Manimala Statistics

Diagnosis of Suspected Scaphoid Fractures

JBJS Rev. 2021 Dec 8;9(12). doi: 10.2106/JBJS.RVW.20.00247.

ABSTRACT

»: Suspected scaphoid fractures are a diagnostic and therapeutic challenge despite the advances in knowledge regarding these injuries and imaging techniques. The risks and restrictions of routine immobilization as well as the restriction of activities in a young and active population must be weighed against the risks of nonunion that are associated with a missed fracture.

»: The prevalence of true fractures among suspected fractures is low. This greatly reduces the statistical probability that a positive diagnostic test will correspond with a true fracture, reducing the positive predictive value of an investigation.

»: There is no consensus reference standard for a true fracture; therefore, alternative statistical methods for calculating sensitivity, specificity, and positive and negative predictive values are required.

»: Clinical prediction rules that incorporate a set of demographic and clinical factors may allow stratification of secondary imaging, which, in turn, could increase the pretest probability of a scaphoid fracture and improve the diagnostic performance of the sophisticated radiographic investigations that are available.

»: Machine-learning-derived probability calculators may augment risk stratification and can improve through retraining, although these theoretical benefits need further prospective evaluation.

»: Convolutional neural networks (CNNs) are a form of artificial intelligence that have demonstrated great promise in the recognition of scaphoid fractures on radiographs. However, in the more challenging diagnostic scenario of a suspected or so-called “clinical” scaphoid fracture, CNNs have not yet proven superior to a diagnosis that has been made by an experienced surgeon.

PMID:34879033 | DOI:10.2106/JBJS.RVW.20.00247

Categories
Nevin Manimala Statistics

DeepCME: A deep learning framework for computing solution statistics of the chemical master equation

PLoS Comput Biol. 2021 Dec 8;17(12):e1009623. doi: 10.1371/journal.pcbi.1009623. Online ahead of print.

ABSTRACT

Stochastic models of biomolecular reaction networks are commonly employed in systems and synthetic biology to study the effects of stochastic fluctuations emanating from reactions involving species with low copy-numbers. For such models, the Kolmogorov’s forward equation is called the chemical master equation (CME), and it is a fundamental system of linear ordinary differential equations (ODEs) that describes the evolution of the probability distribution of the random state-vector representing the copy-numbers of all the reacting species. The size of this system is given by the number of states that are accessible by the chemical system, and for most examples of interest this number is either very large or infinite. Moreover, approximations that reduce the size of the system by retaining only a finite number of important chemical states (e.g. those with non-negligible probability) result in high-dimensional ODE systems, even when the number of reacting species is small. Consequently, accurate numerical solution of the CME is very challenging, despite the linear nature of the underlying ODEs. One often resorts to estimating the solutions via computationally intensive stochastic simulations. The goal of the present paper is to develop a novel deep-learning approach for computing solution statistics of high-dimensional CMEs by reformulating the stochastic dynamics using Kolmogorov’s backward equation. The proposed method leverages superior approximation properties of Deep Neural Networks (DNNs) to reliably estimate expectations under the CME solution for several user-defined functions of the state-vector. This method is algorithmically based on reinforcement learning and it only requires a moderate number of stochastic simulations (in comparison to typical simulation-based approaches) to train the “policy function”. This allows not just the numerical approximation of various expectations for the CME solution but also of its sensitivities with respect to all the reaction network parameters (e.g. rate constants). We provide four examples to illustrate our methodology and provide several directions for future research.

PMID:34879062 | DOI:10.1371/journal.pcbi.1009623

Categories
Nevin Manimala Statistics

Baseball-Related Craniofacial Injury Among the Youth: A National Electronic Injury Surveillance System Database Study

J Craniofac Surg. 2021 Dec 7. doi: 10.1097/SCS.0000000000008404. Online ahead of print.

ABSTRACT

BACKGROUND: Baseball is 1 of the most played sports among adolescents in the United States. Yet, youth baseball players experience the greatest number of oral and facial injuries, compared to other athletes involved in other sports.

METHODS: The National Electronic Injury Surveillance System was analyzed for all hospital admissions for youth baseball athletes (5-19-year-old) experiencing a baseball-related craniofacial injury. These included concussions, head contusions, head lacerations, facial contusions, facial fractures, facial hematomas, face lacerations, eye contusions, mouth lacerations, dental injuries, and neck contusions. Descriptive statistics were performed, and injury incidence was described by sport, injury type, and age group.

RESULTS: Nearly half of the injuries (45.0%) occurred among 10- to 14-year-old patients, followed by 5- to 9-year-olds and 15- to 19-year-olds. Of all age groups, the most common type of injury was facial contusions, compromising one fourth of the injuries. Other frequent injuries included facial lacerations (19.9%), facial fractures (19.7%), and concussions (13.4%).

CONCLUSIONS: Overall, this analysis underscores the need for increased implementation of protective equipment, such as faceguards and safety balls. Although facial fractures are less common amongst the pediatric population, physicians and coaches need to be better educated about the most frequent injury patterns and management. Further prospective studies are warranted to better characterize these findings and to prevent injuries.

PMID:34879017 | DOI:10.1097/SCS.0000000000008404

Categories
Nevin Manimala Statistics

The Standardization of Hospital-Acquired Infection Rates Using Prediction Models in Iran: Observational Study of National Nosocomial Infection Registry Data

JMIR Public Health Surveill. 2021 Dec 7;7(12):e33296. doi: 10.2196/33296.

ABSTRACT

BACKGROUND: Many factors contribute to the spreading of hospital-acquired infections (HAIs).

OBJECTIVE: This study aimed to standardize the HAI rate using prediction models in Iran based on the National Healthcare Safety Network (NHSN) method.

METHODS: In this study, the Iranian nosocomial infections surveillance system (INIS) was used to gather data on patients with HAIs (126,314 infections). In addition, the hospital statistics and information system (AVAB) was used to collect data on hospital characteristics. First, well-performing hospitals, including 357 hospitals from all over the country, were selected. Data were randomly split into training (70%) and testing (30%) sets. Finally, the standardized infection ratio (SIR) and the corrected SIR were calculated for the HAIs.

RESULTS: The mean age of the 100,110 patients with an HAI was 40.02 (SD 23.56) years. The corrected SIRs based on the observed and predicted infections for respiratory tract infections (RTIs), urinary tract infections (UTIs), surgical site infections (SSIs), and bloodstream infections (BSIs) were 0.03 (95% CI 0-0.09), 1.02 (95% CI 0.95-1.09), 0.93 (95% CI 0.85-1.007), and 0.91 (95% CI 0.54-1.28), respectively. Moreover, the corrected SIRs for RTIs in the infectious disease, burn, obstetrics and gynecology, and internal medicine wards; UTIs in the burn, infectious disease, internal medicine, and intensive care unit wards; SSIs in the burn and infectious disease wards; and BSIs in most wards were >1, indicating that more HAIs were observed than expected.

CONCLUSIONS: The results of this study can help to promote preventive measures based on scientific evidence. They can also lead to the continuous improvement of the monitoring system by collecting and systematically analyzing data on HAIs and encourage the hospitals to better control their infection rates by establishing a benchmarking system.

PMID:34879002 | DOI:10.2196/33296

Categories
Nevin Manimala Statistics

Nicotine metabolism ratio increases in HIV-positive smokers on effective antiretroviral therapy: a cohort study

J Acquir Immune Defic Syndr. 2021 Dec 7. doi: 10.1097/QAI.0000000000002880. Online ahead of print.

ABSTRACT

BACKGROUND: People with HIV (PWH) smoke tobacco at much higher rates than the general population. Prior research has shown that PWH have faster nicotine metabolism than HIV-uninfected individuals which may underlie this disparity, but the cause is unknown. We investigated whether higher nicotine metabolite ratio (NMR; 3-hydroxycotinine:cotinine), a validated biomarker of nicotine metabolism via CYP2A6, was associated with antiretroviral use among HIV-infected smokers.

METHODS: We conducted a retrospective cohort study of HIV-positive smokers in the University of Pennsylvania Center for AIDS Research cohort. We compared the NMR before viral suppression (>10,000 copies/ml) and after viral suppression on ART (<200 copies/ml). We used mixed effects linear regression to analyze the change in NMR after viral suppression and assessed for effect modification by efavirenz use.

RESULTS: Eighty-nine individuals were included in the study. We observed effect modification by efavirenz use (interaction term for efavirenz use, P<0.001). Among those on non-efavirenz regimens, the mean NMR increased by 0.14 (95% CI 0.05-0.23, P=0.002). Among those on efavirenz-containing regimens, the mean NMR increased by 0.53 (95% CI 0.39-0.66, P<0.001).

CONCLUSION: We observed a clinically and statistically significant increase in NMR after viral suppression among smokers with HIV, which more than doubled among those on efavirenz-based regimens. Higher NMR among HIV-positive smokers on ART may help explain the higher rates of tobacco use and lower quit rates among PWH in care. These findings suggest that regimen choice and other modifiable factors may be targets for future attempts to increase success rates for tobacco cessation among PWH.

PMID:34879005 | DOI:10.1097/QAI.0000000000002880

Categories
Nevin Manimala Statistics

The Effects of the ManageHF4Life Mobile App on Patients With Chronic Heart Failure: Randomized Controlled Trial

JMIR Mhealth Uhealth. 2021 Dec 7;9(12):e26185. doi: 10.2196/26185.

ABSTRACT

BACKGROUND: The successful management of heart failure (HF) involves guideline-based medical therapy as well as self-management behavior. As a result, the management of HF is moving toward a proactive real-time technological model of assisting patients with monitoring and self-management.

OBJECTIVE: The aim of this paper was to evaluate the efficacy of enhanced self-management via a mobile app intervention on health-related quality of life, self-management, and HF readmissions.

METHODS: A single-center randomized controlled trial was performed. Participants older than 45 years and admitted for acute decompensated HF or recently discharged in the past 4 weeks were included. The intervention group (“app group”) used a mobile app, and the intervention prompted daily self-monitoring and promoted self-management. The control group (“no-app group”) received usual care. The primary outcome was the change in Minnesota Living with Heart Failure Questionnaire (MLHFQ) score from baseline to 6 and 12 weeks. Secondary outcomes were the Self-Care Heart Failure Index (SCHFI) questionnaire score and recurrent HF admissions.

RESULTS: A total of 83 participants were enrolled and completed all baseline assessments. Baseline characteristics were similar between the groups except for the prevalence of ischemic HF. The app group had a reduced MLHFQ at 6 weeks (mean 37.5, SD 3.5 vs mean 48.2, SD 3.7; P=.04) but not at 12 weeks (mean 44.2, SD 4 vs mean 45.9, SD 4; P=.78), compared to the no-app group. There was no effect of the app on the SCHFI at 6 or 12 weeks. The time to first HF readmission was not statistically different between the app group and the no-app group (app group 11/42, 26% vs no-app group 12/41, 29%; hazard ratio 0.89, 95% CI 0.39-2.02; P=.78) over 12 weeks.

CONCLUSIONS: The adaptive mobile app intervention, which focused on promoting self-monitoring and self-management, improved the MLHFQ at 6 weeks but did not sustain its effects at 12 weeks. No effect was seen on HF self-management measured by self-report. Further research is needed to enhance engagement in the app for a longer period and to determine if the app can reduce HF readmissions in a larger study.

TRIAL REGISTRATION: ClinicalTrials.gov NCT03149510; https://clinicaltrials.gov/ct2/show/NCT03149510.

PMID:34878990 | DOI:10.2196/26185