Categories
Nevin Manimala Statistics

Systematic Assessment of 10 Biomarker Candidates Focusing on α-Synuclein-Related Disorders

Mov Disord. 2021 Aug 7. doi: 10.1002/mds.28738. Online ahead of print.

ABSTRACT

BACKGROUND: Objective diagnostic biomarkers are needed to support a clinical diagnosis.

OBJECTIVES: To analyze markers in various neurodegenerative disorders to identify diagnostic biomarker candidates for mainly α-synuclein (aSyn)-related disorders (ASRD) in serum and/or cerebrospinal fluid (CSF).

METHODS: Upon initial testing of commercially available kits or published protocols for the quantification of the candidate markers, assays for the following were selected: total and phosphorylated aSyn (pS129aSyn), neurofilament light chain (NfL), phosphorylated neurofilament heavy chain (pNfH), tau protein (tau), ubiquitin C-terminal hydrolase L1 (UCHL-1), glial fibrillary acidic protein (GFAP), calcium-binding protein B (S100B), soluble triggering receptor expressed on myeloid cells 2 (sTREM-2), and chitinase-3-like protein 1 (YKL-40). The cohort comprised participants with Parkinson’s disease (PD, n = 151), multiple system atrophy (MSA, n = 17), dementia with Lewy bodies (DLB, n = 45), tau protein-related neurodegenerative disorders (n = 80, comprising patients with progressive supranuclear palsy (PSP, n = 38), corticobasal syndrome (CBS, n = 16), Alzheimer’s disease (AD, n = 11), and frontotemporal degeneration/amyotrophic lateral sclerosis (FTD/ALS, n = 15), as well as healthy controls (HC, n = 20). Receiver operating curves (ROC) with area under the curves (AUC) are given for each marker.

RESULTS: CSF total aSyn was decreased. NfL, pNfH, UCHL-1, GFAP, S100B, and sTREM-2 were increased in patients with neurodegenerative disease versus HC (P < 0.05). As expected, some of the markers were highest in AD (i.e., UCHL-1, GFAP, S100B, sTREM-2, YKL-40). Within ASRD, CSF NfL levels were higher in MSA than PD and DLB (P < 0.05). Comparing PD to HC, interesting serum markers were S100B (AUC: 0.86), sTREM2 (AUC: 0.87), and NfL (AUC: 0.78). CSF S100B and serum GFAP were highest in DLB.

CONCLUSIONS: Levels of most marker candidates tested in serum and CSF significantly differed between disease groups and HC. In the stratification of PD versus other tau- or aSyn-related conditions, CSF NfL levels best discriminated PD and MSA. CSF S100B and serum GFAP best discriminated PD and DLB. © 2021 The Authors. Movement Disorders published by Wiley Periodicals LLC on behalf of International Parkinson Movement Disorder Society.

PMID:34363416 | DOI:10.1002/mds.28738

Categories
Nevin Manimala Statistics

Effects of an arm-support exoskeleton on perceived work intensity and musculoskeletal discomfort: An 18-month field study in automotive assembly

Am J Ind Med. 2021 Aug 6. doi: 10.1002/ajim.23282. Online ahead of print.

ABSTRACT

BACKGROUND: Exoskeleton (EXO) technologies are a promising ergonomic intervention to reduce the risk of work-related musculoskeletal disorders, with efficacy supported by laboratory- and field-based studies. However, there is a lack of field-based evidence on long-term effects of EXO use on physical demands.

METHODS: A longitudinal, controlled research design was used to examine the effects of arm-support exoskeleton (ASE) use on perceived physical demands during overhead work at nine automotive manufacturing facilities. Data were collected at five milestones (baseline and at 1, 6, 12, and 18 months) using questionnaires. Linear mixed models were used to understand the effects of ASE use on perceived work intensity and musculoskeletal discomfort (MSD). Analyses were based on a total of 41 participants in the EXO group and 83 in a control group.

RESULTS: Across facilities, perceived work intensity and MSD scores did not differ significantly between the EXO and control groups. In some facilities, however, neck and shoulder MSD scores in the EXO group decreased over time. Wrist MSD scores in the EXO group in some facilities remained unchanged, while those scores increased in the control group overtime. Upper arm and low back MSD scores were comparable between the experimental groups.

CONCLUSION: Longitudinal effects of ASE use on perceived physical demands were not found, though some suggestive results were evident. This lack of consistent findings is discussed, particularly supporting the need for systematic and evidence-based ASE implementation approaches in the field that can guide the optimal selection of a job for ASE use.

PMID:34363229 | DOI:10.1002/ajim.23282

Categories
Nevin Manimala Statistics

Neurosonographic assessments of corpus callosum related structures in growth-restricted fetuses

J Clin Ultrasound. 2021 Aug 6. doi: 10.1002/jcu.23052. Online ahead of print.

ABSTRACT

PURPOSE: The aim of this study was to evaluate whether corpus callosum length (CCL), corpus callosum-fastigium length (CCFL) and the angle between CCL-CCFL (CCFA) were altered in growth-restricted fetuses.

METHODS: This prospective case-control study was conducted in a tertiary center. A total of 80 singleton fetuses were included in the study, classified as 36 late-onset growth-restricted fetuses and 44 adequate-for-gestational-age fetuses. All biometric measurements and Doppler assessments of umbilical artery, middle cerebral artery, and ductus venosus were performed via the trans-abdominal route. CCL, CCLF, and CCFA were assessed via the trans-vaginal route.

RESULTS: Late-onset growth-restricted fetuses showed significantly reduced CCL and CCFL. There was no statistically significant differences in terms of CCFA. Moderate-high correlations between CCL and biparietal diameter, head circumference, abdominal circumference, FL and gestational age were detected (r: 0.482 p: 0.000; r: 0.537 p: 0.000; r: 0.488 p: 0.000; r: 0.519 p: 0.000; and r: 0.472 p: 0.000, respectively).

CONCLUSION: This study adds to the literature that CCFA has not changed despite the decrease in CCL and CCFL in late-onset fetal growth restriction that might be a result of the redistribution of cerebral blood flow. To clarify the prognostic implications of these results in terms of neural and cognitive functions in postnatal life, there is a need for larger prospective studies.

PMID:34363232 | DOI:10.1002/jcu.23052

Categories
Nevin Manimala Statistics

Familial confounding affected the associations between maternal smoking during pregnancy and offspring speech and language, scholastic and coordination disorders

Acta Paediatr. 2021 Aug 7. doi: 10.1111/apa.16062. Online ahead of print.

ABSTRACT

AIM: This study examined the associations between prenatal smoking and speech and language, scholastic, coordination and mixed developmental disorders in offspring, using sibling and population controls.

METHODS: National Finnish registers were used to identify all 690,654 singletons born between 1996-2007 and any cases diagnosed with speech and language, scholastic, coordination and mixed developmental disorders by the end of 2012. Cases were compared to population controls, biological full-siblings and maternal half-siblings born during the same period. Conditional logistic regression was used to assess any associations between smoking during pregnancy and the selected developmental disorders.

RESULTS: Prenatal smoking was higher in the mothers of the 27,297 cases (21.7%) than the 99,876 population controls (14.5%). The adjusted odds ratio for smoking throughout pregnancy, and any diagnosis of speech and language, scholastic, coordination or mixed developmental disorders, was 1.29 (95% confidence interval 1.24-1.34). However, when we compared a subsample of 15,406 cases and their 20,657 siblings, the association was no longer statistically significant (odds ratio 1.09, 95% confidence interval 0.98-1.21).

CONCLUSION: The sibling comparisons suggested that the associations between prenatal smoking and speech and language, scholastic, coordination and mixed developmental disorders were confounded by familial factors shared by differentially exposed siblings.

PMID:34363238 | DOI:10.1111/apa.16062

Categories
Nevin Manimala Statistics

Willingness of Women with Endometriosis Planning to Undergo IVF to Participate in a Randomized Clinical Trial and the Effects of the COVID-19 Pandemic on Potential Participation

Reprod Sci. 2021 Aug 6. doi: 10.1007/s43032-021-00705-0. Online ahead of print.

ABSTRACT

The Pre-IVF Treatment with a GnRH Antagonist in Women with Endometriosis (PREGnant) Trial (clinicaltrials.gov no. NCT04173169) was designed to test the hypothesis that 60-day pre-treatment with an oral GnRH antagonist in women with documented endometriosis and planning an IVF cycle will result in a superior live birth rate to placebo. Eight hundred fourteen women are required from 4 national sites. To determine the feasibility of using an electronic medical record (EMR)-based strategy to recruit 204 participants at the Colorado site, we conducted a survey of women within the UCHealth system. Eligible women, identified using relevant ICD-10 codes, were invited to complete a 6-question survey to assess planned utilization of IVF, potential interest in participation, and whether delays in treatment due to COVID-19 would influence their decision to participate. Of 6354 age-eligible women with an endometriosis diagnosis, 421 had a concurrent infertility diagnosis. After eliminating duplicates, 212 were emailed a survey; 76 (36%) responded, 6 of whom reported no endometriosis diagnosis. Of the remaining 70, 29 (41%) were planning fertility treatment; only 19 planned IVF. All 19 expressed interest in participation. COVID-19 delays in treatment were not considered as a factor affecting participation by 8/19; the remaining 11 felt that it would “somewhat” affect their decision. None reported that they would not consider participation because of COVID-19. EMR-based recruitment for an endometriosis clinical trial is feasible although the overall yield of participants is low. Delays in treatment due to COVID-19 did not appear to overly influence potential recruitment.

PMID:34363198 | DOI:10.1007/s43032-021-00705-0

Categories
Nevin Manimala Statistics

Validation of the Nottingham Hip Fracture Score (NHFS) for the prediction of 30-day mortality in a Swedish cohort of hip fractures

Acta Anaesthesiol Scand. 2021 Aug 7. doi: 10.1111/aas.13966. Online ahead of print.

ABSTRACT

BACKGROUND: Hip fracture is a common osteoporotic fracture with great morbidity and mortality. The utility of ASA classification is limited, as most patients are ≥ ASA 3. A reliable predictor of mortality risk could support decision-making. We aimed to evaluate Nottingham hip fracture score (NHFS) for the prediction of 30-day mortality and then to recalibrate the formula converting NHFS to risk of 30-day mortality.

METHODS: All patients > 60 years with surgically treated hip fracture surgery during 2015-16 were assessed. Data was extracted manually from routinely collected clinical data in registry and medical records. Discriminative performance of NHFS and ASA was assessed with C-statistics. The conversion formula from NHFS to risk of 30-day mortality was recalibrated using logistic binominal regression. Observed vs expected ratios of 30-day mortality were compared with the 2012 NHFS-formula and recalibration was performed in a split dataset.

RESULTS: 1864 patients were included, with 213 deaths within 30 days. C-statistic were 0.64 for NHFS and 0.62 for ASA. Comparing expected values from the 2012-revision with our observed deaths gave a ratio of 1.37. Relating predicted levels of 30-day mortality based on 70% of our cohort vs 30% test portion of our Swedish dataset gave a ratio of 0.97.

DISCUSSION: NHFS underestimated mortality in our cohort and showed poor discrimination. Revision of the formula based on a split dataset improved calibration. We suggest NHFS to be routinely implemented to support clinical judgements, expand preoperative assessment and escalate intraoperative monitoring.

PMID:34363201 | DOI:10.1111/aas.13966

Categories
Nevin Manimala Statistics

Comparison of various basal insulin dose adjustments for inpatients while unable to eat

Int J Clin Pharm. 2021 Aug 6. doi: 10.1007/s11096-021-01314-2. Online ahead of print.

ABSTRACT

Background The American Diabetes Association recommends basal insulin or basal plus correctional insulin regimen for non-critically ill patients with type 2 diabetes mellitus unable to eat. There is limited evidence available examining ideal basal insulin dose reductions in this patient population. Aim This study aimed to determine the percent reduction of maintenance basal insulin that would provide the least hypoglycemic incidence in patients with type 2 diabetes mellitus in the non-intensive care unit setting. Methods This retrospective cohort study evaluated adult patients with type 2 diabetes mellitus prescribed outpatient basal insulin with a minimum unable to eat status of two hours. Patients were divided into four groups; <25%, 25-50%, 51-75%, > 75% of basal insulin administered compared to home dose. The primary endpoint was the incidence of hypoglycemia while unable to eat. Secondary endpoints included incidence of hyperglycemia, severe hypoglycemia, median daily blood glucose and hospital length of stay. Results A total of 173 patients were included. The primary outcome of hypoglycemia (5.9% vs. 8.8% vs. 14.3% vs. 12.3%; P = 0.578) was similar in all groups. There were no differences in hyperglycemia (P = 0.0701), severe hypoglycemia (P = 0.578) and median daily blood glucose (P = 0.428). Patients receiving 25-50% of home basal insulin had the longest unable to eat duration (11.5 h; P = 0.026); however, this was not statistically significant when adjusted using the Bonferroni correction for multiple tests. Conclusions No differences were observed in hypoglycemic events for patients unable to eat receiving various basal insulin dose reductions.

PMID:34363191 | DOI:10.1007/s11096-021-01314-2

Categories
Nevin Manimala Statistics

Prevalence and determinants of intravenous admixture preparation errors: A prospective observational study in a university hospital

Int J Clin Pharm. 2021 Aug 7. doi: 10.1007/s11096-021-01310-6. Online ahead of print.

ABSTRACT

Background Intravenous admixture preparation errors (IAPEs) may lead to patient harm. Insight into the prevalence as well as the determinants associated with these IAPEs is needed to elicit preventive measures. Aim The primary aim of this study was to assess the prevalence of IAPEs. Secondary aims were to identify the type, severity, and determinants of IAPEs. Method A prospective observational study was performed in a Dutch university hospital. IAPE data were collected by disguised observation. The primary outcome was the proportion of admixtures with one or more IAPEs. Descriptive statistics were used for the prevalence, type, and severity of IAPEs. Mixed-effects logistic regression analyses were used to estimate the determinants of IAPEs. Results A total of 533 IAPEs occurred in 367 of 614 admixtures (59.8%) prepared by nursing staff. The most prevalent errors were wrong preparation technique (n = 257) and wrong volume of infusion fluid (n = 107). Fifty-nine IAPEs (11.1%) were potentially harmful. The following variables were associated with IAPEs: multistep versus single-step preparations (adjusted odds ratio [ORadj] 4.08, 95% confidence interval [CI] 2.27-7.35); interruption versus no interruption (ORadj 2.32, CI 1.13-4.74); weekend versus weekdays (ORadj 2.12, CI 1.14-3.95); time window 2 p.m.-6 p.m. versus 7 a.m.-10 a.m. (ORadj 3.38, CI 1.60-7.15); and paediatric versus adult wards (ORadj 0.14, CI 0.06-0.37). Conclusion IAPEs, including harmful IAPEs, occurred frequently. The determinants associated with IAPEs point to factors associated with preparation complexity and working conditions. Strategies to reduce the occurrence of IAPEs and therefore patient harm should target the identified determinants.

PMID:34363192 | DOI:10.1007/s11096-021-01310-6

Categories
Nevin Manimala Statistics

Effects of lithium and selenium in the tail muscle of American bullfrog tadpoles (Lithobates catesbeianus) during premetamorphosis

Environ Sci Pollut Res Int. 2021 Aug 6. doi: 10.1007/s11356-021-15686-5. Online ahead of print.

ABSTRACT

The amphibian populations have faced a drastic decline over the past decades. This decline has been associated with the presence of contaminants in the environment, among other environmental stressors. The present study tested the responses following the exposure to lithium (2.5 mg L-1) and selenium (10μg L-1), both isolated and as a mixture, on the metabolic status of the tail muscle of premetamorphic American bullfrog (Lithobates catesbeianus) through the assessment of the total protein content, mobilization of glucose and triglycerides, and the activity of lactate dehydrogenase (LDH). The exposure followed a 21-day assay with two sampling periods (on the 7th and 21st day after the onset of exposure) to evaluate the effects over time. The group exposed to the mixture presented a statistically decreased LDH activity (P < 0.05) in both sampling periods. The presence of selenium elicited a statistically significant increase (P < 0.05) in the glucose mobilization after 7 days of exposure. After 21 days, the animals exposed to selenium presented levels of glucose mobilization comparable to the control group. The mobilization of glucose and triglycerides remained similar to the control group for the animals exposed to lithium and to the mixture in both periods of sampling (P > 0.05). The total protein content did not show any statistical difference in the treated groups throughout the experiment (P > 0.05). The presented results highlight the importance of the assessment of mixtures that can occur in the environment, since the combination of contaminants may elicit distinct toxicity compared with the effects triggered by the chemicals isolated.

PMID:34363154 | DOI:10.1007/s11356-021-15686-5

Categories
Nevin Manimala Statistics

Evaluation of wetland ecosystem health using geospatial technology: evidence from the lower Gangetic flood plain in India

Environ Sci Pollut Res Int. 2021 Aug 6. doi: 10.1007/s11356-021-15674-9. Online ahead of print.

ABSTRACT

The floodplain wetland habitat in the lower Gangetic plains of West Bengal played a significant role in protecting from environmental degradation like pollution, lowering groundwater table, natural hazards, and others as well as supports for human wellbeing. Thus, it is needed to investigate the health status of wetlands and suggest restoration strategies to protect the livelihood patterns dependent on wetlands. This paper presents the health of the wetland ecosystem by comprising the wetland ecosystem health index (WHI) in 2011 and 2018 at the block level of Malda district, as a part of the lower Gangetic flood plain using the pressure-state-response model (PSR model) and AHP method. A total number of six Landsat satellite images and statistical census data were used to determine the wetland health. Wetlands are classified as very healthy (2.81-3.33), healthy (2.41-2.80), sub-healthy (2.01-2.40), unhealthy (1.61-2.00), and sick (0-1.60) category on the basis of the wetland ecosystem health index score. The results of this study showed that the wetlands located surrounding English Bazar, Manikchak, Ratua-II, and Kaliachak-II blocks have a sub-healthy to very healthy condition in 2011 but changed to unhealthy to sick category in 2018 due to the increase of rapid urbanization, population density, and development activities. These areas have belonged to the sub-healthy to sick category in the year 2011 as well as 2018 due to high wetland pressure. Our observation reveals that the ecosystem service value provided by wetlands decreased by 62.51% and 20.46 in the observed period. Management of WEH should emphasize on large (>100 ha) and medium (51-100 ha) sizes of wetlands in the Diara region of West Bengal. Developing local-level institutions and setting restoration goals are useful strategies to manage wetland resources, and protecting biodiversity should be guided by the Government organization and NGOs.

PMID:34363159 | DOI:10.1007/s11356-021-15674-9