Categories
Nevin Manimala Statistics

Comparison of perioperative outcomes and clinical characteristics of Calcium, Matrix and Struvite stones from a single institution

Urology. 2021 Nov 27:S0090-4295(21)01092-X. doi: 10.1016/j.urology.2021.11.019. Online ahead of print.

ABSTRACT

OBJECTIVE: To define risk factors and perioperative outcomes for matrix stones and compare these outcomes with struvite and calcium stone cohorts.

METHODS: A retrospective cohort study comparing matrix stones (n=32), struvite stones (n=23) and a matched, calcium stone control group (n=32) was performed. Two-way ANOVA was used to compare the groups for continuous variables. Chi-square tests were used to compare categorical variables. Significance was set at p<0.05. All statistical tests were performed using R (v1.73).

RESULTS: We identified no differences in age, gender, or BMI between the three groups. Matrix and struvite stones were more likely to have a history of prior stone surgery and recurrent UTIs compared to calcium stones (p=0.027 and p<0.001, respectively). Struvite stones were more likely to present as staghorn calculi compared to matrix or calcium stones (56.5% vs 21.7% vs 18.8%, p=0.006). There were no significant differences in postoperative stone free rates (p=0.378). No significant differences in postoperative infectious complications were identified. Matrix stones were more likely to have Candida on stone culture compared to the struvite or calcium stones (p<0.0001).

CONCLUSION: Matrix and struvite stones were more likely have a history of stone surgery and preoperative recurrent UTIs. Struvite stones were more likely to present as staghorn calculi. Matrix stones were more likely to have Candida present in stone cultures. However, no difference in postoperative infectious outcomes or stone free rates were identified. Further study with larger cohorts is necessary to distinguish matrix stone postoperative outcomes from struvite and calcium stones.

PMID:34848277 | DOI:10.1016/j.urology.2021.11.019

Categories
Nevin Manimala Statistics

Occurrence of Plant-Parasitic Nematodes of Turfgrass in Korea

Plant Pathol J. 2021 Oct;37(5):446-454. doi: 10.5423/PPJ.OA.04.2021.0059. Epub 2021 Oct 1.

ABSTRACT

Plant-parasitic nematodes are not only an important constraint on agricultural crop production, but also cause both direct and indirect damage to turfgrass, which is a ground cover plant. However, studies on plant-parasitic nematodes of turfgrass in Korea are scarce. A survey for plant-parasitic nematodes was carried out on 13 golf courses in Korea. The results yielded 28 species/taxa belonging to 16 genera and 12 families of plant-parasitic nematodes. Among the isolated species, Helicotylenchus microlobus, Mesocriconema nebraskense, Tylenchorhynchus claytoni, Mesocriconema sp., and Meloidogyne graminicola were the most prevalent species in all management zones. Twelve species were new records of plant-parasitic nematodes in Korea. Highest maximum densities were showed by T. claytoni, Paratylenchus nanus, M. nebraskense, M. graminicola, and H. microlobus. Diversity (H’), was significantly higher in fairways compared to tees and greens, though species evenness (J’) and dominance (D) showed no statistically significant differences. This information is crucial in nematode problem diagnosis, and the subsequent formulation of management strategies.

PMID:34847631 | DOI:10.5423/PPJ.OA.04.2021.0059

Categories
Nevin Manimala Statistics

Clinical utility of family history of depression for prognosis of adolescent depression severity and duration assessed with predictive modeling

J Child Psychol Psychiatry. 2021 Nov 30. doi: 10.1111/jcpp.13547. Online ahead of print.

ABSTRACT

BACKGROUND: Family history of depression (FHD) is a known risk factor for the new onset of depression. However, it is unclear if FHD is clinically useful for prognosis in adolescents with current, ongoing, or past depression. This preregistered study uses a longitudinal, multi-informant design to examine whether a child’s FHD adds information about future depressive episodes and depression severity applying state-of-the-art predictive out-of-sample methodology.

METHODS: We examined data in adolescents with current or past depression (age 11-17 years) from the National Institute of Mental Health Characterization and Treatment of Adolescent Depression (CAT-D) study. We asked whether a history of depression in a first-degree relative was predictive of depressive episode duration (72 participants) and future depressive symptom severity in probands (129 participants, 1,439 total assessments).

RESULTS: Family history of depression, while statistically associated with time spent depressed, did not improve predictions of time spent depressed, nor did it improve models of change in depression severity measured by self- or parent-report.

CONCLUSIONS: Family history of depression does not improve the prediction of the course of depression in adolescents already diagnosed with depression. The difference between statistical association and predictive models highlights the importance of assessing predictive performance when evaluating questions of clinical utility.

PMID:34847615 | DOI:10.1111/jcpp.13547

Categories
Nevin Manimala Statistics

Novel cut-off values of time from diagnosis to systematic therapy predict the overall survival and the efficacy of targeted therapy in renal cell carcinoma: A long-term, follow-up, retrospective study

Int J Urol. 2021 Nov 30. doi: 10.1111/iju.14751. Online ahead of print.

ABSTRACT

OBJECTIVES: Metastatic renal cell carcinoma can occur synchronously or metachronously. We characterized the time from diagnosis to systematic therapy as a categorical variable to analyze its effect on the overall survival and first-line treatment efficacy of metastatic renal cell carcinoma patients.

METHODS: We initially enrolled 949 consecutive metastatic renal cell carcinoma patients treated with targeted therapies retrospectively from December 2005 to December 2019. X-tile analysis was used to determine cut-off values of time from diagnosis to systematic therapy referring to overall survival. Patients were divided into different groups based on the time from diagnosis to systematic therapy and then analyzed for survival.

RESULTS: Of 358 eligible patients with metastatic renal cell carcinoma, 125 (34.9%) had synchronous metastases followed by cytoreductive nephrectomy, and 233 (65.1%) had metachronous metastases. A total of 28 patients received complete metastasectomy. Three optimal cut-off values for the time from diagnosis to systematic therapy (months) – 1.1, 7.0 and 35.9 – were applied to divide the population into four groups: the synchro group (time from diagnosis to systematic therapy ≤1.0), early group (1.0 < time from diagnosis to systematic therapy ≤ 7.0), intermediate group (7.0 < time from diagnosis to systematic therapy < 36.0) and late group (time from diagnosis to systematic therapy ≥36.0). The targeted therapy-related overall survival (P < 0.001) and progression-free survival (P < 0.001) values were significantly different among the four groups. Patients with longer time from diagnosis to systematic therapy had better prognoses and promising efficacy of targeted therapy. With the prolongation of time from diagnosis to systematic therapy, complete metastasectomy was more likely to achieve and bring a better prognosis.

CONCLUSIONS: The time from diagnosis to systematic therapy impacts the survival of metastatic renal cell carcinoma patients treated with targeted therapy. The cutoff points of 1, 7 and 36 months were statistically significant. The statistical boundaries might be valuable in future model establishment.

PMID:34847622 | DOI:10.1111/iju.14751

Categories
Nevin Manimala Statistics

Fracture Strength of Monolithic Zirconia Crowns with Modified Vertical Preparation: A Comparative In Vitro Study

Eur J Dent. 2021 Nov 30. doi: 10.1055/s-0041-1735427. Online ahead of print.

ABSTRACT

OBJECTIVE: The aim of this study was to evaluate the influence of different marginal designs (deep chamfer, vertical, and modified vertical with reverse shoulder) on the fracture strength and failure modes of monolithic zirconia crowns.

MATERIALS AND METHODS: Thirty sound human maxillary first premolar teeth with comparable size were used in this study. The teeth were divided randomly into three groups according to the preparation design (n = 10): (1) group A: teeth prepared with a deep chamfer finish line; (2) group B: teeth prepared with vertical preparation; and (3) group C: teeth prepared with modified vertical preparation, where a reverse shoulder of 1 mm was placed on the buccal surface at the junction of middle and occlusal thirds. All samples were scanned by using an intraoral scanner (CEREC Omnicam, Sirona, Germany), and then the crowns were designed by using Sirona InLab 20.0 software and milled with a 5-axis machine. Each crown was then cemented on its respective tooth with self-adhesive resin cement by using a custom-made cementation device. A single load to failure test was used to assess the fracture load of each crown by using a computerized universal testing machine that automatically recorded the fracture load of each sample in Newton (N).

STATISTICAL ANALYSIS: The data were analyzed statistically by using one-way analysis of variance test and Bonferroni test at a level of significance of 0.05.

RESULTS: The highest mean of fracture load was recorded by chamfer (2,969.8 N), which followed by modified vertical (2,899.3 N) and the lowest mean of fracture load was recorded by vertical (2,717.9 N). One-way ANOVA test revealed a significant difference among the three groups. Bonferroni test showed a significant difference between group A and group B, while a nonsignificant difference was revealed between group C with group A and group B.

CONCLUSION: Within the limitations of this in vitro study, the mean values of fracture strength of monolithic zirconia crowns of all groups were higher than the maximum occlusal forces in the premolar region. The modification of the vertical preparation with a reverse shoulder placed at the buccal surface improved the fracture strength up to the point that it was statistically nonsignificant with the chamfer group.

PMID:34847612 | DOI:10.1055/s-0041-1735427

Categories
Nevin Manimala Statistics

Positive direct antiglobulin test: is it a risk factor for significant hyperbilirubinemia in neonates with ABO incompatibility?

Am J Perinatol. 2021 Nov 30. doi: 10.1055/a-1709-5036. Online ahead of print.

ABSTRACT

OBJECTIVE: ABO incompatibility is a common cause of neonatal indirect hyperbilirubinemia. The direct antiglobulin test (DAT) can identify infants developing hemolytic disease. This study aims to evaluate the significance of DAT positivity among neonates with ABO incompatibility.

STUDY DESIGN: This retrospective study included 820 neonates with blood group A or B who were born to blood group O mothers. The study group consisted of neonates (n = 79) who had positive DAT, and the control group consisted of infants (n = 741) who had negative DAT. Demographic and clinical data of the neonates regarding jaundice were collected and compared statistically.

RESULTS: The bilirubin level at 24 hours of life (study group 8 ± 2.6 mg/dl, control group 6 ± 2.2 mg/dl, p < 0.001) and the highest bilirubin level (study group 12.7 ± 3.6 mg/dl, control group 10.4 ± 4.2 mg/dl, p < 0.001) were higher in infants with positive DAT. In the study group 37 (46.8%) infants and in the control group 83 (11.2%) infants received PT in the nursery (p < 0.001). In neonates with positive DAT; direct bilirubin level, duration of hospitalization, and PT in the nursery were higher (p = 0.002, p < 0.001, and p < 0.001), whereas hemoglobin level was lower (p < 0.001).

CONCLUSION: In neonates with ABO incompatibility, a positive DAT is a risk factor for developing significant hyperbilirubinemia. Close follow-up of newborn infants with ABO incompatibility is crucial for early detection and treatment of neonatal jaundice to avoid early and late complications.

PMID:34847590 | DOI:10.1055/a-1709-5036

Categories
Nevin Manimala Statistics

Endocannabinoid Modulation Using Monoacylglycerol Lipase Inhibition in Tourette Syndrome: A Phase 1 Randomized, Placebo-Controlled Study

Pharmacopsychiatry. 2021 Nov 30. doi: 10.1055/a-1675-3494. Online ahead of print.

ABSTRACT

INTRODUCTION: Tourette syndrome (TS) is a complex neurodevelopmental disorder characterized by chronic motor and vocal tics. While consistently effective treatment is lacking, evidence indicates that the modulation of endocannabinoid system is potentially beneficial. Lu AG06466 (previously ABX-1431) is a highly selective inhibitor of monoacylglycerol lipase, the primary enzyme responsible for the degradation of the endocannabinoid ligand 2-arachidonoylglycerol. This exploratory study aimed to determine the effect of Lu AG06466 versus placebo on tics and other symptoms in patients with TS.

METHODS: In this phase 1b cross-over study, 20 adult patients with TS on standard-of-care medications were randomized to a single fasted dose of Lu AG06466 (40 mg) or placebo in period 1, followed by the other treatment in period 2. The effects on tics, premonitory urges, and psychiatric comorbidities were evaluated using a variety of scaled approaches at different time points before and after treatment.

RESULTS: All scales showed an overall trend of tic reduction, with two out of three tic scales (including the Total Tic Score of the Yale Global Tic Severity Score) showing a significant effect of a single dose of Lu AG06466 versus placebo at various timepoints. Treatment with Lu AG06466 resulted in a significant reduction in premonitory urges versus placebo. Single doses of Lu AG06466 were generally well-tolerated, and the most common adverse events were headache, somnolence, and fatigue.

CONCLUSION: In this exploratory trial, a single dose of Lu AG06466 showed statistically significant positive effects on key measures of TS symptoms.

PMID:34847610 | DOI:10.1055/a-1675-3494

Categories
Nevin Manimala Statistics

Investigating the Role of Auditory Processing Abilities in Long-Term Self-Reported Hearing Aid Outcomes among Adults Age 60+ Years

J Am Acad Audiol. 2021 Jul;32(7):405-419. doi: 10.1055/s-0041-1728771. Epub 2021 Nov 30.

ABSTRACT

BACKGROUND: Self-reported hearing aid outcomes among older adults are variable and important to improve. The extent of the role of auditory processing in long-term hearing aid outcomes is not well understood.

PURPOSE: To determine how auditory processing abilities are related to self-reported hearing aid satisfaction and benefit along with either aided audibility alone or exploratory factors suggested by previous literature.

RESEARCH DESIGN: Descriptive analyses and multiple regression analyses of cross-sectional self-reported outcomes.

STUDY SAMPLE: Adult participants, >60 years (n = 78), fitted with bilateral hearing aids to treat symmetric, mild to moderate sensorineural hearing loss.

DATA COLLECTION AND ANALYSIS: Participants were recruited from a single audiology clinic to complete a series of questionnaires, behavioral assessments, and obtain data from their hearing aids, including real ear measures and data logging of hearing aid use. Multiple linear regressions were used to determine the amount of variance explained by predictive factors in self-reported hearing aid satisfaction and benefit. The primary predictive factors included gap detection threshold, spatial advantage score, dichotic difference score, and aided audibility. Exploratory factors included personality, self-efficacy, self-report of disability, and hearing aid use. All interpretations of statistical significance used p < 0.05. Effect sizes were determined using Cohen’s f 2 with a medium effect suggesting clinical relevance.

RESULTS: Gap detection threshold was a statistically significant predictor in both primary regression models with a medium effect size for satisfaction and a small effect size for benefit. When additional exploratory factors were included in the regression models with auditory processing abilities, gap detection and self-efficacy were both significant predictors of hearing aid satisfaction with medium effect sizes, explaining 10 and 17% of the variance, respectively. There were no medium effect sizes found for other predictor variables in either the primary or exploratory hearing aid benefit models. Additional factors were statistically significant in the models, explaining a small amount of variance, but did not meet the medium effect size criterion.

CONCLUSION: This study provides initial evidence supporting the incorporation of measures of gap detection ability and hearing aid self-efficacy into clinical practice for the interpretation of postfitting long-term hearing aid satisfaction.

PMID:34847582 | DOI:10.1055/s-0041-1728771

Categories
Nevin Manimala Statistics

Evaluation of Postinfection Hearing with Audiological Tests in Patients with COVID-19: A Case-Control Study

J Am Acad Audiol. 2021 Jul;32(7):464-468. doi: 10.1055/s-0041-1730960. Epub 2021 Nov 30.

ABSTRACT

BACKGROUND: Some viral infections can cause congenital or acquired unilateral or bilateral hearing loss. It is predicted that the coronavirus disease 2019 (COVID-19) virus, which can affect many systems in the body, may also have a negative effect on hearing.

PURPOSE: This study evaluated the effects of COVID-19 infection on pure-tone average.

RESEARCH DESIGN: A case-control study.

MATERIALS AND METHODS: A total of 104 volunteers (48 control, 56 experimental group) who applied to the ENT clinic of Adıyaman University Training and Research Hospital were included in this study. After the detailed clinical examination and medical history, 13 volunteers of the experimental group and 5 volunteers from the control group were excluded from the study. In this way, each group consisted of 43 volunteers. While the experimental group consisted of patients who did not have any hearing problems before but had COVID-19. The control group consisted of healthy volunteers who did not have any hearing problems and were not infected with COVİD-19. Audiological test was applied to all volunteers to determine their pure-tone average. On the data obtained, it was analyzed whether COVID-19 affects the pure-tone average and how it changes according to variables such as age and gender.

RESULTS: The evaluation of the 43 (50.0%) COVID-19-positive patients and 43 (50.0%) healthy controls showed no significant differences (p > 0.05) at 250 and 500 Hz, whereas at 4000, 6000, and 8000 Hz, the two groups differed significantly. In addition, significant differences were found in the left and right ears at 1000 and 2000 Hz (p < 0.05). The differences between the two groups in the pure-tone average of the left and right ear were statistically significant (p < 0.05). However, there were no significant sex-based differences in the pure-tone average between males and females (p > 0.05) CONCLUSION: The pure-tone average of COVID-19 positive patients was significantly worse than those of the healthy control group. Thus, COVID-19 should also be considered in patients presenting with unexplained hearing loss. Further studies should investigate the effects of COVID-19 on hearing and the underlying pathophysiology.

PMID:34847586 | DOI:10.1055/s-0041-1730960

Categories
Nevin Manimala Statistics

Patterns of opioid and benzodiazepine use with gabapentin among disabled Medicare beneficiaries – A retrospective cohort study

Drug Alcohol Depend. 2021 Nov 17;230:109180. doi: 10.1016/j.drugalcdep.2021.109180. Online ahead of print.

ABSTRACT

BACKGROUND: Our goal was to describe specific patterns associated with co-prescriptions of gabapentin, opioids, and benzodiazepines among disabled Medicare beneficiaries.

METHODS: Using 2013-2015 Medicare data, we conducted a retrospective cohort study among fee-for-service disabled beneficiaries continuously enrolled in Medicare Parts A, B, and D. The index date was defined as the earliest fill date for a gabapentin, opioid, or benzodiazepine prescription. Monotherapy, dual therapy, and tri-therapy were defined as utilization of one, two, and three medication classes, respectively. Augmentation was defined as a prescription for a different medication class in addition to prescription for initial medication; switching referred to a change in prescription for a different medication class with no subsequent fills of initial medication. We used descriptive statistics, Kaplan Meier analyses and Cox proportional hazards to examine the association between initial therapy and monotherapy, dual therapy, tri-therapy, switching and augmentation.

RESULTS: Among 151,552 disabled beneficiaries, gabapentin initiators were more likely to augment therapy (50.1%) when compared to opioid (28.7%) and benzodiazepine (38.7%) users. When compared to opioid initiators, the risk of augmentation (HR[95%CI]: 1.85[1.82-1.89]) and switching (1.62 [1.51-1.73]) was significantly higher among gabapentin initiators. Risk of augmentation was also significantly higher among beneficiaries with co-morbid pain and mental health conditions (p < 0.01). Overall, the majority of beneficiaries augmented and switched within 2-months and 4-months after initiating therapy, respectively.

CONCLUSIONS: Given safety concerns associated with gabapentin, opioids, and benzodiazepines, it is imperative that the benefits and risks of co-prescribing these medications be examined comprehensively, especially for those in vulnerable sub-groups.

PMID:34847506 | DOI:10.1016/j.drugalcdep.2021.109180