Categories
Nevin Manimala Statistics

Medication overuse headache: The effectivity of iv lidocaine – magnesium

Ideggyogy Sz. 2021 Sep 30;74(9-10):323-328. doi: 10.18071/isz.74.0323.

ABSTRACT

BACKGROUND AND PURPOSE: The detoxification process in medication overuse headache is the most difficult process for the patient. We aimed to investigate the effectiveness of the combination of low dose IV lidocaine and magnesium (100 mg lidocaine and 1.25 mg magnesium) in patients with medication overuse headache during the detoxification process.

METHODS: A total of 30 patients were included in the study; 15 received 24 hours of IV hydration, 15 received 1-hour lidocaine-magnesium infusion at the onset of pain in addition to the 24 hours of IV hydration. Headache severity (numeric rating scale, NRS), attack durations, onset of headache, monthly analgesic/triptan intakes, numbers of monthly headache days data were documented. We evaluated the severity of headache before and after daily treatment of two groups for one week.

RESULTS: When both groups were compared, there was no significant difference in the pre-treatment NRS values, whe-reas, in the group receiving IV lidocaine-magnesium combination, there was a statistically significant decrease in the post-treatment NRS values in the first five days (p <0.05).

CONCLUSION: An 1-hour combined infusion of lidocaine-magnesium may be considered as an alternative option for the patient to have a more quality detoxification process during the hospital stay, so that in parallel to the reduction in the use of multiple treatments (such as neuroleptics, benzodiazepines, antiemetics and opioids) and duration length of stay, the economic costs can also be reduced. The administration of combination will bring fewer side effects compared to their administration separately.

PMID:34657403 | DOI:10.18071/isz.74.0323

Categories
Nevin Manimala Statistics

Is there any difference between pregnancy results after tubal reanastamosis performed laparotomically, laparoscopically, and robotically?

Asian J Endosc Surg. 2021 Oct 17. doi: 10.1111/ases.12991. Online ahead of print.

ABSTRACT

INTRODUCTION: Tubal reanastamosis offers hope to conceive again. However, there are many factors that affect the success of this procedure. In our study we aimed to compare the pregnancy rates of the surgical methods used for tubal reanastamosis in pregnancy requested after tubal sterilization.

METHODS: In our study we compared the rates of pregnancies after reanastamosis retrospectively in female patients under the age of 40 who underwent reanastamosis between 2010 and 2019 with laparotomic, laparoscopic and robotic methods. A single layer of 4 quadrant 6/0 number polydioxanone absorbable sutures were used in all surgical methods. A similar surgical technique was used.

RESULTS: In surgical methods (laparotomy, laparoscopy, and robotics), there was a statistical difference between the three groups in terms of operation times of surgical methods used for tubal reanastamosis (p < 0.05). Laparotomy, laparoscopy, and robotics pregnancy rates were 52.6% (n = 41), 67.3% (n = 37), 61.2% (n = 63), respectively. There was no statistical difference between groups in terms of pregnancy rates. However, odds ratio (OR) values of the laparoscopy group and robotics group probability of conception were 1.536 (95% confidence interval [CI], 0.813-2.898), 1.111 (95% CI, 0.656-1.879) higher, respectively.

CONCLUSIONS: Although there is no statistical difference between the surgical methods used for tubal reanastamosis, we think that the laparoscopic surgical method may be preferable due to the shorter hospital stay. We think that the previous method of bilateral tubaligastion (BTL), the site of reanastasis, and the time between BTL and reanastomosis were effective in pregnancy success.

PMID:34657383 | DOI:10.1111/ases.12991

Categories
Nevin Manimala Statistics

Metabolite signatures of heart failure, sleep apnoea, their interaction, and outcomes in the community

ESC Heart Fail. 2021 Oct 17. doi: 10.1002/ehf2.13631. Online ahead of print.

ABSTRACT

AIMS: Sleep apnoea and congestive heart failure (CHF) commonly co-exist, but their interaction is unclear. Metabolomics may clarify their interaction and relationships to outcome.

METHODS AND RESULTS: We assayed 372 circulating metabolites and lipids in 1919 and 1524 participants of the Framingham Heart Study (FHS) (mean age 54 ± 10 years, 53% women) and Women’s Health Initiative (WHI) (mean age 67 ± 7 years), respectively. We used linear and Cox regression to relate plasma concentrations of metabolites and lipids to echocardiographic parameters; CHF and its subtypes heart failure with reduced ejection fraction (HFrEF) and heart failure with preserved ejection fraction (HFpEF); and sleep indices. Adenine dinucleotide phosphate (ADP) associated with left ventricular (LV) fractional shortening; phosphocreatine with LV wall thickness; lysosomal storage molecule sphingomyelin 18:2 with LV mass; and nicotine metabolite cotinine with time spent with an oxygen saturation less than 90% (β = 2.3 min, P = 2.3 × 10-5 ). Pro-hypertrophic metabolite hydroxyglutarate partly mediated the association between LV wall thickness and HFpEF. Central sleep apnoea was significantly associated with HFpEF (P = 0.03) but not HFrEF (P = 0.5). There were three significant metabolite canonical variates, one of which conferred protection from cardiovascular death [hazard ratio = 0.3 (0.11, 0.81), P = 0.02].

CONCLUSIONS: Energetic metabolites were associated with cardiac function; energy- and lipid-storage metabolites with LV wall thickness and mass; plasma levels of nicotine metabolite cotinine were associated with increased time spent with a sleep oxygen saturation less than 90%, a clinically significant marker of outcome, indicating a significant hazard for smokers who have sleep apnoea.

PMID:34657379 | DOI:10.1002/ehf2.13631

Categories
Nevin Manimala Statistics

A systematic, concept-based method of developing the exposure measure for drug safety and effectiveness studies

Pharmacoepidemiol Drug Saf. 2021 Oct 17. doi: 10.1002/pds.5372. Online ahead of print.

ABSTRACT

PURPOSE: In drug safety and effectiveness studies based on secondary data, the choice of an appropriate exposure measure for a given outcome can be challenging. Different measures of exposure can yield different estimates of treatment effect and safety. There is a knowledge gap with respect to developing and refining measures of drug exposure, to ensure that the exposure measure addresses the study question and is suitable for statistical analysis.

METHODS: We present a transparent, step-by-step approach to the development of drug exposure measures involving secondary data. This approach would be of interest to students and investigators with initial training in pharmacoepidemiology. We illustrate the approach using a study about Parkinson’s disease.

RESULTS: We described the exposure specifications according to the study question. Next, we refined the exposure measure by linking it to knowledge about four major concepts in drug safety and effectiveness studies: drug use patterns, duration, timing, and dose. We then used this knowledge to guide the ultimate choice of exposure measure: time-varying, cumulative 6-month exposure to tamsulosin (a drug used to treat prostate hyperplasia).

CONCLUSIONS: The proposed approach links exposure specifications to four major concepts in drug safety and effectiveness studies. Formulating subject-matter knowledge about these major concepts provides an avenue to develop the rationale and specifications for the exposure measure. This article is protected by copyright. All rights reserved.

PMID:34657356 | DOI:10.1002/pds.5372

Categories
Nevin Manimala Statistics

Cost-effectiveness of exergaming compared to regular day-care activities in dementia: Results of a randomised controlled trial in The Netherlands

Health Soc Care Community. 2021 Oct 17. doi: 10.1111/hsc.13608. Online ahead of print.

ABSTRACT

The growing number of people living with dementia will result in increased costs of dementia worldwide. The e-Health intervention ‘Exergaming’ may improve health and quality of life of people with dementia, but the cost-effectiveness is unknown. We assessed the cost-effectiveness of exergaming compared to regular activities from a societal perspective in day-care centres (DCC) for people with dementia and their informal caregivers (IC) alongside a cluster randomised controlled trial. We included 112 dyads (person with dementia and IC) from 20 psychogeriatric DCCs (11 exergaming, 9 control) across the Netherlands. Exergaming consisted of interactive cycling at least twice a week for 6 months. Measurements were conducted at baseline (T0), after 3 (T1) and 6 (T2) months. Primary outcomes were minutes of physical activity, mobility of the participants with dementia (Short Physical Performances Battery, SPPB), and Quality-Adjusted Life-Years (QALYs) of participants with dementia and ICs. ICs filled out cost diaries to measure healthcare and informal care utilisation during the study. There were no statistically significant differences in outcomes or costs between the groups at the level of participants with dementia, the ICs or the dyad. With regard to QALYs and SPPB, the probability that exergaming is cost-effective compared to control was low for all possible willingness-to-pay (WTP) thresholds. However, for physical activity at WTP thresholds of 0, 50 and 250 Euros per additional minute of physical activity, the probability of cost-effectiveness is 0.46, 0.84 and 0.87, respectively. Exergaming in DCC was not cost-effective compared to usual activities. However, considering the small sample size and the large number of missing observations, findings should be interpreted with caution. Future studies with larger samples are recommended to obtain definitive answers on the cost-effectiveness of exergaming. This trial was registered in the Netherlands Trial Register (NTR5537/NL5420).

PMID:34657346 | DOI:10.1111/hsc.13608

Categories
Nevin Manimala Statistics

Rhinosporidiosis – factors predicting disease recurrence

Mycoses. 2021 Oct 16. doi: 10.1111/myc.13381. Online ahead of print.

ABSTRACT

BACKGROUND: Rhinosporidiosis is a chronic granulomatous disease of the nose caused by Rhinosporidium seeberi. The disease is largely non amenable to medical therapy and shows high recurrence rates requiring patients to undergo multiple surgeries often resulting in increased morbidity.

OBJECTIVE: To analyze the epidemiological, clinical, histopathological characteristics, treatment and outcome in rhinosporidiosis and to identify factors which predispose to recurrence of the disease.

PATIENTS/METHODS: Retrospective analysis of data of all patients with a diagnosis of rhinosporidiosis confirmed by histopathology at a tertiary care hospital from 2015 to 2019.

RESULTS: There were 42 patients, 40 males and two females, with a mean age of 37.37 years. Disease showed bilateral involvement in 17 (40.48%) patients. Nineteen (45.24%) patients had more than two sites involved at initial presentation. Most patients had nasal cavity involvement followed by nasopharynx. Among the 28 patients who had a follow up, 12 showed recurrent disease. However, 21 patients were disease free following a revision excision. Involvement of more than two sites was an independent significant factor for recurrence. On univariate analysis, other factors which showed statistically significant odds of developing recurrence were previous surgery (p=0.054), involvement of nasal septum (p=0.022), middle turbinate (p=0.024), nasopharynx (p=0.049) and posterior pharyngeal wall (p=0.05). Factors which showed significantly less likelihood of developing a recurrence included patients who had less than 12 months duration from first symptom to intervention (p=0.016), involvement of less than two sites (p=0.0003) and unilateral disease (p=0.019).

CONCLUSION: Early intervention in rhinosporidiosis especially when the disease is unilateral and involves less than two sites improves the outcome.

PMID:34657340 | DOI:10.1111/myc.13381

Categories
Nevin Manimala Statistics

Young carers and educational engagement: Quantitative analysis of bursary applications in Australia

Health Soc Care Community. 2021 Oct 17. doi: 10.1111/hsc.13589. Online ahead of print.

ABSTRACT

Young carers support family members affected by disability or a health condition. The Young Carer Bursary Program aims to support young carers’ education. This paper analysed data from consenting bursary applicants (2017-2019) to investigate relationships between wellbeing, educational attendance, home study and other factors. Descriptive statistics, correlation and regression analysis determined significant issues, relationships and influential factors related to young carer (N = 1,443) wellbeing and education. Sixty-eight percent were aged between 13 and 18 years and attended secondary school. One third of the sample reported that they were the main carer in their family and 29% reported receiving no support. Female applicants from single parent households who were the main carer attended educational settings less often. Eighteen percent (n = 267) rated their wellbeing as poor/very poor. Better wellbeing was associated with increased educational attendance (rs = 0.33, p < 0.001) and home study (rs = 0.34, p < 0.001). Wellbeing was associated with main carer status, caring for a parent, having a disability, being older and having few supports. Educational attendance was associated with main carer status, higher care load and fewer supports. Home study was associated with having a disability, caring for a sibling, caring for more than 11 hr per week and having fewer supports. Important factors about the age, life situation and challenges experienced by young carers identified in this paper indicate that further research into preferred supports and effectiveness of the bursary in improving educational engagement is warranted.

PMID:34657333 | DOI:10.1111/hsc.13589

Categories
Nevin Manimala Statistics

A Guide to Pre-Processing High-Throughput Animal Tracking Data

J Anim Ecol. 2021 Oct 17. doi: 10.1111/1365-2656.13610. Online ahead of print.

ABSTRACT

1. Modern, high-throughput animal tracking increasingly yields ‘big data’ at very fine temporal scales. At these scales, location error can exceed the animal’s step size, leading to mis-estimation of behaviours inferred from movement. ‘Cleaning’ the data to reduce location errors is one of the main ways to deal with position uncertainty. Though data cleaning is widely recommended, inclusive, uniform guidance on this crucial step, and on how to organise the cleaning of massive datasets, is relatively scarce. 2. A pipeline for cleaning massive high-throughput datasets must balance ease of use and computationally efficiency, in which location errors are rejected while preserving valid animal movements. Another useful feature of a pre-processing pipeline is efficiently segmenting and clustering location data for statistical methods, while also being scalable to large datasets and robust to imperfect sampling. Manual methods being prohibitively time consuming, and to boost reproducibility, pre-processing pipelines must be automated. 3. We provide guidance on building pipelines for pre-processing high-throughput animal tracking data to prepare it for subsequent analyses. We apply our proposed pipeline to simulated movement data with location errors, and also show how large volumes of cleaned data can be transformed into biologically meaningful ‘residence patches’, for exploratory inference on animal space use. We use tracking data from the Wadden Sea ATLAS system (WATLAS) to show how pre-processing improves its quality, and to verify the usefulness of the residence patch method. Finally, with tracks from Egyptian fruit bats (Rousettus aegyptiacus), we demonstrate the pre-processing pipeline and residence patch method in a fully worked out example. 4. To help with fast implementation of standardised methods, we developed the R package atlastools, which we also introduce here. Our pre-processing pipeline and atlastools can be used with any high-throughput animal movement data in which the high data-volume combined with knowledge of the tracked individuals’ movement capacity can be used to reduce location errors. atlastools is easy to use for beginners, while providing a template for further development. The common use of simple yet robust pre-processing steps promotes standardised methods in the field of movement ecology and leads to better inferences from data.

PMID:34657296 | DOI:10.1111/1365-2656.13610

Categories
Nevin Manimala Statistics

Using Machine Learning to Advance Disparities Research: Subgroup Analyses of Access to Opioid Treatment

Health Serv Res. 2021 Oct 17. doi: 10.1111/1475-6773.13896. Online ahead of print.

ABSTRACT

OBJECTIVE: To operationalize an intersectionality framework using a novel statistical approach and with these efforts, improve the estimation of disparities in access (i.e., wait time to treatment entry) to opioid use disorder (OUD) treatment beyond race.

DATA SOURCE: Sample of 941,286 treatment episodes collected in 2015-2017 in the United States from the Treatment Episodes Data Survey (TEDS-A) and a subset from California (n = 188,637) and Maryland (n = 184,276), states with the largest sample of episodes.

STUDY DESIGN: This retrospective subgroup analysis used a two-step approach called virtual twins. In step 1, we trained a classification model that gives the probability of waiting (1 day or more). In step 2, we identified subgroups with a higher probability of differences due to race. We tested three classification models for step 1 and identified the model with the best estimation.

DATA COLLECTION: Client data were collected by states during personal interviews at admission and discharge.

PRINCIPAL FINDINGS: Random forest was the most accurate model for the first step of subgroup analysis. We found large variation across states in racial disparities. Stratified analysis of two states with the largest samples showed critical factors that augmented disparities beyond race. In California, factors such as service setting, referral source, and homelessness defined the subgroup most vulnerable to racial disparities. In Maryland, service setting, prior episodes, receipt of medication-assisted opioid treatment, and primary drug use frequency augmented disparities beyond race. The identified subgroups had significantly larger racial disparities.

CONCLUSIONS: The methodology used in this study enabled a nuanced understanding of the complexities in disparities research. We found state and service factors that intersected with race and augmented disparities in wait time. Findings can help decision makers target modifiable factors that make subgroups vulnerable to waiting longer to enter treatment. This article is protected by copyright. All rights reserved.

PMID:34657287 | DOI:10.1111/1475-6773.13896

Categories
Nevin Manimala Statistics

A proposed approach for the determination of the bioequivalence acceptance range for narrow therapeutic index drugs in the European Union

Clin Pharmacol Ther. 2021 Oct 17. doi: 10.1002/cpt.2451. Online ahead of print.

ABSTRACT

The current regulatory criterion for bioequivalence of narrow therapeutic index (NTI) drugs in the European Union requires that the 90% confidence interval for the ratio of the population geometric means of the test product compared to the reference for AUC, and in certain cases Cmax , to be included within the tighter acceptance range of 90.00 – 111.11%. As a consequence, sponsors need to recruit a higher number of subjects to demonstrate bioequivalence and this may be seen as increasing the burden for the development of generics. This “one-size-fits-all” criterion is particularly questionable when the within-subject variability of the reference product is moderate-to-high. As an alternative, we propose a further refined statistical approach where the acceptance range is narrowed based on the within-subject variability of the reference product of the NTI drug, similar to the one used for widening the standard 80.00 – 125.00% acceptance range for highly variable drugs. The 80.00-125.00% acceptance range is narrowed, only if the within-subject variability is lower than 30%, down to the current NTI acceptance range of 90.00 – 111.11% when the within-subject variability is 13.93% or lower. Examples within the current EMA list of NTI drugs show a considerable reduction in required sample size for drugs like Tacrolimus and Colchicine, where the predicted within-subject variability is 20-30%. In these cases, this approach is less sample size demanding without any expected increase in the therapeutic risks, since patients treated with reference products with moderate-to-high within-subject variability are frequently exposed to bioavailability differences larger than 10%.

PMID:34657284 | DOI:10.1002/cpt.2451