Categories
Nevin Manimala Statistics

Evaluation of the Validity of a Food Frequency Questionnaire and 24-Hour Dietary Recall to Assess Dietary Iron Intake in Children and Adolescents from the South American Youth/Child Cardiovascular and Environmental Study

J Acad Nutr Diet. 2021 Aug 24:S2212-2672(21)00831-5. doi: 10.1016/j.jand.2021.07.005. Online ahead of print.

ABSTRACT

BACKGROUND: A food frequency questionnaire (FFQ) for South American children and adolescents was developed, but its validity for assessing dietary iron intake has not been evaluated.

OBJECTIVE: To evaluate the validity of the FFQ and 24-hour dietary recalls (24h-DR) for assessing dietary iron intake in children and adolescents.

DESIGN: The South American Youth/Child Cardiovascular and Environmental study is a multicenter observational study, conducted in five South American cities: Buenos Aires (Argentina), Lima (Peru), Medellin (Colombia), Sao Paulo, and Teresina (Brazil). The FFQ assessed dietary intake over the previous 3 months, and the 24h-DR was completed three times (2 weekdays and 1 weekend day) with a minimum 5-day interval between recalls. Blood samples were collected to assess serum iron, ferritin, and hemoglobin levels.

PARTICIPANTS AND SETTING: Data of 99 children (aged 3 to 10 years) and 50 adolescents (aged 11 to 17 years) from public and private schools were collected during 2015 to 2017.

MAIN OUTCOME MEASURES: Dietary iron intake calculated from the FFQ (using the sum of daily iron intake in all food/food groups) and 24h-DR (mean of 3 days using the multiple source method).

STATISTICAL ANALYSES PERFORMED: Dietary iron intake in relation to blood biomarkers were assessed using Spearman rank correlations adjusted for sex, age, and total energy intake, and the quadratic weighted κ coefficients for agreement.

RESULTS: Spearman correlations showed very good coefficients (range = 0.78 to 0.85) for the FFQ in both age groups; for the 24h-DR, the coefficients were weak in children and adolescents (range = 0.23 to 0.28). The agreement ranged from 59.9% to 72.9% for the FFQ and from 63.9% to 81.9% for the 24h-DR.

CONCLUSION: The South American Youth/Child Cardiovascular and Environmental study FFQ exhibited good validity to rank total dietary iron intake in children and adolescents, and as well as the 24h-DR, presented good strength of agreements when compared with serum iron and ferritin levels.

PMID:34463258 | DOI:10.1016/j.jand.2021.07.005

Categories
Nevin Manimala Statistics

Lower limb lymphedema staging based on magnetic resonance lymphangiography

J Vasc Surg Venous Lymphat Disord. 2021 Jul 8:S2213-333X(21)00301-2. doi: 10.1016/j.jvsv.2021.06.006. Online ahead of print.

ABSTRACT

OBJECTIVE: Dermal backflow (DBF) and reduced lymphatic visualization are common findings of lymphedema on various imaging modalities. However, there is a lack of knowledge about how these findings vary with the anatomic location and severity of lymphedema, and previous reports using indocyanine green lymphography or lymphoscintigraphy show variable results. Magnetic resonance lymphangiography (MRL) is expected to clarify this clinical question due to its superior ability for lymphatic visualization. This retrospective study aimed to investigate the following: (1) Are there any characteristic patterns for DBF and lymphatics’ visualization, depending on the anatomic location within lower limbs and severity of lymphedema? (2) Is it possible to classify the severity of lymphedema based on MRL findings?

METHODS: Two radiologists performed consensus readings of MRL of 56 patients (112 limbs) with lower-limb lymphedema. The frequency of visualized DBF and lymphatics was analyzed in six regions in each lower limb. The results were compared with the International Society of Lymphology clinical stages and etiology of lymphedema. Characteristic findings were categorized and compared with the clinical stage and duration of lymphedema.

RESULTS: DBF and lymphatics were observed more frequently in the distal regions than the proximal regions of lower limbs. DBF appeared more frequently as the clinical stage increased, reaching statistical significance (P < 10-3) between stages 0 or I and II. DBF above the knee joint was rarely observed (0.48%) in early stages (0 and I) but appeared more frequently (13.5%, P < 10-5) in stage II. Lymphatics appeared less frequently as the stage progressed, with significant differences (P < .05) between stages I and II and between II and III. The frequency of lymphatics above the knee joint decreased significantly (P < .05) between stages I and II and between II and III as the stage progressed, reaching 0% in stage III. An MRL staging was proposed and showed significant positive correlations with the clinical stage (r = 0.79, P < .01) and the duration of lymphedema (r = 0.57, P < .01).

CONCLUSIONS: MRL-specific patterns of DBF and lymphatics that depended on the site within the lower limb and clinical stage were shown. The DBF pattern differed from those observed in previous studies with other imaging techniques. The proposed MRL staging based on these characteristic findings allows new stratification of patients with lymphedema. Combined with its excellent ability to visualize lymphatic anatomy, MRL could enable a more detailed understanding of individual patient’s pathology, useful for determining the most appropriate treatment.

PMID:34463259 | DOI:10.1016/j.jvsv.2021.06.006

Categories
Nevin Manimala Statistics

Promoter sequence interaction and structure based multi-targeted (redox regulatory genes) molecular docking analysis of vitamin E and curcumin in T4 induced oxidative stress model using H9C2 cardiac cell line

J Biomol Struct Dyn. 2021 Aug 31:1-20. doi: 10.1080/07391102.2021.1970624. Online ahead of print.

ABSTRACT

A positive association between oxidative stress and hyper-thyroid conditions is well established. Vitamin E (VIT-E) and curcumin (CRM) are considered as potent antioxidant small molecules. Nuclear factor erythroid 2-related factor 2(NRF-2) is known to bind with antioxidant response element and subsequently activate expression of antioxidant enzymes. However, the activation of NRF-2 depends on removal of its regulator Kelch-like ECH-associated protein 1(NRF-2). In the current study, an attempt is made to demonstrate whether effects of VIT-E and CRM are due to direct interaction with the target proteins (i.e. NRF-2, NRF-2, SOD, catalase and LDH) or by possible interaction with the flanking region of their promoters by in silico analysis. Further, these results were corroborated by pretreatment of H9C2 cells (1 x 106 cells per mL of media) with VIT-E (50 μM) and/or CRM (20 μM) for 24 h followed by induction of oxidative stress via T4 (100 nm) administration and assaying the active oxygen metabolism. Discriminant function analyses (DFA) indicated that T4 has a definite role in increasing oxidative stress as evidenced by induction of ROS generation, increase in mitochondrial membrane potential and elevated lipid peroxidation (LPx). Pretreatment with the two antioxidants have ameliorative effects more so when given in combination. The decline in biological activities of the principal antioxidant enzymes SOD and CAT with respect to T4 treatment and its restoration in antioxidant pretreated group further validated our in silico data.Communicated by Ramaswamy H. Sarma.

PMID:34463220 | DOI:10.1080/07391102.2021.1970624

Categories
Nevin Manimala Statistics

Evaluation of the Validity and Feasibility of the GLIM Criteria Compared with PG-SGA to Diagnose Malnutrition in Relation to One-Year Mortality in Hospitalized Patients

J Acad Nutr Diet. 2021 Aug 24:S2212-2672(21)01037-6. doi: 10.1016/j.jand.2021.07.011. Online ahead of print.

ABSTRACT

BACKGROUND: The Global Leadership Initiative on Malnutrition (GLIM) approach to diagnose malnutrition was published in 2018. An important next step is to use the GLIM criteria in clinical investigations to assess their validity and feasibility.

OBJECTIVE: To compare the validity and feasibility of the GLIM criteria with Patient-Generated Subjective Global Assessment (PG-SGA) in hospitalized patients and to assess the association between malnutrition and 1-year mortality.

DESIGN: Post hoc analysis of a prospective cohort study.

PARTICIPANTS/SETTING: Hospitalized patients (n = 574) from the Departments of Gastroenterology, Gynecology, Urology, and Orthopedics at the Radboudumc academic facility in Nijmegen, The Netherlands, were enrolled from July 2015 through December 2016.

MAIN OUTCOME MEASURES: The GLIM criteria and PG-SGA were applied to identify malnourished patients. Mortality rates were collected from electronic patient records. Feasibility was assessed by evaluating the amount of and reasons for missing data.

STATISTICAL ANALYSES PERFORMED: Concurrent validity was evaluated by assessing the sensitivity, specificity, and Cohen’s kappa coefficient for the GLIM criteria compared with PG-SGA. Cox regression analysis was used for the association between the GLIM criteria and PG-SGA and mortality.

RESULTS: Of 574 patients, 160 (28%) were classified as malnourished according to the GLIM criteria and 172 (30.0%) according to PG-SGA (κ = 0.22, low agreement). When compared with PG-SGA, the GLIM criteria had a sensitivity of 43% and a specificity of 79%. Mortality of malnourished patients was more than two times higher than for non-malnourished patients according to the GLIM criteria (hazard ratio [HR], 2.68; confidence interval [CI], 1.33-5.41). Data on muscle mass was missing in 454 of 574 (79%) patients because of practical problems with the assessment using bioimpedance analysis (BIA).

CONCLUSIONS: Agreement between GLIM criteria and PG-SGA was low when diagnosing malnutrition, indicating that the two methods do not identify the same patients. This is supported by the GLIM criteria showing predictive power for 1-year mortality in hospitalized patients in contrast to PG-SGA. The assessment of muscle mass using BIA was difficult to perform in this clinical population.

PMID:34463257 | DOI:10.1016/j.jand.2021.07.011

Categories
Nevin Manimala Statistics

Spatial Heterogeneity of Sympatric Tick Species and Tick-Borne Pathogens Emphasizes the Need for Surveillance for Effective Tick Control

Vector Borne Zoonotic Dis. 2021 Aug 31. doi: 10.1089/vbz.2021.0027. Online ahead of print.

ABSTRACT

Three tick species that can transmit pathogen causing disease are commonly found parasitizing people and animals in the mid-Atlantic United States: the blacklegged tick (Ixodes scapularis Say), the American dog tick (Dermacentor variabilis [Say]), and the lone star tick (Amblyomma americanum [L.]) (Acari: Ixodidae). The potential risk of pathogen transmission from tick bites acquired at schools in tick-endemic areas is a concern, as school-aged children are a high-risk group for tick-borne disease. Integrated pest management (IPM) is often required in school districts, and continued tick range expansion and population growth will likely necessitate IPM strategies to manage ticks on school grounds. However, an often-overlooked step of tick management is monitoring and assessment of local tick species assemblages to inform the selection of control methodologies. The purpose of this study was to evaluate tick species presence, abundance, and distribution and the prevalence of tick-borne pathogens in both questing ticks and those removed from rodent hosts on six school properties in Maryland. Overall, there was extensive heterogeneity in tick species dominance, abundance, and evenness across the field sites. A. americanum and I. scapularis were found on all sites in all years. Overall, A. americanum was the dominant tick species. D. variabilis was collected in limited numbers. Several pathogens were found in both questing ticks and those removed from rodent hosts, although prevalence of infection was not consistent between years. Borrelia burgdorferi, Ehrlichia chaffeensis, Ehrlichia ewingii, and Ehrlichia “Panola Mountain” were identified in questing ticks, and B. burgdorferi and Borrelia miyamotoi were detected in trapped Peromyscus spp. mice. B. burgdorferi was the dominant pathogen detected. The impact of tick diversity on IPM of ticks is discussed.

PMID:34463140 | DOI:10.1089/vbz.2021.0027

Categories
Nevin Manimala Statistics

Late-occurring Venous Thromboembolism in Allogeneic Blood or Marrow Transplant Survivors – a BMTSS-HiGHS2 Risk Model

Blood Adv. 2021 Aug 30:bloodadvances.2021004341. doi: 10.1182/bloodadvances.2021004341. Online ahead of print.

ABSTRACT

BACKGROUND: Allogeneic blood or marrow transplant (BMT) recipients are at risk for venous-thromboembolism (VTE) because of high-intensity therapeutic exposures, comorbidities and a pro-inflammatory state due to chronic graft vs. host disease (GvHD). The long-term risk of VTE in allogeneic BMT survivors remains unstudied.

METHODS: Participants were drawn from the BMT Survivor Study (BMTSS), a retrospective cohort study that included patients who underwent transplantation between 1974 and 2014 and survived ≥2y after BMT. The BMTSS survey collected information on sociodemographics, health behaviors and chronic health conditions along with age at diagnosis. Details regarding primary cancer diagnosis, transplant preparative regimens, type of transplant and stem cell source were obtained from institutional databases and medical records. We analyzed the risk of VTE in 1,554 2y survivors of allogeneic BMT compared to 907 siblings. Using backward variable selection guided by minimizing Akaike’s information criterion, we created a prediction model for risk of late-occurring VTE.

RESULTS: Allogeneic BMT survivors had a 7.3-fold higher risk of VTE compared to siblings (95%CI: 4.69-11.46, p<0.0001). After a median follow-up of 11y (inter-quartile range: 6-18y), and conditional on surviving the first 2y after BMT, the cumulative incidence of late-occurring VTE was 2.4% at 5y, 4.9% at 10y and 7.1% at 20y after BMT. Older age at BMT (hazard ratio [HR]=1.02/y, 95%CI=1.01-1.04, p=0.002), use of immunosuppressive medications (HR=2.28, 95%CI=1.41-3.38, p=0.0008), obesity (HR=1.06/unit increase in body mass index, 95%CI=1.02-1.10, p=0.002), history of stroke (HR=3.71, 95%CI=1.66-8.27, p=0.001), chronic GvHD (HR=1.62, 95%CI=1.00-2.60, p=0.049), and use of peripheral blood stem cells (PBSCs) as source of stem cells compared to bone marrow (HR=2.73, 95%CI=1.65-4.50, p<0.0001) were associated with increased VTE risk. The final model for VTE risk applied at 2y post-BMT (“HiGHS2”) included History of stroke, chronic GvHD, Hypertension, Sex (male vs. female) and Stem cell source (PBSCs vs. other) (corrected C-statistics: 0.73; 95%CI=0.67-0.79), and was able to classify patients at high and low VTE risk (10y cumulative incidence 9.3% vs. 2.4%, p<0.0001).

CONCLUSIONS: The BMTSS HiGHS2 risk model when applied at 2y post-BMT can be used to inform targeted prevention strategies for patients at high risk for late-occurring VTE.

PMID:34461633 | DOI:10.1182/bloodadvances.2021004341

Categories
Nevin Manimala Statistics

Factors Associated with Abortion Complications after the Implementation of a Surveillance Network (MUSA Network) in a University Hospital

Rev Bras Ginecol Obstet. 2021 Jul;43(7):507-512. doi: 10.1055/s-0041-1735129. Epub 2021 Aug 30.

ABSTRACT

OBJECTIVE: To evaluate the factors associated with abortion complications following the implementation of the good-practice surveillance network Mujeres en Situación de Aborto (Women Undergoing Abortion, MUSA, in Spanish).

METHODS: A cross-sectional study with women who underwent abortion due to any cause and in any age group at UNICAMP Women’s Hospital (part of MUSA network), Campinas, Brazil, between July 2017 and Agust 2019. The dependent variable was the presence of any abortion-related complications during hospitalization. The independent variables were clinical and sociodemographic data. The Chi-square test, the Mann-Whitney test, and multiple logistic regression were used for the statistical analysis.

RESULTS: Overall, 305 women were enrolled (mean ± standard deviation [SD] for age: 29.79 ± 7.54 years). The mean gestational age was 11.17 (±3.63) weeks. Accidental pregnancy occurred in 196 (64.5%) cases, 91 (29.8%) due to contraception failure. At least 1 complication was observed in 23 (7.54%) women, and 8 (34.8%) of them had more than 1. The most frequent complications were excessive bleeding and infection. The factors independently associated with a higher prevalence of complications were higher gestational ages (odds ratio [OR]: 1.22; 95% confidence interval [95%CI]: 1.09 to 1.37) and contraceptive failure (OR: 3.4; 95%CI: 1.32 to 8.71).

CONCLUSION: Higher gestational age and contraceptive failure were associated with a higher prevalence of complications. This information obtained through the surveillance network can be used to improve care, particularly in women more susceptible to unfavorable outcomes.

PMID:34461660 | DOI:10.1055/s-0041-1735129

Categories
Nevin Manimala Statistics

Association between Pancreatic Burnout and Liver Cirrhosis in Alcoholic Chronic Pancreatitis

Digestion. 2021 Aug 30:1-8. doi: 10.1159/000516482. Online ahead of print.

ABSTRACT

BACKGROUND/OBJECTIVES: In chronic pancreatitis (CP), progressive fibrosis of the pancreas leads to exocrine and endocrine insufficiency and, finally, to pancreatic burnout. Alcohol consumption is associated with fibrosis in the pancreas and the liver, and the activation of stellate cells plays a central role in the induction of fibrosis in both organs. However, the relationship between pancreatic burnout and liver cirrhosis (LC) is still poorly understood in patients with alcoholic CP (ACP).

METHODS: We performed a single-center, retrospective, cross-sectional study with 537 CP patients. We analyzed the clinical presence of early and advanced pancreatic burnout and stated LC in cases of typical alterations in histology, liver stiffness measurement, cross-sectional imaging, or ultrasound. We analyzed further clinical parameters.

RESULTS: The frequency of advanced pancreatic burnout was 6.5% for ACP (20/306) and 4% for non-ACP (8/206; p = 0.20; χ2 test). Advanced pancreatic burnout was not associated with the amount of alcohol consumption (p = 0.34) but with the disease duration (p = 0.0470) and rate of calcification (p = 0.0056). Furthermore, advanced pancreatic burnout was associated with LC (p < 0.0001) but cannot be explained by the amount of alcohol consumption. In ACP with alcohol consumption >80 g/day, an isolated LC was significantly more frequently detectable (14%, without pancreatic burnout) than an isolated advanced pancreatic burnout (1%, without LC). These results were confirmed by multivariable analyses.

CONCLUSIONS: We identified a close association between LC and pancreatic burnout. The disease duration positively correlates with the development of pancreatic burnout. The liver seems to be more vulnerable to alcohol than the pancreas.

PMID:34461618 | DOI:10.1159/000516482

Categories
Nevin Manimala Statistics

Financial Analysis of Cardiac Rehabilitation and the Impact of COVID-19

J Cardiopulm Rehabil Prev. 2021 Sep 1;41(5):308-314. doi: 10.1097/HCR.0000000000000643.

ABSTRACT

PURPOSE: Provision of phase 2 cardiac rehabilitation (CR) has been directly impacted by coronavirus disease-19 (COVID-19). Economic analyses to date have not identified the financial implications of pandemic-related changes to CR. The aim of this study was to compare the costs and reimbursements of CR between two periods: (1) pre-COVID-19 and (2) during the COVID-19 pandemic.

METHODS: Health care costs of providing CR were calculated using a microcosting approach. Unit costs of CR were based on staff time, consumables, and overhead costs. Reimbursement rates were derived from commercial and public health insurance. The mean cost and reimbursement/participant were calculated. Staff and participant COVID-19 infections were also examined.

RESULTS: The mean number of CR participants enrolled/mo declined during the pandemic (-10%; 33.8 ± 2.0 vs 30.5 ± 3.2, P = .39), the mean cost/participant increased marginally (+13%; $2897 ± $131 vs $3265 ± $149, P = .09), and the mean reimbursement/participant decreased slightly (-4%; $2959 ± $224 vs $2844 ± $181, P = .70). However, these differences did not reach statistical significance. The pre-COVID mean operating surplus/participant ($62 ± $140) eroded into a deficit of -$421 ± $170/participant during the pandemic. No known COVID-19 infections occurred among the 183 participants and 14 on-site staff members during the pandemic period.

CONCLUSIONS: COVID-19-related safety protocols required CR programs to modify service delivery. Results demonstrate that it was possible to safely maintain this critically important service; however, CR program costs exceeded revenues. The challenge going forward is to optimize CR service delivery to increase participation and achieve financial solvency.

PMID:34461621 | DOI:10.1097/HCR.0000000000000643

Categories
Nevin Manimala Statistics

Firearm injuries associated with law enforcement activity

J Forensic Leg Med. 2021 Aug 27;83:102249. doi: 10.1016/j.jflm.2021.102249. Online ahead of print.

ABSTRACT

BACKGROUND: Law enforcement activity can involve firearms, and either a civilian or the law enforcement officer can be injured. It was the purpose of this study to characterize the injuries and demographics associated with law enforcement firearm activity across the entire US using a national data base.

METHODS: Inter-University Consortium for Political and Social Research Firearm Injury Surveillance Study data 1993-2015 was used. Law enforcement involvement and other demographic variables were ascertained. Statistical analyses were performed accounting for the weighted, stratified nature of the data. P < 0.05 was considered to be statistically significant.

RESULTS: There were an estimated 2,667,896 ED visits for injuries due to firearms; 1.7% (45,497) were associated with law enforcement. Those involved with law enforcement were older (33.2 vs 29.8 years), a handgun was more commonly involved (80.3 vs 71.5%), male (90.7 vs 86.8%), White (52.9 vs 37.2%), had more upper trunk injuries (25.2 vs 16.2%), fewer lower extremity (15.1% vs 25.9%) injuries, and more fatalities (10.0 vs. 6.2%). An argument, crime, fight and drug involvement were all greater in the law enforcement group. Within the law enforcement group, when the injured patient was the civilian and not the officer, the patient was more commonly Black, male, sustained more trunk injuries, fewer extremity injuries, and more frequently admitted to the hospital. The civilian group had fewer upper extremity (11.7% vs 29.7%), lower extremity (12.2% vs 23.7%) and lower trunk injuries (14.6% vs 8.0%), more upper trunk injuries (31.3% vs 7.8%, and similar proportion of head/neck injuries (31.5% vs 30.7%) compared to the officer group. More females were injured in the officer group (16.9% vs 7.5%). The fatality rate was 12.6% for the civilian group and 3.0% for the officer group. There were no differences by race in disposition from the ED (released, admitted, death) for those who sustained injuries by the officer.

CONCLUSIONS: Firearm injuries due to law enforcement activity occurred in 1.7% of all ED visits for injuries due to firearms. The law enforcement officer was the injured patient in 23% of the events. This study spanning nearly a quarter of a century of data for injuries due to firearm activity resulting in ED visits is baseline data for future studies, especially in the present setting of calls for police reform within the US. This will be important when analyzing the effect of new programs in law enforcement training and/or police reform.

PMID:34461598 | DOI:10.1016/j.jflm.2021.102249