Categories
Nevin Manimala Statistics

Effects of intraoperative fluid balance during pancreatoduodenectomy on postoperative pancreatic fistula: an observational cohort study

BMC Surg. 2023 Apr 13;23(1):89. doi: 10.1186/s12893-023-01978-9.

ABSTRACT

BACKGROUND: Perioperative fluid management during major abdominal surgery has been controversial. Postoperative pancreatic fistula (POPF) is a critical complication of pancreaticoduodenectomy (PD). We conducted a retrospective cohort study to analyze the impact of intraoperative fluid balance on the development of POPF.

METHODS: This retrospective cohort study enrolled 567 patients who underwent open pancreaticoduodenectomy, and the demographic, laboratory, and medical data were recorded. All patients were categorized into four groups according to quartiles of intraoperative fluid balance. Multivariate logistic regression and restricted cubic splines (RCSs) were used to analyze the relationship between intraoperative fluid balance and POPF.

RESULTS: The intraoperative fluid balance of all patients ranged from -8.47 to 13.56 mL/kg/h. A total of 108 patients reported POPF, and the incidence was 19.0%. After adjusting for potential confounders and using restricted cubic splines, the dose‒response relationship between intraoperative fluid balance and POPF was found to be statistically insignificant. The incidences of bile leakage, postpancreatectomy hemorrhage, and delayed gastric emptying were 4.4%, 20.8%, and 14.8%, respectively. Intraoperative fluid balance was not associated with these abdominal complications. BMI ≥ 25 kg/m2, preoperative blood glucose < 6 mmol/L, long surgery time, and lesions not located in the pancreas were independent risk factors for POPF.

CONCLUSION: The study did not find a significant association between intraoperative fluid balance and POPF. Well-designed multicenter studies are necessary to explore the association between intraoperative fluid balance and POPF.

PMID:37055753 | DOI:10.1186/s12893-023-01978-9

Categories
Nevin Manimala Statistics

Integrated systems immunology approach identifies impaired effector T cell memory responses as a feature of progression to severe dengue fever

J Biomed Sci. 2023 Apr 13;30(1):24. doi: 10.1186/s12929-023-00916-4.

ABSTRACT

BACKGROUND: Typical symptoms of uncomplicated dengue fever (DF) include headache, muscle pains, rash, cough, and vomiting. A proportion of cases progress to severe dengue hemorrhagic fever (DHF), associated with increased vascular permeability, thrombocytopenia, and hemorrhages. Progression to severe dengue is difficult to diagnose at the onset of fever, which complicates patient triage, posing a socio-economic burden on health systems.

METHODS: To identify parameters associated with protection and susceptibility to DHF, we pursued a systems immunology approach integrating plasma chemokine profiling, high-dimensional mass cytometry and peripheral blood mononuclear cell (PBMC) transcriptomic analysis at the onset of fever in a prospective study conducted in Indonesia.

RESULTS: After a secondary infection, progression to uncomplicated dengue featured transcriptional profiles associated with increased cell proliferation and metabolism, and an expansion of ICOS+CD4+ and CD8+ effector memory T cells. These responses were virtually absent in cases progressing to severe DHF, that instead mounted an innate-like response, characterised by inflammatory transcriptional profiles, high circulating levels of inflammatory chemokines and with high frequencies of CD4low non-classical monocytes predicting increased odds of severe disease.

CONCLUSIONS: Our results suggests that effector memory T cell activation might play an important role ameliorating severe disease symptoms during a secondary dengue infection, and in the absence of that response, a strong innate inflammatory response is required to control viral replication. Our research also identified discrete cell populations predicting increased odds of severe disease, with potential diagnostic value.

PMID:37055751 | DOI:10.1186/s12929-023-00916-4

Categories
Nevin Manimala Statistics

Clinical and microbiological profiles in post-chemotherapy neutropenic fever in hematological malignancy: exploration of clinical phenotype patterns by two-step cluster analysis

BMC Infect Dis. 2023 Apr 13;23(1):226. doi: 10.1186/s12879-023-08218-8.

ABSTRACT

BACKGROUND: Epidemiology of infectious diseases causing febrile illness varies geographically with human attributes. Periodic institutional surveillance of clinical and microbiological profiles in adding data to updating trends, modulating pharmatherapeutics, signifying possible excessive treatments and risk of drug resistance in post-chemotherapy neutropenic fever (NF) in hematological malignancy (HM) is limited. We aimed to review institutional clinical and microbiological data and explore clinical phenotype pattern groups among data.

METHODS: Available data from 372 NF episodes were included. Demographics, types of malignancies, laboratory data, antimicrobial treatments and febrile-related outcome data such as predominant pathogens and microbiological diagnosed infections (MDIs) were collected. Descriptive statistics, two-step cluster analysis and non-parametric tests were employed.

RESULTS: The occurrences of microbiological diagnosed bacterial infections (MDBIs; 20.2%) and microbiological diagnosed fungal infections (MDFIs; 19.9%) were almost equal. Gram-negative pathogens (11.8%) were comparable with gram-positive pathogens (9.9%), with gram-negative being slightly predominant. Death rate was 7.5%. Two-step cluster analysis yielded four distinct clinical phenotype pattern (cluster) groups: cluster 1 ‘lymphomas without MDIs’, cluster 2 ‘acute leukemias MDBIs’, cluster 3 ‘acute leukemias MDFIs’ and cluster 4 ‘acute leukemias without MDIs’. Considerable NF events with antibiotic prophylaxis being not identified as MDI might have cases in low-risk with non-infectious reasons causing febrile reactions that might possibly not require prophylaxis.

CONCLUSIONS: Regular institutional surveillance with active parameter assessments to signify risk levels in the post-chemotherapy stage, even prior to the onset of fever, might be an evidence-based strategy in the management of NF in HM.

PMID:37055745 | DOI:10.1186/s12879-023-08218-8

Categories
Nevin Manimala Statistics

Specific changes and clinical significance of plasma D-dimer during pregnancy and puerperium: a prospective study

BMC Pregnancy Childbirth. 2023 Apr 13;23(1):248. doi: 10.1186/s12884-023-05561-1.

ABSTRACT

BACKGROUND: Pregnant and puerperal women are high-risk populations for developing venous thromboembolism (VTE). Plasma D-dimer (D-D) is of good value in the diagnosis of exclusion of VTE in the nonpregnant population. Since there is no consensus reference range of plasma D-D applicable to pregnant and puerperal women, the application of plasma D-D is limited. To investigate the change characteristics and the reference range of plasma D-D levels during pregnancy and puerperium and to explore the pregnancy- and childbirth-related factors affecting plasma D-D levels and the diagnostic efficacy of plasma D-D for excluding VTE during early puerperium after caesarean section.

METHODS: A prospective cohort study was conducted with 514 pregnant and puerperal women (cohort 1), and 29 puerperal women developed VTE 24-48 h after caesarean section (cohort 2). In cohort 1, the effects of the pregnancy- and childbirth-related factors on the plasma D-D levels were analyzed by comparing the differences in plasma D-D levels between different groups and between different subgroups. The 95th percentiles were calculated to establish the unilateral upper limits of the plasma D-D levels. The plasma D-D levels at 24-48 h postpartum were compared between normal singleton pregnant and puerperal women in cohort 2 and women from the cesarean section subgroup in cohort 1, binary logistic analysis was used to analyze the relevance between plasma D-D level and the risk of VTE developing 24-48 h after caesarean section, and a receiver operating characteristic (ROC) curve was used to assess the diagnostic efficacy of plasma D-D for excluding VTE during early puerperium after caesarean section.

RESULTS: The 95% reference ranges of plasma D-D levels in the normal singleton pregnancy group were ≤ 1.01 mg/L in the first trimester, ≤ 3.17 mg/L in the second trimester, ≤ 5.35 mg/L in the third trimester, ≤ 5.47 mg/L at 24-48 h postpartum, and ≤ 0.66 mg/L at 42 days postpartum. The plasma D-D levels of the normal twin pregnancy group were significantly higher than those of the normal singleton pregnancy group during pregnancy (P < 0.05), the plasma D-D levels of the GDM group in the third trimester were significantly higher than those of the normal singleton pregnancy group (P < 0.05). The plasma D-D levels of the advanced age subgroup at 24-48 h postpartum were significantly higher than those of the nonadvanced age subgroup (P < 0.05), and the plasma D-D levels of the caesarean section subgroup at 24-48 h postpartum were significantly higher than those of the vaginal delivery subgroup (P < 0.05). The plasma D-D level was significantly correlated with the risk of VTE developing at 24-48 h after caesarean section (OR = 2.252, 95% CI: 1.611-3.149). The optimal cut-off value of plasma D-D for the diagnosis of exclusion of VTE during early puerperium after caesarean section was 3.24 mg/L. The negative predictive value for the diagnosis of exclusion of VTE was 96.1%, and the area under the curve (AUC) was 0.816, P < 0.001.

CONCLUSIONS: The thresholds of plasma D-D levels in normal singleton pregnancy and parturient women were higher than those of nonpregnant women. Plasma D-D had good value in the diagnosis of exclusion of VTE occurring during early puerperium after caesarean section. Further studies are needed to validate these reference ranges and assess the effects of pregnancy- and childbirth-related factors on plasma D-D levels and the diagnostic efficacy of plasma D-D for excluding VTE during pregnancy and puerperium.

PMID:37055718 | DOI:10.1186/s12884-023-05561-1

Categories
Nevin Manimala Statistics

The improved health utility of once-weekly subcutaneous semaglutide 2.4 mg compared with placebo in the STEP 1-4 obesity trials

Diabetes Obes Metab. 2023 Apr 13. doi: 10.1111/dom.15090. Online ahead of print.

ABSTRACT

AIMS: Clinicians and regulatory authorities are placing increasing emphasis on health-related quality of life (HRQoL) and health utilities when evaluating therapeutic efficacy of new agents. We assessed health utility values in the Semaglutide Treatment Effect in People with obesity (STEP) trials.

MATERIALS AND METHODS: The STEP 1-4 phase 3a, 68-week, double-blind randomized controlled trials assessed the efficacy and safety of semaglutide 2.4 mg versus placebo in individuals with BMI ≥30 kg/m2 or BMI ≥27 kg/m2 and ≥1 co-morbidity (STEP 1, 3 and 4), or BMI ≥27 kg/m2 and type 2 diabetes (STEP 2). Patients received lifestyle intervention plus intensive behavioural therapy in STEP 3. HRQoL was assessed using the Short Form 36-item Health Survey version 2 (SF-36v2) at baseline and week 68. Scores were converted into Short Form Six-Dimension version 2 (SF-6Dv2) utility scores or mapped onto the European Quality of Life Five-Dimension Three-Level (EQ-5D-3L) utility index using UK health utility weights.

RESULTS: At week 68, semaglutide 2.4 mg was associated with minor health utility score improvements from baseline (all trials), while scores for placebo typically decreased. SF-6Dv2 treatment differences by week 68 for semaglutide versus placebo were significant in STEP 1 and 4 (p≤0.001), but not STEP 2 or 3. EQ-5D-3L treatment differences by week 68 for semaglutide versus placebo were significant in STEP 1, 2 and 4 (p<0.001 for all), but not STEP 3.

CONCLUSIONS: Semaglutide 2.4 mg was associated with improvement in health utility scores compared with placebo, reaching statistical significance in STEP 1, 2 and 4. This article is protected by copyright. All rights reserved.

PMID:37055712 | DOI:10.1111/dom.15090

Categories
Nevin Manimala Statistics

Cigarette: an unsung anthropogenic evil in the environment

Environ Sci Pollut Res Int. 2023 Apr 13. doi: 10.1007/s11356-023-26867-9. Online ahead of print.

ABSTRACT

The world’s population is growing steadily, and this trend is mirrored by a sharp rise in the number of people who smoke cigarettes. Instead of properly disposing of their cigarette waste, most people simply toss them aside, leading to serious environmental consequences. According to previous statistics, in 2012 alone, 6.25 trillion cigarettes were consumed by 967 million chain smokers. Past studies have shown that up to 30% of global litter is made up of cigarette waste. These discarded cigarette butts are non-biodegradable and contain over 7000 toxicants such as benzene, 1,3-butadiene, nitrosamine ketone, N-Nitrosonornicotine, nicotine, formaldehyde, acrolein, ammonia, aniline, polycyclic aromatic hydrocarbons, and various heavy metals. These toxicants have a negative impact on the habitats of wildlife and can cause serious health problems such as cancer, respiratory disorders, cardiac issues, and sexual dysfunction. Although it is still unclear how littered cigarettes affect plant growth, germination, and development, it is clear that they have the potential to harm plant health. Just like single-use plastic, trashed cigarette butts are a critical new rising form of pollution that requires scientific attention for effective recycling and disposal management. It is important to properly dispose of cigarette waste to protect the environment and wildlife, as well as to prevent harm to human health.

PMID:37055684 | DOI:10.1007/s11356-023-26867-9

Categories
Nevin Manimala Statistics

Outcomes and access to angiography following non-ST-segment elevation acute coronary syndromes in patients who present to rural or urban hospitals: ANZACS-QI 72

N Z Med J. 2023 Apr 14;136(1573):27-54.

ABSTRACT

AIM: This study’s aim was to identify differences in invasive angiography performed and health outcomes for patients with non-ST-segment elevation acute coronary syndrome (NSTEACS) presenting to either i) a rural hospital, or an urban hospital ii) with or iii) without routine access to percutaneous intervention (PCI) in New Zealand.

METHODS: Patients with NSTEACS between 1 January 2014 and 31 December 2017 were included. Logistic regression was used to model each of the outcome measures: angiography performed within 1 year; 30-day, 1-year and 2-year all-cause mortality; and readmission within 1 year of presentation with either heart failure, a major adverse cardiac event or major bleeding.

RESULTS: There were 42,923 patients included. Compared to urban hospitals with access to PCI, the odds of a patient receiving an angiogram were reduced for rural and urban hospitals without routine access to PCI (odds ratio [OR] 0.82 and 0.75) respectively. There was a small increase in the odds of dying at 2 years (OR 1.16), but not 30 days or 1 year for patients presenting to a rural hospital.

CONCLUSION: Patients who present to hospitals without PCI are less likely to receive angiography. Reassuringly there is no difference in mortality, except at 2 years, for patients that present to rural hospitals.

PMID:37054454

Categories
Nevin Manimala Statistics

Epidemiology of Physeal Fractures and Clinically Significant Growth Disturbances Affecting the Distal Tibia, Proximal Tibia, and Distal Femur: A Retrospective Cohort Study

J Am Acad Orthop Surg. 2023 Apr 12. doi: 10.5435/JAAOS-D-22-00303. Online ahead of print.

ABSTRACT

INTRODUCTION: Childhood fractures involving the physis potentially result in premature physeal closure that can lead to growth disturbances. Growth disturbances are challenging to treat with associated complications. Current literature focusing on physeal injuries to lower extremity long bones and risk factors for growth disturbance development is limited. The purpose of this study was to provide a review of growth disturbances among proximal tibial, distal tibial, and distal femoral physeal fractures.

METHODS: Data were retrospectively collected from patients undergoing fracture treatment at a level I pediatric trauma center between 2008 and 2018. The study was limited to patients 0.5 to 18.9 years with a tibial or distal femoral physeal fracture, injury radiograph, and appropriate follow-up for determination of fracture healing. The cumulative incidence of clinically significant growth disturbance (CSGD) (a growth disturbance requiring subsequent physeal bar resection, osteotomy, and/or epiphysiodesis) was estimated, and descriptive statistics were used to summarize demographics and clinical characteristics among patients with and without CSGD.

RESULTS: A total of 1,585 patients met the inclusion criteria. The incidence of CSGD was 5.0% (95% confidence interval, 3.8% to 6.6%). All cases of growth disturbance occurred within 2 years of initial injury. The risk of CSGD peaked at 10.2 years for males and 9.1 years for females. Complex fractures that required surgical treatment, distal femoral and proximal tibial fractures, age, and initial treatment at an outside hospital were significantly associated with an increased risk of a CSGD.

DISCUSSION: All CSGDs occurred within 2 years of injury, indicating that these injuries should be followed for a period of at least 2 years. Patients with distal femoral or proximal tibial physeal fractures that undergo surgical treatment are at highest risk for developing a CSGD.

LEVEL OF EVIDENCE: Level III Retrospective Cohort Study.

PMID:37054395 | DOI:10.5435/JAAOS-D-22-00303

Categories
Nevin Manimala Statistics

Comparing dynamic visual acuity between athletes who are deaf or hard-of-hearing and athletes who are hearing

J Am Coll Health. 2023 Apr 13:1-4. doi: 10.1080/07448481.2023.2198018. Online ahead of print.

ABSTRACT

This study examined the difference in the dynamic visual acuity test (DVAT) between collegiate athletes who are deaf or hard-of-hearing (D/HoH) (n = 38) and university club-level athletes who are hearing (n = 38). Dynamic visual acuity was assessed using the Bertec Vision Advantage (Bertec® Corporation, Columbus, Ohio, USA). No statistically significant differences between athletes who are D/HoH and who are hearing were found in DVAT for leftward (χ2 = 0.71, p = 0.40) or rightward (χ2 = 0.04, p = 0.84) head yaw rotation around an earth vertical axis. Dynamic visual acuity was similar for athletes regardless of hearing status. Baseline DVAT data may be of use for post-injury management of athletes who are D/HoH.

PMID:37053591 | DOI:10.1080/07448481.2023.2198018

Categories
Nevin Manimala Statistics

Imposter phenomenon and experiences of discrimination among students at a predominantly White institution

J Am Coll Health. 2023 Apr 13:1-5. doi: 10.1080/07448481.2023.2198021. Online ahead of print.

ABSTRACT

Objective: To compare the experiences of Imposter Phenomenon and discrimination among non-Hispanic White (NHW) and racial and ethnic minority (REM) students at a predominantly White Institution (PWI). Participants: 125 undergraduate students (89.6% women, 68.8% NHW, and 31.2% REM). Methods: Participants completed an online questionnaire including the Clance Imposter Phenomenon Scale (CIPS), Everyday Discrimination Scale (EDS), demographic variables (class year, gender, first generation student status), and 5 items assessing students’ feelings of belonging and support. Descriptive statistics and bivariate analyses were performed. Results: Mean CIPS scores were similar for NHW (64.05 ± 14.68) and REM students (63.62 ± 15.90, P = . 882), but EDS scores were significantly higher among REM students (13.00 ± 9.24 vs. 8.00 ± 5.21, P = . 009). REM students more frequently felt that they don’t belong, are excluded, and lack resources to succeed. Conclusions: Racial and ethnic minority students at PWIs may need additional resources and social support.

PMID:37053586 | DOI:10.1080/07448481.2023.2198021