Tag: nevin manimala
Radiol Med. 2025 Aug 23. doi: 10.1007/s11547-025-02062-3. Online ahead of print.
ABSTRACT
PURPOSE: This study sought to investigate the presence of residual myocardial hyperemia on the recovery phase in patients undergoing stress CMR.
MATERIAL AND METHODS: Fifty patients with clinical indication for stress CMR underwent quantitative perfusion imaging in resting conditions, after regadenoson-induced hyperemia (400 mcg, 5 mL), and 10 min after recovery with euphylline. Studies showing hypoperfusion due to ischemia and/or prior myocardial infarction were excluded. Global myocardial blood flow during rest (MBFrest), stress (MBFstress) and recovery (MBFrecovery) and MPR indices (MPRstress/rest and MPRstress/recovery) were calculated using automated pixel-wise quantitative myocardial perfusion mapping.
RESULTS: A total of 30 patients (22 males, mean age of 62.7 ± 1 years) were included in the analysis. Global MBFrest and MBFstress were 0.83 ± 0.2 mL/g/min and 2.1 ± 0.6 mL/g/min, respectively. After recovery with euphylline, myocardial perfusion did not return to the resting values (MBFrecovery of 0.92 ± 0.3 mL/g/min) and statistically differed from MBFrest (p < 0.01), suggesting residual myocardial hyperemia. This resulted in an abnormally low MPRstress/recovery (2.43 ± 0.7) with respect to MPRstress/rest (2.56 ± 0.7) (p = 0.03). A linear mixed-effects model accounting for repeated measures revealed statistically significant group differences over time in global MBF (mean difference 0.1, 95% CI 0.02-0.17, p = 0.01) and global MPR (mean difference -0.13, 95% CI -0.25 to -0.02, p = 0.02).
CONCLUSION: Despite the use of euphylline to counteract the vasodilator effect, MBF does not completely revert to resting values and MBFrecovery cannot be used as a substitute for MBFrest when regadenoson is used. Consequently, a rest/stress protocol is advised for quantitative CMR perfusion to obtain accurate MBF and MPR parameters.
PMID:40848227 | DOI:10.1007/s11547-025-02062-3
Int Ophthalmol. 2025 Aug 23;45(1):348. doi: 10.1007/s10792-025-03730-z.
ABSTRACT
PURPOSE: Glaucoma and cataract are the most frequent causes of blindness worldwide with very distinct etiology and pathogenesis. Sleep disturbances have been reported in both conditions with their etiology attributed not only to the particular underlying eye condition but to other comorbid conditions such as chronic diseases and old age. This study compares sleep quality in fifty primary open-angle glaucoma patients and fifty cataract patients of similar vision loss in order to determine the comparative impact of those eye disorders in sleep quality.
METHODS: The glaucoma patients group was comprised of 50 patients with bilateral primary open-angle glaucoma, of moderate stage (38 men and 12 women, mean age = 62.94 years, SD = 4.99 years). The cataract patient group was comprised of 50 gender-matched cataract patients with either cortical sclerotic or posterior sclerotic types of cataracts (mean age = 62.38 years, SD = 4.62 years). All cataract patients were receiving a pre-surgery evaluation at the time of the study and had cataract involvement of both eyes that necessitated cataract surgery. All patients were administered the Pittsburgh Sleep Quality Index (PSQI), a self-administered questionnaire designed to subjectively evaluate sleep quality over the preceding month, and their findings were statistically compared.
RESULTS: Both groups had overall poor sleep quality but the glaucoma patients had worse PSQI total scores (p = .042), higher sleep latency (p = .005) and sleep disturbance (p = .002).
CONCLUSION: Those findings suggest that among patients with comparable vision loss, glaucoma patients may be even more severely affected with disordered sleep than cataract patients. There is a need for the creation and testing of treatment modalities for chronodisruption in both patient groups.
PMID:40848200 | DOI:10.1007/s10792-025-03730-z
Carbon Balance Manag. 2025 Aug 23;20(1):34. doi: 10.1186/s13021-025-00313-4.
ABSTRACT
Agricultural greenhouse gas emissions on the planet threaten both food security and climate change. The United Nations is calling for food security and sustainable agriculture to end hunger by 2030. Sustainable Development Goal 2.4 addresses resilient agricultural practices to combat climate change and produce sustainable food. Resilient agricultural practices are only possible with agricultural technologies (AgriTech) that will create a digital transformation in agriculture. AgriTech can meet the increasing food demand by increasing production efficiency while increasing resource efficiency by combating problems such as climate change and water scarcity. The aim of this study is to examine the impacts of AgriTech usage on sustainable agriculture in Sub-Saharan African (SSA) countries. The analyses were conducted using panel data from 20 SSA countries between 2000 and 2022. In this study, MMQR (Method of Moments Quantile Regression) provided consistent results across quantiles in variable interactions, while GMM (Generalized Method of Moments) and KRLS (Kernel Regularized Least Squares Method) approaches were used to ensure consistency of results. The findings confirm that AgriTech (ATECH) and agricultural value added (AGRW) contribute significantly to sustainable agriculture in SSA countries. The coefficients of ATECH and AGRW variables are negative and statistically significant in all quantiles. This shows that when AgriTech use and agricultural value added increase in SSA, emissions from agriculture decrease and the environment improves. However, agricultural credits (ACRD) are insufficient to reduce agricultural emissions. Furthermore, agricultural workers (AEMP) and internet use (INT) help reduce agricultural emissions up to the 60th and 50th quantiles, while this effect disappears at higher quantile levels. These results emphasize the importance of integrating green procurement and green production technologies supported by green credits into agricultural production in order to achieve sustainable agricultural development goals in SSA. Policies that facilitate farmers’ access to agricultural green credits should be adopted in SSA societies. Infrastructure works that will increase farmers’ access to the internet should be increased. Awareness of agricultural workers on green production and sustainability should be provided to agricultural workers.Highlights. The results show that agricultural technologies, agricultural growth, agricultural labor, and internet use reduce agricultural emissions in SSAcountries, while credit use increases agricultural emissions. AgriTech use (ATECH) and agricultural value-added (AGRW) have statistically significant negative coefficients in all quantiles, indicating that increasing AgriTech and value-added reduce agricultural greenhouse gas emissions. The potential of AgriTech to reduce emissions is higher in low-emission quantiles (10-30%), while the effect is relatively weaker in high-emission quantiles. Agricultural credits (ACRD) only provide environmental improvements in the low-emission quantile (25%) and are insufficient to reduce emissions in high quantiles. Agricultural labor (AEMP) and internet use (INT) significantly reduced emissions at 10-50% quantiles, while this effect disappeared at higher quantiles. Farmers’ success in reducing emissions is directly dependent on their internet access. Panel instantaneous momentum quantile regression (MMQR) was preferred to capture heterogeneous interactions, and the robustness of the results was confirmed with the GMM and KRLS approaches.
PMID:40848194 | DOI:10.1186/s13021-025-00313-4
Mol Diagn Ther. 2025 Aug 23. doi: 10.1007/s40291-025-00805-6. Online ahead of print.
ABSTRACT
BACKGROUND AND OBJECTIVE: Colorectal cancer remains a major global health challenge, necessitating the development of accurate non-invasive diagnostic tools. Circulating and excretory microRNAs (miRNAs) are promising biomarkers owing to their stability and regulatory roles in tumorigenic pathways. While single miRNA assays often lack sufficient diagnostic accuracy, panels combining multiple miRNAs have shown enhanced performance. This systematic review and meta-analysis evaluated the diagnostic accuracy of multi-miRNA panels and explored their mechanistic relevance to colorectal cancer pathogenesis.
METHODS: A comprehensive search of PubMed, Embase, Web of Science, and Scopus was conducted through March 2025 following Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines. The study protocol was registered with PROSPERO (CRD420251060655). Eligible studies assessed the diagnostic accuracy of multi-miRNA panels for colorectal cancer using extractable data on sensitivity, specificity, and area under the curve. Data were extracted independently by two reviewers. A bivariate random-effects model was used to calculate pooled diagnostic estimates. Study quality was assessed with the Quality Assessment of Diagnostic Accuracy Studies 2 (QUADAS-2) tool, and heterogeneity was evaluated using I2 statistics. Subgroup analyses were conducted by sample type (e.g., plasma, serum, stool) and panel size. Target genes of recurrent miRNAs were mapped to canonical colorectal cancer-related pathways.
RESULTS: Twenty-nine studies comprising 5497 participants (3070 colorectal cancer cases and 2427 controls) and 35 multi-miRNA panels were included. Pooled sensitivity was 0.85 (95% confidence interval 0.80-0.88), specificity was 0.84 (95% confidence interval 0.80-0.88), and the area under the curve was 0.90, despite substantial heterogeneity (I2 > 77%). Panels derived from plasma samples showed the highest balanced performance (sensitivity 0.88; specificity 0.87), while three-miRNA panels exhibited the best diagnostic trade-offs. Mechanistic analysis of 42 recurrent miRNAs revealed consistent involvement in key colorectal cancer pathways, including PI3K/AKT, Wnt/β-catenin, epithelial-mesenchymal transition, angiogenesis, and immune regulation.
CONCLUSIONS: Multi-miRNA panels derived from diverse biospecimen sources demonstrate high diagnostic accuracy for colorectal cancer and are mechanistically linked to fundamental oncogenic pathways. Future efforts should focus on panel standardization, biospecimen-specific validation, and integration into clinical workflows to advance precision oncology.
PMID:40848187 | DOI:10.1007/s40291-025-00805-6
J Autism Dev Disord. 2025 Aug 23. doi: 10.1007/s10803-025-07022-4. Online ahead of print.
ABSTRACT
In 2024, the United States House of Representatives passed ruling H.R.7213, the Autism CARES Act, which, if passed by the Senate, will reauthorize funding to extant national autism research programs, with an emphasis on including autistic individuals significantly affected by the disorder. This shift toward research inclusion across the autism spectrum clearly highlights the lack of representation in the past. In the field of multisensory integration, it is well documented that there are changes to how autistic individuals integrate stimuli across different sensory modalities, and the relationship between atypical (multi)sensory processing and the core features of autism is well documented. However, much of this research utilizes samples of autistic individuals with high cognitive, verbal, and functional ability. The purpose of this review is to draw attention to disparities in the samples used in multisensory research in autism. We conducted a systematic review of all studies examining multisensory function in autism to date and provide basic descriptive statistics of the studies. We observed that the vast majority of multisensory research is focused on young, low support needs autistic individuals, with very little investigation in autistic individuals with high support needs (HSN). Additionally, we found investigation into the effect of sex or comorbidities to be lacking. We propose methodological improvements addressing gaps in the research in order to make multisensory research in autism more inclusive to HSN autistics.
PMID:40848184 | DOI:10.1007/s10803-025-07022-4
Ophthalmol Ther. 2025 Aug 23. doi: 10.1007/s40123-025-01222-y. Online ahead of print.
ABSTRACT
INTRODUCTION: To evaluate and characterize adverse events (AEs) associated with EVO and EVO+ implantable collamer lens (ICL) using real-world post-marketing surveillance data from the Food and Drug Administration (FDA)’s MAUDE database.
METHODS: A retrospective analysis was conducted on AE reports related to EVO and EVO+ ICLs, including both spherical and toric models, submitted between 2015 and 2023. After excluding duplicate entries and incomplete records, reports were stratified by lens model and optical type into four groups: spherical EVO, toric EVO, spherical EVO+, and toric EVO+. Each report was independently reviewed by two senior ophthalmologists to classify the associated complications. Descriptive statistics were used to evaluate the proportional distribution of complications across subgroups and to assess the annual trend in reported AEs.
RESULTS: A total of 17,482 AEs reports were analyzed. Across all subgroups, over half of the reports documented no clinical signs or symptoms. Blurred vision was the most frequently reported visual complaint, with a relatively higher reporting frequency in the EVO+ groups. Events involving elevated intraocular pressure and glaucoma were more commonly reported among EVO+ recipients. In addition, a number of rare but clinically significant complications were documented, including hemorrhage, hyphema, decreased intraocular pressure, endophthalmitis, and toxic anterior segment syndrome. The annual number of reported AEs showed a consistent upward trend throughout the study period.
CONCLUSION: This real-world data analysis provides insights into the distribution of major complications associated with ICL implantation in clinical practice. Comprehensive identification and reporting of rare adverse outcomes may help surgeons broaden their perspectives, enhance surgical preparedness, and provide more personalized and informed preoperative counseling.
PMID:40848167 | DOI:10.1007/s40123-025-01222-y
Eur Spine J. 2025 Aug 23. doi: 10.1007/s00586-025-09275-0. Online ahead of print.
ABSTRACT
PURPOSE: To improve the scientific validity and accuracy of this study.
METHOD: We studied the research methods of the article by carefully reading the article and consulting the relevant literature.
RESULT: The authors questioned the routine use of MRI as a follow-up tool by statistically comparing the degree of change in MRI findings in patients with chronic low back pain over a period of less than or equal to two years.
CONCLUSION: To further enhance the scientific rigor of the study, we suggest that certain aspects could be refined, such as including a more comprehensive discussion of patient characteristics, making adjustments to the time limit for inclusion in the study, and conducting more detailed categorical comparisons of different patient populations before and after.
PMID:40848162 | DOI:10.1007/s00586-025-09275-0
Virchows Arch. 2025 Aug 23. doi: 10.1007/s00428-025-04230-2. Online ahead of print.
ABSTRACT
Malignant histiocytoses are rare histiocytic neoplasms that exhibit aggressive clinical and histopathological features. One of these entities, Langerhans cell sarcomas (LCS), shares some histopathological features with Langerhans cell histiocytosis but is distinguished by its overtly malignant cytologic features. The literature on LCS is mostly limited to short reports and a few reviews, while a complete revision of its nosology is lacking. This study aims to fill this gap in the knowledge on LCS, explore potential prognostic factors, and propose a clinical subclassification for better patient stratification, which could guide future treatment investigations. A systematic review of the literature was conducted following PRISMA guidelines. From each included patient, a complete set of clinical and pathological features was collected. Descriptive and association statistics, as well as survival analysis, were performed using R Studio. A cohort of 88 patients was analyzed, the majority being adult males with multisystem pictures often involving skin and lymph nodes. pERK pathway gene mutations were reported in around half. Overall prognosis was poor, even though the association with another hematological neoplasm displayed a significant negative prognostic impact (p = 0.0017). Moreover, in primary cases, a significant difference was observed dividing patients into single system vs multisystem (p = 0.012). Despite treatment modalities being highly heterogeneous, statistical analyses provided insights into the relevance of treating patients according to disease spread (e.g., treating localized masses with surgery alone leads to frequent complete remission, p = 0.0002). This study provides an extensive analysis of LCS nosology and prognostic factors, underscoring the importance of distinguishing LCS from LCH and other histiocytoses, as well as adopting a unified system to define disease spread and guide therapeutic management.
PMID:40848146 | DOI:10.1007/s00428-025-04230-2
Int Urogynecol J. 2025 Aug 23. doi: 10.1007/s00192-025-06282-z. Online ahead of print.
ABSTRACT
INTRODUCTION AND HYPOTHESIS: Diabetic bladder dysfunction (DBD) is a prevalent but underrecognized complication of type 2 diabetes mellitus (T2DM), affecting 25 to 87% of patients and significantly impairing quality of life. The specific risk factors for DBD remain poorly understood due to inconsistent findings in prior studies. This study aims to systematically identify the risk factors associated with DBD among Chinese T2DM patients.
METHODS: A case-control study was conducted in Shenzhen, China, spanning from March 2019 to January 2024, involving T2DM patients from two tertiary comprehensive hospitals. Patients were categorized into DBD and non-DBD groups based on DBD presence or absence. Comparative analysis utilized the Mann-Whitney U test and χ2 test, with significant variables subsequently subjected to logistic regression analysis.
RESULTS: In this study, 35.5% of patients with T2DM experienced the outcome of DBD. Comparative analysis between DBD and non-DBD groups revealed that 11 of 60 candidate variables demonstrated significant associations with DBD development (P < 0.05). Significant predictors identified in logistic regression included age (OR 1.03, 95% CI 1.01-1.04), gender (OR 0.47, 95% CI 0.31-0.72), duration of T2DM (OR 1.08, 95% CI 1.04-1.11), urine microalbumin/creatinine ratio (UA/CR) (OR 1.01, 95% CI 1.01-1.01), and insulin use (OR 2.27, 95% CI 1.30-3.96).
CONCLUSIONS: This study identified a total of five significant risk factors, offering robust evidence for DBD intervention and providing critical insights for reducing its incidence and enhancing patient quality of life.
PMID:40848143 | DOI:10.1007/s00192-025-06282-z