Categories
Nevin Manimala Statistics

Efficacy and Safety of Bronchoscopic Lung Volume Reduction With Endobronchial Valves: A Systematic Review and Meta-analysis

Open Respir Arch. 2025 May 17;7(3):100443. doi: 10.1016/j.opresp.2025.100443. eCollection 2025 Jul-Sep.

ABSTRACT

INTRODUCTION: Emphysema is a phenotype of chronic obstructive pulmonary disease (COPD) that causes air trapping and lung hyperinflation and, consequently, dyspnea, reduced exercise tolerance, and poor health-related quality of life. Several randomized controlled clinical trials have shown that bronchoscopic lung volume reduction (BLVR) with endobronchial valves (EBV) achieves clinically relevant improvements in dyspnea, pulmonary function, exercise capacity and quality of life 12 months after valve implantation in patients with heterogeneous emphysema without collateral ventilation. The goal of our meta-analysis is to examine the efficacy and safety of BLRV in patients with COPD.

MATERIAL AND METHODS: A literature search was performed with PubMed, Embase and Cochrane to identify randomized controlled trials on BLVR with endobronchial valves published from 2005 onwards.

RESULTS: Nine studies with a total of 1352 patients were included; 827 received EBV therapy and 525 standard of care (SOC) medications. The first group showed statistically significant improvements in forced expiratory volume in 1 second (FEV1), Saint George Respiratory Questionnaire (SGRQ) score, modified medical research council (mMRC) dyspnea scale, and 6-minute walk distance (6MWD), and a statistically significant reduction in residual volume (RV). The incidence of pneumothorax and exacerbations in the EBV arm increase significantly, and there were no significant difference between mortality rates.

CONCLUSIONS: Patients with heterogeneous emphysema and no collateral ventilation showed significant improvements in lung function, exercise capacity, dyspnea score, and health-related quality of life after BLVR-EBV, although the risk of exacerbations and pneumothorax in the first 6 months increased compared with the group receiving standard care.

PMID:40530361 | PMC:PMC12173060 | DOI:10.1016/j.opresp.2025.100443

Categories
Nevin Manimala Statistics

Factors related to dosing frequency and route of administration in methotrexate intolerance among patients with rheumatoid arthritis: a cross-sectional study

Ther Adv Drug Saf. 2025 Jun 14;16:20420986251349449. doi: 10.1177/20420986251349449. eCollection 2025.

ABSTRACT

BACKGROUND: Methotrexate is central to the management of rheumatoid arthritis (RA). However, its use is often limited by methotrexate intolerance.

OBJECTIVES: This study aims to explore the association between alternative methotrexate dosing methods and methotrexate intolerance.

DESIGN: A cross-sectional study.

METHODS: A cross-sectional survey was conducted on patients with RA receiving methotrexate for at least 3 months at the outpatient clinic of King Saud University Medical City, Riyadh, Saudi Arabia. The electronic survey collected data on demographics, marital and educational status, methotrexate use, Methotrexate Intolerance Severity Score (MISS), and Health Assessment Questionnaire. Statistical analyses (univariate and linear or logistic regression) were conducted to evaluate the associations between the administration methods and methotrexate intolerance (MISS ⩾6).

RESULTS: The study included 154 patients, predominantly female (89%; mean age (standard deviation, ±SD): 50 (±12) years). Methotrexate tolerance was observed in 64% of the participants, while 36% had a MISS above the cutoff point of 6, indicating intolerance. Methotrexate-intolerant patients were younger (mean age (±SD): 47 (±12) years) than tolerant patients (mean age (±SD): 54 (±12) years; p = 0.005). No significant differences were found between methotrexate-tolerant and methotrexate-intolerant patients regarding dose, frequency, relation to meals, and time of day.

CONCLUSION: Methotrexate tolerance was not associated with different administration methods: split-dose versus single weekly dose, or subcutaneous versus oral administration.

PMID:40530356 | PMC:PMC12171254 | DOI:10.1177/20420986251349449

Categories
Nevin Manimala Statistics

Degenerated nerve grafts provide similar quality and outcome in reconstructing critical nerve defects as compared to fresh nerve grafts

Front Cell Dev Biol. 2025 Jun 3;13:1568935. doi: 10.3389/fcell.2025.1568935. eCollection 2025.

ABSTRACT

INTRODUCTION: Brachial plexus injuries are commonly caused by stretch-traction injuries. The clinical standard is timely anatomic reconstruction with autologous nerve grafts and/or intra- or extraplexal nerve transfers. Commonly used nerve grafts are the sural nerves and/or grafts taken from the affected side. If the lower trunk has been affected, the latter nerves, however, are predegenerated. In this animal experiment we investigated, whether a degenerated nerve graft avails the same quality of regeneration as compared to a non-degenerated graft.

METHODS AND MATERIALS: In this animal study, a 2 cm lesion of the right common peroneal nerve was created, and the ipsilateral sural nerve was cut or left intact to later serve as a graft. Nerve reconstruction was carried out 3 weeks later using the fresh or degenerated graft. After 6 weeks, either a retrograde labeling of the common peroneal nerve or muscle force testing was performed.

RESULTS: A total of 34 male SD rats, Group A (n = 13) and Group B (n = 21) were included. In Group A, the retrograde labeling of the spinal motor neurons showed an average of 66.05 (±17.03) neurons in animals with a fresh graft and 41.19 (±10.47) neurons in animals with a degenerated graft. In two animals with a fresh graft, no motor neurons could be labeled. No statistical inferiority was observed (p = 0.071). In Group B, regeneration is expressed as a recovery ratio. The fresh graft group had a mean maximum evoked contraction of 8.2 (±7.1), compared to 8.5 (±4.9) in the degenerated graft group (p = 0.462). The mean maximum twitch force was 5.2 (±3.5) and 6.4 (±4.4) respectively (p = 0.577). The mean muscle weight, comparing injured to uninjured side, was 0.32 (±0.06) in the fresh graft group and 0.32 (±0.04) in the degenerated graft group (p = 0.964).

CONCLUSION: The use of predegenerated nerve grafts for critical nerve reconstruction showed no statistical inferiority as compared to the fresh grafts in any of the evaluated outcome. Overall, these results are promising, particularly in the context of critical nerve defects involving multiple nerves, where the use of a degenerated grafts often remains the only additional source of graft material.

PMID:40530330 | PMC:PMC12170638 | DOI:10.3389/fcell.2025.1568935

Categories
Nevin Manimala Statistics

Comparing fusion and complication rates after instrumented versus uninstrumented fusion for lumbar spondylolisthesis: A systematic review and meta analysis of randomized controlled trials with trial sequential analysis

J Orthop. 2025 May 27;68:175-184. doi: 10.1016/j.jor.2025.05.059. eCollection 2025 Oct.

ABSTRACT

INTRODUCTION: Spinal fusion is a common treatment for degenerative or isthmic lumbar spondylolisthesis (LS) in adult patients, where vertebral slippage can lead to significant neurological impairment. However, debate exists regarding the exact fixation method, as fusion and complication rates may differ via instrumented fusion (IF) or uninstrumented fusion (UIF). Therefore, the purpose of this study is to investigate the high-level literature for the fusion and complication rates associated with IF versus UIF to guide decision-making.

METHODS: This systematic review and meta-analysis utilized PubMed, SCOPUS, and Web of Science through September 15th, 2024, to assess the fusion and complication rates associated with IF versus UIF for LS. Inclusion criteria included randomized controlled trials (RCTs) only. The primary outcomes were rates of fusion, reoperation, and complication rates. Statistical analysis included relative risk (RR) with 95 % confidence intervals (CI) along with trial sequential analysis (TSA) and assessment of fragility index (FI).

RESULTS: A total of five RCTs were included in this study out of the 799 articles initially retrieved. Included patients (n = 286; 72.02 % female) had a mean age of 60.97 years and underwent either IF (n = 150) or UIF (n = 136) for LS with a mean follow-up of 2.31 years. Roughly 26.92 % of patients had isthmic LS (n = 77) and 73.08 % of patients had degenerative LS (n = 209) with 98.91 % of patients (n = 182/184) having a grade 1 or 2 LS. Patients who underwent IF had a statistically significant higher rate of fusion as compared to patients who underwent UIF for LS (90.7 % versus 48.5 %; RR: 1.96; 95 % CI: [1.23, 3.13]; p = 0.005) with robust evidence (FI: 11 patients). However, there was no statistically significant difference in reoperation rates (10.4 % versus 2.7 %; RR: 1.05; 95 % CI: [0.97,1.13]; p = 0.264) or total complication rates (7.3 % versus 2.4 %; RR: 0.90; 95 % CI: [0.90, 1.02]; p = 0.228) between patients who underwent IF versus UIF for LS. TSA for all primary outcomes demonstrated a Z-curve that did not cross the required information size, suggesting more research is needed for definitive conclusions on this topic. Qualitatively, two RCTs reported greater operative time (OT) and estimated blood loss (EBL) for IF as compared to UIF for LS.

CONCLUSION: Among adult patients with LS, IF resulted in a robust and statistically significant higher fusion rate as compared to UIF, although more research is needed for definitive conclusions. However, there was no statistically significant difference in reoperation or total complication rates for IF versus UIF for LS.

PMID:40530324 | PMC:PMC12167837 | DOI:10.1016/j.jor.2025.05.059

Categories
Nevin Manimala Statistics

Comparison of load-to-failure in pre-shaped versus surgeon-shaped Achilles tendon allograft bone blocks

J Orthop. 2025 May 27;68:163-170. doi: 10.1016/j.jor.2025.05.055. eCollection 2025 Oct.

ABSTRACT

PURPOSE: Traditional bone-plug allografts in reconstruction of anterior cruciate ligament (ACL) tears require shaping of the bone plug by surgeons, yielding inconsistent results, greater costs, and increased operative time. We compare the load-to-failure between pre-shaped and surgeon-shaped Achilles allografts with calcaneal bone blocks to assess their use in ACL reconstruction.

METHODS: Six pre-shaped Achilles allograft tendons with calcaneus bone grafts were compared to 6 surgeon-shaped allografts. Calcaneal grafts were inserted into artificial saw bone while the opposite ends were fixed to a linear-torsion dynamic test machine for cyclic and load-to-failure testing. Loading began with a preconditioning phase, followed by uniaxial cycles. Failure load was and mechanism of failure for each graft was identified.

RESULTS: Of the 6 pre-shaped bone grafts, 3 (50 %) experienced a failure at the sawbone/screw interface, 2 (33 %) experienced a bone graft fracture, and 1 (17 %) a tendon avulsion during cycling. Of the 6 surgeon-shaped bone grafts, 3 (50 %) experienced failure at the sawbone screw interface, and 3 (50 %) experienced a bone block fracture. No significant differences in biomechanical properties measured during load-to-failure testing or failure modes were detected between the two graft types.

CONCLUSION: Pre-shaped grafts exhibited a trend towards higher load and displacement at failure, although this was not statistically significant. These findings, along with potential cost and time savings, warrant further study on their impact on surgical efficiency and outcomes.

PMID:40530321 | PMC:PMC12167822 | DOI:10.1016/j.jor.2025.05.055

Categories
Nevin Manimala Statistics

Application of 1-Stage and 2-Stage Total Hip Arthroplasty in Managing Active Hip Tuberculosis Osteoarthritis of Varying Severity

Arthroplast Today. 2025 Jun 4;33:101722. doi: 10.1016/j.artd.2025.101722. eCollection 2025 Jun.

ABSTRACT

BACKGROUND: Total hip arthroplasty (THA) has emerged as a valuable strategy for managing hip tuberculosis (TB) osteoarthritis, but the optimal of 1-stage and 2-stage THA in patients with hip TB of varying severity levels surgical approach remains debated. The purpose of this study was to investigate whether there were differences in the effect of different surgical protocols on hip TB treatment.

METHODS: A retrospective cohort study was conducted on 43 patients who underwent THA for hip TB at our institution between 2010 and 2020. Twenty-three patients received a 1-stage THA, while 20 underwent a 2-stage procedure. Infection control, functional status, complications and the blood loss and transfusion volume were evaluated mean 4-year follow-up.

RESULTS: Both surgical approaches demonstrated favorable outcomes. No significant differences were observed between the 1-stage and 2-stage groups in terms of infection control (P = .35), functional improvement as measured by the Harris Hip Score (P = .42), or complication rates (P = .61). The mean Harris Hip Score improved significantly in both groups from baseline (P < .01 for both), with a slightly higher score at 1 year in the 1-stage group (P = .04). The differences in both blood loss and transfusion volume were statistically significant (P < .01 and P = .01, respectively).

CONCLUSIONS: For patients with mild disease, 1-stage THA may be an appropriate choice, while 2-stage THA is recommended for severe cases. Within their respective indications, both approaches demonstrate good outcomes in terms of infection control and functional restoration.

PMID:40530300 | PMC:PMC12172307 | DOI:10.1016/j.artd.2025.101722

Categories
Nevin Manimala Statistics

The potential habitat of Phlomoides rotata in Tibet was based on an optimized MaxEnt model

Front Plant Sci. 2025 Jun 3;16:1560603. doi: 10.3389/fpls.2025.1560603. eCollection 2025.

ABSTRACT

INTRODUCTION: Phlomoides rotata, an important Tibetan medicinal plant, has garnered significant attention due to its remarkable medicinal value and ecological functions. However, overharvesting and climate change have progressively reduced its distribution range, threatening its survival.

METHODS: This study employed an optimized MaxEnt model, integrating field survey data and multiple environmental variables, to predict and analyze the potential suitable distribution of P. rotata in Tibet.

RESULTS: The model achieved high predictive accuracy, with Ture skill statistic (TSS) = 0.87 and Cohen’s Kappa Coefficient (Kappa) = 0.81. Under current climatic conditions, the suitable habitat area of P. rotata is 33.31×104 km², primarily distributed in alpine meadows and sparse shrublands in regions such as Lhasa, Nyingchi, Qamdo, Shannan, and eastern Nagqu. Analysis of key environmental factors revealed that land cover type (30.7%), temperature seasonality (19.9%), and vegetation type (10.2%) are the most significant drivers influencing the distribution of P. rotata. Under future climate change scenarios, the distribution of suitable habitats exhibits notable dynamic trends. In the low-emission scenario (SSP126), the suitable habitat area shows an overall expansion. In contrast, under medium- and high-emission scenarios (SSP245 and SSP585), the suitable habitat area gradually shrinks. The distribution centers consistently migrate northwestward, with the longest migration distance observed under SSP585 (89.55 km).

DISCUSSION: This study identifies the critical driving factors for the distribution of P. rotata and elucidates its response patterns to climate change. These findings provide a theoretical foundation for the resource management, ecological conservation, and sustainable utilization of Tibetan medicinal plants while offering valuable references for the study of other alpine plants.

PMID:40530282 | PMC:PMC12170608 | DOI:10.3389/fpls.2025.1560603

Categories
Nevin Manimala Statistics

Deciphering maize resistance to late wilt disease caused by Magnaporthiopsis maydis: agronomic, anatomical, molecular, and genotypic insights

Front Plant Sci. 2025 Jun 3;16:1566514. doi: 10.3389/fpls.2025.1566514. eCollection 2025.

ABSTRACT

INTRODUCTION: Magnaporthiopsis maydis, the causal agent of late wilt disease (LWD), poses a significant threat to maize production by reducing grain yield and quality. Identifying and developing resistant genotypes adapted to different environments is essential for sustainable crop improvement.

METHODS: Fifteen maize genotypes were evaluated for their response to LWD across three growing seasons at two experimental locations-Gemmeiza and Sids. Disease incidence, agronomic performance, anatomical features, and antioxidant enzyme activities were assessed. Gene expression analysis of PR1 and PR4 was conducted using RT-qPCR. Genotype × environment interaction (GEI) was analyzed using combined ANOVA and the additive main effects and multiplicative interaction (AMMI) model.

RESULTS: Significant differences were observed among genotypes, environments, and their interactions (GEI) for disease incidence and yield-related traits (p < 0.05). AMMI analysis confirmed substantial GEI effects on DI% and hundred kernel weight. Genotypes TWC1100, SC30K9, and SC2031 consistently showed the lowest disease incidence and the highest resistance rating index (RRI > 8.3) across both locations, while the susceptible check Boushy recorded the highest DI% and lowest RRI. TWC1100 and SC30K9 also achieved the highest kernel weights at Gemmeiza (42.8 g and 41.5 g, respectively). Stability analysis using AMMI stability value (ASV) identified TWC1100, SC30K9, TWC324, and SC130 as the most stable genotypes. Biochemical analysis revealed that resistant genotypes exhibited higher peroxidase activity and lower electrolyte leakage. Anatomical examination showed superior root structure in resistant genotypes, particularly SC2031. Molecular analysis confirmed the upregulation of PR1 and PR4 genes post-infection, with TWC1100 showing robust expression, while Boushy exhibited minimal gene activation.

DISCUSSION: The integration of agronomic, anatomical, biochemical, and molecular analyses revealed promising maize genotypes with enhanced resistance to late wilt disease (LWD) and stable performance across diverse environments. These findings highlight the potential of these genotypes as valuable candidates for inclusion in breeding programs targeting improved disease resistance and yield stability under varying environmental conditions.

PMID:40530266 | PMC:PMC12170576 | DOI:10.3389/fpls.2025.1566514

Categories
Nevin Manimala Statistics

Hemiarthroplasties via Posterior Trochanter Osteotomy for Treating Femoral Neck Fractures in Post-Cerebrovascular Disease

J Multidiscip Healthc. 2025 Jun 12;18:3391-3401. doi: 10.2147/JMDH.S515576. eCollection 2025.

ABSTRACT

OBJECTIVE: The study investigated to examine the clinical outcomes of hemiarthroplasties using posterior femoral trochanter osteotomy for the treatment of femoral neck fractures in patients at the sequelae stage of cerebrovascular disease.

METHODS: A retrospective analysis was conducted on the data of 53 patients who had been admitted to the Department of Orthopedics at Yan’an University Affiliated Hospital between May 2020 and May 2023. These patients had been diagnosed with femoral neck fractures and concurrent muscle weakness at the sequelae stage of cerebrovascular disease. The patients were divided into two groups: the osteotomy group (20 cases), which underwent hemiarthroplasties via an L osteotomy of the posterior femoral trochanter, and the conventional group (33 cases), which received hemiarthroplasties through the posterolateral approach of the greater trochanter. The two groups were compared on various parameters, including incision length, operation duration, intraoperative blood loss, postoperative drainage, blood transfusion rates, length of hospitalization, early mobilization post-surgery, hip joint function scores at follow-up visits (3 and 12 months), and the rate of postoperative dislocation of the femoral head.

RESULTS: No significant differences were observed between the two groups regarding incision length (P=0.06), operation duration (P=0.284), intraoperative blood loss (P=0.925), Blood transfusion rate (P=0.489), postoperative drainage (P=0.831) and length of hospital stay (P=0.341). However, the early mobilization time following surgery was shorter in the osteotomy group compared to the conventional group (P<0.001). Additionally, the Harris hip joint function scores for the osteotomy group were significantly higher than those for the conventional group at both the 3- and 12-month postoperative assessments (P=0.003, P=0.004, respectively). The dislocation rate of the femoral head in the osteotomy group was lower than that in the conventional group with no statistical significance difference (P=0.521).

CONCLUSION: The use of hemiarthroplasties via posterior femoral trochanter osteotomy demonstrates favorable clinical outcomes in the treatment of femoral neck fractures.

PMID:40530244 | PMC:PMC12170355 | DOI:10.2147/JMDH.S515576

Categories
Nevin Manimala Statistics

Predicting Short-Term Risk of Cardiovascular Events in the Elderly Population: A Retrospective Study in Shanghai, China

Clin Interv Aging. 2025 Jun 12;20:825-836. doi: 10.2147/CIA.S519546. eCollection 2025.

ABSTRACT

INTRODUCTION: Cardiovascular diseases (CVD) represents a leading cause of morbidity and mortality worldwide, including China. Accurate prediction of CVD risk and implementation of preventive measures are critical. This study aimed to develop a short-term risk prediction model for CVD events among individuals aged ≥60 years in Shanghai, China.

METHODS: Stratified random sampling recruited elderly individuals. Retrospective data (2016-2022) were analyzed using Lasso-Cox regression, followed by a multivariable Cox regression model. The risk scoring was visualized through a nomogram, and the model performance was assessed using calibration plots and receiver operating characteristic curves.

RESULTS: A total of 9,636 individuals aged ≥60 years were included. The Lasso-Cox regression analysis showed male gender (HR=1.482), older age (HR=1.035), higher body mass index (HR=1.015), lower high-density lipoprotein cholesterol (HR=0.992), higher systolic blood pressure (HR=1.009), lower diastolic blood pressure (HR=0.982), higher fasting plasma glucose (HR=1.068), hypertension (HR=1.904), diabetes (HR=1.128), and lipid-lowering medication (HR=1.384) were related to higher CVD risk. The C-index in the training and validation data was 0.642 and 0.623, respectively. Calibration plots indicated good agreement between predicted and actual probabilities.

CONCLUSION: This short-term predictive model for CVD events among the elderly population exhibits good accuracy but moderate discriminative ability. More studies are warranted to investigate predictors (gender, high-density lipoprotein cholesterol, systolic blood pressure, diastolic blood pressure, hypertension, and lipid-lowering medication) of CVD incidence for the development of preventive measures.

PMID:40530237 | PMC:PMC12170357 | DOI:10.2147/CIA.S519546