Categories
Nevin Manimala Statistics

Trends in the Use of Oral Anticoagulants for Adults With Venous Thromboembolism in the US, 2010-2020

JAMA Netw Open. 2023 Mar 1;6(3):e234059. doi: 10.1001/jamanetworkopen.2023.4059.

ABSTRACT

IMPORTANCE: The introduction of direct oral anticoagulants (DOACs) has transformed the treatment of venous thromboembolism (VTE). Large health care databases offer valuable insight into how oral anticoagulants (OACs) are used in clinical practice and may aid in understanding reasons for changes in therapy.

OBJECTIVES: To evaluate prescribing patterns of OACs for patients with VTE and identify clinical events that precede treatment changes.

DESIGN, SETTING, AND PARTICIPANTS: This retrospective cohort study used data from a public (Medicare fee-for-service) and a commercial (IBM MarketScan) health insurance database on 298 609 patients initiating OACs within 90 days of index VTE hospitalization from January 1, 2009, to December 31, 2020. Statistical analysis was conducted from April to August 2022.

EXPOSURES: Warfarin and the DOACs rivaroxaban, apixaban, dabigatran, and edoxaban.

MAIN OUTCOMES AND MEASURES: Characteristics of patients initiating different OACs, along with trends over time of patients initiating OACs, were compared. Time receiving continuous anticoagulant therapy, patterns of anticoagulant discontinuation (treatment gap of ≥30 days), and treatment switches were assessed. Clinical events in the 30 days preceding treatment modifications were identified.

RESULTS: A total of 203 378 individuals with Medicare (mean [SD] age, 76.9 [7.6] years; 122 554 women [60.3%]) and 95 231 with commercial insurance (mean [SD] age, 57.6 [15.8] years; 47 139 women [49.5%]) were included (N = 298 609). Warfarin was the most frequent OAC prescribed (163 044 [54.6%]), followed by rivaroxaban (66 882 [22.3%]) and apixaban (65 997 [22.1%]). The proportion of patients initiating DOACs increased from 0% in 2010 to 86.8% (22 420 of 25 817) in 2019 for patients with Medicare and 92.1% (4012 of 4357) in 2020 for commercially insured patients. Patients with chronic kidney disease were more likely to initiate warfarin (35 561 [11.9%]) or apixaban (16 294 [5.5%]) than rivaroxaban (10 136 [3.4%]), and those with a history of bleeding were more likely to initiate apixaban (5424 [1.8%]) than rivaroxaban (3007 [1.0%]). Overall, patients received persistent OAC treatment for approximately 6 months (Medicare: median, 175 days [IQR, 76-327 days]; commercial insurance: median, 168 days [IQR, 83-279 days]). A total of 33 011 patients (11.1%) switched anticoagulant therapy within a year. Switching to another anticoagulant was preceded most frequently by codes for a VTE diagnostic procedure (27.2% of all switchers [8983 of 33 011]).

CONCLUSIONS AND RELEVANCE: This cohort study using data from 2 US health insurance databases suggests that most patients with VTE continued oral anticoagulant treatment for approximately 6 months. Clinical reasons for modifying anticoagulant therapy were identified in one-third of patients. Identifying reasons for treatment modification is crucial for generating valid evidence on drug safety and effectiveness.

PMID:36947039 | DOI:10.1001/jamanetworkopen.2023.4059

Categories
Nevin Manimala Statistics

Prevalence of Myopia in Children Before, During, and After COVID-19 Restrictions in Hong Kong

JAMA Netw Open. 2023 Mar 1;6(3):e234080. doi: 10.1001/jamanetworkopen.2023.4080.

ABSTRACT

IMPORTANCE: Childhood myopia increased during the COVID-19 pandemic. Limited evidence exists about whether myopia development was reversed or worsened after the lockdown.

OBJECTIVE: To determine the prevalence of myopia and its associated factors before, during, and after COVID-19 restrictions.

DESIGN, SETTING, AND PARTICIPANTS: This population-based, repeated cross-sectional study evaluated children aged 6 to 8 years from the Hong Kong Children Eye Study between 2015 and 2021 in 3 cohorts: before COVID-19 (2015-2019), during COVID-19 restrictions (2020), and after COVID-19 restrictions were lifted (2021).

EXPOSURES: All the children received ocular examinations, including cycloplegic autorefraction and axial length. Data about the children’s lifestyle, including time spent outdoors, near-work time, and screen time, were collected from a standardized questionnaire.

MAIN OUTCOMES AND MEASURES: The main outcomes were the prevalence of myopia, mean spherical equivalent refraction, axial length, changes in lifestyle, and the associated factors over 7 years. Data were analyzed using descriptive statistics, logistic regression, and generalized estimating equations.

RESULTS: Of 20 527 children (mean [SD] age, 7.33 [0.89] years; 52.8% boys and 47.2% girls), myopia prevalence was stable from 2015 to 2019 (23.5%-24.9%; P = .90) but increased to 28.8% (P < .001) in 2020 and 36.2% (P < .001) in 2021. The mean (SD) time spent outdoors was much lower in 2020 (0.85 [0.53] h/d; P < .001) and 2021 (1.26 [0.48] h/d; P < .001) compared with pre-COVID-19 levels (1.40 [0.47]-1.46 [0.65] h/d). The trend was reversed for total near-work time and screen time. High myopia prevalence was associated with the COVID-19 pandemic (odds ratio [OR], 1.40; 95% CI, 1.28-1.54; P < .001), younger age (OR, 1.84; 95% CI, 1.76-1.93; P < .001), male sex (OR, 1.11; 95% CI, 1.03-1.21; P = .007), lower family income (OR, 1.05; 95% CI, 1.00-1.09; P = .04), and parental myopia (OR, 1.61; 95% CI, 1.52-1.70; P < .001). During the pandemic, mean (SD) near-work and screen times in children from lower-income families were 5.16 (2.05) h/d and 3.44 (1.97) h/d, more than from higher-income families (4.83 [1.85] and 2.90 [1.61] h/d, respectively).

CONCLUSIONS AND RELEVANCE: The findings of this cross-sectional study revealed that after COVID-19 restrictions were lifted in Hong Kong, myopia prevalence among children was higher than before the pandemic, and lifestyle did not return to pre-COVID-19 levels. Younger children and those from low-income families were at a higher risk of myopia development during the pandemic, suggesting that collective efforts for myopia control should be advocated for these groups.

PMID:36947037 | DOI:10.1001/jamanetworkopen.2023.4080

Categories
Nevin Manimala Statistics

Organ Transplants From Deceased Donors With Primary Brain Tumors and Risk of Cancer Transmission

JAMA Surg. 2023 Mar 22. doi: 10.1001/jamasurg.2022.8419. Online ahead of print.

ABSTRACT

IMPORTANCE: Cancer transmission is a known risk for recipients of organ transplants. Many people wait a long time for a suitable transplant; some never receive one. Although patients with brain tumors may donate their organs, opinions vary on the risks involved.

OBJECTIVE: To determine the risk of cancer transmission associated with organ transplants from deceased donors with primary brain tumors. Key secondary objectives were to investigate the association that donor brain tumors have with organ usage and posttransplant survival.

DESIGN, SETTING, AND PARTICIPANTS: This was a cohort study in England and Scotland, conducted from January 1, 2000, to December 31, 2016, with follow-up to December 31, 2020. This study used linked data on deceased donors and solid organ transplant recipients with valid national patient identifier numbers from the UK Transplant Registry, the National Cancer Registration and Analysis Service (England), and the Scottish Cancer Registry. For secondary analyses, comparators were matched on factors that may influence the likelihood of organ usage or transplant failure. Statistical analysis of study data took place from October 1, 2021, to May 31, 2022.

EXPOSURES: A history of primary brain tumor in the organ donor, identified from all 3 data sources using disease codes.

MAIN OUTCOMES AND MEASURES: Transmission of brain tumor from the organ donor into the transplant recipient. Secondary outcomes were organ utilization (ie, transplant of an offered organ) and survival of kidney, liver, heart, and lung transplants and their recipients. Key covariates in donors with brain tumors were tumor grade and treatment history.

RESULTS: This study included a total of 282 donors (median [IQR] age, 42 [33-54] years; 154 females [55%]) with primary brain tumors and 887 transplants from them, 778 (88%) of which were analyzed for the primary outcome. There were 262 transplants from donors with high-grade tumors and 494 from donors with prior neurosurgical intervention or radiotherapy. Median (IQR) recipient age was 48 (35-58) years, and 476 (61%) were male. Among 83 posttransplant malignancies (excluding NMSC) that occurred over a median (IQR) of 6 (3-9) years in 79 recipients of transplants from donors with brain tumors, none were of a histological type matching the donor brain tumor. Transplant survival was equivalent to that of matched controls. Kidney, liver, and lung utilization were lower in donors with high-grade brain tumors compared with matched controls.

CONCLUSIONS AND RELEVANCE: Results of this cohort study suggest that the risk of cancer transmission in transplants from deceased donors with primary brain tumors was lower than previously thought, even in the context of donors that are considered as higher risk. Long-term transplant outcomes are favorable. These results suggest that it may be possible to safely expand organ usage from this donor group.

PMID:36947028 | DOI:10.1001/jamasurg.2022.8419

Categories
Nevin Manimala Statistics

Distinct health care use patterns of patients with chronic gastrointestinal diseases

Am J Manag Care. 2023 Mar 1;29(3):e71-e78. doi: 10.37765/ajmc.2023.89332.

ABSTRACT

OBJECTIVES: Patients with complex chronic conditions have varying multidisciplinary care needs and utilization patterns, which limit the effectiveness of initiatives designed to improve continuity of care (COC) and reduce utilization. Our objective was to categorize patients with complex chronic conditions into distinct groups by pattern of outpatient care use and COC to tailor interventions.

STUDY DESIGN: Observational cohort study from 2014 to 2015.

METHODS: We identified patients whose 1-year hospitalization risk was in at least the 90th percentile in 2014 who had a chronic gastrointestinal disease (cirrhosis, inflammatory bowel disease, chronic pancreatitis) as case examples of complex chronic disease. We described frequency of office visits, number of outpatient providers, and 2 COC measures (usual provider of care, Bice-Boxerman COC indices) over 12 months. We used latent profile analysis, a statistical method for identifying distinct subgroups, to categorize patients based on overall, primary care, gastroenterology, and mental health continuity patterns.

RESULTS: The 26,751 veterans in the cohort had a mean (SD) of 13.3 (8.6) office visits and 7.2 (3.8) providers in 2014. Patients were classified into 5 subgroups: (1) high gastroenterology-specific COC with mental health use; (2) high gastroenterology-specific COC without mental health use; (3) high overall utilization with mental health use; (4) low overall COC with mental health use; and (5) low overall COC without mental health use. These groups varied in their sociodemographic characteristics and risk for hospitalization, emergency department use, and mortality.

CONCLUSIONS: Patients at high risk for health care utilization with specialty care needs can be grouped by varying propensity for health care continuity patterns.

PMID:36947019 | DOI:10.37765/ajmc.2023.89332

Categories
Nevin Manimala Statistics

Real-World Evaluation of Disease Progression After CDK 4/6 Inhibitor Therapy in Patients With Hormone Receptor-Positive Metastatic Breast Cancer

Oncologist. 2023 Mar 22:oyad035. doi: 10.1093/oncolo/oyad035. Online ahead of print.

ABSTRACT

BACKGROUND: Cyclin-dependent kinase 4/6 inhibitors (CDKi) have changed the landscape for treatment of patients with hormone receptor positive, human epidermal growth factor receptor 2-negative (HR+/HER-) metastatic breast cancer (MBC). However, next-line treatment strategies after CDKi progression are not yet optimized. We report here the impact of clinical and genomic factors on post-CDKi outcomes in a single institution cohort of HR+/HER2- patients with MBC.

METHODS: We retrospectively reviewed the medical records of patients with HR+/HER2- MBC that received a CDKi between April 1, 2014 and December 1, 2019 at our institution. Data were summarized using descriptive statistics, the Kaplan-Meier method, and regression models.

RESULTS: We identified 140 patients with HR+/HER2- MBC that received a CDKi. Eighty percent of patients discontinued treatment due to disease progression, with a median progression-free survival (PFS) of 6.0 months (95% CI, 5.0-7.1), whereas those that discontinued CDKi for other reasons had a PFS of 11.3 months (95% CI, 4.6-19.4) (hazard ratio (HR) 2.53, 95% CI, 1.50-4.26 [P = .001]). The 6-month cumulative incidence of post-CDKi progression or death was 51% for the 112 patients who progressed on CDKi. Patients harboring PTEN mutations pre-CDKi treatment had poorer clinical outcomes compared to those with wild-type PTEN.

CONCLUSION: This study highlights post-CDKi outcomes and the need for further molecular characterization and novel therapies to improve treatments for patients with HR+/HER2- MBC.

PMID:36946994 | DOI:10.1093/oncolo/oyad035

Categories
Nevin Manimala Statistics

Application of Retrocolic Approach with Uncinate Process Priority in Laparoscopic Pancreaticoduodenectomy

J Laparoendosc Adv Surg Tech A. 2023 Mar 22. doi: 10.1089/lap.2022.0491. Online ahead of print.

ABSTRACT

Background: Pancreaticoduodenectomy (PD) is a complex operative procedure, which remains the primary curative treatment for pancreatic, distal bile duct, and periampullary cancers. In recent years, with the continuous development of laparoscopic technology and equipment, laparoscopic pancreaticoduodenectomy (LPD) has been performed gradually in many high-volume surgical centers. However, it is still challenging even for experienced pancreatic surgeons to perform LPD, at the same time, with the accumulation of surgical experience, different surgical approaches are also constantly discussed. Methods: We retrospectively analyzed the clinical data of 323 patients who received LPD at a single institution. Among them, 200 patients received operations with retrocolic approach, 123 patients were treated with traditional approach. In this study, we analyzed perioperative data and compared survival time for patients with pancreatic cancers in two groups. Result: Compared with traditional approach, retrocolic approach with uncinate process priority has a shorter operative time (94.25 ± 6.46 minutes versus 116.43 ± .4.78 minutes, P = .009) and less intraoperative blood loss (80 mL versus 150 mL, P = .562). However, there is no statistical significance in the incidence of postoperative complications (≥ Clavien-Dindo [CD] III) (65 [32.5%] versus 45 [36.58%], P = .871), R0 resection rates (41 versus 38, P = .826), and the number of lymph nodes harvested (16.64 ± 5.93 versus 15.37 ± 4.65, P = .785) between two groups. Meanwhile, the median survival time of patients with pancreatic cancers in posterior approach group was longer than those in traditional approach group (30.34 months versus 28.54 months, P > .05); however, there was no statistical significance between them. Conclusion: Retrocolic approach with uncinate process priority is a feasible method for pancreatic cancer, which could reduce operating time and intraoperative bleeding, meanwhile, not increase the incidence of postoperative complications. Retrocolic approach with uncinate process priority can be generalized to larger group sizes.

PMID:36946976 | DOI:10.1089/lap.2022.0491

Categories
Nevin Manimala Statistics

Multi-mode fiber-based speckle contrast optical spectroscopy: analysis of speckle statistics

Opt Lett. 2023 Mar 15;48(6):1427-1430. doi: 10.1364/OL.478956.

ABSTRACT

Speckle contrast optical spectroscopy/tomography (SCOS/T) provides a real-time, non-invasive, and cost-efficient optical imaging approach to mapping of cerebral blood flow. By measuring many speckles (n>>10), SCOS/T has an increased signal-to-noise ratio relative to diffuse correlation spectroscopy, which measures one or a few speckles. However, the current free-space SCOS/T designs are not ideal for large field-of-view imaging in humans because the curved head contour cannot be readily imaged with a single flat sensor and hair obstructs optical access. Herein, we evaluate the feasibility of using cost-efficient multi-mode fiber (MMF) bundles for use in SCOS/T systems. One challenge with speckle contrast measurements is the potential for confounding noise sources (e.g., shot noise, readout noise) which contribute to the standard deviation measure and corrupt the speckle contrast measure that is central to the SCOS/T systems. However, for true speckle measurements, the histogram of pixel intensities from light interference follows a non-Gaussian distribution, specifically a gamma distribution with non-zero skew, whereas most noise sources have pixel intensity distributions that are Gaussian. By evaluating speckle data from static and dynamic targets imaged through an MMF, we use histograms and statistical analysis of pixel histograms to evaluate whether the statistical properties of the speckles are retained. We show that flow-based speckle can be distinguished from static speckle and from sources of system noise through measures of skew in the pixel intensity histograms. Finally, we illustrate in humans that MMF bundles relay blood flow information.

PMID:36946944 | DOI:10.1364/OL.478956

Categories
Nevin Manimala Statistics

Effect of Squeeze, Cough, and Strain on Dynamic Urethral Function in Nulligravid Asymptomatic Women: A Cross-Sectional Cohort Study

Urogynecology (Phila). 2023 Mar 13. doi: 10.1097/SPV.0000000000001345. Online ahead of print.

ABSTRACT

IMPORTANCE: In the past, urethral shape, mobility, and urodynamics have been used to retrospectively demonstrate correlations with stress urinary incontinence. Our previous work has shown a relationship between urethral function and shape in symptomatic women.

OBJECTIVE: This study aimed to characterize the effect of pelvic floor squeeze and strain maneuvers on urethral shapes and pressure in a cohort of patients without pelvic floor disorders.

STUDY DESIGN: In this cross-sectional study, volunteers underwent dynamic pelvic floor ultrasound examination, and a modified urodynamic study. Urethral length, thickness, and proximal and distal swing angles were measured at rest, squeeze, and strain. The midsagittal urethral walls were traced so that a statistical shape model could be performed. Means and standard deviations of imaging and urodynamic measures were calculated.

RESULTS: Data from 19 participants were analyzed. On average during squeeze compared with rest, urethral length increased by 6%, thickness decreased by 42% (distal, P < 0.001), 10% (middle), and urethral pressure increased by 14%. Opposite shape changes-length decreased by 10% (P = 0.001), thickness increased by 57% (distal, P < 0.001), 20% (middle, P < 0.001)-and increased urethral mobility were observed during strain, with larger pressure increases occurring (29%, P < 0.001). Fifty-one percent of the total shape variance described the differences between maneuvers. These differences were statistically different between groups (P < 0.001 for comparisons, all others P > 0.05).

CONCLUSIONS: Dynamic ultrasound and urodynamics allow for the establishment of baseline ranges in urethral metrics (2-dimensional measures, shape, and pressure) and how they are altered during maneuvers. These data can allow for a more objective identification of incontinence via ultrasound and urodynamic testing.

PMID:36946905 | DOI:10.1097/SPV.0000000000001345

Categories
Nevin Manimala Statistics

Left Atrial Appendage Volume Predicts Atrial Fibrillation Recurrence after Radiofrequency Catheter Ablation: A Meta-Analysis

Arq Bras Cardiol. 2023 Mar;120(3):e20220471. doi: 10.36660/abc.20220471.

ABSTRACT

BACKGROUND: The influence of left atrial appendage volume (LAAV) on the recurrence of atrial fibrillation (AF) following radiofrequency catheter ablation remains unclear.

OBJECTIVES: We performed a meta-analysis to assess whether LAAV is an independent predictor of AF recurrence following radiofrequency catheter ablation.

METHODS: The PubMed and the Cochrane Library databases were searched until March 2022 to identify publications evaluating LAAV in association with AF recurrence after radiofrequency catheter ablation. Seven studies that fulfilled the specified criteria of our analysis were found. We used the Newcastle-Ottawa Scale to evaluate the quality of the studies. The pooled effects were evaluated depending on standardized mean differences (SMDs) or hazard ratios (HRs) with 95% confidence intervals (CIs). P values < 0.05 were considered statistically significant.

RESULTS: A total of 1017 patients from 7 cohort studies with a mean follow-up 16.3 months were included in the meta-analysis. Data from 6 studies (943 subjects) comparing LAAV showed that the baseline LAAV was significantly higher in patients with AF recurrence compared to those without AF (SMD: -0.63; 95% CI: -0.89 to -0,37; all p values < 0.05; I2= 62.6%). Moreover, higher LAAV was independently associated with a significantly higher risk of AF recurrence after radiofrequency catheter ablation (HR: 1.10; 95% CI: 1.02 to 1.18).

CONCLUSIONS: The meta-analysis showed that there is a significant correlation between LAAV and AF recurrence after radiofrequency catheter ablation, and the role of LAAV in AF patients should not be ignored in clinical practice.

PMID:36946857 | DOI:10.36660/abc.20220471

Categories
Nevin Manimala Statistics

Food and Nutrition Surveillance System (SISVAN) coverage, nutritional status of older adults and its relationship with social inequalities in Brazil, 2008-2019: an ecological time-series study

Epidemiol Serv Saude. 2023 Mar 20;32(1):e2022595. doi: 10.1590/S2237-96222023000100003. Print 2023.

ABSTRACT

OBJECTIVE: to analyze the temporal trend of Food and Nutrition Surveillance System (Sistema de Vigilância Alimentar e Nutricional – SISVAN) coverage and the nutritional status of older adults, and its correlation with indicators of social inequality in Brazil between 2008-2019.

METHODS: this was an ecological study using records from SISVAN, related to the population aged 60 years and older; the temporal trend of coverage and the correlation between indicators of social inequality and increment rate of nutritional status were analyzed; slope index of inequality and concentration index were used to measure absolute and relative inequalities.

RESULTS: 11,587,933 records were identified; national coverage increased from 0.1% (2008) to 2.9% (2019), with a statistically significant upward trend; a moderate inverse correlation with an annual increment rate of overweight between human development index and gross domestic product per capita, was found.

CONCLUSION: there was an increasing trend in SISVAN coverage; the increase in overweight was associated with social inequality.

PMID:36946834 | DOI:10.1590/S2237-96222023000100003