Stud Health Technol Inform. 2026 Feb 12;334:73-77. doi: 10.3233/SHTI260019.
ABSTRACT
This environmental scan examined commercially available AI clinical decision support solutions (AI-CDSS) across three domains: knowledge base, AI methodology, and privacy. Over half of vendors disclosed some information on their knowledge base, yet few demonstrated rigorous appraisal or alignment with Quality Standards or other evidence-based guidelines. Transparency on AI methods was limited as most cited proprietary algorithms but rarely described training data. Privacy information was more commonly reported but often high-level, with limited detail on compliance, storage location, or restrictions on secondary use. These gaps reveal the obstacles facing decision makers: without standardized, transparent information, organizations and governments cannot reliably evaluate AI-CDSS or provide the support clinicians need for responsible and informed implementation.
PMID:41685476 | DOI:10.3233/SHTI260019