Categories
Nevin Manimala Statistics

Development of a Contextualized, Research-Based Flemish Assessment Framework for Digital Care, Assistance, and Support: Delphi Study

JMIR Form Res. 2026 Apr 15;10:e88512. doi: 10.2196/88512.

ABSTRACT

BACKGROUND: The rapid evolution of digital technologies has transformed health, mental health, and social care, offering new modalities of digital care, assistance, and support through web-based platforms, mobile apps, extended reality, wearables, and artificial intelligence systems. Despite this proliferation, there is little consensus on what constitutes “high-quality” digital care. Challenges persist regarding data security, interoperability, accessibility, sustainability, and professional competence, whereas existing standards and regulations provide fragmented guidance.

OBJECTIVE: This study aimed to develop a contextualized, consensus-based quality assessment framework for digital care, assistance, and support in Flanders, Belgium. For this purpose, perspectives across technology, organizational processes, and professional competencies were integrated.

METHODS: The study used a multiphase design comprising (1) 10 expert interviews with Flemish government officials; (2) a narrative literature review of 303 peer-reviewed and gray literature sources; (3) a 3-round Delphi study with 50 experts across 5 domains (end users, facilitators, technology developers, deontology and ethics experts, and digital inclusion and media literacy experts); and (4) 4 complementary focus groups and 3 interviews with specialists in artificial intelligence, regulation, social work, mental health, and IT. The Delphi rounds gathered iterative feedback through open-ended elicitation, structured rating, and classification of quality criteria. Quantitative data were analyzed using descriptive statistics, whereas qualitative feedback was subjected to thematic analysis.

RESULTS: A total of 50 experts participated in round 1, a total of 40 (80%) participated in round 2, and 27 (54%) participated in round 3. Round 1 generated 577 unique quality criteria, consolidated into 26 clusters organized under 3 pillars: technology, organization, and professional competencies. The relative importance across pillars was balanced (mean score 37.29, SD 12.38 for technology; 33.33, SD 10.39 for professional competencies; and 29.80, SD 10.45 for organizations). Accessibility, reliability, and safety ranked highest for the technology; vision, quality monitoring, and infrastructure ranked highest for organization; and support, digital competencies, and ethics ranked highest for professional competencies. The finalized framework included 112 criteria, of which 35 (31.3%) were designated as optional and 77 (68.8%) were designated as minimum requirements. Focus groups and interviews validated the framework’s comprehensiveness and usability, emphasizing proportional implementation, user centrality, and alignment with European Union regulations. Stakeholders highlighted the need for tools, training, and governance mechanisms to ensure adoption and sustainability.

CONCLUSIONS: This study produced a codeveloped, context-sensitive quality assessment framework that balances technological robustness, organizational readiness, and professional competence in digital care, assistance, and support. The framework can serve both as a quality safeguard and a developmental road map. Accompanying self-assessment and governance tools enhance practical applicability. Implementation success will depend on governmental support, resource allocation, and structured feedback loops. Future research should pilot the framework in real-world settings, assess its impact, and establish mechanisms for continuous updates to maintain relevance in a rapidly evolving digital landscape.

PMID:41984529 | DOI:10.2196/88512

By Nevin Manimala

Portfolio Website for Nevin Manimala