Categories
Nevin Manimala Statistics

Restoring Engagement in Digital Self-Control Tools Using Nudge Reconfiguration Prompts: Quasi-Experimental Study

JMIR Form Res. 2026 Apr 28;10:e85349. doi: 10.2196/85349.

ABSTRACT

BACKGROUND: Digital self-control tools (DSCTs) have emerged as technological interventions to address excessive smartphone usage and promote digital well-being. However, these tools face persistent challenges with user attrition and sustained engagement, compromising their long-term effectiveness. Current literature lacks an understanding of how observable behavioral indicators, as opposed to self-reported measures, are associated with user engagement and readiness to change in DSCTs.

OBJECTIVE: This study addresses three research questions (RQs): (RQ1) whether prompting passive DSCT users to reconfigure nudges increases subsequent user-nudge interaction, (RQ2) how engagement evolves over time and what behavioral divergence emerges between accepting and rejecting users, and (RQ3) whether observable in-app behavioral indicators are more strongly associated with intervention acceptance than traditional self-reported measures.

METHODS: We conducted a quasi-experimental study (N=252) targeting users who had disabled nudges. Participants were randomly assigned to receive a prompt to reconfigure their nudge settings during daily check-ins (n=138, experimental group) or to a control condition (n=114, no intervention). The experimental group was further classified into acceptance and rejection subgroups based on their response to the intervention. Data collection included DSCT configuration logs, usage-triggered nudge logs, and self-reported questionnaire responses. We analyzed user-nudge interaction ratios using difference-in-differences with permutation tests (RQ1) and nudge configuration parameters and manual app blocking using independent-samples t tests with Cohen d (RQ2) and compared behavioral indicators against self-reported measures using t tests and chi-square tests (RQ3).

RESULTS: Of the experimental participants, 46% (63/138) accepted the nudge reconfiguration prompt. Post intervention, the acceptance subgroup’s 7-day average user-nudge interaction ratio increased from 29.7% to 58.5% (peak of 65% on day 1), a significant increase even after controlling for the temporal decline observed in the control group (difference-in-differences=+36.3 percentage points, P<.001). The rejection subgroup’s decline was not significantly different from the control group’s decline (P=.82). The acceptance subgroup showed preexisting behavioral indicators of higher readiness to change, including 21.53% shorter consecutive usage thresholds (P=.03) compared to the rejection subgroup, with a directionally consistent but nonsignificant difference in cooldown length (+20.56%). Behavioral divergence in consecutive usage thresholds widened post intervention, with Cohen d increasing from -0.47 to -0.67 (P=.002). Acceptance subgroup participants demonstrated a significantly lower tendency to select leisure-oriented daily goals (15.6% vs 26.2%; chi-square P=.001, Cramer V=0.13). Self-reported measures of screen time goals and scrolling regret were not significantly associated with intervention acceptance (P>.10).

CONCLUSIONS: Observable in-app behavioral indicators, rather than self-reported measures, effectively differentiate intervention receptiveness. Study results suggest that effective DSCT design should incorporate adaptive strategies that recognize and respond to users’ readiness to change, as evidenced by their in-app behaviors, while preserving autonomy. Such systems are likely to outperform static interventions or designs that rely solely on self-reported preferences.

PMID:42048520 | DOI:10.2196/85349

By Nevin Manimala

Portfolio Website for Nevin Manimala