top of page
blackbird0054

VerHealth: Vetting Medical Voice Applications through Policy Enforcement.

Updated: Sep 27

Shezan FH, Hu H, Wang G, Tian Y. VerHealth: Vetting Medical Voice Applications through Policy Enforcement. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies. 2020;4(4): Article No. 153:1-21. https://doi.org/10.1145/3432233


Summary: This study introduces VerHealth, a system designed to evaluate health-related voice applications on platforms like Amazon Alexa for compliance with privacy and safety policies. VerHealth employs static and dynamic modules, utilizing machine learning to detect violations within application interactions. The study analyzed 813 health apps, sending over 855,000 queries and evaluating 863,000 responses. It found that 86.36% lacked disclaimers when offering medical information, and 30.23% stored user health data without permission. While the medical advice was often factually correct, domain experts deemed it lacking in relevance and thoroughness. The research suggests areas for improving the safety and quality of health voice apps.

0 views0 comments

Recent Posts

See All

Komentar


Stay in the know.
Subscribe for updates

Proud LGBTQ2S

ally and safe space

Navigation

© 2035 by VetMaite with the services of BetterWave Marketing. Created on Wix Studio.

bottom of page