Data and privacy laws, such as the GDPR, require mobile apps that collect and process the personal data of their citizens to have a legally-compliant policy. Since these mobile apps are hosted on app distribution platforms such as Google Play Store and App Store, the app publishers also require the app developers who wish to submit a new app or make changes to an existing app to be transparent about their app privacy practices regarding handling sensitive user data that requires sensitive permissions such as calendar, camera, microphone. To verify compliance with privacy regulators and app distribution platforms, the app privacy policies and permissions are investigated for consistency. However, little has been done to investigate GDPR completeness checking within the Android permission ecosystem. In this paper, we investigate the design and runtime approaches towards completeness checking of sensitive (‘dangerous’) Android permission policy declarations against GDPR. In this paper, we investigate the design and runtime approaches towards completeness checking of dangerous Android permission policy declarations against GDPR. Leveraging the MPP-270 annotated corpus that describes permission declarations in application privacy policies, six natural language processing and language modelling algorithms are developed to measure permission completeness during runtime while a proof of concept Class Unified Modeling Language Diagram (UML) tool is developed to generate GDPR-compliant permission policy declarations using UML diagrams during design time. This paper makes a significant contribution to the identification of appropriate permission policy declaration methodologies that a developer can use to target particular GDPR laws, increasing GDPR compliance by 12% in cases during runtime using BERT word embedding, measuring GDPR compliance in permission policy sentences, and a UML-driven tool to generate compliant permission declarations.Data and privacy laws, such as the GDPR, require mobile apps that collect and process the personal data of their citizens to have a legally-compliant policy. Since these mobile apps are hosted on app distribution platforms such as Google Play Store and App Store, the app publishers also require the app developers who wish to submit a new app or make changes to an existing app to be transparent about their app privacy practices regarding handling sensitive user data that requires sensitive permissions such as calendar, camera, microphone. To verify compliance with privacy regulators and app distribution platforms, the app privacy policies and permissions are investigated for consistency. However, little has been done to investigate GDPR completeness checking within the Android permission ecosystem. In this paper, we investigate the design and runtime approaches towards completeness checking of sensitive (‘dangerous’) Android permission policy declarations against GDPR. In this paper, we investigate the design and runtime approaches towards completeness checking of dangerous Android permission policy declarations against GDPR. Leveraging the MPP-270 annotated corpus that describes permission declarations in application privacy policies, six natural language processing and language modelling algorithms are developed to measure permission completeness during runtime while a proof of concept Class Unified Modeling Language Diagram (UML) tool is developed to generate GDPR-compliant permission policy declarations using UML diagrams during design time. This paper makes a significant contribution to the identification of appropriate permission policy declaration methodologies that a developer can use to target particular GDPR laws, increasing GDPR compliance by 12% in cases during runtime using BERT word embedding, measuring GDPR compliance in permission policy sentences, and a UML-driven tool to generate compliant permission declarations. Leer más