Privacy is not an Afterthought: Towards a Holistic Privacy-Driven Software Development
Feb 28 11-12 WWH 335
Sepideh Ghanavati , Department of Computer Science, University of Maine
The rapid rise of generative AI and mobile/IoT applications has made it crucial to ensure AI models and software applications adhere to ethical guidelines and protect privacy. Despite recent advances in privacy and software engineering research, developers still face significant challenges including the complexity and ambiguity of legal requirements, a communication gap between developers and legal experts, and a lack of privacy expertise in smaller companies, to name a few. Most research primarily focuses on detecting violations and non-compliance post- development rather than proactively mitigating and resolving them throughout the development lifecycle. In this talk, Dr. Ghanavati will present their approach to integrating privacy into software development by leveraging advances in large language models (LLMs) and privacy-aware software engineering techniques. She will first discuss their empirical findings regarding software and AI team members’ privacy and ethical expertise and challenges, and their approach to aligning technical implementations with ethical and legal standards. Next, she will describe their work to address some of the shortcomings and obstacles with a particular focus on detecting, classifying, and localizing privacy behaviors in software documentations and source code at various stages of development as well as techniques to translate these behaviors into natural language statements for privacy notices and/or privacy requirements (i.e., privacy stories).
Sepideh Ghanavati is an associate professor in Computer Science at the University of Maine. She is the director of the Privacy Engineering – Regulatory Compliance Lab (PERC_Lab). Her research interests are at the intersections of human-centered privacy and security, software engineering (SE), and natural language processing (NLP). She uses research methods, including deep learning, large language models, programming analysis, and empirical human studies, to help developers design privacy-preserving software and build trustworthy AI systems. She is the recipient of the NSF CAREER Award in 2023 and the Google Faculty Research Awards in 2018 and 2021. She has more than 15 years of academic and industry experience in the areas of privacy and regulatory compliance.
ReplyReply allForward |