PUPS: Practical Usable Privacy and Security Research Lab

PUPS is a group of researchers at the University of Alberta organized by Bailey Kacsmar. The work in this group intersects with digital privacy, security, machine larning, and HCI.

Technical privacy systems define what is being protected, from who, and under what conditions this protection will hold. We focus on developing privacy enhancing systems using human-centered design. That is, when developing technical privacy protocols, we take a wider view of privacy. We aim to develop privacy enhancing technologies informed by the end privacy goals of the people who can be effected by whether their data is used in such systems.

PUPS News

  • 🏆 May 2025, PUPS Student Afari Darfoor won first place in the research poster competition at the Responsible AI track of the Canadian AI Conference!!
  • 🏆March 2025, Bailey Kacsmar (PI) recognized with 2025 U of A Award for Outstanding Mentorship in Undergraduate Research & Creative Activities
  • 🏆 January 2025, PUPS student Jialiang Yan recognized with Honorable Mention for the 2025 CRA Outstanding Undergraduate Researcher Award!!
  • 🏆 September 2025, PUPS Student Rabeya Bosri awarded Alberta Graduate Excellence Scholarship (AGES)!!

People in PUPS

PhD Students

Afari Darfoor, Privacy and Bionic Limbs, co-supervised with Patrick Pilarski

Masters Students

Rabeya Bosri, Design tradeoffs in privacy and machine learning, (MSc)

Miriam Bakija, (MSc)

Samuel Feldman, (MSc)

Afrida Hossain, Software engineering in privacy and AI, (MSc)

Research Assistants

Castor Shem, Privacy Problems in Theory versus Media, (UGRA, 2025)

Alumni

Afari Darfoor, Identifying Privacy Threat Vectors in AI-Ehanced Upper-Limb Bionic Devices, (MSc. Fall 2025)

Gwen Delos Santos, Privacy Problems in Theory versus Media, (UGRA, Summer 2025)

Sasha Dudiy, Privacy Problems in Theory versus Media, (UGRA, Summer 2025)

Qiantong Guo, Privacy Problems in Theory versus Media, (UGRA, Summer 2025)

Naone Kim, Privacy Problems in Theory versus Media, (UGRA, Summer 2025)

Khoi Le, Privacy Problems in Theory versus Media, (UGRA, Summer 2025)

Lina Saha, Privacy Problems in Theory versus Media, (UGRA, Summer 2025)

Alireza Hodae, Cryptography for Privacy, (Visiting researcher, Summer 2025)

Jialiang Yan, Privacy and Mobile Notices, (UGRA 2024, next stage: George Washington University PhD Student)

Farishta Kabir, (RA 2024)

What is PUPS?

What is practical usable privacy and security? Well, breaking it down a bit, it is made up of the following:

Practical privacy? On one side of practicality, we need to ensure that technical guarantees are enforced and that they can be done with “reasonable” time and resources requirments. A practical privacy system must also have sufficient utility. For instance, if a “privately trained” machine learning model does not achieve a certain level of success (e.g., at classification or generation) it is not able to serve its purpose.

Usable privacy? The usability of a privacy system has several facets to it. It includes aspects of what users actually do and what do they want to do. However, it also includes the accessibility of the system. Privacy tools may require additional effort (from the various participating agents) over non-privacy preserving tools and therefore require clear motivation before entities, whether individuals or companies, will choose to use them. Thus, to effective design privacy tools that users will feel encouraged to use, it is necessary to study users awareness, understanding, and motivations. While usability can include efficiency and practicality from a technical standpoint, private computation must inspire trust and match the expectations of the data subjects to ensure their continued consent to the use of their data in such computations.

Privacy and security? While privacy and security are not equivalent concepts, when speaking of technical systems, an insecure system can lead to privacy violations. Therefore, when working to design technical systems for privacy, we must also consider the security aspects of it.