Towards user-oriented privacy for recommender system data

A personalization-based approach to gender obfuscation for user profiles

Journal Article (2021)
Author(s)

M. Slokom (TU Delft - Multimedia Computing)

Alan Hanjalic (TU Delft - Intelligent Systems)

M.A. Larson (Radboud Universiteit Nijmegen, TU Delft - Multimedia Computing)

Multimedia Computing
Copyright
© 2021 M. Slokom, A. Hanjalic, M.A. Larson
DOI related publication
https://doi.org/10.1016/j.ipm.2021.102722
More Info
expand_more
Publication Year
2021
Language
English
Copyright
© 2021 M. Slokom, A. Hanjalic, M.A. Larson
Multimedia Computing
Issue number
6
Volume number
58
Pages (from-to)
1-24
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

In this paper, we propose a new privacy solution for the data used to train a recommender system, i.e., the user–item matrix. The user–item matrix contains implicit information, which can be inferred using a classifier, leading to potential privacy violations. Our solution, called Personalized Blurring (PerBlur), is a simple, yet effective, approach to adding and removing items from users’ profiles in order to generate an obfuscated user–item matrix. The novelty of PerBlur is personalization of the choice of items used for obfuscation to the individual user profiles. PerBlur is formulated within a user-oriented paradigm of recommender system data privacy that aims at making privacy solutions understandable, unobtrusive, and useful for the user. When obfuscated data is used for training, a recommender system algorithm is able to reach performance comparable to what is attained when it is trained on the original, unobfuscated data. At the same time, a classifier can no longer reliably use the obfuscated data to predict the gender of users, indicating that implicit gender information has been removed. In addition to introducing PerBlur, we make several key contributions. First, we propose an evaluation protocol that creates a fair environment to compare between different obfuscation conditions. Second, we carry out experiments that show that gender obfuscation impacts the fairness and diversity of recommender system results. In sum, our work establishes that a simple, transparent approach to gender obfuscation can protect user privacy while at the same time improving recommendation results for users by maintaining fairness and enhancing diversity.