Targeted advertising has been subject to many privacy complaints from both users and policy makers. Despite
this attention, users still have little understanding of what data the advertising platforms have about them and why they are shown particular ads. To address such concerns, Facebook recently introduced two transparency mechanisms: a “Why am I seeing this?” button that provides users with an explanation of why they were shown a particular ad (ad explanations), and an Ad Preferences Page that provides users with a list of attributes Facebook has inferred about them and how (data explanations).
In this paper, we investigate the level of transparency provided by these two mechanisms. We first define a number of key properties of explanations and then evaluate empirically whether Facebook’s explanations satisfy them. For our experiments, we develop a browser extension that collects the ads users receive every time they browse Facebook, their respective explanations, and the attributes listed on the Ad Preferences Page; we then use controlled experiments where we create our own ad campaigns and target the users that installed our extension. Our results show that ad explanations are often incomplete and sometimes misleading while data explanations are often incomplete and vague. Taken together, our findings have significant implications for users, policy makers, and regulators as social media advertising services mature.
The full paper is available here:
Investigating Ad Transparency Mechanisms in Social Media: A Case Study of Facebook’s Explanations
Bio:
Oana Goga is a CNRS researcher in the LIG lab in Grenoble, France since 2017. Oana received her Ph.D. from the University Pierre et Marie Curie (France) in 2014 and was a postdoc at the Max Plank Institute for Software Systems (Germany) before joining CNRS.
Oana’s research interests are in security, privacy and data mining aspects of social computing systems (online systems that provide services to people and that have at their core the data produced by them: Twitter, Amazon, Netflix). Her latest works are on bringing transparency to social media targeted advertising. Oana’s teams (with the support of a Data Transparency Lab grant) has developed AdAnalyst (https://adanalyst.mpi-sws.org/) a tool that lets users know why they have been targeted with a particular ad on Facebook.