Algorithms have a lot of impact on our lives. But we often know little about they way they work. The Dutch Ministry of Justice and Security engaged the ILP Lab to perform a study on transparency with regard to automated decision making. Student researchers used a case study on dating apps to review to what extent users desire transparency, and what their rights are in this respect.
The researchers conclude that there are several obligations under the GDPR which require dating app publishers to be transparent about automated decision making. It is uncertain, though, whether there is an obligation to provide information about the logic and the consequences of certain algorithms. They also found that many dating apps do not comply with these requirements. And finally, they conclude that many of the users they interviewed are not aware of the way algorithms work in these apps, while some users indeed are interested to know more.
The researchers provide a number of policy recommendations. First, they suggest to increase awareness about algorithms and potential risks. Second, they suggest to increase compliance with transparency obligations under the GDPR. Finally, they recommend to ensure clarity on the scope of these obligations. The report was written by Meredith Hom, Noortje van Hoorn and Femke Schotman. It was presented to the Ministry in November 2021.