Service Providers Request Too Much Personal Data
Waag Futurelab is part of ACROSS, a project funded by the European Commission that researches digital identity, personal data, and privacy in the context of cross-border movement.
When a person moves across European borders, they must provide a large amount of their personal data to service providers including housing companies, banks, and public authorities. Essential service providers request such large amounts of personal data in order to establish trust – for example, trust that a person will consistently pay their rent on time, or trust that a person will not damage an apartment. However, this method of establishing trust is out of line with the EC’s approach to personal data privacy and data minimisation: Article 5(1)(c) of the GDPR and Article 4(1)(c) of Regulation (EU) 2018/1725, which states that requests for personal data must be "adequate, relevant and limited to what is necessary in relation to the purposes for which they are processed".
According to European law, requests for personal information must be minimised and limited to what is strictly necessary. In practice, however, we find that service providers request data to establish trust, but in doing so, go far beyond the boundaries of what is strictly necessary and respectful of privacy.
Housing agencies and other service provider request too much data
It’s not easy to find a new house for anyone. For people who are moving to a new country, like those who we work with in ACROSS, accessing basic services can be a data privacy nightmare. For example, a housing application in Amsterdam requires:
- Copy of ID
- Statement from employer
- Payslips from the last 3 months
- Certified copy of employment history and wage data from the Employee Insurance Agency
- Copy of bank statement showing the last salary received
- Copy bank statement showing current rent/mortgage payments
- One bank statement of the bank account from which we will collect Proof of basic registration (from the municipality)
- Recent landlord declaration from current landlord
- Copy of bank cards of all tenants
This means that in order to apply for a house (without the guarantee of being selected amongst many applicants) a person is required to share their residency status, photo, date of birth, nationality, sex, and full name, immigration number, employer, salary, banking details and other personal finance information. This also requires applicants to notify their current landlord that they intend to leave. It is not clear how this information is used, how long it is held, or who has access to it.
Why is requesting too much data a problem?
There are various inherent and potential problems that result from service providers requesting too much data:
- Invasion of privacy – Europeans have a right to privacy under Article 8 of the European Convention of Human Rights.
- Legal risks – In gathering personal data, service providers become liable for potential violations to GDPR and other data protection laws.
- Security risks – The more that data is shared, the more vulnerable it is to falling into the wrong hands, for example through leaks, hacks, and cyberattacks.
- Exclusion and profiling – Over-requesting of data excludes people who do not have access to required documentation, which can be especially problematic for people moving to a new country. Personal data can also (deliberately or inadvertently) profile and exclude people on the basis of factors like nationality, sex, age, and income.
- Coerced consent – people should have the freedom to decide when and with whom they share their data. Power imbalances often remove this choice for desperate housing seekers, who are compelled to share whatever data is requested.
Again, requesting large amounts of personal data is the means by which service providers establish trust. What may be constituted as ‘necessary’ for this purpose is a subjective matter, but there are ways to gain trust that do not require such invasive personal data exposure. As explained below, digital credentials are a technically feasible approach to minimise personal data exchange; however, these and other data minimisation practices have not yet been widely adopted by service providers who lack resources and incentives to implement them.
Attribute-based credentials may help, but cannot solve the problem
ACROSS, MGOV4EU, and other European research projects are exploring attribute-based credentials as a way to protect data minimisation, privacy, and control over one’s own digital identity when information is shared with services providers. Attribute-based credentials are a digital way to share no more data about yourself than is needed in a specific scenario. For example, the DECODE project demonstrated how a group people from a neighbourhood could issue a credential that vouched for someone’s residency there. This credential could then be used to access an online neighborhood platform (GebiedOnline) without any additional identification (not even the name of the credential holder). While technically feasible, use of attribute-based credentials is not widely implemented.
We can imagine a scenario in which attribute-based credentials are applied to requesting and sharing data as part of a housing application process. A trusted third party (like the government, an employer, or a previous landlord) issues a single credential to a person, who is then certified as ‘yes, the holder of this certificate is eligible to rent an apartment within x value range’. The person could then present this single credential to a housing agency to verify their rental eligibility, while maintaining privacy over personal data about their identity, employment, finances, and more. Another possibility could be to utilise multiple credentials; for example, that the credential-holder is ‘at least x years old,’ or ‘a legal resident’, as issued by the government as a trusted third-party. Sharing multiple credentials in this way could offer some improvement from the status quo, but still carries risks related to over-identification. Requesting and sharing certain credentials can nonetheless still expose personal information, or wrongfully exclude certain people from consideration.
Attribute-based credentials and other technical solutions can facilitate interactions that minimise data sharing, but they do not directly address the underlying problem of trust. In our current context, service providers are more likely to trust (and prefer) multiple primary sources of documentation, like bank statements, over minimised credentials.
Who can address the problem of over-requesting data, and how?
Service providers, like housing agencies, have the primary responsibility and capability to address the problem because they are generally the party that makes the decision to request too much personal data. Service providers likely over-request data for a number of reasons, including perceived risk mitigation; lack of trust in single sources of information; because nobody is stopping them; because they’ve become accustomed to the practice; or to filter out applicants who lack the willingness or capability to complete the extensive and invasive application process. Operating in a competitive, high-demand, and for-profit environment in which they have leverage over desperate clients like housing seekers, these service providers lack incentive to minimise data requests.
Policymakers and regulators thus have a responsibility to incentivise housing agencies and other service providers to request less data. Policymakers can regulate what can and cannot be requested by service providers in specific situations. Current legislation like GDPR, DSA, and DMA can be better enforced. Government bodies can implement privacy-by-design requirements in the technology they fund, and can provide resources and support to help service providers comply with the laws and spirit of data protection regulations.
Citizens are not responsible to address the problem, but they are negatively affected by it. People who are moving (and especially those moving to a new country) are generally not in a position to refuse opportunities from a housing company, bank, or other essential service provider on the basis of data privacy – they have to ‘take what they can get,’ and there simply are not enough alternative options to do otherwise. It is possible that people moving to a new house might collectivise in some way to demand better from service providers – but in reality, a person in the midst of moving is likely far too busy to take on the role of a data rights advocate, and it seems unfair to also place such a heavy political burden onto the general public. Citizens who are passionate about such issues might consider becoming early adopters and beta-testers of new technologies such as wallets, in order to help improve specific applications and raise awareness and demand for privacy-focused technology in general.
ACROSS and other publicly funded projects have an inherent mission to advocate for public interests. ACROSS (which helps people moving to a new country to identify the right service providers) could address the problem of over-requesting data via the technology that we build; for example, by filtering out service providers that do not comply with strict guidelines for requesting data. While this is already done to an extent (service providers must meet technical and legal requirements as defined in project deliverable D4.2 and other technical reports), it is not realistic to find a sufficient number of service providers that would meet our ideal expectations of data protection. Furthermore, ACROSS does not have the necessary market clout to encourage service providers to adopt better practices by excluding them. It is nonetheless worthwhile for ACROSS partners to identify further ways to encourage service providers on our platform to adopt more robust data minimisation practices, and to share our findings about the contextual factors inhibiting data minimisation.
Public research projects as platforms for citizen inclusion
A strong option for ACROSS and other public tech projects is to provide a public platform to debate, discuss, and raise awareness about issues surrounding personal data protection. We can advocate for citizens’ rights to data protection by bringing them into the fold along with service providers and policymakers.
What do citizens want for their own data? What do they want to change about the current landscape of data sharing in Europe? How can people moving across borders work as a community to support one another and strengthen their position in relation to service providers and policymakers? What do they want their role to be? Such questions are at the core of ACROSS’s participatory research as we explore how to protect personal data.