In a report issued on May 27, 2014, the U.S. Federal Trade Commission (FTC) called for increased transparency and accountability from companies that collect and sell personal data. In light of the FTC’s findings, the time is ripe for Congress to pass legislation that increases users’ control over their personal data and decrease data brokers’ control over data profiles. Access is encouraged by the FTC report and its proposals to Congress for legislative reform, however, the report fails to address some very critical issues and needed reforms to enhance consumers control.
Data brokers and their business model
In the United States and elsewhere, data brokers make money by amassing consumer information, and extrapolate inferences about individuals which are organized to create profiles. These profiles are then sold and used for a variety of commercial purposes. For example, companies use data to “score” consumers and to predict likelihood of fraud, default, or probability of a purchase or in-store visit. Consumers do not know how companies are profiling their activities, or even that such speculations are being made.
The information data brokers use to create profiles includes public information, content you choose to share, contact information — like warranty registrations, and voter registration information — but also information you may not be aware that you are generating through your passive digital footprint — data about sites you visit online. As consumers cannot access their profiles, they are left with no options when they want to take steps to correct their detrimental effects.
For the most part, users do not see and are not aware of the profiles created by data brokers. While US consumers’ trust in online services regarding their privacy is already very low, it is likely that if users knew how their information was being aggregated and utilized in secret, they would be shocked.
A good first step
The FTC report outlines potential benefits of data collection, including convenience and targeted marketing, while noting that the way data is currently aggregated by these companies denies users control over their information. The report surveys nine of the biggest data brokers in the U.S.: Acxiom, Corelogic, Datalogix, eBureau, ID Analytics, Intelius, PeekYou, Rapleaf, and Recorded Future. The report extensively details, among other things, these brokers’ information collection practices, what they do with the collected data, and the dearth of opportunities for consumers to access, correct, or delete their own data profiles. The report concludes by offering both legislative recommendations and recommendations for best practices.
The FTC’s legislative recommendations center around the need for increased data broker transparency and accountability. Access lauds these goals and suggests that sensitive user data can only be protected by empowering users to access and control their data profiles, as well as by implementing tighter legal restrictions on the collection and use of data by commercial brokers.
The FTC specifically recommends legislation to provide consumers the ability to access the profiles compiled by data brokers. According to the FTC, legislation could be passed to require the establishment of a centralized online portal that would allow consumers to view the information data brokers have collected on them. The report also advocates for a right for consumers to opt-out of data collection and to correct mistakes in their profiles. The best practices section of the Report offers a number of recommended commercial practices for data brokers, such as implementation of sound data collection and disposal practices.
Gaps in the FTC’s Report
The lack of transparency and regulation in predictive analytics creates a real risk of even more insidious discriminatory practices, including affecting the ability of users to obtain loans, work, housing, and insurance plans. Though the FTC Report recognizes many dangers related to data broker business models, it stops short in many cases of calling for increased safeguards to protect users from harmful and discriminatory practices.
For example, the report notes that discriminatory practices are already illegal in the United States, but fails to consider the high barriers against enforcement of those laws in a secretive commercial environment. In the case of data brokers, regulation and review are necessary to ensure that “consumer scores” are not being used a proxies for race- or gender-based discrimination.
Another major issue with data brokers in the U.S. is that, while data brokers spin publicly available information into marketing gold, they are not required to set up commensurate safeguards to protect their treasure troves of compiled user data. Once data brokers collect user information and package it for clients, they have little to no legal obligation to monitor the use of that data. As a result, after it leaves the data broker’s hands, the use and distribution of personal information is largely unchecked and unmonitored. Further, users can be placed at risk when personal information is stored in unprotected formats that allow for easy access by unauthorized third parties. This lack of security makes data brokers especially attractive targets for cyber criminals, as well as surveillance by governments.
Moreover, governments may not need to exploit security weaknesses to access the generated profiles and data collected by data brokers: we now know that data brokers have been working closely with several U.S. government agencies. For example, the FBI has been paying Choicepoint in order to access its extensive database to screen for terrorist threats and for other purposes. Acxiom has also been working with authorities after September 11th to track down eleven of the nineteen hijackers. Acxiom has continued to provide assistance to government agencies such as the Transportation Security Administration. Given the current lack of transparency and accountability, new cooperation cases or information regarding governments tapping into data brokers could very well be revealed in the future.
Developing consumers’ control
Following the report, the FTC should further develop a proposal for legislation that mandates high standards for data protection, improves user controls, facilitates data portability, only allows collection of data for specified, explicit, and lawful purposes, and introduces privacy by design and by default. Such measures are currently proposed in the European Parliament version of the new Data Protection Regulation which would allow data protection authorities to impose hefty fines–up to 5% of global annual revenue—on companies in violation of the law. While collecting data of European citizens, data broker companies already must take into account the different legal framework in place and respect the fundamental right to personal data protection. These same protections should be available to all users.
Regulators can also look to privacy models located otherwhere in US law, such as the the Health Insurance Portability and Accountability Act (HIPAA) that governs health care data and regulates its transfer and control in order to protect patient confidentiality. There currently exist no similar requirements for data brokers or their third party clients, even though the data these companies hold can be even more sensitive and revealing than health records (and is likely to include personal health information as well).
Companies that profit by creating detailed and sensitive user profiles should be obligated to protect it by every means possible. Access continues to advocate for strong protections for user data, and to promote user awareness of rights infringement in this area. While stricter protections are pending, users can advocate for better data protections by supporting Encrypt All the Things, Access’ campaign to promote the Data Security Action Plan to protect private networks and communications information.
Users can also take advantage of a number of opt-out tools to minimize the amount of information that is collected about them. One option is disconnect.me, a browser extention that allows users to block some websites from tracking personal data, or EFF’s new Privacy Badger (currently in Alpha phase), which automatically blocks some spying ads and invisible trackers as you browse.
Access will keep you updated on the future development of the legislative proposal introduced in this report.