|July 18, 2014|
Previously published on July 9, 2014
If you are one of the approximately 1.3 billion people who use Facebook, you’ve likely experienced the phenomenon where a single event (like Luiz Suarez biting that Italian guy or pretty much anything involving TSA) manages to raise the ire of a large number of your Facebook friends, causing them to flood your timeline with single-issue Facebook user rage. Another recent event you likely heard about both on the news and through numerous status updates is Facebook’s 2012 experiment in which user timelines were manipulated to gauge users’ response to changes in the number of positive or negative posts. After results of the study were published in March, many users became upset at the idea of possibly having unknowingly taken part in the study. Now, the Electronic Privacy Information Center (EPIC) has filed a formal complaint asking the Federal Trade Commission (FTC) to investigate Facebook’s use of user data for research purposes as a deceptive trade practice.
There are ultimately two questions at play here. The first is whether Facebook violated its own Terms of Service in processing user data for the study’s purposes. EPIC’s complaint alleges that Facebook’s sharing of data and research with universities was not contemplated or disclosed by Facebook’s Terms of Service. At the time of the study, Facebook’s Terms provided that user data could be used to improve Facebook’s products. The aim of the study could arguably fit within that broad description, although, not surprisingly, Facebook’s Terms have since been updated to specifically state that user data may be used for research purposes. If the FTC agrees with EPIC that the use of data in the 2012 study exceeded the disclosures in Facebook’s Terms, then the FTC may find that Facebook has violated its 2012 consent order, which bars Facebook from misrepresenting its data collection practices.
The second question, however, is how much any of the foregoing matters. FTC actions are not to be taken lightly and penalties that result can be substantial. However, Facebook’s larger issue may be changes to the public’s perception of Facebook’s products and services that result from the negative press. The general public understands that Facebook is a business whose profit margin is driven by increasing use of the service by its users. Moreover, the use of behavioral science principles to increase use of products like online games or mobile apps isn’t particularly new or surprising. And yet in this case the public response to Facebook’s behavioral studies has heightened its ongoing issues with user distrust, because although reasonable Facebook users expect and understand that Facebook uses member data to generate revenue, many didn’t expect that Facebook would use member data in that way. In other words, the issue isn’t necessarily what Facebook was trying to accomplish with the data, but rather that that type of data use was, to quote Facebook Chief Operating Officer Sheryl Sandberg, “poorly communicated.”
The lesson to draw from Facebook’s data research issues is that, as we’ve previously written, consent isn’t the only consideration. Users that distrust an online service like Facebook are far less likely to provide it with valuable personal information. When developing and designing an online service that collects personal information from its users, it is important to stop and consider what use or sharing of that information would not be apparent to those users. If the information collected, the way it is used, or the third parties it is shared with may come as a surprise, just-in-time notifications or a similar manner of enhanced notice should be considered so that users will be fully-informed. Although that may dissuade some users from signing on, the consequences of a failure to communicate, could be much worse.