The world hasn’t been the same since we all got on social media. In the roughly 15 years since these sites gained widespread popularity, we’ve seen Twitter and Facebook play a major role in everything from the Arab Spring and #MeToo to the rise of hate crimes and political divisiveness.
It’s hard to imagine a future where social media doesn’t continue to strongly influence our lives. However, that doesn’t mean we have to accept the negatives in order to receive the benefits. Nikolas Guggenberger, a clinical lecturer in law and executive director of the Information Society Project at Yale, thinks social media’s problems can be fixed.
Science Node sat down with him before his talk at the Indiana University Center for Applied Cybersecurity Research (CACR) Security Speaker Series to discuss his thoughts on the future of social media.
You describe our attention, privacy, and behavioral autonomy as common goods. What do you mean by that?
Common goods are something that we share as a society. It goes beyond concepts of personal property or personal liberty. It’s something that we share and that we need to share as a society to function.
This is relevant because of the ways that we have chosen to protect our privacy, personal attention, and autonomy. We have chosen to protect these values by assigning rights to individuals. We have created rules that require an individual's consent, or at least her notice, when we process her data.
I'm arguing that this is not sufficient because privacy also has a common dimension. It’s not enough if an individual is asked whether she wants to give up her privacy because by doing so, she does not only impede her own rights. She does not only consent to something that interferes with her, but she also consents to an interference with society at large. This dimension is something that gets lost in the current system.
Can you give an example of how this plays out?
You’re probably familiar with 23 and Me, the company that allows you to analyze your DNA sequence. The general perception seems to be that if I want to do that, then I should go ahead and do it.
But people don’t realize that when they have their DNA sequence analyzed, they impact their siblings, their cousins, their grandmothers, their grandfathers, even their second-cousin's privacy.
If you have a DNA sample and nothing else, there's a very high probability that you can find and identify that individual. That’s because so many other people have had their DNA analyzed, have put it in a database, and there is a high probability that there is a second-cousin and grandma of that individual that we can find immediately within that database.
Then we can, simply ask ‘who's your grandchild, who's your second-cousin?’ And soon we have identified that person. People need to realize this when they engage with all digital platforms.
How does this play into the current social media business model?
Social media platforms have an incentive to capture as much of your attention as they can because they sell your attention. The more attention they capture, the more they can sell to advertisers. And so, the incentive structure is such that they have an incentive to create applications and websites that are addictive in nature.
One way to keep somebody engaged and on a platform is by enraging them. There is an incentive to polarize debates online because polarization and partisanship itself draws in people's attention. Then more of that attention can be sold to advertisers.
I wouldn't want to suggest that any social media company deliberately wants to enable and support hate crimes. But, again, their business model creates economic incentives to design platforms in ways that facilitate the spread of threats and inciteful content.
What do we need to see on a global or national level to combat this?
I think we need to start seeing meaningful privacy regulation. I mean something that goes beyond the type of notice and choice regimes that we presently have. The California Consumer Privacy Act is certainly a step, but it's only a tiny step in the right direction.
If we look to environmental law for inspiration, we see multiple instruments I think can be put to use. One instrument is taxing behavior that we think has harmful consequences. For example, we could tax advertisement revenues.
But we also need to define boundaries as to what companies can do irrespective of whether individuals consented to that practice. We need to bring the interests of society as a whole into the equation of privacy and platform regulation.