The FDA should regulate Instagram’s algorithm as a drug

Daniel Liss
Contributor

Share on Twitter

Daniel Liss is the founder and CEO of Dispo, the digital disposable camera social network.

More posts by this contributor

Move fast and break Facebook: A bull case for antitrust enforcement
Today’s real story: The Facebook monopoly

The Wall Street Journal on Tuesday reported Silicon Valley’s worst-kept secret: Instagram harms teens’ mental health; in fact, its impact is so negative that it introduces suicidal thoughts.
Thirty-two percent of teen girls who feel bad about their bodies report that Instagram makes them feel worse. Of teens with suicidal thoughts, 13% of British and 6% of American users trace those thoughts to Instagram, the WSJ report said. This is Facebook’s internal data. The truth is surely worse.
President Theodore Roosevelt and Congress formed the Food and Drug Administration in 1906 precisely because Big Food and Big Pharma failed to protect the general welfare. As its executives parade at the Met Gala in celebration of the unattainable 0.01% of lifestyles and bodies that we mere mortals will never achieve, Instagram’s unwillingness to do what is right is a clarion call for regulation: The FDA must assert its codified right to regulate the algorithm powering the drug of Instagram.
The FDA should consider algorithms a drug impacting our nation’s mental health: The Federal Food, Drug and Cosmetic Act gives the FDA the right to regulate drugs, defining drugs in part as “articles (other than food) intended to affect the structure or any function of the body of man or other animals.” Instagram’s internal data shows its technology is an article that alters our brains. If this effort fails, Congress and President Joe Biden should create a mental health FDA.

Researchers can study what Facebook prioritizes and the impact those decisions have on our minds. How do we know this? Because Facebook is already doing it — they’re just burying the results.

The public needs to understand what Facebook and Instagram’s algorithms prioritize. Our government is equipped to study clinical trials of products that can physically harm the public. Researchers can study what Facebook privileges and the impact those decisions have on our minds. How do we know this? Because Facebook is already doing it — they’re just burying the results.
In November 2020, as Cecilia Kang and Sheera Frenkel report in “An Ugly Truth,” Facebook made an emergency change to its News Feed, putting more emphasis on “News Ecosystem Quality” scores (NEQs). High NEQ sources were trustworthy sources; low were untrustworthy. Facebook altered the algorithm to privilege high NEQ scores. As a result, for five days around the election, users saw a “nicer News Feed” with less fake news and fewer conspiracy theories. But Mark Zuckerberg reversed this change because it led to less engagement and could cause a conservative backlash. The public suffered for it.

Facebook likewise has studied what happens when the algorithm privileges content that is “good for the world” over content that is “bad for the world.” Lo and behold, engagement decreases. Facebook knows that its algorithm has a remarkable impact on the minds of the American public. How can the government let one man decide the standard based on his business imperatives, not the general welfare?
Upton Sinclair memorably uncovered dangerous abuses in “The Jungle,” which led to a public outcry. The free market failed. Consumers needed protection. The 1906 Pure Food and Drug Act for the first time promulgated safety standards, regulating consumable goods impacting our physical health. Today, we need to regulate the algorithms that impact our mental health. Teen depression has risen alarmingly since 2007. Likewise, suicide among those 10 to 24 is up nearly 60% between 2007 and 2018.
It is of course impossible to prove that social media is solely responsible for this increase, but it is absurd to argue it has not contributed. Filter bubbles distort our views and make them more extreme. Bullying online is easier and constant. Regulators must audit the algorithm and question Facebook’s choices.
When it comes to the biggest issue Facebook poses — what the product does to us — regulators have struggled to articulate the problem. Section 230 is correct in its intent and application; the internet cannot function if platforms are liable for every user utterance. And a private company like Facebook loses the trust of its community if it applies arbitrary rules that target users based on their background or political beliefs. Facebook as a company has no explicit duty to uphold the First Amendment, but public perception of its fairness is essential to the brand.
Thus, Zuckerberg has equivocated over the years before belatedly banning Holocaust deniers, Donald Trump, anti-vaccine activists and other bad actors. Deciding what speech is privileged or allowed on its platform, Facebook will always be too slow to react, overcautious and ineffective. Zuckerberg cares only for engagement and growth. Our hearts and minds are caught in the balance.
The most frightening part of “The Ugly Truth,” the passage that got everyone in Silicon Valley talking, was the eponymous memo: Andrew “Boz” Bosworth’s 2016 “The Ugly.”
In the memo, Bosworth, Zuckerberg’s longtime deputy, writes:
So we connect more people. That can be bad if they make it negative. Maybe it costs someone a life by exposing someone to bullies. Maybe someone dies in a terrorist attack coordinated on our tools. And still we connect people. The ugly truth is that we believe in connecting people so deeply that anything that allows us to connect more people more often is de facto good.
Zuckerberg and Sheryl Sandberg made Bosworth walk back his statements when employees objected, but to outsiders, the memo represents the unvarnished id of Facebook, the ugly truth. Facebook’s monopoly, its stranglehold on our social and political fabric, its growth at all costs mantra of “connection,” is not de facto good. As Bosworth acknowledges, Facebook causes suicides and allows terrorists to organize. This much power concentrated in the hands of one corporation, run by one man, is a threat to our democracy and way of life.
Critics of FDA regulation of social media will claim this is a Big Brother invasion of our personal liberties. But what is the alternative? Why would it be bad for our government to demand that Facebook accounts to the public its internal calculations? Is it safe for the number of sessions, time spent and revenue growth to be the only results that matters? What about the collective mental health of the country and world?
Refusing to study the problem does not mean it does not exist. In the absence of action, we are left with a single man deciding what is right. What is the price we pay for “connection”? This is not up to Zuckerberg. The FDA should decide.

Share This Post