Facebook should hire me to audit their algorithm
There’s lots of post-election talk that Facebook played a large part in the election, despite Zuckerberg’s denials. Here are some the various theories going around:
- People shared fake news on their walls, and sometimes Facebook’s “trending algorithm” also messed up and shared fake news. This fake news was created by Russia or by Eastern European teenagers and it distracts and confuses people and goes viral.
- Political advertisements had deep influence through Facebook, and it worked for Trump even better than it worked for Clinton.
- The echo chamber effect, called the “filter bubble,” made people hyper-partisan and the election became all about personality and conspiracy theories instead of actual policy stances. This has been confirmed by a recent experiment on swapping feeds.
If you ask me, I think “all of the above” is probably most accurate. The filter bubble effect is the underlying problem, and at its most extreme you see fake news and conspiracy theories, and a lot of middle ground of just plain misleading, decontextualized headlines that have a cumulative effect on your brain.
Here’s a theory I have about what’s happening and how we can stop it. I will call it “engagement proxy madness.”
It starts with human weakness. People might claim they want “real news” but they are actually very likely to click on garbage gossip rags with pictures of Kardashians or “like” memes that appeal to their already held beliefs.
From the perspective of Facebook, clicks and likes are proxies for interest. Since we click on crap so much, Facebook (and the rest of the online ecosystem) interprets that as a deep interest in crap, even if it’s actually simply exposing a weakness we wish we didn’t have.
Imagine you’re trying to cut down on sugar, because you’re pre-diabetic, but there are M&M’s literally everywhere you look, and every time you stress-eat an M&M, invisible nerds exclaim, “Aha! She actually wants M&M’s!” That’s what I’m talking about, but where you replace M&M’s with listicles.
This human weakness now combines with technological laziness. Since Facebook doesn’t have the interest, commercially or otherwise, to dig in deeper to what people really want in a longer-term sense, our Facebook environments eventually get filled with the media equivalent of junk food.
Also, since Facebook dominates the media advertising world, it creates feedback loops in which newspapers are stuck in the loop of creating junky clickbait stories so they can beg for crumbs of advertising revenue.
This is really a very old story, about how imperfect proxies, combined with influential models, lead to distortions that undermine the original goal. And here the goal was, originally, pretty good: to give people a Facebook feed filled with stuff they’d actually like to see. Instead they’re subjected to immature rants and conspiracy theories.
Of course, maybe I’m wrong. I have very little evidence that the above story is true beyond my experience of Facebook, which is increasingly echo chamber-y, and my observation of hyper-partisanship overall. It’s possible this was entirely caused by something else. I have an open mind if there were evidence that Facebook’s influence on this system is minor.
Unfortunately, Facebook’s data is private and so I cannot audit their algorithm for the effect as an interested observer. That’s why I’d like to be brought in as an outside auditor. The first step in addressing this problem is measuring it.
I already have a company, called ORCAA, which is set up for exactly this: auditing algorithms and quantitatively measuring effects. I’d love Facebook to be my first client.
As for how to address this problem if we conclude there is one: we improve the proxies.