Home > Uncategorized > Facebook should hire me to audit their algorithm

Facebook should hire me to audit their algorithm

November 17, 2016

There’s lots of post-election talk that Facebook played a large part in the election, despite Zuckerberg’s denials. Here are some the various theories going around:

  1. People shared fake news on their walls, and sometimes Facebook’s “trending algorithm” also messed up and shared fake news. This fake news was created by Russia or by Eastern European teenagers and it distracts and confuses people and goes viral.
  2. Political advertisements had deep influence through Facebook, and it worked for Trump even better than it worked for Clinton.
  3. The echo chamber effect, called the “filter bubble,” made people hyper-partisan and the election became all about personality and conspiracy theories instead of actual policy stances. This has been confirmed by a recent experiment on swapping feeds.

If you ask me, I think “all of the above” is probably most accurate. The filter bubble effect is the underlying problem, and at its most extreme you see fake news and conspiracy theories, and a lot of middle ground of just plain misleading, decontextualized headlines that have a cumulative effect on your brain.

Here’s a theory I have about what’s happening and how we can stop it. I will call it “engagement proxy madness.”

It starts with human weakness. People might claim they want “real news” but they are actually very likely to click on garbage gossip rags with pictures of Kardashians or “like” memes that appeal to their already held beliefs.

From the perspective of Facebook, clicks and likes are proxies for interest. Since we click on crap so much, Facebook (and the rest of the online ecosystem) interprets that as a deep interest in crap, even if it’s actually simply exposing a weakness we wish we didn’t have.

Imagine you’re trying to cut down on sugar, because you’re pre-diabetic, but there are M&M’s literally everywhere you look, and every time you stress-eat an M&M, invisible nerds exclaim, “Aha! She actually wants M&M’s!” That’s what I’m talking about, but where you replace M&M’s with listicles.

This human weakness now combines with technological laziness. Since Facebook doesn’t have the interest, commercially or otherwise, to dig in deeper to what people really want in a longer-term sense, our Facebook environments eventually get filled with the media equivalent of junk food.

Also, since Facebook dominates the media advertising world, it creates feedback loops in which newspapers are stuck in the loop of creating junky clickbait stories so they can beg for crumbs of advertising revenue.

This is really a very old story, about how imperfect proxies, combined with influential models, lead to distortions that undermine the original goal. And here the goal was, originally, pretty good: to give people a Facebook feed filled with stuff they’d actually like to see. Instead they’re subjected to immature rants and conspiracy theories.

 

Of course, maybe I’m wrong. I have very little evidence that the above story is true beyond my experience of Facebook, which is increasingly echo chamber-y, and my observation of hyper-partisanship overall. It’s possible this was entirely caused by something else. I have an open mind if there were evidence that Facebook’s influence on this system is minor.

Unfortunately, Facebook’s data is private and so I cannot audit their algorithm for the effect as an interested observer. That’s why I’d like to be brought in as an outside auditor. The first step in addressing this problem is measuring it.

I already have a company, called ORCAA, which is set up for exactly this: auditing algorithms and quantitatively measuring effects. I’d love Facebook to be my first client.

As for how to address this problem if we conclude there is one: we improve the proxies.

Categories: Uncategorized
  1. Jean
    November 17, 2016 at 7:21 am

    I just shared your posting on FB. Hope you don’t mind?

    Like

  2. Gordon Henderson
    November 17, 2016 at 9:07 am

    What outcome are you trying to achieve?

    With your M&M example, there’s an obvious goal – help people to make choices that will allow them to avoid diabetes. Not many people would argue the point if you made it impossible for high-risk people to get hold of any candy at all, but I struggle to to extend that metaphor to individuals leisure choices.

    It sounds as though you want to see Facebook feeds that are full of stuff that’s “good” for them. If you were to audit (and edit) the FB algorithm, what would feeds look like afterwards, and how would they differ from the way they look today? Are you going to let the algo decide what’s “good” for them, as opposed to what they like? Who decides what’s “good” in that world? Is “good” an individually tailored quality, or is an absolute?

    Like

  3. Paul Schäfer
    November 17, 2016 at 9:45 am

    Why should my long-term reasonabole preferences be my true preferences? It’s allway easy to think that you shouldn’t have eaten the candy in the past, because the pleasure is in the past and you just have to deal with the consequences in the present.

    It could also be the case that people don’t want to read “good” news, because the world is terrible and thinking is hard. So facebook optimizes exactly in the right way to serve their customes. Of course having knowledgable people who care about politics could be considered a public good. But in this case Facebook should not want to hire you (if they don’t care bout public goods) but maybe should be forced to hire you or an equally good competitor.

    Like

  4. Lars
    November 17, 2016 at 10:07 am

    “There’s lots of post-election talk that Facebook played a large part in the election”

    It’s actually very ironic (and funny) that some people claim that “fake news” and “conspiracy theories” (ie, rumor) on Facebook played a part in the election outcome but provide no evidence for it.

    This claim is in itself a form of rumor.

    People would do well to stick to things for which there is actual evidence.

    But, of course, that’s no fun.

    Like

  5. Lars
    November 17, 2016 at 10:09 am

    I’d rather audit Zuckerberg’s tax return than his algorithm.

    Like

  6. ~
    November 17, 2016 at 11:04 am

    When someone stress-eats M&M it really does mean that they want M&Ms at that moment! No one is forcing them. People are responsible for their decision to eat M&Ms or not.

    Like

    • November 17, 2016 at 11:59 am

      Sure! But if you asked them they’d probably say they’d prefer not being surrounded by M&M’s all day.

      Like

  7. Lars
    November 17, 2016 at 11:18 am

    The distinction between “real” and “fake” news is not cut and dry.

    How does one decide? The news source? On that basis, virtually all the pre-Iraq invasion news about WMD was considered “real”. But we know that it was not, and at least some of it was actually “fake” (purposefully false)

    Also, just because people (perhaps even a majority) call something a “conspiracy theory” does not make it fake or even false.

    Finally, one can make a pretty good argument that Facebook itself is little more than a giant rumor mill, so it seems a little silly (to me, at least) to worry about “fake news” and “conspiracy theories” on Facebook.

    Like

    • Lars
      November 18, 2016 at 10:17 am

      I just discovered how one decides what is a fake news site.

      An assistant professor of communications does it for us. Melissa Zimdars has put together a handy list.

      Thank goodness for that. I always thought “The Onion” (which is on “Zimdar’s List”) was real news.

      Like

  8. O. Kässi
    November 17, 2016 at 11:27 am

    To be honest, I am not entirely convinced about the filter bubble story. To my knowledge most attempts at quantifying the existence of such things have come to the conclusion that filter bubbles are not very filtering (http://science.sciencemag.org/content/348/6239/1130, https://5harad.com/papers/bubbles.pdf, http://www.nber.org/papers/w15916). That is — against the popular narrative — people tend to be usually exposed to a variety of opinions in social media.

    Of course, none of these papers represents the final word on the topic as there are huge measurement problems related to what counts as a varied opinion, whether clicking a link affects your opinion on a particular policy issue, etc… Nonetheless, my N=1 suggests that because of FB, I am exposed to a much wider variety of world views than I would be without it.

    Like

  9. Guest2
    November 18, 2016 at 10:41 pm

    Filter bubble? Kinda exotic. What about those still reading the newspaper?

    http://www.goodreads.com/book/show/1328886.How_the_News_Makes_Us_Dumb
    Years before Trump, this book describes the transformation of presidential campaigns into entertainment in 2 chapters.

    Also, Neil Postman’s Amusing Ourselves to Death: Public Discourse in the Age of Show Business (1985).
    https://en.wikipedia.org/wiki/Amusing_Ourselves_to_Death

    Like

  10. Ralph Trickey
    November 19, 2016 at 11:19 am

    I’m not sure that I’m comfortable with Facebook trying to shape people towards ‘long term happiness’ or whatever, it’s a bit too paternalistic for me. I would be happy with them filtering out the alarmist headlines. I DO think that Facebook should try to let more sunshine in and provide 10% or so of the opposing viewpoints instead of being as good an echo chamber as they can be, giving people the choice of what to view.

    Like

    • November 19, 2016 at 11:23 am

      Dude. They are already trying to do something. And that something is awful. You can change what they are doing but you cannot have them do “nothing” because they always have a default.

      Like

  11. November 19, 2016 at 7:47 pm
  12. Crprod
    November 21, 2016 at 4:49 pm

    I tend to ignore anything political on Facebook because I mainly use it to keep up with my inlaws and relay personal updates about them to my wife.

    Like

  1. No trackbacks yet.
Comments are closed.