Home > Uncategorized > Lethal Autonomous Weapons and the Occupy Book Club

Lethal Autonomous Weapons and the Occupy Book Club

Bloomberg View

My newest Bloomberg piece is out, in which I consider the problem of false negatives in the context of love and war:

False Negatives Can Be a Matter of Life and Death

Algorithms will repeat our mistakes unless we know what we’re missing.

You can see all of my Bloomberg View pieces here.

Book Club!

I also wanted to announce that my Occupy group Alt Banking is starting a book club. We’re meeting this Sunday from 2-3pm at Columbia (room 409 of the International Affairs Building at Amsterdam and 118th), and we’re discussing the introductions and first chapters of the two following books:

  1. Capitalism and Freedom by Milton Friedman, available for free online
  2. Capitalism and Slavery by Eric Williams, which also seems to be available online.

Please join us, we welcome everyone and anyone!

Categories: Uncategorized
  1. Lars
    December 1, 2017 at 10:30 am

    The assumption that “algorithms give wrong answers because they inadvertently miss information” is not necessarily valid.

    Sometimes algorithms are actually designed to “miss” information and the answers they give are actually the “right” ones for those who design and use them.

    I know Cathy has made this point many times, but it’s worth hammering home that, with the exception of algorithms that deal purely with mathematical objects (eg, for sorting integers), algorithms are NEVER objective because there are ALWAYS built in assumptions programmed in from the getgo.

    Acknowledging that simple truth immediately raises the question of precisely what those assumptions are. For example, with an algorithm to count “civilian deaths”, who is considered a civilian?

    To modify a famous phrase:

    All algorithms that deal with humans are wrong, but nonetheless useful to someone.

    Algorithms are very efficient for masquerading objectivity and removing the humans who designed and use them from the direct line of accountability for the outcomes.

    It’s harder to claim a computer algorithm used for hiring is racist than to claim a person is, which is a big plus for those doing hiring.

    Like

  1. No trackbacks yet.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: