Home > Uncategorized > White House report on big data and civil rights

White House report on big data and civil rights

May 17, 2016

Last week the White House issued a report entitled Big Risks, Big Opportunities: the Intersection of Big Data and Civil Rights. Specifically, the authors were United States C.T.O. Megan Smith, Chief Data Scientist DJ Patil, and Cecilia Munoz, who is Assistant to the President and Director of the Domestic Policy Council.

It is a remarkable report, and covered a lot in 24 readable pages. I was especially excited to see the following paragraph in the summary of the report:

Using case studies on credit lending, employment, higher education, and criminal justice, the report we are releasing today illustrates how big data techniques can be used to detect bias and prevent discrimination. It also demonstrates the risks involved, particularly how technologies can deliberately or inadvertently perpetuate, exacerbate, or mask discrimination.

The report itself is broken up into an abstract discussion of algorithms, which for example debunks the widely held assumption that algorithms are objective, discusses problems of biased training data, and discusses problems of opacity, unfairness, and disparate impact. It’s similar in its chosen themes, if not in length, to my upcoming book.

The case studies are well-chosen. Each case study is split into three sections: the problem, the opportunity, and the challenge. They do a pretty good job of warning of things that could theoretically go wrong, if they are spare on examples of how such things are already happening (for such examples, please read my book).

Probably the most exciting part of the report, from my perspective, is in the conclusion, which discusses Things That Should Happen:

Promote academic research and industry development of algorithmic auditing and external testing of big data systems to ensure that people are being treated fairly. One way these issues can be tackled is through the emerging field of algorithmic systems accountability, where stakeholders and designers of technology “investigate normatively significant instances of discrimination involving computer algorithms” and use nascent tools and approaches to proactively avoid discrimination through the use of new technologies employing research-based behavior science. These efforts should also include an analysis identifying the constituent elements of transparency and accountability to better inform the ethical and policy considerations of big data technologies. There are other promising avenues for research and development that could address fairness and discrimination in algorithmic systems, such as those that would enable the design of machine learning systems that constrain disparate impact or construction of algorithms that incorporate fairness properties into their design and execution.

Readers, this is what I want to do with my life, after my book tour. So it’s nice to see people calling for it like this.

Categories: Uncategorized
  1. Jon Lubar
    May 17, 2016 at 3:28 pm

    What’s interesting to me is that there are some folks with very deep pockets and major influence in DC who are going to suffer financially if the reports suggestions are well and quickly implemented. The K-12 testing industrial complex comes to mind as one under a more permanent threat from this. It’s likely they will have little recourse against a contraction of the market they have built since, once past partisan arguments, it will be their once sympathetic and helpful cronies in government who are driving the changes, ones which should in theory end the use of many of their products.

    Like

    • May 17, 2016 at 3:33 pm

      I expect many of them to just jump ship to the industry that will pop up on the other side!

      Like

      • Auros
        May 31, 2016 at 3:03 pm

        Maybe you need to get yourself some “cronies in government”, who can hire you as a consultant to help ensure that the next generation of big-data services to government are actually helpful. 🙂

        Like

  1. No trackbacks yet.
Comments are closed.
%d bloggers like this: