Home > Uncategorized > A good use of big data: to help struggling students

A good use of big data: to help struggling students

November 1, 2016

There’s an article that’s been forwarded to me by a bunch of people (I think first by Becky Jaffe) by Anya Kamanetz entitled How One University Used Big Data To Boost Graduation Rates.

The article centers on an algorithm being used by Georgia State University to identify students in danger of dropping out of school. Once identified, the school pairs those wobbly students with advisers to try to help them succeed. From the article:

A GPS alert doesn’t put a student on academic probation or trigger any automatic consequence. Instead, it’s the catalyst for a conversation.

The system prompted 51,000 in-person meetings between students and advisers in the past 12 months. That’s three or four times more than was happening before, when meetings were largely up to the students.

The real work was in those face-to-face encounters, as students made plans with their advisers to get extra tutoring help, take a summer class or maybe switch majors.

I wrote a recent book about powerful, secret, destructive algorithms that I called WMD’s, short for Weapons of Math Destruction. And naturally, a bunch of people have written to me asking if I thought the algorithm from this article would qualify as a WMD.

In a word, no.

Here’s the thing. One of the hallmark characteristics of a WMD is that it punishes the poor, the unlucky, the sick, or the marginalized. This algorithm does the opposite – it offers them help.

Now, I’m not saying it’s perfect. There could easily be flaws in this model, and some people are not being offered help who really need it. That can be seen as a kind of injustice, if others are receiving that help. But that’s the worst case scenario, and it’s not exactly tragic, and it’s a mistake that might well be caught if the algorithm is trained over time and modified to new data.

According to the article, the new algorithmic advising system has resulted in quite a few pieces of really good news:

  • Graduation rates are up 6 percentage points since 2013.
  • Graduates are getting that degree an average half a semester sooner than before, saving an estimated $12 million in tuition.
  • Low-income, first-generation and minority students have closed the graduation rate gap.
  • And those same students are succeeding at higher rates in tough STEM majors.

graduation-liftedbydata_slide-271e09a2737b0ad490a15c1be7fb62c67837ed63-s800-c85

But to be clear, the real “secret sauce” in this system is the extraordinary amount of advising that’s been given to the students. The algorithm just directed that work.

A final word. This algorithm, which identifies struggling students and helps them, is an example I often use in explaining that an algorithm is not inherently good or evil.

In other words, this same algorithm could be used for evil, to punish the badly off, and a similar one nearly was in the case of Mount St. Mary’s College in Virginia. I wrote about that case as well, in a post entitled The Mount St. Mary’s Story is just so terrible.

Categories: Uncategorized
  1. Tina
    November 1, 2016 at 1:02 pm

    I just finished your book, WMD, and wanted to say THANK YOU for being both intelligent and concerned for those whom WMD can affect.
    I taught HS math for 14 years, but at one point in time was fascinated with the field of operations research, thinking it to be a wonderful tool for resource allocation. Never did it occur to me that it could be used in the ways you described.
    All the best to you!

    Like

  2. Michele Lee
    November 2, 2016 at 2:13 am

    “…an algorithm is not inherently good or evil.” So true! Couldn’t help but think of that statistics joke about lying. But that bit is well known. Thank you Cathy for raising awareness of the ethical use of big data / algorithms! Too often I’ve seen data scientists get so happy that their algos work, that they cannot accept that the model needs to be updated for new data. Or even to fight against creating an algo that could be used for “evil” purposes.

    Like

  3. Lars
    November 2, 2016 at 9:08 am

    From the description of the “predictive algorithm”

    “an adviser gets an alert when:

    A student does not receive a satisfactory grade in a course needed in his or her major.
    A student does not take a required course within the recommended time.
    A student signs up for a class not relevant to his or her major.”

    That must be a mighty powerful algorithm. What person would ever have thought to include such things?

    How about “A second semester senior has not yet selected a major”? Is that on their list?

    Like

    • November 3, 2016 at 10:47 pm

      I agree. Does this even count as “big data”? It’s really just a single “if” statement on a single piece of data at a time. I don’t see any fancy predictions or correlations happening here. More like someone just slapped the phrase “big data analytics” on this for PR purposes because that’s the current buzzword.

      The fact that this wasn’t already being done highlights what a scam most college “advising” is.

      Like

  4. Lars
    November 2, 2016 at 9:20 am

    “…an algorithm is not inherently good or evil.”

    Well, some conservatives might take issue, considering AlGorisms* pure evil, especially Al’s tendency to call climate change deniers “deniers”.

    Like

  5. November 3, 2016 at 11:09 am

    So cool. GSU is doing all kinds of innovative things in many different areas. One small correction: Mt. Saint Mary’s is in Maryland, not Virginia.

    Like

  1. No trackbacks yet.
Comments are closed.