The issue is not people doing research on this topic. The issue is two fold: 1) the horrors of the publication cycle (how long it takes to get something published from when the work is done) and 2) the lack of funding for publicizing the results. And that is before we get in to the issues of how science is being discounted in the public sphere these days. Yes, there is likely the need for more people doing this work, if only to drive up the volume. But there are entire industries and ideologies out there supporting the status quo with publicity and dollars. Academics in all areas struggle to fight against that.
I’m happy for you, and appreciate that you’ve raised a significant issue. Next time, please don’t blame the innocent for it. As you yourself say: academia is where this should be happening – but then you point out that faculty cannot get the data, the university cannot compete with industry for talent, and there is no defined field of study yet. In addition, the federal administration is slashing research funding and threatening to undermine student financial aid, endowment returns, and charitable giving. Those of us in higher education who share your passion for empowering the people could use some help in dealing with the barriers instead of taking blame for something you agree we are unable to do. The current environment could be the end of higher education altogether if those of us who know better can’t turn it around. The 1% want the rest of us fat, dumb, and easily manipulable, and they don’t mind a bit if/when we kill each other off. Let’s be on the same side.
Congratulations on the Op Ed…but I wish you’d done a bit of research first: Science, Technology & Human Values (which I edit) published a special issue on algorithms, edited by Malte Ziewitz (a professor in the STS Department at Cornell). ST&HV has been around for 45 years (it began in the Harvard STS Program), publishing academic work that examines the sorts of issues that concern you. ST&HV is an official journal of the Society for Social Studies of Science, which also studies technologies. There is also the Society for the History of Technology and substantial work published on the politics and philosophy of technology.
Apologies for scolding, but it is unfortunate to have an entire field of scholarship overlooked in such a prominent publication.
Best wishes,
Ed Hackett
Vice Provost for Research and Professor, Brandeis University
Editor, Science, Technology & Human Values
I enjoyed reading your op-ed, but in highlighting a need for research on the complex interactions technology and AI can have on our society, I was wondering why you opted to call for an entirely new institute of research to be established instead of calling for increased funding to the existing research already happening within the Human-Computer Interaction community. A quick scan of the conference proceedings of the annual conference on computer-human interaction (https://chi2017.acm.org/) shows a range of papers that cover many of the topics that you mention in your article. Interested in how AI-powered vehicles will fare within our current driving ecosystem? Check out “The trouble with autopilots: Assisted and autonomous driving on the social road” by Barry Brown and Eric Laurier. Want to learn how technologists and technology researchers coudl better collaborate with health care experts? Check out “HCI and Health: learning from interdisciplinary interactions” by Aneesha Singh et al. Wondering how the next generation views data privacy? Check out “Youth Perspectives on Critical Data Literacies” by Samantha Hautea et al. Programs in HCI have been training graduates in your proposed areas from Carnegie Mellon, University of Washington, Stanford, Georgia Tech and many other places. I appreciate your advocacy for increased consideration of this technology issues from both a policy, research and industry perspective, however, I think you did the existing technology community a disservice by citing findings that are emerging from thought leaders in this very community but then leaving readers with the impression that this work is not already happening and steering them in that direction. It seems like we do not need to START asking these questions, instead, we need more technologists considering these questions in their work and paying attention to the research that has been conducted in this area for the past several decades.
I regret missing the opportunity to talk about the people working in this field. However, my main target was administrations that sideline such research rather than support it.
I hope we can agree that this is a major problem. It means that the average engineering/ CS major doesn’t get this perspective, and the people talking to media and policy makers end up being much too pro tech, practically lobbyists for big tech.
My goal was to get more support to people doing critical work, not to ignore their work.
From a friend who is in Natural Language Processing:
“It’s absolutely the case that there is urgent need for conversation between academics and policy makers around “algorithmic accountability” and ethics in AI (including NLP). But this op-ed is misleading in suggesting that the conversation is not happening within academia.
Among things that are happening:
(1) The FATML (Fairness and Transparency in Machine Learning) series of conferences http://fatml.mysociety.org/
(2) The Workshop on Ethics in Natural Language Processing at EACL 2017, which hopefully will see a second edition in 2018: http://fatml.mysociety.org/
Thanks, Kathy. You raise some excellent points. I appreciate your knowledge and hope that some academics out there will take you seriously. We are all way too trusting of the mysterious algorithms that have so much power.
Nice op-ed! You may be interested in taking a look at (http://fairlyaccountable.org/satc/), a recent major research collaboration between CMU, Cornell Tech, and UCB that seeks to address privacy and fairness in decision-making systems.
Some previous comments have already chided you for ignoring the small, but robust field of Science, Technology and Society (STS) which has been doing exactly the type of work you proposed since the 1960s. In the future, perhaps a better point to make is that most computer scientists and the academic programs that trained them have largely ignored the work of technological ethicists, social scientists, and policy researchers until forced to by the public.
Sensible regulation of technology rarely comes without a public backlash and government intervention (or at least the threat of it). Your popular media writings and presentations on this topic can only help us move in the right direction. Keep up the good work.
Dear Cathy, While I absolutely agree that the current status quo is totally unacceptable and allows for enormous damage to some of the most disadvantaged in our communities, I am not convinced that academic institutes (“Ivory Towers”) are empowered to be come an effective audit or enforcement agencies. For the public sector this needs to come directly from the Senate and the GOA who seem to have the authority to inspect and report on technology/algorithmic abuses. In the private sector, governments need to enforce strict standards as part of the annual audit process to show that algorithms are transparent “white-boxes” as part of an open system architecture with cleansed data and “do no harm”. Your article captures the passion to make fundamental changes and we all need to press hard to raise this subject to the highest levels or we will soon become trapped inside our own Owellian nightmare. Our academic institutions are extremely important places to research the inherent problems and propose policy but they cannot become the Algorithm Audit police 😉
The issue is not people doing research on this topic. The issue is two fold: 1) the horrors of the publication cycle (how long it takes to get something published from when the work is done) and 2) the lack of funding for publicizing the results. And that is before we get in to the issues of how science is being discounted in the public sphere these days. Yes, there is likely the need for more people doing this work, if only to drive up the volume. But there are entire industries and ideologies out there supporting the status quo with publicity and dollars. Academics in all areas struggle to fight against that.
LikeLiked by 3 people
I agree.
LikeLike
Congratulations! And thank you for keeping raising the alarm on this issue.
LikeLiked by 1 person
Congrats! Also, there is a typo in paragraph 3: “when it does exists”
LikeLiked by 1 person
I’m happy for you, and appreciate that you’ve raised a significant issue. Next time, please don’t blame the innocent for it. As you yourself say: academia is where this should be happening – but then you point out that faculty cannot get the data, the university cannot compete with industry for talent, and there is no defined field of study yet. In addition, the federal administration is slashing research funding and threatening to undermine student financial aid, endowment returns, and charitable giving. Those of us in higher education who share your passion for empowering the people could use some help in dealing with the barriers instead of taking blame for something you agree we are unable to do. The current environment could be the end of higher education altogether if those of us who know better can’t turn it around. The 1% want the rest of us fat, dumb, and easily manipulable, and they don’t mind a bit if/when we kill each other off. Let’s be on the same side.
LikeLike
Dear Cathy,
Congratulations on the Op Ed…but I wish you’d done a bit of research first: Science, Technology & Human Values (which I edit) published a special issue on algorithms, edited by Malte Ziewitz (a professor in the STS Department at Cornell). ST&HV has been around for 45 years (it began in the Harvard STS Program), publishing academic work that examines the sorts of issues that concern you. ST&HV is an official journal of the Society for Social Studies of Science, which also studies technologies. There is also the Society for the History of Technology and substantial work published on the politics and philosophy of technology.
Apologies for scolding, but it is unfortunate to have an entire field of scholarship overlooked in such a prominent publication.
Best wishes,
Ed Hackett
Vice Provost for Research and Professor, Brandeis University
Editor, Science, Technology & Human Values
LikeLiked by 2 people
I enjoyed reading your op-ed, but in highlighting a need for research on the complex interactions technology and AI can have on our society, I was wondering why you opted to call for an entirely new institute of research to be established instead of calling for increased funding to the existing research already happening within the Human-Computer Interaction community. A quick scan of the conference proceedings of the annual conference on computer-human interaction (https://chi2017.acm.org/) shows a range of papers that cover many of the topics that you mention in your article. Interested in how AI-powered vehicles will fare within our current driving ecosystem? Check out “The trouble with autopilots: Assisted and autonomous driving on the social road” by Barry Brown and Eric Laurier. Want to learn how technologists and technology researchers coudl better collaborate with health care experts? Check out “HCI and Health: learning from interdisciplinary interactions” by Aneesha Singh et al. Wondering how the next generation views data privacy? Check out “Youth Perspectives on Critical Data Literacies” by Samantha Hautea et al. Programs in HCI have been training graduates in your proposed areas from Carnegie Mellon, University of Washington, Stanford, Georgia Tech and many other places. I appreciate your advocacy for increased consideration of this technology issues from both a policy, research and industry perspective, however, I think you did the existing technology community a disservice by citing findings that are emerging from thought leaders in this very community but then leaving readers with the impression that this work is not already happening and steering them in that direction. It seems like we do not need to START asking these questions, instead, we need more technologists considering these questions in their work and paying attention to the research that has been conducted in this area for the past several decades.
LikeLiked by 1 person
I regret missing the opportunity to talk about the people working in this field. However, my main target was administrations that sideline such research rather than support it.
I hope we can agree that this is a major problem. It means that the average engineering/ CS major doesn’t get this perspective, and the people talking to media and policy makers end up being much too pro tech, practically lobbyists for big tech.
My goal was to get more support to people doing critical work, not to ignore their work.
LikeLike
From a friend who is in Natural Language Processing:
“It’s absolutely the case that there is urgent need for conversation between academics and policy makers around “algorithmic accountability” and ethics in AI (including NLP). But this op-ed is misleading in suggesting that the conversation is not happening within academia.
Among things that are happening:
(1) The FATML (Fairness and Transparency in Machine Learning) series of conferences
http://fatml.mysociety.org/
(2) The Workshop on Ethics in Natural Language Processing at EACL 2017, which hopefully will see a second edition in 2018:
http://fatml.mysociety.org/
(3) Courses on Ethics in NLP at various institutions:
https://aclweb.org/aclwiki/Ethics_in_NLP
(4) … including at UW:
http://faculty.washington.edu/ebender/2017_575/
(5) The new AI Now Institute at NYU
https://ainowinstitute.org/
(6) The established Tech Policy Lab at UW:
http://techpolicylab.org/“
LikeLiked by 1 person
Thanks, Kathy. You raise some excellent points. I appreciate your knowledge and hope that some academics out there will take you seriously. We are all way too trusting of the mysterious algorithms that have so much power.
LikeLike
Nice op-ed! You may be interested in taking a look at (http://fairlyaccountable.org/satc/), a recent major research collaboration between CMU, Cornell Tech, and UCB that seeks to address privacy and fairness in decision-making systems.
LikeLike
Some previous comments have already chided you for ignoring the small, but robust field of Science, Technology and Society (STS) which has been doing exactly the type of work you proposed since the 1960s. In the future, perhaps a better point to make is that most computer scientists and the academic programs that trained them have largely ignored the work of technological ethicists, social scientists, and policy researchers until forced to by the public.
Sensible regulation of technology rarely comes without a public backlash and government intervention (or at least the threat of it). Your popular media writings and presentations on this topic can only help us move in the right direction. Keep up the good work.
LikeLike
Dear Cathy, While I absolutely agree that the current status quo is totally unacceptable and allows for enormous damage to some of the most disadvantaged in our communities, I am not convinced that academic institutes (“Ivory Towers”) are empowered to be come an effective audit or enforcement agencies. For the public sector this needs to come directly from the Senate and the GOA who seem to have the authority to inspect and report on technology/algorithmic abuses. In the private sector, governments need to enforce strict standards as part of the annual audit process to show that algorithms are transparent “white-boxes” as part of an open system architecture with cleansed data and “do no harm”. Your article captures the passion to make fundamental changes and we all need to press hard to raise this subject to the highest levels or we will soon become trapped inside our own Owellian nightmare. Our academic institutions are extremely important places to research the inherent problems and propose policy but they cannot become the Algorithm Audit police 😉
LikeLike