I was just reading Cathy O’Neil’s (@mathbabedotorg) New York Times piece on the tech industry and academia, which argues how academics have not done enough to study issues caused by recent technology, including filter bubbles and big data. Others have already critiqued some of the tone and oversights of the piece, with varying degrees of sass, but I want to look at it as a rallying cry. While I think the piece could give more credit to current researchers, it recognizes a dangerous gap between this research and the tech industry.
A few of O’Neil’s points are especially key. For one, she notes how big data is often cloistered in companies, reducing access to academics. She also notes how private companies hire academics, and she describes how funding that drives engineering and computer science programs may not include more humanities-tinged concerns for the ethical, social dimensions of technology.
More contentiously, O’Neil also says, “There is essentially no distinct field of academic study that takes seriously the responsibility of understanding and critiquing the role of technology — and specifically, the algorithms that are responsible for so many decisions — in our lives.” While a distinct field of study may be harder to name and locate, plenty of sub-fields and inter-disciplinary work hits at this exact issue. For example, in rhet-comp, Kevin Brock and Dawn Shepherd discuss algorithms and their persuasive power and Jessica Reyman has analyzed issues of authorship and copyright with big data. Beyond rhet-comp, danah boyd continues to write on these issues, along with work from the University of Washington.
But a gap remains to some extent, despite this research.
Personally, I see two potential reasons: hubris and tech’s failure to consider social media more critically. Regarding hubris, George Packer’s “Change the World” (2013) explores Silicon Valley’s optimism and their skepticism of Washington. After describing how few start-ups invest in charity, for instance, Packer writes:
At places like Facebook, it was felt that making the world a more open and connected place could do far more good than working on any charitable cause. Two of the key words in industry jargon are “impactful” and “scalable”—rapid growth and human progress are seen as virtually indistinguishable. One of the mottoes posted on the walls at Facebook is “Move fast and break things.” Government is considered slow, staffed by mediocrities, ridden with obsolete rules and inefficiencies.
After Russia’s propaganda push and amid ongoing issues, like Facebook’s role in genocide, this optimism seems naive and dangerous. Zuckerberg’s trip to the Midwest , hiring more fact checkers, and increasing government scrutiny seem to point to a change. But I’m not sure how much is actually changing in tech–or larger structures like education and law.
This leads me to my second thought. In Being and Time, Martin Heidegger distinguishes between the ready-at-hand and the present-at-hand. The former refers to how we normally go through life, interacting with objects without much reflective thought, while the later refers to the way a scientist or philosopher may look at stuff. In his hammer example, Heidegger says that we normally use a hammer without much second thought, but once the hammer breaks, we reflect on what it is or does.
Similarly, with the ugly realities of social media surfacing more, we are more apt to examine and reflect. Before it “broke,” we used it as a neutral tool to communicate and pontificate digitally. As long as we continue to see social media as a neutral tool, or a tool just needing tweaks or fixes, we miss considering what social media is within a broader context of culture, economics, and society. We may be waking up to these deeper questions now, but we can’t fall back on present-for-hand approaches to use and design.
As Lori Emerson (2014) argues, companies rush to intuitive designs and ubiquitous computing, but we must consider how these trends blackbox the values and potentials of our tools. As Emerson and others argue, we can challenge these trends with firmer technological understanding, more democratized development, and the resistance of hackers and activists.
But with tech having so much power, I am not optimistic for change without a broader attitudinal shift in tech and elsewhere. I only see incremental changes coming, like increased fact checkers and algorithmic tweaks. These are good and may lead to significant change in time, but fundamental outlooks in tech–what philosophers may call instrumental rationality–will likely stay the same. Many critique the Ivory Tower for its obsession with present-at-hand abstraction, but the Silicon Tower seems just as dangerous with its present-for-hand reduction.
[Image: “Hacker” by the Preiser Project, via Creative Commons]