The poet Charles Olson wrote, “Whatever you have to say, leave/ The roots on, let them/ Dangle/ And the dirt/ Just to make clear/ Where they come from.” Words are grimed, caked, and clotted with decades of use and wrinkled with age. Some words and phrases become anachronistic, like “winding” a window down in a world of electric windows. Others carry an explosive politics. Many get bleached by the endless passing of palms, losing a clear meaning.
But at a deeper sense, Olson’s line reminds me that we need to inspect our language in all its dirty history and daily use. To take it step further: Words impact our world, etching our reality like the steady run of water on rock or blowing it up like dynamite.
As George Orwell wrote, “if thought corrupts language, language can also corrupt thought.” His classic 1984 also stresses the coercive and meaning-making power of language through “newspeak,” the official language of Oceania that uses simplicity and structure to limit free thought. For example, “bad” no longer exists; instead, one has “ungood.” By limiting expression, one limits thought. This, among other reasons, hits at the danger of censorship and its popularity among totalitarian regimes.
I was just reading Cathy O’Neil’s (@mathbabedotorg) New York Times piece on the tech industry and academia, which argues how academics have not done enough to study issues caused by recent technology, including filter bubbles and big data. Others have already critiqued some of the tone and oversights of the piece, with varying degrees of sass, but I want to look at it as a rallying cry. While I think the piece could give more credit to current researchers, it recognizes a dangerous gap between this research and the tech industry.
A few of O’Neil’s points are especially key. For one, she notes how big data is often cloistered in companies, reducing access to academics. She also notes how private companies hire academics, and she describes how funding that drives engineering and computer science programs may not include more humanities-tinged concerns for the ethical, social dimensions of technology.
More contentiously, O’Neil also says, “There is essentially no distinct field of academic study that takes seriously the responsibility of understanding and critiquing the role of technology — and specifically, the algorithms that are responsible for so many decisions — in our lives.” While a distinct field of study may be harder to name and locate, plenty of sub-fields and inter-disciplinary work hits at this exact issue. For example, in rhet-comp, Kevin Brock and Dawn Shepherd discuss algorithms and their persuasive power and Jessica Reyman has analyzed issues of authorship and copyright with big data. Beyond rhet-comp, danah boyd continues to write on these issues, along with work from the University of Washington.
But a gap remains to some extent, despite this research.
Personally, I see two potential reasons: hubris and tech’s failure to consider social media more critically. Regarding hubris, George Packer’s “Change the World” (2013) explores Silicon Valley’s optimism and their skepticism of Washington. After describing how few start-ups invest in charity, for instance, Packer writes:
At places like Facebook, it was felt that making the world a more open and connected place could do far more good than working on any charitable cause. Two of the key words in industry jargon are “impactful” and “scalable”—rapid growth and human progress are seen as virtually indistinguishable. One of the mottoes posted on the walls at Facebook is “Move fast and break things.” Government is considered slow, staffed by mediocrities, ridden with obsolete rules and inefficiencies.
This leads me to my second thought. In Being and Time, Martin Heidegger distinguishes between the ready-at-hand and the present-at-hand. The former refers to how we normally go through life, interacting with objects without much reflective thought, while the later refers to the way a scientist or philosopher may look at stuff. In his hammer example, Heidegger says that we normally use a hammer without much second thought, but once the hammer breaks, we reflect on what it is or does.
Similarly, with the ugly realities of social media surfacing more, we are more apt to examine and reflect. Before it “broke,” we used it as a neutral tool to communicate and pontificate digitally. As long as we continue to see social media as a neutral tool, or a tool just needing tweaks or fixes, we miss considering what social media is within a broader context of culture, economics, and society. We may be waking up to these deeper questions now, but we can’t fall back on present-for-hand approaches to use and design.
As Lori Emerson (2014) argues, companies rush to intuitive designs and ubiquitous computing, but we must consider how these trends blackbox the values and potentials of our tools. As Emerson and others argue, we can challenge these trends with firmer technological understanding, more democratized development, and the resistance of hackers and activists.
But with tech having so much power, I am not optimistic for change without a broader attitudinal shift in tech and elsewhere. I only see incremental changes coming, like increased fact checkers and algorithmic tweaks. These are good and may lead to significant change in time, but fundamental outlooks in tech–what philosophers may call instrumental rationality–will likely stay the same. Many critique the Ivory Tower for its obsession with present-at-hand abstraction, but the Silicon Tower seems just as dangerous with its present-for-hand reduction.
[Image: “Hacker” by the Preiser Project, via Creative Commons]
It’s been a while since I’ve blogged, and an especially long time since I’ve blogged “for fun” outside of a class requirement, but with the semester starting up again, I wanted to start off with positive habits, creating a space to think through things. For now, I’ve been thinking a lot about politics and what my own interest in play can bring.
Wary of becoming another “It’s Time for Some Game Theory” guy or the writer of a naive think piece that praises some creepy gamifying tactic, I nevertheless think that play, games, etc., have a lot to offer how we consider politics.
When reading Plato’s Gorgias and Phaedrus dialogues, as well as the Dissoi Logoi and Gorgias’ “Ecomium,” three motifs struck me: the role of relativism, the act of teaching rhetoric, and the power of language. I also couldn’t help but meld some of these readings with where my head is at lately, so I think I’ll start there.
I’ve been carrying these ideas around for a while now and am still thinking through them. With Trump, Brexit, Orlando, anti-trans bathroom laws, and other issues cycling through the media–or at least my media–lately, I keep coming back to Hannah Arendt’s The Human Condition, written in 1958 as a defense of philosophy’s role in “the active life” and a critique of its preference for “the contemplative life.”
Arendt opens the book discussing Sputnik. Being the first human-made object to leave the Earth, Sputnik represented, in the words of one reporter, the first “step from men’s imprisonment on Earth.” Arendt goes on to argue that science and technology have increasingly tried to make human life “artificial.” Extending lifespans, splitting the atom, in vitro fertilization, etc., for Arendt, “offer a rebellion against human existence as it has been given.”
I’m not as concerned with this “rebellion” and would side with others in the post-human view that technology and artifice have always been part of the human condition. Instead, what interests me more is Arendt’s next critique: “The trouble concerns the fact that the ‘truths’ of the modern scientific worldview, though they can be demonstrated in mathematical formulas and proved technologically, will no longer lend themselves to normal expression in speech and thought.” In other words, we’re getting ahead of ourselves. We can do things, like a split an atom or raise an embryo in vitro, but can’t talk about it as a public.
Mostly, I’ve just been trying to think through a few gun control-related things. I see opinions all over. Memes. Tweets. Enraged Facebook statuses. This may be part and parcel to that storm, but I wanted to take the whole thing slowly.
I really have nothing major to gain or lose in this debate personally. While I live in a violent city (Syracuse), I’m rarely in harms way directly. Perhaps now and then, but gun violence is not a daily reality in my physical proximity. I don’t own guns, but I also don’t have anything against gun ownership. I’m friends with hunters and gun enthusiasts, and consider them fine people. I also recognize that gun ownership is a constitutional right. More than that, it is part of the Bill of Rights, alongside things like freedom of speech and no double jeopardy.
But as Colbert said, when things like mass shootings keep happening, we should look at changing. I suppose the alternative would be to not change and take things how they are, which is an option. Moreover, I don’t think the idea of “change” needs to be threatening or draconian. Middle ground exists. Places for dialogue. Places for compromise. So mostly I want to point to conversations that I don’t see much in the mainstream media or on social media, including the stakes and confines of the debate itself.
These days, YouTube is still a great place to find videos of cats or middle school students acting out the Scarlet Letter, equipped with shaky camera shots and wind-buried dialogue. But it’s also a fascinating place to find some videos for a quick brain snack, a short (3-15 minute) video about an “educational” topic, often released weekly. Not only are such videos great background noise for morning routines, they can add some pep and multimedia to a lesson.
As Rouner puts it, “There’s a common conception that an opinion cannot be wrong.” In many cases, this is fine. I mat have an opinion on certain music or food. Having that opinion relies on aesthetic judgement, which may be informed, but has a different standard than scientific “fact.”
As the article points out, however, many people have “opinions” that seem to contradict “fact.” Bringing in the usual suspects–climate change deniers, people who connect autism to vaccines, people who doubt privilege–the article tries to argue that such “opinions” are simply wrong. They are misconceptions. Factual errors.
I think the brusque way the article deals with the problem, typical of most contemporary mainstream rhetoric, dodges some of the deeper complications. In reality, I think we have a major epistemological issue afoot, where our sense of fact, truth, or opinion, and the standards we use to judge these words have become really messy.
I like YouTube. I like it more than television. Sometimes more than reading. It has plenty of strange alcoves and diverse pickings, from “weird YouTube” with its singing manikins and smashed together YouTube poop to the comedy skits of Mega64 and others. And this just scratches the surface.
I’ve noticed an interesting figure in some of these places. I call it the YouTube Intellectual. An ever-growing spattering of YouTube channels center on intellectual topics or deal with popular topics, like video games, in intellectual ways.
Playdough. Tiny hands tweak, pinch, stretch the dough into tinsels, meaty threads, snakes curling into snail shells–suddenly smashed flat, “like pancakes,” and rolled smooth in young palms into spheres. Perhaps, with a few gentle, well-placed tugs, the children teas out arms and legs, or a simple face, then the fingers close, vise-like, dough peeking slightly from the spaces between, molding shaping it into a small brain, nooked and crannied, and grained with palm lines.
Then, at the end of the day, it all goes back in the plastic can, smashed, once more, into an uneven cylinder. “Don’t forget,” say the teachers, “or else you won’t be able to play with it anymore.” Sealed behind primary-colored lids and walls, the malleable plaything remains withdrawn and dormant, waiting.
“Writing is revision.” A teacher I once shadowed said this a few times. So did I to own students, thinking it a properly provocative, axiomatic phrase. Something White Lotus from Kung Fu might say if he taught first year composition.
But, to be honest, I don’t really know what it means. Is it a reference to something like Linda Flowers and John R. Hayes and their “cognitivist approach to writing,” in which revision and pre-writing are part of the “writing process”? Or perhaps it’s a more political adage, on the “revision” of ideological entrenchments and social structures. Writing allows one to “revise” the state of things, both inside our heads and outside, in the world.
Or it may stretch the never-ending inventive tweaking that revision entails over the whole of writing. In other words, writing is a constant “revision” of sorts, a constant trying to get words out as best as we can. We are never done. The moment we pick up our pens, we are already revising. The moment we “finish,” we are still revising.
If one mentions (or Googles) Albert Camus, the word “absurdity” is not far behind–neither is “existentialist,” which is a whole other issue. But, as with most cases of historical association, things are more complicated.
The “absurd” is the first of three philosophical progressions for Camus. During WWII, Camus wrote the trilogy of the absurd: the play Caligula (1944), the novel The Stranger (1942), and the essay TheMyth of Sisyphus (1942). This, he said, would be his guiding process, tackling his ideas with a play, a novel, and an extended philosophical essay.
His second trilogy centered on revolt, inserting human values in the face of nihilism. Writing the book-length essay The Rebel (L’Homme révolté, 1951), Camus received a wave of criticism. For one, he attacked the French left, which included his friends Sartre and Beauvoir, because they knew about the atrocities of the GULAG and still supported Stalin.
But more pointedly, Camus also changed his thoughts. He no longer was the “prophet of the absurd,” but the spokesman of revolt. While some argue this shift was a complete rejection and others say it forms a “continuum” with absurdity, both represent a shift.
As Camus writes in his essay “Enigma,” “Everyone wants the man who is still searching to have already reached his conclusion. A thousand voices are already telling him what he has found, and yet he knows that he hasn’t found anything.”
Camus was still searching, still stumbling and exploring his ideas, flashlight in hand, but his public name was already solidified–and, in many ways, remains so.
Playdough is revision. You’re never done tweaking or sculpting it. As it’s name suggests, playdough is always “play,” never product. Pure process, pure doing, all about feeling the grainy pliant substance stick and fold with your fingertips. And each time, it goes back in the container, like an artist who scrubs away his canvas just to start again.
It’s not “art for art’s sake,” but creative construction and exploration without a clear endpoint. Like a sand box or a “sand box” game, playdough provides a space to explore the space. That’s its end and means.
In a sense, it even differs from a “game,” our usual sites of play, as playdough has no constraints. No “rules” that structure the game. For example, in soccer (i.e. football), because you can’t use hands and arms, the “game” is to use one’s other body parts to head, dribble, kick, cross, and score.
Playdough has no “rules,” except, perhaps, a parent saying you can’t stick it on the rug.
Camus also wrote that writing is a “daily fidelity,” a daily act of holding onto and working one’s ideas and images into something that may take years. For some reason, Camus often latches on to five years, saying that one must have an idea five years before one starts writing about it.
Camus’ often forgotten first novel A Happy Death is a steppingstone to The Stranger. The character is a cold, detached Algerian named Mersault (a one letter difference from The Stanger‘s Meursault.). It evokes similar images, similar echoes and feelings, though the novels differ profoundly.
Scrapped and unpublished in his lifetime, A Happy Death may be a failure in some ways. Or a mere writing exercise, a book-length warm up for a new writer. But still, the question remains, how much does it stand on it’s own? How much is it part of The Stranger? And how much does the distinction matter?
I always remember that our English “essay” comes from “essai,” the French word for “trial” or “to test the quality of” (like metal in a furnace), echoing Michel de Montaigne’s Essais, which he viewed in a similar light. They were not meant to be polished, finished pieces, but “trials” and “attempts,” sketches or studies in a sense that tested his ideas.
Like Camus’ daily fidelity and playdough’s unfinished pliancy, Montaigne’s Essais were searching, roving, and unfinished–despite receiving countless edits, read throughs and revisiting. And like Camus writing, the Essais offer profound political and philosophical insights. Here, writing is revision, and revision is powerful.
Encountering most essays, however, we often see them as static and finished. We also see them as discrete and separate–or when not separate, as “derivative” or “remixed.” But technology provides a possible return to Montaigne’s Essais or a possible shift into the realm of playdough, of productive play, as our “interfaces” are often not static. Here, writing is much like revision.
Only I shudder to use the word “productive,” because it has become an instrumentally focused word, layered with nasty, anxiety-inducing overtones that make me wonder if I’m “doing enough,” and “keeping up,” and not “wasting time.”
So, in a sense, technology allows us to have interfaces of co-authorship, interaction, constant change, new mechanics of invention, etc., but we also need a culture that can explore this. We may have playdough interfaces, but we need a playdough culture, a culture that isn’t telling us what we have “found,” to paraphrase Camus, but relishes the play of the finding. Doing so, we may further liberate our technology and creativity to innovate and express. But most of all, it may bring more freedom and joy back into the creative process.
As I said above, it’s not art for art’s sake, but doing for doing’s sake. It’s about turning revision into invention and vise versa. It’s about taking our tacky, doughy language and playing with it, seeing what comes out as we stretch and flatten it into compositions.