When reading Plato’s Gorgias and Phaedrus dialogues, as well as the Dissoi Logoi and Gorgias’ “Ecomium,” three motifs struck me: the role of relativism, the act of teaching rhetoric, and the power of language. I also couldn’t help but meld some of these readings with where my head is at lately, so I think I’ll start there.
I don’t have much time for today for a longer post, but as I was working through some older e-mail, I ran across the NRP weekly I get and saw this powerful image:
Though I had seen it before, circulating on social media, seeing it again this morning, it struck me once again. A few things in particular stick with me.
The magazine showed a scarred, abandoned street. The sort of ruin porn that surfaces from Pripyat, Centralia, or some other orphaned collection of concrete and steel that once constituted “a city,” or at least something human. Shifting earth had torn ditches into the blacktop, like broken bread. Softwoods studded cracks with prickly, anemic limbs. Rubble and rocks piled outside stripped, sagging walls. Cloud-dimmed gray permeated the cityscape.
“18 WAYS TO SURVIVE THE APOCALYPSE” the headline said, in bold, sans-serif font.
The magazine was on a K-Mart rack, like a bruised piece of skin in an otherwise Willy-Wonka-bright palette of check-out line candies, play dough containers, and glossed up celebrities. The rest of the store was pretty quiet beyond the usual ambience of carts, footsteps, distant telephones, and distant arguments.
I think my dad an I were there to buy a couch cover.
I’ve been carrying these ideas around for a while now and am still thinking through them. With Trump, Brexit, Orlando, anti-trans bathroom laws, and other issues cycling through the media–or at least my media–lately, I keep coming back to Hannah Arendt’s The Human Condition, written in 1958 as a defense of philosophy’s role in “the active life” and a critique of its preference for “the contemplative life.”
Arendt opens the book discussing Sputnik. Being the first human-made object to leave the Earth, Sputnik represented, in the words of one reporter, the first “step from men’s imprisonment on Earth.” Arendt goes on to argue that science and technology have increasingly tried to make human life “artificial.” Extending lifespans, splitting the atom, in vitro fertilization, etc., for Arendt, “offer a rebellion against human existence as it has been given.”
I’m not as concerned with this “rebellion” and would side with others in the post-human view that technology and artifice have always been part of the human condition. Instead, what interests me more is Arendt’s next critique: “The trouble concerns the fact that the ‘truths’ of the modern scientific worldview, though they can be demonstrated in mathematical formulas and proved technologically, will no longer lend themselves to normal expression in speech and thought.” In other words, we’re getting ahead of ourselves. We can do things, like a split an atom or raise an embryo in vitro, but can’t talk about it as a public.
Mostly, I’ve just been trying to think through a few gun control-related things. I see opinions all over. Memes. Tweets. Enraged Facebook statuses. This may be part and parcel to that storm, but I wanted to take the whole thing slowly.
I really have nothing major to gain or lose in this debate personally. While I live in a violent city (Syracuse), I’m rarely in harms way directly. Perhaps now and then, but gun violence is not a daily reality in my physical proximity. I don’t own guns, but I also don’t have anything against gun ownership. I’m friends with hunters and gun enthusiasts, and consider them fine people. I also recognize that gun ownership is a constitutional right. More than that, it is part of the Bill of Rights, alongside things like freedom of speech and no double jeopardy.
But as Colbert said, when things like mass shootings keep happening, we should look at changing. I suppose the alternative would be to not change and take things how they are, which is an option. Moreover, I don’t think the idea of “change” needs to be threatening or draconian. Middle ground exists. Places for dialogue. Places for compromise. So mostly I want to point to conversations that I don’t see much in the mainstream media or on social media, including the stakes and confines of the debate itself.
Today, I saw an article floating around social media called “No, It’s not Your Opinion. You’re Just Wrong” by Jef Rouner. It comes at the heels of similar articles, like this one from Vox about professors being afraid of liberal students or this cogent blog post about Twitter by Alex Ried.
As Rouner puts it, “There’s a common conception that an opinion cannot be wrong.” In many cases, this is fine. I mat have an opinion on certain music or food. Having that opinion relies on aesthetic judgement, which may be informed, but has a different standard than scientific “fact.”
As the article points out, however, many people have “opinions” that seem to contradict “fact.” Bringing in the usual suspects–climate change deniers, people who connect autism to vaccines, people who doubt privilege–the article tries to argue that such “opinions” are simply wrong. They are misconceptions. Factual errors.
I think the brusque way the article deals with the problem, typical of most contemporary mainstream rhetoric, dodges some of the deeper complications. In reality, I think we have a major epistemological issue afoot, where our sense of fact, truth, or opinion, and the standards we use to judge these words have become really messy.
With the protests in Baltimore taking place, I’ve been thinking a lot about race, equality, and social media. I don’t post much on the subject, as I’m not sure how I fit in to the whole debate, but prompted by some conversations and thinking from this past week, I feel compelled to sort out my thoughts in a public or semi-public place. I don’t want to point any fingers or espouse any solution. Instead, I mainly just want to clarify.
In terms of labels, and the privileges traditionally ascribed to them, I’m high on the chart: straight, white, male, able-bodied, and upper middle class. I’m not part of the 1%, but I have never had to face any real discrimination. At most, I’ve had to curb decisions and trim back dreams due to constraints on money.
That said, I have felt “other” at points, whether through mingling in counterpublics or traveling to Egypt, where I was the minority at the station, in the markets, on the streets, etc. It is an odd feeling. A certain embodied and spatial self-consciousness, a sense of uncanny distance–even rejection–from the stuff and people around you. A sense of the exiled, alien, or intrusive.
With that in mind, I can’t say my experience connects to the experience of those alienated by the public sphere at large. My experience is vastly different from theirs, and I think that’s the key: find a multiculturalism that doesn’t try to erase the meaningful differences that do exist, but that provides a place where conversation can take place. Recognizing difference, but also recognizing relationship.
For example, Cornell West points out how many Black leaders become washed down or “de-odorized,” in his words. We see this with MLK, honoring his “I have a Dream Speech” without recognizing his widespread attacks on capitalism or militarism. Or, as with Malcolm X and Nelson Mandela, many white Americans–and the media and textbooks I’ve encountered at large–stress the nonviolent turn they took, as if their violent actions were merely a muddy memory and not a meaningful response to the oppression they faced.
In more contemporary contexts, many white Americans might stress the “I recognize my privilege” narrative, while the victimized narratives of minorities–and the anger it expresses–becomes secondary. Some may attack the looting without reflecting on the anger, frustration, the institutional poverty that causes it, left over from hundreds of years of race relations, struggle, and oppression. On the converse, others may fixate on the violence, without looking at the nonviolence and cooperation that may take place on the sidelines. Or impose boarders that leave people alienated and uncertain. Or rely on unchallenged cultural assumptions and group dynamics, like the classic “rugged individualism” narrative.
Simply put, some narratives are more marketable or palatable than others. Some are easier to grasp, easier to hold onto, easier to repeat, easier to spread, or shout, or celebrate. Or, more bleakly, our own inborn biases–our confirmation biases, Dunbar’s numbers, and in-group proclivities–and our cultivation as individual people with limited access and viewpoints in an American milieu prevent us from seeing the whole stage. We get stuck, in a sense, caught up in simplification because the broader picture is so messy, uncertain, ugly, and inconvenient.
We are caught in a paradox: in constant relation, with constant separation. We are all in this together, but alone.
As I said, I don’t want to propose a solution, and though I unintentionally point my finger at certain broad perspectives, I’m pointing the finger at myself. I’m flawed. Even this thinking-through or “essai” may be fraught with errors, may be dangerous, may be constraining and insensitive.
If anything, though, I think we need a certain kind of sympathy, a certain kind of identification that traditional in-group out-group dynamics and bland #AllLivesMatter multiculturalism do not meet. Once upon a time, Americans were called “The People of Feeling.” Sympathy and “fellow-feeling” was a bedrock of politics and social relations. It was a scientific fact that filled all sorts of writings. Granted, as many scholars–like Julia Stern and Andrew Burstein–note, it often excluded many black Americans and women, but I think it still has some use for us today.
I know of few other things that highlight these affective ties and social relations as clearly and viscerally as sympathy. As Diane Davis argues, in order for unifying symbols–like language or culture–to develop, one must first consider an “always prior relation to the foreign(er) without which no meaning-making or determinate (symbolic) relation would be possible.” In other words, we are always in relation. We are always “being-with.”
With this in mind, we always have an ethical obligation to recognize our inherent relationship. But at the same time, that relationship always recognizes the inherent difference or “foreign” element of the other. Through “sympathetic imagination” and “emotional contagion” we may break down boarders, but a separation always persists for most of us. “I” can never be fully “you.”
So perhaps I lied (again), and I am putting forth a position, but it is a pretty basic one: we are all different, each with different (inaccessible) worlds and stories, but we are also in the same communities, the same country, the same world.
With this in mind, I see the role of activism. I do not see myself fitting that role personally, but I think activism and the laws and tradition that allow such activism provide key resources. They are necessary. They are beneficial. But they also have a profound ethical dimension, because activism always addresses “our” world, not “my” world or “your” world. All of our individual actions ripple through the whole.
As one of my teachers would always say, “It’s complicated.” But just because it’s complicated doesn’t mean we shouldn’t seek progress or clarity. I think it just means we must do so with a clear sense of sympathy and “humility,” mindful of the “ground” where each of us stands and why.
I know I said I was taking a hiatus, but if you haven’t noticed, a few posts have been creeping up on the blog. I guess I’ve been looking for an outlet lately, and blogging provides an easy one. So while my clothes are in the wash, and I take a break from grading, I might as well post what’s on my mind.
I haven’t given much thought to the role of philosophy on this blog. I think my most extensive treatment was in this post, where I consider the paradox of “diseases of civilization” or here, where I reblog a cartoon about questions. But last week’s video from Oly at Philosophy Tube gave me pause.
Here’s the video:
I agree with Oly. If one wants to define philosophy as a critical enterprise, composed of rigorous thought, engaged discourse, and reasonable (generally logical) standards of judgement, philosophy has relevence. So does the philosophy of Seneca, Epicurus, and others that challenges assumptions and habits to live a happier, more meaningful life.
The only “philosophies” that may require skepticism are the “new age” assertions that often creep into philosophy sections at bookstores and the glib retorts that people may palaver while sipping a beer or answering a question on television.
I say these deserve skepticism because they generally do not police themselves. As Kant said, “I have set the bounds for reason to make room for faith.” Just so: we should see where reason ends and where faith begins. Faith, too, can be meaningful, but it is different from most “philosophy,” even the Eastern type, which has standards, self-criticism, and limitations.
I have little to add to Oly’s own thoughts–and little time to add anything–but I think two things are particularly important regarding even the most mundane and rudimentary philosophical thinking.
I said to myself, “Look, I have increased in wisdom more than anyone who has ruled over Jerusalem before me; I have experienced much of wisdom and knowledge.” Then I applied myself to the understanding of wisdom, and also of madness and folly, but I learned that this, too, is a chasing after the wind. For with much wisdom comes much sorrow; the more knowledge, the more grief
I once acted in a series of one act plays, and when I wasn’t running lines or rehearsing, I watched the other shows. One particular line has stood out from the experience: “Why be better?” I almost missed it, but hearing that line over and over, I finally realized how nihilistic it was. Yet, some days, I ask myself the same thing.
For the most part, it seems to be a modern question. Ennui, hysteria, and melancholy became common, even expected, medical diagnosis for the growing middle class in the 18th and 19th centuries as prosperity and public reform democratized leisure. Prior to that, some historians argue, people didn’t have the resources for ennui.
Couple this with growing cities, rising industry, increased skepticism for religion and morality–Darwin’s work being one cause–and one can see the anxiety and hopelessness that spurs such questions, especially by the start of the 20th Century.
We all have those moments when we say or do something foolish because our emotions “made us.” You decide to wait for the last minute to finish your work because you’d rather watch Breaking Bad, resulting in a last-minute panic. Or you lean in to kiss a friend because it “felt right,” only to be pushed away.
If anything, emotions make life interesting.
But for the most part, we like to think that we’re rational decision makers. To make choices, we consider our options and chose the one that makes the most sense. We’re not willy-nilly about such things. And those foolish, emotion-based decisions are a rarity, not the norm. As Samuel Johnson once said, “We may take Fancy for our companion, but must follow reason as our guide.”
Moreover, Most of our public discourse assumes that we are rational. Our economy’s dominant theory is “rational market theory” and the framers designed our political system according to Enlightenment ideals of rational government.
Philosophy, in particular has tended to focus on logic and reason. The Stoics are one famous example, but Socrates also prided logic over emotion, even to the point of death. Some exceptions exist, like Nietzsche and Rousseau, but they are precisely that: exceptions.
Indeed, most of us like to think that we control our destiny with rational choice–whether its in buying a car or choosing a profession–but research shows we may not be as rational as we think.