Last spring, I finished my undergrad, where I drug myself through a severely disoriented and disorienting thesis. Among the many lessons I learned in the process, I discovered something that deeply hindered my academic writing: I hated it. This revelation surprised me because I entered that research project believing I liked it and did it well. Now, I could barely sustain either of those beliefs.

Among the many qualities of scholarly writing I now found deplorable: it was infinite, and its vastness offered no longer to enchant but to consume me whole. Every book or article contained a bibliography of books and articles that contained their own bibliographies, too, in an endless chain of intellectual prowess that I could not master yet had to pretend to. Worse still, this interminable ocean of academic text was dominated by voracious critique and obscure theory which knew no bounds.

Right in time for (everyone else’s) fall semester, that undergraduate dread and nihilism is returning to my mind as I’ve been considering applications to grad schools. Amidst this uncertainty, I discovered an old article in The Point Magazine, When Nothing is Cool, which gave voice and validity to my anxieties about critical theory. The writer, Lisa Ruddick, is herself an academic, an English professor at UChicago, so her sympathetic essay feels like a wink from on high. She describes interviewing about 70 grad students in English departments (my turf) to ask about the current state of their discipline. The results were significantly negative.

Among those Ruddick interviewed, many described “unaccountable feelings of confusion, inhibition and loss” as they attempted to situate themselves in academic conversations. Reading those words, I felt like she had surveyed me. And amateur academics were not the only ones feeling such intense gloom. Ruddick spends most of the article supporting this sense with her own experience and a wide-ranging analysis of journal articles. It’s not just me, then: especially regarding critical theory, many people think the humanities are inhumane.

Ruddick summarizes her claim: “Repeatedly, we will find scholars using theory—or simply attitude—to burn through whatever is small, tender, and worthy of protection and cultivation.” This tactic she calls “academic cool,” and it “disdains interpersonal kindness, I-thou connection, and the line separating the self from the outer world and the engulfing collective.” In other words, most important is theoretical abstraction, not living persons, who are vulnerable to it. As Ruddick continues, “Novices subliminally absorb the message that they have no boundaries against the profession itself. The theories they master in graduate school are such as to make their own core selves … look suspect and easy to puncture analytically.” Yet critical theory is held to be “untouchable,” as it “supports a new and enhanced professional self.”

Ruddick’s article gives coherence to that nagging, unnamable disturbance I had felt for so long. I remember first beginning to recognize it during a brief conversation with an English professor, who specialized in Romantic poetry. I loved his class and subject, but I doubted that I could justify that desire. So I asked him how he did it.

“But why do you study poetry?” I asked him.

“Because I fell in love,” he responded.

And I replied, “Is that enough?”

I realized then that the academy could be a difficult place to pursue a “passion.” In nearly every endeavor, institutions require members to rationalistically vindicate their work, to justify the work’s (and their own?) existence and value with arguments and well-documented proofs. And scholarly competition—to enroll, earn high marks, connect with faculty, advance to further study, publish, reach tenure—demands performance of the highest order, which must be the very best, the most useful, the most objectively valuable, according to any number of critical theories. For some time, I’d pondered this state of affairs and have been at a loss to defend my interest in obscure literary studies, so I took the matter to my English professor expecting reassurance. But when he told me that he studied poetry for love, all my hopes of justification dissolved. All of his—all of my—reasoned, objective, significant, and sequential actions rested on something almost entirely subjective, of meaning and consequence only personally: my aesthetic attraction, my curiosity, my vacillating interest, my “love” of my subject. But how can any work founded ultimately on what I merely enjoy ever satisfy the academy?

I have considered some possible responses to my dilemma. My cynical inclination cites Nietzsche at me. Upbraiding “objective” philosophers, he rages,

They all pose as if they had discovered and reached their real opinions through the self-development of a cold, pure, divinely unconcerned dialectic … while at bottom it is an assumption, a hunch, indeed a kind of ‘inspiration’—more often a desire of the heart that has been filtered and made abstract—that they defend with reasons they have sought after the fact. (Beyond Good and Evil, 5)

That is, personal affection does not merely spark academic work; it pervades and determines it, and logical explanation is an afterthought. 

Hearing Nietzsche’s tone, or simply knowing this is coming from Mr. God-is-dead, may put some readers on guard. However, his goal to call out not subjective philosophy but the pretense of pure objectivity in philosophy. All philosophy, on his view, is subjectively influenced, since it is written not by machines but by people, with likes, dislikes, and organic lives. (For my purposes, I’m reading “philosophy” in the broadest sense, as the “love of wisdom,” regardless of the boundaries of academic fields. Whatever I study and write, I’m doing it in the midst of all of my other experiences and desires, not in an abstract bubble of pure reason.)

As Ruddick points out, missing this distinction seems to be exactly what some academics have done. Whereas Nietzsche emphasized the individual person’s subjective, internal experience, many contemporary scholars (specializing in critique and theory) reject such an emphasis or condemn it as a product of a host of social evils: bourgeois ownership, capitalist consumption, patriarchal privilege, etc. If selfhood is linked to these concepts that theory opposes, then theory must also oppose selfhood. Ruddick explains:

Let us assume a proposition that most American psychoanalysts would find uncontroversial, namely that human beings have inner lives—ideally rich ones—and a degree of self-cohesion. As students are brought into our profession [English literary studies], they typically learn to see this view as that of “mainstream psychology,” which in turn is fraught with bourgeois ideology. … they are assigned theories arguing, at an extreme, that the very border between inner and outer worlds is (as Judith Butler has argued) “maintained for the purposes of social regulation and control.” They will also occasionally encounter work that uses the profession’s radical critique of interiority and autonomy to make the shattering of selves look edgy and progressive. I nowhere mean to suggest that the profession does not offer good criticisms of U.S. ideology. The problem is the scorn for self-cohesion that has wound itself in with the project of social critique.

In this setting, many novice intellectuals find themselves disoriented, as their very “selves” are taken apart by their intellectual pursuits. By this reasoning, internal feeling is worthy of suspicion because it is supported by oppressive regimes of thought. And if internal feeling deserves doubt, what is left to guide academic inquiry? Ruddick describes interviewing “many young academics who say that their theoretical training has left them benumbed,” no longer able to remember the way their subject first tugged their imaginations.

Even as a recent undergrad, not quite initiated into the system, I can see myself in these words. The problem I find is that my every inclination, interest, attachment, or curiosity (which could guide productive inquiry and criticism) is itself the object of critique before any inquiry can even occur. And I feel this preemptive critique because I’m a “good” student—I have allowed the discipline, the theory, the critique to engulf my whole mind so that, for me, no boundary separates academia and the rest of my life. Perhaps ironically, Nietzsche would agree with the theorists. No boundary separates the two arenas, but he would say things are working the wrong way. We ought not expect the “objective” academy to infuse and control our “subjective” lives; our “subjective” experiences and desires already, inherently inform our “objective” study, whether we recognize it or not.

To be clear, I don’t oppose academic theory any more than I wholesale champion individualism. But what if (and I think Ruddick is suggesting this, too) there is a slightly more considerate position? Academic theory does necessary, perhaps even prophetic work, by exposing the exploitation of the small and the weak in our social systems. (E.g., one important critical theory, “deconstruction,” ultimately derives its name from Luther quoting Paul quoting Isaiah.) And on the other hand, individualism can do great harm, as well, if it distorts or inhibits community, so essential to Christian life. What if we were to see clearly, but not to see everything clearly, since the latter option, though seemingly attractive, is its own problem? As C.S. Lewis writes in The Abolition of Man,

But you cannot go on ‘explaining away’ forever: you will find that you have explained explanation itself away. You cannot go on ‘seeing through’ things for ever. The whole point of seeing through something is to see something through it. It is good that the window should be transparent, because the street or garden beyond it is opaque. How if you saw through the garden too? It is no use trying to ‘see through’ first principles. If you see through everything, then everything is transparent. But a wholly transparent world is an invisible world. To ‘see through’ all things is the same as not to see.

From their Greek roots, theory derives its name from seeing, and critique, from judging. These capacities are good gifts, when ordered rightly in God. That last phrase is a sad joke, though—on a good day, I see with a little less envy and judge with slightly less bias. But I look forward to seeing no longer through a glass darkly but face to face, to being judged not by my own ambivalence but by the Judge who takes my penalty. And these eschatological hopes are not irrelevant additions to the conversation, even in classrooms and libraries stacked with volumes critical theory. “Thy kingdom come” is the groan of God’s own Spirit with his whole creation, always and everywhere, until it finally does come, and our theories shall be made sight, and our critiques shall be taken up into God’s pardon.