Over at the New York TimesOpinionator, Zachary Fine ponders the millennial predicament of pluralism, and the pressure all 20- and 30-somethings face to inherit opinions that can most easily fit into the “new orthodoxy of multiculturalism.” Fine notes that pluralism is often gracefully self-described as ” faithfully disinterested” or “energetically engaged with diversity,” but that its impact has created a kind of analysis paralysis. What can one say, we wonder, without wakening the beehive of multicultural non-violence? How can one have an opinion, when having one means being a bigot? The generosity of pluralism, in theory, seems to create a law of freedom that really only binds. In a political age when partisan catchphrases and nonsensical jargon seek to leave no one offended (but everyone disillusioned), I think Fine’s described the truth.

However, pluralism has had unforeseen consequences. The art critic Craig Owens once wrote that pluralism is not a “recognition, but a reduction of difference to absolute indifference, equivalence, interchangeability.” Some millennials who were greeted by pluralism in this battered state are still feelings its effects. Unlike those adults who encountered pluralism with their beliefs close at hand, we entered the world when truth-claims and qualitative judgments were already on trial and seemingly interchangeable. As a result, we continue to struggle when it comes to decisively avowing our most basic convictions.

Those of us born after the mid-1980s whose upbringing included a liberal arts education and the fruits of a fledgling World Wide Web have grown up (and are still growing up) with an endlessly accessible stream of texts, images and sounds from far-reaching times and places, much of which were unavailable to humans for all of history. Our most formative years include not just the birth of the Internet and the ensuing accelerated global exchange of information, but a new orthodoxy of multiculturalist ethics and “political correctness.”

These ideas were reinforced in many humanities departments in Western universities during the 1980s, where facts and claims to objectivity were eagerly jettisoned. Even “the canon” was dislodged from its historically privileged perch, and since then, many liberal-minded professors have avoided opining about “good” literature or “high art” to avoid reinstating an old hegemony. In college today, we continue to learn about the byproducts of absolute truths and intractable forms of ideology, which historically seem inextricably linked to bigotry and prejudice.

For instance, a student in one of my English classes was chastened for his preference for Shakespeare over that of the Haitian-American writer Edwidge Danticat. The professor challenged the student to apply a more “disinterested” analysis to his reading so as to avoid entangling himself in a misinformed gesture of “postcolonial oppression.” That student stopped raising his hand in class.

I am not trying to tackle the challenge as a whole or indict contemporary pedagogies, but I have to ask: How does the ethos of pluralism inside universities impinge on each student’s ability to make qualitative judgments outside of the classroom, in spaces of work, play, politics or even love?

Fine then moves into the category of taste, artistic and otherwise, and the inner-scrutiny that this kind of pluralism creates. What does what I like mean about me, if everything else is just as valid, just as “good”? What hegemonies, what structures, are implicating me by liking this Coldplay song? What postcolonial oppression keeps me from applying a more generous regard to Lil’ Wayne?

You can see where he’s going. The matters of taste, of individual preference and desire, no longer just describe the person, but his or her entire global posture. Paradoxically, words like “awareness” and “celebrated diversity” tend to turn the “big issues” into an enormous arena of personal ones, where looking beyond yourself becomes the ultimate motivation in turning inward, a chance to posture your “likes” in a sea of moral rights and wrongs. Isn’t it easier to say “I don’t know”?

Philosophers and social theorists have long weighed in on this issue of taste. Pierre Bourdieu claimed that an “encounter with a work of art is not ‘love at first sight’ as is generally supposed.” Rather, he thought “tastes” function as “markers of ‘class.’ ” Theodor Adorno and Max Horkheimer argued that aesthetic preference could be traced along socioeconomic lines and reinforce class divisions. To dislike cauliflower is one thing. But elevating the work of one writer or artist over another has become contested territory.

This assured expression of “I like what I like,” when strained through pluralist-inspired critical inquiry, deteriorates: “I like what I like” becomes “But why do I like what I like? Should I like what I like? Do I like it because someone else wants me to like it? If so, who profits and who suffers from my liking what I like?” and finally, “I am not sure I like what I like anymore.” For a number of us millennials, commitment to even seemingly simple aesthetic judgments have become shot through with indecision.

It seems especially odd because in our “postcritical” age, as the critic Hal Foster termed it, a diffusion of critical authority has elevated voices across a multitude of Internet platforms. With Facebook, Twitter and the blogosphere, everyone can be a critic. But for all the strident young voices heard across social media, there are so many more of us who abstain from being openly critical: Every judgment or critique has its weakness, making criticism seem dangerous at worst and impotent at best.

…We millennials often seek refuge from the pluralist storm in that crawlspace provided by the expression “I don’t know.” It shelters the speaking-subject, whose utterances are magically made protean and porous. But this fancy footwork will buy us only so much time. We most certainly do not wish to remain crippled by indecision and hope to one day boldly stake out our own claims, without trepidation.