EGO LECTOR

Mental marginalia & cerebral effluvia


  • (T)werkin’ for the Weekend

    In her book Working Girl: Selling Art and Selling Sex, Sophia Giovannitti writes:

    Of course, I can do wage labor, and I have, and I do. I have held many jobs; many are fine. But fine is not how life should be. I firmly believe that no one should have to work to live, that the imperative to sell one’s labor in exchange for the fulfillment of basic survival needs is a foundational violation.

    The objection begins to form in the minds of the average citizen trained through endless capitalist propaganda to accept that this is just how the world works, you should grow up and do your time. Most of us have to go to work, and as much as we may hate it ourselves, we tend to suppose that people who advance notions like this are living in some kind of Candyland fantasy world; our response is to think that everyone just needs to suck it up and suffer as humans were meant to.

    And yet, consider this: most people, if asked what they would do upon receiving a massive financial windfall such as winning the lottery, would likely say that they would want to ensure that some person or people in their lives never had to go to work again. We know that wage slavery is unpleasant and degrading, and many of us fantasize about escaping it ourselves and granting that release to the people dear to us. But when such a notion is applied to society at large, the abstraction of people into statistics leads to a rejection of empathy in favor of what passes for disinterestedly rational realism.

  • Rectitude vs. the Commonwealth

    Ronald Syme, The Roman Revolution (pp.104-105):
    “The memory of Antonius has suffered damage multiple and irreparable. The policy which he adopted in the East and his association with the Queen of Egypt were vulnerable to the moral and patriotic propaganda of his rival. Most of that will be cooly discounted. From the influence of Cicero it is less easy to escape. The Philippics, the series of speeches in which he assailed an absent enemy, are an eternal monument of eloquence, of rancour, of misrepresentation. Many of the charges levelled against the character of Antonius – such as unnatural vice or flagrant cowardice – are trivial, ridiculous, or conventional. That the private life of the Caesarian soldier was careless, disorderly, and even disgraceful, is evident and admitted. He belonged to a class of Roman nobles by no means uncommon under Republic or Empire, whose unofficial follies did not prevent them from rising, when duty called, to services of conspicuous ability or the most disinterested patriotism. For such men, the most austere of historians cannot altogether suppress a timid and perhaps perverse admiration. A blameless life is not the whole of virtue, and inflexible rectitude may prove a menace to the Commonwealth.”

  • Vanity (Book)fair

    Enrique Vila-Matas, Bartleby y compañía (p.26):
    “The entirety of Walser’s work, including his ambiguous silence of twenty years, served as a commentary on the vanity of every enterprise, and even the vanity of life itself.”

    Todo la obra de Walser, incluido su ambiguo silencio de veintiocho años, comenta la vanidad de toda empresa, la vanidad de la vida misma.

    Samuel Johnson, The Rambler (No. 106):
    “No place affords a more striking conviction of the vanity of human hopes than a public library; for who can see the wall crowded on every side by mighty volumes, the works of laborious meditations and accurate inquiry, now scarcely known but by the catalogue and preserved only to increase the pomp of learning, without considering how many hours have been wasted in vain endeavours, how often imagination has anticipated the praises of futurity, how many statues have risen to the eye of vanity, how many ideal converts have elevated zeal, how often wit has exulted in the eternal infamy of his antagonists, and dogmatism has delighted in the gradual advances of his authority, the immutability of his decrees, and the perpetuity of his power?”


  • Smash the Sets!!!

    Whether you view him as a crank or a prophet, you have likely seen Marshall McLuhan cited ad nauseam in discussions of media and its effects on the mind. I was particularly taken with this set of quotes:

    McLuhan once said to his friend and colleague Tom Langan, while watching television, “Do you really want to know what I think of that thing? If you want to save one shred of Hebrao-Greco-Roman-Medieval-Renaissance-Enlightenment-Modern-Western civilization, you’d better get an ax and smash all the sets.” And he was no more accommodating to the electronic beast in his advice to his son Eric regarding one of Eric’s daughters in a 1976 letter: “Try not to have Emily exposed to hours and hours of TV. It is a vile drug which permeates the nervous system, especially in the young.

    It’s often said that McLuhan was far better at diagnosing the social effects of media and even prophesying its total capture of our attention (see: ‘the global village’) than in proposing solutions to these ills once diagnosed or prophesied. But here is a solution, spelled out in simple language and presented in summary form. (“Smash all the sets.”) ‘But,’ the objection begins to form in the reader’s mind, ‘it’s simply unrealistic to imagine that we would ever smash all the sets and retreat to a pre-mediatized state of existence.’ This is to assume that the word ‘solution’ necessarily entails feasibility. In fact, there are many problems which admit of clear solutions which are deemed, at a very democratic level, beyond the pale of what civilization is willing to countenance. Examples include:

    • Gun violence in America: It is beyond question that entirely banning the sale & manufacture of guns, combined with a full recall/confiscation program and draconian punishments for possessing them, would staunch the flow of blood from our society. But this is never seriously proposed in public.
    • Nuclear anxiety: As Will Self notes, we live in a horrific double bind in which the institution which is meant to protect us (the government) also possesses the instruments for effacing human life from the planet and occasionally suggests its willingness to employ them. Obviously, the only real solution to this fear would be complete global disarmament. But this is not possible.
    • Environmental disaster: This is an interesting case, because in this situation even the solutions for management/mitigation of the crisis to a level of tolerable death, displacement, and disorder have been freely discussed in public for decades, yet in actual practice we collectively continue to do exactly the opposite of what we ought if our stated goal is to avoid catastrophe.

    One could likely think of several more examples. But to return to McLuhan: his solution to the problem of civilizational degradation in the age of mass media and excess entertainment is to revert to a pre-televisual world. Certainly he never expected that society would ever follow through on this plan, but it seems to me that it would effectively be the only way to recover the broad civilizational trend which McLuhan delineates.

    Debates about the dangers of social media or the internet more broadly often see at least one citation of the old go-to, “People said the same thing about the telegraph/radio/TV, and they were wrong.” But perhaps they were right. There is no way for us to assess whether we are doing better collectively than we would have without decades of mediatization of our entire experience of the world. Consider: a patient with advanced Alzheimer’s is the one person without a clear sense of how much of their mental power has dissipated. Similarly, we cannot really know what course our collective consciousness or our individual brains may have taken in an alternative world in which these technologies never affected us. Someone who used drugs heavily in their teens will never know in middle age whether they did irrevocable damage to their cognitive capacity because there is no real comparison point.

    We are adaptable creatures, but I suspect that each generation which complained as it witnessed some new technology warping the experience of life in real time was right to do so. It may be the case that every technological development from the Industrial Revolution onward was a mistake from a purely phenomenological perspective. Trains did distort our experience of time by allowing for such rapid movement and by compelling us to follow standardized “clock time,” an imperative which we are born into but which at one time caused a rupture in human experience. So too did the swift global communication by telegraph inaugurate an age of anxiety: so began the mass addiction to news from everywhere, allowing one to extend their range of worry beyond their community and focus on the problems of the entire world.

    I cannot take seriously anyone who argues that television was not a serious problem for the late 20th century mind. The kind of ultra-stimulating and highly engrossing entertainment on display tended over time to shorten our attention spans, to lower our threshold for boredom, and to shape the very nature of the real world. Even in elementary school I recall my teachers talking about the importance of image in the Kennedy/Nixon debates. Those who think that Trump’s presidency would have been impossible without the internet get it wrong: it would have been impossible without the tyranny of the gaudily visual imposed upon us by TV. Much of the internet, after all, owes its instant and addictive appeal to the fact that it’s really just a distillation of the style of television. Video essays are just Andy Rooney at length; videogame streaming is just ESPN for geeks; TikTok is just sketch comedy or music videos in which regular people get to star. And of course, it all runs on the magic juice that kept the TV on, kept the radio broadcasting, and kept the news presses running: advertisement. Not only did TV warp politics into a contest about televisual charisma, it accelerated our addiction to consumer culture. The more we were entertained, the more we would be shown things to desire. So it simply isn’t the case that fears of TV were overblown: our minds work differently, our politics was rendered more superficial, and we were driven to want & demand much more by way of material goods than people were before the sets turned on. (The fact that, in typing that last sentence, I immediately thought of the scene in which Blofeld tells James Bond, “I am the author of all your pain,” [Blofeld here is mass media] is a perfect demonstration of my point: my mind is just saturated with televisual detritus, and even primarily verbalized thoughts about abstract historical trends can suddenly conjure up a neuron-projected clip in my head.)

    Even on the internet, is not uncommon to hear or read a certain popular sentiment directed against the internet itself. Not just its current incarnation (insiliconization?), but its very existence. While I myself am prone to wax nostalgic about a simpler time of basic forums, chat rooms, and text-heavy HTML sites, those who wish to return to that earlier state don’t see that its very structure was simply waiting to be commercialized and, in the parlance of our times, sloppified. Barlow’s declaration that governments and global capital were unwelcome on the web was already too late. Something about the medium itself primes our brains for the very kind of exploitation that is ruining them. I’m glad that we live in a post-ironic age, lest the irony of my posting this on the internet be put to me. We should have smashed all the sets, and we should have smashed all the modems too; but now, you’re integrated into the world and culture of the internet whether you’re on it or not because it shapes public opinion and popular psyche, and has in fact become the default source for epistemic and cultural validation. As Frederic Jameson said that it’s easier to imagine the end of the world than the end of capitalism, so too is it impossible to imagine a life without the internet. We could probably recover something of civilization if we did; but we won’t.

  • The Nightmare of the Past

    Karl Marx, The 18th Brumaire of Napoleon Bonaparte (part 1):

    The tradition of all dead generations rests like a nightmare on the minds of the living.

    Die Tradition aller toten Geschlechter lastet wie ein Alp auf dem
    Gehirne der Lebenden.

    James Joyce, Ulysses (1.1):
    History, Stephen said, is a nightmare from which I am trying to awake.

  • Like Old Man Like Yells At Sky

    In my more peevish moods, I give vent to my spleen. Years ago, I found that I could no longer listen to NPR for a simple reason: too many of the hosts sounded too inarticulate to be taken seriously as representatives of an organization belonging broadly to the intellectual logosphere of our country. As much as I enjoy podcasts, I’ve found that even many of the smartest writers whose work I admire seem to have yielded entirely to the tyranny of the word ‘like.’ Now I struggle to find pleasure in any of the ones which aren’t framed as comedy because I find myself frustrated at the progressive degradation of conversational sentence structure.

    In Taylor Lorenz’s latest interview with Charlie Warzel, it struck me that it is par for the course now for professional journalists to conduct interviews employing syntactic structures that might well have been but a parody of an inarticulate high schooler in the 90’s or early aughties. Both Lorenz and Warzel expressed their thoughts with a combination of gestural/affective utterance, memetic suggestion, and a mass of likes strewn about (319 total). As a mode of communication, this puts heavy demands on the listener because you have to supply most of the reasoning underlying their assessment of the Twitter influencer location reveal scandal. (I should note that I like Lorenz’s tech reporting and feel entirely sympathetic with her views, which is precisely why I listen regularly, but also why I find myself annoyed by this.) A section of the transcript:

    Somehow, I cannot take “they’re not just like bad sometimes like it’s like fundamentally poisonous to society” seriously. I recognize that the epistemic turbulence which the internet has wrought is indeed fundamentally poisonous to society, but it’s hard not to think that the poison has spread so far that even the internet’s sharpest critics don’t realize what a horrific effect the inundation of brain rot and the ceaseless insipid chatter online have had on their capacity for verbal articulation.

    I realize that this complaint will make me appear like a crank to many readers, who feel far less uncomfortable with this aural assault than I do. I know that linguists like to burnish their progressive credentials by explaining this away as a filler or a hedge and then proceeding to suggest that being annoyed by its use is a sign of a socially reactionary or prejudiced disposition. But I don’t find the phenomenon so galling when people are having casual conversations. It seems however that serious ideas that require serious airing ought to be clearly, if not forcefully, expressed. A discussion on an important topic, conducted by two journalists who follow that topic, and produced for intentional dissemination to a regular audience requires a different register than chitchat does.

    Not to stretch the point, but perhaps the reason why many public intellectuals are undervalued today is that they don’t, even to the lay ear, express themselves like intellectuals. The internet is less democratizing than demoticizing. Now that most of what are obnoxiously called ‘thought leaders’ are as halting and inarticulate as the average citizen, they have ceded one of the distinguishing tokens by which they might be recognized as thoughtful people.

    According to an old literary anecdote that often circulates unsourced and variably quoted, Karl Kraus suggested, in reaction to the Japanese devastation of Manchuria, that such atrocities would not be happening if only people had cared more for the proper placement of the comma. While the point of the story is to draw him as a caricaturized pedantic stickler, it also hints at a greater application: the health of civilization is tied to its modes of expression. As ours becomes more telegraphic, reliant upon memetic reference, and generally more driven by the conveyance of affect than by a chain of linear reasoning, so too do people become less able to discern what is being done to their brains.

  • The Eyes Have It

    In his Confessions, Augustine explores how visual metaphors tend to be applied regardless of the sense properly involved in a given perception, though this relation is not typically reversed. Though some of his Latin examples don’t map on to English idiom perfectly, it is nevertheless interesting to observe that roughly the same relation holds for us as it did for him. Could it be that our own age is enthralled to the tyranny of the visual precisely because there is a deep-seated dominance of sight in our nature? Augustine is, naturally, concerned with the ability of the eyes to excite humans to the pursuit of carnal pleasure, but it’s hard not to note that most of the commercial & political (as well as sexual) appeals to our attention draw so heavily on visual stimulus.

    Augustine, Confessions 10.35:

    huc accedit alia forma temptationis multiplicius periculosa. praeter enim concupiscentiam carnis, quae inest in delectatione omnium sensuum et voluptatum, cui servientes depereunt qui longe se faciunt a te, inest animae per eosdem sensus corporis quaedam non se oblectandi in carne, sed experiendi per carnem vana et curiosa cupiditas nomine cognitionis et scientiae palliata. quae quoniam in appetitu noscendi est, oculi autem sunt ad noscendum in sensibus principes, concupiscentia oculorum eloquio divino appellata est. ad oculos enim proprie videre pertinet, utimur autem hoc verbo etiam in ceteris sensibus, cum eos ad cognoscendum intendimus. neque enim dicimus, ‘audi quid rutilet,’ aut, ‘olefac quam niteat,’ aut, ‘gusta quam splendeat,’ aut, ‘palpa quam fulgeat’: videri enim dicuntur haec omnia. dicimus autem non solum, ‘vide quid luceat,’ quod soli oculi sentire possunt, sed etiam, ‘vide quid sonet,’ ‘vide quid oleat,’ ‘vide quid sapiat,’ ‘vide quam durum sit.’ ideoque generalis experientia sensuum concupiscentia (sicut dictum est) oculorum vocatur, quia videndi officium, in quo primatum oculi tenent, etiam ceteri sensus sibi de similitudine usurpant, cum aliquid cognitionis explorant.There is, furthermore, another more manifold and dangerous form of temptation. For, beyond the desire for the flesh, which inheres in the enjoyment of all senses and pleasures, and serving which those who have cut themselves off from You regularly perish, there is a certain vain and curious desire not of delighting oneself through the flesh, but of experiencing through it, which is cloaked in the name of thought and knowledge. Since this is part of the appetite for learning, while the eyes are the leaders among the senses in that pursuit, it is called the ‘greed of the eyes.’ For seeing properly pertains to the eyes, but we use the language of sight even when dealing with the other senses when we use them for learning. For we never say, ‘Hear what reddens’ or ‘smell how it shines’ or ‘taste how it gleams’ or ‘feel how it glows’, because all of these things can be said to be seen. We say, however, not just ‘look at what shines’ because only the eyes are able to sense it, but even ‘see what sounds’, ‘see what smells’, ‘see what tastes good’, ‘see how hard it is.’ And so, the general experience of our senses is called the concupiscence of the eyes, because the job of seeing, where the eyes have the chief office, is a thing usurped by the other senses for themselves on the basis of their similarity when they are engaged in acquiring knowledge.
  • The Ivory Tower Gets a Trust Fund

    At the beginning of his Vocation Lectures (issued by NYRB as Charisma and Disenchantment, Max Weber explains that under the German university system of the late 19th century, only the rich could realistically pursue academic careers. At the time, junior scholars with only one published book to their name rented out their services in a transparent way by receiving fees directly from students who attended their lectures. (This may be usefully compared to the current situation of the precariat under the gig economy.) This went on until the young lecturer received his Habilitation from an elder academic following the completion of a second book. Weber contrasts this with the American university system’s tendency to hire junior faculty members as assistant professors who are expected to work through a relatively predictable system on their way to become full professors.

    Of course, anyone even moderately familiar with American academia today knows that this no longer holds. Not only has the work of junior academics been rendered more precarious through the extension of visiting assistant professorships (perhaps better described as ‘transient academic peonage’), post-doctoral fellowships, and other cost-saving labor arrangements, but many tenure lines have been excised entirely from the university’s books. Not only must the young scholar work far harder at teaching in an increasingly anti-intellectual environment, but all of that hard precarious labor also has no larger goal than the perpetuation of itself. Realistically, most hopeful professors will have to settle for the vassalage of adjuncting forever in exchange either for the intellectual satisfactions of the scholar’s life or for emolument in the form of enhanced social prestige. [This latter point can be contested: really, how much presitge does our society accord even to a full professor today? Though, on the other hand, I as a high school teacher am often asked why I don’t ‘just go teach college instead,’ a linguistic formulation only possible for someone who has no idea what the project entails. Nevertheless, it’s clear that they mean to suggest that it’s a better job than teaching high school, presumably because they reckon it to be more lucrative or more prestigious.] But almost all of them are smart enough to see the truth: at the end of all of those years of grunt work at the ivory tower’s foundations, they will never secure that full professorship, if for no other reason than because it simply doesn’t exist anymore.

    The system under which junior scholars labor is, in effect, just as exclusive as the old German system which Weber criticized. Academic careers, along with other associated lines that might be considered intellectual vocations, such as journalism, curatorship, professional criticism, etc. are now potentially ruinous career choices for those who have no financial safety net to depend on. Though by temperament extremely inclined toward the ivory tower myself, I never pursued the dream of a university career for the same reason that I don’t continue to live my various vivid sleep-borne fantasies when I awake because I see that they were naught but dreams without substance or reality. Though that personal note may suggest that this is written purely out of negative animus toward a system which I was, ultimately, too scared to join, it strikes me as an objectively undesirable state of affairs. Should both scholarship and higher education become pursuits only for those with a trust fund or the kind of soul comfortable with living life on the economic margins, then academia will be nothing more than another one of our institutions entirely captured by the elite and their mode of thinking.

  • Ennius on Literary Celebrity


    Nemo me lacrimis decoret, nec funera fletu
    faxit. Cur? Volito docta per ora virum.

    Let no one honor me with their tears or perform my funeral rites with weeping. Why? I flit about through the mouths of learned men.

    [Ennius, Fragment, cited in Cicero’s Tusculan Disputations 1.117]





  • Logical Phallusy: On the Silliness of ‘Male Reader’ Discourse

    Opinion pieces and reporting about the decline in male readership are afflicted at the outset by a basic epistemological hang-up: how would we even assess this? Neither survey data nor shopping analytics seem scientifically sound, given that they will be skewed by some form of selection bias. In the case of surveys, both the distribution of and response to surveys are potential fault points. As for sales data, it seems obvious upon reflection that there is no necessary correlation between buying books and reading them in a world where a) some purchases are purely performative and b) one needn’t purchase new books to be a serious or thoughtful reader. I suspect that what accounts for the every-few-months greenlighting of a new article about the death of reading among men (or the youth) is based largely on what contemporary parlance nebulously calls vibes.

    A cursory glance at the world’s most famous men does indeed reveal a lack of literacy. It’s hard to imagine any of the men in the broader manosphere settling down with a chunk of literary heft, nor do our testosteronal titans of industry and penis-swinging politicians great exemplars of the bibliomania which we wistfully imagine that previous, more enlightened lords of the world were. On a more proletarian scale, you may find yourself struggling to think of many ‘regular guys’ you know among the dudes, chads, chuds, bros, and douches encountered in daily life burning with desire for ocular application to the printed word. Indeed, of the men whom I know, I can think of only one who has any thought for literature at all.

    But this is all anecdotal impressionism, not the kind of foundation on which to construct a superstructure of moral panic. Perhaps, after all, what is really happening is that reading and literature have ceased to be a locus of virility in the way that they were in the pre-digital age. Could one imagine Samuel Johnson’s declaration about his reading being taken at face value if made today?

    “What he read during these two years he told me, was not works of mere amusement, ‘not voyages and travels, but all literature, Sir, all ancient writers, all manly: though but little Greek, only some of Anacreon and Hesiod; but in this irregular manner (added he) I had looked into a great many books, which were not commonly known at the Universities, where they seldom read any books but what are put into their hands by their tutors; so that when I came to Oxford, Dr. Adams, now master of Pembroke College, told me I was the best qualified for the University that he had ever known come there.”

    I myself was taken aback by Martin Amis’ note in Experience that Iris Murdoch was his favorite ‘woman writer.’ What could this imply other than that, even in the mind of a self-declared feminist writing at the end of the 1990’s, literature still seemed like a primarily male preserve?

    Tyler Cowen is a man. Moreover, he is a man who reads a lot, and I have seen him discuss literature with sensibility and a certain depth of appreciation. Nevertheless, he has also claimed that he finds Twitter more ‘information dense’ than books. This seems like an odd argument to make in light of how much disinformation and frivolity has always characterized Twitter, even well before the days of Musk. (There is a tendency now to nostalgically sentimentalize an earlier incarnation of the platform as a great place for journalists and academics to engage in open information sharing, but I seem to recall three distinct phases of Twitter: 1) mid-aughts dismissal of the platform as geared toward triviality; 2) mid-teens mass adoption characterized by childish arguments, blocking, pile-ons, and retreats from the platform for the recomposure of mental health; 3) degradation into mostly porn, misogyny, and Musk-favored talking points.) Perhaps Cowen’s point is that the limitations of attention online and the old restrictions imposed by character limits meant that there was less fluff tolerance than in other media, but given that these nuggets of information density were surrounded by irrelevant mental effluvia, it’s not clear that there would have been any more ‘density’ here than in any of the baggiest monsters.

    But why am I even addressing this point? No book, excepting works of reference, is meant simply to carry information. In the case of books on science, economics, philosophy, history, sociology, etc. the information is presented as a part of a larger argument which you are meant to follow as you progress through the volume. Returning to Martin Amis, and his novel aptly titled for our purposes The Information, he once said, when asked what he was trying to say with the novel that it took him several hundred pages to say it – that is, the novel is what he was ‘trying to say.’

    In the case of literature, though – whether it be poetry, novels, or even narrative non-fiction – the point is not to receive information, but to have an experience. You may learn something when reading a novel, or you may not. Learning is entirely irrelevant to the process. No one assesses music on the basis of sonic density, on the amount of sound that can be crammed into a given amount of time. No one looks at a painting for chromatic density. It is the mark of the most debased philistinism to suppose that the point of reading is simply to extract encoded information and download it into your own neural server. I have spent more of my waking hours as an adult reading than on any other individual activity, and yet I remember only a trivial fraction of the ‘information’ contained in those books, even if I vividly recall the experience that I had while reading them. But in truth, I have also forgotten most of the actual events which I have experienced in my own life. If I thought of reading as an activity structured around the acquisition of information, I too might come to the conclusion that it was, if not wasted time, then at least a poor use of it.

    So, are men reading less than they used to? Perhaps. I am not entirely unpersuaded by the vibes-based argument, given that it does seem that men talk about sports, videogames, TV, and working out far more than they do about books. But then again, so does all of society. There was a brief period spanning a good chunk of the 19th and 20th centuries during which literacy was increasing and literature had a certain operative force within the world. This period was also one in which the literary firmament was dominated by a lot of titanic male egos who seemed to have the same ability to distort parts of reality the way that tech bros do today – maybe men just flocked to books because it seemed like a manly enough thing to do. But as our dominant paradigm for exploring, commenting upon, and shaping the world shifted from the literary to the technical, maybe the boys’ club just shifted its venue.

    This is all by way of saying: who really cares whether men are reading or not? They tyrannized over the literary kingdom for quite a long time. Moreover, it’s hard to recall a time when hand-wringing over some demographic/intellectual trend did much to alter the trend in question. Whether driven by data or purely by vibes, the panic over male readership misses a salient point: reading is a quiet and fundamentally solitary activity, something which, wherever one does it, is still taking place entirely within one’s head. In a world driven by sharing evidence of our identity through immediately intelligible outward display, such a lonely pursuit within one’s private phenomenology cannot compete for attentional space. You can watch a livestream of people eating, working out, playing videogames; but one would be hard-pressed to imagine tuning in to someone reading quietly to themselves, and indeed the very act of televising it would cast suspicion on one’s commitment to the activity of reading itself. Perhaps we’ll regret the day when mass male reading makes a return and it becomes just another obnoxious pissing contest.