The Gaming Press Is A Sorry Tool For Wokescolds

In last week’s edition of “Playing Politics,” I talked about an article from The Gamer written by a journalist pathetically trying to claim that “Resident Evil 5,” a game set in Africa, was horribly, irredeemably racist because it starred a white man. The week before that, I mentioned an article from IGN where the author expressed her deep disappointment that “Final Fantasy 16,” a game inspired by medieval Europe, didn’t look like modern-day San Francisco. 

And late last week, I had the great misfortune of stumbling upon an article from Polygon claiming that Link from the “Zelda” series is a gay icon, and that “The Legend of Zelda’s queer themes are more than just subtext.”

In all these situations, gaming itself took a backseat to the politics of the writer. Graphics, gameplay, and fun be damned, modern-day games journalists are more concerned about what kind of message the game sends or how they can use our favorite characters to further their leftist agenda.

Video game journalism has been in a sorry state for a while now, but I think we can precisely place the beginning of its decline in 2014, with the advent of Gamergate. Other than which starter Pokemon to pick — the correct answer for that by the way is Squirtle — Gamergate is perhaps the most controversial topic in gaming. 

For those not in the know, Gamergate was a campaign to push for ethics in video game reporting. It started when a female games developer failed to disclose a sexual relationship with a journalist who positively reviewed her game. 

The whole thing collapsed into cyber warfare, with each side accusing the other of spewing vitriol and sending death threats. Very messy. To this day, Wikipedia refers to Gamergate as “a loosely organized misogynistic online harassment campaign and a right-wing backlash against feminism, diversity, and progressivism in video game culture.”

Gamergate was so scarring to the leftist psyche in the gaming sphere, it ignited the same obsession with activism in the culture-focused press that we see with legacy outlets like The New York Times and The Washington Post. But I don’t want to rehash Gamergate too much; that’s a deep, deep rabbit hole that I would really rather not stumble down right now.

What really matters here is what happened after Gamergate; the massive uptick in politics posing as games journalism. Leftist journalists saw what happened and felt they had to use their platforms to fight for the “right” politics instead of, you know, informing gamers whether a game was fun or not.

Hang on, you’re saying I should review the new Mario game? Sorry, but I’m too busy writing about how Princess Peach is actually a symbol of white supremacy and Luigi is trans. 

To drive the point home, let’s take another recent example of the gaming press attempting to push a political message instead of doing their jobs. I’ve talked about “Hogwarts Legacy” before and how the alphabet people and their sycophants hate it because of J.K. Rowling’s views on trans issues. But by the way the journalists reported on this game when it came out, you’d think Voldemort himself had made it. 

From Inverse: “The Real-World Cost of Hogwarts Legacy Is Unforgivable

From The Gamer: “Hogwarts Legacy Made Us Ignore Transphobia, So What’s Next?

From GameRant: “Hogwarts Legacy Should Have Done More To Support Trans Players

From Polygon: “Why I’m not buying the Harry Potter game: Hogwarts Legacy is tainted by J.K. Rowling’s transphobic views

Notice anything missing here? Like an actual description of the game itself? Anything relating to whether it’s good or bad? Is it even fun? Nah, who cares, let’s just accuse Rowling of being a bigot and whine about the people who buy it.  

The gaming press has become yet another tool in the arsenal of wokescolds. And it is legitimately making gaming worse. There absolutely should be a press that talks about games in an objective, non-political way. Gamers deserve serious analysis, not just leftist talking points.

But, like the “mainstream” outlets they’re apparently trying to ape, truth has been replaced by dogma. And journalism has been replaced by activism. 


Douglas Blair is a producer and special correspondent with “O’Connor Tonight” on the Salem News Channel.

Source

Washington Post Found A Lot Of Happy Transgender People, If You Don’t Count Those Who’ve Committed Suicide

The Washington Post must be very proud of itself for tackling one of the biggest problems with the transgender debate (lack of data) and concocting a chipper outcome (most trans people are happy!).

On Thursday the paper published what it called “one of the largest randomized samples of U.S. transgender adults to date about their childhoods, feelings and lives.” The survey took place last year from Nov. 10 to Dec. 1, among 515 U.S. adults who identify as trans and another 823 U.S. adults who did not.

There were a lot of depressing statistics about respondents who are less likely than regular people to say their childhoods were happy and who are more likely to say they often feel anxious. But the ultimate takeaway was that the vast majority of those who said they had “transitioned” away from their natural sex, 78 percent, were “more satisfied” with their lives than before the transition process.

That’s great news for the Post and the rest of the national media trans-champions because it would mean that all the concern and controversy over people who want to irreversibly alter their bodies and appearances with hormones and surgical operations is about nothing!

See? This isn’t about mental illness. Transitioning is real and they’re happier for it!

Like every other one of the media’s new “findings” that fly in the face of common sense and reality (poor diet and lack of exercise have absolutely no bearing on Covid-related morbidity!), it’s nonsense.

Virtually everything about the survey renders it useless, starting with the way it defined “transition.” The Post, in partnership with the Kaiser Family Foundation, set up the survey so that “transgender” includes anyone from those who have undergone surgical operations (just over 15 percent) down to a person who simply says, “I’m transgender,” regardless of whether he does anything at all that distinguishes him from a regular man. By that standard, Janelle Monae, a beautiful actress whose sex is unambiguous in every way, is “transgender,” solely because she calls herself “nonbinary.” (Look! Another wholly satisfied trans person!)

Then there is the utterly pointless one-time questioning of an individual who is dealing with a lifelong, highly fluid conflict. A person who genuinely feels that his mental and emotional state are not in line with his biology may very well feel different one day. He may revert back to his original feelings. They may change yet again. It could be sooner or it could be later. It happens all the time.

A person who decides to live in ways that make him feel more like the opposite sex isn’t a one-and-done. That’s something that has to be examined over an extended period of years. As far as actual genital surgery goes — in theory, the most fulfilling type of “transition” — only one study has done that to date.

Over the course of 30 years, six Swedish doctors and scientists tracked the outcomes of 324 transgender people who had received sex surgeries — 191 male subjects and 133 female subjects. Each subject was cross-referenced with 10 random control subjects of the sex that the trans subject was impersonating. For example, a woman would be a control subject for a man who had had surgery to make himself look more like a woman.

The results were devastating.

Subjects who underwent surgery were more likely than the control subjects to receive inpatient care for a psychiatric disorder, to be convicted of a crime, and to develop cardiovascular disease. The mortality rate by suicide was most striking: Transgender subjects were roughly 20 times more likely to have committed suicide within 10 years of their operations.

“This study found substantially higher rates of overall mortality, death from cardiovascular disease and suicide, suicide attempts, and psychiatric hospitalisations in sex-reassigned transsexual individuals compared to a healthy control population,” the authors concluded. “This highlights that post surgical transsexuals are a risk group that need long-term psychiatric and somatic follow-up.” They noted that surgery and experimental cross-sex hormones may provide some relief for transgender people, but that such treatment is “apparently not sufficient to remedy the high rates of morbidity.”

I don’t doubt that many transgender people are happy. It’s just too bad that The Washington Post can’t survey the ones that killed themselves.


Source

A Professor Abandoning A Spouse And Kids For A College Student Isn’t Brave, But Wicked

A Professor Abandoning A Spouse And Kids For A College Student Isn’t Brave, But Wicked

Can a professor’s affair with a student diagnose what ails our culture? When that professor gets a celebratory profile in The New Yorker, the answer is, a resounding — and depressing — yes.

Agnes Callard is a married University of Chicago professor who left her husband for a male student, divorcing the former and marrying the latter. As Callard reported the affair to the administration before they had a chance to investigate, the celebrated philosopher managed to keep her tenured position.

Professors have been leaving their long-suffering wives for their students for a very long time. Callard’s case reverses the typical gender roles, but it’s 2023. Most of us recognize, that given enough liberty to do so, some women behave just as badly as men. Isn’t that part of what feminism fought for? The freedom of women to make the same reckless, consequence-free decisions as their husbands and brothers?

Callard, of course, takes it a little farther than the standard-issue account of an affair. In a long and worshipful profile by Rachel Aviv, Callard argues that her divorce and second marriage are part of an Aristotelian pursuit of the Good, the Noble, and the True. Just six weeks after falling in love with her student, she came clean to her classes, not pleading for forgiveness but asking them to join her on a philosophical investigation of the nature of love:

After the talk, a colleague told Agnes that she was speaking as if she thought she were Socrates. ‘I was, like, ‘Yeah, that’s what it felt like,’ ‘ she said. ‘I felt like I had all this knowledge. And it was wonderful. It was an opportunity to say something truthful about love.’

Almost exactly ten years ago, I resigned from my tenured position at Pasadena City College, where I had taught history since 1993. Like Callard, I had an affair with a student. Like Callard, I made the affairs (there was more than one) known to the college. Like Callard, I was never accused of sexual harassment or misconduct by either my student lovers or third parties. Like Callard, I was married with two young children. Like Callard, the revelations ended my marriage.

Unlike Callard, I resigned from my job. In the early 2000s, I chaired the committee that wrote the policy forbidding “consensual romantic relationships” between students and faculty at Pasadena City College. I violated that policy on more than one occasion. I deserved to have my 20-year teaching career brought to an unceremonious end.

When you lose your career in a very public way, as I did, people remember. Over the past 10 years, I’ve had countless conversations online and in real life about whether what I did merited my resignation. No one thinks what I did was acceptable, but there’s disagreement as to whether it should have been career-ending.

That’s a question of great interest to me personally, but it doesn’t have broader implications. What does have real implications, I think, is the increasingly wide gap between how my friends on the left and on the right assess and interpret my actions.

Reckless Lecher or Unfaithful Spouse

Millennial and Gen Z lefties are famously suspicious of any romance that has even the slightest hint of a power imbalance. There’s a new and marked hostility towards age-gap relationships.

When I taught sex education in the 1990s, we focused on the importance of “enthusiastic consent.” The young puritans on the left question whether meaningful consent is even possible unless the two parties are exactly the same age and enjoy precisely the same status.

A student who enthusiastically consents to an affair with their professor may feel powerful, or at least equal; they may believe they enjoy the whirlwind romance. They are wrong, the left says. They have a “false consciousness” that deludes them into thinking they are a predator’s equal. My friends on the left are glad I lost my job, as in their mind, it removed a reckless lecher from a campus filled with vulnerable young people.  

My friends on the right tend to be much more concerned about the betrayal of my marriage vows. They are much more likely to say that my real victims were not the young women who willingly took me into their beds, but my son and my daughter.

My kids were 4 and 1 when the scandal broke. Their mother and I have had a blessedly amicable divorce, but even the most civil of separations is devastating to small children. Some of the students I slept with remain my friends; others are out of contact. None accused me of abuse, or of doing them any real harm.

My children, though? No matter how devoted a non-custodial papa I may be, the harm my affairs inflicted resonates in their lives in ways that they still can barely grasp. It is my friends on the right who look at my ex-wife and children and say, “This is the thing for which you most need to repent.”

Marriage Is Not a Private Affair

It’s obviously possible to think that I behaved badly towards multiple people and institutions. Betrayal is not a zero-sum game. It’s telling, though, that the left tends to dismiss infidelity as a private matter while seeing the affairs with students as a matter for public concern.

Marriage is hardly a private matter. I signed a marriage license issued by the county, and that was just as public a legal document as the offer of employment I signed at a college. The state clearly does have a vested interest in marriage.

The left pushed so hard for same-sex marriage because they understood the incomparable importance of the institution. We can disagree as to whether permitting gay couples to wed merely expands or actively degrades marriage, but there’s no question that all sides consider the issue important. The remarkable haste with which a Democrat-controlled Senate pushed a repeal of the “Defense of Marriage Act” through last year’s lame-duck Congress makes it clear: redefining marriage matters to the left.

Let me qualify that: making sure that marriage is open to everyone matters to the left. It is the right, though, who seem the only ones concerned with the health of those marriages. It is the right that is more likely to recognize that divorce, while sometimes inevitable, justified, and necessary, is invariably a tragedy.

There Is No Enlightened Affair

For Callard, divorcing her first husband was no tragedy. It was, as she tells us in the fawning New Yorker piece, a vital step towards self-discovery. Instead of acknowledging that her first husband and young children were collateral damage of her affair, she insists she has given them useful lessons about their own possibilities for happiness. Instead of apologizing to her students for taking one of their number to bed, she lectures to them on the insights the affair has given her and encourages them to follow in her footsteps.

To be sure, there’s still the pesky matter of college policies, but Callard and her student lover played that part perfectly: “In accordance with university guidelines, they declared their desire to have a relationship to the chair of the philosophy department.”  The medieval church had papal indulgences for sin; modern university campuses have sympathetic administrators ready to absolve horny faculty who are calculating enough to confess an affair with a student in advance.

I knew that it was wrong to cheat on my wife. I knew that it was wrong to sleep with students, even if they were enthusiastic and willing. I knew that it was desperately wrong to risk my children’s happiness. I did it anyway.

I fell into despair, had a complete mental breakdown, and ended up hospitalized for months. Wracked with guilt, I gave ill-advised interviews to the press. When these were published, the embarrassment of my family was compounded. “We love you, but you have shamed all of us,” a cousin said.

Long, Horrible Effects on the Whole Family

I’ve done the best I can with the decade since. I’ve worked in retail and as a ghostwriter. My large extended family has, slowly, welcomed me. I know I can never return to teaching. There is no way back, but there is always a way forward, and I have walked that way forward. Shame, however justified, is not an excuse for despair.

I have remarried, and my wife and the mother of my children are not merely civil, but genuine friends. That’s not quite the same as Callard’s arrangement, in which her student-husband and her former husband both live with her. The bloom is off that rose; Agnes admits that she’s a little disillusioned with her second marriage as well.

Her dazzling intellect gave her words to defend and elevate a sordid fling, but while words endure on paper, the feelings they describe tend not to last. The reader is left with the distinct feeling that the professor may not have had her last divorce. (Her sons are now older, and not available for the interview, which one suspects is lucky for Callard. I would like to hear their views, someday.)

Insisting that Evil Is Actually Good

“Mostly this woman is just not a good person, and the men around her are pathetically weak.” That’s how a friend, in an email, described Callard and her two husbands. That’s harsh, but it’s also true. What makes her “not good” isn’t that she had an affair. It’s wrong to cheat, it’s wrong to break your promises, and it’s wrong to sleep with your students, but these are things that humans do even though they shouldn’t. What makes her “not good” is the same thing that makes the moral agenda of the contemporary left “not good” – it doesn’t just tolerate vice, or forgive it, it insists on redefining vice as virtue.

It is very human to try and justify our worst impulses. We are all good at making up reasons why we do things we shouldn’t. At some point, if we have a conscience, we realize that the excuses aren’t working. We confess our sins, ask forgiveness, and try to do better.

Callard is a gifted philosopher, so her defense of her own grubby impulses is lofty and eloquent. She’s fooled herself, and two weak men have bought into her vision. That’s their problem, of course, but it’s ours too. Vice that is dressed up as virtue becomes an example.

Callard constructed a philosophical defense of the indefensible and then peddled it to the world. In her self-absorption, she hasn’t connected her moral vision to a broader politics. But her rationalization of id and impulse is standard doctrine on the contemporary American left.

A culture that celebrates leaving your spouse for your student is a culture that has no concept of what marriage is, or what biological sex is, or when life begins. All these contemporary cultural battles are so intense precisely because one side has effectively — and monstrously — redefined basic truths.

Even many of my friends on the left rolled their eyes at The New Yorker piece. Callard comes across to almost everyone as someone too smart for her own good, surrounded by men too weak to tell her “No.” What these friends don’t see is the extent to which Callard’s justification of the morally reprehensible is, in fact, part and parcel of an entire movement to redefine society, responsibility, and virtue.


Hugo Schwyzer was a professor of history and gender studies at Pasadena City College from 1993-2013. He is now a ghostwriter living in Los Angeles.

Source

How Ditching Social Norms Guarantees Failure, Not Freedom

For Baby Boomers, this famous quote, attributed to Joan Collins, humorously captures a change many of them have observed in American society during their lifetimes: “Grandmother used to take my mother to the circus to see the fat lady and the tattooed man — now they’re everywhere.”

The reality behind the quote is that over the past few decades, American society has largely abandoned standard norms — of appearance, language, and behavior — in favor of a far more à la carte, each-to-their-own choice of how individuals should or can present and conduct themselves in society.

Fashions, standards, and mores evolve in all societies. What differs about the sea-change in this area since the late 1960s, however, is that rather than having one set of conventions gradually replace another, there has been a progressive withdrawal from the very idea of externally perceivable norms as the unwritten framework for what mainstream society considers acceptable and unacceptable.

The new “normal” is a near-absence of norms.

To hark back for a second to the Joan Collins quote, the example of the general spread of obesity and tattoos — once so marginal as to be limited to unfortunate occurrences of glandular dysfunction in the former case, and sailors and convicts in the latter case — is emblematic of not only those things now accepted or even celebrated, but of their co-existence with every other possible manner in which people today choose to present themselves in society, from Goth fashion, to sportswear as everyday attire, to the fetishization of bulbous buttocks, to social media selfies photoshopped to the outer extremes of the possibilities of the human physique.

And we have witnessed, in the past few years, the deconstruction of perhaps the oldest societal norm of all — the distinction between the two biological sexes — into the current very loud controversy over gender fluidity and pronouns.

“What’s the harm in that?” many will ask. And indeed, at the surface and individual level, this kaleidoscope of tastes, fashions, and projected identities can be seen as the simple continuation of the trend toward individualism that started as far back as the late Middle Ages.

But, at the collective level, real dangers come when a society moves from a framework of standards to a culture with virtually no common standards at all.

Shaped gradually over centuries by collective and implicit agreement on what was seen as desirable or healthy for both the individual and for society, norms have traditionally expressed a society’s values. For the most part, they were not enforced by the law, but rather by the sense of shame that society’s disapproval would trigger in those who transgressed such norms.

The softer side of these once-ubiquitous societal norms was called “etiquette,” or, more generally, basic good manners. Their traditional function was to ensure a certain decorum and civility among people. One intent was to shield those perceived at the time as more vulnerable or delicate from unpleasantness and vulgarity, as seen in the lost norm that men should avoid foul language in front of women and children.

Proper etiquette also served to prevent distasteful and annoying behavior in social interactions (not chewing with an open mouth, not speaking with a mouth full of food, moderating volume of speech) as well as to demonstrate respect or appreciation for others or for places of hospitality (not wearing hats indoors, bringing a small gift when invited to dinner). These types of observances were seen as the tangible outcome of a “good upbringing,” thus constituting positive societal reinforcement for the parents of the well-mannered individual.

The infringement of a norm implies a judgment is being passed. And in today’s culture, being accused of being “judgmental” is akin to being labeled intolerant, hateful, bigoted, or worse. Passing judgment is the antithesis of what seems to be the only universal virtue left in modern society: being non-judgmental, or, to put it simply, being “nice.”

Disapproval of pretty much anything has been replaced by the need to be seen as positive, caring, understanding, tolerant, empathetic, open-minded, solicitous, and generally kind-hearted. Hence the virtue-signaling flooding social media and the decline of fire-and-brimstone religion, which has been increasingly replaced by the “positivity” of its modern-day substitutes such as meditation, yoga, and “holistic wellness.”

Culturally, being “nice” equates with the judgment-free acceptance of all forms of fashion, art, music, personal identity, and language, no matter how aesthetically unappealing, cacophonic, vulgar, or simply preposterous. Politically, the mantra of niceness has given rise to forms of moral relativism in public policy that have resulted, for example, in cash-free bail, the decriminalization of certain forms of theft and other crimes, the widespread legalization of marijuana, cities flouting federal immigration law, and the focus on historical revisionism.

But what has replaced the self-conscious self-regulation once underpinned by societal norms?

The prevalence today of morbid obesity, the universal spread of profanity, the wearing of unflattering or inappropriate clothes, the popularity of extensive tattoos and piercings, the rise of widespread drug abuse, the brawling in theme parks and aboard commercial aircrafts, and the problem of increasing criminal violence all point to a culture off track. Our culture has abandoned the standards that existed two or three generations ago — in terms of what they considered healthy, positive, and generally appealing for both individuals and for society as a whole.

Those with a “progressive” bent will decry this line of reasoning, arguing that aesthetics is by its very nature subjective, that “fat shaming” is a reprehensible form of social cruelty, that formerly illegal drugs have merely taken their place alongside tobacco and alcohol, and that crime and violence are merely symptoms of an unjust society. No amount of evidence or logic will sway such views.

But the abandonment of standards brings a far more damaging societal consequence, and it is hiding in plain sight: the creation of a highly visible and most often irreversible distinction between elite society and, for want of a better term, “the masses.”

This phenomenon can be witnessed most easily in those places where the affluent and the privileged, as well as the moderately successful — let’s call them “the 10 percent” — live in close proximity to the remaining 90 percent, and particularly to those fellow citizens considered “the underclass.”

In these tony neighborhoods — from New York’s Upper East Side to Los Angeles’ Beverly Hills, from Washington, D.C.’s Georgetown to Chicago’s Gold Coast — the obese, the heavily tattooed, the eyebrow-studded, the inarticulate, and the badly dressed are found rarely and readily identified as “not belonging” by these neighborhoods’ slim, elegant, and urbane local residents, as well as by the doormen, security staff, and shop assistants they employ.

The resulting subtle apartheid demonstrates the law of unintended consequences: the loosening of societal standards, thought to increase freedom of lifestyle choices and self-expression, has also resulted in the creation of a quasi-caste system in which the plus-sized, the inked, the unintelligible, and the shabby or sloppy are — de facto if not de jure — denied the ability to compete for jobs, for mates, and generally for opportunity in the residential enclaves and in the professional sectors of the rich, svelte, chic, and highly educated.

This is the real tragedy of a pendulum that has swung too far from conformity to “anything goes.” For these readily identifiable non-elites, social and professional upward mobility is today blocked by the very jettisoning of standards that was advertised as the path to a more modern, tolerant, free, and fluid society.

Instead, we have a culture that is undoubtedly coarser, more unattractive, and more confrontational than it was in the recent past and, most importantly, one in which a large social class of “plebeians,” constituting an absolute majority of the population, is now limited in its aspirations — not just by education, generational wealth, and family connections — but now by language, behavior, and physical appearance as well.

The result feels like a cruel trick played by the elites on their less fortunate fellow citizens, preventing anyone from below from joining their ranks. Yet the abandonment of normative societal standards has not been imposed from above. It is instead a cultural spin-off of the wider anti-establishmentarianism of the late 1960s.

It “felt right” at the time to throw off the shackles of convention across a broad range of former norms in a range of cultural domains. But the trendsetters of the time did not anticipate what happens in a cultural vacuum.

Societal standards — whether in dress, language, body shape, or behavior — have existed since the beginning of civilization for a reason. We abandoned them, for the first time in human history, against logic and at great cost. And those who can least afford it pay the highest price.


Paul Maglione is a writer and education entrepreneur who spends his time between the U.S. and Europe. He writes about politics, culture, and everyday life.

Source

‘Woke’ Effectively Describes The Left’s Insanity, And That’s Why They Hate When You Say It

‘Woke’ Effectively Describes The Left’s Insanity, And That’s Why They Hate When You Say It

When was the last time you were called racist? When was the last time you actually cared about being called racist? Odds are you get called it quite often and care (or should care) about being called it very little.

That’s because lobbing accusations of racial bigotry at anyone who gets in their way is second nature for the left. So when people stopped taking these accusations seriously — realizing it is simply impossible for everything to be racist — the left began decrying “white supremacy,” semantically invoking Nazism.

When accusations of racism failed to coerce enough action, the left moved on to a pejorative with far worse aesthetics while maintaining the same message. Accusing people and institutions of “racism” had lost its utility due to rhetorical inflation, and the era of “systemic white supremacy” had begun.

According to some, the conservative movement and the American right writ large are experiencing a similar ongoing dilemma with the word “woke.” Many suggest the word has come to mean nothing due to right-wing over-saturation, while others insist it has taken on a far more nefarious tone.

Nevertheless, the question remains: Why has the word “woke” become so problematic?

Bad Faith

On Tuesday, Bethany Mandel, co-author of “Stolen Youth: How Radicals Are Erasing Innocence and Indoctrinating a Generation,” appeared on The Hill’s “Rising” to discuss leftism’s role in damaging American families. 

During the discussion, Briahna Joy Gray, co-host of the “Bad Faith” podcast, inquired if Mandel would “mind defining ‘woke,’ ’cause it’s come up a couple [of] times, and I just want to make sure we’re all on the same page.” What followed was a brief moment of self-consciousness in which the author stumbled over her words before offering a generally accepted definition of the term.

Despite this, the moment was clipped, and the author was lambasted as both a bigot and buffoon across the web. 

The whole point of this exercise was to humiliate someone offering a coherent definition of woke-ism that was insufficiently deferential to the whims of leftist ideologues. However, this attempt was unsuccessful. 

What Is Woke?

Dragging Mandel through the digital public square did not result in the typical groveling struggle session that has come to be expected whenever people explain their opinions in public, but it did inspire many to inquire about the nature of the term “woke.”

The term started to increase in prevalence in the early-to-mid-2010s back when “Black Lives Matter” referred to a hashtag, not an organization, and when the hot-button social issue du jour was the legalization of homosexual marriage. Despite its original meaning, used in common parlance simply to refer to personal vigilance, “woke” quickly took on social and political meanings. Like how every other community uses specific language to signify in-group allegiance, “woke” was used to inculcate oneself among the broader cause of the burgeoning leftist cultural hegemony and, by extension, the Democrat Party.

But as the term became more and more associated with the party, it became less specifically connected with racial protest movements and more so a shibboleth for supporting the party platform — “stay woke,” the slogan went.

It is undeniable that woke-ism and the people who get protective of the identifying label “woke” have an influential presence on the political and cultural left. There was even a short-lived Hulu series titled “Woke” that chronicled a previously apolitical black cartoonist’s journey through the intersectional landscape of identity politics. And in 2018, “Saturday Night Live” poked fun at the concept of corporate fashion brands using woke-ism to market schlock to well-intentioned hipsters.

Woke-ism came to define a movement so insurgent among the institutionalized powers of the left that even its vanguards like former President Barack Obama and Rep. Hakeem Jeffries, who undeniably had a role ushering it in, bemoaned its rancorous presence and how it distracts from the Democrat Party’s larger goals. 

This was something the Democrats fully embraced until they could no longer fully control the semantics around it.

It’s a Good Bad Word

Woke-ism is simultaneously a persistent ideological framework and a general inclination — it depends on the person or institution in question at the time. But both rely upon a consistent smorgasbord of Marxian dialectics and ideological accouterment — gender theory, critical race theory, et al. — that seeks to usurp the ideals of the American founding and impose contemporary whims. 

The word has become as commonplace among the current-day conservative movement as MAGA hats and “lock her up” chants were at 2016 Trump rallies. And this is, to be fair, totally warranted; what other slogany-sounding word really works as a catch-all for what leftism has become? 

Sure, it would help if the right had a more tactical approach to diagnosing and labeling each and every radical change introduced to our society at breakneck speed, but that’s not how people work. The right can and should identify the unique threats of identitarian Marxism, managerialism, and contemporary Lysenkoism, but is labeling all of these things useful? 

Using “woke” as a catch-all label for radical leftism is effective. That’s one of the major reasons why the left hates it. They lost complete control of the English language, and the word they used to indicate their radicalism to one another is being used to expose that radicalism to the rest of the world.

Woke-ism is an intentionally ambiguous framework that is meant to keep out interlopers and reward its advocates. Therefore, simply describing it as what it is, is anathema to those who wish for its intentions to remain ambiguous.

Simply saying “woke” works.


Samuel Mangold-Lenett is a staff editor at The Federalist. His writing has been featured in the Daily Wire, Townhall, The American Spectator, and other outlets. He is a 2022 Claremont Institute Publius Fellow. Follow him on Twitter @Mangold_Lenett.

Source

‘Christian Nationalism’ Isn’t Cultural Coercion, It’s A Moral Imperative

‘Christian Nationalism’ Isn’t Cultural Coercion, It’s A Moral Imperative

A governor of a highly populous Western state has erected billboards adorned with a verse from the Bible in other states advocating for specific policy positions. A U.S. senator running for reelection equates voting to “a kind of prayer for the world we desire” and defines democracy as “the political enactment of the spiritual idea that each of us was created, as the scriptures tell us, in the ‘Imago Dei’ the image of God.” 

Is this the sinister emergence of Christian nationalism — the right-wing, fascist, “Handmaid’s Tale” hellscape that’s supposedly lurking just around the corner? No, in fact, that’s far from the case.

The first vignette actually speaks to a recent push by California Gov. Gavin Newsom, who had pro-abortion billboards installed in multiple red states, with ones in Mississippi and Oklahoma featuring Jesus’ words from Mark 12:31: “Love your neighbor as yourself. There is no greater commandment than these.” And in the second example, these words were spoken on the campaign trail by Georgia Sen. Raphael Warnock.

The usual takeaway is to point out the hypocrisy behind the adulation that’s typically showered only on the left’s public use of Christianity. But the deeper point is that Newsom and Warnock both show that using Christian arguments and verses from Scripture for the purpose of securing political victories is unexceptional — and even good. As Stephen Wolfe argues in his pathbreaking and provocative book “The Case for Christian Nationalism,” Christians should follow suit, though certainly not in enacting those particular policies. 

Rigorously and relentlessly argued, Wolfe uses the freighted term “Christian nationalism,” a phrase often deployed as a cudgel against evangelicals, to rally Christians behind a positive conception of public life that is grounded on the rich doctrines of 16th and 17th-century Reformed theology and the American political tradition. He builds on the important work of ad fontes, or a return to the source, that Protestant scholars and institutions have undertaken in recent decades.

Above all, Wolfe aims to cultivate “a collective will for Christian dominion in the world” — a will that has been crushed by a combination of elite evangelical rhetoric that buttresses 21st-century pieties, a bicoastal ruling class that is hostile to orthodox Christians, a conservative movement that has mostly failed to preserve American institutions, and a suffocating psychological malaise that has gripped the West. He gives Christians the intellectual tools to break through the nearly impregnable wall created by a combination of “third way” politics, neo-Anabaptism, and unprincipled pluralism and reestablish a way of life that is conducive to Christian flourishing.

Christian Nationalism Explained

Wolfe’s simplest definition of the controversial term “Christian nationalism” is a “Christian people acting for their own good in light of their Christian nationhood.” It encompasses the myriad ways Christians should be working to establish Christian conditions not only in their homes and churches — but also in their villages, towns, cities, states, and, yes, nations. Key to this project is recovering the solid ground of what the Reformers and their heirs frequently called the book of nature, which they saw containing truths that were consistent with the book of Revelation. They understood that God gave us minds to act within the confines of the created order — that Christians do not need a positive command from the Bible for every action they take.

Wolfe teaches that the concept of the nation flows from man’s very anthropology. Standing with Thomas Aquinas and the New England Puritan Samuel Willard, he contends that even if Adam and Eve didn’t follow the serpent’s wiles, mankind would still have “formed distinct civil communities — each being culturally particular.” This is because weaved into man’s nature are social and political faculties that irresistibly “lead him to the fundamental things of earthly life, such as family formation and civil society,” writes Wolfe. “The nation, therefore, is natural to man as man, and the matured earth would be a multiplicity of nations.” 

Implicit in this argument is the Reformed teaching that while Adam’s fall infused man’s entire nature with sin, it “did not eliminate the natural gifts,” as Wolfe notes. This doctrine is popularly known as total depravity, the often misunderstood first point in the TULIP acronym (an anachronistic 19th-century pedagogical device). As Dutch theologian Herman Bavinck wrote, though “numerous institutions and relations in life of society such as marriage, family, child rearing” and “man’s dominion over the earth through science and art” have “undoubtedly been modified by sin … they nevertheless have their active principle and foundation in creation, the ordinances of God.”

The cornerstone of Wolfe’s project is the well-known theological doctrine that grace does not destroy nature but instead perfects it. In other words, Christianity does not overthrow civil order, the non-sinful traditions of the people, and general decorum — the natural sinews that preserve society for posterity. 

As John Calvin taught in a sermon on 1 Corinthians: 

Regarding our eternal salvation, it is true that one must not distinguish between man and woman, or between king and a shepherd, or between a German and a Frenchman. Regarding policy, however, we have what St. Paul declares here; for our Lord Jesus Christ did not come to mix up nature or to abolish what belongs to the preservation of decency and peace among us.

Grace elevates the natural gifts, completing them because they are now aimed at both earthly and heavenly goods. For example, once a husband puts his faith in Christ, he and his family receive baptism and his work is directed to his home and then outward to the temporal world, what the Reformers called the civil kingdom. Grace does not make him into an androgynous being or cause him to leave his family behind to live in a church with other autonomous Christians. 

One of the many controversial aspects of Wolfe’s project for modern readers involves his teachings on civil laws and magistrates. Laws should reflect the natural law, protect natural rights, and, as legal historian Timon Cline has taught, direct “men to virtue,” pointing him to “higher truths.” Though the civil magistrate “cannot legislate or coerce people into belief,” Wolfe argues that he can “punish external religion — e.g., heretical teaching, false rites, blasphemy, sabbath-breaking, etc. — because such actions can cause public harm.” In fact, he proposes that the magistrate can even point citizens toward Christianity as the true religion. 

For dissenting Christians, Wolfe counsels that “wide toleration is desirable.” While non-Christians should be “guaranteed a basic right to life and property,” he contends that they should not be allowed to undertake activities that could harm Christianity. 

Though these were standard features of Christendom throughout Christian history, modern Christian statesmen would need to exhibit careful judgment in applying them today.

Christendom and America

Wolfe’s project is not a theocratic endeavor, with the church lording its power over the civil realm. Instead, he writes that the “classical Protestant position is that civil authorities” should “establish and maintain the best possible outward conditions for people to acquire spiritual goods.” And these goods are acquired through the church, whose ministers preach the Word and administer the sacraments. This doesn’t imply that Christianity is naturally weak absent state support. Rather, it means Christianity should infuse all of life, causing the magistrates of all nations to guide their citizens toward the highest ends.

In fact, as Joe Rigney has noted, civil government favoring and promoting Christianity “has been the dominant position in the history of the church for the last 1500 years.” Key confessions and catechisms of the Reformed tradition, including the original Westminster Confession and the Second Helvetic Confession, teach the good of religious establishments and charge those in political authority to uphold both tables of the Ten Commandments.

Early Americans were influenced by this understanding of Christian political order. According to Davenant Institute President Brad Littlejohn, the Founding Fathers “were certainly ‘Christian nationalists’ by the contemporary definition — that is, people who believed it important that America should publicly describe and conduct itself as a nation within a Christian framework.” Most state constitutions privileged Christianity — in most cases specifically a Protestant kind — and featured mentions of God, religious tests for public office, taxpayer funding of clergy and churches, Sabbath laws and laws against blasphemy, and Christian prayer and instruction in public schools well into the mid-20th century.

Christianity in a Negative World 

What about the place of “cultural Christianity,” an important pillar of Christian nationalism that has been heavily criticized by public theologians such as Russell Moore and Ray Ortlund? Wolfe contends that the critics commit a category error because it was never intended “to bring about anyone’s salvation.” Having a robust culture infused with Christian themes and imagery instead prepares citizens “for the reception of the Gospel.” It is a social power that internalizes the normal patterns of life that revolve around regular participation in Christian practices. 

As Wolfe rightly asks, would these critics look to subject families to “relentless hostile social forces” such as drag queen story hours, transgender ideology being taught in public schools, rampant porn use, and worse? Are active hostility and open persecution — that is, the circumstances first-century Christians faced — the only cultural conditions suited for the spread of Christianity? The history of Christendom renders a rather clear verdict on these questions.  

Christians are not called to conserve mid-20th century Supreme Court rulings. Begging for the table scraps of religious liberty carve-outs will not suffice, and “prudence” that is actually capitulation to the regnant cultural ethos will only hasten our nation’s slide into anarchy. To appropriate a famous G.K. Chesterton quote, the business of Christians “shouldn’t be to prevent mistakes from being corrected.”

In a “negative world,” to use Aaron Renn’s useful taxonomy, in which our magistrates oversee an establishment complete with a “regime-enforced moral ideology” that is hostile to Christianity, Wolfe gives Christians a coherent intellectual foundation that can withstand the gale force winds of our age. But political theory cannot enact itself. Christians must have the courage, manliness, fortitude, and strength to lay the groundwork in the decades ahead for what will assuredly be a multi-generational effort.


Mike Sabo is the editor of RealClear’s American Civics portal. He is a graduate of the Van Andel Graduate School of Statesmanship at Hillsdale College. He and his wife live in Cincinnati, Ohio.

Source

Education Today is Barbarism

Professor James Tooley, Vice-Chancellor at the University of Buckingham, discusses the role of education in the modern age, and the relationship between parental and government oversight in our education systems.

Buckingham University is the oldest of only six private higher education institutions in the UK that award degrees. It started with an idea in 1969 that the country should have a university free from state funding and state regulation.

The Birth of a Private University

Seven years later, the university was opened by Margaret Thatcher in 1976. At the time she was the leader of the opposition and three years out from becoming prime minister of the United Kingdom.  This is part of what she said in the inauguration speech for the university:

To a free people, accustomed to a great richness of private initiative, there is something undesirable, indeed debilitating about the present mood in the country in which so many look not to themselves or their fellows for new initiatives but to the state…

I, as a politician must not prescribe to you. Independence is not a gift, it is not something that governments confer but something that the people enjoy and use…

Unless we are worthy and able to take advantage of a freedom not yet extinguished in our land, we shall become pale shadows like civilisations before us who are eventually thrust aside and disposed of by more vigorous rivals.

What a powerful philosophy! It is wonderful to hear a politician extolling the virtues of freedom of thought and action, with no need or desire for government control over education.

In Loco Parentis

James extolled the virtue of ‘in loco parentis’:

The term “in loco parentis” is a Latin phrase that translates as “in place of a parent” or “instead of a parent” and refers to how schools and school administrators are expected to act with reference to students and other minors. In other words, the employees of a school are charged, by the parents of the students, to act on their behalf while the students are there.

In other words, the educational institution is primarily responsible for carrying out the wishes of the parent. The phrase is used to help the teacher make a judgement call: ‘What would the parent do in this situation? They are at work, I have the responsibility for their child — I need to act in loco parentis’.

John followed up with his phrase that ‘governments should be downstream from culture, not the other way about’. In other words, the government’s primary responsibility is to listen to the culture and enact what is best, rather than to dictate what they believe is best for the people.

The Difference Between Boys and Girls

Then the conversation moved on to the proportion of young people that should go to university in any given society. From my own experience, I have seen governments seek to push this proportion as high as possible to minimise youth unemployment statistics. But is university the best avenue for every young person?

James talks about the shortage of skilled tradespeople, electricians, plumbers, and welders. He argues that university is not the best pathway for these invaluable members of society. Their discussion goes on to cover the alarming dropout rate from the education of white males in the USA, with the effect that this push towards university has now created an underclass of disaffected, underproductive, white young males intent on destabilising society.

The discussion also covers the styles of learning that have become most prevalent in universities. There are now fewer high-pressure summative exams that favour boys’ learning styles, and more small-scale continuous assessments favoured by girls, arguably disenfranchising the boys.

Low-Cost Private Education

In addition to his role as Vice-Chancellor of his university, James has also pioneered some astonishing work on the demand for and viability of low-cost private education, firstly in the developing world and more recently in the West.

His research, in some of the most dangerous places on earth, found that upwards of 70% of children in developing world nations are being educated in private schools. These schools might have been started by a passionate mother with a teaching gift who then established a school around her as her own children grew.

Or the school might have been a tutor group that was set up to help more senior students pass public exams, then it grew a school, backwards, down through the grades. While yet other schools have been established as small businesses, so-called ‘for profit’, but only generating enough income for the founder to clothe, house and feed his own family.

James recounts his own experience in Durham, England, where he established a low-cost private school. The fees are £3,000 per student per year. The class sizes are 20, therefore generating £60,000 per year, £30,000 for the teacher’s salary, typical for the UK, and £30,000 for buildings, utilities and resources. The low-cost model works, and they don’t need or want government involvement or interference.

Education Should Be the Foundation of Civilisation

The developing world demonstrates that without the intervention of governments, parents can make education happen. James Tooley has demonstrated that it can also happen in the West without the intervention of the state. And in this regard, Margaret Thatcher extolled the virtues and advantages of freedom of choice for parents in the education of their children.

It seems to me that the only motivation for state control of education is to control and homogenise society by dumbing down education, alongside the removal of debate.

Education must provide the tools to think and learn, rather than telling children what is right and wrong. The latter is for the parents to teach in the home.

Sadly, Mrs Moira Deeming MP, Liberal Member for the Western Metropolitan Region, Melbourne, Victoria, had to give up her teaching career on account of excessive state control of education. In her maiden speech to parliament, she highlights the excesses and evils that result from disproportionate state control of education.

___

Photo by Max Fischer.

Thank the Source

If You Take The Elevator To The Second Floor, You Don’t Deserve Legs

If You Take The Elevator To The Second Floor, You Don’t Deserve Legs

I went on a cruise across the Caribbean last week, and if you’ve ever been on one of those ships, you know the elevators are usually right next to wide, grand staircases that go all the way up and down the boat. You can’t miss them.

After boarding the boat from a day at port in the Dominican Republic, I hopped on the elevator to hit the buffet 10 floors up. I was back on the boat a bit early, so I was the only one in the elevator waiting for it to close. When the doors began to shut, a couple ripped them open at the last second to climb in — which was fine at first. I’m happy to be courteous, and we were all on vacation, after all. That was until the elevator stopped on the second floor to let this couple off despite the stairs being 10 feet from the elevator entrance.

I’m sorry, but at that point, why even have legs? Nothing better captures the laziness of modern Americanism than those who feel compelled to ride the elevator up or down a single floor because they can’t be bothered to find the stairs, let alone use them.

There are few reasons why any able-bodied person should ever ride the elevator up or down a single floor instead of tackling a flight of stairs, which in America is only about 12 steps. Unless a person is seriously disabled, they can take 12 steps. Unless a person is moving a big, awkward object such as furniture, they can take 12 steps. Unless a person is carrying a large load of groceries (and I mean large, three bags minimum), they can take 12 steps. There is no excuse for requiring that everybody else in a building wait for the elevator to start and stop so a lazy person can avoid the apparently Herculean task of walking 12 or, on the cruise ship, 14 inclined steps.

The couple who interrupted my hangry trip to the buffet was not disabled, not carrying anything, and not even overweight (unlike most people on the gluttony-fueled voyage). If I were king for a day, I might have had their legs chopped off. They don’t appear to be needing them.


Source

error

Please help truthPeep spread the word :)