How Ditching Social Norms Guarantees Failure, Not Freedom

For Baby Boomers, this famous quote, attributed to Joan Collins, humorously captures a change many of them have observed in American society during their lifetimes: “Grandmother used to take my mother to the circus to see the fat lady and the tattooed man — now they’re everywhere.”

The reality behind the quote is that over the past few decades, American society has largely abandoned standard norms — of appearance, language, and behavior — in favor of a far more à la carte, each-to-their-own choice of how individuals should or can present and conduct themselves in society.

Fashions, standards, and mores evolve in all societies. What differs about the sea-change in this area since the late 1960s, however, is that rather than having one set of conventions gradually replace another, there has been a progressive withdrawal from the very idea of externally perceivable norms as the unwritten framework for what mainstream society considers acceptable and unacceptable.

The new “normal” is a near-absence of norms.

To hark back for a second to the Joan Collins quote, the example of the general spread of obesity and tattoos — once so marginal as to be limited to unfortunate occurrences of glandular dysfunction in the former case, and sailors and convicts in the latter case — is emblematic of not only those things now accepted or even celebrated, but of their co-existence with every other possible manner in which people today choose to present themselves in society, from Goth fashion, to sportswear as everyday attire, to the fetishization of bulbous buttocks, to social media selfies photoshopped to the outer extremes of the possibilities of the human physique.

And we have witnessed, in the past few years, the deconstruction of perhaps the oldest societal norm of all — the distinction between the two biological sexes — into the current very loud controversy over gender fluidity and pronouns.

“What’s the harm in that?” many will ask. And indeed, at the surface and individual level, this kaleidoscope of tastes, fashions, and projected identities can be seen as the simple continuation of the trend toward individualism that started as far back as the late Middle Ages.

But, at the collective level, real dangers come when a society moves from a framework of standards to a culture with virtually no common standards at all.

Shaped gradually over centuries by collective and implicit agreement on what was seen as desirable or healthy for both the individual and for society, norms have traditionally expressed a society’s values. For the most part, they were not enforced by the law, but rather by the sense of shame that society’s disapproval would trigger in those who transgressed such norms.

The softer side of these once-ubiquitous societal norms was called “etiquette,” or, more generally, basic good manners. Their traditional function was to ensure a certain decorum and civility among people. One intent was to shield those perceived at the time as more vulnerable or delicate from unpleasantness and vulgarity, as seen in the lost norm that men should avoid foul language in front of women and children.

Proper etiquette also served to prevent distasteful and annoying behavior in social interactions (not chewing with an open mouth, not speaking with a mouth full of food, moderating volume of speech) as well as to demonstrate respect or appreciation for others or for places of hospitality (not wearing hats indoors, bringing a small gift when invited to dinner). These types of observances were seen as the tangible outcome of a “good upbringing,” thus constituting positive societal reinforcement for the parents of the well-mannered individual.

The infringement of a norm implies a judgment is being passed. And in today’s culture, being accused of being “judgmental” is akin to being labeled intolerant, hateful, bigoted, or worse. Passing judgment is the antithesis of what seems to be the only universal virtue left in modern society: being non-judgmental, or, to put it simply, being “nice.”

Disapproval of pretty much anything has been replaced by the need to be seen as positive, caring, understanding, tolerant, empathetic, open-minded, solicitous, and generally kind-hearted. Hence the virtue-signaling flooding social media and the decline of fire-and-brimstone religion, which has been increasingly replaced by the “positivity” of its modern-day substitutes such as meditation, yoga, and “holistic wellness.”

Culturally, being “nice” equates with the judgment-free acceptance of all forms of fashion, art, music, personal identity, and language, no matter how aesthetically unappealing, cacophonic, vulgar, or simply preposterous. Politically, the mantra of niceness has given rise to forms of moral relativism in public policy that have resulted, for example, in cash-free bail, the decriminalization of certain forms of theft and other crimes, the widespread legalization of marijuana, cities flouting federal immigration law, and the focus on historical revisionism.

But what has replaced the self-conscious self-regulation once underpinned by societal norms?

The prevalence today of morbid obesity, the universal spread of profanity, the wearing of unflattering or inappropriate clothes, the popularity of extensive tattoos and piercings, the rise of widespread drug abuse, the brawling in theme parks and aboard commercial aircrafts, and the problem of increasing criminal violence all point to a culture off track. Our culture has abandoned the standards that existed two or three generations ago — in terms of what they considered healthy, positive, and generally appealing for both individuals and for society as a whole.

Those with a “progressive” bent will decry this line of reasoning, arguing that aesthetics is by its very nature subjective, that “fat shaming” is a reprehensible form of social cruelty, that formerly illegal drugs have merely taken their place alongside tobacco and alcohol, and that crime and violence are merely symptoms of an unjust society. No amount of evidence or logic will sway such views.

But the abandonment of standards brings a far more damaging societal consequence, and it is hiding in plain sight: the creation of a highly visible and most often irreversible distinction between elite society and, for want of a better term, “the masses.”

This phenomenon can be witnessed most easily in those places where the affluent and the privileged, as well as the moderately successful — let’s call them “the 10 percent” — live in close proximity to the remaining 90 percent, and particularly to those fellow citizens considered “the underclass.”

In these tony neighborhoods — from New York’s Upper East Side to Los Angeles’ Beverly Hills, from Washington, D.C.’s Georgetown to Chicago’s Gold Coast — the obese, the heavily tattooed, the eyebrow-studded, the inarticulate, and the badly dressed are found rarely and readily identified as “not belonging” by these neighborhoods’ slim, elegant, and urbane local residents, as well as by the doormen, security staff, and shop assistants they employ.

The resulting subtle apartheid demonstrates the law of unintended consequences: the loosening of societal standards, thought to increase freedom of lifestyle choices and self-expression, has also resulted in the creation of a quasi-caste system in which the plus-sized, the inked, the unintelligible, and the shabby or sloppy are — de facto if not de jure — denied the ability to compete for jobs, for mates, and generally for opportunity in the residential enclaves and in the professional sectors of the rich, svelte, chic, and highly educated.

This is the real tragedy of a pendulum that has swung too far from conformity to “anything goes.” For these readily identifiable non-elites, social and professional upward mobility is today blocked by the very jettisoning of standards that was advertised as the path to a more modern, tolerant, free, and fluid society.

Instead, we have a culture that is undoubtedly coarser, more unattractive, and more confrontational than it was in the recent past and, most importantly, one in which a large social class of “plebeians,” constituting an absolute majority of the population, is now limited in its aspirations — not just by education, generational wealth, and family connections — but now by language, behavior, and physical appearance as well.

The result feels like a cruel trick played by the elites on their less fortunate fellow citizens, preventing anyone from below from joining their ranks. Yet the abandonment of normative societal standards has not been imposed from above. It is instead a cultural spin-off of the wider anti-establishmentarianism of the late 1960s.

It “felt right” at the time to throw off the shackles of convention across a broad range of former norms in a range of cultural domains. But the trendsetters of the time did not anticipate what happens in a cultural vacuum.

Societal standards — whether in dress, language, body shape, or behavior — have existed since the beginning of civilization for a reason. We abandoned them, for the first time in human history, against logic and at great cost. And those who can least afford it pay the highest price.

Paul Maglione is a writer and education entrepreneur who spends his time between the U.S. and Europe. He writes about politics, culture, and everyday life.


‘Woke’ Effectively Describes The Left’s Insanity, And That’s Why They Hate When You Say It

‘Woke’ Effectively Describes The Left’s Insanity, And That’s Why They Hate When You Say It

When was the last time you were called racist? When was the last time you actually cared about being called racist? Odds are you get called it quite often and care (or should care) about being called it very little.

That’s because lobbing accusations of racial bigotry at anyone who gets in their way is second nature for the left. So when people stopped taking these accusations seriously — realizing it is simply impossible for everything to be racist — the left began decrying “white supremacy,” semantically invoking Nazism.

When accusations of racism failed to coerce enough action, the left moved on to a pejorative with far worse aesthetics while maintaining the same message. Accusing people and institutions of “racism” had lost its utility due to rhetorical inflation, and the era of “systemic white supremacy” had begun.

According to some, the conservative movement and the American right writ large are experiencing a similar ongoing dilemma with the word “woke.” Many suggest the word has come to mean nothing due to right-wing over-saturation, while others insist it has taken on a far more nefarious tone.

Nevertheless, the question remains: Why has the word “woke” become so problematic?

Bad Faith

On Tuesday, Bethany Mandel, co-author of “Stolen Youth: How Radicals Are Erasing Innocence and Indoctrinating a Generation,” appeared on The Hill’s “Rising” to discuss leftism’s role in damaging American families. 

During the discussion, Briahna Joy Gray, co-host of the “Bad Faith” podcast, inquired if Mandel would “mind defining ‘woke,’ ’cause it’s come up a couple [of] times, and I just want to make sure we’re all on the same page.” What followed was a brief moment of self-consciousness in which the author stumbled over her words before offering a generally accepted definition of the term.

Despite this, the moment was clipped, and the author was lambasted as both a bigot and buffoon across the web. 

The whole point of this exercise was to humiliate someone offering a coherent definition of woke-ism that was insufficiently deferential to the whims of leftist ideologues. However, this attempt was unsuccessful. 

What Is Woke?

Dragging Mandel through the digital public square did not result in the typical groveling struggle session that has come to be expected whenever people explain their opinions in public, but it did inspire many to inquire about the nature of the term “woke.”

The term started to increase in prevalence in the early-to-mid-2010s back when “Black Lives Matter” referred to a hashtag, not an organization, and when the hot-button social issue du jour was the legalization of homosexual marriage. Despite its original meaning, used in common parlance simply to refer to personal vigilance, “woke” quickly took on social and political meanings. Like how every other community uses specific language to signify in-group allegiance, “woke” was used to inculcate oneself among the broader cause of the burgeoning leftist cultural hegemony and, by extension, the Democrat Party.

But as the term became more and more associated with the party, it became less specifically connected with racial protest movements and more so a shibboleth for supporting the party platform — “stay woke,” the slogan went.

It is undeniable that woke-ism and the people who get protective of the identifying label “woke” have an influential presence on the political and cultural left. There was even a short-lived Hulu series titled “Woke” that chronicled a previously apolitical black cartoonist’s journey through the intersectional landscape of identity politics. And in 2018, “Saturday Night Live” poked fun at the concept of corporate fashion brands using woke-ism to market schlock to well-intentioned hipsters.

Woke-ism came to define a movement so insurgent among the institutionalized powers of the left that even its vanguards like former President Barack Obama and Rep. Hakeem Jeffries, who undeniably had a role ushering it in, bemoaned its rancorous presence and how it distracts from the Democrat Party’s larger goals. 

This was something the Democrats fully embraced until they could no longer fully control the semantics around it.

It’s a Good Bad Word

Woke-ism is simultaneously a persistent ideological framework and a general inclination — it depends on the person or institution in question at the time. But both rely upon a consistent smorgasbord of Marxian dialectics and ideological accouterment — gender theory, critical race theory, et al. — that seeks to usurp the ideals of the American founding and impose contemporary whims. 

The word has become as commonplace among the current-day conservative movement as MAGA hats and “lock her up” chants were at 2016 Trump rallies. And this is, to be fair, totally warranted; what other slogany-sounding word really works as a catch-all for what leftism has become? 

Sure, it would help if the right had a more tactical approach to diagnosing and labeling each and every radical change introduced to our society at breakneck speed, but that’s not how people work. The right can and should identify the unique threats of identitarian Marxism, managerialism, and contemporary Lysenkoism, but is labeling all of these things useful? 

Using “woke” as a catch-all label for radical leftism is effective. That’s one of the major reasons why the left hates it. They lost complete control of the English language, and the word they used to indicate their radicalism to one another is being used to expose that radicalism to the rest of the world.

Woke-ism is an intentionally ambiguous framework that is meant to keep out interlopers and reward its advocates. Therefore, simply describing it as what it is, is anathema to those who wish for its intentions to remain ambiguous.

Simply saying “woke” works.

Samuel Mangold-Lenett is a staff editor at The Federalist. His writing has been featured in the Daily Wire, Townhall, The American Spectator, and other outlets. He is a 2022 Claremont Institute Publius Fellow. Follow him on Twitter @Mangold_Lenett.


‘Christian Nationalism’ Isn’t Cultural Coercion, It’s A Moral Imperative

‘Christian Nationalism’ Isn’t Cultural Coercion, It’s A Moral Imperative

A governor of a highly populous Western state has erected billboards adorned with a verse from the Bible in other states advocating for specific policy positions. A U.S. senator running for reelection equates voting to “a kind of prayer for the world we desire” and defines democracy as “the political enactment of the spiritual idea that each of us was created, as the scriptures tell us, in the ‘Imago Dei’ the image of God.” 

Is this the sinister emergence of Christian nationalism — the right-wing, fascist, “Handmaid’s Tale” hellscape that’s supposedly lurking just around the corner? No, in fact, that’s far from the case.

The first vignette actually speaks to a recent push by California Gov. Gavin Newsom, who had pro-abortion billboards installed in multiple red states, with ones in Mississippi and Oklahoma featuring Jesus’ words from Mark 12:31: “Love your neighbor as yourself. There is no greater commandment than these.” And in the second example, these words were spoken on the campaign trail by Georgia Sen. Raphael Warnock.

The usual takeaway is to point out the hypocrisy behind the adulation that’s typically showered only on the left’s public use of Christianity. But the deeper point is that Newsom and Warnock both show that using Christian arguments and verses from Scripture for the purpose of securing political victories is unexceptional — and even good. As Stephen Wolfe argues in his pathbreaking and provocative book “The Case for Christian Nationalism,” Christians should follow suit, though certainly not in enacting those particular policies. 

Rigorously and relentlessly argued, Wolfe uses the freighted term “Christian nationalism,” a phrase often deployed as a cudgel against evangelicals, to rally Christians behind a positive conception of public life that is grounded on the rich doctrines of 16th and 17th-century Reformed theology and the American political tradition. He builds on the important work of ad fontes, or a return to the source, that Protestant scholars and institutions have undertaken in recent decades.

Above all, Wolfe aims to cultivate “a collective will for Christian dominion in the world” — a will that has been crushed by a combination of elite evangelical rhetoric that buttresses 21st-century pieties, a bicoastal ruling class that is hostile to orthodox Christians, a conservative movement that has mostly failed to preserve American institutions, and a suffocating psychological malaise that has gripped the West. He gives Christians the intellectual tools to break through the nearly impregnable wall created by a combination of “third way” politics, neo-Anabaptism, and unprincipled pluralism and reestablish a way of life that is conducive to Christian flourishing.

Christian Nationalism Explained

Wolfe’s simplest definition of the controversial term “Christian nationalism” is a “Christian people acting for their own good in light of their Christian nationhood.” It encompasses the myriad ways Christians should be working to establish Christian conditions not only in their homes and churches — but also in their villages, towns, cities, states, and, yes, nations. Key to this project is recovering the solid ground of what the Reformers and their heirs frequently called the book of nature, which they saw containing truths that were consistent with the book of Revelation. They understood that God gave us minds to act within the confines of the created order — that Christians do not need a positive command from the Bible for every action they take.

Wolfe teaches that the concept of the nation flows from man’s very anthropology. Standing with Thomas Aquinas and the New England Puritan Samuel Willard, he contends that even if Adam and Eve didn’t follow the serpent’s wiles, mankind would still have “formed distinct civil communities — each being culturally particular.” This is because weaved into man’s nature are social and political faculties that irresistibly “lead him to the fundamental things of earthly life, such as family formation and civil society,” writes Wolfe. “The nation, therefore, is natural to man as man, and the matured earth would be a multiplicity of nations.” 

Implicit in this argument is the Reformed teaching that while Adam’s fall infused man’s entire nature with sin, it “did not eliminate the natural gifts,” as Wolfe notes. This doctrine is popularly known as total depravity, the often misunderstood first point in the TULIP acronym (an anachronistic 19th-century pedagogical device). As Dutch theologian Herman Bavinck wrote, though “numerous institutions and relations in life of society such as marriage, family, child rearing” and “man’s dominion over the earth through science and art” have “undoubtedly been modified by sin … they nevertheless have their active principle and foundation in creation, the ordinances of God.”

The cornerstone of Wolfe’s project is the well-known theological doctrine that grace does not destroy nature but instead perfects it. In other words, Christianity does not overthrow civil order, the non-sinful traditions of the people, and general decorum — the natural sinews that preserve society for posterity. 

As John Calvin taught in a sermon on 1 Corinthians: 

Regarding our eternal salvation, it is true that one must not distinguish between man and woman, or between king and a shepherd, or between a German and a Frenchman. Regarding policy, however, we have what St. Paul declares here; for our Lord Jesus Christ did not come to mix up nature or to abolish what belongs to the preservation of decency and peace among us.

Grace elevates the natural gifts, completing them because they are now aimed at both earthly and heavenly goods. For example, once a husband puts his faith in Christ, he and his family receive baptism and his work is directed to his home and then outward to the temporal world, what the Reformers called the civil kingdom. Grace does not make him into an androgynous being or cause him to leave his family behind to live in a church with other autonomous Christians. 

One of the many controversial aspects of Wolfe’s project for modern readers involves his teachings on civil laws and magistrates. Laws should reflect the natural law, protect natural rights, and, as legal historian Timon Cline has taught, direct “men to virtue,” pointing him to “higher truths.” Though the civil magistrate “cannot legislate or coerce people into belief,” Wolfe argues that he can “punish external religion — e.g., heretical teaching, false rites, blasphemy, sabbath-breaking, etc. — because such actions can cause public harm.” In fact, he proposes that the magistrate can even point citizens toward Christianity as the true religion. 

For dissenting Christians, Wolfe counsels that “wide toleration is desirable.” While non-Christians should be “guaranteed a basic right to life and property,” he contends that they should not be allowed to undertake activities that could harm Christianity. 

Though these were standard features of Christendom throughout Christian history, modern Christian statesmen would need to exhibit careful judgment in applying them today.

Christendom and America

Wolfe’s project is not a theocratic endeavor, with the church lording its power over the civil realm. Instead, he writes that the “classical Protestant position is that civil authorities” should “establish and maintain the best possible outward conditions for people to acquire spiritual goods.” And these goods are acquired through the church, whose ministers preach the Word and administer the sacraments. This doesn’t imply that Christianity is naturally weak absent state support. Rather, it means Christianity should infuse all of life, causing the magistrates of all nations to guide their citizens toward the highest ends.

In fact, as Joe Rigney has noted, civil government favoring and promoting Christianity “has been the dominant position in the history of the church for the last 1500 years.” Key confessions and catechisms of the Reformed tradition, including the original Westminster Confession and the Second Helvetic Confession, teach the good of religious establishments and charge those in political authority to uphold both tables of the Ten Commandments.

Early Americans were influenced by this understanding of Christian political order. According to Davenant Institute President Brad Littlejohn, the Founding Fathers “were certainly ‘Christian nationalists’ by the contemporary definition — that is, people who believed it important that America should publicly describe and conduct itself as a nation within a Christian framework.” Most state constitutions privileged Christianity — in most cases specifically a Protestant kind — and featured mentions of God, religious tests for public office, taxpayer funding of clergy and churches, Sabbath laws and laws against blasphemy, and Christian prayer and instruction in public schools well into the mid-20th century.

Christianity in a Negative World 

What about the place of “cultural Christianity,” an important pillar of Christian nationalism that has been heavily criticized by public theologians such as Russell Moore and Ray Ortlund? Wolfe contends that the critics commit a category error because it was never intended “to bring about anyone’s salvation.” Having a robust culture infused with Christian themes and imagery instead prepares citizens “for the reception of the Gospel.” It is a social power that internalizes the normal patterns of life that revolve around regular participation in Christian practices. 

As Wolfe rightly asks, would these critics look to subject families to “relentless hostile social forces” such as drag queen story hours, transgender ideology being taught in public schools, rampant porn use, and worse? Are active hostility and open persecution — that is, the circumstances first-century Christians faced — the only cultural conditions suited for the spread of Christianity? The history of Christendom renders a rather clear verdict on these questions.  

Christians are not called to conserve mid-20th century Supreme Court rulings. Begging for the table scraps of religious liberty carve-outs will not suffice, and “prudence” that is actually capitulation to the regnant cultural ethos will only hasten our nation’s slide into anarchy. To appropriate a famous G.K. Chesterton quote, the business of Christians “shouldn’t be to prevent mistakes from being corrected.”

In a “negative world,” to use Aaron Renn’s useful taxonomy, in which our magistrates oversee an establishment complete with a “regime-enforced moral ideology” that is hostile to Christianity, Wolfe gives Christians a coherent intellectual foundation that can withstand the gale force winds of our age. But political theory cannot enact itself. Christians must have the courage, manliness, fortitude, and strength to lay the groundwork in the decades ahead for what will assuredly be a multi-generational effort.

Mike Sabo is the editor of RealClear’s American Civics portal. He is a graduate of the Van Andel Graduate School of Statesmanship at Hillsdale College. He and his wife live in Cincinnati, Ohio.


Education Today is Barbarism

Professor James Tooley, Vice-Chancellor at the University of Buckingham, discusses the role of education in the modern age, and the relationship between parental and government oversight in our education systems.

Buckingham University is the oldest of only six private higher education institutions in the UK that award degrees. It started with an idea in 1969 that the country should have a university free from state funding and state regulation.

The Birth of a Private University

Seven years later, the university was opened by Margaret Thatcher in 1976. At the time she was the leader of the opposition and three years out from becoming prime minister of the United Kingdom.  This is part of what she said in the inauguration speech for the university:

To a free people, accustomed to a great richness of private initiative, there is something undesirable, indeed debilitating about the present mood in the country in which so many look not to themselves or their fellows for new initiatives but to the state…

I, as a politician must not prescribe to you. Independence is not a gift, it is not something that governments confer but something that the people enjoy and use…

Unless we are worthy and able to take advantage of a freedom not yet extinguished in our land, we shall become pale shadows like civilisations before us who are eventually thrust aside and disposed of by more vigorous rivals.

What a powerful philosophy! It is wonderful to hear a politician extolling the virtues of freedom of thought and action, with no need or desire for government control over education.

In Loco Parentis

James extolled the virtue of ‘in loco parentis’:

The term “in loco parentis” is a Latin phrase that translates as “in place of a parent” or “instead of a parent” and refers to how schools and school administrators are expected to act with reference to students and other minors. In other words, the employees of a school are charged, by the parents of the students, to act on their behalf while the students are there.

In other words, the educational institution is primarily responsible for carrying out the wishes of the parent. The phrase is used to help the teacher make a judgement call: ‘What would the parent do in this situation? They are at work, I have the responsibility for their child — I need to act in loco parentis’.

John followed up with his phrase that ‘governments should be downstream from culture, not the other way about’. In other words, the government’s primary responsibility is to listen to the culture and enact what is best, rather than to dictate what they believe is best for the people.

The Difference Between Boys and Girls

Then the conversation moved on to the proportion of young people that should go to university in any given society. From my own experience, I have seen governments seek to push this proportion as high as possible to minimise youth unemployment statistics. But is university the best avenue for every young person?

James talks about the shortage of skilled tradespeople, electricians, plumbers, and welders. He argues that university is not the best pathway for these invaluable members of society. Their discussion goes on to cover the alarming dropout rate from the education of white males in the USA, with the effect that this push towards university has now created an underclass of disaffected, underproductive, white young males intent on destabilising society.

The discussion also covers the styles of learning that have become most prevalent in universities. There are now fewer high-pressure summative exams that favour boys’ learning styles, and more small-scale continuous assessments favoured by girls, arguably disenfranchising the boys.

Low-Cost Private Education

In addition to his role as Vice-Chancellor of his university, James has also pioneered some astonishing work on the demand for and viability of low-cost private education, firstly in the developing world and more recently in the West.

His research, in some of the most dangerous places on earth, found that upwards of 70% of children in developing world nations are being educated in private schools. These schools might have been started by a passionate mother with a teaching gift who then established a school around her as her own children grew.

Or the school might have been a tutor group that was set up to help more senior students pass public exams, then it grew a school, backwards, down through the grades. While yet other schools have been established as small businesses, so-called ‘for profit’, but only generating enough income for the founder to clothe, house and feed his own family.

James recounts his own experience in Durham, England, where he established a low-cost private school. The fees are £3,000 per student per year. The class sizes are 20, therefore generating £60,000 per year, £30,000 for the teacher’s salary, typical for the UK, and £30,000 for buildings, utilities and resources. The low-cost model works, and they don’t need or want government involvement or interference.

Education Should Be the Foundation of Civilisation

The developing world demonstrates that without the intervention of governments, parents can make education happen. James Tooley has demonstrated that it can also happen in the West without the intervention of the state. And in this regard, Margaret Thatcher extolled the virtues and advantages of freedom of choice for parents in the education of their children.

It seems to me that the only motivation for state control of education is to control and homogenise society by dumbing down education, alongside the removal of debate.

Education must provide the tools to think and learn, rather than telling children what is right and wrong. The latter is for the parents to teach in the home.

Sadly, Mrs Moira Deeming MP, Liberal Member for the Western Metropolitan Region, Melbourne, Victoria, had to give up her teaching career on account of excessive state control of education. In her maiden speech to parliament, she highlights the excesses and evils that result from disproportionate state control of education.


Photo by Max Fischer.

Thank the Source

If You Take The Elevator To The Second Floor, You Don’t Deserve Legs

If You Take The Elevator To The Second Floor, You Don’t Deserve Legs

I went on a cruise across the Caribbean last week, and if you’ve ever been on one of those ships, you know the elevators are usually right next to wide, grand staircases that go all the way up and down the boat. You can’t miss them.

After boarding the boat from a day at port in the Dominican Republic, I hopped on the elevator to hit the buffet 10 floors up. I was back on the boat a bit early, so I was the only one in the elevator waiting for it to close. When the doors began to shut, a couple ripped them open at the last second to climb in — which was fine at first. I’m happy to be courteous, and we were all on vacation, after all. That was until the elevator stopped on the second floor to let this couple off despite the stairs being 10 feet from the elevator entrance.

I’m sorry, but at that point, why even have legs? Nothing better captures the laziness of modern Americanism than those who feel compelled to ride the elevator up or down a single floor because they can’t be bothered to find the stairs, let alone use them.

There are few reasons why any able-bodied person should ever ride the elevator up or down a single floor instead of tackling a flight of stairs, which in America is only about 12 steps. Unless a person is seriously disabled, they can take 12 steps. Unless a person is moving a big, awkward object such as furniture, they can take 12 steps. Unless a person is carrying a large load of groceries (and I mean large, three bags minimum), they can take 12 steps. There is no excuse for requiring that everybody else in a building wait for the elevator to start and stop so a lazy person can avoid the apparently Herculean task of walking 12 or, on the cruise ship, 14 inclined steps.

The couple who interrupted my hangry trip to the buffet was not disabled, not carrying anything, and not even overweight (unlike most people on the gluttony-fueled voyage). If I were king for a day, I might have had their legs chopped off. They don’t appear to be needing them.


What My Son’s Soccer Taught Me About the Gender Revolution

The question hit me with fresh force as I read it:

‘What gender is your child?’

I had read and answered similar questions before, at least about my gender. But this question from my son’s local soccer club about his gender gave the following options:

‘Male’, ‘female’, ‘prefer not to say’, or ‘gender fluid’.

Gender fluid. 

For a local soccer club (not to mention Football Australia) to ask whether your child is ‘gender-fluid’ made me realise something: the transgender revolution is not something out there amongst inner-city secular progressives who accept gender ideology. It’s now affecting everyone, including my children. Sure, we haven’t, as a family, signed onto the (trans)gender revolution. But our culture has, and in a big way.

Cultural Change

Just think about it: Five years ago, for a soccer club to ask if your child is gender fluid would have raised parental eyebrows. And ten years ago, such a question would have been unthinkable (most of us then didn’t even have a category for ‘gender fluid’).

But today, most soccer mums and dads shrug their shoulders and move on.

And across our culture, the number of gender non-conforming children has skyrocketed. Writing about the British Tavistock Centre and its Gender Identity Development Service (GIDS), author Hannah Barnes writes:

‘Since 2007 [GIDS] had grown from a small team that saw 50 young people each year to a nationally commissioned service treating thousands.’

And just as disturbingly, the people presenting at the clinic had changed:

‘Whereas most of the literature on gender non-conforming children was about boys who had a life-long sense of gender incongruence, GIDS’s waiting room was overpopulated with teenage girls whose distress around their gender had only started in adolescence.’

It’s the cultural sea we’re swimming in.

While these changes raise urgent questions — such as why the sudden increase in gender non-conforming adolescent girls? and how we care well for gender non-conforming people (especially children)?, my question is more basic:

How did such a moral revolution happen so quickly? 

It’s made me think of the various steps of moral revolutions outlined by English writer and thinker Theo Hobson. In his view, for a full moral reversal — a moral revolution — to take place, three conditions much be met:

1) What Was Condemned Must Be Celebrated

Until around 20 years ago, the Biblical (and historical) view of marriage as between one man and one woman was widely celebrated.

Heterosexuality was the norm, and there were only two genders (aligned with our sex). Anything outside that was seen as being outside the norm.

But today, homosexuality and gender fluidity aren’t merely tolerated as equal views: they are actively celebrated and promoted — even to schoolchildren. From Wear It Purple days to the mass marketing of Pride Week, all LGBTIQ identities are honoured and held up as good, true and beautiful.

2) What Was Celebrated Must Now Be Condemned

The Biblical understanding of marriage and gender is now condemned as oppressive and harmful.

Teaching these values to children is increasingly considered suspect, as the recent Australian Law Reform Commission Report argues. If you haven’t heard, the Federal Government tasked the Australian Law Reform Commission Report to report on religious schools and how to handle religious freedom. Its recent report argues for removing religious freedom protection for religious schools.

As Neil Foster, an Associate Professor of Law and expert on religious freedom, points out:

‘[The Report] effectively recommends the removal of protections enjoyed by religious educational institutions which have been designed to safeguard the ability of these organisations to operate in accordance with their religious beliefs. The “fences” protecting these bodies from being forced to conform to majority views on sexual behaviour and identity (and hence losing their distinctiveness as religious bodies) are to be knocked down, the ALRC says.’

 If the ALRC had its way, religious schools will no longer be allowed to be… religious. At least not when sexuality and gender are concerned.

What was celebrated must now be condemned — culturally and, increasingly, legally.

3) Those Who Will Not Join in Celebrating the Moral Revolution Must Be Condemned

The transgender moral revolution doesn’t believe in ‘live and let live’ disagreement.

You must be condemned if you have the audacity to raise some basic questions about the moral revolution. Just ask J.K. Rowling.

She was cancelled for raising the concern a few years ago that ‘trans-women’ (i.e. biological males identifying as women) are different from biological women.

But it’s not just celebrities and public figures that face cancellation if they speak up. Any parent who dares raise questions about why their daughter has to play against biological boys in a girls-only soccer competition will not be popular with the likes of The ABC or The Age.

Any religious leader who promotes Biblical sexuality is at risk of attack.

And if you’re an employee who doesn’t wear purple on said days, the questions from colleagues and HR will soon come, if they’re not already coming.

What Might Be Next?

While it’s impossible to know what’s next in the moral revolution, there are signs that it has overreached, at least regarding gender ideology. Like all revolutions that try and overturn God’s good creation order (communism, anyone?), reality has a way of pushing back and making itself known.

Many medical practitioners are raising questions about the ethics of carte blanche gender-affirming care. Regarding gender ideology, ‘de-transitioners’ — those that have transitioned but now regret it, are making their voices known. Some sporting bodies are pushing back against rules that allow biological males to compete against females. And even the secular-left newspapers like The New York Times have run articles questioning gender-affirming therapies of trans kids (earning the ire of LGBTIQ activists, in line with point #3, above).

Those are encouraging signs. But, on the other hand, perhaps this revolution has a long way to go before it burns itself out.


Originally published at Photo by Dominika Roseclay.

Thank the Source

How Focusing On Prayer This Lent Could Lead You To Redemption

How Focusing On Prayer This Lent Could Lead You To Redemption

Redemption is actually not a religious term. The act of redeeming someone goes back to ancient times. It was the practice of buying back a servant or loved one who had been kidnapped. This payment was the ransom.

Redemption is needed when someone or something was taken. Today, this is applicable to our lives in a real way. What has been taken from us? What have we lost?

If you are like most Americans, you are too busy. Our calendars are filled with responsibilities regarding our careers and activities for the family. So many people explain that they cannot attend worship services or pray because they are simply out of time. In the time they do have, they are wiped out. 

Many Americans also question the meaning or purpose of their lives. According to Lifeway Research, 63 percent of Americans wonder if their life can have more meaning on a regular basis. The seemingly infinite human “To Do List” might fill our time, but it does not satisfy the human heart. 

The answer to redeeming America’s business and doubts about purpose actually resides in a meaningful Lent. Once a year, the church doubles down on what it means to be a follower of Jesus. What does it mean to be in a relationship with God, and how does one grow in it? After honest reflection, most people would admit they do not focus enough on the big questions that revolve around God and life.

Questions like, is there a God? Can we know God? Does God care about me? Does God interact with human beings? What happens when we die?

Lent can redeem us because it forces us to give space to what is most important. Forty days can truly change us if we commit. That is a critical key to redemption: commitment. This is summarized by the famous invitation of Jesus to his followers: “Follow me” (Matthew 4:19). God constantly invites us to a relationship with him, but things only begin to change when we commit.

Churches all over the nation have a commitment in mind. They offer more Masses, other services, and different opportunities for people to grow in their faith by committing to something so simple in Lent: prayer. Several studies have shown that prayer brings about many practical benefits.

A study from Columbia University suggests that the spiritual practice of meditation actually strengthens the brain’s cortex. This study posits that prayer helps guard against many illnesses and fights tendencies toward depression. Prayer can literally protect us against feelings of loneliness and purposelessness because it unites us with the God of love. Oregon State University found that prayer leads to less addiction, and it helps people regulate their emotions. 

In our world today, there are so many proposed answers to a person’s lost sense of self and purpose. Exercise, diet, sports, leisure — the list goes on and on. These are proposed as possible remedies for human heartache. The answer, however, is so simple that it is overlooked. A focus on the supernatural, on God, is the best way for a person to be placed in contact with the source of it all and experience true loving acceptance and a sense of relationship. 

One of the longest research studies on record was conducted by Harvard University. It has made headlines in and out of a variety of newsrooms recently. It is a multigenerational study on happiness. More than 700 males were chosen for this study, including their children and grandchildren. The takeaway was astounding. The No. 1 leading cause of happiness was meaningful relationships. 

We know happiness is immaterial; it is not a physical thing in the universe. That is why money cannot buy happiness. That is why you cannot purchase happiness on Amazon. God is also immaterial, for if God were material, He would have a beginning. Instead, He is the source of the universe, existing outside of it.

God is also a relationship: Father, Son, and Holy Spirit. If the scientifically proven, No. 1 leading cause of happiness is relationships, and God is a relationship, it follows that we must lean on God if we desire our lives to be restored to the fullness they were made for.

So this Lent, focus on prayer. It could change your life and lead you to true happiness. Let us commit to focusing on God so we can receive Jesus, the true Ransom, at the end of these 40 days. That, and only that, can lead to a life restored and an America redeemed.

Thomas Griffin teaches in the Religion Department at a Catholic high school and lives on Long Island with his wife and son. He has a master’s degree in theology and is currently a masters candidate in philosophy. Follow his latest content at


How The Diversity Industrial Complex Dominated Everything And Fixed Nothing

How The Diversity Industrial Complex Dominated Everything And Fixed Nothing

Little more than a decade ago, DEI was just another arcane acronym, a clustering of three ideas, each to be weighed and evaluated against other societal values. The terms diversity, equity, and inclusion weren’t yet being used in the singular, as one all-inclusive, non-negotiable moral imperative. Nor had they coalesced into a bureaucratic juggernaut running roughshod over every aspect of national life. 

They are now. 

Seemingly in unison, and with almost no debate, nearly every major American institution — including federal, state, and local governments, universities and public schools, hospitals, insurance, media and technology companies, and major retail brands — has agreed that the DEI infrastructure is essential to the nation’s proper functioning.

From Amazon to Walmart, most major corporations have created and staffed DEI offices within their human resources bureaucracy. So have sanitation departments, police departments, physics departments, and the departments of agriculture, commerce, defense, education, and energy. Organizations that once argued against DEI now feel compelled to institute DEI training and hire DEI officers. So have organizations that are already richly diverse, such as the National Basketball Association and the National Football League.  

Many of these offices in turn work with a sprawling network of DEI consulting firms, training outfits, trade organizations, and accrediting associations that support their efforts. 

“Five years ago, if you said ‘DEI,’ people would’ve thought you were talking about the Digital Education Initiative,” Robert Sellers, University of Michigan’s first chief diversity officer, said in 2020. “Five years ago, if you said DEI was a core value of this institution, you would have an argument.”   

Diversity, equity, and inclusion is an intentionally vague term used to describe sanctioned favoritism in the name of social justice. Its Wikipedia entry indicates a lack of agreement on the definition, while and the Associated Press online style guide have no entry (the AP offers guidance on related terms). 

Yet however defined, it’s clear DEI is now much more than an academic craze or corporate affectation.

“It’s an industry in every sense of the word,” says Peter Schuck, professor emeritus of law at Yale. “My suspicion is that many of the offices don’t do what they say. But they’re hiring people, giving them titles and pretty good money. I don’t think they do nothing.”  

It’s difficult to know how large the DEI Industrial Complex has become. The Bureau of Labor Statistics hasn’t assessed its size. Two decades ago, MIT professor Thomas Kochan estimated that diversity was already an $8 billion-a-year industry. Yet along with the addition of equity, inclusion, and like terms, the industry has surely grown an order of magnitude larger. Six years ago, McKinsey and Company estimated that American companies were spending $8 billion a year on diversity training alone. DEI hiring and training have only accelerated in the years since.  

“In the scope and rapidity of institutional embrace,” writes Marti Gurri, a former CIA analyst who studies media and politics, “nothing like it has transpired since the conversion of Constantine.”  

Yet in our time, no Roman Emperor has demanded a complete cultural transformation. No law was passed mandating DEI enactment. No federal court ruling has required its implementation. There was no clarion call on the order of President Dwight D. Eisenhower’s “military industrial complex” warning. No genuine public crisis matched the scale of the response.  

The sources of this transformation are both deep and fairly recent. On one level, they can be traced back to the egalitarian movements that have long shaped American history — from the nation’s founding, through the Civil War and Reconstruction to the battles for women’s suffrage, the civil rights movement, and same-sex marriage. In other ways, the rapid transformation can seem no more explicable than an eccentric fashion trend, like men of the late 18th century wearing periwigs. However, a few pivot points of recent history bent its arc in DEI’s direction.  

The push for affirmative action is the most obvious influence, a program first conceived during the Reconstruction era but then abandoned for nearly a century. Although triumphs for social justice, the Civil Rights Act and Voting Rights acts of the late 1950s and 1960s didn’t stop discrimination; the country would need to take more affirmative steps toward assisting minority groups and achieving more equitable outcomes, proponents argued. A controversial policy from the start (with the Supreme Court expected to curb its use in college admissions this term), affirmative action was further complicated by immigration reforms that allowed for more non-European immigrants, setting off a seismic demographic shift that continues to reverberate.  

The diversity movement of the early 1990s was in part an attempt to capitalize on the new multicultural reality. Stressing individual and institutional benefits rather than moral failings, early corporate diversity training programs hewed to traditional values of equality and meritocracy. Creating a diverse workplace, R. Roosevelt Thomas wrote in the Harvard Business Review, in 1990, “should always be a question of pure competence and character unmuddled by birth.”  

And in many ways it appears to have worked. Just look at the tech industry, where immigrants from East and South Asia have flourished. Nigerian immigrants are perhaps the most successful group in America, with nearly two-thirds holding college degrees. Doors have opened wide to the once-closeted LGBT community.  

But in other ways, the recent explosion of DEI initiatives reflects shortcomings of earlier efforts, as suggested by the headline of a 2016 article in the Harvard Business Review, “Why Diversity Fails.” Even as high-achieving first- and second-generation immigrants have thrived in certain industries, particularly STEM fields, people of color remain scarce in senior institutional positions. There is also the deeper issue of what many in the post-George Floyd era have taken to calling systemic or structural racism, citing major disparities for black Americans in education, health care, homeownership, arrests, incarceration, and household wealth. 

More recently, a spate of widely publicized police killings of unarmed African Americans has galvanized a growing belief, especially among progressives and especially since Donald Trump’s election, that America is an irredeemably racist nation. In 2020, in the wake of the Floyd murder and in advance of a fraught election, a moral panic set in. Having increased their ranks, social justice entrepreneurs and bureaucrats were poised to implement an ideological agenda and compound their institutional power. 

Although no hard numbers exist on the exact size of the industry, the “DEIfication” of America is clear. From Rochester, New York, to San Diego, California, cash-strapped municipalities have found the funds to staff DEI offices. Startups and small companies that once relied on their own employees to promote an inclusive culture now feel compelled to hire diversity consultants and sensitivity trainers to set them straight.

The field is so vast it has born a sub-field: recruiting agencies for DEI consultants. So-called “authenticity readers” tell publishing companies what are acceptable depictions of marginalized groups and who is entitled to tell their stories. Master’s degree and certificate programs in DEI leadership at schools like Cornell, Georgetown, and Yale offer new and lucrative bureaucratic careers. 

At Ohio State University, for example, the average DEI staff salary is $78,000, according to public information gathered by economist Mark J. Perry of the American Enterprise Institute — about $103,000 with fringe benefits. Not to be outdone by its Big Ten conference rival, the University of Michigan pays its diversity officers $94,000 on average — about $124,000 with benefits. Until he retired from the position last summer, Michigan’s chief diversity officer, Robert Sellers, was paid over $431,000 a year. His wife, Tabbye Chavous, now has the job, at the vice provost rank and a salary of $380,000.  

For smaller organizations that cannot afford a full-time equity officer, there are other options for shoring up social justice bona fides — namely, working with any of the hundreds of DEI consulting agencies that have risen like mushrooms after a night’s rain, most of them led by “BIPOC” millennials. With some firms, the social justice goals are unmistakable. The Racial Equity Institute is “committed to the work of anti-racist transformation” and challenging “patterns of power” on behalf of big-name clients like the Harvard Business School, Ben & Jerry’s, and the American Civil Liberties Union. With others, the appeal has less to do with social change than exploring marketing opportunities and creating a “with-it” company culture, where progressive politics complement the office foosball tables and kombucha on tap.

“Diversity wins!” declares the management consultancy McKinsey & Company. Certainly diversity officers have been winning, although opposition is building in Florida and elsewhere, where the wider woke agenda that includes DEI has advanced. Even minimally trained practitioners are in high demand, and signs of their influence abound.   

Wells Fargo offers cheaper loans to companies that meet racial and gender quotas. Private equity and venture capital firms like BlackRock and KKR declare their commitment to racial “equity.” Bank of America tells its employees they are implicated in a white supremacist system. Lockheed Martin asks its executives to “deconstruct their white male privilege.” 

Major tech companies like Google publicly chart the “Black+ and Latinx+” people they’ve hired and assure the public that Artificial Intelligence will prioritize the DEI political agenda. ChapGPT, an AI model that can generate remarkably cogent writing, has been designed with a liberal bias, summarily rejecting requests that don’t conform to the algorithm’s notions of “positivity, equality and inclusivity.” 

Disney instructs employees to question colorblind beliefs espoused by the Rev. Martin Luther King Jr. and others. Fire departments are told to lower their physical fitness requirements for women. Similarly, universities are dropping standardized tests to yield more admissions of certain minorities (typically not Asians). And the Academy of Motion Picture Arts and Sciences, hoping to award more “films of color,” inspects Oscar-nominated films for cast and crew diversity. (Netflix has been a notable exception, last May laying off dozens of employees working on such issues. Under Elon Musk, Twitter is also flouting woke orthodoxies.) 

In education, college students are required to take DEI-prescribed courses. Community college employees in California are evaluated on their DEI competencies. Loyalty oaths to the DEI dogma are demanded of professors. Applicants to tenure-track positions, including those in math and physics, are rejected out of hand if their mandatory DEI statements are found wanting. Increasingly, DEI administrators are involved in hiring, promotion, and course content decisions.  

“Academic departments are always thinking, ‘We need to run this by Diversity,’” says Glenn Ricketts, public affairs officer for the National Association of Scholars.  

The industry’s reach can also be seen in the many Orwellian examples of exclusion in the name of inclusion, of reprisals in the name of tolerance. Invariably, they feature an agitated clutch of activists browbeating administrators and executives into apologizing for an alleged trespass against an ostensibly vulnerable constituency. When that has been deemed insufficient or when senior executives have sensed a threat to their own legitimacy, they’ve offered up scapegoats on false or flimsy pretexts. That might be a decades-long New York Times reporter, a head curator at a major art museum, an adjunct art history professor, a second-year law student, or a janitor at a pricey New England college. (The list is long.) 

Often enough, the inquisitions have turned into public relations debacles for major institutions. But despite the intense criticism and public chagrin, the movement marches on. 

The expansion “happened gradually at first, and people didn’t recognize the tremendous growth,” Perry says. “But after George Floyd, it really accelerated. It became supercharged. And nobody wanted to criticize it because they would been seen as racists.”  

Not playing along with the DEI protocols can end an academic career. For example, when Gordon Klein, a UCLA accounting lecturer, dismissed a request to grade black students more leniently in 2020, the school’s Equity, Diversity, and Inclusion office intervened to have him put on leave and banned from campus. A counter-protest soon reversed that. However, when Klein also declined to write a DEI statement explaining how his work helped “underrepresented and underserved populations,” he was denied a standard merit raise, despite excellent teaching evaluations. (He is suing for defamation and other alleged harms.)  

Scores of professors and students have also been subject to capricious, secretive, and career-destroying investigations by Title IX officers, who work hand-in-glove with DEI administrators, focusing on gender discrimination and sexual harassment. As writer and former Northwestern University film professor Laura Kipnis recounts in “Unwanted Advances,” individuals can be brought up on charges without any semblance of due process, as she was, simply for “wrongthink” — that is, for having expressed thoughts that someone found objectionable.

With activist administrators assuming the role of grand inquisitors, “the traditional ideal of the university — as a refuge for complexity, a setting for free exchange of ideas — is getting buried under an avalanche of platitudes and fear,” she writes. And it would appear that students and professors would have it no other way. By and large, they want more bureaucratic intervention and regulations, not less. 

As more institutions create DEI offices and hire ever more managers to run them, the enterprise inevitably becomes self-justifying. According to Parkinson’s Law, bureaucracy needs to create more work, however unnecessary or unproductive, to keep growing. Growth itself becomes the overriding imperative. The DEI movement needs the pretext of inequities, real or contrived, to maintain and expand its bureaucratic presence. As Malcolm Kyeyume, a Swedish commentator and self-described Marxist, writes: “Managerialism requires intermediation and intermediation requires a justifying ideology.”

Ten years ago, Johns Hopkins University political scientist Benjamin Ginsberg found that the ratio of administrators to students had doubled since 1975. With the expansion of DEI, there are more administrators than ever, most of whom have no academic background. On average, according to a Heritage Foundation study, major universities across the country currently employ 45 “diversicrats,” as Perry calls them. With few exceptions, they outnumber the faculty in history departments, often two or three to one. 

At Michigan, Perry wasn’t able to find anyone with the words “diversity,” “equity,” or “inclusion” in his job title until 2004; and for the next decade, such positions generally remained centralized at the provost level, working for the university as a whole. But in 2016, Michigan president Mark Schlissel announced that the university would invest $85 million in DEI programs. Soon after, equity offices began to “metastasize like a cancer,” Perry says, across every college, department, and division, from the college of pharmacy to the school’s botanical garden and arboretum, where a full-time DEI manager is now “institutionalizing co-liberatory futures.” All the while, black enrollment at Michigan has dropped by nearly 50 percent since 1996.  

Despite the titles and the handsome salaries, most DEI administrative positions are support staff jobs, not teaching or research positions. In contrast with the provisions of Title IX, DEI is not mandated by law; it is entirely optional. DEI officers nevertheless exert enormous influence, in part because so few people oppose them. The thinking seems to be that if you’re against the expanding and intrusive diversity, equity, and inclusion agenda, you must be for the opposite — discrimination, inequality, and exclusion.  

“By telling themselves that they’re making the world a better place, they get to throw their weight around,” says Ricketts. “They have a lot of money, a lot of leverage, and a lot of people who just don’t want to butt heads with them — people who just want to go along to get along. People who are thinking, ‘If we embrace DEI, nobody can accuse us of being racist or whatever.’ They’re trying to cover their backsides.” 

Some organizations, it seems, are merely trying to keep up with cultural trends.  

Consider Tucson, Arizona, where diversity is not a buzzy talking point but an everyday reality. With a population that is 44 percent Hispanic, 43 percent white, and only 4.6 percent black, the city has had no major racial incidents in decades. Yet like hundreds of others communities, Tucson suddenly decided in direct response to the Floyd murder 1,600 miles away that it needed an office of equity.

To many observers, it seemed that the city was just “getting jiggy with it,” pretending to solve a problem that didn’t exist. After a two-year search, it hired Laurice Walker, the youngest chief equity officer in the country, at age 28, with a salary of $145,000 — nearly three and a half times what Tucson’s mayor, Regina Romero, earns. 

Not that the mayor is complaining. “I think this position is about putting an equity lens into all that we do,” Romero said in May, by which she means — well, nobody is quite sure what “equity” means, particularly with respect to federal legislation clearly prohibiting positive and negative discrimination alike.  

But trying to get out in front of the DEI train can also result in getting run over by it.  

When the city council of Asheville, North Carolina, hired Kimberlee Archie as its first equity and inclusion manager, its members probably didn’t anticipate being accused of having a “white supremacy culture.” After all, city manager Debra Campbell is black, as are three of the seven women making up the city council. The council had cut police funding and unanimously approved a reparations resolution.

Archie nevertheless complained that her colleagues still weren’t doing enough to advance racial equity. “What I describe it as is kind of like the bobblehead effect,” she said in 2020. “We’d be in meetings … and people’s heads are nodding as if they are in agreement. However, their actions didn’t back that up.”  

The drama in western North Carolina illustrates a dilemma that organizations face going forward. They can pursue an aggressive political agenda in which white supremacy is considered the country’s defining ethos (per The New York Times’ “1619 Project“) and present discrimination as the only remedy to past discrimination (see Ibram X. Kendi). Or they take the path of least resistance, paying rhetorical tribute to DEI enforcers as the “bobbleheads” that Archie disparages but doing little more than that. After all, they still have universities, businesses, and sanitation departments to run, alumni and investors to satisfy, students to teach, research to pursue, roads to be paved, sewage to be treated, costs to be minimized, and profits to be maximized.  

Perhaps, too, senior administrators and executives are beginning to realize that, despite the moral panic of 2020, the most culturally diverse country in the world might not be irredeemably racist, even if it’s no longer acceptable to say so. The United States twice elected an African American man named Barack Hussein Obama as president. His first attorney general was a black man, who would be replaced by a black woman. His vice president would pick a woman of mixed race as his running mate. The mayors of 12 of the 20 largest U.S. cities are black, including the four largest cities.

Likewise, many of the people whom Americans most admire — artists, athletes, musicians, scientists, writers — are black. Lately, most winners of MacArthur Foundation “genius” grants are people of color. Gay marriage is legal, and enjoys wide public support, even among conservatives. The disabled, neurodivergent, and gender-divergent are applauded for their courage and resilience. And nonwhite groups, particularly Asians, Latinos, and African immigrants, have been remarkably upwardly mobile (often without official favoritism). 

Clearly, troubling disparities persist for African Americans. What’s much less clear is that racism, systemic or not, remains the principal cause of these disparities or that a caste of equity commissars will reverse them. And now, it would seem that narrowing these disparities runs counter to their self-interest. 

“I don’t want to deny that there’s genuine goodwill on the part of some of these programs,” says Prof. Schuck, stressing that he hasn’t examined their inner workings. “But some of these conflicts are not capable of being solved by these gestures. They have to justify their own jobs, their own budgets, however. And that creates the potential for a lot of mischief. They end up trafficking in controversy and righteousness, which produces the deformities we’ve been seeing in policies and conduct.” 

Still, to hear DEI officers, it’s they who are beleaguered and overwhelmed. Yes, they have important-sounding jobs and rather vague responsibilities. They are accountable to nobody, really. Rather than fighting “the man,” they now are the man, or at least the gender-neutral term for man in this context. But this also means that they are starting to catch flak, particularly as the evidence mounts that the institutions they advise and admonish aren’t actually becoming more fair, open, and welcoming. They’re not even becoming more ethnically diverse.  

Like other DEI advocates, the National Association of Diversity Officers in Higher Education has declined to answer questions for this article. Its officers are too busy traveling to conferences to do so, a spokeswoman said.  

But at a recent association meetingAnneliese Singh of Tulane University invoked Rosa Parks’ refusal to take a back seat to discrimination. Although Parks was a housekeeper and diversicrats have comfortable university sinecures, their struggles are analogously distressing, Singh suggested. The latter, too, are on the “front lines” in a harrowing war. However, she said, her colleagues needed to remember what mattered most: Looking out for themselves.  

“It is not self-indulgence,” she said, now quoting the feminist and civil rights activist Audre Lord. “It is self-preservation. And that is an act of political warfare.”  

For the moment, it’s a war Singh and her DEI colleagues are clearly winning.

This article was originally published by RealClearInvestigations.



Please help truthPeep spread the word :)