What do we celebrate on the Fourth of July?
Our national independence! (As we subvert the national independence of Iran, North Korea, and Venezuela, as well as assist in the brutal occupation of Palestine and Yemen – none of which threaten us with any harm?)
Our national sovereignty! (As unregulated legal immigration reaches new highs, unstoppable illegal immigration overwhelms the border, while the apolitical “Jarvanka” shills for the cheap-labor lobby and the bipartisan Durbin-Graham “DREAMer Team” shill for the amnesty lobby?)
Our military supremacy! (As our military grows more impotent as its $1.25-trillion-and-counting budget grows more exorbitant, the “War on Terror” – with its staggering cost of nearly $6 trillion and about a half-million lives – has nothing to show for itself, and our “superpower” status is overstretched?)
Our political unity! (As partisan polarization gets so severe that the government is constantly breaking down in gridlock and shutdowns – occasionally interrupted by tribalistic crises like the Kavanaugh confirmation – while fake news like the “Covington Kids” and “Jussie Smollett” scandals are insta-politicized by radicalized techno-mobs lynching each other online?)
Our liberal democracy! (As political scientists conclude that “economic elites and organized groups representing business interests have substantial independent impacts on U.S. government policy, while average citizens and mass-based interest groups have little or no independent influence” – in other words, that the government is an “oligarchy,” not a “democracy”?)
Our constitutional liberty! (As economic and social terrorism by “the Fourth Estate” and “Silicon Valley” repress our freedoms of expression far more pervasively than old-fashioned state propaganda and surveillance ever could?)
Our free press! (As the media becomes less informative and more provocative – from a public institution essential to self-government to a consumer product targeting various demographics with fake news appealing to their prejudices – while journalists defend Jim Acosta’s grandstanding as an expression of First-Amendment freedoms and condemn Julian Assange as a spy for reporting on war crimes?)
Our “Judeo-Christian” morality! (As our corrupt, decadent, “Weimerikan” culture embodies each and every one of the Seven Deadly Sins while “progressive occultism” and literal “devil worship” are the fastest-growing religions?)
Our economic prosperity! (As 78% of our workers live paycheck to paycheck, real wages have not risen since the 1970s, two-thirds of bankruptcy cases are tied to medical bills, the (under-)employment rate is over 7%, job growth is absorbed by immigrants, another financial bubble is inflating, and 40%, and 20% of wealth belongs to 10%, 1%, and 0.01% of the people?
Our historical identity! (As “antifa”-iconoclasm creeps from Confederate history to all of American history, even overthrowing figures as harmless as the songwriter Stephen Foster and the singer Kate Smith?)
Our racial diversity! (As minority groups become so triumphalist about demographic trends toward a “majority-minority” population – with skeptics pathologized as paranoid, prejudiced, and psychopathic – that “diversity is our strength” becomes “diversity is your punishment”?)
Our rule of law! (As “anarcho-tyranny” victimizes the lawful and peaceful while empowering the lawless and brutal, the “RussiaGate” conspiracy theory risks a soft-coup at home and a hard-war abroad, and a partisan judiciary interferes with the authority of other branches of government?)
What have the conservatives conserved? What have the progressives progressed? Nothing. (The less said about the libertarians, the better.)
National holidays such as “Independence Day” presuppose the existence of a sovereign nation-state – a coherent, distinct nation, for one, and a state which is the political expression of a nation, for another. What “Weimerika” is, however, is a nation against itself and a state against the nation – that is, a people who all hate one another and a government which hates its own people.
On July 5th, 1852, the freedman-abolitionist Frederick Douglass gave a speech to the Ladies’ Anti-Slavery Society of Rochester, New York, titled “What to the Slave is the Fourth of July?” In his speech, which has since gone down in history, Douglass opened with a conventional tribute to the American Revolution but ended with a bitter twist:
What, to the American slave, is your Fourth of July? I answer: a day that reveals to him, more than all other days in the year, the gross injustice and cruelty to which he is the constant victim. To him, your celebration is a sham; your boasted liberty, an unholy license; your national greatness, swelling vanity; your sounds of rejoicing are empty and heartless; your denunciations of tyrants, brass-fronted impudence; your shouts of liberty and equality, hollow mockery; your prayers and hymns, your sermons and thanksgivings, with all your religious parade and solemnity, are, to him, mere bombast, fraud, deception, impiety, and hypocrisy – a thin veil to cover up crimes which would disgrace a nation of savages. There is not a nation on earth guilty of practices, more shocking and bloody, than are the people of these United States, at this very hour.
As rhetoric, Douglass’ speech is sublime; as history, it is dreadful. Douglass’ speech was one of the earliest ideological revisions of the Declaration of Independence from an act of secession which had more to do with self-interest than human rights into a veritable human-rights manifesto. According to Douglass, the American Revolution was not a colonial reaction against a faraway and meddlesome government, but an enlightened, universal expression of “the rights of man.” Thus, slavery, which denied that “all men are created equal,” was a “contradiction” to and “inconsistency” with the “principles” on which American independence was founded.
Douglass did not conceive of this literalist interpretation of the Declaration of Independence on his own, although his speech was one of the most eloquent and impassioned articulations of it. Such interpretations of the phrase “all men are created equal” had existed as soon as those words were written (though notably not from Thomas Jefferson, John Adams, or any of the other authors or signatories of the Declaration), but it was not until Abraham Lincoln incorporated this idea into his chiliastic and gnostic rhetoric that “all men are created equal” fundamentalism was canonized.
On July 4th of last year, leading “Black Lives Matter” agitprop-activist Shaun King shared Douglass’ speech. “Frederick Douglass was a prophet. Every single word of this is as true today as it was over 150 years ago. Read it all. The Fourth of July was always a sham.” In response, mewling “conservatives” over at Breitbart (and even at American Greatness) recited their “proposition nation” catechism. (They did not even challenge the big lie at the heart of the “Black Lives Matter” protest movement: The reason that blacks are arrested and even shot by police in disproportionate numbers is, sadly, because blacks commit crime in disproportionate numbers.)
Instead of continuing to cling to a corny, saccharine civic religion which offers them nothing, Americans should do as Douglass did and ask themselves the hard question, “What, to me, is the Fourth of July?”
What is the point of celebrating the independence of your country when your country has been reduced to lines on a map with nothing holding it together but the money and power of a central government? (Indeed, even “secession,” a question which for over 150 years has supposedly been “settled” by the so-called “Civil War – the same way that the Indians’ titles and treaties were “settled” by the Indian Removal Act? – is now making a comeback!)
What is the point of celebrating the independence of your country when everything for which your country stands at home and abroad disgusts you? (Check out publicly funded NPR reporting on the U.S. State Department promoting feminism in Iran and the U.S. government’s Radio Free Europe / Radio Liberty criticizing Russia for censoring homosexual content in American media!)
What is the point of celebrating the independence of your country when your country is so unrepresentative of the people that it might as well be occupied by aliens? (And I, for one, would be interested in hearing what any prospective extraterrestrial overlords had to say!)
As painful as it is to doubt something about which you have always been certain, it is foolish and even dangerous to continue to cling to a belief that is clearly based on a falsehood. False knowledge is worse than mere ignorance.
As the country as a whole unravels – that is, as the bourgeois, “neo-liberal” institutions of capitalism and democracy lose their legitimacy in the face of economic stagnation, environmental degradation, and mass-migration, not to mention anomie – what does the ruling class of this country do? Instead of listening to their fellow citizens and doing anything differently, the elite continues to tug on the threads of the cultural, economic, political, and social fabric while vilifying those who realize that the status quo is neither normal nor moral as “Communists” (if they are on the left) or “Nazis” (if they are on the right). On issue after issue – playing nice with other countries rather than bullying them or reducing immigration numbers rather than boosting them, not to mention the bogeyman of “democratic socialism” – the people are ignored and their will is rendered irrelevant. The U.S.A. does not belong to “us”; it belongs to “them.”
The whole point of national independence is that a country’s interests are supposed to be more accurately and effectively represented by its own people – “self-government” – but independence is worthless if no one actually represents the people. The lobbyists to whom politicians listen closely and whom they understand very well have good reason to celebrate independence, perhaps, but not the people whom are overruled whenever they try to take back their country and are otherwise ignored.
Thomas Jefferson, for example, did not support independence for independence’s sake alone, but supported it as a small step towards the larger goal of self-government. In fact, independence was so beside the point to Jefferson that he would have rather been back home in Virginia framing her new government than in Philadelphia drafting the Declaration of Independence! “It is a work of the most interesting nature and such as every individual would wish to have his voice in,” Jefferson wrote to a friend about the constitutional convention in Virginia. “In truth it is the whole object of the present controversy; for should a bad government be instituted for us, in future it had been as well to have accepted at first the bad one offered us from beyond the water without the risk and expense of contest.” As Jefferson’s biographer Dumas Malone explains, “He regarded political independence not as an end but as a means, and was more deeply concerned about what should follow the formal separation than about the action itself.”
The philosophical statesman Edmund Burke once observed, “To make us love our country, our country ought to be lovely.” Well, lately, our country – if it can even be called “ours” – has been anything but lovely. Indeed, whether it is bombing, looting, or polluting the rest of the world, Weimerika has warped into a downright demonic force for chaos and evil.
The Iranians call the U.S.A. “the Great Satan.” (Americans would do well to check their kneejerk-jingoism and remember that to Iranians, the U.S.A. is a country which in 1953 deposed their progressive government in order to install a subservient dictatorship, since 1979 has isolated them diplomatically and economically, from 1980 to 1988 backed Saddam Hussein as Iraq invaded their country and unleashed chemical weapons on them, since 2002 has vilified them as a “martyr state” and “state sponsor of terrorism” for helping neighboring countries quell rebellions and repel invasions, since 1948 has been allied with a “Zionist regime” which menaces the Middle East, and just recently broke their denuclearization deal on false pretenses, legitimized a criminal group which was exiled from their country, labeled a branch of their armed forces a terrorist group, and is now threatening – in all-caps – to “end” their country.) Given the U.S.A.’s utter mendacity and malevolence toward Iran – which it has never forgiven for “declaring independence” itself – it is no more of a wonder that the Iranians chant “Death to America!” and burn American flags than it was a wonder that the Americans of 1776 chanted “Death to King George!” and burned the British in effigy.
In a sermon against the “Great Satan” and “Zionism,” the Islamic scholar Yahya Jafari described the nature of these Western beasts:
They do not understand the language of reason. They do not follow international law, nor do they follow moral and humane laws. They have no religion and do not accept God. They are a bunch of profane people, but seemingly, some are called Christians and some are called Jews. If the prophet Jesus were among us, would he justify the criminal measures America takes today in the name of promoting democracy and freedom?
As insulting as such an epithet is, what makes it even worse is that it is all too true. The utter stupidity and savagery of Weimerika is shameful – and Donald Trump is merely a symptom, not the disease.
Does Weimerika listen to reason? No, it rejects any information which seems to limit its belief in its own “exceptionalism,” “indispensability,” and other chauvinistic conceits.
That imperial arrogance is why it broke the denuclearization deal with Iran despite ample documentation of compliance which it had even certified itself.
That imperial arrogance is why it broke the Paris Agreement despite the dire ecological consequences of climate change.
Weimerika is, then, irrational and unreasonable.
Does Amerika follow international law? No, because of its chauvinistic belief in its own “exceptionalism” and “indispensability,” it pursues a unilateral foreign policy which not only defies its own allies and treaties, but also defies reality itself.
That imperial arrogance is why it not only broke the denuclearization deal with Iran in bad faith, but also punished Iran and its former P5+1 partners for continuing to abide by the deal that they all negotiated in good faith.
That imperial arrogance is why it broke the Paris Agreement, not only setting an example of bad faith in an agreement based on good faith, but also undoing environmental regulations and undermining climate science at home.
Weimerika is, then, lawless and reckless.
Does Weimerika follow human-rights laws? No, it perversely weaponizes human rights, calling out abuses in so-called “rogue states” (e.g. Iran and Syria), covering up abuses in its client states (e.g. Israel and Saudi Arabia), and carrying out its own abuses whenever and wherever it wants (which is allowed because of its “exceptionalism” and “indispensability”).
Israel can snipe Palestinian children, journalists, and medics at their border or proclaim itself a “Jewish supremacist” ethno-state and Weimerika will respond by blithely reaffirming the Israeli-American unholy alliance. But if someone attacks a Japanese ship in the Persian Gulf (during a diplomatic mission to Iran from Japan) or Iran shoots down an American drone (encroaching on Iranian airspace), Weimerika will immediately play the victim and threaten violence.
Saudi Arabia can attack Yemeni marketplaces and weddings or strangle and dismember a Saudi dissident working for The Washington Post, and Weimerika will respond by blithely reaffirming the Saudi-American unholy alliance. But if poison gas supposedly goes off somewhere in Syria, Weimerika will immediately launch airstrikes on Syria before figuring out what really happened and then lie about it once it does.
Weimerika is, then, unfair and untrustworthy.
Does Weimerika follow religion? No, its traditional Christian religion having been purged from public spaces and private hearts, if it has any gods at all, they are profane "gods" of consumerism, egalitarianism, hedonism, individualism, materialism, narcissism, permissiveness, selfishness, and transgressiveness. (Overweening delusions of “exceptionalism” and “indispensability” are chauvinistic expressions of this underlying narcissism.)
Prestigious cultural and social events like the “Met Gala” and “Time 100” epitomize this godlessness. Supposedly held in honor of influential public figures (artists, entrepreneurs, intellectuals, philanthropists, statesmen, and the like), these events have instead degenerated into platforms for moronic and narcissistic pop-stars to exhibit – and in some cases literally expose – themselves for attention.
Weimerika’s only gods, then, are false gods.
Would Jesus promote “democracy” and “freedom” in other poorer countries as Weimerika does – by isolating them from the global economy and dropping bombs on them until the crisis reaches a breaking point, then exploiting the crisis to impose unpopular and predatory economic policies? Would Jesus even approve of the nature of “democracy” and “freedom” here in the so-called “exceptional and indispensable” Weimerika, where abortion and pornography are protected as quasi-sacred constitutional rights, where corporate greed and governmental corruption sustain an opioid epidemic, and where sporting events glorify militaristic propaganda?
Weimerika, then – its cesspool of a culture, its house-of-cards economy, its gangster-style government, and the insanity of its society, all of which it also exports around the world – is indeed “the Great Satan.” Indeed, in Islam and the other Abrahamic faiths, Satan is, above all else, “the Deceiver.” What better word, then, for a country whose current chief representative to the rest of the world in one moment jokes, “We lied, we cheated, we stole,” and in the next moment preaches, “It reminds you of the glory of the American experiment” – all to the applause of his audience of college students?
What better word for a country whose chief representative to the rest of the world, in the midst of overthrowing another secularist “regime” with more jihadist “rebels” in Libya, blathered about “humanitarian intervention” and “smart power,” only to burst out cackling “we came, we saw, he died!” upon learning that the Libyan head of state had been lynched?
What better word for a country whose chief representative to the rest of the world, despite invading Iraq under false pretenses, still boasted, “Our record of living our values and letting our values be an inspiration to others I think is clear, and I don’t think I have anything to be ashamed of or apologize for with respect to what America has done for the world”?
What better word for a country whose chief representative to the rest of the world, when asked about the human cost of economic sanctions against Iraq (that is, a half-million dead children), retorted, “We think the price is worth it,” yet when later asked to justify military force against Iraq, rhapsodized, “We are America…we stand taller and we see further than other countries into the future”?
What better word for a country whose chief representative to the rest of the world, in order to reassure the Soviet Union, promised, “If you remove your troops and allow unification of Germany in NATO, NATO will not expand one inch to the east,” but after the dissolution of the USSR, expanded NATO so far to the east that it is now massing forces on Russia’s very border (all the while calling Vladimir Putin “the next Hitler” or “the next Stalin”)?
Remember when Ron Paul was booed at “GOP” debate hosted by FOX News and The Wall Street Journal for suggesting that our foreign policy should be based on the “Golden Rule”? (By contrast, Newt Gingrich received a roar of approval from the audience for responding, “Andrew Jackson had a pretty clear-cut idea about America’s enemies: Kill them.”) Needless to say, booing the Golden Rule is, quite literally, “Satanic.”
The only other word for Weimerika besides “Deceiver” that comes to mind is yet another anti-American Iranian epithet: “The Capital of Global Arrogance.” As Weimerika’s current chief representative to the rest of the world just declared, “What’s good for the United States is good for the world,” and “America’s aggressiveness” is justified by “America’s essential rightness.” The pseudo-idealistic language with which Weimerika traditionally cloaks its imperial arrogance is simply insufferable, and thus such “swagger” – in this jingoist-turned-diplomat’s words – is at least somewhat refreshing for its lack of pretense.
Yet as a certain ex-President (who, just like a “deceiver,” followed up winning the Nobel Peace Prize with setting records for drone-strikes and prosecuting journalists) would say, “Let me be clear.” So let me be clear: As much as I hate all that is chaotic and evil about “Weimerika,” I love all that is lawful and good about “America.” Just as there cannot be light without darkness, there cannot be love without hatred. Thus, I hate the plastic, toxic Weimerika that has supplanted the organic America that I love and to which I am loyal. Weimerika is to America what McDonald’s is to Bern’s.
“What to the American is the Fourth of July?” Good question. I answer: A day that reveals to us, more than all other days in the year, the gross injustice and cruelty to which we – and the rest of the world – are the constant victim.
To us, our so-called national independence is a sham.
Our much-ballyhooed “values,” an unholy license.
Our self-assurances about ruling a “unipolar world,” swelling vanity.
Our redefined identity and retconned history as a “nation of immigrants,” empty and heartless (to native citizens at least!)
Our self-proclaimed status as “a city upon a hill” and “a light unto the nations,” brass-fronted impudence.
Our self-righteous shouts of “fighting terrorism” and “spreading freedom,” hollow mockery.
Our platitudes about “the character of this nation,” “the core values of this nation,” “the soul of this nation,” “our standing in the world,” “our very democracy,” and “who we are,” are, to us, mere bombast, fraud, deception, impiety, and hypocrisy – a thin veil to cover up crimes which would disgrace a nation of savages.
There is not a nation on earth less independent – as Eugene J. McCarthy put it, “a colony of the world” where businesses control economic policy, foreign states control foreign policy, and immigrants control immigration policy, all in their own self-interest at public expense – than are we the people of these United States, at this very hour.
Watchmen, 2009. What if superheroes were real – I mean really real?
Directed by Zack Snyder.
Starring Jackie Earle Haley, Patrick Wilson, Malin Akerman, Billy Crudup, Matthew Goode, and Jeffrey Dean Morgan.
Written by David Hayter and Alex Tse.
Scored by Tyler Bates.
If Christopher Nolan’s “Batman” movies are the best supervillain movies (meaning that the supervillains challenge the superhero not just physically, but mentally and morally as well), then Zack Snyder’s “Watchmen” is the best superhero movie (meaning that the superheroes actually act like Ubermenschen would, and do not just echo the humanistic pieties of the Untermenschen who hate and fear them). In fact, the one superhero in the movie who does echo those pieties turns out to be a supervillain. The superheroes themselves, however, are “watchmen” fighting the law-breaking and defending the law-abiding, which is actually quite subversive in today’s “anarcho-tyranny.”
The most important thing about “Watchmen” is its characters – who, however fantastic, have interesting and realistic personalities. As Alan Moore, the author of the graphic novel on which the movie is based (but who refused any involvement in the movie), explained of his characters, “We tried to set up four or five radically opposing ways of seeing the world and let the readers figure it out for themselves; let them make a moral decision for once in their miserable lives! Too many writers go for that ‘baby-bird’ moralizing, where your audience just sits there with their beaks open and you just cram regurgitated morals down their throat…What we wanted to do was show all of these people, warts and all. Show that even the worst of them had something going for them, and even the best of them had their flaws.”
One character, the Comedian (played by Jeffrey Dean Morgan), is a deep-state agent who carries out assassinations, coups, and other black-ops activities that would warm Elliott Abrams’ heart. He acts cynical and irreverent out of increasing disillusionment, which peaks when he finally realizes that the liberal decadence which he has fought for all of his life is not a deviation from American ideals, but rather those ideals come to fruition. “Whatever happened to the American Dream?” one character asks the Comedian, in the midst of nationwide riots. “It came true,” the Comedian replies, sadly.
Another character, Rorschach (played by Jackie Earle Haley), is a vigilante who is absolutely uncompromising in his pursuit of justice. He wears a mask that looks like a Rorschach test – and much like those inkblot tests, how viewers respond to the black-and-white Rorschach says something about who they are and what they value. Where are they on the x-axis from “lawful” to “neutral” to “chaotic,” and on the y-axis from “good” to “neutral” to “evil”?
Dr. Manhattan (played by Billy Crudup), a physicist who was destroyed in an experiment but who returned, miraculously, as a godlike figure. He grows increasingly detached from, if not disgusted with, humanity, and it is his lover, “Silk Spectre” (played by Malin Akerman), who must remind him of the value of life.
Adrian Veidt (played by Matthew Goode) is a capitalist and philanthropist who, before superheroes were outlawed, was “Ozymandias,” named after Percy Bysshe Shelley’s poem about the megalomaniacal Egyptian pharaoh. Veidt is a materialist (who believes that unlimited resources will bring universal peace) as well as a utilitarian (who thinks in terms of the greatest good for the greatest number, not right and wrong).
Silence, 2016. The story of two Jesuit priests who travel to Japan to discover the fate of their mentor, who is rumored to have renounced his faith and gone native.
Directed by Martin Scorsese.
Starring Andrew Garfield, Adam Driver, and Liam Neeson.
Written by Martin Scorsese.
Scored by Kim and Kathryn Kluge.
Like “Gangs of New York,” this movie was a passion project of Martin Scorsese’s which he had wanted to make for decades. It is an intensely religious story about faith and doubt – God’s apparent “silence” as his followers suffer. When two Jesuit priests in Portugal, Sebastio Rodrigues (played by Andrew Garfield) and Francisco Garupe (played by Adam Driver), hear that their mentor, Father Cristovao Ferreira (played by Liam Neeson), has become an apostate, they embark on a mission to redeem him, even though Japan has closed itself off to foreigners. When Rodrigues and Garupe arrive in Japan, they are horrified to by the official inquisition against Christian converts, (who, if exposed, must either apostatize or be tortured to death).
The Japanese point of view is similar to that of the Greek philosopher Celsus and the Roman Emperor Julian, both of whom opposed the Christianization of their peoples. According to Celsus and Julian, religion is not a matter of individual choice or free will, but is linked to particular peoples as expressions of their organic uniqueness. To the Japanese authorities in the movie, the Christian missionaries are a threat to the identity of their nation and the sovereignty of their state. Although the Japanese inquisition is portrayed, unflinchingly, as brutal (indeed, even diabolical, as the Japanese understand the Christians better than the Christians understand the Japanese, and are able to trick them into renouncing Christianity out of Christian motives), it is hard not to sympathize with the Japanese, who are simply trying to resist Western colonialism. Yet the Jesuit priests are not portrayed as chauvinists, either, but as earnest believers who are trying to save souls and are conflicted over the persecution that their presence causes. Indeed, to prepare for their roles, Garfield and Driver immersed themselves in the Jesuit lifestyle, and with the help of practicing Jesuits, even underwent the Jesuit rite of a seven-day silent prayer vigil.
True to its name, “Silence” is a quiet movie without much in the way of music. Instead of a score, there are the sounds of nature – of waves crashing on a rocky shore, of a nighttime forest buzzing with life, and so on. It is an utterly immersive experience.
Star Trek: Deep Space Nine, 1993-1999. The breakup of Yugoslavia and the Yugoslav wars – in space!
Of all the Star-Trek series, “Deep Space Nine” is the most mature. “Next Generation” perfected the adventure format of the original series (uneven in quality and dated by sci-fi kitsch) and is enhanced by the acclaimed performance of Patrick Stewart. “Voyager” had Seven of Nine and “Enterprise” had Porthos. Yet “Deep Space Nine,” according to cast member Rene Auberjonois, “is the one that is almost like a Russian novel.” Indeed, as the show follows a well-sized cast of well-developed characters through war and peace, it is Tolstoyesque. While other Star-Trek series can be highly episodic, “Deep Space Nine” explored overarching themes of ethics, faith, identity, and more. While other Star-Trek series prioritized world-building over character-building, “Deep Space Nine” did both, building its own history and mythology as well as a highly individualized cast of heroes and villains. Speaking of villains, “the Dominion” is one of the most sinister villains in fiction: a galactic empire overseen by the “Vorta” (a genetically engineered managerial class) and “Jem’Hadar” (a genetically engineered warrior class), though ruled in secret by “the Founders” (a cabal of shapeshifters with a persecution complex who infiltrate and undermine other states which they have targeted for subjugation).
The Sopranos, 1999-2007. An Italian mob boss in New Jersey struggles to balance his two “families” – his fellow gangsters in the city with his wife and children in the suburbs.
This show blends thrilling criminal intrigue, moving relationship drama, dark comedy, and a skillfully deployed classic-rock soundtrack. It pioneered the often-imitated, rarely duplicated concept of making the protagonist an otherwise unsympathetic individual whom gives the audience a transgressive thrill. Mafia movies, along with Westerns, are a unique American art form, and “The Sopranos” is a Mafia story for this age of rootlessness, meaninglessness, and hopelessness. As Tony Soprano states at the very beginning of the show, “It’s good to be in something from the ground floor. I came too late for that, I know. But lately, I’m getting the feeling that I came in at the end. The best is over…Take my father. He never reached the heights like me. But in ways he had it better. He had his people. They had their standards. They had pride. Today, what do we got?”
Firefly, 2002-2003. “Space Opera” meets “Wild West.”
After reading The Killer Angels (Michael Shaara’s Pulitzer-winning novel on the Battle of Gettysburg), Joss Whedon was inspired to tell a story about people who had fought on the losing side of a war and were seeking freedom on the frontier, like many ex-Confederates did in the American West after the Civil War. The result was “Firefly.” In the future, after Earth’s resources are depleted, the human race colonizes another star system. When the central planets (“The Sino-American Alliance”) try to take control of the outer planets (“The Independent Faction,” which is distinctly Wild West), war breaks out, and in the end, the Independents are badly beaten. Captain Malcom Reynolds, an Independents veteran, lives outside the law on the edge of civilization (known as “the black”), though he live by his own code of honor. The rest of Reynolds’ crew are all seeking their own forms of freedom, too: a city-slicker doctor, a high-class prostitute, a doe-eyed mechanic, a dim-witted mercenary, a fast-talking pilot, a kind-hearted preacher, and more. “Firefly” does what Whedon does best: tells a story filled with light humor around real depth of feeling. The cast is a salad bowl of different personalities and yet also a melting pot of one family. Just when the show seems not to be taking itself too seriously, it subverts expectations with a dose of sincerity, and when things are getting too serious, it subverts expectations with a dose of irony. Even the music stands out, which is rare for anything on television. Unfortunately, Americans would rather have a smorgasbord of never-ending sitcoms and procedurals, and thus “Firefly” was canceled midway through its first season. The show has a loyal fan-base, however (known as “Browncoats,” the unofficial name for Independents soldiers, and reminiscent of Confederate “Graybacks” and Union “Bluebellies”), and the show was continued as a movie, “Serenity,” which I also highly recommend.
The Wire, 2002-2008: The story of modern-day Baltimore told from the perspective of different municipal institutions (the schools, the press, the ports, the politicians, etc.), particularly through their relation to law-enforcement.
“The Wire” is the product of David Simon (a reporter for The Baltimore Sun) and Ed Burns (a Baltimore police officer and schoolteacher), who had collaborated on a book about inner-city life in Baltimore. They prided themselves on the realism of their storylines and characters, drawing from their own experiences in the city and often using non-professional actors from the city. “The Wire,” while cynical about the effect that institutions (whether bureaucracies or gangs) have on individuals, is optimistic about the humanity of individuals themselves (whether police officers or drug dealers). “The Wire” is, perhaps most popularly, a vehicle for virtue-signaling politically correct opinions, such as “the need for criminal-justice reform.” While such liberal platitudes were, without a doubt, the intention of Simon and Burns (and they are not wrong that the “War on Drugs” does more harm than good), because of his integrity as an artist and a journalist, that was not the only message that was conveyed. In fact, “The Wire” was a spectacular “Kinsey Gaffe” in that unsparingly portrayed the downright feral behavior – sorry, but is there any other word for this? or this? or this? – of Baltimore’s “black community.” The transformation of Baltimore from “Charm City” and “Monumental City” to “The City That Bleeds” parallels what has happened to many American cities in the wake of the civil-rights revolution. In just a few generations Baltimore transformed from a thriving metropolis to a blighted slum. The city’s demographics went from 63% white in 1950 to 76% black in 2010. Like most urban areas that have undergone similar demographic transformation, the city is also depopulating – by 35% between 1950 and 2010, with a 75% drop in the white population. The result is the world of “The Wire.” As someone whose family is from Maryland (my father’s side is from Baltimore and my mother’s side is from Annapolis) “The Wire” bleakly illustrates what my family – and our nation and civilization as a whole – has lost.
Battlestar Galactica, 2004-2009: After an apocalyptic event on Earth, what is left of the human race must preserve not only its very existence, but its ideals – what makes them human.
The 2000s “Battlestar Galactica” is a remake of an earlier series from the 1970s, and is a rare example of a remake that is actually an improvement on the original. While the original is pure sci-fi kitsch, the remake is an intelligent show. Unlike the futuristic world of “Star Trek,” the world of “Battlestar Galactica” feels real. It is not just the aesthetics which feel real, however, but the issues faced by the remnant of the human race. Whether election fraud, martial law, acts of terrorism, vigilante justice, collective punishment, and show trials, the question that is asked again and again is, “Can the ends ever justify the means?” (Anyone with a categorical answer to that question is someone who either believes in nothing at all or in only one thing, both equally dangerous.) “Battlestar Galactica,” which took place in the midst of the “War on Terror,” is clearly influenced by the debate between “national security” and “individual privacy,” what is “necessary” versus what is “legal,” and whether perceived threats are real or fake. Unfortunately, as the show ends, it takes a self-indulgent dive, dropping its complex politics for an over-the-top mysticism that is head-scratching and eye-rolling.
The Tudors, 2007-2010. The story of the life and times of King Henry VIII, with particular emphasis on his six different wives.
The appearance of this show is deceiving. On the surface, it may seem like a mass-market historical romance, where everyone is gorgeous and eager to rip off each other’s clothes. For instance, Henry VIII, rather than the husky man that we know from his portraiture, is played by a svelte Jonathan Rhys Meyers, who looks like he belongs in a cologne advertisement. It is a bit of a bodice-ripper, to be sure, but it is much more than that. It is also a compelling portrayal of English Reformation. While many historical liberties are taken, it is always necessary to cut and condense material when adapting a story – especially a true one. Catholics, in particular, will appreciate the unflinching portrayal of the Protestant revolution. “Idols,” such as art in churches or relics in shrines, were vandalized. Customs and traditions, such as festivals for saints, were outlawed as “idol-worship.” Homes were ransacked in search of “idols” (such as an image of a saint) and visiting a family graveyard to offer prayers for the departed was suspected as “idolatry.” Monasteries were abolished and their property confiscated. Rebellions by peasants who wanted to worship in their old ways were double crossed and stamped out. Catholic dissidents were executed. The Reformation, in short, was a period of hyper-fundamentalist repression. If the purpose of a historical adaptation is to create awareness of and interest in a particular event or period (even if every detail is not exactly right), then “The Tudors” succeeds marvelously.
True Detective (Season 1), 2014. Weird horror, cosmic terror, philosophical interludes, dark comedy, tempting women, and more.
In this Southern mystery, two detectives, Woody Harrelson’s Marty (an outwardly respectable family man, though inwardly a liar and a cheater) and Matthew McConaughey’s Rust (outwardly, a misanthrope, but inwardly, an honorable, lawful man), to solve a murder mystery which has haunted them for years. The show is set in Louisiana, because that is what the showrunner, Nic Pizzolatto (born in New Orleans, raised in rural Louisiana, and graduated from LSU) knows. It is no modern-day Southern gothic horror, however, like “True Blood,” but spends most of its time out in the sticks or deep in the underworld. In literature, the only place better for a horror story than New England (with its hyper-repressive hatred and fear of the unknown) is the South (with its grim resignation to the existence of evil). Southern writers, influenced by an older Christianity which has nothing to do with the fanatical Hebraic-Puritanism of New England, know that there is an innate evil in humanity which no social systems or progressive reforms, however well-intentioned and well-administered, can ever fully repress. “Whenever I’m asked why Southern writers particularly have a penchant for writing about freaks,” quipped Flannery O’Conner, “I say it is because we are still able to recognize one.” As their suspenseful investigation uncovers darker and darker secrets, the detectives’ notions of morality and even reality are shaken their core.
And now, for American history. I feel the same way about American history and movies that the late Tom Wolfe did about modern American life and novels: with so much potential material out there, how is it that historical movies are so scarce (and when they exist at all, they are usually meant to make us feel bad about ourselves?). Why is there no historical adventure movie featuring John Smith, who was a real-life swashbuckling hero? Why is there a historical romance about the fictional romance between Thomas Jefferson and Sally Hemings, but not one about the real-life romance between Thomas and his wife, Martha? The list of wasted opportunities is endless.
The Patriot, 2000. The story of the American Revolution in South Carolina, following a yeoman farmer trying to escape his reputation as a legendary Indian fighter and keep his family safe.
Directed by Roland Emmerich.
Starring Mel Gibson and Heath Ledger.
Written by Robert Rodat.
Composed by John Williams.
A brilliant but flawed movie which has become near and dear to my heart due to the fanatical, malicious criticism to which it has been subjected. Yes, the movie is, at times, melodramatic, and, at times, historically inaccurate, but other movies which grossly falsify history in order to be politically correct are not nearly as criticized as “The Patriot” was for its harmless fictional liberties here and there. For instance, 2016’s much-applauded “Birth of a Nation” rewrote the history of Nat Turner’s slave revolt: instead of a short-lived killing spree of white families in their sleep which was summarily crushed by the local militia, it was a heroic rebellion against white supremacy which held out until it was crushed by overwhelming numbers. In fact, criticism of “The Patriot” was one of my earliest encounters with the Cultural-Marxist Left: a young, teenaged me, “surfing the web” as we said back then, came across Salon’s hate-laced review, in which a neurotic, paranoid Jewish critic compared this conventionally patriotic movie to Nazi propaganda.
The protagonist, Benjamin Martin (played by a typically heroic Mel Gibson), is a composite-character of the real-life South Carolinians Francis Marion, Andrew Pickens, and Thomas Sumter, and the antagonist, Col. William Tavington (played by a typically villainous Jason Isaacs), is based on the Briton Banastre Tarleton. The movie accurately depicts the internecine warfare between American revolutionaries and loyalists which took place in South Carolina, the backwoods “guerrilla” warfare which superseded pitched battles, and the decisive importance of French intervention. The Battle of Camden is briefly depicted, and the final battle is based loosely on the battles of Cowpens and Guilford Courthouse. John Williams’ martial score is full of bugles, drums, and fifes, sounding a lot like what the painting, “The Spirit of ’76,” looks like, and makes me long for the true sound of “The Star-Spangled Banner” – not a stylized R&B cover, but a poetic anthem which should be backed by the full force of an orchestra.
The movie has been justly criticized for falsely depicting the British as vicious war criminals and minimizing the presence of slavery. Indeed, British tyranny is almost entirely a figment of the American imagination, even back in 1776. Prior to the American Revolution, the colonists were already the freest people in the world, and it was that freedom which gave them the self-consciousness and self-confidence to break away from their mother country and central government. When it comes to slavery, while I object to the morbid obsession with it which masquerades as “historical accuracy,” I also object to cowering from it in the name of “political correctness.” Slavery existed in the Americas, because the Americas were colonized by Europe, and slavery existed in European colonies, as well as everywhere else in the world at the time and for all time. In fact, slavery’s roots in human history are so wide and deep that, technically speaking, it is arguably the natural state of humanity, and the idea that every individual is equal and has rights the profoundly irregular, unnatural state. I refuse to live in fear of slavery – that is, of being called “racist” for refusing to erase my history, deny my existence, and abort my future – and encourage all other Americans to emancipate themselves.
Gangs of New York, 2002. The story of tribal politics in New York City, riven by mass-immigration and the Civil War, following one man’s quest to kill the man who killed his father.
Directed by Martin Scorsese.
Starring Leonardo DiCaprio and Daniel Day-Lewis.
Written by Jay Cocks, Steven Zaillian, and Kenneth Lorgan.
Composed by Howard Shore.
“Gangs of New York” is a superb movie, with the great director Martin Scorsese, the great actors Daniel Day-Lewis and Leonardo DiCaprio, and the great composer Howard Shore all at the top of their game. It is a Shakespearean tragedy set in New York City, 1863, with themes of honor, loyalty, and revenge. Portraying this period of American history (which is, in this case, based on Gangs of New York, a book from 1928 about the immigrant-versus-native gang warfare) was a passion project of Scorsese’s for decades.
The nominal antagonist, Day-Lewis’ Bill “the Butcher” Cutting, is, in my opinion, the real protagonist of the movie. At the very least, the nominal protagonist, DiCaprio’s Amsterdam Vallon, has little to no redeeming qualities, while the antagonist has many. In the interest of not spoiling anything, all that can be said is that brutal as Bill is, he fights for his people and lives by a code of honor (just like his legendary enemy, “Priest Vallon,” and unlike Priest’s son, Amsterdam, driven by the low motive of revenge and willing to dishonor himself to have it). To borrow some lines from Bill himself, in a world of “base defilers,” it is only Bill who rises above the mob to be “a great man.” It helps, of course, that Day-Lewis is absolutely captivating as Bill, striding across the screen with charisma and machismo, unlike the skulking Amsterdam.
Unfortunately, most critics seem to have drawn the wrong conclusion from “Gangs of New York,” believing that it is merely another ethno-religious pageant of good immigrants versus bad nativists, in which democracy and diversity win out in the end. (Even contributors at VDare.com have criticized the movie along these lines, though not everyone agrees.) This is incorrect and unfair. The movie’s portrait of democracy is one of corruption and cynicism. Everyone is for democracy because everyone is cheating the system. Its portrait of diversity, likewise, is not one of vibrancy, but of squalor and despair. Immigrants are tearing down and burning the society of their host country, both figuratively and literally. Its portrait of the Civil War, even, is not one of the usual Yankee triumphalism: all of the main characters, protagonists and antagonists alike, are either uninterested or outright opposed to the war. Scorsese, justly famous for his visual spectacles, pulls off a long tracking shot which follows Irish immigrants as they disembark from a ship, are conscripted into the Union army, then put back on a ship heading to the war front – which is, at the same time, unloading coffins of Union soldiers. Far from an empty-headed applauding for the shlock and schmaltz, “Gangs of New York,” much like Scorsese’s “Taxi Driver,” has a deeply subversive message.
Gods and Generals, 2003. The story of the first two years of the Civil War, focused on Gen. Stonewall Jackson, Gen. Robert E. Lee, and Col. Joshua Chamberlin.
Directed by Ron Maxwell.
Starring: Stephen Lang, Robert Duvall, and Jeff Daniels.
Written by Ron Maxwell.
Composed by John Frizzell.
This movie is brilliant but flawed; it reminds me of a diamond with excellent color and cut, but poor carat and clarity. It is full of cumbersome and didactic writing, but it is also full of spectacular battle sequences (particularly Fredericksburg and Chancellorsville), as well as well-written, well-acted, and well-scored scenes which capture the pathos of the Civil War.
The critics do not hate it for its flaws, however, but for its virtues: it commits the thought-crime of humanizing the Southern people. In this fanatical “Battle Hymn of the Republic” school of history, there is no pathos to the Civil War: it was the smiting of evil incarnate, and the only tragedy is that the smiting was not bloodier and fierier. According to the historian Steven E. Woodworth, for instance, “Gods and Generals” is “the most pro-Confederate film since Birth of a Nation, a veritable celluloid celebration of slavery and treason.” To Woodworth, anything which presents the Confederacy in a positive light must, by definition, be “Lost-Cause mythology,” because, in his mind, there was literally nothing positive about the Confederacy – even depicting Confederate soldiers as a fearsome fighting force, a historical fact to which Union soldiers amply attested, is suspect.
All of this, of course, is quite insane. Granted, there is plenty of SCV-influenced historical revisionism in the movie (yes, slavery did play a role, and the sooner we understand how and why, the sooner we can more effectively defend our heritage and identity from anti-historical presentism and iconoclasm), but that is not the only reason critics like Woodworth hate it. They hate it because of Robert E. Lee’s reflection on what it means to fight for your homeland, the love and loyalty shown for one another by the white and black members of a Fredericksburg family, for the performance of “The Bonnie Blue Flag,” and other such humanizing moments. Its predecessor, “Gettysburg” (which takes place after the events of this movie, yet was made in first in 1993) has all the same strengths and weaknesses as “Gods and Generals,” and our own Clyde Wilson, quite controversially, prefers Martin Sheen’s Lee to Robert Duvall’s.
The Alamo, 2004. The story of American folk-heroes Davy Crockett, Jim Bowie, and William Barret Travis in their last stand at the Battle of the Alamo.
Directed by John Lee Hancock.
Starring Dennis Quaid, Billy Bob Thornton, Jason Patric, and Patrick Wilson.
Written by John Lee Hancock.
Composed by Carter Burwell.
“The Alamo” is a big-budget battle movie. Prior to the battle itself, which is the movie’s set piece, there is a lot of expository dialogue on the historical background as well as the characters’ backstories, though it does not lay it on as thick as “Gods and Generals” and “Gettysburg.” Patrick Wilson, playing William Barrett Travis, movingly portrays an idealistic but inexperienced young man struggling for the respect of older, harder men. Jason Patric, playing James Bowie, acts with the intensity of a rattlesnake about to strike, just like his character. Last, but not least, Billy Bob Thornton is perfect as the hard-bitten, wise-cracking Davy Crockett.
The Texan Revolution is one of many events of American history which cannot be squared with the cuckservatives’ Sunday-School lessons about American history. Where does a quasi-racial war between “Hispanic” Mexicans and “Anglo” Americans fit in with the bromide that “America” has no ethno-cultural identity or heritage, but is a “proposition nation”? Does anyone believe that Americans fought at the Alamo to “dedicate themselves to the proposition that all men are created equal”? David French, Jim Geraghty, and Kevin Williamson probably do, but imagine the laughter that such a notion would have elicited from Bowie, Crockett, and Travis!
Speaking of the Alamo and cuckservatives, an incident involving the two had a decisive impact on my then-young mind. In 2010, National Review, in an obituary for the actor Fess Parker, described Crockett (his most famous role) as a “gaudy self-promoter.” In response, “Carol L. Crockett” wrote a letter defending the memory of her ancestor from this completely uncalled for dishonor. Ms. Crockett quoted from Jay Winik’s review of A Line in the Sand: The Alamo in Blood and Memory, which had been published in National Review nine years earlier. In that review, Winik quoted the founder of and editor of National Review himself, William F. Buckley, who dismissed the revisionist history around Crockett as a “traditional debunking campaign” by “liberal publicists.” According to Buckley, “He’ll survive the carpers.” In reply to Ms. Crockett, the editors reiterated that while Crockett “died a hero’s death,” he was, in life, a “gaudy self-promoter.” “History is as simple as humanity,” they intoned, whatever that is supposed to mean. After that embarrassing exchange (there was no need to respond to her letter, and certainly not in such an arrogant and petulant manner), I promptly canceled my subscription. For years, I had read National Review with increasing dissatisfaction – mainly with its ideological neo-conservatism, but I had also come to hate the editors’ obnoxious habit of putting down well-meaning readers. They came off as schoolboys with more wit than wisdom (and not much of either, at that) who always had to get the last word. Read Chronicles and The American Conservative instead!
Like most movies about American history (at least those that are not guilt-fests over racism), “The Alamo” was critically panned, and, sadly, was a spectacular “box-office bomb.” In 2004, apparently, Americans would rather have watched yet another “Shrek” movie, yet another “Spider-Man” movie, yet another “Harry Potter” movie, and worse (mindless, tasteless garbage such as “Dodgeball,” “Starsky and Hutch,” and “Anchorman”), than a patriotic action movie about a true story so dramatic that it seems legendary – an American “Thermopylae.”
Sadly, mass-immigration – legal and illegal – has given away what was won in 1836 and 1848. Texas recently crossed the majority-minority demographic event horizon, which has already brought and will continue to bring cultural, political, and social changes. San Antonio itself, the site of the Alamo, is now a “sanctuary city” (and the kritarchy has, running interference for the Left as usual, prevented anyone from doing anything to stop that).
For what it is worth, the “Deguello de Crockett” scene is, in my opinion, one of the best portrayals of the spirit of the South. You’ll have to watch it to find out for yourself!
The New World, 2005. The founding of Jamestown, inspired by the legendary romance of John Smith and Pocahontas.
Directed by Terrence Malick.
Starring Colin Farrell, Q’orianka Kilcher, Christopher Plummer, and Christian Bale.
Written by Terrence Malick.
Composed by James Horner.
Terrence Malick, who directed and wrote “The New World,” was, refreshingly, not out to “deconstruct” the story of John Smith and Pocahontas, as many historians are eager to do these days. Just look at how “Pocahontas,” a charming children’s cartoon, is still sneered at for “neo-colonialism” and “cultural appropriation.” Whether literally true (in that their romance did indeed happen) or mythically true (in that their romance, like the Thanksgiving fable, is a metaphorical memory), Malick could not care less. To him, the story is a vehicle to illustrate how the American-Indian natives and the European colonists both discovered a “new world” at the same time. To the natives, “Europe” was as much a “new world” as “America” was to the colonists.
Speaking of “myths,” one of the most annoying and undying “myths” of American history is that the American-Indians were peaceful people before and after the arrival of Europeans. As John Smith puts in the movie (in an adaptation of a passage written by an earlier English explorer, Arthur Barlowe), “They are gentle, loving, faithful, lacking in all guile and trickery. The words denoting lying, deceit, greed, envy, slander, and forgiveness have never been heard. They have no jealousy, no sense of possession.” This belief in the “noble savage” is a form of the “romantic primitivism” which can come over civilized people when they first encounter an uncivilized “Other.” The Romans viewed the Germanic barbarians in the forests as noble savages. The English viewed the Scottish barbarians in the highlands as noble savages. When Europeans encountered the Indians, they viewed them as noble savages, too. The idea has always been that “soft” civilized life corrupts people, while “hard” savage life ennobles people. Many Indians, probably unknowingly, have absorbed this sentimental fantasy into their own identity. (Nowadays, the “noble savage” myth is not based on philosophical beliefs about the innate virtue of humanity and the innate depravity of civilization, but is based rather on the innate virtue of non-white people and the innate depravity of white people.) The reality is that the Indians were anything but “noble savages.” Reviewing “The New World,” Dr. Cathy Schultz, a professor at the University of St. Francis, objected that Powhatan and his people “were far from the innocent, childlike creatures we see in the film,” but that they “ruled by conquest over the surrounding tribes.” Indeed, in 1622, the Powhatan Chiefdom, in a coordinated surprise attack across several English settlements in Virginia, massacred one-third of the white population – assuming, incorrectly, that the English would act as other Indians would after defeat in battle and simply leave. Long before Europeans arrived, Indians fought wars of enslavement and extermination amongst themselves which, if most people knew about, would drain their blood and chill their bones. The cause of these inter-tribal genocides? Land! In fact, most of what people think that they know about Indians comes from the “New Age” movement, which – to borrow a term – culturally appropriates them as symbols of its nebulous spirituality. Of course, none of this is to excuse the U.S. government’s own savagery and treachery in its relations with the Indians. When more than one civilization comes to occupy the same space, coexistence is impossible and conflict is inevitable, yet even so, the U.S. government often dishonored itself in that conflict.
“The New World” is a movie of little dialogue but with lush visuals. According to one reviewer, the movie – a spellbinding spectacle of unspoiled sights and sounds – reflects Mallick’s obsession with “Eden.” Malick has probably not read Louis B. Wright’s Colonial Search for a Southern Eden, but his portrayal of Virginia as an “Eden” is exactly how the “Cavaliers” who colonized Virginia perceived it, in stark contrast to “Puritan” colonists up north, who feared the “howling wilderness.” Early in the movie (in a voiceover adapted from John Smith’s own words) John Smith expresses his high hopes for the New World: “A world equal to our hopes, a land where one might wash one’s soul pure, rise to one’s true stature. We shall make a new start. A fresh beginning. Here all the blessings of the earth are bestowed upon all. None need grow poor. Here there is good ground for all and no cost but one’s labor. We shall build a true commonwealth, hard work and self-reliance our virtues. We shall have no landlords to rack us with high rents or extort the fruit of our labor. No man shall stand above any other, but all live under the same law.” While the Virginians were clearly optimistic – perhaps somewhat “utopian” – about what life would be like in the New World, they still identified as Englishmen and Christians and intended on maintaining historical continuity with their country and church. Once again, it was the Puritans who were the real “utopians,” cutting themselves off from the world while also setting themselves above the world, in order to found a “City Upon A Hill,” or a “Christian Israel” and “Hebrew Republic.”
Lincoln, 2012. The story of the passage of the Thirteenth Amendment by the Congress near the end of Abraham Lincoln’s life.
Directed by Steven Spielberg.
Starring Daniel Day-Lewis.
Written by Tony Kushner.
Composed by John Williams.
Hear me out! The portrayal of Lincoln in the movie, which might seem hagiographic at first, is actually deceivingly historically accurate. For one, the movie clearly portrays Lincoln bribing and lying (not just to politicians, but to the public) in order to get the Thirteenth Amendment through the Congress. He offers patronage to politicians in exchange for their votes. “I am the President of the United States of America, clothed in immense power!” Lincoln shouts. “You will procure me those votes!” He covers up negotiations with Confederate peace envoys (offering to restore the Union but leave slavery intact – a compromise which Northerners would have eagerly accepted), and when news of such a peace offer leaks, he lies and denies it. David Brooks, one of the “house conservatives” at The New York Times, wrote a whole column about how he hoped that Lincoln would not only inspire Millennials to believe in “the high vision” of politics again, but also teach them to accept the “low cunning” that politics requires. “It shows that you can do more good in politics than any other sphere,” gushes Brooks, “but you can achieve these things only if you are willing to bamboozle, trim, compromise, and be slippery and hypocritical.” I can think of few better tableaus of the corrupted, degenerated state of Conservatism, Inc. than this column on “Lincoln” by Brooks.
For another, the movie shows Lincoln’s signature rhetorical style of twisting the meaning of questions he is asked and avoiding straight answers to even the simplest of questions. There were many moments when the script could have had Lincoln deliver a presentist pontification about how race is nothing more than skin color, or something, but in which he dodges the issue with a facile, folksy tale instead. This can easily be misunderstood as homespun wisdom, but it is not hard to point out that Lincoln is simply talking about of both sides of his mouth, which is annoying once it is noticed. In one scene, for example, a freedwoman who is friends with Lincoln’s wife, states, “White people don’t want us here” (to which Lincoln replies “many don’t”), and then asks him, “What about you?” Lincoln replies, “I don’t know you, Mrs. Keckley. Any of you. You’re familiar to me, as all people are. Unaccommodated, poor, bare, forked creatures such as we all are. You have a right to expect what I expect, and likely our expectations are not incomprehensible to each other. I assume I’ll get used to you. But what you are to the nation, what’ll become of you once slavery’s day is done, I don’t know.” That is another way of saying, “No, I don’t want you here, either. I’m one of those white people to whom you just referred. But I can’t just come out and tell you that.” In another scene, Lincoln lectures his Cabinet on presidential war powers, using lawyerly sophistry to deconstruct the Constitution. Afterwards, his own Secretary of the Interior comments, “You’re describing precisely the sort of dictator the Democrats have been howling about,” and asks the rest of the Cabinet, “What reins him in?” Last of all, in a fine example of Lincoln’s rhetoric, he invokes Euclidean geometry in support of his fundamentalist “all men are created equal” interpretation of the Declaration of Independence. “Euclid’s first common notion is this: ‘Things which are equal to the same thing are equal to each other’…That’s a rule of mathematical reasoning. It’s true because it works; has done and always will do. In his book, Euclid says this is ‘self-evident.’ D’you see? There it is, even in that two-thousand year-old book of mechanical law: it is a self-evident truth that things which are equal to the same thing are equal to each other. We begin with equality. That’s the origin, isn’t it? That’s balance, that’s fairness, that’s justice.” So if A and B are both equal to C, then, logically, A is equal to and B is equal to A…therefore, “all men are created equal”?
To the movie’s credit, the primary antagonists (the opponents of the amendment, represented by George H. Pendleton and Fernando Wood) and secondary antagonists (the Confederate envoys Alexander Stephens, R.M.T. Hunter, and John Campbell) are not characterized as intellectually, physically, and spiritually defective, as if it were a medieval morality play, and are allowed to speak for themselves in key moments. The movie’s secondary protagonists (Republicans like William Seward, Thaddeus Stevens, and Francis Preston Blair, Sr.) are hardly lionized, either, but are characterized realistically, as a querulous politician, a fanatical ideologue, and a reluctant conservative, respectively. Daniel Day-Lewis, famously selective about his roles as well as renowned for his intense “method-acting,” transforms himself into character and delivers a definitive Lincoln performance – not supreme, stentorian, and statuesque, but sensitive, soft-spoken, and stooped (not to mention slippery). For John Williams, however, the score is surprisingly unmemorable, though this is not a movie that really needs much of a score. All things considered, “Lincoln” is a historically accurate movie about a historically important event, whether or not we are happy with the way that everything happened.
The Witch, 2015. A New-England family, banished from their colony and living on the outskirts of civilization, is terrorized by a witch lurking in the woods.
Directed by Robert Eggers.
Starring Anya Taylor-Joy, Ralph Ineson, and Kate Dickie.
Written by Robert Eggers.
Composed by Mark Korven.
“The Witch” is an “atmospheric horror” movie that is as much about the psychological as it is the supernatural. The eponymous character does not merely prey on the family’s livestock and children, but on their fears as well, turning them against each other one by one.
The director and writer, Robert Eggers, grew up in New England. “Witches were a part of all my earliest nightmares,” he explains. “The 17th-century witch, the Puritan witch – she’s a lot more primal and a lot scarier than we ever would have imagined.” In order to make the audience really believe in the witch the way that the characters would believe in her, Eggers demanded absolute realism. The dialogue is written in an archaic style of English and spoken with heavy English accents. (Some of the actual lines are adapted from 17th-century documents, including passages from the writings of Cotton Mather and John Winthrop.) For authenticity, costumes, props, and sets were hand-constructed with period-appropriate tools. (For reference, museums were visited, archaeologists were consulted, and some of the more detailed work was outsourced to master craftsmen.) The movie was filmed out in the woods, not in a studio, and on a location so remote that cell-phone service was dead. The characters are not disdainfully “pathologized” à la Arthur Miller, either: they are not portrayed as dumb bumpkins or creepy fanatics, as religious folk are usually portrayed in entertainment media, but as the common folk of their time and place. The result was an immersive experience for the crew, the actors, and of course, for the audience.
What makes “The Witch” so frightening is not cheap jump-scares or gore-porn (there is, fortunately, neither in the movie), but the atmosphere of gloom and doom, which only grows more ominous as the story unfolds. The family lives in fear of the dark woods which surround their homestead, in fear of an evil devil preying on their weaknesses, in fear of a lawful god punishing them for their sins, and in fear of one another for betraying them. “Fear itself” is not the only thing that this family has to fear, however. “The Witch” is, as Eve Tushnet comments at The American Conservative, “a powerful brew of family tragedy, religious drama, and horror show.”
I would like to name some of my favorite movies and shows, with a little bit of basic information and personal commentary (without much in the way of spoilers, of course). After that, I would like to do the same for what I think are a few of the best movies about American history – a woefully underserved genre, to say the least!
The Man Who Shot Liberty Valance, 1962. “‘Cause the point of a gun was the only law that Liberty understood…From out of the East a stranger came, a law book in his hand, the kind of a man the West would need to tame a troubled land…When the final showdown came at last, a law book was no good…The man who shot Liberty Valance, he was the bravest of them all.”
Directed by John Ford.
Starring John Wayne, James Stewart, Vera Miles, and Lee Marvin.
Written by James Warner Bellah and Willis Goldbeck.
Scored by Cyril J. Mockridge.
“The Man Who Shot Liberty Valance” has all of the features of a great Western, particularly the conflict between love and duty and the challenge of standing up for what is right even if it means standing alone. What makes this movie unique, however, is the philosophical question that it asks: on the frontier of civilization, where law has not yet been established, how is chaotic evil to be stopped – by lawful good or by chaotic good? The two protagonists, Ransom Stoddard (played by James Stewart) and Tom Doniphon (played by John Wayne), are philosophically opposed on how to deal with Liberty Valance (played by Lee Marvin), the ringleader of a gang terrorizing a frontier town. Stoddard is adamant that the establishment of law and order will put an end to Liberty’s reign of terror (and more importantly, that it is crucial to the establishment of law and order that men like Liberty not be dealt with extra-legally). Doniphon insists, however, that only brute force is capable of stopping Liberty (and that law and order cannot be established until men like Liberty have been dispatched). How does the movie answer the question? Watch and find out!
A Man For All Seasons, 1966. The story of Sir Thomas More’s refusal to accede to the English Reformation.
Directed by Fred Zinneman.
Starring Paul Scofield, Leo McKern, Orson Welles, and Robert Shaw.
Written by Robert Bolt.
Scored by Georges Delerue.
Sir Thomas More was a philosopher, and although not a king himself, a philosopher to the king, so to speak. More was an advisor to King Henry VIII and helped him write public polemics against Martin Luther. For refuting Luther, Pope Leo X titled Henry “Defender of the Faith.” When the Catholic Church would not grant Henry the divorce he desired, this erstwhile “Defender of the Faith” broke with Rome and set himself at the head of his own church. More, however, who was serving as Lord Chancellor of England at the time, refused to endorse Henry’s divorce or acknowledge Henry’s supremacy, for which he was accused of heresy and treason.
Nowadays, there is a so-called “Resistance” against Pres. Donald Trump, which despite its insufferable self-regard is, in reality, merely the system mustering all of its money and power to destroy any real resistance. The Resistance is a corporatist, elitist, globalist counter-revolution to the nationalist, populist, traditionalist revolution which Trump unwittingly incited in 2015. No member of the Resistance is risking anything. Indeed, everyone from public figures to private individuals is free to defame the President in the vilest terms without any fear of consequences whatsoever. Even illegal aliens parade around in public, complaining about oppression as they flaunt their crimes and trumpet their rising numbers. At the same time, those loyal citizens who agree with the President’s “isolationist,” “nativist,” and “protectionist” agenda are subject to life-destroying harassment by alt-left goon squads, which often results in getting fired from their jobs, doxed on social media, and physically assaulted in the streets. What sort of “fascist regime” is this? On the contrary, it is unvarnished “anarcho-tyranny.”
More, in his day and age, was a part of a real “Resistance.” He and a few other Catholic individuals (who have all been sainted since) took conscientious, principled stands against a tyrannical king and religious fanaticism. “A Man For All Seasons” does this inspiring story justice. The script is essentially one long debate on conscience, ethics, and law, with enough wisdom to have been written by More himself. It is one of the few movies out there that is truly educational, edifying, and uplifting – intellectually, morally, and spiritually – to watch.
The Star Wars Trilogy, 1977-1983. A space opera drawn from world mythology.
Directed by George Lucas.
Starring Mark Hamill, Harrison Ford, and Carrie Fisher.
Written by George Lucas.
Scored by John Williams.
“Star Wars,” though a space opera, is not really of the science-fiction genre, but more of the fantasy. Science fiction is about exploring the consequences of scientific innovation, which is why it is alternatively known as “speculative fiction,” yet “Star Wars,” though outwardly futuristic, is not about the future, but the past. Specifically, “Star Wars” is modeled on what Joseph Campbell, a scholar of comparative mythology and religion, terms the “Hero’s Journey,” which is a story – or “monomyth” – that can be identified in all world cultures. George Lucas was heavily influenced by Campbell’s theories, and drew on the Hero’s Journey to tell his own story. As Lucas explained to Campbell’s biographers, “What’s valuable for me is to set standards, not to show people the world the way it is. Around the period of this realization, it came to me that there really was no modern use of mythology. The Western was possibly the last generically American fairy tale, telling us about our values. And once the Western disappeared, nothing has ever taken its place. In literature we were going off into science fiction, so that’s when I started doing more strenuous research on fairy tales, folklore, and mythology, and I started reading Joe’s books. Before that I hadn’t read any of Joe’s books. It was very eerie because in reading The Hero with a Thousand Faces, I began to realize that my first draft of ‘Star Wars’ was following classic motifs. I modified my next draft of ‘Star Wars’ according to what I’d been learning about classic motifs and made it a little bit more consistent.” So although “Star Wars” may take place “in a galaxy far, far away,” it also takes place “a long time ago.”
John Williams’ score, simply put, makes “Star Wars.” Without it, the movies would have been unable to overcome the sci-fi kitsch. It is hard to take some of the special effects of the movies seriously, but it is impossible not to take the score seriously. The music is present throughout most of the movie – each movie opens with a magnificent overture – and explains much of what is actually happening at that moment. Indeed, Williams’ scores are so iconic that they often come to define whole movies themselves.
The original “Star Wars” trilogy, from 1977 to 1983, was lightning in a bottle. Its archetypes, patterns, and themes of the story touch our “mythic imagination,” which is why they resonate so deeply. The writing and scoring of the movie is memorable, full of quotable lines and hummable tunes. The casting is perfect (could there be a wiser mentor than Alec Guinness, a darker-sounding adversary than the voice of James Earl Jones, or a more heroic-looking hero than Mark Hamill?), and the acting only gets better with each movie. The prequels that Lucas made from 1999 to 2005 were poorly written and acted, but were at least earnest in trying to tell a new – far more modern and less mythic – story. The ongoing sequels that Disney is making, however, do not seem to understand anything about what made “Star Wars” great, and feel like cash-ins and rip-offs.
Excalibur, 1981. The Legend of King Arthur.
Directed by John Boorman.
Starring Nigel Terry.
Written by John Boorman.
Scored by Trevor Jones.
“King Arthur,” unfortunately, is one of those stories which is endlessly adapted in bad faith, like Robin Hood (who has recently been turned into an antifa punk) and Sherlock Holmes (who has recently been turned into a man-child). “Excalibur,” however, is a faithful adaptation which actually wants to retell the story to a modern audience, not trade on its name to tell a different story altogether. Specifically, it is based on Thomas Malory’s Le Morte d’Arthur, which was first published in the 15th century.
“Excalibur” features a big cast of British actors before they became famous in the movies, including Gabriel Bryne (as King Uther Pendragon), Helen Mirren (as Morgana Le Fay), Liam Neeson (as Sir Gawain), Patrick Stewart (as King Leodegrance), Nicol Williamson (as Merlin), and more. In order to make the movie feel more mythic and less realistic, there is little in the way of characterization or dialogue, and much in the way of music and imagery.
Speaking of music, the score to “Excalibur” is fantastic. Richard Wagner and Carl Orff: who better for a score to one of the greatest myths of all time than the composers of “Siegfried’s Funeral March” and “Fortuna Imperatrix Mundi” (both of which feature prominently in the movie)?
American History X, 1998. A story of redemption and damnation, as an older brother recently released from prison struggles to save his younger brother following in his footsteps.
Directed by Tony Kaye.
Starring Edward Norton and Edward Furlong.
Written by David McKenna.
Scored by Anne Dudley.
“American History X,” set in Venice, Los Angeles, is the story of Derek Vinyard (played by Edward Norton) from the point of view of his younger brother, Danny (played by Edward Furlong). Derek is meant to be a sympathetic figure. He is a bright young man with a frightful temper who turns to the neo-Nazi movement after his father, a firefighter, is murdered by a drug dealer while putting out a fire at a drug den. Even after he becomes a neo-Nazi, however (complete with a shaved head and tattoos!) his bravery, charisma, and intelligence remain irresistible, in stark contrast to every other skinhead depicted in the movie, who are indeed mere bigots and cowards.
“American History X” can be, and often is, interpreted as a mere homily against racism, but it is much more than that. It is a tragedy. What happens to Derek is tragic. What happens to Derek’s family as a result of what happens to him – no spoilers! – is tragic. Yes, the message of the movie – “hate is baggage” – is somewhat moralistic and simplistic, yet rather than demonized, “haters” like Derek are humanized, which is what makes the movie a tragedy and not a medieval morality play. Incredibly, the black gangs of the movie are not portrayed as blameless, helpless victims, but just as thuggish as the white gangs. “American History X” is a refreshingly sensitive and thoughtful criticism of racism.
The Believer, 2001. The story of a prodigious and prodigal yeshiva student who hates his own people as a perverse act of love.
Directed by Henry Bean.
Starring Ryan Gosling.
Written by Henry Bean.
Scored by Joel Diamond.
“The Believer” is similar to “American History X” in that it is a story about a neo-Nazi (Danny Balint) who finds redemption with an absolutely captivating performance by that actor (Ryan Gosling). By day, Danny is a thug getting into trouble with his gang. By night, he is a Jewish boy living at home with his father. Danny’s identity crisis began in yeshiva, where he was expelled for blasphemy: he hated God for his cruelty and hated the Jews for their passivity, both exemplified in the infamous story of the Binding of Isaac.
Danny is not a paranoid anti-Semite, afraid of and angry at Jews merely because they happen to be different from him. On the contrary, as a Jew and an educated Jew at that, he has a sophisticated understanding of what it means to be Jewish, which informs his esoteric, intricate theories of anti-Semitism. (Much of what Danny says about Jews, in fact, is much of what Jews have said about themselves – for example, Yuri Slezkine’s The Jewish Century, which won the National Jewish Book Award in 2005.) Danny is articulate and intelligent (Gosling’s portrayal of his intensity and insecurity is irresistible), but he lacks self-awareness and self-control, and has a tendency to push his arguments too far. When he finally has a chance to take action, however, he is conflicted, and realizes that his hatred of his own people is, strangely, rooted in his love for them.
Alas, the compelling-yet-disturbing character of Danny was too much for the Simon Wiesenthal Center’s Rabbi Abraham Cooper, who accused the movie of anti-Semitism. As a result, “The Believer,” despite winning the Sundance Festival’s Grand Jury Prize, was dropped by Paramount Pictures. When the director (who was Jewish) complained about “Jewish paranoia” making it impossible for him to find another distributor, the SWC’s Rabbi Marvin Hier accused him of implying that Jews control the media. Incidents like this remind me of one of Norm MacDonald’s old jokes back when SNL was funny. “Marlon Brando said on ‘Larry King Live’ that Hollywood is ‘run by Jews,’” quipped MacDonald. “Brando met with Jewish leaders to apologize for his comments. They have accepted his apology and announced that he is now free to work again.”
The Lord of the Rings, 2001-2003. An adaptation of J.R.R. Tolkien’s monumental masterpiece, which was influenced heavily by his lifelong study of Germanic mythology and singlehandedly invented the genre of fantasy.
Directed by Peter Jackson.
Starring Elijah Wood, Viggo Mortenson, and Ian McKellen.
Written by Fran Walsh, Philippa Boyens, and Peter Jackson.
Composed by Howard Shore.
“The Lord of the Rings” movie trilogy is a remarkably faithful adaption: the script is full of Tolkien’s own lyrical prose, follows the plotlines as closely as possible, and includes an unbelievable degree of attention to detail when it comes to lore. I recently learned, for example, that the choral singing in “The Revelation of the Ringwraiths” (the theme that plays in dramatic moments featuring the Ringwraiths, ancient kings of men corrupted by their rings of power) is a poem written by one of the screenwriters, Philippa Boyens, which she translated into Adunaic (an archaic human language which Tolkien invented) and which the composer, Howard Shore, gave a choral and orchestral setting.
Speaking of the music, the trilogy’s score is suitably epic (not epic as in modern sense, e.g. “This pizza is epic,” but epic in the literal sense, e.g. worthy of a heroic saga). Although lots of Hollywood-style action was added to make the movie exciting to mass-audiences, many of the action sequences are entertaining. Furthermore, live actors, location shooting, and physical effects are preferred to CGI, which is used only when appropriate. The casting is uniformly perfect, launching (Viggo Mortensen as Aragorn), revitalizing (Elijah Wood as Frodo), and crowning (Ian McKellen as Gandalf) careers across the board. Last, but not least, “The Lord of the Rings” contains no degenerate or subversive content (not even feminist tropes, e.g. the woman who is better than the men at everything, or token diversity, e.g. casting black actors in white roles), all of which would have pleased the traditionalist Roman-Catholic Tolkien.
Peter Jackson’s majestic “The Lord of the Rings” is the perfect alternative to HBO’s gory porno “Game of Thrones.” It is incredible that such a movie trilogy was even made, so savor it, because it will not happen again any time soon. Jackson’s recent adaptation of Tolkien’s “The Hobbit” (complete with oversaturated CGI, cartoonish action, feminist tropes, non-white tokens, and even a recycled score) is proof of that.
Alexander, 2004. The story of the man who united most of the known world under his rule.
Directed by Oliver Stone.
Starring Colin Farrell.
Written by Oliver Stone.
Scored by Vangelis.
“Alexander” is a strange departure for Oliver Stone, who is famous for counter-cultural movies like “Platoon,” “Wall Street,” and “JFK” (all good, by the way). There is nothing counter-cultural about “Alexander,” however; it is downright hagiographic. The movie is narrated by Ptolemy (who gained control of Egypt in the civil war that broke out among Alexander’s generals after his death) as he narrates the memoirs that would be lost to history in the burning of the Great Library of Alexandria. Colin Farrell, who plays Alexander, is a fine actor who is often savaged by critics for no apparent reason. Vangelis’ one-man electronic score is amazing, as usual, somehow managing to sound as if he is conducting the heavens themselves.
When I was a young, dumb, ugly libertarian, I hated Oliver Stone for what seemed like warmed-over “socialism” to me, though I loved his exposures of U.S. foreign policy (see especially his “Putin Interviews” and “Ukraine on Fire”). As I grew up, however, the “cultural contradictions of capitalism” became apparent to me (how the so-called “creative destruction” of capitalism actually undermines everything which conservatives supposedly wish to conserve), and I became far more tolerant of anti-capitalists, even if they have problems of their own. Likewise, I used to hate Alexander, a megalomaniacal warlord who burned and bled the world for no reason other than his own glory, yet became too much of a degenerate to rule effectively and left his generals to fight over his empire after he died. I distinctly remember walking back to my dormitory after a lecture on Alexander, troubled by how he could have destroyed a city like Persepolis. Nevertheless, Alexander was one of the most important figures in all of history. He was brave (fought alongside his men) and intelligent (tutored by Aristotle), as well as charismatic and eloquent (pushed on his men and shut down mutinies). Where he destroyed, he also built: he was a founder of cities as well as a patron of the arts and sciences. By uniting the world, however briefly, the West began “Hellenizing” the East, and the East began “Orientalizing” the West, creating a new “Hellenistic Civilization.” All of this is why Alexander is known as “the Great,” not “the Good.” A movie about someone like that cannot be anything other than interesting.
The Dark Knight Trilogy, 2005-2012.
Directed by Christopher Nolan.
Starring Christian Bale, Michael Caine, Gary Oldman, and Morgan Freeman.
Written by Christopher Nolan.
Scored by Hans Zimmer.
The “Dark-Knight” trilogy, which includes “Batman Begins,” “The Dark Knight,” and “The Dark Knight Rises,” are the greatest supervillain movies. Each movie features a supervillain who presents a radical critique of modern society and its secular-humanist beliefs, written so compellingly by Christopher Nolan that they cannot just be rejected as “psycho.” In “Batman Begins,” the supervillain is Ra’s al Ghul, the head of a shadowy cult which, throughout history, pushes decadent civilizations on the brink of collapse over the edge, in order to end the Dark Age and begin a new Golden Age. This pessimistic theory that history is essentially cyclical (“hard times make strong men – strong men make good times – good times make weak men – weak men make hard times”) is in stark opposition to the optimistic theory that history is essentially progressive (i.e. things are always getting better). In “The Dark Knight,” the supervillain is the Joker, who traps Batman in twisted experiments designed to challenge his faith in mankind. The Joker, pessimistically, believes that people are basically evil, not basically good, and tries to prove that by showing what happens when chaos disrupts the law. In “The Dark Knight Rises,” the supervillain is Bane, who combines the League of Shadows’ pessimistic theory of history with the Joker’s pessimistic view of humanity. To make an example of Gotham City – and the optimistic faith in historical progress and human goodness – Bane incites the worst of the underclass against the worst of the ruling class, pitting anarchy against tyranny, then stands back and lets the world watch as the revolution consumes itself.
Apparently, David Boreanaz (“Angel” from “Buffy the Vampire Slayer”) was originally cast as Batman, but Christian Bale edged him out in the end. The hunky Boreanaz more looks the part than the leaner Bale, but Bale is such a good actor that it is hard to complain. Hans Zimmer, who has a team of composers working for him at a studio, produced a score for the trilogy, which has its moments, but is mainly there to punch up the trailers. Zimmer’s team has done much better work on other favorites of mine, such as “The Last Samurai” and “Inception.”
300, 2006. Elite Spartan warriors, led by their king, defend a mountain pass against the invading Persian army – a sacrifice which rallies all of Greece to resistance.
Directed by Zack Snyder.
Starring Gerard Butler.
Written by Zack Snyder.
Scored by Tyler Bates.
“300” is the story of the most famous last stand in history – perhaps the last stand that inspired them all. King Leonidas and his bodyguard were willing to give up their individual lives for the sake of their people’s continued existence. Their story is considerably “sexed up,” in this case, but it still pays tribute to their very real sacrifice and exemplifies their very real virtues.
As with “Watchmen,” Zack Snyder pulled off the rare feat of improving on his source material. Frank Miller’s graphic novel is full of ugly art (all the characters, even the Spartans, look downright simian) and neo-conservative propaganda (trying to equate the modern War on Terror with the ancient Graeco-Persian wars). Snyder’s movie, by contrast, has a beautiful cast (Gerard Butler, Lena Headey, and other heroes literally look like Greek statues), and makes it clear that the Spartans are fighting for their own people on their own land (not neocon abstractions like “reason” over “mysticism”). Tyler Bates’ score is a veritable wall of booming percussion, soaring choruses, and roaring brass worthy of such a legendary battle.
I matriculated at Bucknell University in 2006 (the year that “300” was released), already intending to major in Classics. Naturally, the movie was often the subject of heated discussion among the students and teachers alike, mostly having to do with its historical inaccuracy and fantastical style. I, however, was one of the few who defended the movie. For one, I thought that the historical accuracy of the movie was overlooked and underrated. For another, I thought that the movie’s fantastical style gave it a sort of “meta” historical accuracy. No, the Spartans were not godlike heroes who fought in the nude, but they were remembered as godlike heroes by the Greeks, who depicted their heroes as nude. The Spartans were, of course, a highly effective fighting force and the only Greeks who could have held out at Thermopylae. No, the Persians were not subhuman or even inhuman monsters, but they were remembered as such monsters – “barbarians” – by the Greeks. The Persians did, of course, have a very different civilization from the Greeks (who had never encountered so many different cultures before could not conceive of a single centralized state ruling multiple nations). When it comes to mythic memory (and much of the history of the Battle of Thermopylae is mythic), therefore, “300” is historically accurate. In fact, the most quotable lines from “300” are ripped right from the pages of Herodotus.
Flags of Our Fathers, 2006. A companion movie to “Letters from Iwo Jima” – the story of the Battle of Iwo Jima told through the soldiers from the famous flag-raising photograph.
Directed by Clint Eastwood.
Starring Ryan Philippe, Jesse Bradford, and Adam Beach.
Written by Paul Haggis.
Scored by Clint Eastwood.
The protagonists of this movie need no lionization: they are the subjects of one of the most iconic images of World War II. What this movie does is tell the story of who they were, what happened to them, and how their overnight celebrity changed them.
Letters From Iwo Jima, 2006. A companion movie to “Flags of Our Fathers” – the story of the Battle of Iwo Jima told through the letters that the Japanese soldiers have written home.
Directed by Clint Eastwood.
Starring Ken Watanbe.
Written by Paul Haggis and Iris Yamasita.
Scored by Kyle Eastwood.
In World War II, the principal Axis Powers were Germany, Italy, and Japan – three of the most beautiful civilizations in world history. Whatever threat Hitler’s Nazis, Mussolini’s Fascists, and Tojo’s Imperialists posed to world peace (one which, I suspect, was trumped up by their geopolitical rivals, the Soviets, British, and Americans, cf. Pat Buchanan and Peter Hitchens, or Charles Beard and Herbert Hoover), I simply refuse to believe that the highly cultured Germans, Italians, and Japanese were such barbaric enemies. Certainly the rank-and-file soldiers themselves were not fire-breathing, blood-thirsty warmongers, but like men in all wars, just doing their duty when their country called. “Letters From Iwo Jima” is a significant step towards humanizing one of the most-demonized American enemies, about whom crude war propaganda is still widely believed. (Every August 6th, for instance, neo-conservative chickenhawks and other fat-headed jingoists ritually remake the case for vaporizing the city of Hiroshima.)
The score has an authentic Japanese sound – sensitive and minimalistic – and is absolutely haunting, befitting a story about a garrison which was practically killed to the last man. There were parts of the movie where it, literally, moved me to tears.
Apocalypto, 2006. The story of one family’s struggle for survival amid the beginning of the end of Mayan civilization.
Directed by Mel Gibson.
Starring Rudy Youngblood and Dalia Hernandez.
Written by Mel Gibson and Farhad Safinia.
Scored by James Horner.
Mel Gibson and Farhad Safinia, who met while the latter was working as an assistant during the post-production of “The Passion of the Christ,” had a mutual love of the action-chase genre. “We wanted to update the chase genre by, in fact, not updating it with technology or machinery,” explained Safinia, “but stripping it down to its most intense form, which is a man running for his life, and at the same time getting back to something that matters to him.” At the same time, they wanted to tell a larger story about the fall of civilization set in Mesoamerica prior to the arrival of Europeans. They believed that the same forces that undermined the Maya – environmental degradation, overpopulation, internecine warfare, political corruption, and socioeconomic inequality – remained as relevant as ever, and wanted to use an ancient culture like the Maya to illustrate, starkly and shockingly, those parallels. As Gibson put it, the 15th-century setting is “merely the backdrop” for “civilizations and what undermines them.” They did not want the movie to be entirely pessimistic, however, and explained that the title (which does not make sense until the final scene) literally means “a new beginning or an unveiling – a revelation.” According to Gibson, “Everything has a beginning and an end, and all civilizations have operated like that.”
“Apocalypto” is remarkable for its authenticity. First, there is a complete absence of CGI effects and sets. The movie was shot entirely on location in the jungles of Veracruz, as well as on a set modeled after the sites of ancient Mayan cities which Gibson and Safinia had visited in the Mirador Basin. The effort that went into creating that cityscape, from the attention to detail in the various domestic and economic structures and materials, to the reconstruction of a plaza with a step-pyramid (modeled after that of Tikal), was monumental. In addition, a team of artists based all of the movie’s costumes, hairstyles, makeup (such as piercings and tattoos), and props on archaeological sources, such as ceramics and murals. Second, the cast is entirely comprised of Indian actors and actresses, almost none of whom had any prior acting experience and relied heavily on Gibson’s skillful directing. Third, just as “The Passion of the Christ” was written in Aramaic, “Apocalypto” is written in Yucatec-Mayan, an obscure indigenous language which is the closest possible approximation of what language the characters would have spoken.
“Apocalypto” is also remarkable for its historical accuracy. Of course, a few historical liberties are taken here and there for dramatic effect, but nothing which misrepresents who the Mayans were and what their world was like. Nevertheless, as is the case for any movie that is not pornography about American slavery or the Holocaust (whoever fact-checked “Roots” or “Schindler’s List”?), the movie was nitpicked to death. Richard Hansen (a professor of Mesoamerican Studies and the historical adviser for “Apocalypto”) defended the movie from criticism, and in a published article argued that while there were a few anachronisms in the movie, the criticisms in question were rooted in “relativism,” “revisionism,” and “aboriginalism” among academics-turned-activists. For instance, while many of these critics claimed that the Maya did not practice ritual sacrifice, Hansen demonstrated that the archaeological and documentary evidence is to the contrary (and that the movie was inaccurate only insofar as it omitted even worse details, such as flaying the corpses for human decoration or butchering the corpses for human consumption). “Apocalypto will be judged in time as a cinema masterpiece, not only in its superb execution of film production, but also as an allegorical reference to the present,” argued Hansen. “The criticisms, which were both accurate and fallacious, will continue to surround this film due to its unique story, the extraordinary setting, the allegorical and metaphorical references, and the various levels of awareness that are inherent in the film regarding the human saga.”
“Apocalypto” begins with a quote by the historian William Durant: “A great civilization is not conquered from without until it has destroyed itself from within.” Likewise, a great man is not conquered from without until he has destroyed himself from within. The summer before “Apocalypto” was released, Gibson was arrested for speeding along the Pacific Coast Highway with an open bottle of tequila in his car. Gibson’s profane, obscene, and above all drunken tirade had been recorded by the police, and when it was leaked to the media it was so humiliating that Hollywood blacklisted him and his wife divorced him. (To learn more about what happened that night, watch his interview with Diane Sawyer.) A few years later, Gibson’s partner claimed that he had been physically and verbally abusive, leaking recordings of the latter that worsened his already ruined reputation. Only recently, after a decade of alcohol-recovery and anger-management therapy, has this immensely talented actor and director begun to make a comeback, with the critically acclaimed and award-winning movie “Hacksaw Ridge.” Unfortunately, Gibson’s public self-destruction simultaneously overshadowed and tainted “Apocalypto,” which did not win many awards and is now out of production – a travesty against this artistic masterpiece.
No Country for Old Men, 2007. A suspenseful tale of hunter and hunted and good and evil, across the ghostly landscape of West Texas.
Directed by Joel and Ethan Coen.
Starring Josh Brolin, Javier Bardem, Tommy Lee Jones, Woody Harrelson, and Kelly Macdonald.
Written by Joel and Ethan Coen.
Scored by Carter Burwell.
“No Country for Old Men” is the Coen Bros.’ adaptation of a book of the same name by Cormac McCarthy, thus uniting two of our best modern directors with one of our best modern authors. The movie is also a combination of two distinctive American genres, noir and Western – a Noir-Western. The antagonist, Anton Chigurh (played chillingly by Javier Bardem) is a personification of “Unstoppable Evil” and a demonic “Angel of Death.” He shows no human empathy to those whose lives are in his hands and appears apathetic about the lives that he takes – accordingly, there is nothing sympathetic about him. The protagonist, Llewellyn Moss (played by Josh Brolin) is a morally gray man who, by chance, comes into conflict with very evil men. At first, Moss tries to run from the relentless Chigurh, but chance or fate seems to keep bringing them together, and Moss decides that his only hope of survival is if he hunts the hunter. As Sheriff Bell (played by Tommy Lee Jones) tracks the trail of blood that they leave, trying to help Moss and stop Chigurh, he becomes increasingly disillusioned along the way. “No Country for Old Men” is an engrossing thriller.
The burning of Notre-Dame de Paris was a catastrophe and a tragedy – not just to Paris, not just to France, but to the world. “We are filled with emotion and our hearts are broken,” according to UNESCO Director-General Audrey Azoulay. “Notre Dame represents an architectural, cultural, and religious heritage, a unique literary heritage that speaks to the whole world.”
Notre-Dame de Paris (literally “Our Lady of Paris”) is a prime symbol of the French nation, the Catholic church, and Western civilization. Construction began in 1163 A.D. and was carried on generation after generation until completion in 1345. It is considered to be, next only to the Notre-Dame de Chartres and the Notre-Dame de Reims, one of the finest examples of French Gothic architecture, with its extensive and innovative usage of flying buttresses, rib vaults, stained-glass rosettes, and sculptural decorations.
In 1458, Protestant Huguenots vandalized it for its “idolatry,” and in 1793, atheistic Jacobins vandalized it for its “superstition,” but Notre Dame survived both of these revolutions. France was once the most faithful of all the kingdoms of Europe, known as “the eldest daughter of the church” for early Christian monarchs like King Clovis (who converted the Franks to Catholicism) and Emperor Charlemagne (who unified the Catholic church and the French state). Yet even as France in particular and Europe in general have undergone the strange post-WWII process of secularization/de-Christianization and diversification/Islamification, Notre Dame still stands proudly as a patriotic symbol of Paris and of France.
In “The Wound at the Heart of Paris,” Rachel Donadio, a Paris-based staff writer for The Atlantic, describes what it was like to witness the Notre Dame go up in flames:
To those of us who live in Paris, Notre-Dame is as familiar as a landscape, and as solid as a mountain. How could it have burned so fast? I walk past it so often. I like it best at night, when the sculptures on the outside come alive under the spotlights, the gargoyles and saints and the few fallen angels plunging upside down from heaven above the central door.
In barbaric “Amerika,” ethno-masochistic white majorities ceremoniously destroy works of art in order to appease the tribal grievances of ethno-centric non-white minorities which still hate them anyway. According to the Southern Poverty Law Center, at least 113 Confederate monuments have been destroyed since 2016 alone – irreparable and unforgivable damage to Southern heritage and identity. In civilized France, however, such hyper-ideological iconoclasm is unwelcome. On the contrary, the French government has an entire ministry – the Ministry of Culture and Communication – devoted to preserving and restoring the nation’s heritage sites.
When, at the height of the “gilet jaune” protests in Paris, the Arc de Triomphe was vandalized, it became a serious scandal that made the news and got the French President personally involved. By contrast, when mobs of American “antifa” overthrew the 106 year-old “Silent Sam” statue at the University of North Carolina, the authorities defied the will of the people (70% and 50% of North Carolinians were against illegal and legal removal, respectively) by doing nothing to stop it and even going along with it.
In “The Cathedral: Mirror of the West, Then and Now” at National Review, the (somewhat-reformed) neo-conservative Victor Davis Hanson reacts to this angry, ugly American phenomenon:
The contemporary West is in an age not of builders but dismantlers. We topple statues by night and rename streets, squares, and buildings – now judged wanted by our postmodern, always metastasizing standards of race, class, and gender – to virtue-signal our angst over our preindustrial moral superiors. Most silently acknowledge that few of us could have endured the physical hardship, pain, or danger of guiding three tiny 15th-century caravels across the Atlantic or could have walked the length of California founding missions. Discovering the New World was difficult, but a dunce can topple Columbus’ statue. How many contemporary American monumental buildings will last for the next 800 years?
The French would never dream of desecrating the symbols of their history just to appease, say, Algerian, Moroccan, and Tunisian immigrants. In fact, it was in France where historical conservationism originated, and it was Notre Dame itself which was the subject of the first historical-conservationist campaign. The famous French author Victor Hugo was revolted by the demolition of classic, historic French architecture and its replacement with buildings of a derivative, pseudo-“classical” style – particularly the neglect of the “majestic and sublime” Notre Dame, “the aged queen of our cathedrals.” “All manner of profanation, degradation, and ruin are all at once threatening what little remains of these admirable monuments of the Middle Ages that bear the imprint of past national glory, to which both the memory of kings and the tradition of the people are attached,” lamented Hugo. “While who knows what bastard edifices are being constructed at great cost (buildings that, with the ridiculous pretension of being Greek or Roman in France, are neither Roman nor Greek), other admirable and original structures are falling without anyone caring to be informed, whereas their only crime is that of being French by origin, by history, and by purpose.” According to Hugo, “A universal cry must finally go up to call the new France to the aid of the old!” Hugo’s novel, Notre-Dame de Paris (“The Hunchback of Notre Dame” in English), combined with his essay, “Guerre aux Demolisseurs!” (“War on the Demolishers!”), galvanized public outrage over “the numberless degradations and mutilations which time and men have both caused the venerable monument to suffer,” leading to its first restoration in 1844.
Alas, in cruel twist of irony, it was the latest round of restoration work that appears to have caused the fire, though exactly what happened is still unknown.
Thankfully, the fire was extinguished before all of Notre Dame was leveled, although it was a very close call. “I say this to you solemnly this evening: We will rebuild this cathedral,” swore Pres. Emmanuel Macron the night of the fire. “We will rebuild because it is what the French people expect, because it is what our history deserves, because it is our profound destiny.” The roof (made from a veritable “forest” of trees) and spire (reconstructed in 1844) are cinders. Yet the towers and much of the stained glass (such as the “South Rose Window”), along with many artworks and relics (such as the legendary “Crown of Thorns”), are safe. In 2010, an art professor at Vassar College, Andrew Tallon, laser-scanned the exterior and interior of Notre Dame on his own initiative – data which will be an invaluable resource in the reconstruction. As Sophie Gilbert, writing “All Isn’t Lost” in The Atlantic, observes, “The saving grace of Monday’s tragedy is that the stone structure of Notre-Dame still stands, that most of its treasures seem to have been saved in time, that none of the 400 firemen who fought the blaze for nine hours lost their lives, and that much of the interior to the cathedral seems to have survived, including the three astonishing rose windows.” According to Gilbert, “None of this makes the Notre-Dame fire less catastrophic, or less of a wound to the soul of Paris, but it’s comforting, maybe, to consider how many sites have recovered from the grievous damage of natural and man-made disasters.”
Yet despite the admirable French culture of historical conservationism, this disaster has brought forth many ambitious Ellsworth Tooheys. “France Debates How to Rebuild Notre-Dame, Weighing History and Modernity,” reports The New York Times:
Jean-Michel Wilmotte, a French architect who recently designed a Russian orthodox cathedral in Paris, told FranceInfo radio on Thursday that rebuilding a “pastiche” of the destroyed spire, which was added to the cathedral in the 19th century, would be “grotesque.”
Indeed, some of the architects who have entered France’s international reconstruction competition have proposed rebuilding the roof out of glass and steel (in the style of an Apple Store) and replacing the spire with a minaret (in the style of a mosque). While conceding that modern materials/methods should probably be used for safety’s sake and refraining from any reactionary criticisms of modernity, it must be stated that giving Notre Dame a “modern makeover” would be an act of sheer vanity. There is nothing wrong with modern buildings being built in modern styles. Architecture, like any other form of art, should be a reflection of its time and place, and not merely imitative. Notre Dame is not a modern building built in a modern style, however, but rather an old building built in an old style that reflects its time and place. Thus, the rebuilding of Notre Dame is not supposed to be an opportunity to “update” Notre Dame’s style, but simply to reconstruct it in its original style. It is that building in that style which has made Notre Dame one of the world’s “most popular tourist attractions” and a “world heritage site.” Reconstructing a French Gothic cathedral like Notre Dame, an absolutely exemplary reflection of its time and place, to reflect a different time and place altogether, robs it of its very identity and defeats the very purpose of its existence. What would be the point of reconstructing Notre Dame as something other than Notre Dame? In short (and pardon my French), “Build your own damn building!”
“Conservation,” as opposed to “creation,” requires the repression of these narcissistic impulses, a quality lacking in many modern artists and architects. As Steve Sailer quips in Taki’s Magazine, “Ironically, they don’t realize that contemporary architects with their egomaniacal hatred of tradition represent white maleness at its most Promethean and annoying.” Modernist narcissism is not the only threat to Notre Dame’s historical integrity, however. There is also “ethnic sado-narcissism,” which the Claremont-based American Greatness defines as “a demonstration of self-love that perversely manifests itself as a desire to denigrate and punish the broader ethnic group to which one belongs.” According to American Greatness, “Among all ethnic groups, only white liberals are biased against their own ethnic group.”
“Give Notre Dame a Modern Roof the Alt-Right Will Hate” the ethnic sado-narcissist Erika Harlitz-Kern demands in The Daily Beast.
While Notre Dame was burning, conspiracy theories began to surface, and declarations were made of the fire portending the end of Western civilization and its Judeo-Christian values. Instead, what was happening was the destruction of a medieval past that has, in all honesty, never existed
What does Harlitz-Kern mean by “alt-right”? Does she mean the ghettoized Internet subculture of anonymous trolls with Nazi anime-chick avatars? (Because it is not worth redesigning a world-famous cultural, historical, and spiritual monument just so a “hate group” will “hate” it.) Or is she vilifying anyone who loves history and wants to see Notre Dame faithfully reconstructed as “alt-right”? (Because that is illogical and unfair.) Harlitz-Kern is trying to bolster her shoddy argument by implying that the opposing argument is “alt-right,” and thus evil.
Western civilization is a term that grew out of the creation of history as a topic of study at the universities in England, Germany, and France in the 19th century. In her book History. Why It Matters, historian Lynn Hunt states that “history grew as an academic discipline in tandem with nationalism and a growing conviction of European superiority over the rest of the world.” This conviction led to the West “being portrayed as the source of technical innovation and cultural advancement,” also known as “modernity.”
Harlitz-Kern is resorting to semantics – a form of sophistry – in an attempt to deconstruct Europe’s sense of self-consciousness and self-confidence. The fact that the terms used today to refer to Europe were only recently coined does not mean that Europe had no sense of itself prior to the 19th century or that its sense of itself was formed in colonialist, imperialist, and “white supremacist” contexts. Before there was “the West,” there was “Christendom,” which from the 11th to the 13th centuries waged “the Crusades” to defend Christian civilization from various Islamic caliphates and continued to fight against the Ottoman Empire until the 17th century. That is evidence of a sense of self. Furthermore, it is not even true that the idea of “the West” was invented by proto-Nazis in the late-1800s. It originated in the 5th century A.D., when the Roman Empire split into “East” (where it survived until Ottoman conquest in 1453) and “West” (where it collapsed from unstoppable “barbarian” immigrations and invasions, though was reunited under Pater Europae Charlemagne in the 8th century). The fact that in 331 B.C. Alexander the Great, a Macedonian king from the west, crowned himself “King of Asia” when he conquered the Persian Empire to the east and made enemies trying to forge a “brotherhood of man” among the manifold nations under his rule, suggests that an “East-West” self-awareness originated even earlier, perhaps even stemming back to that first “clash of civilizations” – the Graeco-Persian Wars of the 5th century B.C. Would Harlitz-Kern ever argue that because the idea of “Africa” is largely a “Western” construct that, therefore, “Africa” has never really existed?
This brings us to the notion of Judeo-Christian values, where once again the Middle Ages are used to make an argument that actually misrepresents the time period. In the eyes of those who promote the ideas of Western civilization and so-called Judeo-Christian values, the Middle Ages stand out as the ideal time period when Europe was a white society united in a homogeneous Christian culture led by one single Christian institution.
Like Dinesh D’Souza’s crackpot historical revisionism, “multicultural Medieval Europe” relies on gross exaggeration and sheer obfuscation, as well as an all-around ignorant audience. Of course Europe was aware of and interacted with the outside world, but Harlitz-Kern twists every economic, diplomatic, or military “contact” that Europe had with another civilization to argue against the very existence of a distinct, coherent civilization in Europe. (Curiously, the reverse is never the case: Harlitz-Kern would never, ever claim, for example, that Marco Polo’s mission to Kublai Khan made Mongolian civilization any less Mongolian than it made Italian civilization any less Italian.) Indeed, hucksters like Harlitz-Kern are constantly searching for any “Moorish” ambassador or merchant who visited Europe once whom they can then transmute and multiply into a subpopulation of black Africans living throughout Europe. (The Moors were Middle-Eastern colonists in North Africa, not indigenous sub-Saharan Africans, but whatever.) Harlitz-Kern’s “multicultural Medieval Europe” is just as political as D’Souza’s “Democrats are the real racists,” too: it is meant to degrade and subvert Europe’s sense of self.
History is an ongoing process of human activity. Notre Dame is an example of history being a living, breathing thing. History is not static, and neither is Notre Dame. Or as medievalist Lisa Fagin Davis puts it, “Nothing makes it from the Middle Ages to the 21st century without being transformed along the way.
Harlitz-Kern, an adjunct professor at Florida International University, is so shameless and soulless that she is actually taking advantage of the destruction of Notre Dame to publicize her malicious and mendacious revisionism.
In “How Should France Rebuild Notre Dame?” Rolling Stone hears out both sides of the debate:
Birignani and Harwood’s criticism is just modernist narcissism rearing its ugly head again. Why on earth should Notre Dame be rebuilt as “a reflection of France today”? The whole point of historical conservation is to preserve “reflections of an old France.” New buildings are what should reflect the France that is “currently in the making,” not old buildings. Birignani and Harwood are just like the “demolishers” whom Victor Hugo denounced as “ignoble speculators whose honor has been blinded by self-interest…so idiotic that they don’t even understand that they are barbarians!” As for the idea that “non-secular, white European France” is a “France that never was,” that is just ethnic sado-narcissism. Charles de Gaulle, France’s heroic wartime and peacetime leader, rejected this multi-cultural, multi-religious mirage in its inception, as “decolonized” North Africans began immigrating to France. According to de Gaulle, “We are still primarily a European people of the white race, Greek and Latin culture, and the Christian religion.” While De Gaulle believed that France could be “open to all races,” it was only open “on condition that they remain a small minority…otherwise, France would no longer be France.”
Although the author of the Rolling Stone article, E.J. Dickson, is clearly in favor of the deconstructionists, she gives the last word to the conservationists:
[Jeffrey] Hamburger, however, dismisses this idea as “preposterous.” Now that the full extent of the damage is being reckoned with – and is less than many initially feared – he sees no reason to not try to rebuild and preserve one of the few remaining wonders of medieval architecture. “It’s not as if in rebuilding the church one is necessarily building a monument to the glorification of medieval Catholicism and aristocracy. It’s simply the case that the building has witnessed the entire history of France as a modern nation,” he says. “You can’t just erase history. It’s there, and it has to be dealt with critically.
What The Daily Beast and Rolling Stone are not going to publicize is the reverse-colonialist vengefulness revealed in the literal celebrations of the destruction of Notre Dame. “Hafsa Askar,” the vice-president of the French Student Union at Lille University, boasted, “I don’t give a damn about Notre Dame because I don’t give a damn about the history of France,” and asked, “How much are you people going to cry for some bits of wood?” According to “Sarah Sahim,” a writer for The Independent and The Guardian, “Notre Dame burning is cosmic karma for all the historical sites and artifacts France destroyed and stole when being colonialist scum.” Rabbi Shlomo Aviner, who recently emigrated from France to Israel, speculated that the burning of Notre Dame was “divine retribution” for French anti-Semitism in the 13th century. In a letter to The Atlantic, “Ken Mondschein” sneered that Notre Dame was a symbol of the nation “that exiled Captain Alfred Dreyfus and, in more recent times, has marginalized its Muslim citizens.” Mondschein asked, “Will the rebuilding use aped medieval materials to conjure an imagined past, or will it be built in the same spirit, but employing a diversity of modern materials, to bespeak the reality of contemporary France?” Vice interviewed a number of young Parisians about the fire, one of whom, an artist from Israel named “Jonathan,” praised the fire in the name of equality and diversity:
I thought it was sad but beautiful. It was an important lesson: Nothing lasts forever, everything comes to an end. We’re watching the beginning of a new modernity; religion and the church don’t have the same influence they did before, and it’s always a good thing when white men lose their power…I saw the fire with my own eyes. It was really beautiful, as though Satan was speaking to us humans to say, “The end of your world is coming!” I identify as queer or gay, I’m a spiritual person, and I believe in the same light that burned the cathedral. I think we need to burn all churches and get rid of all organized religions. We need to reconnect with spirituality, to understand that all human beings are equal so that we can finally all accept one another…In Muslim countries, when ISIS came into power, they destroyed Egyptian and religious places that were clearly more important, and nobody cared. Now this happens in Paris and the whole world feels invested; that makes me cynical. If you ask me, it’s crazy that in just a few hours, people have given more than 450 million Euros for a little f*****g church! We have plenty of other problems affecting actual human lives
Andy Ngo, an investigative journalist at Quillette, compiled a list of Twitter users cheering the fire. Many of these people, such as “El Buchón Mariwano,” who tweeted, “idgaf [i.e. “I don’t give a f**k”] about Notre Dame burning down because how many times have white colonizers burned or destroyed our religious structures?…F**k the Catholic church,” received thousands of “likes” and “retweeets.”
For my own part, I came across one such cretin on Facebook – a friend of a friend – who had shared a picture of Notre Dame on fire with the caption, “When the Paris police are knocking and you need to get rid of your child-porn collection.” He was both befuddled and amused with why anyone would be “offended” by him “making fun of the place getting torched,” because while “it sucks that it burnt…the religious aspects mean jack s**t to me.”
This hate – this bile, this poison – does not stop with mere words. “Catholic Churches Are Being Desecrated Across France – and Officials Don’t Know Why,” Newsweek reported a month before the fire at Notre Dame. “France has seen a spate of attacks against Catholic churches since the start of the year, vandalism that has included arson,” continued Newsweek. “Vandals have smashed statues, knocked down tabernacles, scattered or destroyed the Eucharist, and torn down crosses, sparking fears of a rise in anti-Catholic sentiment in the country.”
“And then, in the 21st century,” the philosophical Russian novelist Fyodor Dostoevsky foretold, “to the accompanying howl of the triumphant mob, a degenerate will pull a knife from his boot, climb the stairs to the marvelous image of the Sistine Madonna, and slash this image in the name of universal equality and brotherhood.”
The opening of Sir Kenneth Clark’s documentary series, “Civilisation: A Personal View,” features Notre Dame in a very moving way. Standing on the banks of the River Seine, the famous art historian and museum director posed a question. “What is civilization,” he asked. “I don’t know,” he answered. “I can’t define it in abstract terms.” Turning around and beckoning toward the majestic cathedral, Clark concludes, “But I think I can recognize it when I see it, and I’m looking at it now.”
In his same opening monologue, however, Clark includes a word of caution:
Looking at those great works of Western man and remembering all that he’s achieved in philosophy, poetry, science, law, it does seem hard to believe that European civilization could ever vanish. And yet it has happened once. All the life-giving human activities that we lump together under the word “civilization” have been obliterated once in Western Europe, when the barbarians ran over the Roman Empire. For two centuries, the heart of European civilization almost stopped beating. We got through by the skin of our teeth. In the last few years we’ve developed an uneasy feeling that this could happen again, and advanced thinkers (who even in Roman times thought it fine to gang up with the barbarians) have begun to question if civilization is worth preserving
Notre Dame may have survived the fire by the skin of its teeth, but modernist narcissism, ethnic sado-narcissism, and reverse colonialism, which questions whether civilization is even worth preserving, will prove far more dangerous to that 800 year-old cathedral – and the even older civilization that it represents – than any fire.
Tom Wolfe, with his perceptive eye and eloquent pen, was a Charles Dickens and Mark Twain for our time. He not only satirized patrician vices and eulogized plebeian virtues to devastating effect, but also created many unique expressions which enriched the language. Wolfe was born on March 2nd, 1930, in Richmond, Virginia. He was accepted to Princeton, but attended Washington and Lee because he wished to stay close to home, and when he left for graduate school at Yale, he hated it. Wolfe enjoyed writing and had already developed a signature style in college (his dissertation on the organizational activities of the Communist Party in American literature had to be rewritten because it was too colorful), but after doing some workmanlike reporting in Springfield, Massachusetts, and Washington, D.C., he grew bored and moved to New York City. There, writing for The New York Herald Tribune and New York Magazine, was where he broke out and wrote some of his most iconic pieces. Articles such as “Radical Chic and Mau-Mauing the Flak Catchers” and “The ‘Me’ Decade and the Third Great Awakening” quickly turned Wolfe into the sort of writer that people bought whole publications just to read. Of all Wolfe’s journalism during this period, The Right Stuff (originally serialized but ultimately republished as a book) is the most impressive: he spent years traversing the country talking to whomever he could, and the result was not just a timely history of Project Mercury, but an intimate portrait of the lives of the first astronauts as well. Wolfe’s first novel, The Bonfire of the Vanities, was a critically acclaimed bestseller, as was his much-anticipated second novel, A Man in Full. In “Stalking the Billion-Footed Beast: A Literary Manifesto for the New Social Novel,” published in Harper’s Magazine shortly after The Bonfire of the Vanities, Wolfe responded to his critics, arguing that the role of a novelist was to “document” real life – an opinion popular with many readers but unpopular with many other writers. Sadly, but after a well-lived life stretched over 88 years and packed full of adventure, Wolfe died on May 14, 2018.
In Wolfe’s last novel, Back to Blood, he asked the question, “Could Miami with its clash of cultures be the American city of the future?” In an interview with The Telegraph, Wolfe explained his motives behind writing such a book:
Cubans are one of the largest immigrant groups to the U.S.A., comprising 3% of the entire immigrant population. Mass-immigration from Cuba began in 1959 after the Cuban Revolution. Cubans being refugees from a Cold-War enemy, the U.S.A. expedited their immigration. In the 1960s alone, the Cuban-immigrant population in the U.S.A. increased nearly 600% – from 79,000 to 439,000. The Cuban-immigrant population continued to rapidly increase, to 608,000 by 1980, 737,000 by 1990, 873,000 by 2000, and 1,105,000 by 2010. Between 1995 and 2015, when the “wet foot, dry foot” policy was in effect, 650,000 Cubans immigrated to the U.S.A. Today, there are approximately 1,272,000 Cuban immigrants living in the U.S.A., with 78% in Florida and 64% in Miami.
Originally, Back to Blood was going to be about the Vietnamese in Orange County, California (which has also been demographically transformed by virtue-signaling and pathologically altruistic immigration policies from the Cold War), but became interested in Miami when he learned about the scope of the population replacement that has taken place there and that its government is now entirely under the control of Cubans.
Wolfe brought his signature style of journalism-turned-fiction to Back to Blood, doing lots of legwork to learn about his subject from the bottom up; the result are lots of little details that make his big picture feel real. He was shown around the city by a Cubano reporter, an Anglo police chief, a Haitian anthropologist, and more. The reporter made a PBS documentary about Wolfe’s research process for the novel, “Tom Wolfe Gets Back to Blood.” Three of the novel’s most striking chapters – set in a regatta gala, an art show, and a strip club – are drawn from Wolfe’s own firsthand experiences.
Edward T. Topping IV, the Anglo editor of The Miami Herald, sets the stage as he recollects how and why he ended up in this city where he is so uncomfortable:
Indeed, the novel’s title and theme also come from Ed’s stream of consciousness:
Ed is a thoroughly conventional liberal who wallows in fear and self-hatred. In quasi-1984 fashion, Ed self-censors his politically incorrect thoughts, although as a writer, he often winces at the damage that political correctness has done to the English language. “Aw, shit, the kid is PC…the way he almost said ‘him’ and switched it to ‘person’ on the edge of a cliff…and then gave up on ‘person’ for ‘they,’ so he wouldn’t have to deal with the gender in the singular, the ‘hims’ and ‘he’s,’” Ed thinks when talking to a young reporter. “I fucking don’t want to believe it was Yale that made my man here mangle the goddamn English language this way.” When his wife gets into a racially charged shouting match with a Latina who stole their parking space, Ed cringes. “‘Herald Editor’s Wife in Racist Rant.’” he imagines the headlines. “He could write the whole thing himself.” Anglos like Ed are the only people in Back to Blood who have little to no sense of self-consciousness or self-confidence.
In Back to Blood, figurative human sacrifices are necessary to keep the peace in Miami. For following orders, risking his life, and saving the life of a Cuban refugee, Nestor Camacho (a Cubano cop), is actually punished in just to appease the outraged Cubanos, to whom any deportation of one of their own is an unforgivable offense.
In one of the novel’s most striking passages, the mayor, Dionisio Cruz (who is Cubano), explains to the police chief, Cyrus Booker (who is black), Miami’s conflict between democracy and diversity:
Nestor is, privately, medaled for his bravery and reassigned from the Marine Patrol to the Crime Suppression Unit. When a recording of Nestor insulting a subdued black criminal spreads online, Dio demands his head. Cy, however, argues that the recording was taken out of context: Nestor had just saved the life of another cop, whom the criminal had nearly killed. “You know very well that one of the main reasons you were made chief was that we thought you were the man to keep the peace with all these – uh, uhhh – communities,” Dio snaps back at Cy. “So you think I’m gonna stand by and let you turn racial friction into a goddamn conflagration on my watch?”
Cy knows that amongst themselves, Cubanos like Dio differentiate themselves from the Anglos, but amongst other races – such as “African-Americans” like Cy – the Cubanos define themselves as white:
Miami is now known as “the Capital of Latin America,” or as Wolfe puts it, “Plan B for everyone in Latin America.” When Ed’s wife yells at the Spanish-speaking Latina that she is in America and so should speak English, the Latina laughs in reply, “No, mia malhablada puta gorda [you impudent fat bitch] – we een Mee-ah-mee now! You in Mee-ah-mee now!” Later, when the Cubano cop Nestor and an Anglo reporter John Smith, drive out of Miami into Broward County (which is hardly the American heartland), they can both sense a change:
(By the way, the character “John Smith” is always referred to by his full name, “John Smith” – never “John” or “Smith.” It is not just that it is an archetypical name among “Anglos” (i.e. White Anglo-Saxon Protestants), but that it is also name of the original American and Virginian, Captain John Smith.)
In Back to Blood, everybody does indeed hate everybody, as Mayor Dio explained. Everybody does not just hate everybody, but everybody hates everybody over everything, too – not just over cultural and social differences or economic and political conflicts, but even over the most superficial things, such as how other races dress or speak. Nestor, for instance, is irritated by the clothes that his partner, John Smith, wears. “The Americano stood there dressed so Americano, it was annoying,” he thinks to himself. This instinctive awareness of and animus toward other races is exemplified by Magdelana Otero, a Cubano nurse who is prejudiced against any Americano she meets, no matter how friendly or helpful:
Wolfe’s portrait of Miami’s Cubans makes a mockery of Americano confidence in immigrant assimilation, particularly Republicans’ belief that Latinos are “natural conservatives.” If any Latinos were going to assimilate, it would be the Cubans, who are given extra-preferential immigration status (they are automatically legalized as soon as they enter the U.S.A., even illegally), are eligible for affirmative-action privileges (originally intended as one form of reparations to blacks for white discrimination), and are one of the most powerful foreign-policy lobbies in the U.S.A. (as vengeful to their homeland as the Israel Lobby is to the Palestinians). Yet Wolfe shows that these Cuban-“Americans” do not identify as Americans at all. “Americano” is not a term that they ever apply to themselves – they are Cubanos! – but is basically a racial epithet against “Anglos.”
While patrolling Biscayne Bay with two other cops – Americanos – Nestor considers the irrationality of these racial categories:
As far as Miami’s Cubanos are concerned, the sole business of the U.S. government should be to transfer the population of Cuba to Miami and overthrow the Cuban government (and, as a voter bloc, they present a united and uncompromising front on this issue). There is no gratitude toward the Americanos for rescuing them, however, just entitlement and resentment.
Indeed, Ed not-so-fondly recalls the Cubano blowback to one of his first big stories at The Miami Herald:
Amongst themselves, moreover, the Cubanos expect and enforce tribal loyalty. When Nestor saves the life of a Cuban refugee (but, in the process, deprives him of automatic asylum due to the U.S.A.’s “wet foot, dry foot” policy), he is cast out not just by other Cubanos, but by his own family as well. “DETENIDO! 18 METROS DE LIBERTAD,” announces the front page of the El Nuevo Herald. “A Cuban refugee, reportedly a hero of the dissident underground, was arrested yesterday on Biscayne Bay just eighteen meters from the Rickenbacker Causeway – and asylum – by a cop whose own parents had fled Cuba and made it to Miami and freedom in a homemade dinghy.” Nestor briefly takes some pride in The Miami Herald’s more neutral headline (which highlights the fact that what happened was a “rescue,” after all) only to remember what the Cubanos say: “Yo no creo el Miami Herald” – “I don’t believe The Miami Herald.”
When Nestor comes home to Hialeah after he is on the news, he is told that he has dishonored the family name:
Later, at his grandmother’s birthday party, Nestor is berated even by distant relatives:
Back to Blood, though about race in the main, also includes the other American anxieties of class and sex which Wolfe so often satirized. Magdalena, a Cubano nurse, is an ambitious social climber who uses her body to get what she wants from men, yet is unaware that men are only using her for her body to get what they want. While Magdalena is having sex with Nestor, she starts having sex with Norman, and while she is with Norman, she starts having sex with the gangster-turned-philanthropist Sergei Korolyov, and after she realizes that Sergei has disposed of her, she tries to get back with Nestor again. Norman, an Americano psychiatrist who treats sexual disorders, is an ambitious social climber who uses his rich and famous clients to increase his own public profile. Norman is cocky, petty, and creepy: he lies to make himself seem like more of a celebrity than he is, puts down Magdalena’s weaker grasp of the English language, and is in denial about his own sex addiction. One of the most interesting characters in the novel is Professor Antoine Lantier, a light-skinned Haitian-American linguist. Lantier, who identifies as French (he is apparently descended from old Norman aristocracy) and considers himself to be a bearer of Western Civilization, is disgusted with the primitive black Haitians from whence he came. Lantier’s greatest hope is for his light-skinned daughter, Ghislaine (who is cultured, educated, and innocent), to “pass” as white, while his greatest disappointment is that his darker-skinned son, Philippe, identifies as black. “Right now he wants to be a Neg, a black Haitian,” Ghislaine says of Philippe, “and they want to be like American black gangbangers…and I don’t even know what American black gangbangers want to be like.”
Wolfe’s journalism, in addition to influencing how he wrote, also influenced what he wrote. For years, Wolfe followed the emerging field of “sociobiology” as well as rapid advances in the field of neuroscience (e.g. “Sorry, But Your Soul Just Died” and “Digibabble, Fairy Dust, and the Human Anthill”). Wolfe, who had himself been the target of ideological purges by the literati (because of his populist literary philosophy which criticized elitist literature), took particular pleasure in ridiculing the malcontents of neuroscientific progress. “If I were a college student today,” admitted Wolfe, “I don’t think I could resist going into neuroscience.” Accordingly, in I Am Charlotte Simmons, published a few years after these essays, the title character takes classes on neuroscience, which she sees as the most cutting-edge and wide-open field of study.
Likewise, parts of Back to Blood are clearly influenced by Wolfe’s earlier journalism on the faddishness of modern art and architecture, summed up in his books, The Painted Word and From Bauhaus to Our House. In his chapter on Art Basel Miami Beach, “The Super Bowl of the Art World,” ignorant wealthy collectors, advised by self-interested art dealers, mindlessly compete for tasteless and otherwise worthless works of art. Maurice Fleischmann, a pornography-addicted Jewish billionaire, spends $17 million in 15 minutes on “No-Hands” and “De-Skilled Art” –an artist hired someone else to take pictures of him having sex with a prostitute, then hired someone else to etch the photographs into glass, and did not even touch the photographs when they were sent or the etchings when they were received. “And there you’ve got the very best, the most contemporary work of the whole rising generation,” Fleischmann’s art dealer, “A.A.” tells him. “Maurice…you…have…really…scored this time.” Magdalena thinks to herself, “Fleischmann looked very pleased, but his smile was the baffled smile of someone who can’t explain his own good fortune.”
Wolfe was sometimes labeled a reactionary railing against changes which he did not understand. Oftentimes, however, his critics did not understand the changes that they were idly accepting and applauding. For instance, Slate’s Stephen Metcalf denounced I Am Charlotte Simmons as “an eminently foolish book, by an old man for whom the life of the young has become a grotesque but tantalizing rumor.” Metcalf, with all the wit of the sophomore-cum-philosopher, pronounced, “The stupidity here may actually be boundless.” What Metcalf seems not to have known, however (probably because, to him, any moral criticism of anything is puritanical), is how degenerate campus life had actually become. By contrast, when the judge-turned-professor Richard Posner first read The Bonfire of the Vanities, he commented that “it didn’t strike me as the sort of book that has anything interesting to say.” Yet after the Tawana-Brawley rape hoax, the arrest of the bond-trader Michael Milken, the Crown-Heights riot, the Rodney-King riot, and the O.J.-Simpson trial (all of which Wolfe had anticipated in some form in that 1987 novel), Posner changed his opinion, admitting that he had been “ungenerous and unperceptive.”
Wolfe was never a reactionary, however, neither a nativist nor a xenophobe. On the contrary, Wolfe, like many members of the Baby-Boomer generation, celebrated the U.S.A.’s post-1965 immigration wave as proving, once and for all, that Americans were not the racists that American “Rococo Marxists” (or “sweaty little colonials forever trying to keep up with Europe and, above all, France”) claimed that they were:
In fact, Wolfe celebrated mass-immigration as a natural outgrowth of “The American Idea,” reified in the open, equal seating arrangements at Pres. Thomas Jefferson’s state dinners (described by the British ambassador and his wife as “pell-mell”):
Ironically, however, the “Rococo Marxists” whom Wolfe so deftly ridiculed got the last laugh, benefiting enormously from the worldwide immigration wave to the U.S.A. If mass-immigration discredited their paranoia about non-existent American “isms” and “phobias,” as Wolfe rather optimistically predicted, then they have been too busy enjoying the importation of loyal activists and obedient voters to realize it.
As a recent article in The New York Times, “Why the Announcement of a Looming White Minority Makes Demographers Nervous” notes, the policies that are changing the U.S.A.’s demographics have made the Left complacent and triumphalist about the future:
The new Rococo-Marxist Congress (which, if not “the blue wave” of expectation, is still “the most diverse ever”) is a prelude to that “demographic destiny,” with the Rococo-Marxist Alexandria Ocasio-Cortez as its arrogant, ignorant face. Conservatives and libertarians criticize Ocasio-Cortez’s “Democratic Socialism,” “Green New Deal,” and “Modern Monetary Theory” as if she is driving these ideas herself, but she is merely representative of the demographic shift that is driving the political shift. Now that “all politics is identity politics” (this is a truth which diversity brings to light), it does not really matter if Ocasio-Cortez represents a “rotten borough,” lies about her upper-middle-class background, and dismisses any criticism as prejudice. All that really matters, in her own words, is whether she represents her tribe (“intersectional working-class Ocasio voters”) against other tribes (“homogenous working-class Trump voters”). If Back to Blood is correct that Miami is “the city where America’s future has arrived first,” then America’s future has arrived in Ocasio-Cortez. (Ocasio-Cortez, whose high opinion of herself actually seems to increase with each passing blunder, would have made a memorable character in one of Wolfe’s novels.)
Only late in his life, in Back to Blood, did Wolfe seem consider the consequences of “people of every land, every color, every religion…pouring into the United States.” As Wolfe put it, “This is a book about immigrants in America and the way in which immigrants change life in America.” Yet the question is why would anyone want to live in such a city, let alone such a country?
What took place, perhaps, with Wolfe (and what is certainly taking place with Americans), is a “great relearning” on the issue of immigration – a vital issue which has, lately, been governed more by schmaltz than sense. In “The Great Relearning,” Wolfe argued that the 20th century was defined by intellectual movements and political activism which “swept aside all rules and tried to start from zero.” In architecture, there was the Bauhaus School, which resulted in sterile buildings ugly at which to look and uncomfortable in which to live. In culture, there was the hippy subculture, the communal living of which quickly resulted in rare disease outbreaks. In morality, there was the Sexual Revolution, which resulted, frankly, in AIDS. In politics, there was Communism, which resulted in dysfunction and repression. Optimistically, as was his nature, Wolfe predicted that Americans were looking back on “the amazing confidence, the Promethean hubris, to defy the gods and try to push man’s power and freedom to limitless, god-like extremes” and learning their lesson.
Likewise, in 1965, the qualitative and quantitative regulations which had controlled immigration since 1924 were ceremoniously abolished. The Johnson-Reed Act of 1924, itself the culmination of a series of increasingly selective and restrictive acts from the late-1800s and early-1900s, established a system of national-origins quotas which ensured that the immigrants would be of assimilable character and number. In 1952, the McCarran-Walter Act repealed some of the outright racial discrimination in the 1924 act (citizenship was limited to whites only), but otherwise retained the national-origins quota system. Yet by 1965, the idea that a foreign country’s immigrant quota should be proportional to the percentage of its people among the host country’s population was considered “discriminatory,” at least according to the politicians. Not so much the people, however: “U.S. Public is Strongly Opposed to Easing of Immigration Laws,” reported The Washington Post, citing a Harvard-Harris survey of 58% to 24%, but the politicians had already set their brains to zero.
It was not all a “Great Unlearning,” however; there was plenty of base politicking. For one, there was Rep. Michael A. Feighan of Ohio, who as chairman of the House’s immigration subcommittee agreed to stop obstructing the act in exchange for prioritizing family reunification over skills, so that he could pander to the labor unions (which did not want to compete with skilled foreign labor) and Eastern-European blocs (which wanted to import more of their own people) in his district. For another, there was Sen. James O. Eastland of Mississippi, who as chairman of the Senate Judiciary Committee also agreed to stop obstructing the act, seemingly for no other reason than as a personal favor to the President and the Kennedys. More common, however, was the well-meaning gullibility and high-minded sanctimony typical of a “Great Unlearning.” Sen. Eugene J. McCarthy of Minnesota, for example, declared that while the act was “a recognition of the great contribution made to the development of our nation by peoples from all regions of the world” and “reflects a fundamental principle in our laws and traditions: that of the equality of nature of all men,” he also believed that it “would not greatly increase the number of immigrants, but it would provide that those admitted would be judged on the basis of a national-origins quota.” (McCarthy later admitted that he and his colleagues had “never intended to open the floodgates” and that they had made a terrible mistake.) In a ceremony at the Statue of Liberty, Pres. Lyndon B. Johnson signed the Hart-Celler Act into law with reassuring words. “The bill that we sign today is not a revolutionary bill,” he explained. “It will not reshape the structure of our daily lives, or really add importantly to our wealth or power.” According to Johnson, “The days of unlimited immigration are past, but those who do come will come because of what they are, and not because of the land from which they sprung.” (In the same speech, Johnson announced that Cuban refugees would continue to receive their privileged treatment.)
The “Great Relearning” began when the character of immigrants started changing and the number of immigrants started rising, which was not ever supposed to happen. The bill’s sponsors, Rep. Emanuel Celler of New York and Sen. Philip A. Hart of Michigan, along with many other confident Congressmen, had predicted that their reform would have no such effect and had dismissed their opponents as mere contrarians and cynics. Sen. Edward Kennedy of Massachusetts and Sen. Robert Kennedy of New York had made an effort to “set to rest any fears that this bill will change the ethnic, political, or economic makeup of the United States,” while also attacking criticism of the bill as “emotional, irrational, and with little foundation in fact…out of line with the obligations of responsible citizenship.” LBJ-Administration officials, such as Attorney-General Nicholas D. Katzenbach, Secretary of State D. Dean Rusk, and Secretary of Labor W. Willard Wirtz, had been called to the Congress to testify the same: there would be no noticeable difference in the quality or quantity of immigrants, and anyone who suggested so was suspicious.
John F. Kennedy himself, in his highly romanticized and sentimentalized Nation of Immigrants (published by the Anti-Defamation League in 1958), had avowed that he “does not seek to make over the face of America.” Yet the Hart-Celler Act that is remembered today for exactly that – for making over the face of America. The Pew Research Center’s timeline of the racial composition of the American population, “The Changing Face of America,” begins in 1965, the year Hart-Celler became law. In a report on Hart-Celler, “1965 Immigration Law Changed Face of America,” NPR’s Jennifer Luden described how “it marked a radical break with previous policy and has led to profound demographic changes in America.” At the Migration Policy Institute’s 2015 symposium in honor of Hart-Celler, Muzaffar Chisti explained that it “literally changed the face of America.” According to Chisti, Hart-Celler “ushered in far-reaching changes that continue to undergird the current immigration system, and set in motion powerful demographic forces that are still reshaping the United States today and will in the decades ahead.” A recent book by Peggy Orchowski on the 50th anniversary of Hart-Celler, “The Law that Changed the Face of America,” argues that “this historic law that made the United States the highly diverse nation of immigrants that it is today.”
The American people quickly learned about the unintended consequences of policies such as “birthright citizenship,” “chain migration,” “diversity lotteries,” “refugee protocols,” “workers visas,” and more. Reforms were passed in 1986 and 1996, but they were half-hearted and simple-minded, and only worsened the problems by delaying the necessary solutions. Further reforms were attempted in 2007 and 2013, but these were so transparently exploitative that they did not even manage to pass. In 2012, Pres. Barack Obama took it upon himself to decree amnesty for the children of all illegal immigrants, resulting in a surge of illegal immigration at the southern border still ebbing and flowing seasonally. In 2016, Donald Trump (a world-famous businessman, celebrity, and demagogue) ran a self-funded presidential campaign with an “isolationist,” “protectionist,” and “nativist” message, including not only the revocation of Obama’s amnesty but also the construction of a wall along the southern border. Since his stunning victory, however, he has struggled against bipartisan resistance at every turn, even in his by-the-book efforts to enforce the law at the national border. Early in 2018, when Pres. Trump was making a major attempt to negotiate a deal with the Congress, a Harvard-Harris poll found that not only did a substantial majority of 65% (cutting across dividing lines of class, race, and sex) support his position of trading amnesty for reforming the system and boosting border security, but also that an even greater majority of 81% supported reducing immigration levels. One year later, just days before Pres. Trump caved on the government shutdown, another Harvard-Harris poll found that immigration, at 38%, was the single most important issue to the public (and while his proposed border wall was opposed by 55% to 45%, there was more support for other border-security measures, including a “security barrier”). Yet when it comes to immigration reform, the Democrats and the Republicans are less responsive to and less representative of public opinion than the Bourbons and the Romanovs.
What the American people are “relearning” – and Back to Blood is a sign of the times – is that nothing comes free, especially not immigration. Like every other policy, immigration has costs as well as benefits, losers as well as winners, and so on. Indeed, how could a policy which literally determines the composition of a country’s population – and populations are not merely interchangeable masses – be anything less?
William Porcher Miles was one of the “Fire-Eaters,” an influential faction of academics, publishers, and statesmen who advocated Southern secession in response to irreconcilable differences – cultural, economic, political, and social – with the North. Although he and his comrades have gone down in history as extremists, he was not some hothead who blindly led his people to a war out of ideological fanaticism or personal ambition, but an intellectual who thought seriously about the problems facing his people and how to preserve their established, inherited way of life.
Indeed, the neo-abolitionist historians who denounce Fire-Eaters like Miles as conspiratorial, demagogic, paranoid ranters and ravers – the self-hating Charles B. Dew and the smart-aleck William W. Freehling come to mind – contradict themselves by subscribing to the “irrepressible-conflict” school of history. If it really were true that there was an “irrepressible conflict” between the North and the South, then it was the Fire-Eaters (not to mention secessionist abolitionists like George W. Bassett, William Lloyd Garrison, Wendell Phillips, and Lysander Spooner) who saw the future the clearest.
Miles was born in Walterboro, South Carolina, on July 4th, 1822. Unlike most other South-Carolinian men of his generation, Miles was uninterested and uninvolved in politics, despite growing up in the Colleton District (the seat of the Nullification Crisis) and living through the Blufton Movement. Miles was a professor of mathematics at his alma mater, Charleston College, from 1843 to 1855, where he befriended the future editor James De Bow and the future diplomat William Henry Trescot. It was during this period of intensifying sectional conflict that Miles began speaking out and standing up for the South.
Miles’ political debut was at a celebration of the Fourth of July in Charleston, where he criticized the Wilmot Proviso for attempting to exclude slavery from territory acquired in the Mexican War. “As a Southern man,” he explained, “I was bound, on such an occasion, in honor and conscience, to express myself in the strongest and fullest manner.” While Southerners were considering the issue as an “abstract question of constitutional right,” Northerners were neither “contending for an abstract principle,” nor “influenced by a mere spirit of fanatical opposition to slavery,” but rather “are deliberately, intentionally, and advisedly aiming a deadly blow at the South.” According to Miles, such a blow was “intended to repress her energies – to check her development – to diminish and eventually destroy her political weight and influence in this confederacy.”
In 1855, Miles was elected Mayor of Charleston, thanks to the politicking of his friends back home, who publicized his volunteer work treating an outbreak of yellow fever in Norfolk, Virginia. Miles was a progressive, reformist mayor who created a number of facilities and resources for the improvement of public health, safety, and welfare (including aid for free blacks), as well as developed a schedule for the reduction of public debt.
Miles was elected to the U.S. House of Representatives in 1856, where he served until 1860. William Gilmore Simms, the famous author whom Miles had befriended while governing Charleston, apprised him of other Southerners in the Congress (such as Mississippi’s John A. Quitman and South Carolina’s James H. Hammond), but above all, advised Simms, “Let all your game lie in the constant recognition and assertion of a Southern nationality!” By this point, Miles was already an avowed secessionist, and in the opinion of some of his more conservative friends, by taking maximalist positions on impractical issues – such as the admission of Kansas under a pro-slavery constitution – he was trying to force the moment to its crisis. “The issue has been made, the battle joined, and though it be on an abstract principle which does not at present promise to result in any practical advantage to us, I am willing to stand by the guns and fight it out,” argued Miles. “The South may not dissolve the Union on the rejection of Kansas, but such rejection would, assuredly, sever still another of the cords – rapidly becoming fewer – which the course of events has been snapping one by one.” Forcing the issue of Kansas would “at least have the effect of opening the eyes of the Southern people to the startling fact that they have no hope in the future of maintaining their equality in the Union,” as well as “compel them to ponder the question whether they will choose subjugation or resistance, colonial vassalage or separate independence.”
Miles boasted that Southern secession was imminent:
It is not on a question of dollars and cents that the South would dissolve the Union. The history of long weary years of unjust and unequal legislation has sufficiently proved that point. But when it shall be proved to the South not only that the scepter has forever departed from her; that she can never, concurrently with the North, rule the common country; but that she must forever occupy an inferior and subordinate position; that she can never expand, never occupy her just share of the common territory; that her institutions and civilization are at the mercy of a sectional majority which tolerates them only to the end that her people, as “hewers of wood and drawers of water” [a quote from Joshua 9:23 describing the Israelites’ enslavement of the Canaanites], may minister to its prosperity – then, I believe, she will imitate the example of our revolutionary sires, and take her destinies into her own hands.
In the presidential election of 1860, Miles opposed Southern support for Sen. Stephen A. Douglas, suspicious of Northern Democrats and certain that Southerners could only be represented by a strictly sectional party. Miles predicted to the Charleston Mercury (the secessionist newspaper edited by the Fire-Eater Robert B. Rhett) that the coming election would pit “power against principle – the majority against the minority, regardless of all constitutional barriers.” (Indeed, 60% of the electorate would cast a vote against Abraham Lincoln – the other three candidates being varying degrees of anti-Republican – but by carrying the Northern states Lincoln won the electoral plurality that he needed.) Even before the Democrats’ sectional schism and Lincoln’s sectional election, however, Miles pressed for preemptive secession, concluding that the North and the South neither should nor could live together under the same government. “They actually invade our borders and endeavor to apply the ‘knife’ and ‘actual cautery,’ fire and sword, to what in their folly they consider ‘a sore’ in our body politic!” exclaimed Miles. “Can the Southern people endure this without degradation and ruin?” he asked. “Impossible,” he answered. “We only desire to be let alone, and yet we are constantly told that we are aggressors and agitators.” The Southern states were “the sole judges of what is best for their own interests, and for their own peace and security,” and thus could “whenever they choose, take their destinies into their own hands.” Indeed, the South possessed “all the elements of wealth, prosperity, and strength, to make her a first-class power among the nations of the world,” argued Miles, who was “weary of these eternal attempts to hold out the olive branch, when we ought to be preparing to grasp the sword.”
After Abraham Lincoln’s election, Miles and Laurence M. Keitt (another South-Carolinian Fire-Eater) met with Pres. James Buchanan to negotiate the status of federal property in South Carolina, the secession of which was imminent. According to Miles and Keitt, they had an “understanding” with Buchanan that no military action would be taken against U.S. forts in Charleston Harbor so long as no attempt was made to reinforce them. “After all, this is a matter of honor among gentlemen,” Miles and Keitt said that Buchanan had told them. “I do not know that any paper or writing is necessary.” Shortly after Lincoln took command, however, he skillfully manipulated the officially unresolved issue of U.S. forts (namely, Fort Sumter, in Charleston Harbor) as a pretense for war.
Miles was elected as a delegate to South Carolina’s secession convention where the Union was dissolved, a delegate to the Montgomery Convention where the Confederate Constitution was framed, and a representative to the Confederate Congress. Miles also served, along with Louis Trezevant Wigfall (a Texan Fire-Eater) as an aide-de-camp to Gen. P.G.T. Beauregard at Charleston and the Battle of First Manassas, but found that his lack of military training made him unhelpful. It was Miles who, as the chairman of the House Military Affairs Committee, designed what is recognized today as “the Confederate flag,” which became a popular battle flag and was later incorporated into the national flag. “There is no propriety in retaining the ensign of a government which, in the opinion of the states composing this Confederacy, had become so oppressive and injurious as to require their separation from it,” Miles reported to his committee “It is idle to talk of ‘keeping’ the flag of the United States when we have voluntarily seceded from them.”
Like most Fire-Eaters, however, Miles found himself out of power in the Confederacy. Much to his chagrin, the conservatives who had always opposed disunion, such as Pres. Jefferson Davis and Gen. Robert E. Lee, had become far more powerful and influential than secessionists like him. Miles believed in “our great struggle for liberty, independence, and even existence as a people,” but that struggle did not seem to believe in him. The Confederate military’s last-ditch effort to enlist and emancipate slaves epitomized Miles’ disappointment and frustration. “It is not merely a military, but a great social and political question,” he explained, “and the more I consider it the less is my judgment satisfied that it could really help our cause to put arms into the hands of our slaves.”
After the war the unreconstructed Miles, disgusted with other Fire-Eaters for disavowing their secessionist politics, retreated to his original professorial life. “Politics must be more a trade and less a pursuit for an honorable man than it ever was before,” reflected Miles, and “for a time cannot be a path which any high-toned and sensitive – not to say honest and conscientious man – can possibly tread.” Miles was equally disgusted with North for the infamous corruption of the postbellum Gilded Age, believing that “Monopolists” (such as parasitic, predatory railroad tycoons) and “Demagogues” (such as anarchists, populists, and socialists) had made a mockery of “just principles of government.” (Historians rarely make the connection between the outcome of the so-called Civil War and the ensuing generation of degeneracy among the victors.) Miles applied unsuccessfully for the position of president at the new Johns Hopkins University in Baltimore, Maryland, but in 1880 he was accepted as president of South Carolina College (now the University of South Carolina). Since all white and black men had been given the right to vote during Reconstruction, Miles believed that it was necessary for good governance that whites and blacks alike receive free primary education. “The whole population should be educated,” he recommended, “trained to the just discharge not only of the right of suffrage, but of all duties of citizenship.” Miles had married a Virginian heiress in 1863, and in 1882, after the death of his father-in-law, Miles resigned from South Carolina College to manage the extensive plantation network that he had inherited. Miles’ inheritance included sugar plantations in Louisiana, but when fellow sugar planters asked him to help them lobby for high sugar tariffs, he refused, reminding them that he was “an old-fashioned, straight-out, ‘strict-construction’ Democrat, bred in the South-Carolina school of John C. Calhoun and State Rights,” and thus would oppose anything unconstitutional on principle even if opposed his own financial interests. Miles died in 1899 and was buried at Green Hill Cemetery in Union, West Virginia, the home of his wife’s family.
In 1852, while still a mathematics professor, Miles spoke to the Alumni Society of Charleston College on commencement day. The title of his speech was “Republican Government not Everywhere and Always the Best; and Liberty not the Birthright of Mankind.” Miles’ speech was so popular that the Alumni Society requested a copy from Miles for publication. “The opinions embodied in it are my deliberate convictions, as I believe they are those of educated men and gentlemen throughout the country,” Miles wrote in reply to Henry M. Bruns, Henry D. Lesesne, and S.P. Ravenel, “although it has seemed to me that there has been rather too much hesitancy on their part in the open avowal of them.” Miles’ speech is hardly historically dated, and is, in fact, a remarkable refutation of the modern-day American civic religion of universalism and equalitarianism.
To Miles, the zeitgeist of Western Civilization in the mid-1800s was the rise of reason over tradition. “Progress is the watchword of the day,” announced Miles. “Mind rules.” In what Miles described as “restless activity of thought,” everything in life – society and politics, science and religion, and so on – had become pressing “problems” to be solved, and, as a result, “the prestige of old forms and associations is daily becoming weaker.” In other words, instead of accepting the accumulated wisdom of preceding generations and adding improvements where necessary, traditions were tested for conformity to the latest ideological fad and purged if they proved deviant.
The media (which at the time was starting to grow into “the mass-media” due to the invention of the telegraph, the formation of news-sharing syndicates, the rise of prestigious national papers, and the proliferation of localized as well as specialized publications) accelerated and amplified the propagation of new ideas and “public opinion.” Yet this new medium, which prioritized sensationalism and simplicity, was influencing the message as much as it was propagating it, with the negative result that the most ill-formed and ill-understood ideas were often the most influential. “Their daring speculations, from the almost incredible facilities of publication and transmission, are thrown out with a rapidity, and, in consequence, a crudity, which, while it prevents a healthy and proper digestion, excites a morbid and craving appetite for novelty,” explained Miles. “All is excitement and confusion, and there is little time for careful weighing and reflection.”
It was the duty of the “educated and intelligent classes” (such as the distinguished alumni society which Miles was addressing, long before affirmative action, grade inflation, curricula degradation, grade inflation, and student debt made college degrees worthless) to refute the fallacies and heresies that were influencing public opinion. Unfortunately, intellectuals had not just failed in this duty, but were actually the prime offenders. “How grievous an account will posterity exact of their memories for evils which may have arisen, whether through their mistakes or supineness,” Miles imagined, “and on the other hand, how glorious will be their reward who have manfully discharged their duty as thinking men and citizens – stemming the tide of false political doctrines and disorganizing social theories which, in these latter days, are sweeping like a torrent over the face of the earth.”
“In this country,” warned Miles, “freedom of thought, freedom of action, and freedom of the press run riot, until they often degenerate into the grossest license, and where, in consequence, the widest field lies open for sowing the seed of every hurtful weed of doctrine.” The tendency of liberty to degenerate into licentiousness had been a constant problem in Western political philosophy since Plato, and Miles had opened his address with a quote from the English poet John Milton, whose epic “Paradise Lost” could be interpreted as an allegory of that very problem: “License they mean when they cry liberty!” Miles could never have fathomed how much worse these “freedoms” would get, however, as a perusal of recent headlines will demonstrate.
Ezra Klein, the founder of Vox, either does not comprehend or simply does not care about the U.S.A.’s federal and republican form of government, and is attacking equal representation in the Senate and the electoral college as “undemocratic.” (Ironically, although the popularity of “Hamilton,” a minstrel show for white left-liberals, has convinced audiences that the eponymous hero was “woke” like them, it was he who defended the electoral college in the Federalist, and, in fact, supported a proto-“Trumpian” program against “free trade” and “open borders.”) Klein does not endorse rethinking the oversized and divided U.S.A. in order to ensure representative government for all, but rather moving aggressively to secure a one-party Democratic state by breaking apart single blue states into multiple blue states (like California) and creating new blue states out of territory (like Washington, D.C., and Puerto Rico).
The New York Times recently hired Sarah Cheong, a feminist blogger who vented her spleen against white people daily on social media for years. Cheong resorted to the illiterate excuse of “satire,” but her tweets are not satirizing anything, and instead reveal a humorless hatred. Aja Romano, a writer for Vox, explains that Jeong was merely “discussing and responding to the oppressive mentality of white culture,” as well as “commenting on the idea that white people often believe they are being discriminated against when they aren’t.” According to Romano, “To equate ‘being mean to white people’ with the actual systemic oppression and marginalization of minority groups is a false equivalency.” (The sort of systemic oppression and marginalization faced by Jeong, an immigrant from South Korea – a country which American soldiers died to save from North Korea – who went on to attend two of her host country’s most prestigious universities, Berkeley and Harvard, before she was even naturalized, and who now writes for one of her host country’s most prestigious newspapers?)
Jonathan Chait, a writer for New York Magazine, unwittingly satirized the utter absurdity of “RussiaGate” in an essay that can only be described as a conspiracy theory. According to Chait, Pres. Donald Trump may be a sleeper agent turned by the Soviets back in 1987 (the year he began publicly criticizing the free ride that the U.S.A. was giving its Cold-War allies) who remained dormant until recently activated by his handler, Vladimir Putin. The evidence for Chait’s theory, like all the other evidence for RussiaGate, is nil. (RussiaGate is, on the one hand, mere xenophobic propaganda for Outer-Party proles – the “birtherism” of the Democratic Party – but on the other hand, a political ploy by Inner-Party elites to prevent the critical normalization of American foreign policy on which Trump campaigned.)
Pres. Donald Trump told The Sun, “I have a great love for countries in Europe,” and added (in reference to the mass-migration of Africans, Arabs, and Asians initiated by German Chancellor Angela Merkel) that “I think what has happened to Europe is a shame.” According to Trump, “I think it changed the fabric of Europe and, unless you act very quickly, it’s never going to be what it was, and I don’t mean that in a positive way.” At a news conference, Trump continued, arguing, “I just think it’s changing the culture…and I think they better watch themselves because you are changing culture, you are changing a lot of things.” Philip Bump, a national correspondent for The Washington Post, pronounces this to be “white-nationalist rhetoric,” and denied not only that immigration has any effect on existing culture (unless it is making it better, of course), but also that Europe even has a culture (besides mayonnaise on white bread and other bland things, of course). Apparently, Western Civilization is now nothing more than white nationalism, and anyone who wants European to remain European is nothing more than a white nationalist.
“First Man,” a true-to-life biopic about Neil Armstrong during the Apollo-11 mission, was derided by The New Yorker’s film critic, Richard Brody, as a “right-wing fetish object…a film of deluded, cultish longing for an earlier era of American life” which is “whiter than a Fred-and-Ginger ballroom set.” Meanwhile, Brody applauds “Hidden Figures” (a laughable “affirmative-action” falsification of history that looks like it should have aired on the Hallmark Channel) as “a subtle and powerful work of counter-history, or, rather, of a finally and long-deferred accurate history”), and seriously believes that “were it not for the devoted, unique, and indispensable efforts of three black women scientists, the United States might not have successfully sent people into space or to the moon and back.” At least “Hidden Figures” was trying to tell a wholesome story, however, unlike most of the degeneracy out of Hollywood. Due to the highly graphic and exploitative simulated sex scenes on “The Deuce,” HBO’s new show about the mainstreaming of pornography in the 1970s (finally, someone is telling this long-overdue story!), Rolling Stone reports that the network is hiring an “intimacy coordinator” to keep the actresses “safe.” (Remember, the U.S.A. has a holy mission to evangelize its self-evident “exceptionalism” to the heathen world.)
David Greenberg, a Rutgers historian reviewing David Frum’s Trumpocracy: The Corruption of the American Republic in Yale Alumni Magazine, is praising neo-conservatives for their integrity. “Writers like Max Boot, Eliot Cohen, Jennifer Rubin, James Kirchik, Bret Stephens, Cathy Young, and Bill Kristol have resisted the pressures of tribalism at a moment when few dare to leave their partisan tents.” (Does Greenberg not get the joke? These neo-cons are all highly ethnocentric – that is, “tribalist” – Jews who, as pundits, act as a fifth column in their host country for racial and religious hardliners in their home country, Israel.)
Kevin Sullivan, a senior correspondent at The Washington Post, is touting McAllen, Texas, as “an all-American city that speaks Spanish.” McAllen is the site of a U.S. Customs and Border Protection facility where “families crossing the border illegally have been separated and children have been housed under the administration’s ‘zero-tolerance’ policy” (translation: adults and children are detained separately for a brief period of time while the adults are legally processed). Sullivan found that the people of McAllen, who are 84.6% Latino, cannot comprehend why anyone would worry about immigration. “Immigration isn’t a problem for this Texas town,” remarks Sullivan, “it’s a way of life.” (Perhaps because, in recent years, McAllen, supposedly a city of the immigrant future, has been ranked the least-educated city, a city with the highest poverty rate in its class, the worst city for residents feeling safe, the worst city for young people finding work, and the third most-obese city.)
Alexis Grenell, a Democratic-Party strategist writing in The New York Times, is attacking white women as “gender traitors” for not showing “solidarity” with non-white women and for making a “blood pact” with white men. According to Grenell, white women, who “put their racial privilege ahead of their second-class gender status,” are “expected to support the patriarchy by marrying within their racial group, reproducing whiteness, and even minimizing violence against their own bodies.” Grenell sneers at the traditional (or “patriarchal”) belief that women are to be “cherished and revered,” arguing that they are instead “denied basic rights.” (As evidence of this dystopia, Grenell cites “The Handmaid’s Tale,” which is a sort of feminist-1984.)
When John McCain, the U.S. Senator from Arizona who made a career out of disloyalty to party and country, died after clinging to power for a year, his public funeral became an obscene display of political grandstanding, with melodramatic and grandiloquent eulogies not just from ex-Presidents, but from the media (which McCain often described as his “base”). Tribute was paid not only to McCain the soldier (he fought bravely in the Vietnam War and then spent his life entangling new generations in new Vietnams), but also to McCain the statesman (that is, his espousal of schmaltzy, shlocky immigration myths and his advocacy of chauvinistic, jingoistic imperialist policies). “We gather to mourn the passing of American greatness,” intoned his daughter, who co-hosts The View, “the real thing, not cheap rhetoric from men who will never come near the sacrifice, those that live lives of comfort and privilege while he suffered and served.” (Is she talking about Donald Trump, or Joe Biden, George W. Bush, and Bill Clinton?)
Amy Harmon, The New York Times’ science reporter, is criticizing advances in scientific research on DNA for “carrying with it the inescapable message that people of different races have different DNA.” Geneticists themselves are as eager to deny that their research, which demonstrates that race is based in human biology, means what it manifestly means. While Harmon’s predecessor, Nicholas Wade, once explained the politically incorrect results of genetics research with honesty, his successor cries, “Can Somebody Please Debunk This?” in response to such inconvenient truths. (When “climate skeptics” criticize scientific research which conflicts with their ideologies and policies, that is “science denialism,” but when Harmon does it, that is debunking “white supremacism.”)
As Harmon’s interviews with geneticists reveal, postmodernism is now beginning to infect the sciences, but the humanities have long been irredeemably contaminated “hot zones.” Donna Zuckerberg, the editor of the journal Eidolon and the author of Not All Dead White Men, is attacking the Classics. “Classics as a discipline has deep roots in fascism and reactionary politics and white supremacy,” argues Zuckerberg, who calls for “a Classics that is ethical, diverse, intersectional, and especially feminist.” Zuckerberg and the contributors to Eidolon are constantly struggling between the contradictory-yet-complementary goals of deconstructing or denouncing ancient Greece and Rome. “Ancient Athens, though a democracy, does not fit a narrative of openness to immigrants and refugees no matter how we try to dress it up,” frets one contributor after reading about the nativist Periclean Citizenship Law of 451 B.C. “What, then, is a woke classicist to do?” Her answer is “to stop pretending that the worst thing the Athenians ever did was to execute Socrates and openly engaged the true dark side of Classical Athens’ anti-immigration policies and the obsession with ethnic purity that lies at the heart of its literature, history, and philosophy.” (I majored in Classics because it was a way to study the founding history, literature, philosophy of Western Civilization at the same time, but now I realize that I should have majored in something that I hatred in order to subvert the subject.)
The academia and media of 1852 are paragons of intellectual integrity and responsibility next to the ideologues and partisans of 2018, who have mastered the Ministry of Truth’s practices of “doublethink,” “duckspeak,” and “crimestop.”
Miles identified two overarching fallacies and heresies which had captivated and contaminated public opinion. First? “That a republican form of government is not only the best form of government, abstractly, but that it is necessarily, everywhere and always the only good, and tolerable, and true form of government.” Second? “That liberty is the birthright of mankind.”
“Republican Government not Everywhere and Always the Best"
Miles was quick to clarify that while republican government (which he defined broadly as “self-government”) was not the only acceptable form of government and not a universally applicable form of government, it was certainly the only form of government acceptable and applicable to Americans:
And this is a notion so widely spread among us, that I fear that in stating it as an error – even before this intelligent and educated audience – I will excite at first some little surprise. But I do not fear that you will misunderstand me. As an American addressing Americans, it is hardly necessary for me to say that I believe our form of government to be not only the best possible form for us, but really the only one that could ever measurably secure to us those blessings whose security must be the prime object of all government. No thinking man, whatever abstract theory he may hold – whatever may be his predilections and prejudices – can seriously doubt this. In fact, there is little room for choice. For ruling the Anglo-Saxon race, no national constitution which does not recognize the great principles of the responsibility of the rulers to the ruled, and the right of the citizen to assist in framing the laws and to tax himself, can be available. The English Constitution and our own – parent and child historically – are really the only two which practically do this, and are consequently the only two between which even the speculative theorist would have to decide. But a government similar to that of England cannot be formed by legislative enactments and a paper constitution. We cannot create a ruling dynasty, nor, its necessary support, a hereditary nobility. They are not matters of a few generations, nor of a few centuries. A thousand years would scarcely be sufficient to give them that hold upon national sentiment which would ensure stability. No one, therefore, whatever may be his theoretic views, can believe a monarchical form of government practicable or possible in our country. It remains, then, that our republican form is, as I said, not only for us the best, but the only practicable one.
The origins of the U.S.A. as British colonies demonstrate why self-government was right for Americans. “Our own great experiment in America had everything in its favor, whether we look to the character of the people, the geographical position of the country, or the adventitious circumstances of the case.” For one, Americans, living in colonies across the Atlantic Ocean, were remote from their mother country and any sort of central government. “Three thousand miles of ocean – equivalent to almost thrice that distance in these days of rapid steam navigation – separated us from the intriguing influences of jealous and powerful governments.” For another, aristocratic and ecclesiastical orders had never been established in the American colonies, nor was there an industrial, urban proletariat. “The nice and perplexing questions involved in the adjustment of the claims of antagonistic orders and classes – of immense and conflicting social interests – of the insolent, apathetic rich on the one hand, and the hungry-eyed, desperate-minded poor on the other – all those momentous and terrible problems which impede reform and fetter progress in the old and densely crowded communities of Europe, were with us happily and entirely wanting.” As a result, Americans were accustomed to governing themselves by their own traditions and institutions, and while distance made the governmental authority of their mother country weak, Americans still strongly identified with the heritage of their mother country, and the traditions and institutions by which they governed themselves were ones which they had inherited from her. “The Anglo-Saxon race seem to possess in an eminent degree those qualities requisite for national self-government: an inflexible love of justice, great tenacity of purpose; a certain instinctive reverence for existing institutions which makes them averse to fickle changes; a constitutional equanimity and moderation; in a word, a steady equilibrium resulting from the due mixture of attributes peculiarly their own, sound judgment and practical common sense.”
By their historical experience, therefore, Americans had developed the characteristics and capacities which made self-government possible:
Never perhaps before in the history of the world had such an opportunity been afforded – never perhaps in history will such an opportunity again offer – of trying fairly and thoroughly on a noble and majestic scale, the theory which throughout all ages – from Plato to [Algernon] Sydney – had been the cherished dream of philosophers and statesmen. We have tried it. And yet even among us there are not wanting wise and good men who look upon our experiment as still but an experiment. Our existence as a nation has been but for little more than three quarters of a century (a very small fraction in the life of a people) and already there are distracting forces at work which not only threaten to break up what the founders of it regarded as the essential framework of our government, but to convert it into an absolute democratic despotism in the hands of a numerical majority!
While conceding that self-government was, indeed, optimal for Americans, Miles held that it was far from perfect in its own right (for instance, it was too prone to the instabilities of “public opinion”) and that because of those problems (which only the most enlightened people were capable of overcoming) it was far from optimal for everyone else:
But while sincerely and firmly believing this, I am by no means prepared to allow that what is best for us is always and everywhere possible. A republican form of government is morally and intellectually the highest form, inasmuch as it presupposes the highest moral and intellectual development of the people. Where, therefore, such a development exists, or rather such an approximation to it as human frailty admits – there the people are capable of self-control and self-government. But even there, from its very fundamental assumption, it lacks some of those counterbalances and checks which exist in governments founded upon a lower theory of the perfectibility of human nature. It is more liable to sudden changes. Its moral tone is more directly influenced by the prevailing modes of thought and manners of the people. It wants more of that innate, recuperative force, which after every departure can bring it back to, and make it conform with, the original type. So that waiving the discussion of the general question, as to which is upon the whole the best theory upon which to construct the constitution of a government – we are at least safe in assuming that it is not every condition of a people – nor even every people in their best condition – which admits of or can support republican or self-government.
In Miles’ opinion, the fate of France “stands out as a beacon to warn those who too rashly and enthusiastically assume that that the republican form is the truest or the simplest.” The French Revolution, which began as the long-overdue reform of the government into the First Republic, degenerated into the Reign of Terror, in which counter-revolutionary classes were mass-executed and pre-revolutionary culture was mass-purged. The Reign of Terror was overthrown by the Directory, which was less diabolical though no less dictatorial. The Directory was overthrown by the militaristic Napoleon Bonaparte, who quickly rose from consul to emperor amid the disorder and came very close to conquering all of Europe. Napoleon was ultimately defeated and deposed, and the Bourbon dynasty was restored to power. The Bourbons, however, were soon overthrown in favor of the “Citizen-King” Louis-Philippe (from another branch of the same family), who was soon overthrown in order to establish the Second Republic, which was soon overthrown by Napoleon III (the nephew of Napoleon Bonaparte). “What will be the next act in the drama or the farce, no one can with any confidence predict,” remarked Miles. “The next mail may bring us information of some new emeute [riot] which has entirely overthrown the existing order of things.” (In fact, in 1870, during the Franco-Prussian War, Napoleon III would be defeated and captured in battle, leading to his exile and the establishment of the Third Republic, which survived World War I but not World War II.)
No other country had contributed as much to Western Civilization as the French, though for all of their achievements, they had not developed the particular qualities necessary for self-government:
So much for the French attempts at republican government. And yet what nation stands higher than France in civilization, intelligence, and refinement? What people have contributed more to the advancement of science, philosophy, and all the social arts? Into every department of human knowledge – into every region of thought and speculation, they have pressed forward among the foremost. But they have yet to learn the art of self-government.
After Christian evangelicals and Enlightenment ideologues began forcefully criticizing slavery in the mid-1700s, Southerners responded, no less forcefully, with what they called “the pro-slavery argument.” First, they argued, slavery was a system of labor not just recognized, but in some cases protected, under the Constitution, which contained no delegated and enumerated powers over slavery. Second, they argued, slavery grew out of racial inequalities between blacks and whites, manifested in the contrast between the primitive conditions of the former in their native Africa (or American freedman communities) and the civilized conditions of the latter in their native Europe (or worldwide colonies). Third, they argued, slave societies – by making all freemen equal to one another regardless of birth or wealth, treating liberty as a privilege rather than a right, and combining capital and labor into a single class with a common interest – were morally superior to free societies. According to William Sumner Jenkins, the author of Pro-Slavery Thought in the Old South, this second “ethnological” argument for slavery was crucial to the supporting constitutional and moral arguments for slavery. Miles completely agreed, of course (this was a day and age of literal white supremacists, not the lame bogeymen of today), but that was not his point in this particular speech. Miles’ point was that even among civilized “races” (by which he meant European ethno-nationalities, such as the “Anglo-Saxons”) there were still highly determinative cultural, social, and political differences. Even though he singled out France for the instability of its so-called republican governments, Miles (who was himself of Huguenot ancestry) did not intend to be condescending. On the contrary, republican government was not necessarily better than any other form of government – just better-suited to some people more than others, like tastes in fashion or food.
For example, Miles asked, why should Italy, Spain, England, and Lapland (i.e. Finland), all much-older countries with their own unique cultures, societies, and governments, have the same form of government as the much-younger U.S.A., the form of government of which was adapted from its own unique historical experience?
There is not any one specific form of government into which, as the bed of Procrustes, you can force the body politic. The form of government is but the outward development of the inner life of a people. It not only may differ – but must differ – with different people and different social organizations. Nations have grown great and powerful, and fulfilled their missions in furthering civilization and the elevation of man’s nature under various and opposite forms of government. The social and political requirements of one people are not less distinct from those of another than their physical requirements. The Italian or the Spaniard does not need, nor would be be nourished by, the beef and beer of the Englishman, nor would the latter thrive on the train oil [whale oil extracted from blubber] of the Laplander; so, too, the clothing requisite in the one case would be insufficient or an encumbrance in the others. How irrational and quixotic, then, must be the attempt to go about the world in a spirit of political propagandization, making proselytes to republicanism among the nations?
Miles cited Lajos Kossuth as an example of this political proselytism. Kossuth was a Hungarian émigré who traveled around the North, where he was hailed as a sort of European George Washington. “We know that he is a man of fluent eloquence, of fascinating address, of potent fortitude, possessed in a remarkable degree of what is certainly a main element in all true greatness, faith in himself and in what he believes to be his mission,” admitted Miles. “But to what extent is he a true representative of Hungary – a true exponent of her national sentiment?” The Bathyani and Esterhazy families (two of the most “powerful, influential, and patriotic in Hungary…prominent for their earnest and disinterested love of their country – for their zealous advocacy of liberal and enlightened measures likely to elevate and improve her condition”) disapproved of Kossuth’s international activities, for instance. Centuries before Europeans even colonized the American continent, Hungary had been ruled by aristocratic and ecclesiastical orders, and was united by marriage with Austria. The conflict between Austria and Hungary (the former “oppressive and exacting,” the latter “unruly and unreasonable”) was utterly foreign to Americans. “Is it not downright impertinence in us – ignorant of the political relations of the two countries – of their mutual obligations – of the very gist of the dispute – to thrust ourselves forward as umpires in their quarrel?” asked Miles.
The U.S.A. did not have any responsibility, much less any right, to intervene in Europe in order to separate Hungary from Austria. (In fact, in 1867, Austria and Hungary would, without any American interference, resolve their conflict by uniting as a “Dual Monarchy.”) Miles did not believe that Kossuth was that interested in Hungarian independence, however, and that the “Red Republicans, socialists and communists, abolitionists, free-soilers, and barnburners” who hailed him in the North would not have sympathized with him if he did. What Kossuth was promoting in the U.S.A. was, rather, “to revolutionize and democratize Hungary,” which would, in Miles’ opinion, be a gross violation of the law of nations and perhaps even tantamount to an act of war.
Miles was a proponent of a sort of “Prime Directive,” if you will, opposing any interference in the internal development of a foreign nation:
The form of government of a people ought not to be determined by foreign and extrinsic influences. I do not mean to lay down the broad principle that the intervention of one nation in the affairs of others is never justifiable or even necessary. But political crusades are impolitic and dangerous, and are apt to be productive of as small permanent results as the religious crusades of the Middle Ages. The true policy of a nation is to a great degree selfish. Let her chief aim be to conserve or to perfect her own constitution; to elevate and improve the condition of her own people. The finite intellect of man is incapable of marking out for them the destinies of all races and peoples. The Supreme Ruler of the Universe “shapes the ends” of nations as of individuals. In his good time he will elevate the earnest and struggling spirit to reach the light. Let us then, while thankfully enjoying a form of popular government, which seems most advisable for us, leave other nations to work out for themselves, as we have done, the problem as to what political form is best-adapted to their peculiar growth and development.
After World War I, World War II, the Cold War, and now in this post-1991 “unipolar moment,” when any government which resists democratic capitalism and managerial liberalism can be “regime-changed,” can Americans even comprehend Miles’ opposition to the U.S.A. interfering in the politics of other countries? Indeed, Miles’ concern that merely playing host and giving voice to an exiled dissident would be out of line must, to thoroughly jingoized and chauvinized Americans, seem “unpatriotic.”
A seemingly small story, completely ignored by the American media, is an excellent illustration of the U.S.A.’s retarded and suicidal foreign policy. With all the threats to American political unity, cultural identity, economic stability, and national security – indeed, literal “caravans” of illegal aliens marching toward our border – the government is preoccupied with securing foreign borders and selling arms to foreign governments.
Pres. Donald Trump’s National Security Adviser, John R. Bolton (just recently described by Rep. Jimmy Duncan of Tennessee as “far too eager for others to go to war so these chickenhawks can think of themselves as modern Winston Churchills”) has been on a diplomatic tour of the South Caucasus, visiting Georgia, Azerbaijan, and Armenia after traveling to Russia to announce that the U.S.A. was ending a Reagan-Gorbachev arms-control treaty which had significantly reduced the risk of nuclear warfare. Bolton was there to convince countries to cooperate with the Trump Administration’s heightened sanctions against Iran. “We want to put maximum pressure on Iran because it has not given up the pursuit of nuclear weapons,” Bolton told the U.S.-funded news service in Armenia. “It remains the world’s central banker of international terrorism, and we’re concerned about its ballistic-missile programs and its active conventional operations in Syria and Iraq and elsewhere.” Armenia has been trying to lessen its dependence on Russia by deepening ties with the European Union and Iran (with which it shares a southern border). Yet now Bolton is telling Armenia to isolate itself diplomatically and undermine its national interests by joining with the U.S.A. against the EU, Russia, and Iran. Armenia’s borders with Azerbaijan have been closed due to the ongoing territorial dispute in Nagorno-Karabagh, and Turkey has closed its borders in solidarity with Azerbaijan. If Armenia closed its borders with Iran to appease the Trump Administration, then its only trade route would be through Georgia to the north. In order to open up Armenia’s borders, then, Bolton recommended that the new prime minister “show leadership” and take “decisive action” to resolve the conflict with Azerbaijan, presumably in a way which benefits the latter, if past American positions are any indication. In return, Bolton suggested that the Trump Administration would be willing to “look at” arms sales to Armenia. “If it’s a question of buying Russian military equipment versus buying U.S. military equipment, we’d prefer the later,” said Bolton. “And I think that it increases Armenia’s options when it’s not entirely dependent on one major power.” Bolton suggested the same thing in Azerbaijan, however, and, in fact, in the interest of finding any country in the area willing to support the U.S.A.’s 17-year occupation of Afghanistan, has recommended repealing legislative restrictions on aid to Azerbaijan and Turkey, which were put in place in 1992 due to their blockade against Armenia. Last, but not least, Bolton endorsed “the ongoing democratic changes in Armenia, that outline a more prosperous, freer, and more independent future,” and appraised Armenia’s chances of becoming a “stable democracy” as “really fundamental to Armenia exercising its full sovereignty and not being dependent on – or subject to – excessive foreign influence.”
Iran is not pursuing nuclear weapons, and has not since 2003 (which, even then, it was pursuing for the same reason that nuclear weapons proliferated during the Cold War: “deterrence”). Yet despite the assessments of American and even Israeli intelligence (which have, in multiple “national intelligence estimates,” found no evidence of an active weapons program), American and Israeli politics demand jihad against Iran. The denuclearization accords reached at Lausanne were a triumph of old-fashioned realpolitik over Bushite “we-don’t-talk-to-evil” moralism, ensuring the dismantlement of Iran’s nuclear-weapons infrastructure in exchange for the removal of economic sanctions. The International Atomic Energy Agency, as well as all signatories to the deal, including even the U.S.A., had verified Iran’s ongoing compliance with the denuclearization process. Pres. Trump’s unilateral and illegal exit from the accords was a disgrace, further discrediting already-discredited American diplomacy and further destabilizing the already-destabilized Middle East. Iran is not “the world’s central banker of international terrorism,” either, which is yet another political lie contradicted by informed intelligence. Iran does support national-liberationist militias, such as Hamas in Palestine, Hezbollah in Lebanon, and the Houthis in Yemen, but there is a difference between the terrorist attacks that these groups carry out against invaders in their homelands and “international terrorism” as waged by al-Qaeda and ISIS. It is, in fact, Saudi Arabia and other Arab sheikhdoms such as Bahrain and Qatar (all American “allies”), which are the world’s central bankers of international terrorism, not only providing direct support to groups such as al-Qaeda and ISIS, but also building Wahhabist mosques throughout Europe and North America (creating cells and lone wolves which have terrorized cities like Brussels, Orlando, Paris, and San Bernardino). As far as Iran’s military activities in Syria go, Iranian forces, unlike American forces, are there by invitation of the Syrian government, and also unlike American forces, have fought on the front lines against the international terrorists in Syria supported by American allies. In short, although Bolton accuses Iran of a “malign” role in the region, it is the U.S.A. which has been truly malign.
“After Bolton takes aim at Russia and Iran, is Armenia the collateral damage?” asks EurasiaNet. Other headlines were no less ominous, such as Asia Times’ “Trump administration plants U.S. flag in Armenia” and Asbarez’s “With Bolton’s Visit, U.S. Reasserts its Heavy Hand on Armenia.” The U.S.A. is attempting to interfere with Armenia’s national borders – Bolton warned that the Armenian-Iranian border would be “a significant issue” – and is expecting Armenia to risk reorienting its foreign policy to be subject to American priorities with little to nothing offered in return. Indeed, in his last interview, the U.S. Ambassador to Armenia took the Azeri-Turkish position that “any settlement is going to require the return of some portion of the occupied territories” – a position which Bolton, when pressed, did not disavow. That same ambassador also lectured Armenians that if they want the Armenian Genocide to be taken seriously by the rest of the world (which already takes it seriously, anyway) then they must join the U.S.A. (which is so beholden to Turkey that it refuses even to recognize the genocide) and other human-rights champions (like the House of Saud and the Likud Party) to sanction Iran (a policy which the rest of the world opposes). The possibility of arms sales to Armenia, though meant to be enticing, is instead disturbing, since the U.S.A. often uses arms sales to turn a country’s foreign policy to American interests, without any regard for whatever that country’s own interests may be. Bolton’s theory that buying arms from the U.S.A. as well as Russia would lessen foreign influence over Armenia is also incorrect: it would only increase the foreign influence in Armenia, as both countries fight for the money and the power. The possibility of accompanying American arms sales to Azerbaijan raises the alarm and could upset the parity of power in the region, as oil-rich Azerbaijan has a much bigger military budget than Armenia and would stand to benefit more from the entry of another seller in the arms market. Bolton’s warning about “foreign influence” interfering with “democracy” is meant to drive a wedge between Armenia and Russia, which are not just tied together by history and geography, but have influenced each other culturally and politically, and depend on each other militarily and economically. In response to Bolton’s criticism of “excessive foreign influence” in Armenia, the Russian Foreign Ministry quipped, “It would be good for John Bolton to ponder the meaning of his own words.” If Bolton finds Armenia resistant to his plans, however, he will not hesitate to do to what other neoconservatives did to Yugoslavia, Georgia, Ukraine, and Kyrgyzstan: a “color revolution,” in which American intelligence and international NGOs ally with a country’s marginalized, alienated minorities to propagate disinformation and manufacture a crisis, with the ultimate goal of regime change. “A ‘Color Revolution’ in Armenia?” asked the U.S.-funded news service RadioFreeEurope. “Mass Protests Echo Previous Post-Soviet Upheavals.” In short, Bolton’s heavy-handed overture to Armenia is utterly self-serving and would politically and economically destabilize, culturally and socially disintegrate, as well as diplomatically isolate and militarily endanger Armenia.
What would Miles have made of a man like Bolton, whose public “service” has consisted of interfering in the affairs of other countries and entangling his own country in pointless conflicts? How does Bolton celebrate Memorial Day in good conscience?
“Liberty not the Birthright of Mankind"
“But forms of government, after all,” remarked Miles, moving on to his second great fallacy and heresy, “are of less importance than that which is ostensibly the end of all government – the true interests and wellbeing of the governed.” According to the Founding Fathers, the role of the government was to uphold the “liberty” of the people, but the meaning of liberty had been degraded by the individualist and egalitarian zeitgeist. Liberty once meant “freedom…of development, each in his legitimate sphere, and equality, as far as may be, in the conditions under which it is to be put forth,” but had now come to mean “a naked and absolute freedom from all but self-imposed control, and entire and unlimited equality in social privileges and political power.” This was nothing more than “selfishness and vanity – the one tending to overthrow every barrier to personal indulgence – the other begetting a self-complacency that makes each one’s individual opinion the sole and infallible test of truth and right.” With this anti-social reinterpretation of liberty and equality, society itself would “disintegrate and dissolve,” as “the mutual concessions and sacrifices upon which its healthy organization depends would be swept away and replaced by separate, discordant, and conflicting interests.”
Miles was not a “Quark,” labeling any and every change “the end of Ferengi civilization,” as simple-minded presentists, who have never had an unapproved or original thought in their whole lives, programmatically dismiss so-called reactionaries. In order to make any collective body work, whether a tribal village or a nation-state, all of its members have to believe that they are on the same team, in one way or another – that they all follow the rules of the group and that they are all have the best interests of the group at heart/in mind. Without that sense of civic compromise and consensus, the members of the group will become atomized and isolated, and the group will decompose and collapse amid alienation and suspicion. The breakdown of society rarely ends in anarchy, however, because out of the chaos emerge revolutionary regimes which seize power, impose order, and punish their enemies. This is what happened, more or less, in 1789 France, 1917 Russia, and 1933 Germany.
“Political liberty, I say it boldly, is not an inalienable right, but an acquired privilege,” declared Miles. “To regard it in any other light is to lower its value and debase its nature.” Miles acknowledged that his statement was contradicted by the phrase “all men are created equal,” often plucked from the Declaration of Independence in lieu of an actual argument, but he rejected that phrase as a “monstrous and dangerous fallacy,” anyway. “But peace to the ashes of one who, with all his errors, still, in his day and generation, ‘did the state some service,’” Miles added, quoting Shakespeare in defense of Thomas Jefferson. Although the notion that “all men are created equal” was a fallacy “which melts away at the first breath of logic, which vanishes at the first glance of reason and good sense,” it was, nevertheless, becoming the universal standard for good government.
Indeed, the American example of self-government was “democratizing” and “revolutionizing” Europe. Classical liberals, such as the aforementioned Kossuth, were influenced by the American example and were attempting to introduce it in their own countries, albeit without much success. “Any people anywhere being inclined and having the power have the right to rise up and shake off the existing government, and form a new one that suits them better,” declared a young Congressman from Illinois in 1848 (ironically, Abraham Lincoln). “This is a most valuable, a most sacred right – a right which we hope and believe is to liberate the world.” In 1821, John Quincy Adams, serving as Secretary of State under Pres. James Monroe, gave this policy of leading by example its most famous formulation: the U.S.A. “goes not abroad, in search of monsters to destroy,” and while she is “the well-wisher to the freedom and independence of all,” she is “the champion and vindicator only of her own.” Today, of course, the soft glove of American influence has been replaced by the hard fist of American intervention: the U.S.A., drunk on victory after the fall of its international enemy, the Soviet Union, now compulsively turns to economic pressure, political agitation, and military action to “liberate” the world. To paraphrase Lincoln, Americans believe that they have the right to rise up and shake off the existing governments of any other people and form new ones that suit them better. To paraphrase Adams, Americans are monsters which go abroad in search of free and independent countries to destroy. That was the purpose of John Bolton’s aforementioned visit to Armenia, which the U.S.A. is now targeting as part of its program of global assimilation/homogenization.
As if Armenia, an ancient civilization that dates back nearly three thousand years, was the world’s very first Christian state, and has been influenced by the other ancient civilizations of Byzantium, Persia, and Russia, has anything to learn from the U.S.A., an upstart country which is, after not even three centuries, already in decline, if not collapse! Indeed, as if the Armenians, a people who outlived all of their neighbors from the ancient “Cradle of Civilization,” who survived the reigns of terror under the Ottomans and the Communists, and who just recently retook control of their government in a “Velvet Revolution,” have anything to learn from the Americans, whose dysfunctional politics are represented by the Stupid Party and the Evil Party shutting down the government – whose degenerate culture is defined by Sarah Silverman’s jokes and Nikki Minaj’s butt – whose unstable society is producing mass-shootings in schools and sanctuaries – whose imbalanced economy is built on importing “coolie” labor from the Third World and printing money by banks – and whose bloated, flaccid military cannot win wars anymore! In fact, the individualist and egalitarian ethos on which the U.S.A. was supposedly founded – which Miles knew was a mirage leading to licentiousness – is simply running its course.
The notion that “all men are created equal” was, in the classical-liberal 1800s, “the cornerstone of almost all the political fabrics which the restless imaginations of men, in so many countries, are striving to erect,” but Miles cautioned that any form of government based on the manifest myth of universal equality was doomed to end either in anarchy or tyranny. “Men are neither born free nor equal,” and while “they may, and ought to, aspire to be free – they cannot, nor is it desirable that they should all become equal.”
Equality was manifestly non-existent among every other living being, so why should it be any different among human beings?
Perfect equality is contrary to all the analogies of nature. “One star differeth from another in glory” [a verse from 1 Corinthians]. Animals of the same species vary in size and strength. No two trees or plants enjoy the same advantages of sun and soil, and they have not, in consequence, the same growth and luxuriance. Rivers differ in size and depth, and are tributary to one another. Throughout the entire cycle of the material creation we see contrast, difference, inequality. It is the connection of all the parts – the interdependence between the great and the small, the strong and the weak – that brings about the beautiful harmony of nature.
Likewise, men were born with “the most various and opposite mental and moral constitutions – all degrees of difference in ability and power – all shades of contrast in natural or acquired privilege.” Even if opportunity were equalized, outcomes would still be unequal: due to the inborn traits of men, which were not equally distributed, some would do better than others and inequality would, quite naturally, reestablish itself. “Non fit Mercurius e quovis ligno,” pronounced Miles, quoting the Roman poet Horace (i.e. “A [statue of] Mercury is not to be fashioned from just any piece of wood”). As Miles put it, just as a statesman cannot be made into a ploughman “by the magic of any political legerdemain,” so a ploughman cannot be made into a statesman.
Miles concluded his speech with a fitting horticultural metaphor. “True liberty,” he argued, “is no exotic, no hot-house plant,” but rather “must be indigenous and spring from the soil…must be rooted in the nature, manners, and habits, no less than the thoughts and affections, of a people.” Liberty cannot be artificially cultivated “under the bell-glass of a mere written constitution,” but rather “must inspire the free air of its native plains…must expand under the genial warmth of its native sun…must be fanned by the sighs of patriots and watered with their tears and blood.” Quoting the English poet Henry Taylor, Miles explained that liberty was a tree which “sucks kindlier nurture from the soil enriched by its own fallen leaves.” Liberty cannot be transplanted “to an ungenial clime without its drooping and dying, or becoming dwarfed and insignificant,” and although it may be “scarred by despotic violence,” “nipped by the frosts of faint-heartedness and treason,” and “almost prostrated by the rude blasts of popular fury and passion,” as long as “its germ was in the soil – if it is no ‘chance-sown sapling,’ but a native of the land – it will grow.”
What Miles meant was that “liberty” (which he equated with “self-government”) was a sort of historical stage to which a people must rise on their own, and that revolutions which tried to bypass the process of natural development by historical experience would backfire. (Consider the U.S.A.’s wrong-headed and heavy-handed attempts to “democratize” and “liberalize” the Middle East, despite the fact that is riven by ethno-religious tribalism incompatible with “liberal democracy.”) Some people would probably never rise to that stage of development, but that was not an injustice to be remedied by those who had. It was better to leave each people at their natural stage of development than to try and force a higher stage of development on them before they were ready. As Miles put it, “self-government” required “self-culture.”
Liberty was not just a technical matter of choosing and calibrating the right institutions, either, but required a preexisting political culture which would sustain those institutions. (Consider Liberia, a colony for freedmen founded on their mother continent, technically governed under a copy of the U.S. Constitution, yet which 199 years later, is worse off than many other African countries and is still regularly propped up/bailed out by the U.S.A.) This is true of any organization, whether a neighborhood or a nation-state: the quality of the people is more important than the quality of the institutions, and, in fact, the former determines the latter just as inputs determine outputs. Indeed, cities like Baltimore have the same institutions as ever, yet demographic shocks have, in just fifty years, transformed them into slums notorious for corruption, poverty, and violence. As Miles put it, “self-government” required “self-control.”
In short, Miles meant that propositions on pieces of paper were not going to protect liberty on their own, but grew out of and were only as good as the people themselves. To return to his horticultural metaphor, liberty must be organic, so to speak.
“The Cant Phraseology of the Day"
Miles was one of the many Americans who had concerns about mass-immigration’s effect on political unity, cultural identity, and social stability – which, as he put it, was “a serious and increasing evil” – yet who, at least according to the Sunday-School story of American history, never existed until Donald Trump came on the scene. If Miles was worried that Irish and Germans might be incompatible with American political culture, then imagine what he would have thought of Somalians and the Hmong!
Scoffing at the “cant phraseology of the day,” Miles warned that the U.S.A., supposedly an “asylum for the oppressed of every clime,” was “in danger of becoming a sort of lazar-house for all the social and political diseases of Europe.” It was not just that immigrants introduced foreign ideas to their host country (which, in small numbers, would be manageable, if not desirable), but that they came in such large numbers that they made their ideas mainstream. “What is to be the fate of a nation whose nationality is daily diluted by such copious foreign streams, many of them but the drains and the sewers of the Old World, it is difficult to say,” wondered Miles. “Whether there is good enough to leaven the mass and keep it sound and wholesome – or whether the bad will gradually vitiate and corrupt the whole – time only can show.”
Miles had nothing against immigrants who, in addition to coming for a better life, would become citizens and contribute to their host country: “the honest, industrious, and order-loving emigrant who seeks on our shores a refuge from misery and suffering in his own land, and a new field for the earnest development of his energies; who grateful for the blessings we freely extend him, works quietly and heartily with us in elevating and improving the condition of what is thenceforth for us and for him, a common country.” Yet Miles would exclude immigrants who did not care how things were done in their host country and were mainly interested in patronage for their ethno-religious group: “him who thanklessly grasps, without any feeling of gratitude, the boon of civil liberty as if it were merely his due; who, uninformed and without preparation, at once thrusts himself with clamorous bullyism into every contest; and who, while casting a revengeful scowl across the waters at the rulers who he thinks have so long tyrannically kept him out of his natural inheritance, chafes at the restraints of law and society in his new home as partaking in some sort of the same tyranny.” Given “the truckling in so many of our larger cities to what is openly and unblushingly called ‘the foreign influence,’ and the attendant bribery and corruption which mark their municipal elections” (e.g. New York City, the corrupt, violent tribal politics of which are portrayed in Martin Scorsese’s “Gangs of New York”), Miles suspected that this latter class of immigrant – angry, entitled, and ungrateful – comprised “no small portion of those who flock to ‘the asylum.’”
If mass-immigration continued unregulated, without any qualitative or quantitative standards, Miles predicted that it would “denationalize us as a nation” and “degrade us as a people.” It was, therefore, the duty of “the educated and intelligent classes” (again, not the dopey, snarky “elite” of late-night show hosts and their audiences) to “say authoritatively these things shall no longer be.” Miles was not primarily interested in the mass-immigration of incompatible and antagonistic elements, however, but only regarded it as a contributing factor to the other problems which he had already detailed.
“Republican Government not Everywhere and Always the Best; and Liberty not the Birthright of Mankind” is a timeless speech which has a lot to say to those who are wise enough to listen to and learn from the past. Unfortunately, the two great fallacies and heresies which Miles tried to refute – universalism and equalitarianism – have, on the contrary, become dogma in the American civic religion. To presentists, such a speech must be a monument to the self-evidently backward head and heart of Miles’ time and place: such retrograde thinking – “isms” and “phobias” and “hate speech,” oh my! – must be ritually denounced and never reasonably discussed. At least since Abraham Lincoln, pointing-and-sputtering at anything which contradicts the Declaration of Independence (that is, the fatuous interpretation of that famous sentence fragment, “all men are created equal”) has been a favorite pastime in American political culture, yet it is not at all clear that the Declaration even means what it is commonly said to mean, or whether the common meaning now ascribed to the Declaration is even true.
Indeed, Abraham Lincoln’s simplistic, sentimental understanding of that famous sentence fragment can only be sustained by a decontextualized and politicized reading of the Declaration of Independence. The purpose of the Declaration was not to christen a Hebraic-Puritan “city upon a hill” and “last best hope of earth,” or inaugurate a Marxist-Leninist “permanent revolution” led by an “international vanguard” – not a single delegate in Philadelphia was so commissioned by his colony or personally conceived of the document as such – but to serve notice to the British state and appeal to the British nation. The Declaration’s purpose was to promulgate an act of secession which had already been voted on in the Continental Congress, and thus meant no more than what its antecedent, the “Lee Resolution,” meant: “Resolved, that these United Colonies are, and of right ought to be, free and independent States, and that they are absolved from all allegiance to the British Crown, and that all political connection between them and the State of Great Britain is, and ought to be, totally dissolved.”
Anyone familiar with the intellectual biographies of Thomas Jefferson and John Adams, the two Founding Fathers most involved with Declaration of Independence, knows that whatever they meant by “all men are created equal,” they did not mean it literally. Throughout Thomas Jefferson’s long life of reading, thinking and writing (including his radical “libertarian” youth), he never wavered in the belief that the white and black races could not coexist freely and equally. “Why not retain and incorporate the blacks into the state?” he asked in Notes on the State of Virginia. “Deep-rooted prejudices entertained by the whites; ten thousand recollections, by the blacks, of the injuries they have suffered; new provocations; the real distinctions which nature has made; and many other circumstances, will divide us into parties, and provoke convulsions which will probably never end but in the extermination of one or the other race,” he answered. In addition to these “political” objections to equality, Jefferson added “physical and moral” objections, too. “It is not against experience to suppose, that different species of the same genus, or varieties of the same species, may possess different qualifications,” he argued. “Will not a lover of natural history then, one who views the gradations in the races of animals with the eye of philosophy, excuse an effort to keep those in the department of man as distinct as nature has formed them?” John Adams cheerfully mocked the prospect of other social revolutions invoking the supposed spirit of the American Revolution in support of their causes. “As to your extraordinary code of laws, I cannot but laugh,” he replied to his wife when she requested that the Continental Congress institute women’s rights. “We have been told that our struggle has loosened the bands of government everywhere,” he continued. “That children and apprentices were disobedient – that schools and colleges were grown turbulent – that Indians slighted their guardians and negroes grew insolent to their masters, but your letter was the first intimation that another tribe more numerous and powerful than all the rest were discontented.” While Adams was teasing his wife for her feminism, he wrote a much more concerned letter to James Sullivan criticizing equal voting rights for men alone “Depend upon it, sir, it is dangerous to open so fruitful a source of controversy and altercation, as would be opened by attempting to alter the qualifications of voters,” he argued. “It tends to confound and destroy all distinctions, and prostrate all ranks, to one common level.” Jefferson and Adams, in correspondence with each other years after writing and signing the Declaration, did not explore the ever-expanding interpretations of “all men are created equal,” but rather agreed that “there is a natural aristocracy among men, the grounds of which are virtue and talents.”
Fire-Eaters like Miles knew what the Founding Fathers believed in and fought for, which was why they invoked their memory and laid claim to their legacy with such confidence:
Carolinians! Will you consent to this? Will you quietly and without a determined struggle allow this seal of infamy to be set upon you? Will you allow this stab to be made at the great principle of constitutional liberty, for which our fathers struggled so hard…and not throw your whole moral weight and force as a guard before it? Or is that principle no longer as dear to us as it was to the men of the revolution? Or in this utilitarian age is all principle sneered at as a “metaphysical abstraction” – and the profoundest questions in politics and constitutional law to be settled solely on the basis of dollars and cents? If so, let us pause and reflect; for all our institutions, our liberties, nay, our very existence are endangered. If so let us pause and reflect for we are already degenerating from the spirit of ’76! (Charleston, 1849)
Americans are expected to ignore the actual intentions and objectives of the Founding Fathers, and instead give the Declaration of Independence the same reading as Frederick Douglass, Martin Luther King, and Ta-Nehisi Coates, pretending that it establishes some mystical standard by which the U.S.A. is to forever judge itself (or, for neo-conservatives like Robert Kagan, Bill Kristol, and John Podhoretz, a mystical standard by which the U.S.A. is to forever judge others).
Are all men created equal, even? Of course not! At the micro level, every individual human being is different from one another, not only in a “nature” sense (in that each person is genetically unique – or unequal), but also in a “nurture” sense (in that each person has unique – or unequal – circumstances). “Equality of opportunity” is a fallback position for so-called conservatives faced with the obvious absurdity of “all men are created equal” but who want to keep kosher in their civic religion, yet just as a boxing match between a male heavyweight champion and a female pinweight challenger would not be considered “equal,” so “equal opportunity” requires that everyone be cut down to the same size and into the same shape – in other words, “equality of condition.” At the macro level, people also differ from each other in average ways: different population groups living in different parts of the planet have adapted differently to their different environments, which is the same process of evolution to which every other animal and plant species is subject. To put it one way, if intelligent alien life ever visits Planet Earth and researches human civilization, it is certainly not going to report back to its home-world that “all men are created equal.”
Are all men endowed with natural rights, even? Of course not! Enlightenment-era intellectuals like John Locke, who felt the need to construct theoretical justifications for the state, philosophized about “natural rights” and “social contracts,” yet their theories, in spite of their systematic logic and appealing principles, were but castles in the air. An individual’s “natural rights” are mere abstractions outside of a collective community which legally recognizes such rights. In that sense, then, rights do not inhere in individuals, but rather in a people; they are the particular values of a particular people, ideally embodied in their particular traditions and institutions. There is nothing “natural,” furthermore, about “rights” such as “habeas corpus” or a trial by jury: these are not the products of reason, but of experience, and they are not enforced by nature, but by law. Rights, in other words, do not – and cannot – exist in a state of nature, but only in a state of society.
Yuval Noah Harari, a historian at the Hebrew University of Jerusalem, sounds like he could have been channeling the spirit of George Fitzhugh this passage from Sapiens: A Brief History of Humankind:
Is there any objective reality, outside the human imagination, in which we are truly equal? Are all humans equal to one another biologically? Let us try to translate the most famous line of the American Declaration of Independence into biological terms…
Harari’s argument, though reductionist and pedantic at times, is a necessary corrective to the superstition and fundamentalism that surrounds the Declaration of Independence. Similar correctives exist for other American symbols clouded in miasmas of mysticism, such as H.L. Mencken’s essay about the Gettysburg Address, Thomas Bailey Aldrich’s poem about the Statue of Liberty, Mark Twain’s parody of “The Battle Hymn of the Republic,” as well as the critical comedy of Ambrose Bierce and George Carlin. As the great Mencken observed, “One horse-laugh is worth ten thousand syllogisms.”
The Hamiltonian (de)construction of the Constitution, by writing out States’ rights, has subverted the very government – that is, the very federal republic, the very voluntary union – which its framers and ratifiers intended to establish in 1787-1789. Yet the Lincolnian (de)construction of the Declaration of Independence, by rewriting the American founding myth, has been even more subversive, mainstreaming a “Hebraic-Puritan” chiliasm and gnosticism. Combined, those twin acts of subversion have transformed the U.S.A. into a revolutionary state warring against its own nation and the world at large – a fate which Miles clearly foresaw and was willing to stop by any means necessary. “Revolution…is a serious thing, a terrible thing,” he warned, “but to noble natures there are things more serious and terrible than revolutions.” According to Miles, “The slow, undermining process by which the high spirit of a free people is sapped, their strength destroyed, their faith in themselves crushed out, their progress paralyzed, is far more appalling to the true statesman and patriot than the temporary, though critical, fever of revolution.”
It is hard to accept that the Democrats, “the Evil Party,” after campaigning for the past two years as “post-American,” if not “anti-American,” were rewarded by voters with a new majority in the House of Representatives. Perhaps endlessly gullible White Americans fell for the old-fashioned “bread-and-butter” economic and social issues, believing that they were electing moderates who were going to fund schools and parks instead of radicals who will end up voting with Rep. Nancy Pelosi and Sen. Chuck Schumer to abolish ICE and impeach Trump? Or did demographic changes in these swing districts reach tipping points in the past two years – the importation of a new electorate which a leaked memo from one of Hillary Clinton’s directors revealed is “a critical component of the Democratic Party’s future electoral success”? Or, worst of all, are White Americans so self-hating and ethno-masochistic that they do not care if they are literally replaced – if their past is erased and their future is aborted? It is probably some combination of all three. There was hardly the “Blue Wave” which the Democrats have been trumpeting for the past two years, although that is small given the gridlock (goodbye, border wall) and grandstanding (hello, impeachment circus) to come.
It is not as if the Republican-controlled Congress did much to earn reelection, anyway. After all, what did they do for two years? They passed a temporary tax cut and failed spectacularly to repeal/replace “ObamaCare.” This lack of legislative achievements is mainly the result of Pres. Donald Trump's naiveté and the Republicans’ duplicity. Several times, for instance, Trump played Charlie Brown to the Republicans’ Lucy, with funding for the border wall as the football. Republicans shoveled money to every conceivable special interest and pressure group (especially the military-industrial complex) yet obstinately refused to fund a border wall. So Trump should be criticized for getting fooled so many times and coming away with nothing on his signature issue. He wasted crucial time on Rep. Paul Ryan’s unpopular and irrelevant “fiscal-conservative” agenda. The “Stupid Party” continues to find new ways to lose.
On other issues, such as broader immigration reform, there was never much hope of legislative progress anyway, given the cohort of Republican “cuckservatives” in the Senate (most of whom are now either retired or dead), as well as the Democrats’ electoral dependence on legal and illegal mass-immigration. Now there is certainly no hope of legislative progress, as Democrats in the House will not support any bill which does not include unconditional amnesty for the entire illegal population of the U.S.A. (recently revised upward from 11.3 million to 22.8 million).
Nevertheless, Trump can continue to accomplish a good deal unilaterally. He can keep renegotiating stupid American “free-trade” deals (which effectively subsidize foreign imports while penalizing domestic exports) to protect industries and communities from displacement. He can keep enforcing the laws at the border (which until now were more honored in the breach than in the observance) to deter the crime, poverty, and disease of illegal immigration. Unfortunately, Trump’s foreign-policy/national-security team is horrendous, made up of bloodthirsty neocons who think that the biggest mistake in the Iraq War was withdrawing too soon. Indeed, Trump in particular and the Republican Party in general is bankrolled by the casino magnate Sheldon Adelson, a hardcore Zionist who cares more about where the U.S.A. puts its embassy in his home country than the political, cultural, social, and economic crises in his host country. “Regime change” in Syria and Iran, while certainly desired by Trump’s neocons, may be off the table at the moment, but all it would take is another false-flag attack (like the “gassings” in August 2013, April 2017, and April 2018), to prompt another kneejerk retaliation and potentially ignite a wider war.
Trump’s prospective executive order on “birthright citizenship” is a good example of what he could still do with a Democrat-controlled House. Birthright citizenship is, contrary to all the sputtering journalists and pundits, not in the Constitution: it is a highly interpretive reading of the Fourteenth Amendment pieced together by a series of disparate Supreme-Court rulings, some as late as 1983. Birthright citizenship has not only led to huddled masses of Mexicans sneaking across the border to have babies which they can use as “anchors” for the “chain migration” of the rest of their family (hence the term “anchor baby”), but also to “birth tourism” by wealthy Russians, Chinese, and Saudi Arabians, who visit the country as tourists in order to have babies whom automatically inherit all the perks of American citizenship. For what it is worth, even the framers of the Fourteenth Amendment had nothing like birthright citizenship in mind. In short, birthright citizenship is an unlawful and harmful policy – and thus, fittingly, not once put to a vote, but rather ruled into “law” by judges. “We’re the only place, just about, that’s stupid enough to do it,” remarked Trump. If Trump issues such an executive order, it will immediately prompt lawsuits by pro-immigration activists like the ACLU, and although lower-court judges will eagerly strike it down, the Supreme Court (recently fortified by conventionally conservative “originalists” like Neil Gorsuch and Brett Kavanaugh) should uphold it, just as it earlier upheld Trump’s objectively legal “travel ban.” Trump is now in a position where he can do what the Democrats have been doing for decades and use the judiciary as a backstop for policies that cannot get through the legislature.
The Republicans not only gained seats in the Senate, but also replaced Republican cuckservatives who were disloyal to their party and their country (such as Sens. John McCain, Jeff Flake, and Bob Corker) with loyal Republicans. The only bad news in the Senate is that the cuckservative Willard “Mitt” Romney won in Utah. (Ironically, Romney, who relocated from Massachusetts to Utah in order to run among his Mormon co-religionists, won by employing the sort of “identity politics” for which he criticizes Trump.) Even with Romney’s virtue-signaling and tone-policing, however, a more cooperative Senate means that Trump should be able to get a stronger Cabinet confirmed. Forcing out Attorney General Jeff Sessions (who was not only the sole Cabinet member doing his job, but also one of the first Republicans to campaign with Trump back in 2015) because he perhaps erred in letting the “Russian-collusion” investigation get out of control, however, is not a good start.
The Democrats in the House may vote to impeach Trump, and while that will go nowhere in the Senate, just as the investigation of non-existent “Russian meddling” distracted and handicapped Trump, so an impeachment circus could further detract from his effectiveness. His “tweeting” would be endless, acrimonious, and useless.
A political realignment is underway. Starting with Barack Obama, the Democrats have been pursuing their “coalition-of-the-ascendant” strategy, which is, simply put, to abandon white voters (particularly ruralite white men), who are demographically “descending,” in favor of a majority-minority coalition (including suburbanite white women, however), who are demographically “ascending.” The Republicans, on the other hand, have wasted a lot of time with an outdated Reaganite strategy, pontificating about deregulation, privatization, and constitutionalism, which has little relevance to contemporary issues and fails to appeal to the prime Middle-American demographic which the Democrats have abandoned. Trump, however, has revolutionized the Republican Party, and is remaking it to be, in a word, more “nationalist” – that is, more “nativist,” “protectionist,” “isolationist,” and so on, all of which are healthy expressions of “patriotism.” Republican “inreach” to its white base is far more effective than “outreach” to non-white fringes, because making small gains in the share of the white vote is worth more than gaining much larger shares of the non-white vote. This is the so-called “Sailer Strategy,” named after VDare and Taki’s Steve Sailer, which columnist Ross Douthat recently referenced in the New York Times. The retirement of 45 Republican cuckservatives (in other words, the forfeiture of 45 incumbency advantages) was crucial to the Democrats’ midterm success, yet it may prove to be one step backward and two steps forward, as the party realigns to secure its future.
Nothing of value was lost, for instance, in the unexpected defeat of Rep. Steve Russell of Oklahoma. “America’s immigration problem is not with immigrants, but with Americans,” Russell recently declared in a speech applauded by The Oklahoman as “a reasonable approach to immigration.” According to Russell, the Founding Fathers always envisioned open borders for their country (proof of this is a poem on a plaque that was fixed to a stairway landing at the Statue of Liberty in 1903), and besides, without immigrants (who are clearly superior to stupid, lazy Americans), there would be no economic growth. Russell added that anyone who disagreed with him was probably “sitting on the couch eating his cheese puffs…pecking out hatred and vitriol.” A retired U.S. Army Lt. Col., Russell authored We Got Him!, a memoir trumpeting the capture of Saddam Hussein (who never posed a threat to, or stood much of a chance against, the U.S.A.) as “a triumph of military strategy” which “opened the door for the most recent and essential victory in the War on Terror.” The defeat of such feckless cuckservatives (who would never fund a border wall and end birthright citizenship, or disentangle and deescalate foreign conflicts) will make Republicans in the Congress stronger. As Commissar Ninotchka quipped of Stalin’s reign of terror, “The last mass-trials were a great success! There are going to be fewer but better Russians.”
Trump must continue to push – and be pushed, if necessary – toward the white-inreach strategy, because the alternative is national dispossession as the U.S.A. becomes a majority-minority country. The corporate media will continue to cry that such a strategy is “white nationalism,” but it is merely a reaction to the Democrats’ own strategy of non-white, if not anti-white, identity politics. Indeed, in an op-ed about the Georgia gubernatorial race titled “We Can Replace Them,” The New York Times’ columnist Michelle Goldberg states that while “America is tearing itself apart as an embittered white conservative minority clings to power, terrified at being swamped by a new multiracial polyglot majority,” voters can “show them they’re being replaced” by electing the black female candidate, Stacey Abrams. (Although the Democrats lost that election, legal efforts to reverse the result are already underway there, as well as in Florida.) A seemingly stupid, but actually quite sinister symbol of these left-wing identity politics is the “reimagining” of Norman Rockwell’s art, in which so-called artists replace his working-to-middle-class white characters with new non-whites. It is not enough, apparently, for modern-day art to reflect modern day diversity; old art which does not reflect the non-existent diversity of its time and place must be “retconned,” too. (If there is a better illustration of “presentism,” I have not seen it.)
While anyone white and right-wing who is concerned about mass-immigration’s threats to political unity, cultural identity, economic stability, and national security must carefully guard his or her language for fear of harassment, leftists like Goldberg may publicly vent their deepest, darkest hatreds and fears without any fear of personal or professional consequences. For instance, an “anti-fascist” mob recently showed up at the house of Tucker Carlson (a FOX-News host who has committed the civic heresy of questioning whether diversity really is “our greatest strength”) to chant and riot, nearly beating down his door in the process. No arrests have been made, even though one of the rioters mentioned a pipe bomb, while left-wing pundits like Vox’s Matthew Yglesias and Think Progress’ Adam Peck are in such a privileged position that they actually applauded the mob. No neo-Nazi mobs are going to show up at Goldberg’s house performing anti-Semitic chants and beating down her door, but if one did, it would be swiftly crushed by the police and unanimously condemned by the media.
Yet even in the face of this racial/civilizational revolution and war of state against nation, incompetent, cowardly, or corrupt cuckservatives (who at this point are too contemptible to be named) continue to wring their hands and furrow their brows about “tribalism” among white right-wingers. To them I say simply that for every action there is an equal and opposite reaction.
“The Declaration of Independence is to be taken with a great qualification. It declares those men have an inalienable right to life; yet we hang criminals – to liberty, yet we imprison – to the pursuit of happiness, yet we must not infringe upon the rights of others. If the Declaration of Independence is taken in its fullest extent, it will warrant robbery and murder, for some may think those crimes necessary to their happiness.” – Rep. Joseph Clay of Pennsylvania on the slave trade (1806)
“I should not have noticed this strange and ridiculous vision, that the Declaration of Independence was a decree of universal emancipation, had it not issued from respectable sources, and been seriously enforced upon the credulity of the public. Instead of attempting to answer or refute these visions of a disturbed imagination, let us recur to principles and facts.” – Rep. John Holmes on the Missouri Crisis (1820)
“I cannot, in the first place, believe that Mr. Jefferson ever intended to give the meaning or force which attempted now to be applied to this language when he said, ‘We hold these truths to be self-evident, that all men are created equal.’ I hold it to be a self-evident lie…I speak what is the judgment of all men, if they dare say it, that neither morally, mentally, socially, physically, nor politically, does equality exist in any country on the earth. It cannot exist in the nature of things. God himself has not created them equal. It is not, therefore, a truism, as Jefferson put it forth, but is false in form, and false in fact.” – Sen. John Pettit of Indiana on the Kansas-Nebraska Act (1853)
“The object of that war was to disenthrall the united colonies from foreign rule, which had proved to be oppressive, and to separate them permanently from the mother country. The political result was the foundation of a federal republic of free white men of the colonies, constituted, as they were, in distinct and reciprocally independent State governments. As for the subject races, whether Indian or African, the wise and brave statesmen of that day, being engaged in no extravagant scheme of social change, left them as they were, and thus preserved themselves and their posterity from the anarchy and the ever-recurring civil wars which have prevailed in other revolutionized European colonies of America.” – Pres. Franklin Pierce of New Hampshire in the State of the Union (1855)
John Dickinson, known as the “Penman of the Revolution” for his “Letters from a Pennsylvania Farmer” essays, as well as his “Petition to the King,” “Olive Branch Petition,” and “Declaration of the Causes and Necessity of Taking Up Arms,” ultimately refused to sign the Declaration of Independence. His objections had nothing to do with the ways in which the Declaration could be perverted by posterity, but were entirely pragmatic concerns about whether the American Colonies were prepared for war and whether their supposed allies in Europe would actually come to their aid. If he had any idea, however, that the Declaration – at the time, nothing more than an act of secession – would one day be cited to justify transforming his free and independent country into a “Colony of the World,” where immigration policy was conducted in the interest of alien immigrants first, economic policy was conducted in the interest of multi-national businesses first, and foreign policy was conducted in the interest of foreign states first – never “America First” – then Dickinson would not have been the only delegate who refused to sign it; not a single delegate would have signed such a suicide pact.
The so-called “equality clause” of the Declaration of Independence is neither about “equality” (at least not as in the equalitarian “capital-E” sense) and is not even a “clause.” Calling this phrase plucked from the second paragraph a “clause” is meant to invest it with constitutional authority – to make it a mandate, so to speak.
The Development of Neo-Conservatism in America
It is the neo-conservatives, or “neocons,” who want to make this “equality clause” a mandate. The neocons are a clique of ideologues who have become highly influential in both parties – the Evil Party and the Stupid Party (you figure out which is which). They are not normally politicians, but rather advisers and speechwriters influencing elected officials, or appointees to powerful but unseen positions. They are journalists spinning and slanting stories, pundits supplying prepackaged opinions and fueling ready-made outrage, financiers funding campaigns and foundations, wonks ensconced in think tanks inventing policy initiatives, and so on. They infiltrate institutions, subvert their values, and appropriate their identities.
“Neocon” is a term that most Americans first heard during the Bush Administration, when that clique was at the height of its power. Dimwitted “Dubya” was desperate to “do something” after 9/11, and the neocons were ready and waiting with a plan, the same plan that they had been pushing on presidents since the end of the Cold War: now that the Soviet Union was no longer a rival empire, the U.S.A. and Israel could unilaterally intervene in the Middle East. Paul Wolfowitz, Douglas Feith, Richard Perle, Elliott Abrams, and David Wurmser were a few of Bush’s highest-placed neocons, though many more were crawling in and out of the administration. Most of them were members of Project for the New American Century (a think tank which advocated “a Reaganite policy of military strength and moral clarity” and obsessively urged “the removal of Saddam Hussein’s regime from power”) and some of them were authors of “A Clean Break: A New Strategy for Securing the Realm” (the report of an Israeli study group which recommended reestablishing the “principle of preemption” to “contain, destabilize, and roll back” Iran, Iraq, Lebanon, Syria, and Palestine).
The neocons were the architects of “the War on Terror,” which should go down in history as the most notorious example of what the Old-Right thinker and writer Samuel T. Francis called “anarcho-tyranny” – a dystopia unforeseen even by Orwell, Huxley, and Bradbury, in which the government does not do what it should do but does do what it should not do. In other words, anarcho-tyranny is when the government, whether through mere incompetence or sheer malice, permits widespread lawlessness (“anarchy”) and commits abuses of power (“tyranny”). While the beneficiaries of anarcho-tyranny are the dysfunctional, dependent underclass and the established, elite managerial class, the victims of anarcho-tyranny are law-abiding, tax-paying, “Middle-American” citizens trapped between the anarchy from below and tyranny from above.
9/11 was committed by radical Sunni fundamentalists, so what did the neocons do? They declare war on the secular powers in the Middle East who had been on the front lines fighting radical Sunni fundamentalists for years. 9/11 was committed by illegal immigrants who overstayed their visas, so what did the neocons do? Instead of establishing an entry/exit visa system to prevent such oversights, they have repeatedly undermined the enforcement of immigration laws and even tried to sneak through amnesty. 9/11 was committed by Muslims from foreign countries, so what did the neocons do? Instead of using statistical probabilities to “profile” potential terrorists, all American citizens were subjected to a regime of snooping and bullying. A real “War on Terror” would have meant forming a grand alliance between Iraq, Syria, and Iran (a triumph of statesmanship of which the jingoistic, chauvinistic neocons would certainly have been incapable) to wage a moral crusade against the two chief destabilizers of the Middle East, Saudi Arabia and Israel.
The neocons were also the architects of the Iraq War – not just the waging of the war itself (although in many cases these eggheads and pinheads did indeed overrule the actual military brass), but more importantly, the case for the war. The fraudulent case for the Iraq War – that if Saddam Hussein, an evil dictator and state sponsor of terrorism, were overthrown, then the Middle East would become safer and freer – was the fault not only of the neocons’ arrogance and ignorance, but also of their dishonesty and disloyalty. Just recently, the political party of Moqtada al-Sadr (a radical Muslim cleric who led the insurgency against American occupation), won a plurality of seats in the Iraqi elections. The Iraq War is going to go down in history as the greatest catastrophe in the Middle East since the collapse of the Ottoman Empire.
Just recently, the neocons have been leading resistance to Donald Trump from within the Republican Party, not because they have any moral qualms about his character or his competence, but because Trump is an old-fashioned populist and nationalist who, in the course of the primaries and the general election, clearly rejected their designs of “world policing” and “nation building.”
The founding generation of the neocons came from two factions of the American Old Left: Communist exiles (specifically, Trotskyites who were ousted by Stalin in the power struggle after Lenin’s death) and Cold-War liberals (Democrats like Sen. Henry Jackson of Washington).
The proto-neocon Trotskyites were organized around Max Schachtman, a Polish-Jewish immigrant involved with the AFL-CIO and the Democratic Party. The far-left Schachtmanites, alarmed by Stalinist imperialism and “anti-Semitism,” supported the U.S.A. countering the USSR militarily and creating the State of Israel as a national home for Jews. The Schachtmanites began as Trotskyites who believed that “the maintenance of the dictatorship in one land was dependent on the extension of the proletarian revolution on a world scale,” but ended up as anti-Communists who believed that “U.S. power could be used to promote democracy in the Third World” (and thus supported the bungled invasion of Cuba and the Vietnam War). Interestingly enough, a superpower imposing its ideology on other countries is a constant throughout every phase of Schachtmanite thought, from hardcore Communism to hardcore anti-Communism.
The proto-neocon Cold-War liberals were organized around Leo Strauss, a German-Jewish immigrant who became a sort of rabbinical figure to “social democrats.” They were for supporting “the New Deal” at home and fighting Communism abroad, which to them meant not only militarily countering Soviet expansion, but also promoting mass-immigration to the U.S.A. in order to signal to the Third World that they were more humanitarian than the USSR. Irving Kristol and Norman Podhoretz, both Straussians, founded the now-neocon Commentary and The Public Interest as center-left, social-democratic publications for liberals who wanted “to return to the original sources of liberal vision and liberal energy so as to correct the warped version of liberalism that is today’s orthodoxy.”
There were two events in the 1960s and 1970s which resulted in these different factions of the Old Left coming together to reinvent themselves as “conservatives.”
The first was when the USSR became officially anti-Zionist, aligning with the Arab states against Western powers and their ally, Israel. In the 1967 war, for instance, the Soviets were on the side of the Arabs, whom they had been arming and advising, and represented them in cease-fire negotiations with the U.S.A. The 1967 war – and the fear of another “Holocaust” – reinvigorated the ethno-religious roots of these proto-neocons, who realized that they cared less about their ideology than they did their identity and that the future of their ethno-state depended on the U.S.A.
The second was when the “New Left” displaced the “Old Left” in the Democratic Party. The New Left cared less about the “hard” economic and social doctrines that animated the Old Left, and more about “soft” cultural issues such as diversity, pluralism, and tolerance. The candidacies of the anti-war George McGovern and the race-baiting Jesse Jackson disgusted these proto-neocons, who longed for the presidencies of Franklin D. Roosevelt and Harry S. Truman. In addition, and probably most importantly, the New Left was pro-Soviet and anti-Zionist, expressing solidarity not just with the Palestinians against Israel, but with any Third-World revolutionaries against capitalist and imperialist powers.
Reacting to Israel’s geopolitical situation and their loss of influence in the Democratic Party, the Old Left defected, so to speak, to the Old Right, where they immediately clashed with existing conservatives (now known as “paleo-conservatives,” i.e. “old conservatives”). These “neo-conservatives” (i.e. “new conservatives”) did not want to assimilate to the Old Right, but rather to deconstruct and redefine it into a host for their preexisting center-left, social-democratic, anti-Communist, pro-Zionist ideologies. The “paleo” and “neo” prefixes should indicate which side won.
While paleocons were traditionalists, neocons were ideologues. Paleocons believed, in Russell Kirk’s phrase, that they had a responsibility “to preserve a particular people, living in a particular place during a particular time.” Neocons, like right-thinking Marxists, believed in carrying certain principles – or “propositions” – to their logical conclusions, no matter the human cost. Paleocons believed, again in Kirk’s phrase, in maintaining “custom, convention, and continuity.” Neocons believed in the moral abstractions devised by “classical liberals” during the Enlightenment, “free minds, free markets, and free people.” Paleocons did not have detailed “policy prescriptions,” as did the neocons. As Kirk put it, conservatism was properly an “adjective,” describing “a state of mind, a type of character, a way of looking at the social order,” and was thus “the negation of ideology.” By contrast, the “think tank,” an industrialized ivory tower which manufactures opinions like a factory manufactures widgets, is a distinctly neocon invention. Neocons were “painfully deficient,” according to Kirk,” in the understanding of the human condition and in the apprehension of the accumulated wisdom of our civilization,” instead preferring “to engage in ideological sloganizing.”
The paleocons never really stood a chance against this Catilinarian conspiracy. The neocons were far shrewder and fiercer than the paleocons, who were, chiefly, scholarly and gentlemanly types unprepared for the neocons’ verbal brawls and power grabs. “Eager for place and preferment and power, skillful at intrigue, ready to exclude from office any persons who might not be counted upon as faithful to the neo-conservative ideology,” Russell Kirk summed up and put down the neocons. “Often, backstairs, they have seemed more eager to frustrate their allies than to confute those presumptive adversaries the liberals and radicals.” Indeed, this network of neocon pundits, politicos, professors, and outright plutocrats has remade the Right into something unrecognizable – a cartoonish band of free marketeers like Jack Kemp and Paul Ryan, armchair militarists like John Bolton and Dick Cheny, religious fundamentalists like Pat Robertson and Jerry Falwell, and amoral “consultants” like Karl Rove and Steve Schmidt (with the “traditionalist” paleocons blackened as “extremists.”)
The fate of M.E. Bradford exemplifies the neocon takeover. In 1980, President Ronald Reagan announced that he was nominating Bradford, a professor of English at the University of Dallas, to chair the National Endowment for the Humanities. The neocons, who had long-coveted the cultural influence of the NEH, organized a smear campaign against Bradford, calling him a “neo-Confederate” for identifying with his Southern heritage and criticizing Abraham Lincoln. The neocon columnist George Will was particularly snide and insincere in his attacks on Bradford. “I’m through,” Bradford told his friend and colleague, Thomas H. Landess. “If they want it that bad to do something like this, then let them have it.” Ultimately, Bill Bennett, a Democrat who was friends with Bradford’s detractors, was appointed instead.
Paul Gottfried, who also lost an academic position at due to neocon lobbying, described the neocons as “ideologically motivated pursuers of power” who “have never made a secret of their fear and loathing of that part of the Right which they cannot reshape or convert to their views.”
The Lincolnian Revolution and the Renegade Neo-Conservatives
Before the neocons politicized American history, the Declaration of Independence was simply a significant, symbolic historical document with a clear textual and contextual meaning, to be studied by students and scholars as the culminating point of the American Revolution and honored by the public on the Fourth of July. Now it is endlessly “interpreted” and “reinterpreted” to divine mystical new “meanings,” as if it were the Talmud. The Straussian Harry V. Jaffa, repackaging the anti-slavery rhetoric of Abraham Lincoln for post-WWII Americans, invented what has become the kosher meaning of the Declaration, and by extension, the meaning of America herself – because countries cannot simply be “givens” anymore, but must be “propositions.”
The neocon interpretation of the Declaration is, in fact, derivative of Lincoln’s own reinterpretation, and thus descended from what the biographer Edgar Lee Masters called “Hebraic-Puritanism” – the Manichaean, Millenarian, and “Old Testament” worldview of the Puritans. By the time of the Civil War, this Hebraic-Puritan mission had been secularized, but had (and has) lost none of its crusading zeal. Accordingly, “The Battle Hymn of the Republic” quite literally sanctifies the U.S.A. and demonizes its enemies: the abolitionist author sees “the glory of the coming of the Lord” in the Union military (“burnished rows of steel” are a “fiery gospel”), which is marching forward to “trample out the grapes of wrath,” “crush the serpent,” “sift out the hearts of men,” and “die to make men free.”
According to Lincoln, the meaning of the Declaration is contained in the sacred words – “the Proposition” – that “all men are created equal.” From this sentence fragment, which he reinterpreted at once obtusely and abstrusely, the modern-day Lincolnites (i.e. neocons) have deduced, no less obtusely and abstrusely, what they call “the American Creed,” which proceeds as follows:
James Madison, Alexander Hamilton, and John Jay never mentioned any of the above in the Federalist. It is as if they never took any courses at Hillsdale College or PragerU!
All that matters to the doctrinaire neocons is that one sentence fragment from the second paragraph – no need to read the rest of the document, and really no need to learn about the historical context of the Colonial Crisis, the Continental Congresses, the Revolutionary War, or really to learn about any American history for that matter. Once “the Proposition” is understood, all other truths are, mystically, revealed. The neocons are always lecturing about the “lessons” and “laws” of history, like right-thinking Marxists, yet it seems that the only history they know is this sole sentence fragment from the Declaration of Independence, the Gettysburg Address, the “New Colossus” poem on the Statute of Liberty, and, of course, Munich and Auschwitz.
The neocon reinterpretation of the Declaration of Independence is a sort of historical fundamentalism – a simplistic and sentimental civic religion (or, to use a term more familiar to these Marxists, “false consciousness”) with little to no relation to human experience or even human reason. “They repeat, as the fundamental maxim of our civil policy, that all men are born free and equal, and quote from our Declaration of Independence,” Chancellor Harper explained as early as 1838. “It is not the first time I have had occasion to observe that men may repeat with the utmost confidence, some maxim or sentimental phrase, as self-evident or admitted truth, which is palpably false, or to which, upon examination, it will be found that they attached no definite idea.”
It is indeed simplistic and sentimental: all men are most certainly not created equal. As individuals, human beings each have different capacities and characteristics determined by their genetic inheritance. For instance, cognitive ability, or “IQ,” has been demonstrated to be strongly predictive of life outcomes, and while not entirely heritable, still strongly heritable. Even the idea of “equality of opportunity” (as opposed to “equality of outcome”) is an illusion: just as a footrace in which some runners started ahead and some behind would not be considered “fair,” true “equal opportunity” requires an equal starting point for everyone, or “equality of outcome.” Yet in spite of the empirical untruth of the proposition “that all men are created equal,” Americans are seriously expected to live and die by it simply because Thomas Jefferson, writing under the heady influence of the Enlightenment, pronounced it so. Yet Jefferson was a member of a committee tasked with writing the first draft for a public statement of a resolution that the Continental Congress had already passed, not a Promethean lawgiver.
According to John C. Calhoun, “that all men are born free and equal” was “a proposition which originated in a hypothetical truism, but which, as now expressed and now understood, is the most false and dangerous of all political errors.” Calhoun continued that he was “not afraid to attack error, however deeply it may be entrenched, or however widely extended, whenever it becomes my duty to do so,” and proceeded to refute this heresy in every particular:
Taking the proposition literally (it is in that sense it is understood), there is not a word of truth in it. It begins with “all men are born,” which is utterly untrue. Men are not born. Infants are born. They grow to be men. And concludes with asserting that they are born “free and equal,” which is not less false. They are not born free. While infants they are incapable of freedom, being destitute alike of the capacity of thinking and acting, without which there can be no freedom. Besides, they are necessarily born subject to their parents, and remain so among all people, savage and civilized, until the development of the intellect and physical capacity enables them to take care of themselves. They grow to all the freedom of which the condition in which they were born permits, by growing to be men. Nor is it less false that they are born “equal.” They are not so in any sense in which it can be regarded; and thus, as I have asserted, there is not a word of truth in the whole proposition, as expressed and generally understood.
All of the above is indeed contrary to the American civic religion of absolute individualism and egalitarianism, as preached by the Clintons and the Bushes, or the Washington Post and the Wall Street Journal, or Samantha Bee and John Oliver, or Jon Meacham and Eric Foner, or “The West Wing” and “The Newsroom,” but an “appeal to authority” is a logical fallacy (especially an appeal to such pathetic authorities). In what way was Calhoun wrong? “The act was, in fact, but a formal and solemn annunciation to the world that the colonies had ceased to be dependent communities, and had become free and independent States,” Calhoun wrote in his 1850 masterpiece. Even John Quincy Adams of Massachusetts, a proto-Lincolnite, quintessential Hebraic-Puritan, and one of Calhoun’s nemeses, did not disagree with the South Carolinian on this question. “The Declaration of Independence, in its primary import, was merely an occasional state paper,” Adams said on July 4th, 1821. “It was a solemn exposition to the world, of the causes which had compelled the people of a small portion of the British Empire, to cast off their allegiance and renounce the protection of the British king, and to dissolve their social connection with the British people.”
This politicization of the Declaration of Independence can be traced back to the Missouri Crisis. When Southerners pointed out that the Congress had no constitutional authority to prohibit slavery in a new State, Northerners like Sen. Jonathan Roberts of Pennsylvania and Sen. David L. Morril of New Hampshire pointed out that slavery itself seemed to violate the so-called “equality clause” in the Declaration.
Sen. Rufus King of New York, in particular, incensed Southerners with a forceful speech arguing that slavery was a violation of “the law of nature” and “the law of God,” which were, as he put it, one and the same. According to King, this law was “established by the Creator, which has existed from the beginning, extends over the whole globe, is everywhere, and at all times binding upon mankind: a law which applies to nations, because their members are still men; a law which is the foundation of all constitutional, conventional, and civil laws, none of which are valid if contrary to the law of nature; that according to this law all men are born free.” In his Second Inaugural Address (written by the neocon Michael Gerson, in consultation with other neocons and even the Israeli official Nathan Sharansky), Dubya took this same ideology and extended it globally – a perfect example of what Burke, in reference to the Jacobins, called an “armed doctrine.”
Rep. John Tyler of Virginia (who would be President in 1841) argued that the phrase “all men are created equal” was a fallacy:
Gentlemen have exultingly read to us the Declaration of Independence. From it they have gathered that which, as an abstract truth, I am not disposed to deny: “That all men are, by nature, equally free, sovereign, and independent.” Can this proposition admit of application to a state of society? Does not this fallacy meet you in every walk of life? Distinctions will exist. Virtue and vice, wealth and poverty, industry and idleness, constitute so many barriers, which human power cannot break down, and which will ever prevent us from carrying into operation, in extenso, this great principle. Take this principle and preach it up to the monarchs of the world; will they descend from their lofty eminences, or raise mankind to a level with themselves? No, sir, the principle, although lovely and beautiful, cannot obliterate those distinctions in society which society itself engenders and gives birth to. Liberty and equality are often captivating sounds; but they often captivate to destroy. England had her Jack Cades and Levelers. Look, I pray you, to revolutionary France. These were the principles of that day. Mark the consequences! Murder and rapine stalked over the land, and the guillotine, the work, too, of a philanthropist of that day, was the sad monument of this fallacy. Liberty and equality was proclaimed by Robespierre and his associates, at the very moment that they were enriching the fields of France with the blood of her citizens. Nor was the doctrine confined to political institutions, but, advancing with a daring step, fought even with the Creator, and mocked at the immutable truth of religion. Turn your eyes also to South America. The throne of the Incas was washed from under them by the tide which flowed in from Spain. The native of the forest was deprived of his freedom, and made to toil for his new master. Then, too, sprung up a philanthropist, who claimed for the Indian an equal rank in creation with the inhabitants of Spain. His claim was admitted, and Africa mourned over the mistake, and her deepest curses may still be uttered against the memory of Las Casas.
“Although I do not believe that this principle of equality can be applied to man in extenso, yet I love it, and admire it as an abstract truth, and will carry it into operation whensover I can,” explained Tyler. “If we cannot raise the black man up to the level with the white – and that we have not the constitutional power to do so none here have denied – let us raise, at least, the white man up to this level,” he concluded. “Extend an equality of rights to the people of Missouri.”
Sen. William Pinkney of Maryland argued that it was self-evident that the phrase “all men are created equal” was never meant to be taken so literally:
Of the Declaration of Independence, which has also been quoted in support of the perilous doctrines now urged upon us, I need not now speak at large. The self-evident truths announced in the Declaration of Independence are not truths at all, if taken literally; and the practical conclusions contained in the same passage of that declaration prove that they were never to be so received
According to Pinkney, “the infinite perfectibility of man and his institutions, and the resolution of everything into a state of nature,” as preached by King, were “sentiments the most destructive, which, if not borrowed from, are identical with, the worst visions of the political philosophy of France when all the elements of discord and misrule were let loose upon that devoted nation.”
Sen. William Smith of South Carolina condemned King’s near-religious belief in natural law and natural rights, or “the religion of nature,” as a recipe for a revolution:
This religion of nature, and the application of it, which the gentleman has recommended for your consideration, is the very system which gave rise to that state of things which so lately convulsed Europe to its center. This was the religion preached up in the French Convention in the days of Robespierre. They, like the gentleman from New York, were not bound by written systems. They were too limited for the great projects of revolution. They presented in their gallery the Goddess of Liberty, draped in transparencies… declared the laws of nature to be the laws of God and of religion, by which all men were born free and equal. This theory intoxicated the nation, and the reform in their government, which was their great object, was lost in the designs of aspiring ambition; and the fairest portion of that nation was sacrificed on its altar. Robespierre paved his way with blood, until the nation sickened at the sight. In the midst of those scenes of horror and dismay, Napoleon, for the pious purpose of securing the liberty, and promoting the tranquility of the nation, assumed the reins of government, and in his career would have prostrated all Europe, if all Europe had not combined to prostrate him.
James Rutledge Roesch lives in Florida. He is a member of the Sons of the American Revolution, the Sons of Confederate Veterans, and the Military Order of the Stars and Bars, as well as the author of From Founding Fathers to Fire-Eaters: The Constitutional Doctrine of States' Rights in the Old South.