The Most Promising Post-Election Revelation: Dems, Media Are Never Going to Learn
When Donald Trump won the presidency in 2016, there were no shortages of things to blame that didn’t involve Hillary Clinton, her positions, or America’s disenchantment with the political establishment.
It was the Russians. It was fake news on social media. It was the “alt-right,” whoever they were. It was the Electoral College thwarting the will of the people’s popular vote. It was some unconfirmed tape involving Trump, some ladies of the night and an unmentionable bodily fluid that Putin may or may not have had. It was James Comey. It was any and all of these things.
Then, in 2020, the Democrats won, and all was right with the world, according to the Democrats and the media. The vote was free and fair, and America would heal. Forgetting the first part, America definitely didn’t heal — unless rampant inflation, attempted vaccine mandates and unprecedented illegal immigration are signs that the body politic is healing.
Read the Rest — Subscribe Now
You've reached the end of the free preview.Join thousands who rely on us for trusted news.
Already a subscriber?
Truth and Accuracy
We are committed to truth and accuracy in all of our journalism. Read our editorial standards.
Advertise with The Western Journal and reach millions of highly engaged readers, while supporting our work. Advertise Today.










