Woke Culture Is Killing America!
The World According To Ben Stein

Woke Culture Is Killing America!

2021-05-05
Woke culture is killing America. Kids are being indoctrinated in schools and through all forms of media at home. Adults are relinquishing their parental obligations and staying silent. And if this wasn’t frightening enough, critical race theory has now infiltrated the already-corrupted agencies responsible for protecting us. We are witnessing government-controlled subversion, and yes, your humble servant is scared.
Comments (3)

More Episodes

All Episodes>>

Get this podcast on your phone, Free

Create Your Podcast In Minutes

  • Full-featured podcast site
  • Unlimited storage and bandwidth
  • Comprehensive podcast stats
  • Distribute to Apple Podcasts, Spotify, and more
  • Make money with your podcast
Get Started
It is Free