Discussion about this post

User's avatar
Johann Goergen's avatar

The West (i.e. the USA and its various vassals) was always closely aligned with ideas of fascism, racisms and colonialism. That is not new. I grew up in Germany and studied the history and end of the Third Reich closely. Was always fascinated by the fact that the late leaders of this death cult were sure that they could make peace with the U.S. and G.B. as these were similarly fascist countries. They already understood how things really worked. The U.S. since WWII was easily the greatest purveyor of violence around the globe. But as you say, the average Westerner is saturated with Capitalist State propaganda and does not directly experience said violence. Plus there are many distractions. What has changed in the last 20 or so years is that other, more humane societies have gained the required military power to resist the fascist, so-called democracies of the West. The outcome of this is uncertain but if history is any guide a certain amount of pessimism is in order.

Expand full comment
Gary H's avatar

Revelation 13:11 KJV

And I beheld another beast coming up out of the earth; and he had two horns like a lamb, and he spake as a dragon.

Expand full comment
2 more comments...

No posts