The Myth of Germany as an Evil Nation
"The image of Germany as a sinister, predatory, warlike nation only took root in the twentieth century. Nineteenth century Germany, by contrast, was seen as a place of peace and enlightenment."
Although this article suffers, in my estimation, from its none-too-subtle anti-Jewish bias, what the author has to say about the demonization of Germany is squarely on the mark.
I have shared elsewhere on this blog ("Who's to blame for World War One?") how the idea that Germany is solely or even primarily to blame for the First World War is entirely counter-factual, the result of Allied propaganda created to mask their own complicity and to justify a horrific and completely unnecessary war; and is is widely recognized by sober and objective historians that the rise of Hitler and the Nazi party was a direct and perhaps inevitable result of the draconian punitive measures leveled against Germany by those same Allies, when victorious.
Prior to World War One, "Germany was admired by the world as a center of learning, for its high culture and for its achievements in every field; but also for its culture of honesty, hard work, orderliness and thrift, which existed even at the lowest level of society. British scholars and journalists had been very favorably disposed toward all things German, including their history, culture, and institutions throughout the nineteenth century," and "British author Thomas Arnold (June 13, 1795 – June 12, 1842) saw Germany not as a nation with a unique predisposition toward authoritarianism and regimentation, but rather as a 'cradle of law, virtue, and freedom,' and considered it a 'distinction of the first rank' that the English belonged to the Germanic family of peoples."
However, "This view of Germany was to change almost overnight with the outbreak of World War I. After the war began in 1914 a grotesque image of a rapacious, bloodthirsty and uniquely aggressive Germany quickly took form and became the stereotypical image of Germany in Europe and America. This new image of Germany was the direct result of a virulent anti-German propaganda campaign conducted by the British government and later joined by the United States government in which deliberate and systematic lies, distortions and false atrocity stories were disseminated to the British and American publics.
"The emotions of both the British and American publics were deliberately whipped up to a fever pitch of hatred for the “Hun.” A pathological hostility towards all things German, which later became such a familiar and integral part of Western thinking about Germany, had its birth in this skillful propaganda campaign. After World War Two, Historian Harry Paxton Howard examined this transformation of Germany’s reputation which began immediately after the start of WWI. It was made out, he said, that Germany was not only evil but had always been that way, and that Germany, contrary to the facts [emphasis added], had always been the historical enemy of Europe and America."
The lingering effects of this villainization, which appears to have been largely internalized by the German people themselves, can be all-too-plainly seen in the paralysis which is gripping Germans today in the face of mass immigration, and even in the impetus to allow that mass immigration in the first place. There seems to be an almost masochistic desire to avoid even the semblance of "racism" or "intolerance," even in the face of an obvious existential threat, which prevents Germans from facing up to the perils of runaway Islamic immigration, and formulating a clear-eyed and constructive response to this threat.
The result – pending a popular re-awakening to the fact that Germany is not and must not be defined by a single brief period, however terrible that may have been – is that the Germany of kings and kaisers, of Bach and Beethoven, of Goethe and gemütlichkeit, and of much else that is good, true, and beautiful – is well on its way down the road leading to self-immolation.
It is sad to see, in a nation which was once so great.