‘New’ Germany: Disturbing Direction or Revitalized Nation?

Postwar Germany, traditionally viewed as a bastion of consensus politics and democratic liberalism, is very much a changed nation – one could say, a “New Germany.” Many Germans feel that the country moves in a disturbing direction while others see signs of a more assertive nation.