Would you agree that the Allies treatment of Germany after World War I paved the way for Hitler and his government?
In some ways yes but I believe hitler coming to power and the nazi power and ultimately WWII would have eventually happened without the way the allies treated Germany post WWI. Hitler was no doubt one of the worst people in history but he was cunning
Charismatic and knew politics very well unfortunately he would have come to power eventually without even without WWI.
Unquestionably. It put unneeded hardship on the German people, and had bored resentment against the nations that forced that upon them. Granted, they would have been dealing with the Great Depression, as the rest of the world, but it wouldn't have
been as severe, and it wouldn't have given them a specific, ongoing reason for vengeance.