lol, care to explain that? Germany was treated very poorly. The main wound they inflicted on Germany which is what enabled Hitler to preach his nationalist message was the taking of land from Germany. If they were not punished with that, then WWII would never have happened, and Hitler would never have come to power, and the Holocaust would never have happened. Our President tried to warn the French, but they did not listen. If people had listened to the US then, there would have been no WWII. :P (sorry, I had to put a nationalist spin on it for HoreTore's sake)
Bookmarks