I believe that Europe had a problem to understand the difference between the new "democratic" era that the beginning of the 1900 actually meant. The old way of kings and emperor dividing land and people between each other as they like was over and the peace of WWI didn't just offend a spoiled royalty, but all the German people. This made it impossible to reconcile the people of Europe and WWII was a fact.
Here a note should be made, that US actually respected the German people after the defeat and I believe that the war efforts from US in WWII was rather marginal, while the rebuilding efforts of Germany saved Europe from a WWIII. (We leave the cold war out of this, since that was not so cleverly handled).

So bottom line, WWII was expected after WWI.......