It is arguable that the world would have been better off had Germany been the victor in WWI. A victorious Germany, after the war in the West ended, would have crushed the Bolsheviks in Russia, thus avoiding the pain and suffering Soviet rule imposed on the Russian people and, later, Eastern Europe