Life in Germany After World War 2

Aug 12, 2022 | History, Military/War, Videos

Germany lost World War II, and as a loser, had to pay heavy fines and punishments. But what was life actually in Germany after World War II?

In between the two world wars, Germany was one of the biggest and most developed economic countries in the world. Hitler used the economic crisis after World War I to bring the people of Germany together.

He promised them jobs, and he delivered on the promise. But he created jobs in the military industry, most precisely, the weapons manufacturing industry. After World War II, Germany had a limit on its military and weapons industry.

As a result of those limitations, the economy, social, and political situation in the country after World War II was not shiny.

One thing that was evident after the end of World War II is how Britain tried to teach Germany on democracy, economics, and politics.

Read On – Our Latest Top Documentaries Lists

Riyan H.