How was America transformed by the New Deal?

1 Answer
Jan 25, 2018

The role of the government changed drastically

Explanation:

The New Deal meant the beginning of a new era. America is a country founded upon the principles of liberty and responsibility. It meant self-help and individualism. The New Deal contrasted this vision by introducing a stronger role for the federal government in the economy.

Preceding periods had known such transformations (the Civil War or the Progressive Era) but the New Deal put forward a trend that inspired the left like Kennedy in the sixties and still influences the Left nowadays.