Throughout history, the automobile has played an important role in society. It has transformed the landscape of the United States. Its development started in the late 1700s with the invention of the internal combustion engine.
In the 1920s, the gasoline-powered automobile had overtaken the streets of the United States and Europe. In the 1970s, the United States was losing ground to Japanese automakers.