Most Americans are aware that France helped us win independence from Britain, but few appreciate the huge influence France continued to have here during our nation’s formative years. In its infancy, the United States was a scarecrow of a country — mostly farms and a few hamlets scattered among diverse, semi-independent states — with meager economic development.