(Credit: AP/Andrew Harnik/Evan Vucci) In the United States, it has long been common for political terms to deviate considerably from their original European meaning, or even to acquire entirely new meanings. After the New Deal, for example, liberal came to represent those who advocated social democratic policies (e.g., universal health care, income and wealth redistribution, social security, strong labor unions) in America, while the same word has long denoted a more center-right political ideology elsewhere in the world (think Adam Smith-style classical economic liberalism).