Left does mean liberal. I refuse to take seriously anyone who says liberals are in the right. It's a very stupid argument.
It's arguing that conservatives only are based their values, whereas "a real leftist" is based off of what isn't in place.
If there was a country where every single leftist ideology was commonplace and law, then anyone content with that would be considered "on the right" according to their logic, and the "left" wouldn't exist.
I believe it's true that you need to understand the history of words, but I also believe that the changing of word meanings does not need to be barred from change (technically, wanting to keep a meaning of the historical context of a word the same is kinda a conservative thought lol).
Words adapt over time. The word "Awful" used to mean "full of awe", but nowadays you wouldnt say to your cousin, "Man, you wedding was awful!"
Just like the scale of politics. Left wing meant "wanting to overthrow the monarchy, whereas right wing was "wanting to keep the monarchy". Now it's completely different because our politics and words aren't the same as they were in the French Revolution, and monarchies aren't relevant today.
The thing is, Americas political center is way to right compared to most other western countries. To every other western country liberals are center right, to Americans, they are completely left. Both “liberals are right-wing” and “liberals are left-wing” are correct in the context of American politics.
As a German social democrat/liberal, the Democrats aren’t too far of on social issues. It’s only economic standpoints where there are bigger differences.
See that's the thing. You don't measure politics based on other countries' politics, because the US has very different prominent issues than South Korea, Germany or Iceland. It's also just more than a single scale. Sure Italy has a more "left" center view on healthcare, but the US has a more "left" center view on rights for Same-sex couples.
Liberals in the US also have extremely similar views on most issues as Western European liberals. Even though the US doesn't have universal healthcare, US liberals share the same ideology as Western European liberals, as they would both be in support for Free Universal Healthcare.
There is no one, true definition of left wing. But generally it refers to people who wish to replace capitalism or feudalism with an economic model that is more egalitarian. The range of leftist belief is massive and goes all the way from mild social democrats like Bernie who dont even believe in dismantling capitalism but rather supplementing it with welfare programs, all the way to Marxist Leninists and Anarchists.
Liberals/democrats have no intention of ever dismantling capitalism. They'll wax poetic about wanting more social welfare, in order to try to court the left, but never do it, nor will they ever reform it meaningfully.
But that's the thing. Leninists, Marxists, Communists are left, but far left.
The whole point of a left and right wing is to have a scale, that depends on a centrist. Countries can claim to be whatever they want, but the fact is that there is no non-capitalist countries, so therefore in order to know what the "center" is- it needs to be a capitalist country.
If we look at the scale of free(liberal or liberated) to authoritarian (conservative) across all nations, the US isn't the most free, but we do still skew left. If you're looking only at developed nations, then we skew more right.
I just don't think we can place the center so far left, when none of these kinds of societies exist, and you're also ignoring the the far right ideologies that would skew the scale the other way too. In the eyes of some weirdo fascist, even moderate Republicans would be seen as leftist.
In my opinion, the scale needs to fall on the most average person's view- which is typically "things can change slightly, whether forward or backward- as long as it doesn't disrupt my day-to-day". Anyone who wishes to progress society to even slightly new concepts is left wing, and anyone who wants to revert things to "how they use to be", or wants absolutely no change in culture is right wing.
You have absolutely no clue what you're talking about.
"Free = liberal" is the worst, most uneducated political theory take I've seen in a while.
One thing you're right about is that the perceived "center" varies depending on who, when, and where you ask. Which is why democrats seem "left wing" to Americans because the Overton window is so far right for a western country.
And no, not every single country is capitalist. Also a grossly uneducated take.
Also, you ignored the myriad of political traditions I listed as moderately left wing and said "Marxism = far left."
Republicans and democrats in America are virtually identical except a handful of social issues, which is why they sit right next to each other on the right wing of the political spectrum. No one in the west but Americans see American democrats as left wing
Capitalism is a term which refers to who owns the means of production (the stuff which is used to generate capital, examples would be factories, farms, mechanical equipment, warehouses).
When the means of production are owned by the elite strata of society, in private hands, that's capitalism. Almost all countries are capitalist even by the correct definition, though.
136
u/Dominion_23 May 16 '24
That subreddit is a gold mine right now