For as long as I can remember, there has been a ritual declaration spoken after every election: "America is a center-right nation." It doesn't seem to matter who wins the election or by what margin; this refrain has become tantamount to a tradition among the pundit class, and traditions are not to be dispensed with lightly.
Yet I submit that it has been increasingly clear that America is, in actuality, a center-left nation.
Now, to some extent, this depends on what your baseline is. Compared to Sweden, we're still quite conservative. Compared to Russia, by contrast, we look a lot more progressive. But judging on the general spectrum of American politics, the fact is that Democrats have won the national popular vote in seven of the last eight presidential elections. A Republican has won a popular plurality twice in my lifetime, and one of those times was when I was two years old. Certainly, the margins aren't overwhelming, and it does not seem to be the case that even the median "Democratic voter" want the sort of full-throated left-progressivism that some activists would desire. But given a choice where their voices count equally, Americans have been relatively consistent in their preferences over the past few decades: they want to be led by Democrats -- not necessarily the most progressive wing of the Democratic Party, but Democrats. Hence: center-left.
It would be nice if, in between the seven and eight hundredth essay on what Democrats need to do to reach out to Trump voters, some time was spent by the media internalizing this state of affairs, and contemplating what it means for a GOP whose response to this reality has dispensed with the idea that it should be forced to do anything as crass as "win more votes" in favor of burrowing ever-deeper into anti-democratic quasi-authoritarianism.