The West is far from being saved.
You guys cheer around for Trump and Brexit like if it's a right-wing revolution but it's not at all. People are noticing the increasingly destructive properties of SJW's and to an extent socialism, but that's all. We also have to keep in mind that the universities have already shifted the minds of millennials and the only reason why Brexit/Trump happened was because of the adults in this generation are generally right-leaning. In about 50 years when they're all dead and gone, the young socialists of today will become the overwhelming majority and they will revert anything we do now. Even now, most of the normal people who supported Trump and Brexit aren't even right-wing, they're just Egalitarians. They're just going center-left, which doesn't save the west, it's culture, and it's rapidly declining population. The only thing it does is preserve the free market and individual rights, which are very important but they do not include other important societial issues like first generation feminism, sexual dimorphism the obvious and undeniable differences between the races. The only way the west will ever be truly, socially right is through a dark age or serious times of hardship that forces the population to do what has fucking worked for eons, which is of course, right-wing values.
What are you thoughts?