Since Liliaeth probably isn't over-familiar with American history, I will come in and say that the KKK came out of the south and this was two centuries ago, following the Civil War, and although it is true that the south was Democratic at the time, a lot has changed since then. The Democrats became liberal, and when it comes to race, it is now the Republicans, especially now, that harbors white supremacists.
no subject
no subject