Here's a question for the political historians among us.
When I was in school, we learned about the 'solid south'. It referred to the fact that ever since the war of northern aggression, the southern states almost always voted democrat. I suppose it had to do with the fact that Lincoln was a republican.
Now, it seems to be just the opposite. With the possible exception of Florida, it seems that the south, including Texas, pretty much votes republican in national presidential elections. When did that change happen?
Was it during the 60s civil rights era, with Martin Luther King and LBJ's Great Society, or did it happen later? I can remember in the 70s being told that the republican party was very small and weak in Texas, and most of our governors were democrats. I wasn't especially politically astute then, so I just didn't notice or pay attention. Now, though, the republicans definitely have the upper hand.
I suppose I could google it, but thought it better to let someone more informed than me answer this.