I think the south is the least racist of all the regions in the US, simply because whites here are more prone to actually living around, interacting with, and growing up around blacks. Its not like that outside of the south, whites outside the south, in my experience, are more likely to never have met a black person, and only seen blacks from their portrayal in music videos and tv.
I feel this way too. Ppl talk shyt about racism in Florida. But unless you're in a real hick town, it's not so apparent. Yes there's still Confederate flags and shyt... But there's a familiarity there. Blacks and whites work together, there's black and white cops, there's black and white businesses, etc. Places like Cali, Oregon, Washington, etc I'd imagine would be worse being that it's literally all white people.