Outside of SoCal and Vegas, what else does the West coast have to offer? Nevada, which is mostly dessert and the poor man's Florida with no water? Utah, mountains and mormons? Seattle, Washington where it rains all year? Portland, Oregon which is all white folks, hipsters, and KKK secret hideouts? Arizona where you'll fry yourself due to the heat? Living around retired cacs who carry guns by the hip at all times? Idaho? Montana? Wyoming? I mean sure I bet the cost of living is great in New Mexico, but it's fukking New Mexico
I just don't see what's so special about the West coast, even if I was rich.