I live in Illinois, but I want to move to a state with a warmer climate because I don't like the winters very much. I'm thinking about moving to the South in the future, but I'm thinking that it will make my skin tanner than I want it to be. Does living in a southern state darken your skin significantly? Which US state would you recommend?
I've found that using sunblock or sunscreen helps.
States I'm considering include Arizona, California, Texas, Florida, and North Carolina.
Tags: