I grew up in a small place in southeastern North Carolina, and to run around barefoot was a normal part of growing up for most everybody. I can remember going to the grocery store or to the mall with my mom and two sisters and all four of us would be barefoot. I pretty much went barefoot in the summer until I was in my early 20's, and then it just seemed to go out of fashion. You never see teenagers of younger adults out and about with bare feet anymore, which is unfortunate because that was one of the nicest things about summer. I wonder if part of the explanation isn't the aggressive marketing techniques employed by the big name shoe manufacturers ? Certain types of footwear have become status symbols among younger people.