Reading the NY Times style section, I came across an article on feet. Apparently, a majority of women in the US hate their feet. Huh?
I remember in the 60's there was an ad for an exfoliant foot cream that showed a woman, apparently nude, sitting with her arms and legs crossed strategically, that read: What's the ugliest part of your body? Whoa, I found the ad on Google images; you can see for yourself:
The first time I saw that ad, I have to admit, "feet" did not come to mind...
According to the article, so many women hate their feet that there are apparently "I hate feet" groups on Facebook! Sheesh! I'm not sure why, but I've always liked my feet, and I've always taken better care of them than I have of my hands. Admittedly I now spend 9 months of the year in open toed sandals, but even when I lived in the frozen north and wore socks for 9 months a year, I gave myself regular pedicures. I read the comments posted on the NY Times article, and there were the usual killjoys, upset and critical that anyone would spend time or money having a pedicure, yada yada yada, whining that feet are purely functional. Yeah, but so are teeth, and yet most of us brush and floss daily and see a dentist a couple of times a year. Years ago I had a neighbor who loved to harass me about the fact that I colored my hair, while she'd allowed herself to go grey. One day I'd had enough and pointed out that she permed her hair on a regular basis, so really, although we were doing different things to achieve it, both of us were changing what nature had given us, in an attempt to improve our appearance. Having well groomed feet is more than just a matter of personal style, though. Ask any diabetic, or anyone caring for a diabetic, about the importance of taking care of their feet.
I've always liked my feet, and at 59, I still like 'em!