Why?Why do we have to grow into adults and run into problems in life before we learn the power of positive affirmations, confidence, and believing in ourselves?
Why are these simple messages of truth that can change our lives so hidden in the culture and the environment that raises us?
Why do we have to discover them by accident and feel like we have found a new gold mine whereas almost all of it is found in ancient messages that still hold true?
I wish I understood why our culture abstains from teaching you how to think positively when you are growing up. Or is this rampant in all cultures of this big great world?
anya1212 26-30, F 6 Responses 3 Jan 22, 2013