Just curious, why is it that our culture tells us to respect women but doesnt mention respecting men?
Is it something to do with the nature of women and how they are to be treated. Im getting the vibe that they want to be in control and want dominance in everything. I know women are a lot more sensitive then men so that maybe it. I do believe in respecting women, but I always wondered why that is brought up more than respecting men.