I think everyone seems more rude and unfriendly.
I have spent a lot of time in the hospital lately, and most of the nurses I have met have been nice, friendly and caring. I especially liked the ones in ICU who tended me while I spent three months in a coma and on total life support. The doctors may have the skills, but the nurses provide the care that really heal people
I can only speak as I find and as a carer I am dealing with nurses all of the time,99% of whom are fantastic at what they do,caring with empathy.
I hear what you are saying my friend.
I think that a nurse should get paid (for example) 30% of what the doctor gets paid. The doctor would have a better paid smiling nurse. The nurse would work harder so that he/she could indirectly benefit.
I find this talk quite funny and somehow related. Check it out. <br />
I agree 100%.<br />
For about the past ten years, I have noticed that every time I have gone to the doctor's office or hospital, my nurses are incredibly rude and they act like they hate their jobs. the last time I went to the gyno, my nurse practically called me a moron because I couldn't p e e on command. I'm sorry my bladder and urethra won't cooperate with YOUR HIGHNESS. Geeze.<br />
Hating your job is common in the world, but if you're gonna be a nurse or any other caretaker profession, you need to make sure you always make an effort to be cordial and nurturing, NOT a complete grumpy and passive-aggressive b*tch.