2 answers
2 answers
Updated
Camie’s Answer
Great question, and I think the answer is "yes" and "no". Generally speaking we see a lot more women go into the field of nursing, so often times people assume female health care workers are "nurses". For this reason, if you are a woman and if you wear scrubs, chances are patients will assume that you are a "nurse" even though that may not be true and you may actually be a doctor, pharmacist, or other healthcare worker. Women also get paid less than men in general, so we see this happen in healthcare as well.
Updated
Estelle’s Answer
As a woman and surgeon, I have been both harmed and helped during my career.
People didn't think I could be as tough as the men in my surgical program, and that was a terrible frustration. However, I was able to prove them wrong.
I was also discouraged from certain fields-like surgery, but that didn't stop me from following a passion.
However, I was sometimes saved from some of the more aggressive rants because of my gender and perceived weakness.
Thankfully, I think things have become much fairer over the last 30 years but there are still frustrations.
People didn't think I could be as tough as the men in my surgical program, and that was a terrible frustration. However, I was able to prove them wrong.
I was also discouraged from certain fields-like surgery, but that didn't stop me from following a passion.
However, I was sometimes saved from some of the more aggressive rants because of my gender and perceived weakness.
Thankfully, I think things have become much fairer over the last 30 years but there are still frustrations.