Skip to main content
2 answers
2
Asked 429 views

in the medical careers er women and men treated differently

#medicine #healthcare

+25 Karma if successful
From: You
To: Friend
Subject: Career question for you

2

2 answers


1
Updated
Share a link to this answer
Share a link to this answer

Camie’s Answer

Great question, and I think the answer is "yes" and "no". Generally speaking we see a lot more women go into the field of nursing, so often times people assume female health care workers are "nurses". For this reason, if you are a woman and if you wear scrubs, chances are patients will assume that you are a "nurse" even though that may not be true and you may actually be a doctor, pharmacist, or other healthcare worker. Women also get paid less than men in general, so we see this happen in healthcare as well.
1
0
Updated
Share a link to this answer
Share a link to this answer

Estelle’s Answer

As a woman and surgeon, I have been both harmed and helped during my career.

People didn't think I could be as tough as the men in my surgical program, and that was a terrible frustration. However, I was able to prove them wrong.

I was also discouraged from certain fields-like surgery, but that didn't stop me from following a passion.

However, I was sometimes saved from some of the more aggressive rants because of my gender and perceived weakness.

Thankfully, I think things have become much fairer over the last 30 years but there are still frustrations.
0