I just came across an article that struck interest and I simply must put my two cents into it. The article was about whether females or males make better doctors. Aparently research shows that women enjoy female doctors who can truly relate and be empathitic but do not like their male doctors to act the same way and that generally men have no preference. I don't know how this study was conducted! Maybe women just said that to empower other women and to completely throw out the idea of having a relationship with a man other than their boyfriend or husband, and maybe men just said they didn't care either way so they wouldn't be called sexist and looked at the wrong way.
Personally, I prefer any doctor who is empathitic and takes me seriously. My rhuematologist is the perfect match for me as he is very laid back and my care is very well thought out to my needs and preferences. Not to say I've never had a bad experience: I've had terrible experiences with doctors. My eye doctor practically climbs on top of me and she tells me to suck it up even though I'm visibly scared stiff or in pain. I could go on with good and bad times but I won't, so let's leave it at 'it depends on the person.'