fire_breeze
Softly Seductive
- Joined
- Jul 17, 2007
- Posts
- 47,741
This, I hope, will initiate a good, well-thought out and enlightening debate - not a flame war or fight about politics.
I have been checking out the How-To's (extensive) library on a particular health issue, and couldn't help but notice some disparaging remarks about the 'mainstream' medical community in general.
I have come across quite a bit of posts that includes "most mainstream doctors are not aware of X conditions" or "doctors don't know/accept/acknowledge said condition" or "mainstream doctors just want to sell drugs". This seems to be counter-indicative of medicine and the Hippocratic Oath, highly unethical and kind of doing the same thing that these posters are accusing doctors.
I understand that family or general physicians are not experts. They are there to keep a patient healthy and to diagnose the ailment to a field and send them to experts. I also know, through my year of pre-med before I left, that doctors are well-versed in medicine and that they also extensively read journals and reviews that discuss new breakthroughs, diagnoses, etc.
What confuses me is that my doc, for example, will tell me flat out if she doesn't know about a particular condition and then refer me to a specialist/look it up. That said, she also knows how to analyze data and symptoms. I don't, and I've learned not to rely on Dr. Google. While I applaud taking control of health, but I also know that we are NOT a medical experts. MDs go through years of in-depth, extensive and exhaustive training, a year residency, plus gruelling and brutal board exams, not to mention a high level of accountability and rigorous refreshers and re-examinations.
Furthermore, she will not prescribe me medications for no reason. If there is an alternative medication or path that is equally or more effective, that's what she will do. A quick, informal survey of my local friends - even those without a PCP or family doctor, reported similar experiences.
Please remember that I am Canadian, where preventative health care is highly encouraged and where most people do not go bankrupt when they have an illness and/or condition. Paying for basic health care -particularly preventative health care - is very alien to me, so perhaps it is a lack of perspective.
Is it true that unlike the universal health-care system, that doctors in for-profit systems are in the business of healing and that they 'really don't know anything', or are only interested in text-book cases? Has the medical community, in general, been solely interested in profit and not in keeping a population healthy? Do doctors, in fact, get money from prescribing certain medications? Are 'mainstream' doctors so dismissive with the alternative route - despite extensive studies that are published in exclusive peer-reviewed journals - in combination of also practicing conventional medicine?
Thoughts? Comments? Or am I just naive?
To quote Ed, my comments and thoughts yada yada yada.

I have been checking out the How-To's (extensive) library on a particular health issue, and couldn't help but notice some disparaging remarks about the 'mainstream' medical community in general.
I have come across quite a bit of posts that includes "most mainstream doctors are not aware of X conditions" or "doctors don't know/accept/acknowledge said condition" or "mainstream doctors just want to sell drugs". This seems to be counter-indicative of medicine and the Hippocratic Oath, highly unethical and kind of doing the same thing that these posters are accusing doctors.
I understand that family or general physicians are not experts. They are there to keep a patient healthy and to diagnose the ailment to a field and send them to experts. I also know, through my year of pre-med before I left, that doctors are well-versed in medicine and that they also extensively read journals and reviews that discuss new breakthroughs, diagnoses, etc.
What confuses me is that my doc, for example, will tell me flat out if she doesn't know about a particular condition and then refer me to a specialist/look it up. That said, she also knows how to analyze data and symptoms. I don't, and I've learned not to rely on Dr. Google. While I applaud taking control of health, but I also know that we are NOT a medical experts. MDs go through years of in-depth, extensive and exhaustive training, a year residency, plus gruelling and brutal board exams, not to mention a high level of accountability and rigorous refreshers and re-examinations.
Furthermore, she will not prescribe me medications for no reason. If there is an alternative medication or path that is equally or more effective, that's what she will do. A quick, informal survey of my local friends - even those without a PCP or family doctor, reported similar experiences.
Please remember that I am Canadian, where preventative health care is highly encouraged and where most people do not go bankrupt when they have an illness and/or condition. Paying for basic health care -particularly preventative health care - is very alien to me, so perhaps it is a lack of perspective.
Is it true that unlike the universal health-care system, that doctors in for-profit systems are in the business of healing and that they 'really don't know anything', or are only interested in text-book cases? Has the medical community, in general, been solely interested in profit and not in keeping a population healthy? Do doctors, in fact, get money from prescribing certain medications? Are 'mainstream' doctors so dismissive with the alternative route - despite extensive studies that are published in exclusive peer-reviewed journals - in combination of also practicing conventional medicine?
Thoughts? Comments? Or am I just naive?
To quote Ed, my comments and thoughts yada yada yada.
