Why Doctors Are Not the “Bad Guys”

It is almost impossible to go to the hospital with a health issue and not succumb to surgery or prescription medicines to relieve the diagnosis. Medical interventions happen. Doctors continuously suggest new ways to intervene with our bodies. We then begin to forget how to trust our body’s natural messages or rely on our own judgement as to how we may be healed. When people try to relearn natural remedies they tend to blame the medical profession for their desired change in lifestyle. It is so easy to blame someone else for something that is harmful: saying “never trust doctors,” when, in fact, there are many outstanding and caring doctors in the industry. It is not as important to look at why doctors are “bad professionals” as it is to ask ourselves “why do doctors provide treatments the way they do?” Continue reading