So I just finished watching this documentary about Healing around the world and through different cultures. And while even I found a few of the techniques to be a bit "different" I found one thing they said to be really interesting...
One of the points made was that even "western" medicine requires some level or faith to work. You call the doctor because you have the faith that they have some form of knowledge or understanding about your ailment that you do not and that they may have the answers you seek... You wouldn't allow someone to cut in to your body if you didn't have faith that they knew what they were doing... And you wouldn't take a pill full of chemicals if you didn't have faith that they were going to actually fix your issue...
The idea was that while we see these cultural medicine or faith healing techniques as odd or taboo, even our cultural NORM of western medicine takes similar types of faith.
Answer by kittieashy at 5:19 PM on Nov. 4, 2010
Answer by KelleyP77 at 5:27 PM on Nov. 4, 2010
Answer by kittieashy at 5:36 PM on Nov. 4, 2010
Answer by ObbyDobbie at 6:36 PM on Nov. 4, 2010
Answer by dullscissors at 6:43 PM on Nov. 4, 2010
I wouldn't say it was faith, especially when the diplomas hanging on the office wall are proof that the doctor has the required knowledge and training. I would call it trust.
Answer by beeky at 6:58 PM on Nov. 4, 2010
Answer by soyousay at 8:07 PM on Nov. 4, 2010
Answer by momto2boys973 at 10:21 PM on Nov. 4, 2010