Of the devout Christians I know personally, essentially all of them say that God has transformed them inwardly. They often describe how, before they "met God", they held onto anger and resentments, and could not let go of various inward conflicts which led to unhappiness in their lives. They emphasize that it has been essential to "maintain a relationship" with God in order for them to be their best selves. However, I have also met people who have been able to make these same inward transformations without "God's help". For some it was therapy which helped them change, for others it was a certain philosopher or other famous speaker, and for some it was simply a change they decided to make for themselves. So, why is it that some people insist they need God in order to be able to be virtuous, loving, forgiving, etc., but others say God is not necessary for this?
Asked by Anonymous at 4:40 PM on Mar. 24, 2010 in Religion & Beliefs
If God was taken away from them, they would resort back their old selves because religion is a band aid- not a cure. They need to heal from the inside out. Learn to love, trust and have faith in themselves. In that way, they will be strong enough to take whatever comes their way and not need to turn to myths for comfort or answers.
Answer by IhartU at 7:59 PM on Mar. 24, 2010
Answer by Anonymous at 4:43 PM on Mar. 24, 2010
Answer by ArkTech at 4:46 PM on Mar. 24, 2010
Answer by hot-mama86 at 4:49 PM on Mar. 24, 2010
Answer by Loryl at 5:05 PM on Mar. 24, 2010
Answer by hot-mama86 at 9:34 PM on Mar. 24, 2010
Answer by IhartU at 9:36 AM on Mar. 25, 2010
Next question overall
(Politics & Current Events)
Should the video of the killer whale drowning the trainer be made public?