The modern philosophy of justice and morals, stemming from Western Civilization are Christian ideal's. If you think otherwise, you are just willingly ignoring reality, there is no two ways about it.
That having been said, I agree with you Filth. Being Christian doesn't make you moral, it's something you have to work at, and I think ever good and decent Christian realizes this.
And I'm not limiting morality solely to Christianity, just in the context of western civilization.
__________________
To win a war you must serve no master but your ambition.
Last edited by Mojo_PeiPei; 12-06-2004 at 12:57 PM..
|