Our homie Mydnite Sun sent us this video, at first I was obstruct by the thought, but then as I really thought about it… it seems this country is not as Christian as many of us think we are. Yuppp… I said it! I would be interested to know what you guys think. Is the United States of America a Christian nation? Were we founded on Christian principles? If so, are we straying away from them?
Obama In Turkey “We Do Not Consider Ourselves A Christian Nation”