I keep coming upon snarky comments on the internets that Obama has offended the Christians in this country by announcing the U.S. "is no longer a Christian country." Most people are responding as if Obama has kicked Jesus out of the United States and revoked the right to practice religion.
What Obama said was, "Although ... we have a very large Christian population, we do not consider ourselves a Christian nation, or a Jewish nation, or a Muslim nation. We consider ourselves a nation of citizens who are bound by ideals and a set of values.."
In other words, Obama is inclusive and acknowledged everyone in this country, pointing our we are not EXCLUSIVELY Christian. And I think that is a good thing. I, personally, am getting pretty annoyed with Christians who think their beliefs should be imposed on everyone. Go to church, practice your religion and even share your religion with others who are interested but stop thinking the rest of the country should be bound by your precepts.