This Washington Post article examines the question of what is meant by the assertion that America is a Christian nation. They cover the 1790s treaty that asserted we were not a Christian nation, Andrew Jackson’s efforts to resist the starting of a Christian political pary, and Lincoln’s burying of a Christian amendment to the constitution.
The article also describes the statements made Obama that led to him being seen as a non-Christian. Sarah Palin’s assertions that we are a Christian nation are described as well.
The article finally asks the question what do we mean when we say America is a Christian nation? What are your thoughts?