America's Christian Influence

There has been much debate over whether America was originally founded as a Christian nation. The Christian influence in our founding is undeniable. What follows are some articles that show that link as well as some book references that prove the strong influence of Christianity on our founding as a nation. 
Subpages (1): Bible Museum
Comments