When the church consistently teaches people about God's righteousness and how it applies to every sphere of life; the impact is strongly evident on culture. Those who founded the United States accepted that the Bible is authoritative on everything and that there is not one single area of human development that is not touched by the Word of God. They were unwavering in their belief!