America has been changing for the worse. I am not saying that from a mere pessimist view but because it is obvious. The reason is that God's principles in the founding of America have been subverted; in their place is the reasoning of greedy, power hungry men. When God's principles are replaced in the government and culture the consequences are disastrous. What will it take for America to learn this lesson?