You know it, and I know it. Far too many Christians are content to ignore the moral decay in our state and nation. Instead of boldly exposing evil, we tolerate it and, in some instances, even affirm it. Should the Church confront cultural decay? What does it mean to be "salt of the earth" and "the light of the world?"