For nearly half a century, American Christians have, to greater and lesser degrees, embraced the role of culture warriors. As evangelicals began to stake a claim on American culture and politics, they invoked the language of rights while lamenting the purported decline of “Christian America.” They pushed back against encroaching secularization and federal government “overreach” [Read More...]
↧