9 of the Biggest Lies Christianity Tells Us About Sex and Marriage
ByEliel Cruz January 27, 2015
Let's talk about sex, baby. Let's talk about you and me — and the church.
For centuries, the Christian church has had what amounts to a monopoly on Western conversations about sex and marriage. And during this time it has given a lot of bad advice in its tenure. While some religious institutions have used their platforms to preach tolerance and respect, all too often their more conservative counterparts have ended up perpetuating patriarchy, rape culture and heteronormativity.
A lot of religious institutions — especially conservative iterations — forgo sexual education in favor of blanket statements like, "Sex is impure, don't have it until marriage." In America, these types of dogmatic and outdated beliefs may be taking their toll on the faithful. According to the Pew Research Forum, more Americans than ever before claim they don't consider themselves affiliated with any one religion, including nearly a quarter of Millennials. And it's not just in the U.S.
If religious institutions truly want to stop this trend, they must change the perception that they are stuck in a rut culturally. One way to do this would be by contributing healthy ideas about sex and marriage. But they have to stop telling lies first.
Here are a few ways Christian leaders could stop being part of the problem when it comes to sexual stigmatization and shaming, instead helping their audience become more enlightened and empowered when it comes to sexuality. Read full article...
When you subscribe to the blog, we will send you an e-mail when there are new updates on the site so you wouldn't miss them.