I don't think so. Unfortunately Christianity looks upon sex and sexuality as Satan's work. The whole issue of sexual fun and enjoying sexual pleasures has no place in the Christian lexicon. One of the reasons that Christians speak from a bully pulpit. Enjoy your carnal pleasures.... welcome to the world of carnal pleasure like the thousands of other men and women....![]()
I don't believe that is true at all. I am a Christian and grew up going to a rather conservative church. But we were never taught that sex was bad or evil, but a part of a loving relationship. Sex is a gift from God to be enjoyed. Certainly some churches or denominations preach certain restrictions. But those come from those churches, not from the word of God.