0 LIKES LikeUnLike
I am really not trying to offend you, I am trying to understand. I have always thought you believed a woman's place was at home, raising her children. That matters like running countries should be left to men. That seems to be the message I have heard your leaders preach for many years now. Did I hear it wrong? I am directing this question specifically to evangelicals as I know there are millions of other Christians who don't think this way. But I thought evangelicals did. Help me understand.
Tags:
Report (0) (0) | earlier
Latest activity: earlier. This question has 18 answers.