0 LIKES LikeUnLike
The western world has become increasingly "broken" and traditional values seem to mean nothing. All this because women think they should be free to perform the male role. Would society be healthier if women went back to their proper roles ie., having and raising children, keeping a nice home for the provider to come home to etc? Or is it just too late now?
Tags:
Report (0) (0) | earlier
Latest activity: earlier. This question has 31 answers.