Do you think Western society, propelled by political correctness and public liability, is becoming a victim culture?
That is, it is "normal" to be a victim - advocate groups and individuals who take offense at literally ANYthing - or those who make wildly excessive claims that something that happened to them is everyone elses fault.
Case in point - ethnic, sexuality or religious groups who create an outcry at seemingly every opportunity?
Tags: