Show of HandsShow of Hands

political April 12th, 2019 7:36pm

Does America have a rape culture? Rape culture defined - a culture that encourages and/or normalizes rape.

15 Liked

Comments: Add Comment