In the American culture, are men or women more likely to exhibit sexist attitudes and behaviors?
It's okay for women to be sexist but not for men, which is an unfortunate truth in society
That depends on how you define "sexist."
Does it? Explain.