I’ve been at a handful of companies, and there’s always some who seem to be more involved in the diversity groups than the actual work their job title shows. And in many cases, me and colleagues (white males, not that it should matter but context is context) felt uneasy about some of the general rhetoric.
Some knew the groups were just lip service, while others saw the groups as enacting global change.
I personally haven’t sorted it out, I’m still thinking about it. But it seems odd how heavy some college curriculum, and high school for that matter (myself in college not long ago, my brother in HS), focus on being an activist.
Sure that’s great to want change in your community, but I feel the desire to be an activist for something was taught over the desire to foster a local, welcoming community.