Our culture is the dominant culture. The United States culture is pervasive, touching all countries and cultures that it comes in contact with. The dominant mindset of American culture is that ours is the best way to do things and all other ways are crap.
When I see that written out, it becomes such an arrogant statement. America has it’s problems: the current government treats it citizens like crap (unless you have money). It is the only western country that doesn’t have some sort of real universal healthcare. We’re crap on women’s issues (child care, abortion, healthcare in general), crap on racism, trans issues, the environment, and many other things.
We are NOT the best country in the world. According to this study, the US is #10. Switzerland was #1. It shows that our culture, and the imposition of our culture may not necessarily be the best way to do things.
It makes me wonder why our culture is still the dominant culture. I suppose it could be the idea and the hope of the “American Dream” but yet, even here, that dream is quickly becoming a thing of the past. Most of the time it just seems that it’s only because we are only one of two really large, influential countries left. We have Hollywood, New York City, Silicon Valley, and are the anchor country for many technology ventures.
When I’m thinking about this, I always wonder why. It just never seems to me that our culture is particularly special overall. Sure, there are areas in America that has special cultural things, but over all, we’re not much better than any other culture, and in fact, we’re one of the youngest on the planet.
I really wish our country had more humility about who we are.