Something that my therapist said today stuck with me.
America has never been a white country.
This got me thinking. The United States are fundamentally different from Europe. First there was Native people. And then came the immigrants, and European slavery of African people, and immigrants, immigrants, immigrants.
It is very hard to claim ownership to the American identity when so many of us, despite how long we've lived here, do not feel welcomed. Yes, the United States is a center of colonialism. But it is also one of the first postcolonial states. We're not going to have the linearity of European history. We do not serve a king. Monarchy has no place here.
There are always going to be people who go too far, because immigration and exploitation has been what created America, and all immigrants are a little desperate and all exploited people are at the very least angry. It's not something that goes away, even with generations. But it matters that America as a nation-state exists in spite of that. It matters that our defining stories start with "Despite of".
I don't have legal citizenship and I probably never will. But that doesn't stop me from saying I had a home here. That doesn't make it less true that I am a person from here. I am American in the way I smile at people behind the counter in the immigrant-run deli that sells nice sandwiches. I am American in the way I think about personal boundaries and make everyday efforts to try to exist with different people.
There is such a huge spectrum of ways to be American. Evangelical Christianity is not the definition. Rich white suburbia is not the norm. The people in power right now have no understanding of what it means to be American. We live on unceded land, in cities built by enslaved people and impoverished immigrants. Everything of note that was achieved here happened "in spite of."
America has never been a white country.