American culture is generally better when whites are the majority
As whites have always been the majority, how would we know?
My last home was in Memphis TN which is majority black. Things ran pretty much the same as anywhere else I’ve lived. I think our shared American lifestyles and values are far more important than any concerns over people's complexions.
At the rate the population is changing soon whites will be in the minority which is fine by me. That means they’ll have to find someone else to blame for all their problems.
America was founded by whites for whites.
Racist much? And that's not very "Christian" of you.
God dosen't judge us for being racist, He judges by how we have worshipped Him and served Him.
He’s not actually wrong, America was founded by white men for white men. That’s why it’s so hard even now for women and people of color to get ahead and be treated as equal in America. Our country was created on inequality.
However, I think our culture is improving greatly as we diversify!
Yes He does. Being racist violates the command to love your neighbor as you love yourself.
Mac, God demands love, not racial equality specifically as a form of love. You have no argument.
I like America as the melting pot it’s become. I don’t care what color people are, and I don’t view people as white, black, or anything else when I’m dealing with them.
This user is currently being ignored
That’s not been true in all areas and times. In fact, to name just one example, the Harlem Renaissance provides an instance of increased cultural production in New York’s African-American community.
No, many different races have made unique and important contributions to American culture. No one race has a monopoly on the achievements of the American experiment.