[15 / 5 / ?]
Girls these days are extremely liberal and woke. According to liberals everything wrong with the world is caused by white men. Using common sense they must mean a lot of women have extreme dislike of white men and thus would probably rather not date them. Is it fair to say that being white is seen as a bad thing in the eyes of most women these days? Its fair to say that a lot of women feel ashamed of having a white bf these days.