Hollywood is flocking with White Feminists who range from annoying to problematic to downright toxic.
Why do women have to shave their legs???? I don’t know why? I can understand shaving armpits and the private area, it is important to stay clean hygiene-wise. But I don’t get the legs? Why?