Thanks to feminists, I’ve come to an epiphany: rape is bad.
I know, fellow men, that this statement might surprise you, and it’s completely understandable. I mean, why haven’t we been told that rape is bad all along? Women, you must understand that it’s not our fault that we didn’t know. We were never taught that rape is bad. Not in school, not by our parents, not by society, not by culture, not by you…
Now that we know that rape is morally reprehensible, we must spread the word. We should have seminars at college campuses, educational programs in elementary schools, and an incredibly strict legal system where men are charged on the mere accusation of rape. We can finally take down those silly flyers warning women not to walk alone at night, and replace them with ones that read “DON’T RAPE, MEN” in size 1046 font.
Just like we have stopped murder completely by teaching men that it’s an immoral act, I’m positive that by teaching men not to rape (since men are the only ones that do it) we can end this epidemic.
Thank you, feminists, for enlightening me on the morality of rape.