Why Women Should Own The Shit Out Of Their Sexuality

By

Ever since I could remember, the idea of a woman being comfortable with both sex and sexuality was almost seen as a taboo subject. Women never talked about what they liked, and if they did talk about sex, it was in a negative light. It wasn’t until my second year of college that I realized that women could not only enjoy sex, but enjoy various versions of the activity. Since then, I have explored the options available when it comes to sex, and even started working in the sex toy industry as a digital marketing specialist.

While women are coming to terms that their pleasure should be a top priority in a relationship, the topic of sexuality in said relationship is not generally talked about. That’s why I’m here today to share my insights on why you and every other woman should take full control of their sexuality.

First, I should explain what I mean by sexuality. Sexuality is by definition “a person’s sexual orientation and preferences”. While most of us know the term as it pertains to sexual orientation, we don’t talk about it when it pertains to preferences. In my opinion, everyone has different preferences when it comes to sex, foreplay, and even how their appearance when they walk outside of the house, and since all of this encompasses what makes sex special, it should count towards the definition of sexuality.

In today’s version of western society, every single action that a woman does can and will be sexualized. So why shouldn’t said woman take control of her sexuality?

Granted, there is a giant double standard around a women’s sexuality. Take the example of Jodie Whittaker. This British actress was comfortable enough with her body that she posed nude during her career. Those images were then used against her by Rupert Murdoch when she took the role as the 13th Doctor in Doctor Who. In this scenario, one of two things occurred. She is shamed as she is seen as “inappropriate” in a society that values modesty, or people creepily lust after her as all they know about her is her naked body.

This brings up an interesting subject. We’re all familiar with the idea that sex sells, even though that has proven not to be true by several studies. Think about it: do you remember the sexy woman or the burger in a Carl’s Jr./Hardee’s ad? You remember the woman, so the product becomes obsolete.

However, if product happens to be the woman, then it’ll definitely sell. That may be why the idea of a woman being an object has become a popular female stereotype has become popular in the past few decades and why a woman that breaks out of that mold is currently seen more negatively than positively.

This may have to do with the idea that a confident woman is seen in a more negative light when it comes to Western media. There are several songs out there where the male singer will tell a girl that she may not know she’s beautiful, but she is to him and she needs to know.

This gives the idea to many women out there that you need affirmation from someone to know you’re beautiful, when in reality, the only one you needs to confirm that these women are beautiful are themselves. This goes back further as much of the Disney media we grew up with displays confident women as the villain, while the insecure protagonist gets what they want in the end.

Now, when it comes to sex, both men and women have the right to do what they want in a non-violent, consensual manner. In my professional opinion, if either party does not enjoy the sexual experience, why have sex in the first place? Sex should please both partners, but women from an early age are conditioned to believe that sex is designed to bring men pleasure instead of the pleasure being shared by both partners.

Sex is seen more as a masculine pleasure as a feminine pleasure in the eyes of society as men are allowed to openly express details of their sex life and preferences while women cannot without being judged.

The idea around a woman having positive sexual experiences is seen as taboo in today’s mainstream perspective. Women in media are more likely to either not talk about sex, or only talk about their negative experiences in the bedroom, though this trend is starting to slow down more and more in modern media. A woman who knows what she wants in terms of sex is seen as a “slutty woman”, but that term is only used when said woman rejects a man’s advances or said woman breaks up with a man for valid.

However, I believe that the stereotypical man in our Western culture does not like the idea of a woman liking sex for more reason than pleasing her partner as a woman who knows what she wants can be seen as a sign of dominance and the stereotypical man does not like the idea of a dominant or independent woman.

While we are trying to break out of it, women are still confined by these stereotypes as they’ve been ingrained into the minds of everyone. At this point, I have to ask if you’re going to be confined to these stereotypes around a woman, or are you going to break free from them? I can tell you that the best way to break the confines of society’s interpretation of women is to talk that image into your own hands and own the s**t out of it. It could be modest, provocative, androgynous, feminine, gothic, or pastel.

As long as you’re okay with your body, how you look, and what you do with it, that’s all that matters.