Since this pride month is coming to an end, I decided to discuss another topic that may technically not be a LGBTQ thing specifically, but still important. Gender roles I think are something we all get so sucked into. Kids are raised up and taught that they should be doing specific things because they are a boy or girl. They’re also told certain stuff is just unacceptable, again, because of their gender.
Do I think Gender roles are real or fiction? Personally, I think they’re fiction. Here’s why..
Who are we to really tell people what they can or can’t do because of their gender? It’s insane. You hear things like: ” A woman shouldn’t do body building” ” A man shouldn’t show emotion” “A woman needs to be the house keeper” (aka be the one who cleans, cooks, etc) “Boys shouldn’t play with dolls”. It’s stuff like this that just drives me insane and keeps people close minded. Some people are old school and have their thoughts and opinions just as I have mine, and I respect that. There’s no problem with that whatsoever, but I do think we as people should be a little bit more open about things. It’s 2018…we shouldn’t be surprised if we see things may seem “out of the norm”.
What’s normal anyways you know? It’s what society has depicted as normal over centuries time. In the end, trying to inflict and shove gender roles down your child’s throat could possibly cause for an adult who’s afraid to be who they really are. Let people do and live how they please no matter what gender or what you may think isn’t right. If they’re not hurting themselves or others then what’s the harm?