Written by: Chantal Sood @chantalsood
The fashion industry is considered a female-centered industry, with a large number of female model, bloggers, and consumers. With that said, it may seem like women are the leading force in this industry, however, male designers dominate it. Fashion, more specifically the men who control it, dictates what a woman wears and how she should dress.
All of this is changing, however, as more and more women-owned brands are emerging. While fashion remains a male-dominated field, it is refreshing to see more labels run by women. With a growing number of female-owned fashion labels, it’s not surprising that there’s been a change in the industry. A new era for fashion has arrived. Male designers are no longer the primary decision makers of what’s in style. Now, women call the shots. They are straying less away from male approval and creating trends that reflect their newfound confidence in the industry. Trends are slowing changing. Now, this doesn’t mean that women are no longer dressing provocatively. The main difference now is that instead of being overly sexualized, women are empowered by their sex appeal. A good example of this is how American Apparel’s new executive team, made up of entirely women, continues to use risque pictures of their female models to promote their brand. This encourages the idea that women can be sexually liberated and comfortable in their own skin. The women in fashion are standing up for themselves. The narrative has changed from exploiting women to empowering them.
Trends are no longer revolved around what men consider sexy or how men want women to dress. The reason for this change is simple; the only person who should be in charge of how a woman dresses is herself.
Picture Credit: Thefashionsupernova.com