The term masculinity has somewhat of a negative connotation in today’s society: it’s usually accompanied by the word “toxic,” and conjures up outdated imagery of a Schwarzenegger-esque macho man who closes himself off emotionally and always has the right answer to everything. At the end of the day, masculinity is a term that is given meaning by the views of society, and there’s no denying that society’s definition of masculinity has (thankfully) shifted away from the stereotype I mentioned above. But what exactly has it shifted to?