The definition I found that popped up on google pretty well sums up what I have always heard women say.
The advocacy of women's rights on the basis of the equality of the sexes
It's really that simple. It's not a women over men movement. It's a movement to receive the same respect, rights, and inclusion that men have enjoyed basically forever. They want the right to make decisions about their body. They'd like to maybe not be victims of sexual assault and rape and staggering percentages (about 1 in 6 American women will be raped in their lifetime). They'd like to have a better chance at corporate leadership (10% of fortune 500 CEOs are women). They'd like to have more of a footprint in government (roughly 28% of the US congress is female and this is a record high).
They just want equity and respect and they deserve it.
As a man I feel like the main people to blame for this are men both now and historically. So many of us were raised with this macho bullshit hold-it-all-in thing and our parents' generation were even worse about it and theirs even worse still.