We live in a patriarchal society where men are basically forced to dominate, to be leader, to “own”, otherwise he’ll be seen as someone that… well… not a man. And I don’t understand that.
There’s a sense of pride in men when they have a partner that follow whatever their order is.
“Don’t hang out with your bad friends!”
“Don’t wear revealing clothes!”
And all those fucking beta bullshit. When a man say that to a girl, she’s a trophy, not a partner.