Feminism means to me that a woman takes the right to choose for herself which life, job, family setup etc. makes her happy.
If this means being a submissive to a man, so be it.
It should not mean "doing everything like a man" or "doing everything better than a man" or "hating men" or "not having sex with men" or any of such bullshit.
That sort of self-proclaimed "feminists" were the only ones who gave me a really hard time when I discovered my sexual preferrence. They just piss me off!