Quote Originally Posted by LO1121 View Post
I fully embrace the idea that society views need to change, but how does that change happen?
Like any other change, it happens through education. Teaching our kids that ALL people are equal, that women have the same rights and privileges as men, that no one, male or female, should be forced into submission by another.

And in my opinion, a large step in that direction would be achieved by eliminating religion from the public sphere. All of the major religions, and most of the minor ones, are patriarchal in nature. They preach the domination of men over women. That has to change. And in some degree, it is changing. Almost every religious organization in the US is losing members, mostly the young. It's one of the reasons for some of the more inane (and insane) bloviating going on by religious and political leaders regarding the rights of women: their belief systems are crumbling in the face of the modern world, and they are striking out blindly to defend their Iron Age mythologies. Sadly, women and minorities are the primary targets of these strikes.