Tuesday, October 30, 2018

What Do Women Really Want?


We are almost at the end and it’s time to do some summing up. So what do the majority of women truly want? I think we can agree first and foremost women want to be happy, they want their children to be safe, healthy and happy with opportunity for their future, they want the basic things that make life worth living. These are fundamentals and yet sadly lacking for many women around the world.

In order for things to change and to make these fundamentals more widely achieved it seems natural to assume that empowering more women in the political, business and social arenas would ensure such a result. In the “first world” countries this certainly has happened and not just because women are empowered but because men have also seen that creating equality in ALL spheres makes for a better country.
Women don’t want to castrate men nor do they want to wrest all power out of their hands; they simply want a seat at the table and they want their voices to be heard. They want EQUALITY not domination. They want control over their own destiny and their own bodies. They want to be respected as a human being not an object.
"In the future, there will be no female leaders. There will just be leaders."—Sheryl Sandberg

No comments:

Post a Comment