Empowerment to me means when you can stand up for yourself and speaking up or fighting to make a change for something better, like raising awareness. It makes me think of women and how they are *weaker* than men and the most men take advantage of that. They could protest, write books and write articles. To me it's like men think that women are objects. That they are like sex toys or only for pleasure. But, they are none of that. We are human, we have feelings. We weren't made for that. We were mainly made for respecting each other but it seems like that never ever happened. For thousands and thousands of years women have been the inferior gender. And a lot of women still treat their bodies like objects and that's what makes it hard to get that image off of women.