The Truth AboutWomen Being Paid Less

Here’s the truth: Women are not getting paid less.

Anyone who has taken any sort of logical thinking course can figure out that the statistics can’t possibly take into account every possibility.

The fact is, yes, on average women make less of an income than men.

Because less women work! In 2014 47.4 of the civilian workforce was female. That 2.6% that was added onto the male wage average would’ve been quite a bit a money, making it look like men earned more. Which they did. Because there’s more of them.

More women are likely to quit a job to take care of family than men. The poll from non-working adults showed that 61% of women gave this reason and only 37% of men.

Also, take into account maternity leave, where women still have a job that they aren’t making money from.

Women are more likely to take part-time jobs so that they can also care for children, whereas more men take full-time jobs, consequently earning more money per year than women.

There are of course exception to every rule.

But if you look at the average per-year wage statistics statistics, which is all we get, it certainly looks like women are discriminated against, when really they just work less or work lower paying jobs.

More men seek the higher-paying full-time jobs so that they can support their families.

Middle-class wives, for the most part, don’t absolutely have to work and consequently don’t seek the highest paying jobs. They tend to find jobs that they’ll enjoy doing, or jobs where they can work from home, which, on average, won’t pay as much.

The wage gap doesn’t exist. It’s lies being spoon-fed to you from a media that wants to get feminists riled up.

Do some research, don’t just accept what you see on the news

Advertisements

Equality

Featured image

This pink ribbon is everywhere this month.

It makes sense, it’s Breast Cancer Awareness Month.

It really is a great way to raise awareness… for one type of cancer that only women suffer from.

Not saying it is to be taken lightly! Not at all. The effects of breast cancer can be devastating and deadly.

But wouldn’t it be more effective to have an all-inclusive Cancer Awareness month?

Breast cancer gets its own month, color, catchphrases, definitely the most press and awareness. But what about pancreatic cancer? Lung cancer? Colon, prostate,liver?

I believe we should change October to Cancer Awareness month so that everyone can be encouraged by awareness for their battle, not just breast cancer.

I realize that it may be difficult to really do everything justice, but I really don’t get why this one cancer deserves so much of our attention when so many others are hardly talked about.

I don’t know, maybe these are just silly thoughts of mine. Comments?