20 Signs That We’re Not Living In A Patriarchy

Feminists typically justify their beliefs by claiming that America (and the west in general) is a patriarchy, a society where men dominate and women are subjugated. But does our world really privilege men over women? All the signs point to no. Here’s why…
1. More women than men are attending college and earning degrees

According to Pew Research Center, 71 percent of women enroll in college after graduating high school while only 61 percent of men do, with the gap widening every year. Additionally, since 2006, women have earned the majority of college degrees at all levels, from associate’s degrees to doctorates.
2. Read More

Source: Return of Kings