Sexism in the United States

From Wikipedia, the free encyclopedia

Jump to: navigation, search

Sexism in the United Staes has existed since the colonial era.Historically, the country has been dominated by a settler society of men and as such the major form of sexism in the past and present is discrimination against women also known as Misogny this has been embodied in numerous forms including exclusion from most careers and by law not being able to vote however thanks to the femiist movement women know enjoy suffrage and can work at any occupation. There are still obstacles to true equality including being paid less than men for the same job.

Personal tools