World
War I was the first opportunity
that American women really had to gain
a foothold in the occupations traditionally
dominated by men. As America's men left
to fight the war, it became necessary for
women to join the workforce in areas
where they had previously been unnaccepted.
The need for combat materials opened a
whole new world of jobs and also many
women filled the jobs that men had left
behind. Some employers found the work
done by women superior to that of the men
who had preceded them. The progress was
too good to be true. Men and unions resisted
women. They were not taken seriously and it
was assumed that when the war was over, they
would go back home where they "belonged."
|