Role of Women in the American West

by Gwen Bruno
Children gained status as a unique social group with special issues in the 1800s.

Children gained status as a unique social group with special issues in the 1800s.

In the 1800s, women were expected to stay in the "women's sphere" of society, caring for home and family under the protection of husbands and fathers. The balance of power changed as families moved west, and women expanded their roles.

In Business

Some banks in the West preferred offering loans to woman to start businesses, because they were more reliable than men.


Because of the need for teachers, Western women were allowed to attend universities; many of them went on to become school administrators and serve on state boards of education. They were also instrumental in helping run missions, churches and schools for Native Americans.

Property Owners

Western women were encouraged to hold property in their own name, so families could increase their family's holdings. This led to some women running ranches and farms by themselves, including supervising male employees.


The demand for professionals led to people in the West to accept women as doctors, lawyers and business owners much sooner than people in the Eastern United States.

The Negative Side

The negative side of women's lives in the West was drudgery and loneliness. Because of the shortage of labor, women often had to do farm work in addition to housework and caring for children.

About the Author

Gwen Bruno has been a full-time freelance writer since 2009, with her gardening-related articles appearing on DavesGarden. She is a former teacher and librarian, and she holds a bachelor's degree in education from Augustana College and master's degrees in education and library science from North Park University and the University of Wisconsin.

Photo Credits

  • Images