Harold Barkley/Getty Images
- Fifty years ago, nurses received much less formal education than they do today. Doctors treated nurses more as assistants and caregivers than respected medical professionals, according to Nurse.org.
- Here's a visual look at how being a nurse has changed in the last 50 years.
- Visit Business Insider's homepage for more stories.
A lot has changed since nurses wore white skirts and stockings to work.
Fifty years ago, doctors still treated nurses as assistants, and the role was seen as an extension of women's caregiving instead of as a career. The role required less formal education, and nurses had just a "rudimentary" understanding of scientific medical care, according to Minority Nurse.
Today, nursing makes up the largest workforce in healthcare - and continues to grow as America's baby boomer population ages. Nurses can attain various higher education certifications and degrees, and can be highly specialized in new fields like forensic nursing and informatics. Though the industry has made strides in gender and racial diversity, discrimination still exists.
Here's a visual look at some of the ways nursing has changed in the last 50 years: