The role of nurses in our society today
by Veronica Freeman
May 10, 2021
As we enter into Nurses Week, it’s important to acknowledge that the profession has expanded and nurses are now researchers, health policy advocates and educators, and have advanced their careers to decipher the role far more than it had been at inception. The work of nursing to consistently influence nursing concepts not only includes caring for the sick and the public, but being advocates for wellbeing and impacting positive patient outcomes.