Hollywood definitely portrays nursing in an unrealistic fashion. For the most part there are two different types of TV show’s/movies out there right now and they both shed a different light on nurse’s. Even though I am in love with it, shows such as Grey’s Anatomy in my opinion do not show what it’s like to actually be a nurse. For the most part, we are not running around hooking up with every doctor we see, and one thing that drives me crazy is that they show the doctor’s doing jobs that the nurses do! I have NEVER seen a doctor, not even a resident start a peripheral IV or draw blood. Yet on Grey’s they are starting IVs, drawing blood, pushing meds and spending way more time than I have ever seen talking to patients. I know the show is focused on the doctors, but if I only knew about nursing through watching TV, I would think the doctors were the heroes who comfort patients 24/7, and that all nurses did were stare at each other all day.
Sometimes I think even the news can affect how the public views nursing. Within the last few months, a med tech was responsible for a huge Hep C outbreak at my hospital. I know this negatively effected the way the public in my area views nurses, how it was discussed on TV they sort of lumped his job in the category of him being an actual nurse, and people all over are talking about how bad the nurses are at the hospital. Now in the public’s mind, all the nurses are to blame for one guys mistake. Recently I have started watching NY Med on ABC and I think that shows exactly what nursing is like. Being puked on, yelled at because you can’t give them the food they want, getting hit