Storytelling is intrinsic to human culture. We use it to explain the universe and to contextualize and resolve universal struggles—those that define the human condition. Seismic shifts occurred in the pattern of human existence during the Neolithic revolution, whereby larger, permanent settlements and agriculture replaced many hunter-gatherer societies. At this time, the stories told undoubtedly changed, but we continued to tell them as a primary means of knowledge transfer. Over millennia, as societies became more complex and technologically advanced, organized religion took hold, and those stories changed again, but we continued to use storytelling as a principle means of communication.
The human brain has evolved for stories – storytelling effectively populates our cerebral hard drives: we are hardwired to remember stories, not facts.
The perplexed human now has access to the Internet and boundless amounts of data and knowledge. Watching YouTube videos is the new way of storytelling and sharing, but we engage with these videos because of the content the YouTuber creates, not because we’re excited by the technology or channel. So, in the blink of an evolutionary eye, human society has changed dramatically, but our brains are still configured in the same way as our ancestors’—stories resonated, inspired, explained, and educated then as they do now. Why? Because they drive our emotions, and emotions drive our behavior.
There is growing acknowledgment of the role and potential of storytelling in business. Marketers know that customers purchase specific products in recognition of the brand story. When consumers buy a pair of sneakers, they are buying more than shoes. They are adopting a form of identity anchored in the brand’s myths and meaning.
Scientists also embrace storytelling, not only using it to disseminate hard evidence to their audiences but also to make the evidence more likely to change behaviors or current practices. It is therefore not surprising that stories and narratives are gaining more traction in the medical profession.
Since the 1980s, American medical schools and hospitals have promoted so‑called narrative medicine in healthcare curricula to facilitate reflection and reduce the distance between technical understanding of the disease and the patient’s subjective experiences with illnesses. And it works: a recent study of medical students demonstrated that the use of narrative medicine improved their ability to self‑reflect, empathize, and enhance patient‑healthcare provider communication.1 Narratives change medical practice and patient outcomes.
Fill out the form below to read the full article