Introduction
The role of a CEO, once defined by strategy charts and bottom lines, is undergoing a sea change. With constant technological advances, changing business complexities, and societal expectations, CEOs are required to expand their expertise beyond traditional business acumen. Today, a truly great CEO needs to master the art of social skills, demonstrating a keen ability to interact, coordinate, and communicate across multiple dimensions.
As the business landscape continues to grow more complex, the ability to navigate this intricacy has become a defining factor in effective leadership. This holds true for large, publicly-listed multinational corporations and medium to large companies operating in a rapidly evolving marketplace. As a result, leaders must possess the skills and acumen to navigate this complex landscape, make informed decisions, and steer their organisations toward success.
Social Skills
Top executives in these firms are expected to harness their social skills to coordinate diverse and specialised knowledge, solve organisational problems, and facilitate effective internal communication. Further, the interconnected web of critical relationships with external constituencies demands leaders to demonstrate adept communication skills and empathy.
The proliferation of information-processing technologies has also played a crucial role in defining a CEO’s success. As businesses increasingly automate routine tasks, leadership must offer a human touch—judgment, creativity, and perception—that can’t be replicated by technology. In technologically-intensive firms, CEOs need to align a heterogeneous workforce, manage unexpected events, and negotiate decision-making conflicts—tasks best accomplished with robust social skills.
Equally, with most companies relying on similar technological platforms, CEOs need to distinguish themselves through superior management of the people who utilise these tools. As tasks are delegated to technology, leaders with superior social skills will find themselves in high demand, commanding a premium in the labour market.
Transparency
The rise of social media and networking technologies has also transformed the role of CEOs. Moving away from the era of anonymity, CEOs are now expected to be public figures interacting transparently and personally with an increasingly broad range of stakeholders. With real-time platforms capturing and publicising every action, CEOs need to be adept at spontaneous communication and anticipate the ripple effects of their decisions.
Diversity & inclusion
In the contemporary world, great CEOs also need to navigate issues of diversity and inclusion. This calls for a theory of mind—a keen understanding of the mental states of others—enabling CEOs to resonate with diverse employee groups, represent their interests effectively, and create an environment where diverse talent can thrive. (See our article on the Chief Coaching Officer for an alternative solution to this issue)
Hiring strategies
Given this backdrop, it is essential for organisations to refocus their hiring and leadership development strategies. Instead of relying on traditional methods of leadership cultivation, companies need to build and evaluate social skills among potential leaders systematically.
Current practices, such as rotating through various departments, geographical postings, or executive development programs, aren’t enough. Firms need to design a comprehensive approach to building social skills, even prioritising them over technical skills. High-potential leaders should be placed in roles that require extensive interaction with varied employee populations and external constituencies, and their performance should be closely monitored.
Assessing social skills calls for innovative methods beyond the traditional criteria of work history, technical qualifications, and career trajectory. New tools are needed to provide an objective basis for evaluating and comparing people’s abilities in this domain. While some progress is being made with the use of AI and custom tools for lower-level job seekers, there is a need for further innovation in top-level searches.
Conclusion
In conclusion, the role of the CEO is more multifaceted than ever. The modern world demands executives to possess exceptional social skills, including effective communication, empathetic interaction, and proactive inclusion. Companies need to recognise this change and adapt their leadership development programs accordingly to cultivate CEOs who can effectively lead in the 21st century.
The persistent pulse of inquiry in history
Throughout history, our innate curiosity has been the heartbeat of progress, driving us from basic questions about nature, like “Why does it rain?” to profound existential inquiries, such as “Do we have free will?”. In today’s fast-paced world, the art of asking questions feels somewhat overshadowed by the avalanche of information available. Yet, recognising what we don’t know often serves as the true essence of wisdom.
One lasting method of exploring knowledge through questioning is the Socratic method, a tool from ancient Greece that aids critical thinking, helps unearth solutions, and fosters informed decisions. Its endurance for over 2,500 years stands as a testament to its potency. Plato, a student of Socrates, immortalised his teachings through dialogues or discourses. In these, he delved deep into the nature of justice in the “Republic”, examining the fabric of ideal societies and the character of the just individual.
Questions have not only transformed philosophy but also propelled innovations in various fields. Take, for instance, Alexander Graham Bell, whose inquiries led to the invention of the telephone or the challenges to traditional beliefs during the Renaissance that led to breakthroughs in art, science, and philosophy. With their profound questions about existence and knowledge, the likes of Kant and Descartes have shaped the philosophical narratives we discuss today.
Critical questioning has upended accepted norms in the scientific realm, leading to paradigm shifts. For example, Galileo’s scepticism of the geocentric model paved the way for ground-breaking discoveries by figures such as Aristarchus, Pythagoras, Copernicus, Newton, and Einstein. At its core, every scientific revolution was birthed from a fundamental question.
On the educational front, the importance of questioning is backed by modern research. Historically, educators have utilised questions to evaluate knowledge, enhance understanding, and cultivate critical thinking. Rather than simply prompting students to recall facts, effective questions stimulate deeper contemplation, urging students to analyse and evaluate concepts. This enriches classroom experiences and deepens understanding in experiential learning settings.
By embracing this age-old method and recognising the power of inquiry, we can better navigate the complexities of our contemporary world.
Questions through the ages: an enduring pursuit of truth
Throughout the annals of time, the act of questioning has permeated our shared human experience. While ancient civilisations like the Greeks laid intellectual foundations with their spirited debates and dialogues, their inquiries’ sheer depth and diversity stood out. These questions spanned from the cosmos’ intricate designs to the inner workings of the human soul.
Historical literature consistently echoed this thirst for understanding, whether in the East or West. It wasn’t just about obtaining answers; it celebrated the journey of arriving at them. The process, probing, introspection, and subsequent revelations hold a revered spot in our collective memory. The reverence with which we’ve held questions, as seen through the words of philosophers, poets, and thinkers, showcases the ceaseless human spirit in its quest for knowledge.
In today’s interconnected world, the legacy of these inquiries remains ever-pertinent. We live in an era of information, a double-edged sword presenting knowledge and misinformation. As we grapple with this deluge, the skills of discernment and critical inquiry, inherited from our ancestors, are invaluable. It’s no longer just about seeking answers but about discerning the truths among many voices.
With the current rise in misinformation and fake news, a sharpened sense of questioning becomes our compass, guiding us through the mazes of contemporary challenges. By honouring the traditions of the past and adapting them to our present, we continue our timeless pursuit of truth, ensuring that the pulse of inquiry beats strongly within us.
Understanding the Socratic Method
Having recognised the age-old reverence for inquiry, it becomes imperative to explore one of its most pivotal techniques: the Socratic method. Socrates, widely regarded as a paragon of wisdom, believed that life’s true essence lies in perpetual self-examination and introspection. His approach was unique in its time, as he dared to challenge societal norms and assumptions. When proclaimed the wisest man in Greece, he responded not with complacency but with probing inquiry.
The Socratic method transcends a mere question-answer paradigm. Instead, it becomes a catalyst, prompting deep reflection. This dialectical technique fosters enlightenment, not by spoon-feeding answers but by kindling the flames of critical thinking and understanding. The beauty of this method rests not solely in the answers it might yield, but in the journey of introspection and dialogue it necessitates.
Beyond philosophical discourses, this method resonates powerfully in contemporary educational spheres. It underscores that genuine knowledge transcends rote memorisation, emphasising comprehension and enlightenment. This reverence for knowledge stresses the imperative of recognising our limitations fostering an ethos where learning is ceaseless and dynamic.
In our information-saturated age, the Socratic method’s principles are not just philosophical musings but indispensable. According to Statistica, only about 26% of Americans feel adept at discerning fake news, while a concerning 90% inadvertently propagate misinformation. Herein lies the true power of the Socratic approach. It teaches us discernment, evaluation, and the courage to seek clarity continuously. By integrating this method into our lives, we are better equipped to navigate our intricate world, fostering lives marked by clarity, purpose, and profound understanding.
Why the question often surpasses the answer
Having delved into the rich tapestry of historical inquiry and the transformative power of the Socratic method, one may wonder: Why such an emphasis on the question rather than the answer?
We are often trained to seek definite conclusions throughout our educational journey and societal conditioning. Yet, as Socrates demonstrated through his dialogues, there’s profound wisdom in embracing the exploration inherent in questioning. His discussions rarely aimed for definitive answers, suggesting that the reflective process, rather than the conclusion, held deeper significance.
Imagine a complex puzzle. While the completed picture might offer satisfaction, aligning each piece, understanding its intricacies, and appreciating its nuances truly enriches the experience. Similarly, questions, even those without clear-cut resolutions, can expand our horizons, provoke self-assessment, and challenge our preconceived notions. This process broadens our perspectives and fosters a more holistic understanding of our surroundings.
By valuing the act of questioning, we equip ourselves with the tools to navigate ambiguity, confront our limitations, and engage with the world more thoughtfully and profoundly.
The Socratic Method in contemporary frameworks
Socratic questioning involves a disciplined and thoughtful dialogue between two or more people, and its methodologies, rooted in ancient philosophy, remain instrumental in today’s diverse contexts. In the realm of academia, especially within higher education, this collaborative form of questioning is a cornerstone. Educators don’t merely transfer information; they challenge students with introspective questions, compelling them to reflect, engage, and critically evaluate the content presented.
Beyond the classroom, the applicability of the Socratic method stretches wide. Business environments, such as boardrooms and innovation brainstorming sessions, harness the power of Socratic dialogue, pushing participants to confront and rethink assumptions. Professionals employ this method in therapeutic and counselling to guide clients in introspective exploration, encouraging clarity and self-awareness.
Through its emphasis on continuous dialogue, deep reflection, and the mutual pursuit of understanding, this age-old method remains a beacon, guiding us as we navigate the ever-evolving complexities of our modern world.
Conclusion: the timeless art of inquiry
From the cobbled streets of ancient Athens to contemporary classrooms, boardrooms, and counselling sessions, the enduring legacy of the Socratic method attests to the potent force of inquiry. By valuing the exploratory process as much as, if not more than, the final insight, we pave a path towards richer understanding, intellectual evolution, and the limitless possibilities of human achievement.
In today’s deluge of data and information, the allure of swift answers is undeniable. Yet, Socrates’ practice reminds us of the transformative power held in the act of questioning. Adopting such a mindset, as this iconic philosopher once did, extends an open invitation to a life punctuated by curiosity, wonder, and unending discovery.
Depending on who you listen to working from home is either proof of a declining work ethic – evidence of and contributor to a global malaise that is hampering productivity, decimating work culture and amplifying isolation and laziness – or it’s a much-needed break from overzealous corporate control, finally giving workers the autonomy to do their jobs when, where and how they want to, with some added benefits to well-being, job satisfaction and quality of work baked in.
Three years on from the pandemic that made WFH models ubiquitous, the practice’s status is oddly divisive. CEOs malign it. Workers love it. Like most statements around WFH, that analysis is over simplistic. So what’s the actual truth: is WFH good, bad or somewhere in between?
The numbers
Before the pandemic Americans spent 5% of their working time at home. By spring 2020 the figure was 60% [1]. Over the following year, it declined to 35% and is currently stabilised at just over 25% [2]. A 2022 McKinsey survey found that 58% of employed respondents have the option to work from home for all or part of the week [3].
In the UK, according to data released by the Office for National Statistics in February, between September 2022 and January 2023, 16% of the workforce still worked solely from home, while 28% were hybrid workers who split their time between home and the office [4]. Meanwhile, back in 1981, only 1.5% of those in employment reported working mainly from home [5].
The trend is clear. Over the latter part of the 20th century and earliest part of the 21st, homeworking increased – not surprising given the advancements to technology over this period – but the increase wasn’t drastic. With Covid, it surged, necessarily, and proved itself functional and convenient enough that there was limited appetite to put it back in the box once the worst of the crisis was over.
The sceptics
Working from home “does not work for younger people, it doesn’t work for those who want to hustle, it doesn’t work in terms of spontaneous idea generation” and “it doesn’t work for culture.” That’s according to JPMorgan Chase CEO Jamie Dimon [6]. People who work from home are “phoning it in” according to Elon Musk [7]. In-person engineers “get more done,” says Mark Zuckerberg, and “nothing can replace the ability to connect, observe, and create with peers that comes from being physically together,” says Disney CEO Bob Iger [8].
Meanwhile, 85% of employees who were working from home in 2021 said they wanted a hybrid approach of both home and office working in future [9]. It seems there’s a clash, then, between the wants of workers and the wants of their employers.
Brian Elliott, who previously led Slack’s Future Forum research consortium and now advises executive teams on flexible work arrangements, puts the disdain for WFH from major CEOs down to “executive nostalgia” [10].
Whatever the cause, and whether merited or not, feelings are strong – on both sides. Jonathan Levav, a Stanford Graduate School of Business professor who co-authored a widely cited paper finding that videoconferencing hampers idea generation, received furious responses from advocates of remote-work. “It’s become a religious belief rather than a thoughtful discussion,” he says [11].
In polarised times, it seems every issue becomes black or white and we must each choose a side to buy into dogmatically. Given the divide seems to exist between those at the upper end of the corporate ladder and those below, it’s especially easy for the WFH debate to fall into a form of tribal class warfare.
Part of the issue is that each side can point to studies showing the evident benefits of their point of view and the evident issues with their opponents. It’s the echo-chamber effect. Some studies show working from home to be more productive. Others show it to be less. Each tribe naturally gravitates to the evidence that best suits their argument. Nuance lies dead on the roadside.
Does WFH benefit productivity?
The jury is still out.
An Owl Labs report on the state of remote work in 2021 found that of those working from home during 2021, 90% of respondents said they were at least at the same productivity level working from home compared to the office and 55% said they worked more hours remotely than they did at the office [12].
On the other end of the spectrum, a paper from Stanford economist Nicholas Bloom, which reviewed existing studies on the topic, found that fully remote workforces on average had a reduced productivity of around 10% [13].
Harvard Business School professor Raj Choudhury, looking into government patent officers who could work from anywhere but gathered in-person several times a year, championed a hybrid approach. He found that teams who worked together between 25% and 40% of the time had the most novel work output – better results than those who spent less or more time in the office. Though he said that the in-person gatherings didn’t have to be once a week. Even just a few days each month saw a positive effect [14].
It’s not just about productivity though. Working from home can have a negative impact on career prospects if bosses maintain an executive nostalgia for the old ways of working. Studies show that proximity bias – the idea that being physically near your colleagues is an advantage – persists. A survey of 800 supervisors by the Society for Human Resource Management in 2021 found that 42% percent said that when assigning tasks, they sometimes forget about remote workers [15].
Similarly, a 2010 study by UC Davis professor Kimberly Elsbach found that when people are seen in the office, even when nothing is known about the quality of their work, they are perceived as more reliable and dependable – and if they are seen off-hours, more committed and dedicated [16].
Other considerations
It’s worth noting other factors outside of productivity that can contribute to the bottom line. As Bloom states, only focusing on productivity is “like saying I’ll never buy a Toyota because a Ferrari will go faster. Well, yes, but it’s a third the price. Fully remote work may be 10% less productive, but if it’s 15% cheaper, it’s actually a very profitable thing to do” [17].
Other cost-saving benefits of a WFH or hybrid work model include potentially allowing businesses to downsize their office space and save on real estate. The United States Patent and Trademark Office (USPTO) estimated that increases in remote work in 2015 saved it $38.2 million [18].
Minimising the need for commuting also helps ecologically. The USPTO estimates that in 2015 its remote workers drove 84 million fewer miles than if they had been travelling to headquarters, reducing carbon emissions by more than 44,000 tons [19].
A hybrid model
Most businesses now tend to favour a hybrid model. Productivity studies, including Bloom’s that found the 10% productivity drop from fully remote working, tend to concede there’s little to no difference in productivity between full-time office staff and hybrid workers. 47% of American workers prefer to work in a hybrid model [20]. In the UK, it’s 58% [21]. McKinsey’s American Opportunity Survey found that when given the chance to work flexibly, 87% of people take it [22].
However, as Annie Dean, whose title is “head of team anywhere” at software firm Atlassian, notes: “For whatever reason, we keep making where we work the lightning rod, when how we work is the thing that is in crisis” [23].
Choudhary backs this up, saying, “There’s good hybrid – and there’s terrible hybrid” [24]. It’s not so much about the model as the method. Institutions that put the time and effort into ensuring their home and hybrid work systems are well-defined and there’s still room for discussion, training and brainstorming – all the things that naysayers say are lost to remote working – are likely to thrive.
That said, New Yorker writer Cal Newport points out that firms that have good models in place (what he calls “agile management”) are few and far between. Putting such structures in place is beyond the capability of most organisations. “For those not benefiting from good (“Agile”) management,” he writes, “the physical office is a necessary second-best crutch to help firms get by, because they haven’t gotten around to practising good management [25].”
The future
Major CEOs may want a return to full-time office structures, but a change seems unlikely. You can’t put the genie back in the bottle. Home and hybrid working is popular with employees, especially millennials and Gen Z. As of 2022 millennials were the largest generation in the workforce [26]; their needs matter.
The train is only moving in one direction – no amount of executive nostalgia is going to get it to turn back. It seems a hybrid model is the future, and a healthy enough compromise.
References
[1] https://www.economist.com/special-report/2021/04/08/the-rise-of-working-from-home
[2] https://www.forbes.com/sites/stevedenning/2023/03/29/why-working-from-home-is-here-to-stay/
[3] https://www.mckinsey.com/industries/real-estate/our-insights/americans-are-embracing-flexible-work-and-they-want-more-of-it
[4] https://www.theguardian.com/commentisfree/2023/feb/14/working-from-home-revolution-hybrid-working-inequalities
[5] https://wiserd.ac.uk/publication/homeworking-in-the-uk-before-and-during-the-2020-lockdown/
[6] https://www.forbes.com/sites/jenamcgregor/2023/08/19/the-war-over-work-from-home-the-data-ceos-and-workers-need-to-know/
[7] https://hbr.org/2023/07/tension-is-rising-around-remote-work
[8] https://www.forbes.com/sites/jenamcgregor/2023/08/19/the-war-over-work-from-home-the-data-ceos-and-workers-need-to-know/
[9] https://www.ons.gov.uk/employmentandlabourmarket/peopleinwork/employmentandemployeetypes/articles/businessandindividualattitudestowardsthefutureofhomeworkinguk/apriltomay2021
[10]
https://www.forbes.com/sites/jenamcgregor/2023/08/19/the-war-over-work-from-home-the-data-ceos-and-workers-need-to-know/
[11] https://www.forbes.com/sites/jenamcgregor/2023/08/19/the-war-over-work-from-home-the-data-ceos-and-workers-need-to-know/
[12] https://owllabs.com/state-of-remote-work/2021/
[13] https://www.forbes.com/sites/jenamcgregor/2023/08/19/the-war-over-work-from-home-the-data-ceos-and-workers-need-to-know/
[14] https://www.forbes.com/sites/jenamcgregor/2023/08/19/the-war-over-work-from-home-the-data-ceos-and-workers-need-to-know/
[15] https://www.forbes.com/sites/jenamcgregor/2023/08/19/the-war-over-work-from-home-the-data-ceos-and-workers-need-to-know/
[16] https://www.forbes.com/sites/jenamcgregor/2023/08/19/the-war-over-work-from-home-the-data-ceos-and-workers-need-to-know/
[17]
https://www.forbes.com/sites/jenamcgregor/2023/08/19/the-war-over-work-from-home-the-data-ceos-and-workers-need-to-know/
[18] https://hbr.org/2020/11/our-work-from-anywhere-future#:~:text=Benefits%20and%20Challenges,of%20enhanced%20productivity%20and%20engagement
[19] https://hbr.org/2020/11/our-work-from-anywhere-future#:~:text=Benefits%20and%20Challenges,of%20enhanced%20productivity%20and%20engagement.
[20] https://siepr.stanford.edu/publications/policy-brief/hybrid-future-work#:~:text=Hybrid%20is%20the%20future%20of%20work%20Key%20Takeaways,implications%20of%20how%20and%20when%20employees%20work%20remotely.
[21] https://mycreditsummit.com/work-from-home-statistics/
[22] https://www.mckinsey.com/industries/real-estate/our-insights/americans-are-embracing-flexible-work-and-they-want-more-of-it
[23] https://www.forbes.com/sites/jenamcgregor/2023/08/19/the-war-over-work-from-home-the-data-ceos-and-workers-need-to-know/
[24] https://www.forbes.com/sites/jenamcgregor/2023/08/19/the-war-over-work-from-home-the-data-ceos-and-workers-need-to-know/
[25] https://www.forbes.com/sites/stevedenning/2023/03/29/why-working-from-home-is-here-to-stay/
[26] https://www.forbes.com/sites/theyec/2023/01/10/whats-the-future-of-remote-work-in-2023/
Introduction
Originally published in 2013, Ichiro Kishimi and Fumitake Koga’s The Courage to be Disliked quickly became a sensation in its authors’ native Japan. Its English language translation followed suit with more than 3.5 million copies sold worldwide.
The book is often shelved in the ‘self-help’ category, in large part due to its blandly overpromising subheading: How to free yourself, change your life and achieve real happiness. In truth it would be better suited to the philosophy or psychology section. The book takes the form of a discussion between a philosopher and an angsty student. The student is unhappy with his life and often with the philosopher himself, while the philosopher is a contented devotee of Adlerian psychology, the key points of which he disseminates to the student over the course of five neatly chunked conversations. His proposed principles offer sound advice for life in general but also prove useful when integrated into a business setting.
Adlerian Psychology
Alfred Adler was an Austrian born psychotherapist and one of the leading psychological minds of the 20th century. Originally a contemporary of Freud’s, the two soon drifted apart. In many ways Adler’s theories can be defined in opposition to his old contemporary; they are anti-Freudian at their core. Freud is a firm believer that our early experiences shape us. Adler is of the view that such sentiments strip us of autonomy in the here and now, seeing Freud’s ideas as a form of determinism. He instead proffers:
No experience is in itself a cause of our success or failure. We do not suffer from the shock of our experiences – the so-called trauma – but instead, we make out of them whatever suits our purposes. We are not determined by our experiences, but the meaning we give them is self-determining.
Essentially, then, the theories are reversed. Adler posits that rather than acting a certain way in the present because of something that happened in their past, people do what they do now because they chose to, and then use their past circumstances to justify the behaviour. Where Freud would make the case that a recluse doesn’t leave the house because of some traumatic childhood event, for example, Adler would argue that instead the recluse has made a decision to not leave the house (or even made it his goal not to do so) and is creating fear and anxiety in order to stay inside.
The argument comes down to aetiology vs teleology. More plainly, assessing something’s cause versus assessing its purpose. Using Adlerian theory, the philosopher in the book tells the student that: “At some stage in your life you chose to be unhappy, it’s not because you were born into unhappy circumstances or ended up in an unhappy situation, it’s that you judged the state of being unhappy to be good for you”. Adding, in line with what David Foster-Wallace referred to as the narcissism of self-loathing, that: “As long as one continues to use one’s misfortune to one’s advantage in order to be ‘special’, one will always need that misfortune.”
Adler in the workplace: teleology vs aetiology
An example of the difference in these theories in the workplace could be found by examining the sentence: “I cannot work to a high standard at this company because my boss isn’t supportive.” The viewpoint follows the cause and effect Freudian notion: your boss is not supportive therefore you cannot work well. What Adler, and in turn Kishimi and Koga, argue is that you still have a choice to make. You can work well without the support of your boss but are choosing to use their lack of support as an excuse to work poorly (which subconsciously was your aim all along).
This is the most controversial of Adler’s theories for a reason. Readers will no doubt look at the sentence and feel a prescription of blame being attributed to them. Anyone who has worked with a slovenly or uncaring boss might feel attacked and argue that their manager’s attitude most certainly did affect the quality of their work. But it’s worth embracing Adler’s view, even if just to disagree with it. Did you work as hard as you could and as well as you could under the circumstances? Or did knowing your boss was poor give you an excuse to grow slovenly too? Did it make you disinclined to give your best?
Another example in the book revolves around a young friend of the philosopher who dreams of becoming a novelist but never completes his work, citing that he’s too busy. The theory the philosopher offers is that the young writer wants to leave open the possibility that he could have been a novelist if he’d tried but he doesn’t want to face the reality that he might produce an inferior piece of writing and face rejection. Far easier to live in the realm of what could have been. He will continue making excuses until he dies because he does not want to allow for the possibility of failure that reality necessitates.
There are many people who don’t pursue careers along similar lines, staunch in the conviction that they could have thrived if only the opportunity had arisen without ever actively seeking that opportunity themselves. Even within a role it’s possible to shrug off this responsibility, saying that you’d have been better off working in X role in your company if only they had given you a shot, or that you’d be better off in a client-facing position rather than being sat behind a desk doing admin if only someone had spotted your skill sets and made use of them. But without asking for these things, without actively taking steps towards them, who does the responsibility lie with? It’s a hard truth, but a useful one to acknowledge.
Adler in the workplace: All problems are interpersonal relationship problems
Another of the key arguments in the book is that all problems are interpersonal relationship problems. What that means is that our every interaction is defined by the perception we have of ourselves versus the perception we have of whomever we are dealing with. Adler is the man who coined the term “inferiority complex”, and that factors into his thinking here. He spoke of two categories of inferiorities: objective and subjective. Objective inferiorities are things like being shorter than another person or having less money. Subjective inferiorities are those we create in our mind, and make up the vast majority. The good news is that “subjective interpretations can be altered as much as one likes…we are inhabitants of a subjective world.”
Adler is of the opinion that: “A healthy feeling of inferiority is not something that comes from comparing oneself to others; it comes from one’s comparison with one’s ideal self.” He speaks of the need to move from vertical relationships to horizontal ones. Vertical relationships are based in hierarchy. If you define your relationships vertically, you are constantly manoeuvring between interactions with those you deem above you and those you deem below you. When interacting with someone you deem above you on the hierarchical scale, you will automatically adjust your goalposts to be in line with their perceptions rather than defining success or failure on your own terms. As long as you are playing in their lane, you will always fall short. “When one is trying to be oneself, competition will inevitably get in the way.”
Of course in the workplace we do have hierarchical relationships. There are managers, there are mid-range workers, there are junior workers etc. The point is not to throw away these titles in pursuit of some newly communistic office environment. Rather it’s about attitude. If you are a boss, do you receive your underlings’ ideas as if they are your equal? Are you open to them? Or do you presume that your status as “above” automatically means anything they offer is “below”? Similarly if you are not the boss, are you trying to come up with the best ideas you can or the ones that you think will most be in-line with your boss’ pre-existing convictions? Obviously there’s a balance here – if you solely put forward wacky, irrelevant ideas that aren’t in line with your company’s ethos and have no chance of success then that’s probably not helpful, but within whatever tramlines your industry allows you can certainly get creative and trust your own taste rather than seeking to replicate someone else’s.
Pivotal to this is whether you are willing to be disagreed with and to disagree with others or are more interested in pleasing everyone, with no convictions of your own. This is where the book’s title stems from. As it notes, being disliked by someone “is proof that you are exercising your freedom and living in freedom, and a sign that you are living in accordance with your own principles…when you have gained that courage, your interpersonal relationships will all at once change into things of lightness.”
Adler in the workplace: The separation of tasks
The separation of tasks is pivotal to Adlerian theory and interpersonal relationships. It is how Adler, Kishimi and Koga suggest one avoids falling into the trap of defining oneself by another’s expectations. The question one must ask themselves at all times, they suggest, is: Whose task is this? We must focus solely on our own tasks, not letting anyone else alter them and not trying to alter anyone else’s. This is true for both literal tasks – a piece of work, for example – but also more abstract ideas. For example, how you dress is your task. What someone else thinks of how you dress is theirs. Do not make concessions to their notions (or your perceptions of what their notions might be) and do not be affected by what they think for it is not your task and therefore not yours to control.
This idea that we allow others to get on with their own tasks is crucial to Adler’s belief in how we can live rounded, fulfilling lives. The philosopher argues that the basis of our interpersonal relationships – and as such our own happiness – is confidence. When the boy asks how the philosopher defines the “confidence” of which he speaks, he answers:
It is doing without any set conditions whatsoever when believing in others. Even if one does not have sufficient objective grounds for trusting someone, one believes. One believes unconditionally without concerning oneself with such things as security. That is confidence.
This confidence is vital because the book’s ultimate theory is that community lies at the centre of everything. The awareness that “I am of use to someone” both allows one to act with confidence in their own life, have confidence in others, and to not be reliant on the praise of others. The reverse is true too. As Kishimi and Koga state, “A person who is obsessed with the desire for recognition does not have any community feeling yet, and has not managed to engage in self-acceptance, confidence in others, or contribution to others.” Once one possesses these things, the need for external recognition will naturally diminish.
For high-level employees, then, it’s important to set a tone in the workplace that allows colleagues to feel that they are of use. But as the book dictates, do not do this by fake praise – all that will do is foster further need for recognition (“Being praised essentially means that one is receiving judgement from another person as ‘good.’”) Instead, foster this atmosphere by trusting them, showing confidence.
The courage to be disliked
The Courage to be Disliked is at odds with many of the accepted wisdoms of the day. Modern cultural milieu suggests that we should be at all times accepting and validating others’ trauma as well as our own. Many may even find solace in this approach and find that it suits them best. But there is no one-size-fits-all solution when it comes to fostering a successful workplace and even less so when it comes to leading a fulfilling life. For anyone who feels confined by the idea that there are parameters around what they can achieve and are capable of because of some past event or some subjective inferiority that has been harboured too long, perhaps look at those interpersonal relationships, perhaps find the courage to be disliked, and in doing so hope to find a community that you’re willing to support as much as it supports you. There is no need to be shackled to whatever mythos you’ve internally created.
As the book states: “Your life is not something that someone gives you, but something you choose yourself, and you are the one who decides how you live…No matter what has occurred in your life up to this point, it should have no bearing at all on how you live from now on.”
References
Kishimi, Ichiro & Koga, Fumitake. The Courage to Be Disliked: How to Free Yourself, Change your Life and Achieve Real Happiness. Bolinda Publishing Pty Ltd. 2013.
Introduction
Consider a simple yet profound question: What does your work mean to you? Is it merely a task to be completed, or does it resonate with a deeper purpose in your life?
Viktor Frankl, a prominent Austrian psychiatrist and philosopher, grappled with these very questions, evolving them into a broader exploration of life’s meaning. Drawing from his harrowing experiences in Nazi concentration camps, he developed logotherapy—a form of psychotherapy that centres around the search for meaning and purpose. Through logotherapy, Frankl illuminated the idea that life’s essence can be found not just in joyous moments but also in love, work, and our attitude towards inevitable suffering. This pioneering approach underscores personal responsibility and has offered countless individuals a renewed perspective on fulfilment, even in the face of daunting challenges.
In this piece, we delve into the intricacies of Frankl’s teachings, exploring the symbiotic relationship he identified between work and our quest for meaning.
A Holistic Approach to Life and Work
In his seminal work, ‘Man’s Search for Meaning,’ Viktor Frankl delved deeply into the multifaceted nature of human existence. He eloquently described the myriad pathways through which individuals uncover meaning. For Frankl, while work or ‘doing’ is undoubtedly a significant avenue for deriving meaning, it isn’t the only one. He emphasised the value of love, relationships, and our responses to inevitable suffering. Through this lens, he offered a panoramic view of life, advocating for a holistic perspective where meaning is not strictly tethered to our work but is intricately woven through all our experiences and interactions.
Progressing in his exploration, Frankl sounded a note of caution about the perils of letting work become an all-consuming end in itself. He drew attention to the risks of burnout and existential exhaustion when one’s sense of purpose is confined solely to one’s occupation or the relentless chase for wealth. To Frankl, an overemphasis on materialistic achievements could inadvertently lead individuals into what he termed an ‘existential vacuum’ – a state where life seems starkly devoid of purpose. He argued that in our quest for success, we must continually seek a deeper, more intrinsic purpose. Otherwise, we risk being blinded by life’s profound significance and richness beyond material gains.
Delving deeper into the realm of employment, Frankl confronted the psychological and existential challenges of unemployment. He noted that without the inherent structure and purpose provided by work, many individuals grapple with a profound sense of meaninglessness. This emotional and existential void often manifests in a diminishing sense of significance towards time, leading to dwindling motivation to engage wholeheartedly with the world. The ‘existential vacuum’ emerges again, casting its shadow and enveloping individuals in feelings of purposelessness.
Yet, Frankl’s observations were not merely confined to the challenges. He beautifully illuminated the resilience and fortitude of certain individuals, even in the face of unemployment. He showcased how, instead of linking paid work directly with purpose, some found profound meaning in alternative avenues such as volunteer work, creative arts, education, and community participation.
Frankl firmly believed that the essence of life’s meaning often lies outside the traditional realms of employment. To drive home this perspective, he recounted poignant stories, such as that of a desolate young man who unearthed profound purpose and reaffirmed his belief in his intrinsic value by preventing a distressed girl from taking her life. Such acts, as illustrated by Frankl, highlight the boundless potential for a meaningful existence, often discovered in genuine moments of human connection.
Work as an Avenue for Meaning and Identity
Viktor Frankl’s discourse on work transcended the common notions of duty and obligation. For him, work was more than a mere means to an end; it was a potent avenue to unearth meaning and articulate one’s identity. Frankl posited that when individuals align their work with their intrinsic identity—encompassing all its nuances and dimensions—they move beyond merely working to make a living. Instead, they find themselves working with a purpose.
This profound idea stems from his unwavering belief that our work provides us with a unique opportunity. Through it, we can harness our individual strengths and talents, channelling them to create a meaningful and lasting impact on the world around us.
In line with modern philosophical thought, which views work as a primary canvas for self-expression and self-realisation, Frankl also recognised its significance. He believed that work could serve as a pure channel, finely tuned to our unique skills, passions, and aspirations. This deep sense of accomplishment and fulfilment from one’s chosen profession, he asserted, is invaluable. However, Frankl also emphasised the importance of seeing the broader picture. While careers undeniably play a significant role in our lives, they are but a single facet in our ongoing quest for meaning.
Frankl reminds us that while our careers are integral to our lives, the quest for meaning isn’t imprisoned within their boundaries. He believed the core of true meaning emerges from our deep relationships, our natural capacity for empathy, and our virtues. These treasures of life, he asserted, can be manifested both within the confines of our workplace and beyond.
The True Measure of Meaning Through Work
For Viktor Frankl, our professional lives brim with potential for fulfilment. Yet, fulfilment wasn’t solely defined by accolades. Instead, it was about aligning our work with our deepest values and desires. It wasn’t just the milestones that mattered but how they resonated with our core beliefs.
Frankl’s logotherapy reshapes our perception of work, emphasising that even mundane tasks can hold significance when approached with intent. With the right mindset, every job becomes a step in our journey for meaning.
In Frankl’s writings, he weaves together tales of profound significance—a young man’s transformative act of kindness, a narrative not strictly tethered to work’s traditional realm. Yet, these stories anchor a timeless truth: In every endeavour, whether grand or humble, lies the potential for unparalleled meaning. Here, work isn’t just about designated roles—it becomes an evocative stage where profound moments play out. Beyond job titles and tasks, the depth, sincerity, and fervour we infuse into each act truly capture the essence of meaningful work.
Finding Fulfilment in Every Facet
Viktor Frankl’s profound insights into the human pursuit of meaning provide a distinctive lens through which we can evaluate both our daily tasks and life’s most pivotal moments. Through his exploration—whether addressing the ordinariness of daily life or the extremities of crisis—Frankl illuminated the profound interconnectedness of work and personal identity. He posited that our professions, while significant, are fragments of a vast tapestry that constitute human existence.
Navigating the journey of life requires continual adjustments to our perceptions of success and meaning. While our careers and professional achievements are significant, true fulfilment goes beyond these confines. It’s woven into our human experiences, the bonds we nurture, the challenges we face, and the joys we hold dear.
Frankl’s pioneering work in logotherapy urges us to approach life with intention and purpose. He beckons us to see the value in every moment, task, and human connection. As we delve into our careers and strive for success, aligning not just with outward accomplishments but with the very essence of who we are is vital.
Introduction
Humans have always been fascinated by the future. Prior to the era of computers and data, we sought insights from the stars, dreams, and even animal behaviour. The tale of the Delphic Oracle is etched in this tapestry of human curiosity. A simple goat herder named Coretas reportedly stumbled upon a fissure in the earth, releasing ethereal vapours. Drawn by these mysterious emissions, he perceived glimpses of the future. This mystical spot soon became legendary. Word spread and people from distant lands journeyed here, drawn by the allure of prophecy. They came eager to hear the visions of the future, as interpreted by the chosen Pythia, a maiden who acted as the mouthpiece of Apollo. From mystical vapours to celestial patterns, humanity’s thirst for understanding tomorrow has perpetually pushed us to evolve our tools and methods, seeking ever-more sophisticated ways to peer into the future.
Throughout history, cultures around the globe have relied on a myriad of tools for forecasting the future. The Mayans, for instance, constructed elaborate calendars, meticulously tracking celestial bodies. Chinese sages consulted the I Ching, a revered text blending both philosophy and prediction. During the Middle Ages, figures like Nostradamus peered at the cosmos, firmly believing that the stars unveiled the secrets of events yet to unfold. Meanwhile, in their endless pursuit of the Philosopher’s Stone, alchemists hoped that their transformative experiments might also provide windows into future events. As the sands of time flowed, the rigours of science began to play an increasingly pivotal role in this age-old quest. Meteorologists harnessed accumulated data to forecast weather patterns, while demographers, attuned to shifts in population dynamics, used their insights to anticipate future demographic shifts.
Predictive analytics
Now, in this age, we’re navigating through a golden era of prediction. Computers, hailed as our contemporary oracles, delve into vast data lakes. With the aid of intricate algorithms and machine learning, they furnish insights about potential future events. Computers, hailed as our contemporary oracles, dive into vast data lakes — with less smoke and more code. Though technologically advanced, these modern tools have a mystique reminiscent of ancient methods. Indeed, their exceptional abilities often blur the lines between the arcane and the technological.
Even though the settings have changed—with glass skyscrapers replacing ancient temples—our innate desire to predict the future remains unwavering. We’ve shifted from seeking guidance from oracles to heeding the insights of modern-day experts: economists, scientists, and statisticians. The unpredictable nuances of geopolitics and the intricate web of global economies underscore the challenges of forecasting. Despite our technological advances, no tool or expert can perfectly predict outcomes, as emphasised by the renowned financier Peter Lynch: “You never can predict the economy. You can’t predict the stock market.”
It’s against this backdrop of prediction challenges that Philip Tetlock’s work shines prominently. Over decades, Tetlock undertook the meticulous task of analysing millions of predictions, unravelling the intricacies of human foresight. He identified the ‘superforecasters’, a rare group that consistently demonstrated superior predictive abilities.
Superforecasters
Superforecasters stand apart from their peers, not simply through the accuracy of their predictions, but through their unique way of understanding and working with probabilities. Instead of confining themselves to somewhat nebulous terms like ‘likely’ or ‘certain’, they delve into a world of precision, where small differences matter. They employ an almost artistic attention to detail, carving out distinctions in probability estimates that most would overlook.
What’s noteworthy isn’t simply that they can perceive a difference between a 56% and a 57% probability, but the mindset this precision reflects. It speaks to a meticulousness, and diligence that’s often lacking in forecasting. This ability to finely calibrate their predictions sets them apart, transforming forecasting from a vague art into a refined science.
However, this is but one facet of their skills. Superforecasters also excel at dynamically updating their forecasts as new information comes to light, demonstrating humility in acknowledging and learning from their errors, and cultivating a probabilistic thinking mindset. Taken together, these skills contribute to their exceptional track record in the challenging realm of prediction.
In the aftermath of the Iraq war, where intelligence missteps around weapons of mass destruction became evident, the US intelligence community sought Tetlock’s expertise. His findings were detailed in his book, “Superforecasting: The Art and Science of Prediction”, serving as an invaluable guide for anyone looking to refine their forecasting skills.
Beyond the realm of politics and global affairs, the implications of Tetlock’s research are profound. His techniques offer practical applications in diverse arenas, from deciphering economic trends to pivotal personal decisions, such as evaluating career trajectories or the potential of a business venture.
Ethical & economical challenges
Yet, while Tetlock’s findings are ground-breaking, they’re not infallible. Even the best predictions are fraught with uncertainties. As we harness these insights, it’s vital to maintain a balanced approach, merging strong convictions with a healthy dose of caution.
By integrating Tetlock’s teachings, we can achieve a heightened awareness of our cognitive biases, enabling more informed decisions and, potentially, a brighter future.
While predictive tools offer remarkable insights, their overreliance introduces both ethical and economic challenges. Ethically, leaning too heavily on predictions can erode our adaptability and critical thinking, luring us into a false sense of security. Economically, this complacency can result in missed opportunities or misguided strategies. Just as the roll of a dice is inherently unpredictable, so too are complex systems like economies. They’re influenced by countless variables, making them vulnerable to unexpected twists and turns. Predictions, while valuable, are best used as guiding lights and not as absolute certainties. After all, at their core, they’re imbued with an intrinsic element of unpredictability.
In the realm of forecasting, we find that with great predictive power comes great responsibility — and the inevitable debate over who truly holds the crystal ball. The craft, while teeming with potential, is not without its boundaries and ethical dilemmas. Foretelling the future transcends the realms of science and art; it’s a weighty task that beckons us to navigate with zeal and caution. Here lies our unparalleled chance to influence humanity’s trajectory, yet we must remember to gracefully balance our acquired wisdom against the vast, ever-present unknowns.
Conclusion
As we conclude, we’re reminded of the timeless rhythm of humanity’s quest: from the ethereal mists of the Delphic Oracle to the digital pulses of algorithms. This cyclical endeavour to decipher tomorrow underscores our unyielding curiosity, a reflection of our innate need to foresee, understand, prepare, and connect with the uncertain embrace of the future.
Introduction
Navigating life’s unpredictability often resembles the exhilarating world of alpine skiing. Mikaela Shiffrin, a superstar of the sport, imparts insights into a high-performance mindset, saying,
While taken from the realm of competitive skiing, this guiding principle resonates profoundly beyond sports, offering the transformative potential to shape our personal and professional lives. It emphasises maintaining high quality and performance standards while tempering expectations around future outcomes. So, how can we cultivate this mindset, and what benefits can it give?
Standards vs expectations
Fundamentally, standards are often seen as the internal benchmarks or criteria we set for ourselves, encompassing our definitions of quality, competence, or excellence. They are self-generated and typically align with our values, aspirations, and sense of identity. On the other hand, expectations represent our forecasts or assumptions about future events or outcomes. While our personal beliefs and experiences shape them, they are also susceptible to external influences such as societal norms, peer input, or past results. These predictions can significantly influence our emotional responses and subsequent actions, for better or worse.
Insights from the leadership and strategy expert, Sydney Finkelstein, align well with Shiffrin’s principle. Finkelstein highlights,
This mindset promotes the potent power of adaptability, urging us to expect the unexpected and welcome it with open arms. Finkelstein’s emphasis on embracing surprise complements Shiffrin’s philosophy and brings a new dynamic to it – teaching us that the keys to success lie in our ability to pivot, adapt and thrive amidst life’s most surprising turns.
Maintaining excellence and expectations
We should strive for excellence in our pursuits, whether it’s producing top-quality work or meeting project timelines. However, it’s crucial to remain aware that external factors like market fluctuations, organisational shifts, or managerial decisions could impact our anticipated outcomes.
Applying this perspective across various facets of our professional life can yield significant benefits. The following strategies amalgamate Shiffrin’s principle and Finkelstein’s insights:
- Foster a Growth Mindset: Shift the focus from the final outcome to the effort and process. Emphasise the value of consistent effort and dedication rather than setting unattainable, vague targets. This mindset can be reinforced by celebrating the consistent efforts and hard work involved in achieving professional milestones.
- Encourage Personal Bests: Remind everyone that success isn’t always about outperforming others but about personal growth, continuous learning, and achieving personal bests, irrespective of external markers of success.
- Allow Space for Mistakes: Encourage learning from failures. This approach cultivates resilience and adaptability, essential traits in any professional setting.
- Offer Continuous Support: Extend support during the process, not merely after achieving the outcome. This can involve listening empathetically, providing constructive feedback, or offering resources for professional development.
Striking a balance
Among these strategies, it’s vital to remember that balance is key, particularly when it comes to praise and reassurance. Excessive or unfounded praise can unintentionally communicate low expectations, undermining the motivational power of genuine appreciation and constructive feedback. It’s a delicate act of maintaining high standards and keeping expectations in check — a true testament to the wisdom of Shiffrin and Finkelstein in our professional pursuits.
Shiffrin’s approach to maintaining high standards while tempering expectations, coupled with Finkelstein’s emphasis on embracing surprise and adaptability, provides a robust framework to navigate the complex landscape of the professional world. This balanced methodology promotes growth, resilience, and adaptability amidst life’s unpredictable twists and turns, transforming us from passive observers to active, resilient participants in life’s dynamic game.
Exercise and positive expectations
The integration of this philosophy extends beyond professional life into our approach to exercise and overall well-being. A study by Hendrik Mothes and colleagues at the University of Freiburg highlights that individuals’ expectations and beliefs significantly influence the psychological and neurophysiological benefits arising from a single exercise session. Participants holding positive expectations about exercise’s benefits consistently reported greater psychological benefits, including increased enjoyment, mood enhancement, anxiety reduction, and a rise in alpha-2 brain waves, indicating relaxation.
Summary
Such findings underscore the profound impact our mindset, expectations, and internal narratives can have on our health journeys. In high-pressure environments—whether they’re sporting arenas or corporate boardrooms—the pressure to meet personal and external expectations can be overwhelming. Ambition can motivate and drive progress, but continuous high-pressure situations can lead to mental health issues like anxiety, stress, and depression.
Organisations must balance their success drive with care for their employees’ mental well-being to foster healthier and more productive environments. Initiatives like emotional well-being programmes provide structures to support employees’ mental health, offering varying levels of care and engagement tailored to individuals’ needs.
By embracing a mindset that unifies an understanding of mental health with Shiffrin’s high-standards-low-expectations approach, we can embark on a holistic path towards better physical and psychological well-being. This integrated approach can significantly enhance our quality of life and performance across multiple life spheres.
More on positivity
- We still need to find out how and why optimism scientifically influences these diagnostics, but we know it yields clear-cut results with empirical certainty. (Read more)
- “Regarding and building upon the last point, mindfulness is deriving positives from what is happening now or not allowing the negatives to alter the future detrimentally.” (Read more)
Introduction
In 1930, John Maynard Keynes, the man regarded as the founder of macroeconomics, from whom Keynesian economics takes its name, predicted that in one hundred years time the average human workweek would clock in at fifteen hours [1]. We’re still seven years away from that hundred year milestone but barring a remarkable turnaround it seems Keynes’ prediction will be proved wrong, and drastically.
Not only are people generally working between 35 and 50 hours a week – depending on country, role etc. – but many are engaged in the uniquely 21st century phenomenon of the side-hustle. According to research for the Trades Union Congress, one in seven workers in Britain now partake in gig-economy jobs like Uber or Amazon delivery at least once a week, many of them on top of full-time employment [2].
Meanwhile, digital tools have made it possible to work from pretty much anywhere, at pretty much any time. This was supposed to usher in a new age of liberation: the worker, no longer constrained by their office environment or nine-to-five schedule, is now free to live the life they always wanted. In reality, it has just meant the expectation of swift email correspondence has extended its lebensraum to the realms of evenings, weekends and even holidays. That edenic notion of freetime signed off its suicide note with a customary “sent from my iPhone” footer.
The sense of never-ending malaise that occupies the modern employee is perhaps best captured by the TV show Severance. Centering around a fictional procedure that severs the work self from the free-time self, the show darkly and comically skewers the torturous undertakings the zombified worker self is made to endure by the malevolent corporation that employs him in this inescapable labour prison, the ramifications of which naturally spill out from their office containment to bruise each self equally. It’s not hard to see why viewers are able to relate.
Keynes’ prediction was based on the myriad changes imbued upon 20th century work culture by technological innovations and societal adjustments in the wake of the industrial revolution. In Keynes’ lifetime, the average workday dropped from fourteen hours a day to eight [3]. Understanding that greater advancements were yet to come, Keynes posited that the trend would continue.
He was right that further innovations in tech would make working practices substantially easier, with everything from printers to Excel to Zoom obvious examples. But while those advancements reduced the amount of time it takes workers to complete everyday tasks, that simply meant workers were now expected to undertake more tasks within their allotted nine-to-five (or often longer) shifts.
Keynes’ great contemporary, the philosopher Bertrand Russell, diagnosed many of the issues with modern work culture in his 1932 essay “In Praise of Idleness”. Russell wrote, “A great deal of harm is being done in the modern world by belief in the virtuousness of work, and that the road to happiness and prosperity lies in an organized diminution of work” [4].
Perhaps more prudently, with an evergreen tinge, he wrote:
Modern methods of production have given us the possibility of ease and security for all; we have chosen instead to have overwork for some and starvation for others. Hitherto we have continued to be as energetic as we were before there were machines. [5]
With Keynes’ foreseen fifteen-hour week out the window, then, how much should we work, really? Provided we have tasks to fulfil, a sense of pride in our roles that dictates our output should be of a certain quality, and a life outside of work from which we hope to derive pleasure and meaning, what is the optimum time we should give to our professional endeavours? The answer is dependent on our role, abilities, temperaments and life circumstances, of course. But there are those advocating specific solutions, and it’s worth assessing the merits of each.
What a way to make a living
The nine-to-five is very much the status quo when it comes to our working schedules. It’s become the parlance in and of itself: nine-to-five equals work, even when in many cases employers are dragging the last of those numbers up and up and up.
The nine-to-five got its start in 1926 under Henry Ford at his namesake Ford Motor Company [6]. At the time, it was a reduction in working hours that was celebrated for obvious reasons. Ford workers manned the assembly line. By putting them on eight-hour shifts, they were able to cover the 24-hour day in three shifts without putting undue demands on staff. Once Ford set the ball rolling and the new schedule proved successful, the system was then adopted in many countries around the world and persisted almost unquestioned (in any meaningful sense) until the pandemic in 2020.
Covid disrupted a litany of accepted notions regarding working practices. Once the flexibility of home working was made commonplace (and even governmentally mandated), it was only a matter of time before workers started to question why they couldn’t add a little flexibility to their hours too.
The nine-to-five has some obvious flaws. In 1926, the expectation was that the man of the house would work while his wife stayed home and dealt with domestic and child-rearing duties. Obviously things have progressed since then. Nowadays, most families consist of two workers. Juggling parental obligations around an in-office nine-to-five is extremely difficult and often involves sacrificing either valuable time with one’s child or professional progress.
The most damning argument against the nine-to-five is that studies show it to be inefficient. A 2016 survey of 1,989 UK office workers found that over the course of an eight-hour workday, the average employee works for two hours and 53 minutes [7]. The rest of the time is spent reading the news, browsing social media, eating, socialising, taking cigarette breaks, and searching for new jobs. Essentially, people are dragging out their tasks to fill the time, and are less fulfilled, less productive, less happy and less healthy for it.
In response to the limitations associated with the traditional nine-to-five five-day week, variations on the formula are becoming increasingly prevalent, as well as increasingly in-demand.
The four-day week
Four-day work weeks are becoming more common. Advocates claim that by providing employees with an extra day of rest, the four-day work week reduces employee anxiety and stress while facilitating better sleep and more time to exercise. Those benefits then pay dividends when it comes to the quality of employee output and increased productivity.
The biggest recent study on the subject was a report by the advocacy groups 4 Day Week Global and 4 Day Week Campaign, with the assistance of researchers from Boston College and the University of Cambridge. The report’s findings show that roughly 40% of respondents said they experienced less work-related stress, and 71% reported lower levels of burnout. More than 40% said their mental health had improved, with significant numbers of employees reporting decreases in anxiety and negative emotions [8].
Nearly half of workers involved said they weren’t as tired as they were before the experiment, and 40% said it was easier to get to sleep. In the end, 96% of employees said they preferred four-day schedules. At the same time, company revenue increased by an average of roughly 1% over the six month period, while employee turnover and absenteeism went down. Almost all of the businesses in the program said they planned to continue with a four-day work week once the experiment was over [9].
The data is striking, and backed up in other studies. In 2019, Microsoft Japan introduced a four day working week and reported a 40% boost in productivity [10]. In Sweden, a two-year government study conducted from 2015-17 on retirement-home workers in Gothenburg found that at the end of the study people were happier, less stressed, and enjoyed work more [11].
Another added benefit of the four-day week is environmental. A study by the University of Massachusetts Amherst found that a 10% reduction in working hours cut an individual’s carbon footprint by 8.6% [12]. Minimising the amount of days workers are commuting can have a drastic environmental impact, and should be a further consideration for those thinking of moving away from the five-day nine-to-five.
The 5-hour workday
Some argue that rather than removing a whole day from the week, it is more efficient to reduce the number of hours worked a day.
Alex Pang, founder of Silicon Valley consultancy Strategy and Rest, visiting scholar at Stanford University, and the author of Rest and The Distraction Addiction, notes that “research indicates that five hours is about the maximum that most of us can concentrate hard on something” [13].
The notion of the five-hour workday gained notoriety through Tower Padel Boards, an online, direct-to-consumer company that sells stand-up paddleboards. In 2015, the company’s CEO Stephan Aarstol offered his employees a deal: if they figured out how to do the same work in less time, they could keep the same salary and leave at 1pm. He also implemented a 5% profit sharing plan, increasing hourly pay [14]. On the day the company announced the change on its website, it broke its previous daily sales record, booking $50,000 in sales for the first time. By the end of the month, it had sold $1.4m worth of paddleboards, breaking its previous monthly sales record by $600,000.
Inspired by what he saw, David Rhoads, CEO of Blue Street Capital, a California-based company that arranges financing for enterprise IT systems, decided to try this new work strategy out for himself. Three months after starting Blue Street Capital’s five-hour workday trial, David found that while they had cut the length of the workweek by three-eighths, the number of calls his employees made per person had doubled. David made the five-hour workday a permanent feature after three months. Three years in, revenues had gone up every year – 30% the first year, 30% the second – while the company grew from nine to seventeen employees [15].
The five-hour workday, like all approaches, has its flaws. Research shows that people’s creativity fades after five hours of concentration – but not all jobs are creative. Taking the original Ford model as an example, assembly line workers have no reason (efficiency-wise) to shorten their workdays. The same is true for those in administrative roles, those in call centres, and all sorts of other professions.
Jan-Emmanuel de Neve, associate professor of economics and strategy at the University of Oxford’s Saïd Business School, is an advocate of the five-hour workday. He says his research reinforces the argument that five-hour working days lead to greater employee wellbeing, which in turn leads to greater productivity. But he also warns that working in these more limited bursts can actually result in greater employee stress [16].
Associate professor in strategic human resource management at the University of Reading’s Henley Business School Rita Fontinha agrees, saying: “While a shorter work day could result in better time management and promote concentration, individuals may feel an added pressure to complete tasks on time” [17].
The death of leisure
In his aforementioned 1932 essay, Russell observed that, “The idea that the poor should have leisure has always been shocking to the rich” [18]. But in 21st century society, we seem to have gone one further: it seems to have become far fetched that anyone at all might have leisure. Free time has been annexxed by 24/7 work schedules and commercialised by social media sites so that even the most lackadaisical of weekend pursuits are increasingly undertaken “for the gram” rather than for the inherent joy in the activity. The self-improvement zeitgeist has similarly snatched away any pastimes that could potentially be filed under ‘trivial’. As Wessie du Toit notes in the New Statesman:
Meditation and exercise look suspiciously like personal optimisation. Artistic vocations centre on tireless self-promotion to a virtual audience. A movement of “homesteaders” churning their own butter and knitting their own jumpers are simply cosplaying older forms of work, and probably posting the results on Instagram. [19]
What to do
Amongst a society that has placed a premium on work and prizes workaholics, Russell’s praise for idleness feels more needed and yet further away than ever. Trends like the Great Resignation and “quiet quitting” demonstrate that worker dissatisfaction is starting to permeate the workforce at large. Shifts to a four-day work week or five-hour workday could be solutions, granting employees autonomy and opportunity for rest at little to no cost to business – potentially even improving productivity and profits.
But given it took a global pandemic to even vaguely move the world away from Henry Ford’s modus operandi first adopted some 97 years ago, it would be optimistic to think such large-scale changes are on their way any time soon.
References
[2] https://www.newstatesman.com/culture/2023/05/work-four-hours-a-day
[3] https://www.theguardian.com/commentisfree/2020/mar/10/five-hour-workday-shorter-book
[4] https://harpers.org/archive/1932/10/in-praise-of-idleness/
[5] https://harpers.org/archive/1932/10/in-praise-of-idleness/
[6] https://www.wired.co.uk/article/working-day-time-five-hours
[7] https://www.businessinsider.com/8-hour-workday-may-be-5-hours-too-long-research-suggests-2017-9
[8] https://time.com/6256741/four-day-work-week-benefits/
[9] https://time.com/6256741/four-day-work-week-benefits/
[10] https://www.weforum.org/agenda/2023/03/surprising-benefits-four-day-week/
[11] https://www.businessinsider.com/8-hour-workday-may-be-5-hours-too-long-research-suggests-2017-9
[12] https://www.weforum.org/agenda/2023/03/surprising-benefits-four-day-week/
[13] https://www.wired.co.uk/article/working-day-time-five-hours
[14] https://www.theguardian.com/commentisfree/2020/mar/10/five-hour-workday-shorter-book
[15] https://www.theguardian.com/commentisfree/2020/mar/10/five-hour-workday-shorter-book
[16] https://www.wired.co.uk/article/working-day-time-five-hours
[17] https://www.wired.co.uk/article/working-day-time-five-hours
[18] https://harpers.org/archive/1932/10/in-praise-of-idleness/
[19] https://www.newstatesman.com/culture/2023/05/work-four-hours-a-day
Introduction
It was the summer of 1964 when the members of a burgeoning British band named The Rolling Stones found themselves on American soil. They were halfway through their first stateside tour when they made their way to Chess Studios in Chicago, keen to record the follow-up to their debut album. The studio was the hallowed hub of their musical heroes, the cradle of the blues and rock ‘n’ roll genres that shaped their sound. The anticipation was palpable as they stepped into the studio, the very place where legends like Howlin’ Wolf, John Lee Hooker, Bo Diddley, and Muddy Waters had crafted their biggest hits.
In a serendipitous twist of fate, their first encounter at Chess was not with a studio executive or an eager intern but Muddy Waters himself. But he was not wielding a guitar; he was clad in overalls, perched on a ladder, paintbrush in hand, and whitewash streaming down his face. The Stones were startled, and in the confusion, an opportunity emerged, laying bare the perfect juxtaposition of the seemingly mundane and its grand potential.
Keith Richards and the band did not just meet an idol that day; they built a relationship that would later see them tour and work with Muddy, learning first-hand from one of the greats. The Stones’ deep understanding and appreciation of blues music and readiness to learn propelled their career to unprecedented heights, leading them to their first number-one hit, ‘It’s All Over Now’.
Preparation meeting opportunity
This principle of “Preparation Meeting Opportunity,” often defined as luck, is equally applicable in the world of work. It emphasises that when individuals and organisations are mentally and practically prepared, they are more likely to recognise and capitalise on opportunities.
Much like The Rolling Stones recognised the value in learning from a legend like Muddy Waters, forward-thinking companies understand that their talent is their scarcest resource. According to a McKinsey report titled “Organising for the future: Nine keys to becoming a future-ready company,” successful companies anchor their efforts on the principle that talent is indeed scarcer than capital. They continually ask themselves: What talent do we need? How can we attract it? And how can we manage talent most effectively to deliver on our value agenda?
Inclusion & diversity
Inclusion and diversity have surfaced as critical aspects of this talent strategy. A company that fosters an inclusive employee experience becomes an attractive destination for top talent and benefits from the increased profitability associated with diverse leadership.
The Rolling Stones, who had already seen early success, remained hungry for improvement and open to learning from the best in their field. Similarly, organisations and their employees can foster a culture of continuous learning and development, seeking out opportunities in the most unexpected places.
Summary
The story of The Rolling Stones’ encounter with Muddy Waters and their subsequent rise to global fame is not just a story of music and stardom. It’s a tale of recognising and seizing opportunity, preparation meeting chance, and the power of a creative, curious, and prepared mindset.
Whether you’re a fledgling band walking into a legendary recording studio or a company trying to navigate the rapidly changing business landscape, this story serves as a reminder that opportunity can present itself in the most unpredictable ways. The question is, are you ready to grasp it when it does?
Introduction
Coaching has long been viewed as a premium service, frequently offered only to the upper echelons of organisations, the C-suite executives. The potential benefits of coaching in enhancing leadership skills, strategic thinking, and overall effectiveness are well-documented (Gawande, 2011; Coutu & Kauffman, 2009). However, current research also underscores its broader utility across all tiers of an organisation, promoting it as an indispensable instrument for comprehensive personal and professional development (Grover & Furnham, 2016; Wang, Qing, et al., 2021).
Contemplating a world where coaching benefits could be accessed by every individual within an organisation, irrespective of their position, is invigorating. Envision a Chief Coaching Officer (CCO) guiding this transformation, meticulously integrating coaching into every facet of the organisational structure. Such progressive thinking could trigger a paradigm shift in the corporate landscape.
Coaching is now in the top three tools for modern organisations. There are a number of global organisations who are actively utilising coaching – those that are show marked individual and team improvements.
Coaching Beyond Conventional Domains
Atul Gawande’s (2011) illuminating article “Personal Best” and Ted Talk narrates how the power of coaching can transcend beyond traditional spaces into unexpected realms like the operating theatre. He invites a retired colleague to observe his surgical techniques and offer coaching, effectively bridging the coaching principles of sports or performing arts with the medical field. This compelling narrative is a testament to the universality of coaching, emphasising its potential for ongoing self-improvement across various professional disciplines.
Dispelling Misconceptions Around Coaching
To achieve an effective rollout of a comprehensive coaching strategy, we need to challenge the pre-existing association of coaching with performance improvement or the resolution of performance issues, particularly outside the C-suite. Coaching should be viewed as something other than a remedial measure but as a proactive tool for fostering personal and professional growth. This proactive view promotes an organisational culture where coaching becomes a regular aspect of professional development rather than a response to performance deficiencies.
Expanding the Horizon of Coaching
Consider an early career employee mastering technical skills while being coached to negotiate broader career challenges. Or a mid-level manager augmenting their leadership prowess through a customised development journey. The utility of coaching extends beyond conventional confines, offering numerous benefits, including amplified self-awareness, goal attainment, and improved stress management (Grant, 2013; Bozer & Sarros, 2014).
Introducing the Chief Coaching Officer
The advent of a Chief Coaching Officer (CCO) could revolutionise coaching. By nurturing a coaching culture within the organisation, a CCO can make coaching accessible to all, from entry-level professionals to senior executives. The CCO’s responsibilities would include overseeing the execution of coaching programmes, designing an overarching coaching strategy, and ensuring effective resource allocation. Crucially, the CCO would assess the impact of these initiatives on individual and organisational performance, thereby validating the effectiveness of the coaching interventions.
Addressing Potential Hurdles
The transition towards a coaching culture does not come without its challenges. These range from financial constraints and identifying apt coaches to the potential discomfort of professionals who may be reluctant to expose themselves to scrutiny. Nevertheless, these hurdles are not insurmountable. Retirement, for instance, need not symbolise the end of one’s career; the wealth of experience accumulated by retirees could be channelled into coaching roles. Furthermore, investing in coaching can yield significant returns, not just in the form of avoided mistakes but also through augmented performance (Gawande, 2011).
The Final Word
In our ever-competitive and rapidly evolving world, organisations must recognise the potential benefits of expanding the scope of coaching. Empirical evidence supports its effectiveness as a developmental intervention (Grover & Furnham, 2016; Sharma, 2017; Wang, Qing, et al., 2021). Adopting an organisation-wide approach to coaching can catalyse individual potential and drive company-wide growth. The appointment of a Chief Coaching Officer can be a strategic move towards fostering a culture of continuous learning and improvement. Ultimately, the goal is to enable every professional to achieve their personal best, regardless of their position or field.
References
Coutu, D., & Kauffman, C. (2009). What can coaches do for you? Harvard Business Review, 87(1), 92–97.
Gawande, A. (2011). Personal best. The New Yorker, October, 3.
Grover, S., & Furnham, A. (2016). Coaching as a developmental intervention in organisations: A systematic review of its effectiveness and the mechanisms underlying it. PloS one, 11(7), e0159137.
Bozer, G., & Sarros, J. C. (2012). Examining the effectiveness of executive coaching on coachees’ performance in the Israeli context. International Journal of Evidence Based Coaching and Mentoring, 10(1), 14-32.
Grant, A. M. (2013). The efficacy of executive coaching in times of organisational change. Journal of Change Management, 13(4), 411-429.
Sharma, P. (2017). How coaching adds value in organisations-The role of individual level outcomes. International Journal of Evidence Based Coaching & Mentoring, 15.
Wang, Q., Lai, Y., Xu, X., & McDowall, A. (2021). The effectiveness of workplace coaching: a meta-analysis of contemporary psychologically informed coaching approaches. Journal of Work-Applied Management.
Introduction
Can I interest you in everything, all of the time? This is the question put to us by Bo Burnham’s carnival ringmaster of all things online in ‘Welcome to the Internet’, a song from his Covid-induced comedy special come mental unravelling Inside [1]. The song captures the imprisonment of the age, our shared, crooked addictions to the ever-flowing fountain of information that’s rarely more than a few metres from our fingertips. “Here’s a tip for straining pasta; here’s a nine year-old who died,” he grins, every bit as manic and entrancing as the technology he portrays. We all scroll idly past such travesties and worse daily on our phones, laptops and tablets. And of course it has an effect.
It’s hardly a secret that doomscrolling is bad for you. Or that too much time online is. Knowing these things does not make separation any easier. We are hooked. According to a journal published by the National Center for Biotechnology Information, doomscrolling “appears as a vicious cycle in which users find themselves stuck in a pattern of seeking negative information no matter how bad the news is” [2]. And the news is bad. Take your pick from the growing rolodex of global travesties. The war in Ukraine. The impending one in Taiwan. A food crisis in Yemen. Ongoing climate struggles. The world’s greatest living footballer shilling out for oil money from a nation with a less-than-stellar human rights record [3]. Name your crisis, the news will find it for you; it is not low on stock.
The bad news
A 2020 Pew Research Center survey of more than 12,000 U.S. adults found that 66% felt worn out by the news. The same study shows that, “news fatigue is more widespread among the least engaged political news consumers. Nearly three-quarters of those who follow political and election news “not too” or “not at all closely” feel exhausted by the news (73%), higher than the share among those who follow political news “somewhat” (66%) or “very” closely (56%)” [4].In other words, political disengagement is not the answer to your prayers. Think of the news like Liam Neeson’s avenging father in Taken. It has a very particular set of skills. It will look for you. It will find you. And, to somewhat edit the final line, it will hit you with a debilitating fatigue that stalks you in your work and social life.
Unsurprisingly, this problem is more pronounced amongst the young. An American Press Institute survey found that more than 90% of Gen Z and Millennials report spending at least two hours a day online. That includes 56% who are online for more than 5 hours a day and 24% for more than 9 [5]. The World Health Organization recommends the public “[tries] to reduce how much you watch, read or listen to news that makes you feel anxious or distressed” [6]. The Reuters Institute for the Study of Journalism at Oxford University, meanwhile, coined a title for those who struggle with excess news consumption: the infodemically vulnerable [7].
Pandemic fatigue
The pandemic of course played a large role in our collective news fatigue. The Economist called Covid the most dominant news story since the Second World War [8]. I don’t think anyone with even a vague memory of the time would find that surprising. As early as April 2020, the World Health Organization was using the term “infodemic” to describe the abundance of pandemic coverage [9]. For many, disengagement became a vital tool of survival. Walking a tightrope of well-being that a single further graph of infections vs hospitalisations threatened to tip off balance.
That said, Covid only served to exacerbate trends that had begun with social media’s growing prevalence and an age of polarisation best exemplified by the Brexit vote in the UK and the election of Donald Trump in the US. The online world moved from a social space to a partisan one. It was important to have a tribe. If you didn’t choose one, one would be ascribed to you, likely unfavourably.
This no doubt contributed to an increased sense of digital fatigue as no longer were people simply consumers of news, they were engagers with it. You did not read an article, you reacted to it. Like, comment, retweet, post. It requires mental energy to not simply stay engaged but to embody engagement, building a mini-brand around your beliefs, demonstrable through the content you chose to respond to and pass on to other like-minded consumers. Social media ceased to be that Edenic place where you would blissfully log on to see what your friends were up to. Instead, it became an algorithmically dictated carousel of partisan avatars looking to prove their moral and intellectual credentials, often at the expense of an equally engaged opposing force.
Staying engaged
As noted in the Athens Journal of Mass Media and Communications, “research has confirmed the mental health impact of news consumption. One study found heightened anxiety, even sadness, in people who watched negative news-related material, such as bulletins, after only minutes” [10]. For citizens who want to remain engaged, then, there exists a quandary: do you sacrifice your own well-being out of a sense of civic duty, or do you cut down on your consumption, willfully opting for the bliss of ignorance?
Of course, the choice is not actually so binary. Like most things, it’s about balance. If you sense that you are feeling overwhelmed by the news – especially if you are conscious that you spend more time following it than it’s suggested you should – then take a step back. A recent study by Texas Tech University among people with problematic or high levels of news viewing found that nearly 74% experienced stress or anxiety “quite a bit” or “very much”, while sixty-one percent reported feeling physically ill “quite a bit” or “very much” [11]. If you recognise yourself in those brackets, step back.
Targeted screen time
Time notes that, “Excessive screen time has been shown to have negative effects on children and adolescents. It’s been linked to psychological problems, such as higher rates of depression and anxiety, as well as health issues like poor sleep and higher rates of obesity” [12]. The effects on adults are less well-documented, but are thought to be only mildly less potent. But as assistant adjunct professor of psychology at UCLA, Yalda T. Uhls, says, how much time you’re spending on your phone is far less pertinent as the content you’re consuming. To avoid news fatigue, you don’t need to throw your phone in the ocean and set up camp in Timbuktu. You can still use your phone. Just be sure to pay attention to what you’re paying attention to.
Cutting back on social media seems the best way to help yourself. A study published in 2018 in the Journal of Social and Clinical Psychology assessing the effects of Facebook, Instagram, and Snapchat on the mental health of 143 college students found that if young people showed depressive symptoms at the start of the study, then reduced their social-media use to just 10 minutes per day on each platform—a total of 30 minutes on social media per day—for three weeks, their symptoms of depression and loneliness decreased [13].
Melissa Hunt, associate director of clinical training in the department of psychology at the University of Pennsylvania and author of the above study, notes that, “It’s not that social media is in and of itself inherently problematic. It’s that using too much of it, or using it in the wrong way, is very problematic. My advice is if you’re going to use social media, follow friends for about one hour a day” [14]. A Canadian study during the pandemic found that the best way to boost mental and general health was to combine a reduction in screen time with increased outdoor exercise [15].
Switching off
Essentially, then, the solution is as simple as it is difficult: spend less time engaging with content that drains you. The obvious problem with that advice is that we are rarely engaging with such content blindly. Awareness that we are overdoing it does not preclude us from clicking on that next enticingly provocative link.
If you’re really struggling, going cold turkey might be the solution. Set limits on your phone so that there is at least some kind of barrier in place. Tell a friend or partner that you’re looking to disengage. Vocalising your intentions will likely help. If not, having someone willing to check in on you or hold you accountable is useful motivation.
And for those who don’t wish to step back their engagement out of a sense of civic responsibility, know that you’re not helping anyone by draining yourself in the name of staying informed. Making a martyr of yourself is futile. Adopt the oxygen mask rule: save yourself first. Then once you’re set, you’ll be that much better placed to help others.
References
[1] https://www.netflix.com/title/81289483
[6] https://www.who.int/docs/default-source/searo/bangladesh/2019-ncov/mental-health-covid-19.pdf
[7] https://www.athensjournals.gr/media/2022-8-3-1-Fitzpatrick.pdf
[9] https://www.athensjournals.gr/media/2022-8-3-1-Fitzpatrick.pdf
[10] https://www.athensjournals.gr/media/2022-8-3-1-Fitzpatrick.pdf
[11] https://www.tandfonline.com/doi/abs/10.1080/10410236.2022.2106086?journalCode=hhth20
[12] https://time.com/6174510/how-much-screen-time-is-too-much/
[13] https://time.com/6174510/how-much-screen-time-is-too-much/
[14] https://time.com/6174510/how-much-screen-time-is-too-much/
[15] https://www150.statcan.gc.ca/n1/pub/82-003-x/2020006/article/00001-eng.htm
Introduction
Nathaniel Hawthorne’s 1843 short story “The Birth-Mark” centres on a young scientific scholar who develops an unhealthy obsession with a small red birthmark on his wife’s cheek. His wife is noted for her beauty, but for the young scholar the issue lies in his bride’s tantalising proximity to perfection. This one tiny aberration proves too much for him to take. He ascribes the birthmark additional meaning, viewing it as a sign of the “fatal flaw of humanity” and his wife’s “liability to sin, sorrow, decay and death” [1].
His bride comes to internalise her husband’s feelings and so asks him to remove this single display of her physical fallibility – to “fix” her. He conceals her in a boudoir by his laboratory and subjects her to a variety of alchemical concoctions. The wife observes of her husband that “his most splendid successes were almost invariably failures, if compared with the ideal at which he aimed.” Eventually one of his potions succeeds in removing the birthmark. However, no sooner has it done so than his young bride passes away.
Hawthorne’s tale of the ruinous effects of perfectionism echoes louder today than ever. A study of over 41,000 people published by Thomas Curran and Andrew Hill in Psychological Bulletin found that perfectionism’s prevalence in society has increased [2]. Their study, the first of its kind in comparing perfectionism across generations (from 1989 to 2016), found significant increases in the rates of perfectionism among recent undergraduates in the US, UK and Canada compared with those of previous generations [3]. Kate Rasmussen, who researches child development and perfectionism at West Virginia University says that today, “As many as two in five kids and adolescents are perfectionists…We’re starting to talk about how it’s heading toward an epidemic and public health issue” [4].
As Amanda Ruggeri notes, writing on the subject for the BBC, that rise in perfectionism “doesn’t mean each generation is becoming more accomplished. It means we’re getting sicker, sadder and even undermining our own potential” [5].
The Perfect Body
As Hawthorne’s 19th century story demonstrates, perfectionism is nothing new. But aspects of today’s society serve to exacerbate it. The correlations between the young scholar and his bride in “The Birth-Mark” and the prevalence of plastic surgery in today’s society is obvious. As the bride died in the name of her husband’s obsessive pursuit, so too do a certain fraction of today’s patients die under the knife in pursuit of plasticised perfection. Others irrevocably change their appearance again and again, never quite satisfied, always certain they’re just one operation away. They place their faith in tomorrows, which of course are only – but more crucially, always – a day away.
Cosmetic surgery deaths are the most extreme and garish example of the perfectionism phenomena, but this rut runs far deeper. There’s no longer a need to fly to the Dominican Republic for cheap tummy tuck surgery, after all. Nowadays, one can simply use an app on their phone to slice away the flab, add some colour to the skin or remove those pesky, unwanted pimples from photos. You can easily obfuscate any and all potential flaws you see within yourself at the click of a button. Social media is awash with amended images of falsified selves living picturesque lives. These beautiful unrealities are designed to suppress insecurities (that they end up exponentially worsening) through the fleeting validation afforded by the likes of friends and strangers, who are themselves represented by equally falsified avatars.
Unsurprisingly social media’s impact on body image is most harmful to the young, who spend disproportionate amounts of time online (a survey by the nonprofit research organisation Common Sense Media found that the average screen time for 13-18 year-olds was eight hours and 39 minutes a day). Research published by the American Psychological Association found that “teens and young adults who reduced their social media use by 50% for just a few weeks saw significant improvement in how they felt about both their weight and their overall appearance compared with peers who maintained consistent levels of social media use” [6].
Meanwhile, the World Health Organisation have stated that record numbers of young people are experiencing mental illness, with depression, anxiety and suicide ideation more common in the US, Canada and UK today than even a decade ago [7]. This is of course not all down to the ease of Photoshop botox or sepia Instagram filters. To think the nefarious effects of perfectionism are limited to the physical would be to miss the mark. Dissatisfaction with the body stems from the mind.
The Perfect Self
Curran and Hill ascribe the exponential rise in perfectionism amongst today’s youth to “increasingly demanding social and economic parameters” as well as “increasingly anxious and controlling parental practices” [8]. In The Tyranny of Merit, the American philosopher Michael Sandel argues that meritocratic capitalism has created a permanent state of competition within society. This system sustains an order of winners and losers, “breeding hubris and self-congratulation among the former and chronically low self-worth among the latter” [9].
Millennials and those of younger generations are far more likely to have undertaken some form of higher education than their predecessors. And yet graduates, even those with highly specialised Master’s degrees, are finding it difficult to find work in increasingly oversaturated markets. Those who do gain employment are often settling for junior roles consisting of administrative duties that fail to make use of their (generally hugely expensive) education. They find themselves on the unrewarding bottom rung of a ladder that makes no guarantee of further ascent and pays them so little (if at all, in the era of the unpaid internship) that they often struggle to make rent or rely on a litany of exhaustive side-hustles to do so.
Josh Cohen, a psychoanalyst and professor of modern literary theory at Goldsmiths, University of London, notes that in such a culture “young people are likely to grow dissatisfied both with what they have and who they are.” Meanwhile, “social media creates additional pressure to construct a perfect public image, exacerbating our feelings of inadequacy” [10].
Concurrently, all around us self-help gurus preach the importance of betterment – educational, emotional, physical, financial – in droves. The idea that we must seek something better than what we have – something graspable if we just put the work in (or better yet, like and subscribe) – is pernicious, and fuels the fire of inadequacy. If we are seeking something better, it’s because there is something lacking in what we have. No wonder we feel unfulfilled when we’re surrounded by false prophets promising they have the key to fulfilment’s kingdom. Who knew satisfaction was just a pricey online self-help course away?
According to research examining 43 different studies over 20 years by York St. John University, perfectionism is linked to burnout as well as depression, anxiety and even mortality [11]. That’s right, perfectionists die younger [12]. Since it’s so linked to such disastrous outcomes, then, why is it we can’t kick the perfectionist habit?
Positive perfection
Because we don’t want to. For all its obvious faults, most of us still attribute some kind of value to perfectionism. It’s become job interview parody to say that your greatest weakness is that you’re a perfectionist. As we all know, this is really a sneaky positive. Perfectionism has come to be associated with a strong work ethic, ambition, and high attainment.
Researchers argue that these benefits are illusory. Sarah Egan, a senior research fellow at the Curtin University in Perth who specialises in perfectionism, eating disorders and anxiety, notes that “the difficult part of [perfectionism], and what makes it different than depression or anxiety, is that the person often values it. If we have anxiety or depression, we don’t value those symptoms. We want to get rid of them. When we see a person with perfectionism, they can often be ambivalent towards change. People say it brings them benefits.”
It becomes an endless loop. Perfectionism brings a person dissatisfaction – maybe even depression or suicidal ideation – so they go to a therapist to fix the problem. They want to get rid of the depression, they say, but they don’t want to lose the perfectionism that contributes to it as they believe it offers them something essential. It’s like going to a personal trainer and demanding they help you lose weight while telling them you have no intention of cutting the daily fast food, sugary drinks and excessive alcohol from your diet. Something’s gotta give.
Obviously ambition, diligence and high standards are positive traits. The problem is that these traits are wrongly associated with a kind of “healthy” perfectionism, when really they’re not perfectionism at all. They’re conscientiousness [13]. As Hill notes, “Perfectionism isn’t about high standards. It’s about unrealistic standards…Perfectionism isn’t a behaviour. It’s a way of thinking about yourself” [14]. Wanting to do well is good. Beating yourself up if you don’t is not.
Humanity’s quest for perfection
It’s possible that perfectionism is just part of our nature. Cohen draws parallels between the human strive for perfection and religious and mythical tales of divine wrath. Prometheus and the Tower of Babel provide examples of what happens when man tries to overextend his reach: the relevant divinity rains down punishment. According to Cohen, “Religious striving for moral and spiritual improvement goes in tandem with the sombre recognition that perfection belongs to God alone.” Or more strikingly, “In the religious imagination, the notion of human perfection is blasphemy” [15].
What to do?
As the late author David Foster-Wallace noted, “if your fidelity to perfectionism is too high, you never do anything” [16]. The writer Rebecca Solnit is also succinct in her denigration of perfection’s pitfalls: “The perfect is not only the enemy of the good; it’s also the enemy of the realistic, the possible, and the fun” [17].
To break out of the perfectionism trap takes a shift in mindset – an acceptance of fallibility. It’s a false economy, really. You can’t achieve the perfect, so why are you trying to? Elizabeth Gilbert says perfectionism is nothing more than, “fear in fancy shoes and a mink coat” [18]. And even that may be giving it too much credit.
Hard as society makes it, it’s important not to focus on the birthmark. Better to reserve your attention for the living, breathing woman on whose cheek it sits. Singer Jeff Rosenstock’s primal shouts on his album closer Perfect Sound Whateversum it up nicely: “Perfect always takes so long because it don’t exist. It doesn’t exist. It doesn’t exist. It doesn’t exist. It doesn’t exist.”
It can be worth remembering that.
References
[1] http://www.lem.seed.pr.gov.br/arquivos/File/livrosliteraturaingles/birthmark.pdf
[3] https://www.apa.org/pubs/journals/releases/bul-bul0000138.pdf
[4] https://www.bbc.com/future/article/20180219-toxic-perfectionism-is-on-the-rise
[5] https://www.bbc.com/future/article/20180219-toxic-perfectionism-is-on-the-rise
[6] https://www.apa.org/news/press/releases/2023/02/social-media-body-image
[7] https://www.bbc.com/future/article/20180219-toxic-perfectionism-is-on-the-rise
[8] https://www.economist.com/1843/2021/08/10/the-perfectionism-trap
[9] https://www.economist.com/1843/2021/08/10/the-perfectionism-trap
[10] https://www.economist.com/1843/2021/08/10/the-perfectionism-trap
[12] https://www.ncbi.nlm.nih.gov/pubmed/19383652
[13] https://www.bbc.com/future/article/20180219-toxic-perfectionism-is-on-the-rise
[14] https://www.bbc.com/future/article/20180219-toxic-perfectionism-is-on-the-rise
[15] https://www.economist.com/1843/2021/08/10/the-perfectionism-trap
[16] https://fs.blog/david-foster-wallace-on-ambition-and-perfectionism/
[17] https://howtoacademy.com/videos/elizabeth-gilbert-lets-call-perfectionism-what-it-really-is-2/
[18] https://howtoacademy.com/videos/elizabeth-gilbert-lets-call-perfectionism-what-it-really-is-2/