
Introduction
The UK government caused controversy recently by rolling back its commitments to net zero. Commenting on the backpedalling, Ford’s UK chair Lisa Brankin raised concerns, citing the rampant upheavals Ford (and every company in the automotive sector) have been making in order to act in accordance with the now-reversed ban on the sale of new petrol and diesel cars by 2030.
As well as the UK 2030 target being “a vital catalyst to accelerate Ford into a cleaner future,” Brankin stated that, “Our business needs three things from the UK government: ambition, commitment and consistency. A relaxation of 2030 would undermine all three” [1].
Essentially what the government has done is undermine the trust of Ford and every other leading player in the automotive sector. They said they were committed to something and then showed they were not. They had the whole sector aligned and working at breakneck speed to overhaul old practices only to discover they’d been wasting their time and money.
As the old saying goes, trust is hard to earn and easy to lose. And in business, a loss of trust is catastrophic.
The value of trust
Research published by Harvard Business Review found that workers at companies where trust is high report 106% greater energy in the office, 74% lower stress levels, 76% greater engagement, and 50% more productivity than their peers at low-trust businesses [2].
Meanwhile PwC reports that 91% of business executives say their ability to build and maintain trust improves the bottom line (including 50% who strongly agree), 58% of consumers say they have recommended a company they trust to friends and family, and 64% of employees say they recommended a company as a place to work because they trusted it [3].
Trust pays. It builds relationships – both internally and with clients – and only grows stronger with time. It produces happier, more productive employees and reaps dividends in profit. Evidently, then, it’s something worth investing in. But to do so, we first need to clarify what we mean by trust.
What is trust?
Writing for Forbes, John Hall, a motivational speaker and co-founder of the time and scheduling management app Calendar, says workplace trust relies on two fundamentals: “First, every team member is making their best effort to further the interests of the company; second, everyone assumes that fact about everyone else on the team unless they see evidence to the contrary” [4].
In lieu of trust falls, office ping pong or other more performative variants of workplace integration, trust boils down to something more fundamental, whether you are doing your best and giving everyone else in your team the courtesy of assuming they’re doing the same.
This second part can prove especially difficult. We can control our own work ethic, not others. And within almost all office environments there’s a sense of competitiveness, the rate and quality of your output exists in constant competition with the rate and output of your colleagues. Who’s in the boss’s good books? Who’s getting the bonus? The promotion?
All these considerations can’t help but cultivate attrition. We may like to think our colleagues aren’t working as hard or to as high a standard as we are out of pride or to build up our own sense of self-worth. This is misguided. We need to bestow trust freely and unsparingly. When considering how best to decide who is trustworthy, Ernest Hemingway put the answer most succinctly: “The best way to find out if you can trust somebody is to trust them” [5].
It’s a leap of faith. That’s what trust is at its core. And until somebody gives you a reason not to trust them, your best bet is to give them the benefit of the doubt.
The science of trust
In an era marred by a seemingly endless carousel of corporate jargon and buzzwords, it’s possible to read about the notion of trust and think it’s more of the same – a benevolent, ultimately abstract notion that holds no quantifiable value but makes for a useful throwaway LinkedIn post or hastily churned out blog. But there is a science to trust, as demonstrated by Paul J. Zak, the founding director of the Center for Neuroeconomics Studies and a professor of economics, psychology, and management at Claremont Graduate University, and the CEO of Immersion Neuroscience.
Having seen in rodents that a rise in the brain’s oxytocin levels signified that another animal was safe to approach, Zak wondered if the same was true for humans. He conducted an experiment following the model of Nobel laureate in economics Vernon Smith [6]. In the experiment, a participant would choose an amount of money to send to a stranger via computer, knowing that the amount they chose to send would triple once they’d sent it. The recipient would then have the option of sharing this tripled amount with the sender or keeping all the cash for themselves. It was a trust exercise made of two parts. First, how much do you send? Second, do you share or steal?
To measure oxytocin levels during the exchange, Zak and his colleagues developed a protocol to draw blood from people’s arms before and immediately after they made decisions to trust others (if they were senders) or to be trustworthy (if they were receivers). The participants were not informed as to the content of the study (and even if they had been, they still would have had no control over the amount of oxytocin their bodies release).
They found that the more money people received (denoting greater trust on the part of senders), the more oxytocin their brains produced. The amount of oxytocin recipients produced then also predicted how trustworthy – that is, how likely to share the money – they would be. To prove that this was not just a result of the brain randomly generating chemicals, they performed further tests, administering doses of synthetic oxytocin into the brain through nasal spray and comparing participants who’d had a dose with those who’d had a placebo. They found that giving people 24 IU of synthetic oxytocin more than doubled the amount of money they sent to a stranger.
To ensure that the oxytocin spray did not cognitively impair the participants – and thus that their actions were actually born of brain fog or psychosis rather than trust – they performed other tests, this time replacing the money test with a gambling model. They found that increased oxytocin led to no rise in risk taking. In other words, the sole and genuine effect of increased oxytocin was to reduce the fear of trusting a stranger.
Over the following ten years, during which he conducted various further tests on oxytocin levels, Zak found that stress is a potent oxytocin inhibitor, as well as learning that oxytocin increases a person’s empathy, which of course is a vital tool for any act that requires collaboration.
How to develop trust
There is a gap in how executives see trust in business and how employees and customers see it. According to PwC, 84% of business executives think that customers highly trust their company, yet only 27% of customers say the same. Similarly 79% of business executives say their employees trust the company, but only 65% of employees agree [7]. Clearly, then, the first step a higher-up can take to improve trust in the company is to be aware that it’s lacking.
Zak’s continued research shows that recognition and attainment of goals are the most proven ways of garnering trust. “The neuroscience shows that recognition has the largest effect on trust when it occurs immediately after a goal has been met, when it comes from peers, and when it’s tangible, unexpected, personal, and public” [8].
Setting goals that are difficult but achievable is crucial. The moderate stress of the task releases neurochemicals, including oxytocin and adrenocorticotropin, that intensify people’s focus and strengthen social connections. However, the challenges have to be achievable and have a clear endpoint. Research shows that vague goals cause employees to give up before they’ve even started.
Pivotal to trust rates within an organisation are messaging and communication. Internal trust networks are hard to maintain because the flow of communication is so much looser and unrestrained than in a strictly employee-client relationship. Organisations are sending their workers multiple, often contradictory messages every day. Different departments are working towards distinct, sometimes contrasting goals. Maintaining alignment to a clear, single message is extremely difficult and does not happen by accident.
Inconsistent messaging, inconsistent standards and false feedback all contribute to the sense of a company unworthy of trust. If one boss is asking workers to pull one way while another boss asks them to pull the other, employees will lose faith in management. This is even more true when it is just one boss flip-flopping on the direction of travel, unsure of their own wants.
Regarding standards, if a boss sets a line, verbal or written, as to what is acceptable behaviour or what is the demanded standard of work but then fails to live up to this standard themselves then trust will quickly dissipate. The same is true if they allow others to get away with clear or repeated breaches, especially if the boss is thought to be playing favourites. It is for managers to set the tone and take responsibility for their organisation. A leader’s words and actions are ascribed deep meaning by their employees, and will be scrutinised heavily. Trust starts at the top and filters down.
Former Herman Miller CEO Max De Pree once said, “The first responsibility of a leader is to define reality. The last is to say thank you. In between the two, the leader must become a servant” [9]. That ability to humble oneself is pivotal to good management.
One way leaders can achieve this is to be willing to ask for help from their workers rather than just telling others what to do. This forms trust and connection with the employees and shows signs of a secure leader, far more trustworthy than one who pretends to have all the answers. As Steve Jobs said, “It doesn’t make sense to hire smart people and tell them what to do; we hire smart people so they can tell us what to do” [10]. Ask questions and show a willingness to learn, you can bet your employees will do the same in turn.
Trust today
Lending employees trust is of greater importance today than ever before due to the prevalence of home and hybrid working. Employers are not able to see and covertly monitor their employees through the day as they can in an office, and so must trust their teams to get the work done in a more autonomous fashion.
People can meet the same standards of in-office productivity from home on their own, less constrained schedule. The numbers back it up [11]. But still some companies are wary. We’ve all seen stories of organisations that want to remotely monitor the usage of their workers’ computers throughout the day to check that they are always at their desk during work hours. This draconian approach shows a total lack of trust. Who would want to work for a company that held them in such low regard? What kind of atmosphere does that cultivate? We talk a lot about company culture. Well, a culture that doesn’t trust its staff is unlikely to get the best out of them, and frankly doesn’t deserve to.
Workers will only grow more remote with time. The traditional 9-5 is unlikely to return. Employers need to bestow the requisite levels of trust to get their employees thriving no matter where they are.
Trust is money
Hall recommends we treat trust like we treat money: “Save it carefully, and spend it wisely. You may not be able to measure it like you can a bank balance, but sooner or later, you’ll see it there, too” [12].
Trust is pivotal to any team endeavour and business is no different. Businesses need to cultivate trust with their consumers. To do so, they must first build it internally, starting from the top. That requires consistent messaging and open communication. It requires humility from leaders, not bullish overconfidence. It requires vulnerability and a willingness to trust someone until they prove you wrong, which inevitably some will. But for companies able to garner a truly trusting environment, one in which every worker is giving their best and working under the assumption that each of their colleagues are doing the same, the rewards are enormous.
References
[3] https://www.pwc.com/us/en/library/trust-in-business-survey-2023.html
[6] https://hbr.org/2017/01/the-neuroscience-of-trust
[7] https://www.pwc.com/us/en/library/trust-in-business-survey-2023.html
[8] https://hbr.org/2017/01/the-neuroscience-of-trust
[9] https://hbr.org/2017/01/the-neuroscience-of-trust
[10] https://businessfitness.biz/hire-smart-people-and-let-them-do-their-jobs/
[11] https://www.businessnewsdaily.com/15259-working-from-home-more-productive.html

Introduction
As the labour market evolves, organisations have been reconsidering the importance and relevance of degree qualifications in their hiring practices. A trend known as “degree inflation,” which saw an increase in job descriptions requiring degrees even when the roles hadn’t changed, was particularly evident in the early 2000s. However, the trend experienced a reset in the aftermath of the 2008-2009 Great Recession, reducing degree requirements across numerous roles.
This shift is particularly noticeable in middle-skill positions, which require some post-secondary education or training but not necessarily a four-year degree. The reset is also evident, though to a lesser extent, in higher-skill positions. Two waves have driven this trend. First, a structural reset that started in 2017 and was characterised by a move away from degree requirements in favour of demonstrated skills and competencies. Second, a cyclical reset that began in 2020, prompted by the Covid-19 pandemic, and involved employers temporarily relaxing degree requirements to find skilled workers during the health crisis.
Impact on equality
In the case of Ireland, the shift away from degree requirements has been particularly impactful in increasing female participation in the workforce. According to the latest Labour Market Pulse published by IDA Ireland in partnership with Microsoft and LinkedIn, skills-based hiring and flexible working conditions are integral to increasing female participation in the Irish labour market. The adoption of a skills-first hiring approach has the potential to increase the overall talent pool in Ireland more than six-fold and 20% more for women than men in traditionally male-dominated occupations.
Hard skills Vs soft skills
Despite the promising trends, it’s important to note that the degree inflation reset is a work in progress. A significant percentage of job descriptions still list degree requirements, effectively walling out a vast number of potential employees from the candidate pool. Additionally, while many companies have announced the removal of degree requirements, they often still exhibit a higher-than-average demand for college degrees in practice. This suggests that, while hard skills can be easily confirmed, degrees are still seen as a proxy for soft skills, which are harder to assess.
However, the shift away from degree-based hiring compels companies to think more carefully about the skills they need. More explicit descriptions of desired capabilities in job postings are increasing awareness among applicants about the importance of developing soft skills. This could influence skills providers to consider how they can update their curricula to include these skills.
Diversified talent pool
The elimination of inflated degree requirements is a critical step towards achieving equity in the labour market. Companies should reassess the assumptions underlying their recruitment strategies, reconsidering the use of blunt and outdated instruments in favour of more nuanced, skills-focused approaches. This shift is already opening attractive career pathways for traditionally overlooked workers due to the lack of a four-year degree. The potential result is a win-win situation: greater equity for job seekers and a more robust, diversified talent pool for companies to draw from.
Skills-first approach
This trend is particularly beneficial in the Irish context, where the government has set ambitious targets for gender equality and equal representation in leadership. A skills-first approach could be instrumental in activating the skills of underrepresented groups, including women, people with disabilities, and those without third-level education. Ireland can pave the way for a more inclusive, equitable future by eliminating barriers to well-paying jobs.
If you wish to introduce skill-based initiatives, it is critical to contextualise these ideas within your company’s unique circumstances, set clear objectives, and develop strategies for implementation. Here are some actionable insights based on the above points:
- Develop a Learning & Development Strategy: Understand your company’s current capabilities and identify the areas where there’s a skill gap. Invest in the creation of learning and development programs that target these gaps. These could be in-house training, online courses, or educational partnerships.
- Empower Employees to Shape Their Career Path: Create platforms or mechanisms that allow employees to share their interests and career goals. This could be an annual employee survey, open discussions, or a tool integrated into your HR system.
- Empower Employees to Shape Their Career Path: Create platforms or mechanisms that allow employees to share their interests and career goals. This could be an annual employee survey, open discussions, or a tool integrated into your HR system.
- Create Cross-functional Opportunities: Make it a point to allow employees to participate in projects or tasks outside their usual scope. This will not only allow them to broaden their skills but also to get a better understanding of the overall company operations.
- Incentivise Learning: Make learning an integral part of your company’s culture. Encourage employees to take time out of their work schedule to engage in training and learning activities. Offer rewards or recognition for those who actively participate in these programs or demonstrate new skills.
- Revamp Your Hiring Process: Transition from a credentials-based hiring approach to a skills-based one. Re-evaluate your job descriptions to focus more on the skills required to perform the job rather than academic or professional credentials.
- Introduce Skills Assessments: Implement mechanisms to measure a candidate’s skills during the hiring process objectively. This could include technical assessments, practical exercises, or situational judgement tests.
- Promote Lifelong Learning During Recruitment: During interviews, discuss the company’s learning and development programs and the opportunities for career growth within the organisation. This can make your company more attractive to potential hires.

Introduction
The role of a CEO, once defined by strategy charts and bottom lines, is undergoing a sea change. With constant technological advances, changing business complexities, and societal expectations, CEOs are required to expand their expertise beyond traditional business acumen. Today, a truly great CEO needs to master the art of social skills, demonstrating a keen ability to interact, coordinate, and communicate across multiple dimensions.
As the business landscape continues to grow more complex, the ability to navigate this intricacy has become a defining factor in effective leadership. This holds true for large, publicly-listed multinational corporations and medium to large companies operating in a rapidly evolving marketplace. As a result, leaders must possess the skills and acumen to navigate this complex landscape, make informed decisions, and steer their organisations toward success.
Social Skills
Top executives in these firms are expected to harness their social skills to coordinate diverse and specialised knowledge, solve organisational problems, and facilitate effective internal communication. Further, the interconnected web of critical relationships with external constituencies demands leaders to demonstrate adept communication skills and empathy.
The proliferation of information-processing technologies has also played a crucial role in defining a CEO’s success. As businesses increasingly automate routine tasks, leadership must offer a human touch—judgment, creativity, and perception—that can’t be replicated by technology. In technologically-intensive firms, CEOs need to align a heterogeneous workforce, manage unexpected events, and negotiate decision-making conflicts—tasks best accomplished with robust social skills.
Equally, with most companies relying on similar technological platforms, CEOs need to distinguish themselves through superior management of the people who utilise these tools. As tasks are delegated to technology, leaders with superior social skills will find themselves in high demand, commanding a premium in the labour market.
Transparency
The rise of social media and networking technologies has also transformed the role of CEOs. Moving away from the era of anonymity, CEOs are now expected to be public figures interacting transparently and personally with an increasingly broad range of stakeholders. With real-time platforms capturing and publicising every action, CEOs need to be adept at spontaneous communication and anticipate the ripple effects of their decisions.
Diversity & inclusion
In the contemporary world, great CEOs also need to navigate issues of diversity and inclusion. This calls for a theory of mind—a keen understanding of the mental states of others—enabling CEOs to resonate with diverse employee groups, represent their interests effectively, and create an environment where diverse talent can thrive. (See our article on the Chief Coaching Officer for an alternative solution to this issue)
Hiring strategies
Given this backdrop, it is essential for organisations to refocus their hiring and leadership development strategies. Instead of relying on traditional methods of leadership cultivation, companies need to build and evaluate social skills among potential leaders systematically.
Current practices, such as rotating through various departments, geographical postings, or executive development programs, aren’t enough. Firms need to design a comprehensive approach to building social skills, even prioritising them over technical skills. High-potential leaders should be placed in roles that require extensive interaction with varied employee populations and external constituencies, and their performance should be closely monitored.
Assessing social skills calls for innovative methods beyond the traditional criteria of work history, technical qualifications, and career trajectory. New tools are needed to provide an objective basis for evaluating and comparing people’s abilities in this domain. While some progress is being made with the use of AI and custom tools for lower-level job seekers, there is a need for further innovation in top-level searches.
Conclusion
In conclusion, the role of the CEO is more multifaceted than ever. The modern world demands executives to possess exceptional social skills, including effective communication, empathetic interaction, and proactive inclusion. Companies need to recognise this change and adapt their leadership development programs accordingly to cultivate CEOs who can effectively lead in the 21st century.

The persistent pulse of inquiry in history
Throughout history, our innate curiosity has been the heartbeat of progress, driving us from basic questions about nature, like “Why does it rain?” to profound existential inquiries, such as “Do we have free will?”. In today’s fast-paced world, the art of asking questions feels somewhat overshadowed by the avalanche of information available. Yet, recognising what we don’t know often serves as the true essence of wisdom.
One lasting method of exploring knowledge through questioning is the Socratic method, a tool from ancient Greece that aids critical thinking, helps unearth solutions, and fosters informed decisions. Its endurance for over 2,500 years stands as a testament to its potency. Plato, a student of Socrates, immortalised his teachings through dialogues or discourses. In these, he delved deep into the nature of justice in the “Republic”, examining the fabric of ideal societies and the character of the just individual.
Questions have not only transformed philosophy but also propelled innovations in various fields. Take, for instance, Alexander Graham Bell, whose inquiries led to the invention of the telephone or the challenges to traditional beliefs during the Renaissance that led to breakthroughs in art, science, and philosophy. With their profound questions about existence and knowledge, the likes of Kant and Descartes have shaped the philosophical narratives we discuss today.
Critical questioning has upended accepted norms in the scientific realm, leading to paradigm shifts. For example, Galileo’s scepticism of the geocentric model paved the way for ground-breaking discoveries by figures such as Aristarchus, Pythagoras, Copernicus, Newton, and Einstein. At its core, every scientific revolution was birthed from a fundamental question.
On the educational front, the importance of questioning is backed by modern research. Historically, educators have utilised questions to evaluate knowledge, enhance understanding, and cultivate critical thinking. Rather than simply prompting students to recall facts, effective questions stimulate deeper contemplation, urging students to analyse and evaluate concepts. This enriches classroom experiences and deepens understanding in experiential learning settings.
By embracing this age-old method and recognising the power of inquiry, we can better navigate the complexities of our contemporary world.
Questions through the ages: an enduring pursuit of truth
Throughout the annals of time, the act of questioning has permeated our shared human experience. While ancient civilisations like the Greeks laid intellectual foundations with their spirited debates and dialogues, their inquiries’ sheer depth and diversity stood out. These questions spanned from the cosmos’ intricate designs to the inner workings of the human soul.
Historical literature consistently echoed this thirst for understanding, whether in the East or West. It wasn’t just about obtaining answers; it celebrated the journey of arriving at them. The process, probing, introspection, and subsequent revelations hold a revered spot in our collective memory. The reverence with which we’ve held questions, as seen through the words of philosophers, poets, and thinkers, showcases the ceaseless human spirit in its quest for knowledge.
In today’s interconnected world, the legacy of these inquiries remains ever-pertinent. We live in an era of information, a double-edged sword presenting knowledge and misinformation. As we grapple with this deluge, the skills of discernment and critical inquiry, inherited from our ancestors, are invaluable. It’s no longer just about seeking answers but about discerning the truths among many voices.
With the current rise in misinformation and fake news, a sharpened sense of questioning becomes our compass, guiding us through the mazes of contemporary challenges. By honouring the traditions of the past and adapting them to our present, we continue our timeless pursuit of truth, ensuring that the pulse of inquiry beats strongly within us.
Understanding the Socratic Method
Having recognised the age-old reverence for inquiry, it becomes imperative to explore one of its most pivotal techniques: the Socratic method. Socrates, widely regarded as a paragon of wisdom, believed that life’s true essence lies in perpetual self-examination and introspection. His approach was unique in its time, as he dared to challenge societal norms and assumptions. When proclaimed the wisest man in Greece, he responded not with complacency but with probing inquiry.
The Socratic method transcends a mere question-answer paradigm. Instead, it becomes a catalyst, prompting deep reflection. This dialectical technique fosters enlightenment, not by spoon-feeding answers but by kindling the flames of critical thinking and understanding. The beauty of this method rests not solely in the answers it might yield, but in the journey of introspection and dialogue it necessitates.
Beyond philosophical discourses, this method resonates powerfully in contemporary educational spheres. It underscores that genuine knowledge transcends rote memorisation, emphasising comprehension and enlightenment. This reverence for knowledge stresses the imperative of recognising our limitations fostering an ethos where learning is ceaseless and dynamic.
In our information-saturated age, the Socratic method’s principles are not just philosophical musings but indispensable. According to Statistica, only about 26% of Americans feel adept at discerning fake news, while a concerning 90% inadvertently propagate misinformation. Herein lies the true power of the Socratic approach. It teaches us discernment, evaluation, and the courage to seek clarity continuously. By integrating this method into our lives, we are better equipped to navigate our intricate world, fostering lives marked by clarity, purpose, and profound understanding.
Why the question often surpasses the answer
Having delved into the rich tapestry of historical inquiry and the transformative power of the Socratic method, one may wonder: Why such an emphasis on the question rather than the answer?
We are often trained to seek definite conclusions throughout our educational journey and societal conditioning. Yet, as Socrates demonstrated through his dialogues, there’s profound wisdom in embracing the exploration inherent in questioning. His discussions rarely aimed for definitive answers, suggesting that the reflective process, rather than the conclusion, held deeper significance.
Imagine a complex puzzle. While the completed picture might offer satisfaction, aligning each piece, understanding its intricacies, and appreciating its nuances truly enriches the experience. Similarly, questions, even those without clear-cut resolutions, can expand our horizons, provoke self-assessment, and challenge our preconceived notions. This process broadens our perspectives and fosters a more holistic understanding of our surroundings.
By valuing the act of questioning, we equip ourselves with the tools to navigate ambiguity, confront our limitations, and engage with the world more thoughtfully and profoundly.
The Socratic Method in contemporary frameworks
Socratic questioning involves a disciplined and thoughtful dialogue between two or more people, and its methodologies, rooted in ancient philosophy, remain instrumental in today’s diverse contexts. In the realm of academia, especially within higher education, this collaborative form of questioning is a cornerstone. Educators don’t merely transfer information; they challenge students with introspective questions, compelling them to reflect, engage, and critically evaluate the content presented.
Beyond the classroom, the applicability of the Socratic method stretches wide. Business environments, such as boardrooms and innovation brainstorming sessions, harness the power of Socratic dialogue, pushing participants to confront and rethink assumptions. Professionals employ this method in therapeutic and counselling to guide clients in introspective exploration, encouraging clarity and self-awareness.
Through its emphasis on continuous dialogue, deep reflection, and the mutual pursuit of understanding, this age-old method remains a beacon, guiding us as we navigate the ever-evolving complexities of our modern world.
Conclusion: the timeless art of inquiry
From the cobbled streets of ancient Athens to contemporary classrooms, boardrooms, and counselling sessions, the enduring legacy of the Socratic method attests to the potent force of inquiry. By valuing the exploratory process as much as, if not more than, the final insight, we pave a path towards richer understanding, intellectual evolution, and the limitless possibilities of human achievement.
In today’s deluge of data and information, the allure of swift answers is undeniable. Yet, Socrates’ practice reminds us of the transformative power held in the act of questioning. Adopting such a mindset, as this iconic philosopher once did, extends an open invitation to a life punctuated by curiosity, wonder, and unending discovery.

Depending on who you listen to working from home is either proof of a declining work ethic – evidence of and contributor to a global malaise that is hampering productivity, decimating work culture and amplifying isolation and laziness – or it’s a much-needed break from overzealous corporate control, finally giving workers the autonomy to do their jobs when, where and how they want to, with some added benefits to well-being, job satisfaction and quality of work baked in.
Three years on from the pandemic that made WFH models ubiquitous, the practice’s status is oddly divisive. CEOs malign it. Workers love it. Like most statements around WFH, that analysis is over simplistic. So what’s the actual truth: is WFH good, bad or somewhere in between?
The numbers
Before the pandemic Americans spent 5% of their working time at home. By spring 2020 the figure was 60% [1]. Over the following year, it declined to 35% and is currently stabilised at just over 25% [2]. A 2022 McKinsey survey found that 58% of employed respondents have the option to work from home for all or part of the week [3].
In the UK, according to data released by the Office for National Statistics in February, between September 2022 and January 2023, 16% of the workforce still worked solely from home, while 28% were hybrid workers who split their time between home and the office [4]. Meanwhile, back in 1981, only 1.5% of those in employment reported working mainly from home [5].
The trend is clear. Over the latter part of the 20th century and earliest part of the 21st, homeworking increased – not surprising given the advancements to technology over this period – but the increase wasn’t drastic. With Covid, it surged, necessarily, and proved itself functional and convenient enough that there was limited appetite to put it back in the box once the worst of the crisis was over.
The sceptics
Working from home “does not work for younger people, it doesn’t work for those who want to hustle, it doesn’t work in terms of spontaneous idea generation” and “it doesn’t work for culture.” That’s according to JPMorgan Chase CEO Jamie Dimon [6]. People who work from home are “phoning it in” according to Elon Musk [7]. In-person engineers “get more done,” says Mark Zuckerberg, and “nothing can replace the ability to connect, observe, and create with peers that comes from being physically together,” says Disney CEO Bob Iger [8].
Meanwhile, 85% of employees who were working from home in 2021 said they wanted a hybrid approach of both home and office working in future [9]. It seems there’s a clash, then, between the wants of workers and the wants of their employers.
Brian Elliott, who previously led Slack’s Future Forum research consortium and now advises executive teams on flexible work arrangements, puts the disdain for WFH from major CEOs down to “executive nostalgia” [10].
Whatever the cause, and whether merited or not, feelings are strong – on both sides. Jonathan Levav, a Stanford Graduate School of Business professor who co-authored a widely cited paper finding that videoconferencing hampers idea generation, received furious responses from advocates of remote-work. “It’s become a religious belief rather than a thoughtful discussion,” he says [11].
In polarised times, it seems every issue becomes black or white and we must each choose a side to buy into dogmatically. Given the divide seems to exist between those at the upper end of the corporate ladder and those below, it’s especially easy for the WFH debate to fall into a form of tribal class warfare.
Part of the issue is that each side can point to studies showing the evident benefits of their point of view and the evident issues with their opponents. It’s the echo-chamber effect. Some studies show working from home to be more productive. Others show it to be less. Each tribe naturally gravitates to the evidence that best suits their argument. Nuance lies dead on the roadside.
Does WFH benefit productivity?
The jury is still out.
An Owl Labs report on the state of remote work in 2021 found that of those working from home during 2021, 90% of respondents said they were at least at the same productivity level working from home compared to the office and 55% said they worked more hours remotely than they did at the office [12].
On the other end of the spectrum, a paper from Stanford economist Nicholas Bloom, which reviewed existing studies on the topic, found that fully remote workforces on average had a reduced productivity of around 10% [13].
Harvard Business School professor Raj Choudhury, looking into government patent officers who could work from anywhere but gathered in-person several times a year, championed a hybrid approach. He found that teams who worked together between 25% and 40% of the time had the most novel work output – better results than those who spent less or more time in the office. Though he said that the in-person gatherings didn’t have to be once a week. Even just a few days each month saw a positive effect [14].
It’s not just about productivity though. Working from home can have a negative impact on career prospects if bosses maintain an executive nostalgia for the old ways of working. Studies show that proximity bias – the idea that being physically near your colleagues is an advantage – persists. A survey of 800 supervisors by the Society for Human Resource Management in 2021 found that 42% percent said that when assigning tasks, they sometimes forget about remote workers [15].
Similarly, a 2010 study by UC Davis professor Kimberly Elsbach found that when people are seen in the office, even when nothing is known about the quality of their work, they are perceived as more reliable and dependable – and if they are seen off-hours, more committed and dedicated [16].
Other considerations
It’s worth noting other factors outside of productivity that can contribute to the bottom line. As Bloom states, only focusing on productivity is “like saying I’ll never buy a Toyota because a Ferrari will go faster. Well, yes, but it’s a third the price. Fully remote work may be 10% less productive, but if it’s 15% cheaper, it’s actually a very profitable thing to do” [17].
Other cost-saving benefits of a WFH or hybrid work model include potentially allowing businesses to downsize their office space and save on real estate. The United States Patent and Trademark Office (USPTO) estimated that increases in remote work in 2015 saved it $38.2 million [18].
Minimising the need for commuting also helps ecologically. The USPTO estimates that in 2015 its remote workers drove 84 million fewer miles than if they had been travelling to headquarters, reducing carbon emissions by more than 44,000 tons [19].
A hybrid model
Most businesses now tend to favour a hybrid model. Productivity studies, including Bloom’s that found the 10% productivity drop from fully remote working, tend to concede there’s little to no difference in productivity between full-time office staff and hybrid workers. 47% of American workers prefer to work in a hybrid model [20]. In the UK, it’s 58% [21]. McKinsey’s American Opportunity Survey found that when given the chance to work flexibly, 87% of people take it [22].
However, as Annie Dean, whose title is “head of team anywhere” at software firm Atlassian, notes: “For whatever reason, we keep making where we work the lightning rod, when how we work is the thing that is in crisis” [23].
Choudhary backs this up, saying, “There’s good hybrid – and there’s terrible hybrid” [24]. It’s not so much about the model as the method. Institutions that put the time and effort into ensuring their home and hybrid work systems are well-defined and there’s still room for discussion, training and brainstorming – all the things that naysayers say are lost to remote working – are likely to thrive.
That said, New Yorker writer Cal Newport points out that firms that have good models in place (what he calls “agile management”) are few and far between. Putting such structures in place is beyond the capability of most organisations. “For those not benefiting from good (“Agile”) management,” he writes, “the physical office is a necessary second-best crutch to help firms get by, because they haven’t gotten around to practising good management [25].”
The future
Major CEOs may want a return to full-time office structures, but a change seems unlikely. You can’t put the genie back in the bottle. Home and hybrid working is popular with employees, especially millennials and Gen Z. As of 2022 millennials were the largest generation in the workforce [26]; their needs matter.
The train is only moving in one direction – no amount of executive nostalgia is going to get it to turn back. It seems a hybrid model is the future, and a healthy enough compromise.
References
[1] https://www.economist.com/special-report/2021/04/08/the-rise-of-working-from-home
[2] https://www.forbes.com/sites/stevedenning/2023/03/29/why-working-from-home-is-here-to-stay/
[3] https://www.mckinsey.com/industries/real-estate/our-insights/americans-are-embracing-flexible-work-and-they-want-more-of-it
[4] https://www.theguardian.com/commentisfree/2023/feb/14/working-from-home-revolution-hybrid-working-inequalities
[5] https://wiserd.ac.uk/publication/homeworking-in-the-uk-before-and-during-the-2020-lockdown/
[6] https://www.forbes.com/sites/jenamcgregor/2023/08/19/the-war-over-work-from-home-the-data-ceos-and-workers-need-to-know/
[7] https://hbr.org/2023/07/tension-is-rising-around-remote-work
[8] https://www.forbes.com/sites/jenamcgregor/2023/08/19/the-war-over-work-from-home-the-data-ceos-and-workers-need-to-know/
[9] https://www.ons.gov.uk/employmentandlabourmarket/peopleinwork/employmentandemployeetypes/articles/businessandindividualattitudestowardsthefutureofhomeworkinguk/apriltomay2021
[10]
https://www.forbes.com/sites/jenamcgregor/2023/08/19/the-war-over-work-from-home-the-data-ceos-and-workers-need-to-know/
[11] https://www.forbes.com/sites/jenamcgregor/2023/08/19/the-war-over-work-from-home-the-data-ceos-and-workers-need-to-know/
[12] https://owllabs.com/state-of-remote-work/2021/
[13] https://www.forbes.com/sites/jenamcgregor/2023/08/19/the-war-over-work-from-home-the-data-ceos-and-workers-need-to-know/
[14] https://www.forbes.com/sites/jenamcgregor/2023/08/19/the-war-over-work-from-home-the-data-ceos-and-workers-need-to-know/
[15] https://www.forbes.com/sites/jenamcgregor/2023/08/19/the-war-over-work-from-home-the-data-ceos-and-workers-need-to-know/
[16] https://www.forbes.com/sites/jenamcgregor/2023/08/19/the-war-over-work-from-home-the-data-ceos-and-workers-need-to-know/
[17]
https://www.forbes.com/sites/jenamcgregor/2023/08/19/the-war-over-work-from-home-the-data-ceos-and-workers-need-to-know/
[18] https://hbr.org/2020/11/our-work-from-anywhere-future#:~:text=Benefits%20and%20Challenges,of%20enhanced%20productivity%20and%20engagement
[19] https://hbr.org/2020/11/our-work-from-anywhere-future#:~:text=Benefits%20and%20Challenges,of%20enhanced%20productivity%20and%20engagement.
[20] https://siepr.stanford.edu/publications/policy-brief/hybrid-future-work#:~:text=Hybrid%20is%20the%20future%20of%20work%20Key%20Takeaways,implications%20of%20how%20and%20when%20employees%20work%20remotely.
[21] https://mycreditsummit.com/work-from-home-statistics/
[22] https://www.mckinsey.com/industries/real-estate/our-insights/americans-are-embracing-flexible-work-and-they-want-more-of-it
[23] https://www.forbes.com/sites/jenamcgregor/2023/08/19/the-war-over-work-from-home-the-data-ceos-and-workers-need-to-know/
[24] https://www.forbes.com/sites/jenamcgregor/2023/08/19/the-war-over-work-from-home-the-data-ceos-and-workers-need-to-know/
[25] https://www.forbes.com/sites/stevedenning/2023/03/29/why-working-from-home-is-here-to-stay/
[26] https://www.forbes.com/sites/theyec/2023/01/10/whats-the-future-of-remote-work-in-2023/

Introduction
Originally published in 2013, Ichiro Kishimi and Fumitake Koga’s The Courage to be Disliked quickly became a sensation in its authors’ native Japan. Its English language translation followed suit with more than 3.5 million copies sold worldwide.
The book is often shelved in the ‘self-help’ category, in large part due to its blandly overpromising subheading: How to free yourself, change your life and achieve real happiness. In truth it would be better suited to the philosophy or psychology section. The book takes the form of a discussion between a philosopher and an angsty student. The student is unhappy with his life and often with the philosopher himself, while the philosopher is a contented devotee of Adlerian psychology, the key points of which he disseminates to the student over the course of five neatly chunked conversations. His proposed principles offer sound advice for life in general but also prove useful when integrated into a business setting.
Adlerian Psychology
Alfred Adler was an Austrian born psychotherapist and one of the leading psychological minds of the 20th century. Originally a contemporary of Freud’s, the two soon drifted apart. In many ways Adler’s theories can be defined in opposition to his old contemporary; they are anti-Freudian at their core. Freud is a firm believer that our early experiences shape us. Adler is of the view that such sentiments strip us of autonomy in the here and now, seeing Freud’s ideas as a form of determinism. He instead proffers:
No experience is in itself a cause of our success or failure. We do not suffer from the shock of our experiences – the so-called trauma – but instead, we make out of them whatever suits our purposes. We are not determined by our experiences, but the meaning we give them is self-determining.
Essentially, then, the theories are reversed. Adler posits that rather than acting a certain way in the present because of something that happened in their past, people do what they do now because they chose to, and then use their past circumstances to justify the behaviour. Where Freud would make the case that a recluse doesn’t leave the house because of some traumatic childhood event, for example, Adler would argue that instead the recluse has made a decision to not leave the house (or even made it his goal not to do so) and is creating fear and anxiety in order to stay inside.
The argument comes down to aetiology vs teleology. More plainly, assessing something’s cause versus assessing its purpose. Using Adlerian theory, the philosopher in the book tells the student that: “At some stage in your life you chose to be unhappy, it’s not because you were born into unhappy circumstances or ended up in an unhappy situation, it’s that you judged the state of being unhappy to be good for you”. Adding, in line with what David Foster-Wallace referred to as the narcissism of self-loathing, that: “As long as one continues to use one’s misfortune to one’s advantage in order to be ‘special’, one will always need that misfortune.”
Adler in the workplace: teleology vs aetiology
An example of the difference in these theories in the workplace could be found by examining the sentence: “I cannot work to a high standard at this company because my boss isn’t supportive.” The viewpoint follows the cause and effect Freudian notion: your boss is not supportive therefore you cannot work well. What Adler, and in turn Kishimi and Koga, argue is that you still have a choice to make. You can work well without the support of your boss but are choosing to use their lack of support as an excuse to work poorly (which subconsciously was your aim all along).
This is the most controversial of Adler’s theories for a reason. Readers will no doubt look at the sentence and feel a prescription of blame being attributed to them. Anyone who has worked with a slovenly or uncaring boss might feel attacked and argue that their manager’s attitude most certainly did affect the quality of their work. But it’s worth embracing Adler’s view, even if just to disagree with it. Did you work as hard as you could and as well as you could under the circumstances? Or did knowing your boss was poor give you an excuse to grow slovenly too? Did it make you disinclined to give your best?
Another example in the book revolves around a young friend of the philosopher who dreams of becoming a novelist but never completes his work, citing that he’s too busy. The theory the philosopher offers is that the young writer wants to leave open the possibility that he could have been a novelist if he’d tried but he doesn’t want to face the reality that he might produce an inferior piece of writing and face rejection. Far easier to live in the realm of what could have been. He will continue making excuses until he dies because he does not want to allow for the possibility of failure that reality necessitates.
There are many people who don’t pursue careers along similar lines, staunch in the conviction that they could have thrived if only the opportunity had arisen without ever actively seeking that opportunity themselves. Even within a role it’s possible to shrug off this responsibility, saying that you’d have been better off working in X role in your company if only they had given you a shot, or that you’d be better off in a client-facing position rather than being sat behind a desk doing admin if only someone had spotted your skill sets and made use of them. But without asking for these things, without actively taking steps towards them, who does the responsibility lie with? It’s a hard truth, but a useful one to acknowledge.
Adler in the workplace: All problems are interpersonal relationship problems
Another of the key arguments in the book is that all problems are interpersonal relationship problems. What that means is that our every interaction is defined by the perception we have of ourselves versus the perception we have of whomever we are dealing with. Adler is the man who coined the term “inferiority complex”, and that factors into his thinking here. He spoke of two categories of inferiorities: objective and subjective. Objective inferiorities are things like being shorter than another person or having less money. Subjective inferiorities are those we create in our mind, and make up the vast majority. The good news is that “subjective interpretations can be altered as much as one likes…we are inhabitants of a subjective world.”
Adler is of the opinion that: “A healthy feeling of inferiority is not something that comes from comparing oneself to others; it comes from one’s comparison with one’s ideal self.” He speaks of the need to move from vertical relationships to horizontal ones. Vertical relationships are based in hierarchy. If you define your relationships vertically, you are constantly manoeuvring between interactions with those you deem above you and those you deem below you. When interacting with someone you deem above you on the hierarchical scale, you will automatically adjust your goalposts to be in line with their perceptions rather than defining success or failure on your own terms. As long as you are playing in their lane, you will always fall short. “When one is trying to be oneself, competition will inevitably get in the way.”
Of course in the workplace we do have hierarchical relationships. There are managers, there are mid-range workers, there are junior workers etc. The point is not to throw away these titles in pursuit of some newly communistic office environment. Rather it’s about attitude. If you are a boss, do you receive your underlings’ ideas as if they are your equal? Are you open to them? Or do you presume that your status as “above” automatically means anything they offer is “below”? Similarly if you are not the boss, are you trying to come up with the best ideas you can or the ones that you think will most be in-line with your boss’ pre-existing convictions? Obviously there’s a balance here – if you solely put forward wacky, irrelevant ideas that aren’t in line with your company’s ethos and have no chance of success then that’s probably not helpful, but within whatever tramlines your industry allows you can certainly get creative and trust your own taste rather than seeking to replicate someone else’s.
Pivotal to this is whether you are willing to be disagreed with and to disagree with others or are more interested in pleasing everyone, with no convictions of your own. This is where the book’s title stems from. As it notes, being disliked by someone “is proof that you are exercising your freedom and living in freedom, and a sign that you are living in accordance with your own principles…when you have gained that courage, your interpersonal relationships will all at once change into things of lightness.”
Adler in the workplace: The separation of tasks
The separation of tasks is pivotal to Adlerian theory and interpersonal relationships. It is how Adler, Kishimi and Koga suggest one avoids falling into the trap of defining oneself by another’s expectations. The question one must ask themselves at all times, they suggest, is: Whose task is this? We must focus solely on our own tasks, not letting anyone else alter them and not trying to alter anyone else’s. This is true for both literal tasks – a piece of work, for example – but also more abstract ideas. For example, how you dress is your task. What someone else thinks of how you dress is theirs. Do not make concessions to their notions (or your perceptions of what their notions might be) and do not be affected by what they think for it is not your task and therefore not yours to control.
This idea that we allow others to get on with their own tasks is crucial to Adler’s belief in how we can live rounded, fulfilling lives. The philosopher argues that the basis of our interpersonal relationships – and as such our own happiness – is confidence. When the boy asks how the philosopher defines the “confidence” of which he speaks, he answers:
It is doing without any set conditions whatsoever when believing in others. Even if one does not have sufficient objective grounds for trusting someone, one believes. One believes unconditionally without concerning oneself with such things as security. That is confidence.
This confidence is vital because the book’s ultimate theory is that community lies at the centre of everything. The awareness that “I am of use to someone” both allows one to act with confidence in their own life, have confidence in others, and to not be reliant on the praise of others. The reverse is true too. As Kishimi and Koga state, “A person who is obsessed with the desire for recognition does not have any community feeling yet, and has not managed to engage in self-acceptance, confidence in others, or contribution to others.” Once one possesses these things, the need for external recognition will naturally diminish.
For high-level employees, then, it’s important to set a tone in the workplace that allows colleagues to feel that they are of use. But as the book dictates, do not do this by fake praise – all that will do is foster further need for recognition (“Being praised essentially means that one is receiving judgement from another person as ‘good.’”) Instead, foster this atmosphere by trusting them, showing confidence.
The courage to be disliked
The Courage to be Disliked is at odds with many of the accepted wisdoms of the day. Modern cultural milieu suggests that we should be at all times accepting and validating others’ trauma as well as our own. Many may even find solace in this approach and find that it suits them best. But there is no one-size-fits-all solution when it comes to fostering a successful workplace and even less so when it comes to leading a fulfilling life. For anyone who feels confined by the idea that there are parameters around what they can achieve and are capable of because of some past event or some subjective inferiority that has been harboured too long, perhaps look at those interpersonal relationships, perhaps find the courage to be disliked, and in doing so hope to find a community that you’re willing to support as much as it supports you. There is no need to be shackled to whatever mythos you’ve internally created.
As the book states: “Your life is not something that someone gives you, but something you choose yourself, and you are the one who decides how you live…No matter what has occurred in your life up to this point, it should have no bearing at all on how you live from now on.”
References
Kishimi, Ichiro & Koga, Fumitake. The Courage to Be Disliked: How to Free Yourself, Change your Life and Achieve Real Happiness. Bolinda Publishing Pty Ltd. 2013.

Introduction
Consider a simple yet profound question: What does your work mean to you? Is it merely a task to be completed, or does it resonate with a deeper purpose in your life?
Viktor Frankl, a prominent Austrian psychiatrist and philosopher, grappled with these very questions, evolving them into a broader exploration of life’s meaning. Drawing from his harrowing experiences in Nazi concentration camps, he developed logotherapy—a form of psychotherapy that centres around the search for meaning and purpose. Through logotherapy, Frankl illuminated the idea that life’s essence can be found not just in joyous moments but also in love, work, and our attitude towards inevitable suffering. This pioneering approach underscores personal responsibility and has offered countless individuals a renewed perspective on fulfilment, even in the face of daunting challenges.
In this piece, we delve into the intricacies of Frankl’s teachings, exploring the symbiotic relationship he identified between work and our quest for meaning.
A Holistic Approach to Life and Work
In his seminal work, ‘Man’s Search for Meaning,’ Viktor Frankl delved deeply into the multifaceted nature of human existence. He eloquently described the myriad pathways through which individuals uncover meaning. For Frankl, while work or ‘doing’ is undoubtedly a significant avenue for deriving meaning, it isn’t the only one. He emphasised the value of love, relationships, and our responses to inevitable suffering. Through this lens, he offered a panoramic view of life, advocating for a holistic perspective where meaning is not strictly tethered to our work but is intricately woven through all our experiences and interactions.
Progressing in his exploration, Frankl sounded a note of caution about the perils of letting work become an all-consuming end in itself. He drew attention to the risks of burnout and existential exhaustion when one’s sense of purpose is confined solely to one’s occupation or the relentless chase for wealth. To Frankl, an overemphasis on materialistic achievements could inadvertently lead individuals into what he termed an ‘existential vacuum’ – a state where life seems starkly devoid of purpose. He argued that in our quest for success, we must continually seek a deeper, more intrinsic purpose. Otherwise, we risk being blinded by life’s profound significance and richness beyond material gains.
Delving deeper into the realm of employment, Frankl confronted the psychological and existential challenges of unemployment. He noted that without the inherent structure and purpose provided by work, many individuals grapple with a profound sense of meaninglessness. This emotional and existential void often manifests in a diminishing sense of significance towards time, leading to dwindling motivation to engage wholeheartedly with the world. The ‘existential vacuum’ emerges again, casting its shadow and enveloping individuals in feelings of purposelessness.
Yet, Frankl’s observations were not merely confined to the challenges. He beautifully illuminated the resilience and fortitude of certain individuals, even in the face of unemployment. He showcased how, instead of linking paid work directly with purpose, some found profound meaning in alternative avenues such as volunteer work, creative arts, education, and community participation.
Frankl firmly believed that the essence of life’s meaning often lies outside the traditional realms of employment. To drive home this perspective, he recounted poignant stories, such as that of a desolate young man who unearthed profound purpose and reaffirmed his belief in his intrinsic value by preventing a distressed girl from taking her life. Such acts, as illustrated by Frankl, highlight the boundless potential for a meaningful existence, often discovered in genuine moments of human connection.
Work as an Avenue for Meaning and Identity
Viktor Frankl’s discourse on work transcended the common notions of duty and obligation. For him, work was more than a mere means to an end; it was a potent avenue to unearth meaning and articulate one’s identity. Frankl posited that when individuals align their work with their intrinsic identity—encompassing all its nuances and dimensions—they move beyond merely working to make a living. Instead, they find themselves working with a purpose.
This profound idea stems from his unwavering belief that our work provides us with a unique opportunity. Through it, we can harness our individual strengths and talents, channelling them to create a meaningful and lasting impact on the world around us.
In line with modern philosophical thought, which views work as a primary canvas for self-expression and self-realisation, Frankl also recognised its significance. He believed that work could serve as a pure channel, finely tuned to our unique skills, passions, and aspirations. This deep sense of accomplishment and fulfilment from one’s chosen profession, he asserted, is invaluable. However, Frankl also emphasised the importance of seeing the broader picture. While careers undeniably play a significant role in our lives, they are but a single facet in our ongoing quest for meaning.
Frankl reminds us that while our careers are integral to our lives, the quest for meaning isn’t imprisoned within their boundaries. He believed the core of true meaning emerges from our deep relationships, our natural capacity for empathy, and our virtues. These treasures of life, he asserted, can be manifested both within the confines of our workplace and beyond.
The True Measure of Meaning Through Work
For Viktor Frankl, our professional lives brim with potential for fulfilment. Yet, fulfilment wasn’t solely defined by accolades. Instead, it was about aligning our work with our deepest values and desires. It wasn’t just the milestones that mattered but how they resonated with our core beliefs.
Frankl’s logotherapy reshapes our perception of work, emphasising that even mundane tasks can hold significance when approached with intent. With the right mindset, every job becomes a step in our journey for meaning.
In Frankl’s writings, he weaves together tales of profound significance—a young man’s transformative act of kindness, a narrative not strictly tethered to work’s traditional realm. Yet, these stories anchor a timeless truth: In every endeavour, whether grand or humble, lies the potential for unparalleled meaning. Here, work isn’t just about designated roles—it becomes an evocative stage where profound moments play out. Beyond job titles and tasks, the depth, sincerity, and fervour we infuse into each act truly capture the essence of meaningful work.
Finding Fulfilment in Every Facet
Viktor Frankl’s profound insights into the human pursuit of meaning provide a distinctive lens through which we can evaluate both our daily tasks and life’s most pivotal moments. Through his exploration—whether addressing the ordinariness of daily life or the extremities of crisis—Frankl illuminated the profound interconnectedness of work and personal identity. He posited that our professions, while significant, are fragments of a vast tapestry that constitute human existence.
Navigating the journey of life requires continual adjustments to our perceptions of success and meaning. While our careers and professional achievements are significant, true fulfilment goes beyond these confines. It’s woven into our human experiences, the bonds we nurture, the challenges we face, and the joys we hold dear.
Frankl’s pioneering work in logotherapy urges us to approach life with intention and purpose. He beckons us to see the value in every moment, task, and human connection. As we delve into our careers and strive for success, aligning not just with outward accomplishments but with the very essence of who we are is vital.

Introduction
Humans have always been fascinated by the future. Prior to the era of computers and data, we sought insights from the stars, dreams, and even animal behaviour. The tale of the Delphic Oracle is etched in this tapestry of human curiosity. A simple goat herder named Coretas reportedly stumbled upon a fissure in the earth, releasing ethereal vapours. Drawn by these mysterious emissions, he perceived glimpses of the future. This mystical spot soon became legendary. Word spread and people from distant lands journeyed here, drawn by the allure of prophecy. They came eager to hear the visions of the future, as interpreted by the chosen Pythia, a maiden who acted as the mouthpiece of Apollo. From mystical vapours to celestial patterns, humanity’s thirst for understanding tomorrow has perpetually pushed us to evolve our tools and methods, seeking ever-more sophisticated ways to peer into the future.
Throughout history, cultures around the globe have relied on a myriad of tools for forecasting the future. The Mayans, for instance, constructed elaborate calendars, meticulously tracking celestial bodies. Chinese sages consulted the I Ching, a revered text blending both philosophy and prediction. During the Middle Ages, figures like Nostradamus peered at the cosmos, firmly believing that the stars unveiled the secrets of events yet to unfold. Meanwhile, in their endless pursuit of the Philosopher’s Stone, alchemists hoped that their transformative experiments might also provide windows into future events. As the sands of time flowed, the rigours of science began to play an increasingly pivotal role in this age-old quest. Meteorologists harnessed accumulated data to forecast weather patterns, while demographers, attuned to shifts in population dynamics, used their insights to anticipate future demographic shifts.
Predictive analytics
Now, in this age, we’re navigating through a golden era of prediction. Computers, hailed as our contemporary oracles, delve into vast data lakes. With the aid of intricate algorithms and machine learning, they furnish insights about potential future events. Computers, hailed as our contemporary oracles, dive into vast data lakes — with less smoke and more code. Though technologically advanced, these modern tools have a mystique reminiscent of ancient methods. Indeed, their exceptional abilities often blur the lines between the arcane and the technological.
Even though the settings have changed—with glass skyscrapers replacing ancient temples—our innate desire to predict the future remains unwavering. We’ve shifted from seeking guidance from oracles to heeding the insights of modern-day experts: economists, scientists, and statisticians. The unpredictable nuances of geopolitics and the intricate web of global economies underscore the challenges of forecasting. Despite our technological advances, no tool or expert can perfectly predict outcomes, as emphasised by the renowned financier Peter Lynch: “You never can predict the economy. You can’t predict the stock market.”
It’s against this backdrop of prediction challenges that Philip Tetlock’s work shines prominently. Over decades, Tetlock undertook the meticulous task of analysing millions of predictions, unravelling the intricacies of human foresight. He identified the ‘superforecasters’, a rare group that consistently demonstrated superior predictive abilities.
Superforecasters
Superforecasters stand apart from their peers, not simply through the accuracy of their predictions, but through their unique way of understanding and working with probabilities. Instead of confining themselves to somewhat nebulous terms like ‘likely’ or ‘certain’, they delve into a world of precision, where small differences matter. They employ an almost artistic attention to detail, carving out distinctions in probability estimates that most would overlook.
What’s noteworthy isn’t simply that they can perceive a difference between a 56% and a 57% probability, but the mindset this precision reflects. It speaks to a meticulousness, and diligence that’s often lacking in forecasting. This ability to finely calibrate their predictions sets them apart, transforming forecasting from a vague art into a refined science.
However, this is but one facet of their skills. Superforecasters also excel at dynamically updating their forecasts as new information comes to light, demonstrating humility in acknowledging and learning from their errors, and cultivating a probabilistic thinking mindset. Taken together, these skills contribute to their exceptional track record in the challenging realm of prediction.
In the aftermath of the Iraq war, where intelligence missteps around weapons of mass destruction became evident, the US intelligence community sought Tetlock’s expertise. His findings were detailed in his book, “Superforecasting: The Art and Science of Prediction”, serving as an invaluable guide for anyone looking to refine their forecasting skills.
Beyond the realm of politics and global affairs, the implications of Tetlock’s research are profound. His techniques offer practical applications in diverse arenas, from deciphering economic trends to pivotal personal decisions, such as evaluating career trajectories or the potential of a business venture.
Ethical & economical challenges
Yet, while Tetlock’s findings are ground-breaking, they’re not infallible. Even the best predictions are fraught with uncertainties. As we harness these insights, it’s vital to maintain a balanced approach, merging strong convictions with a healthy dose of caution.
By integrating Tetlock’s teachings, we can achieve a heightened awareness of our cognitive biases, enabling more informed decisions and, potentially, a brighter future.
While predictive tools offer remarkable insights, their overreliance introduces both ethical and economic challenges. Ethically, leaning too heavily on predictions can erode our adaptability and critical thinking, luring us into a false sense of security. Economically, this complacency can result in missed opportunities or misguided strategies. Just as the roll of a dice is inherently unpredictable, so too are complex systems like economies. They’re influenced by countless variables, making them vulnerable to unexpected twists and turns. Predictions, while valuable, are best used as guiding lights and not as absolute certainties. After all, at their core, they’re imbued with an intrinsic element of unpredictability.
In the realm of forecasting, we find that with great predictive power comes great responsibility — and the inevitable debate over who truly holds the crystal ball. The craft, while teeming with potential, is not without its boundaries and ethical dilemmas. Foretelling the future transcends the realms of science and art; it’s a weighty task that beckons us to navigate with zeal and caution. Here lies our unparalleled chance to influence humanity’s trajectory, yet we must remember to gracefully balance our acquired wisdom against the vast, ever-present unknowns.
Conclusion
As we conclude, we’re reminded of the timeless rhythm of humanity’s quest: from the ethereal mists of the Delphic Oracle to the digital pulses of algorithms. This cyclical endeavour to decipher tomorrow underscores our unyielding curiosity, a reflection of our innate need to foresee, understand, prepare, and connect with the uncertain embrace of the future.

Introduction
Navigating life’s unpredictability often resembles the exhilarating world of alpine skiing. Mikaela Shiffrin, a superstar of the sport, imparts insights into a high-performance mindset, saying,
“My biggest challenge was to keep my expectations low but my standards high, pushing my skiing, doing my best with my turns, having good tactics and being aggressive but not to expect that I would win the race because anything can happen.”
While taken from the realm of competitive skiing, this guiding principle resonates profoundly beyond sports, offering the transformative potential to shape our personal and professional lives. It emphasises maintaining high quality and performance standards while tempering expectations around future outcomes. So, how can we cultivate this mindset, and what benefits can it give?
Standards vs expectations
Fundamentally, standards are often seen as the internal benchmarks or criteria we set for ourselves, encompassing our definitions of quality, competence, or excellence. They are self-generated and typically align with our values, aspirations, and sense of identity. On the other hand, expectations represent our forecasts or assumptions about future events or outcomes. While our personal beliefs and experiences shape them, they are also susceptible to external influences such as societal norms, peer input, or past results. These predictions can significantly influence our emotional responses and subsequent actions, for better or worse.
Insights from the leadership and strategy expert, Sydney Finkelstein, align well with Shiffrin’s principle. Finkelstein highlights,
“Some of the most successful people and organisations in the world are those that embrace surprise. Embracing, rather than fearing, the unexpected is key to getting ahead and being smarter and more adaptable.”
This mindset promotes the potent power of adaptability, urging us to expect the unexpected and welcome it with open arms. Finkelstein’s emphasis on embracing surprise complements Shiffrin’s philosophy and brings a new dynamic to it – teaching us that the keys to success lie in our ability to pivot, adapt and thrive amidst life’s most surprising turns.
Maintaining excellence and expectations
We should strive for excellence in our pursuits, whether it’s producing top-quality work or meeting project timelines. However, it’s crucial to remain aware that external factors like market fluctuations, organisational shifts, or managerial decisions could impact our anticipated outcomes.
Applying this perspective across various facets of our professional life can yield significant benefits. The following strategies amalgamate Shiffrin’s principle and Finkelstein’s insights:
- Foster a Growth Mindset: Shift the focus from the final outcome to the effort and process. Emphasise the value of consistent effort and dedication rather than setting unattainable, vague targets. This mindset can be reinforced by celebrating the consistent efforts and hard work involved in achieving professional milestones.
- Encourage Personal Bests: Remind everyone that success isn’t always about outperforming others but about personal growth, continuous learning, and achieving personal bests, irrespective of external markers of success.
- Allow Space for Mistakes: Encourage learning from failures. This approach cultivates resilience and adaptability, essential traits in any professional setting.
- Offer Continuous Support: Extend support during the process, not merely after achieving the outcome. This can involve listening empathetically, providing constructive feedback, or offering resources for professional development.
Striking a balance
Among these strategies, it’s vital to remember that balance is key, particularly when it comes to praise and reassurance. Excessive or unfounded praise can unintentionally communicate low expectations, undermining the motivational power of genuine appreciation and constructive feedback. It’s a delicate act of maintaining high standards and keeping expectations in check — a true testament to the wisdom of Shiffrin and Finkelstein in our professional pursuits.
Shiffrin’s approach to maintaining high standards while tempering expectations, coupled with Finkelstein’s emphasis on embracing surprise and adaptability, provides a robust framework to navigate the complex landscape of the professional world. This balanced methodology promotes growth, resilience, and adaptability amidst life’s unpredictable twists and turns, transforming us from passive observers to active, resilient participants in life’s dynamic game.
Exercise and positive expectations
The integration of this philosophy extends beyond professional life into our approach to exercise and overall well-being. A study by Hendrik Mothes and colleagues at the University of Freiburg highlights that individuals’ expectations and beliefs significantly influence the psychological and neurophysiological benefits arising from a single exercise session. Participants holding positive expectations about exercise’s benefits consistently reported greater psychological benefits, including increased enjoyment, mood enhancement, anxiety reduction, and a rise in alpha-2 brain waves, indicating relaxation.
Summary
Such findings underscore the profound impact our mindset, expectations, and internal narratives can have on our health journeys. In high-pressure environments—whether they’re sporting arenas or corporate boardrooms—the pressure to meet personal and external expectations can be overwhelming. Ambition can motivate and drive progress, but continuous high-pressure situations can lead to mental health issues like anxiety, stress, and depression.
Organisations must balance their success drive with care for their employees’ mental well-being to foster healthier and more productive environments. Initiatives like emotional well-being programmes provide structures to support employees’ mental health, offering varying levels of care and engagement tailored to individuals’ needs.
By embracing a mindset that unifies an understanding of mental health with Shiffrin’s high-standards-low-expectations approach, we can embark on a holistic path towards better physical and psychological well-being. This integrated approach can significantly enhance our quality of life and performance across multiple life spheres.
More on positivity
- We still need to find out how and why optimism scientifically influences these diagnostics, but we know it yields clear-cut results with empirical certainty. (Read more)
- “Regarding and building upon the last point, mindfulness is deriving positives from what is happening now or not allowing the negatives to alter the future detrimentally.” (Read more)

Introduction
In 1930, John Maynard Keynes, the man regarded as the founder of macroeconomics, from whom Keynesian economics takes its name, predicted that in one hundred years time the average human workweek would clock in at fifteen hours [1]. We’re still seven years away from that hundred year milestone but barring a remarkable turnaround it seems Keynes’ prediction will be proved wrong, and drastically.
Not only are people generally working between 35 and 50 hours a week – depending on country, role etc. – but many are engaged in the uniquely 21st century phenomenon of the side-hustle. According to research for the Trades Union Congress, one in seven workers in Britain now partake in gig-economy jobs like Uber or Amazon delivery at least once a week, many of them on top of full-time employment [2].
Meanwhile, digital tools have made it possible to work from pretty much anywhere, at pretty much any time. This was supposed to usher in a new age of liberation: the worker, no longer constrained by their office environment or nine-to-five schedule, is now free to live the life they always wanted. In reality, it has just meant the expectation of swift email correspondence has extended its lebensraum to the realms of evenings, weekends and even holidays. That edenic notion of freetime signed off its suicide note with a customary “sent from my iPhone” footer.
The sense of never-ending malaise that occupies the modern employee is perhaps best captured by the TV show Severance. Centering around a fictional procedure that severs the work self from the free-time self, the show darkly and comically skewers the torturous undertakings the zombified worker self is made to endure by the malevolent corporation that employs him in this inescapable labour prison, the ramifications of which naturally spill out from their office containment to bruise each self equally. It’s not hard to see why viewers are able to relate.
Keynes’ prediction was based on the myriad changes imbued upon 20th century work culture by technological innovations and societal adjustments in the wake of the industrial revolution. In Keynes’ lifetime, the average workday dropped from fourteen hours a day to eight [3]. Understanding that greater advancements were yet to come, Keynes posited that the trend would continue.
He was right that further innovations in tech would make working practices substantially easier, with everything from printers to Excel to Zoom obvious examples. But while those advancements reduced the amount of time it takes workers to complete everyday tasks, that simply meant workers were now expected to undertake more tasks within their allotted nine-to-five (or often longer) shifts.
Keynes’ great contemporary, the philosopher Bertrand Russell, diagnosed many of the issues with modern work culture in his 1932 essay “In Praise of Idleness”. Russell wrote, “A great deal of harm is being done in the modern world by belief in the virtuousness of work, and that the road to happiness and prosperity lies in an organized diminution of work” [4].
Perhaps more prudently, with an evergreen tinge, he wrote:
Modern methods of production have given us the possibility of ease and security for all; we have chosen instead to have overwork for some and starvation for others. Hitherto we have continued to be as energetic as we were before there were machines. [5]
With Keynes’ foreseen fifteen-hour week out the window, then, how much should we work, really? Provided we have tasks to fulfil, a sense of pride in our roles that dictates our output should be of a certain quality, and a life outside of work from which we hope to derive pleasure and meaning, what is the optimum time we should give to our professional endeavours? The answer is dependent on our role, abilities, temperaments and life circumstances, of course. But there are those advocating specific solutions, and it’s worth assessing the merits of each.
What a way to make a living
The nine-to-five is very much the status quo when it comes to our working schedules. It’s become the parlance in and of itself: nine-to-five equals work, even when in many cases employers are dragging the last of those numbers up and up and up.
The nine-to-five got its start in 1926 under Henry Ford at his namesake Ford Motor Company [6]. At the time, it was a reduction in working hours that was celebrated for obvious reasons. Ford workers manned the assembly line. By putting them on eight-hour shifts, they were able to cover the 24-hour day in three shifts without putting undue demands on staff. Once Ford set the ball rolling and the new schedule proved successful, the system was then adopted in many countries around the world and persisted almost unquestioned (in any meaningful sense) until the pandemic in 2020.
Covid disrupted a litany of accepted notions regarding working practices. Once the flexibility of home working was made commonplace (and even governmentally mandated), it was only a matter of time before workers started to question why they couldn’t add a little flexibility to their hours too.
The nine-to-five has some obvious flaws. In 1926, the expectation was that the man of the house would work while his wife stayed home and dealt with domestic and child-rearing duties. Obviously things have progressed since then. Nowadays, most families consist of two workers. Juggling parental obligations around an in-office nine-to-five is extremely difficult and often involves sacrificing either valuable time with one’s child or professional progress.
The most damning argument against the nine-to-five is that studies show it to be inefficient. A 2016 survey of 1,989 UK office workers found that over the course of an eight-hour workday, the average employee works for two hours and 53 minutes [7]. The rest of the time is spent reading the news, browsing social media, eating, socialising, taking cigarette breaks, and searching for new jobs. Essentially, people are dragging out their tasks to fill the time, and are less fulfilled, less productive, less happy and less healthy for it.
In response to the limitations associated with the traditional nine-to-five five-day week, variations on the formula are becoming increasingly prevalent, as well as increasingly in-demand.
The four-day week
Four-day work weeks are becoming more common. Advocates claim that by providing employees with an extra day of rest, the four-day work week reduces employee anxiety and stress while facilitating better sleep and more time to exercise. Those benefits then pay dividends when it comes to the quality of employee output and increased productivity.
The biggest recent study on the subject was a report by the advocacy groups 4 Day Week Global and 4 Day Week Campaign, with the assistance of researchers from Boston College and the University of Cambridge. The report’s findings show that roughly 40% of respondents said they experienced less work-related stress, and 71% reported lower levels of burnout. More than 40% said their mental health had improved, with significant numbers of employees reporting decreases in anxiety and negative emotions [8].
Nearly half of workers involved said they weren’t as tired as they were before the experiment, and 40% said it was easier to get to sleep. In the end, 96% of employees said they preferred four-day schedules. At the same time, company revenue increased by an average of roughly 1% over the six month period, while employee turnover and absenteeism went down. Almost all of the businesses in the program said they planned to continue with a four-day work week once the experiment was over [9].
The data is striking, and backed up in other studies. In 2019, Microsoft Japan introduced a four day working week and reported a 40% boost in productivity [10]. In Sweden, a two-year government study conducted from 2015-17 on retirement-home workers in Gothenburg found that at the end of the study people were happier, less stressed, and enjoyed work more [11].
Another added benefit of the four-day week is environmental. A study by the University of Massachusetts Amherst found that a 10% reduction in working hours cut an individual’s carbon footprint by 8.6% [12]. Minimising the amount of days workers are commuting can have a drastic environmental impact, and should be a further consideration for those thinking of moving away from the five-day nine-to-five.
The 5-hour workday
Some argue that rather than removing a whole day from the week, it is more efficient to reduce the number of hours worked a day.
Alex Pang, founder of Silicon Valley consultancy Strategy and Rest, visiting scholar at Stanford University, and the author of Rest and The Distraction Addiction, notes that “research indicates that five hours is about the maximum that most of us can concentrate hard on something” [13].
The notion of the five-hour workday gained notoriety through Tower Padel Boards, an online, direct-to-consumer company that sells stand-up paddleboards. In 2015, the company’s CEO Stephan Aarstol offered his employees a deal: if they figured out how to do the same work in less time, they could keep the same salary and leave at 1pm. He also implemented a 5% profit sharing plan, increasing hourly pay [14]. On the day the company announced the change on its website, it broke its previous daily sales record, booking $50,000 in sales for the first time. By the end of the month, it had sold $1.4m worth of paddleboards, breaking its previous monthly sales record by $600,000.
Inspired by what he saw, David Rhoads, CEO of Blue Street Capital, a California-based company that arranges financing for enterprise IT systems, decided to try this new work strategy out for himself. Three months after starting Blue Street Capital’s five-hour workday trial, David found that while they had cut the length of the workweek by three-eighths, the number of calls his employees made per person had doubled. David made the five-hour workday a permanent feature after three months. Three years in, revenues had gone up every year – 30% the first year, 30% the second – while the company grew from nine to seventeen employees [15].
The five-hour workday, like all approaches, has its flaws. Research shows that people’s creativity fades after five hours of concentration – but not all jobs are creative. Taking the original Ford model as an example, assembly line workers have no reason (efficiency-wise) to shorten their workdays. The same is true for those in administrative roles, those in call centres, and all sorts of other professions.
Jan-Emmanuel de Neve, associate professor of economics and strategy at the University of Oxford’s Saïd Business School, is an advocate of the five-hour workday. He says his research reinforces the argument that five-hour working days lead to greater employee wellbeing, which in turn leads to greater productivity. But he also warns that working in these more limited bursts can actually result in greater employee stress [16].
Associate professor in strategic human resource management at the University of Reading’s Henley Business School Rita Fontinha agrees, saying: “While a shorter work day could result in better time management and promote concentration, individuals may feel an added pressure to complete tasks on time” [17].
The death of leisure
In his aforementioned 1932 essay, Russell observed that, “The idea that the poor should have leisure has always been shocking to the rich” [18]. But in 21st century society, we seem to have gone one further: it seems to have become far fetched that anyone at all might have leisure. Free time has been annexxed by 24/7 work schedules and commercialised by social media sites so that even the most lackadaisical of weekend pursuits are increasingly undertaken “for the gram” rather than for the inherent joy in the activity. The self-improvement zeitgeist has similarly snatched away any pastimes that could potentially be filed under ‘trivial’. As Wessie du Toit notes in the New Statesman:
Meditation and exercise look suspiciously like personal optimisation. Artistic vocations centre on tireless self-promotion to a virtual audience. A movement of “homesteaders” churning their own butter and knitting their own jumpers are simply cosplaying older forms of work, and probably posting the results on Instagram. [19]
What to do
Amongst a society that has placed a premium on work and prizes workaholics, Russell’s praise for idleness feels more needed and yet further away than ever. Trends like the Great Resignation and “quiet quitting” demonstrate that worker dissatisfaction is starting to permeate the workforce at large. Shifts to a four-day work week or five-hour workday could be solutions, granting employees autonomy and opportunity for rest at little to no cost to business – potentially even improving productivity and profits.
But given it took a global pandemic to even vaguely move the world away from Henry Ford’s modus operandi first adopted some 97 years ago, it would be optimistic to think such large-scale changes are on their way any time soon.
References
[2] https://www.newstatesman.com/culture/2023/05/work-four-hours-a-day
[3] https://www.theguardian.com/commentisfree/2020/mar/10/five-hour-workday-shorter-book
[4] https://harpers.org/archive/1932/10/in-praise-of-idleness/
[5] https://harpers.org/archive/1932/10/in-praise-of-idleness/
[6] https://www.wired.co.uk/article/working-day-time-five-hours
[7] https://www.businessinsider.com/8-hour-workday-may-be-5-hours-too-long-research-suggests-2017-9
[8] https://time.com/6256741/four-day-work-week-benefits/
[9] https://time.com/6256741/four-day-work-week-benefits/
[10] https://www.weforum.org/agenda/2023/03/surprising-benefits-four-day-week/
[11] https://www.businessinsider.com/8-hour-workday-may-be-5-hours-too-long-research-suggests-2017-9
[12] https://www.weforum.org/agenda/2023/03/surprising-benefits-four-day-week/
[13] https://www.wired.co.uk/article/working-day-time-five-hours
[14] https://www.theguardian.com/commentisfree/2020/mar/10/five-hour-workday-shorter-book
[15] https://www.theguardian.com/commentisfree/2020/mar/10/five-hour-workday-shorter-book
[16] https://www.wired.co.uk/article/working-day-time-five-hours
[17] https://www.wired.co.uk/article/working-day-time-five-hours
[18] https://harpers.org/archive/1932/10/in-praise-of-idleness/
[19] https://www.newstatesman.com/culture/2023/05/work-four-hours-a-day

Introduction
It was the summer of 1964 when the members of a burgeoning British band named The Rolling Stones found themselves on American soil. They were halfway through their first stateside tour when they made their way to Chess Studios in Chicago, keen to record the follow-up to their debut album. The studio was the hallowed hub of their musical heroes, the cradle of the blues and rock ‘n’ roll genres that shaped their sound. The anticipation was palpable as they stepped into the studio, the very place where legends like Howlin’ Wolf, John Lee Hooker, Bo Diddley, and Muddy Waters had crafted their biggest hits.
In a serendipitous twist of fate, their first encounter at Chess was not with a studio executive or an eager intern but Muddy Waters himself. But he was not wielding a guitar; he was clad in overalls, perched on a ladder, paintbrush in hand, and whitewash streaming down his face. The Stones were startled, and in the confusion, an opportunity emerged, laying bare the perfect juxtaposition of the seemingly mundane and its grand potential.
Keith Richards and the band did not just meet an idol that day; they built a relationship that would later see them tour and work with Muddy, learning first-hand from one of the greats. The Stones’ deep understanding and appreciation of blues music and readiness to learn propelled their career to unprecedented heights, leading them to their first number-one hit, ‘It’s All Over Now’.
Preparation meeting opportunity
This principle of “Preparation Meeting Opportunity,” often defined as luck, is equally applicable in the world of work. It emphasises that when individuals and organisations are mentally and practically prepared, they are more likely to recognise and capitalise on opportunities.
Much like The Rolling Stones recognised the value in learning from a legend like Muddy Waters, forward-thinking companies understand that their talent is their scarcest resource. According to a McKinsey report titled “Organising for the future: Nine keys to becoming a future-ready company,” successful companies anchor their efforts on the principle that talent is indeed scarcer than capital. They continually ask themselves: What talent do we need? How can we attract it? And how can we manage talent most effectively to deliver on our value agenda?
Inclusion & diversity
Inclusion and diversity have surfaced as critical aspects of this talent strategy. A company that fosters an inclusive employee experience becomes an attractive destination for top talent and benefits from the increased profitability associated with diverse leadership.
The Rolling Stones, who had already seen early success, remained hungry for improvement and open to learning from the best in their field. Similarly, organisations and their employees can foster a culture of continuous learning and development, seeking out opportunities in the most unexpected places.
Summary
The story of The Rolling Stones’ encounter with Muddy Waters and their subsequent rise to global fame is not just a story of music and stardom. It’s a tale of recognising and seizing opportunity, preparation meeting chance, and the power of a creative, curious, and prepared mindset.
Whether you’re a fledgling band walking into a legendary recording studio or a company trying to navigate the rapidly changing business landscape, this story serves as a reminder that opportunity can present itself in the most unpredictable ways. The question is, are you ready to grasp it when it does?

Introduction
Coaching has long been viewed as a premium service, frequently offered only to the upper echelons of organisations, the C-suite executives. The potential benefits of coaching in enhancing leadership skills, strategic thinking, and overall effectiveness are well-documented (Gawande, 2011; Coutu & Kauffman, 2009). However, current research also underscores its broader utility across all tiers of an organisation, promoting it as an indispensable instrument for comprehensive personal and professional development (Grover & Furnham, 2016; Wang, Qing, et al., 2021).
Contemplating a world where coaching benefits could be accessed by every individual within an organisation, irrespective of their position, is invigorating. Envision a Chief Coaching Officer (CCO) guiding this transformation, meticulously integrating coaching into every facet of the organisational structure. Such progressive thinking could trigger a paradigm shift in the corporate landscape.
Coaching is now in the top three tools for modern organisations. There are a number of global organisations who are actively utilising coaching – those that are show marked individual and team improvements.
Coaching Beyond Conventional Domains
Atul Gawande’s (2011) illuminating article “Personal Best” and Ted Talk narrates how the power of coaching can transcend beyond traditional spaces into unexpected realms like the operating theatre. He invites a retired colleague to observe his surgical techniques and offer coaching, effectively bridging the coaching principles of sports or performing arts with the medical field. This compelling narrative is a testament to the universality of coaching, emphasising its potential for ongoing self-improvement across various professional disciplines.
Dispelling Misconceptions Around Coaching
To achieve an effective rollout of a comprehensive coaching strategy, we need to challenge the pre-existing association of coaching with performance improvement or the resolution of performance issues, particularly outside the C-suite. Coaching should be viewed as something other than a remedial measure but as a proactive tool for fostering personal and professional growth. This proactive view promotes an organisational culture where coaching becomes a regular aspect of professional development rather than a response to performance deficiencies.
Expanding the Horizon of Coaching
Consider an early career employee mastering technical skills while being coached to negotiate broader career challenges. Or a mid-level manager augmenting their leadership prowess through a customised development journey. The utility of coaching extends beyond conventional confines, offering numerous benefits, including amplified self-awareness, goal attainment, and improved stress management (Grant, 2013; Bozer & Sarros, 2014).
Introducing the Chief Coaching Officer
The advent of a Chief Coaching Officer (CCO) could revolutionise coaching. By nurturing a coaching culture within the organisation, a CCO can make coaching accessible to all, from entry-level professionals to senior executives. The CCO’s responsibilities would include overseeing the execution of coaching programmes, designing an overarching coaching strategy, and ensuring effective resource allocation. Crucially, the CCO would assess the impact of these initiatives on individual and organisational performance, thereby validating the effectiveness of the coaching interventions.
Addressing Potential Hurdles
The transition towards a coaching culture does not come without its challenges. These range from financial constraints and identifying apt coaches to the potential discomfort of professionals who may be reluctant to expose themselves to scrutiny. Nevertheless, these hurdles are not insurmountable. Retirement, for instance, need not symbolise the end of one’s career; the wealth of experience accumulated by retirees could be channelled into coaching roles. Furthermore, investing in coaching can yield significant returns, not just in the form of avoided mistakes but also through augmented performance (Gawande, 2011).
The Final Word
In our ever-competitive and rapidly evolving world, organisations must recognise the potential benefits of expanding the scope of coaching. Empirical evidence supports its effectiveness as a developmental intervention (Grover & Furnham, 2016; Sharma, 2017; Wang, Qing, et al., 2021). Adopting an organisation-wide approach to coaching can catalyse individual potential and drive company-wide growth. The appointment of a Chief Coaching Officer can be a strategic move towards fostering a culture of continuous learning and improvement. Ultimately, the goal is to enable every professional to achieve their personal best, regardless of their position or field.
References
Coutu, D., & Kauffman, C. (2009). What can coaches do for you? Harvard Business Review, 87(1), 92–97.
Gawande, A. (2011). Personal best. The New Yorker, October, 3.
Grover, S., & Furnham, A. (2016). Coaching as a developmental intervention in organisations: A systematic review of its effectiveness and the mechanisms underlying it. PloS one, 11(7), e0159137.
Bozer, G., & Sarros, J. C. (2012). Examining the effectiveness of executive coaching on coachees’ performance in the Israeli context. International Journal of Evidence Based Coaching and Mentoring, 10(1), 14-32.
Grant, A. M. (2013). The efficacy of executive coaching in times of organisational change. Journal of Change Management, 13(4), 411-429.
Sharma, P. (2017). How coaching adds value in organisations-The role of individual level outcomes. International Journal of Evidence Based Coaching & Mentoring, 15.
Wang, Q., Lai, Y., Xu, X., & McDowall, A. (2021). The effectiveness of workplace coaching: a meta-analysis of contemporary psychologically informed coaching approaches. Journal of Work-Applied Management.