Introduction
Welcome to the leadership paradox. Let’s start with a scenario. You joined your company many years ago, starting in a more junior role, where you proved your skill sets over and over. Maybe you were a born salesperson, maybe you were a master at client relations, maybe you were product focused, perhaps something else entirely. Regardless, with every task and every further year at the company, you demonstrated your value. As such, you were rewarded with a series of promotions. Eventually you found yourself in a management position, the leadership role you’d always wanted. Sounds great. All is well, right?
Not necessarily. Because once you’d reached this position, you discovered that the skills you’d demonstrated to get there weren’t needed anymore. You’d ceased to be the one selling; you’d ceased the one fronting the call; you’d ceased to be the one making decisions around specification. You might have still tried to do all those things, to involve yourself heavily and bring yourself back to the fore. Perhaps you rolled up your sleeves and said your management style was “hands-on”, justified your involvement in lower-level projects by saying you were a lead from the front type.
To do so is only natural. After all, you got to where you are by being an achiever, someone who not only got things done but prided themselves on being the one actively doing them. And this is where the paradox lies. Because the skills that served you so well and earned you your promotion might be the very same ones preventing you from succeeding in your new role.
This notion is distilled to its essence in the title of Marshall Goldsmith’s bestselling book: “What got you here won’t get you there” [1]. Being a thriving part of the workforce and being a leader are two entirely different things. The skills are not the same. To achieve, you need to be able to get the best out of yourself. To lead, you need to be able to get the best out of other people.
Oftentimes, newly promoted leaders try to continue as they were before. They want to get their hands dirty, to micro-manage and ensure that every aspect of a project is marked by their fingerprints. But micro-management is not the answer. As Jesse Sostrin PhD, Global Head of Leadership, Culture & Learning at Philips, puts it, leaders need to be “more essential and less involved.” He adds, “the difference between an effective leader and a super-sized individual contributor with a leader’s title is painfully evident” [2].
For many, the adjustment is difficult and can take time. If a leader is too eager to imprint themselves on every aspect of a project, not only is the leader likely to end up feeling overstretched (according to Gallup research, managers are 27% more likely than individual contributors to strongly agree they felt a lot of stress during their most recent workday [3]) but the project will suffer too. Staff will come to feel constrained and undervalued. They may not feel they have the opportunity to grow or express themselves fully. They will be less likely to try new and innovative ideas with someone breathing down their neck or dictating that they must service a single vision at all times rather than being allowed to bring themselves to the fore.
In other words, micro-management offers a whole lot of downsides in exchange for very few upsides. Sostrin proposes that a useful way for a manager to tell if they are taking on too much responsibility is by answering the simple question: If you had to take an unexpected week off work, would your initiatives and priorities advance in your absence? [4] A well-functioning team run by an effective leader should in theory be able to get by without that leader – for a period of time, at least. Whereas an organisation that orbits around the whims of a single figure is likely to stall, and fast. It’s why all good managers practice delegation.
Why delegate?
Delegation is something every business practices but not all do well. Just handing an employee some of the work does not count as delegation in any meaningful sense. Successful delegation involves genuinely trusting the employee and granting them autonomy. That can be a scary prospect for a leader used to having a controlling stake in all output. But there are ways to ensure that even without constant supervision, your team is working in a manner you approve.
The first is obviously to hire smart, capable workers to whom you feel comfortable delegating responsibility. Oftentimes leaders take on extra workplace burdens out of a lack of faith in their team. They think, “I’m not confident they have the ability to do the task,” and so instead choose to take it on themselves. But trust is paramount to any successful workplace. And to paraphrase Ernest Hemingway, the best way to know if you can trust an employee is to trust them – at least until they give you a reason not to. The best thing a leader can do is give their employees a chance and see what happens.
After all, a leader’s job is to get the best out of their employees. As Forbes writer Cynthia Knapek puts it, “Some people work to show you what their superpower is, but a good leader works to show you yours…you’ll never be a good delegator if you’re holding on to the belief that no one can do it as well as you can” [5].
Trusting your team – and shedding the arrogance of presuming you can do everything better yourself – is pivotal to good leadership. Refusing to cede control is the sign of an insecure leader, one who sees their role and status as proportional to their decision-making authority. They think that any act of delegation would lead to a dilution of their power.
This theory is backed up by a 2017 study on psychological power and the delegation of authority by Haselhuhn, Wong and Ormiston. They ultimately found that, “individuals who feel powerful are more willing to share their decision making authority with others. In contrast, individuals who feel relatively powerless are more likely to consolidate decision making authority and maintain primary control” [6]. Delegation is a sign of strength, not weakness. Consolidation of all authority is the remit of the insecure.
Another thing leaders can do to help ensure their team is working autonomously but towards a clear end goal is to have a solid set of principles in place. These principles shouldn’t just highlight the leader’s values and goals but make clear the approach they want to use to achieve them. Shift Thinking founder and CEO Mark Bonchek calls such a set of principles a company’s “doctrine”. Bonchek argues that, “without doctrine, it’s impossible for managers to let go without losing control. Instead, leaders must rely on active oversight and supervision. The opportunity is to replace processes that control behavior with principles that empower decision-making” [7].
Having a guiding set of principles in place lets you delegate responsibility more freely because you know that even with limitless autonomy, your employees are aware of the parameters they should be working within – it keeps them drawing within the lines.
Evidently, a pivotal part of leadership is and always will be people management. But if a leader has already clearly defined their principles, they’ll find they need to manage their people much less. Some companies that advocate for principles-based management include Amazon, Wikipedia and Google. The proof is in the pudding.
Effective delegation
How delegation is handled contributes enormously to what kind of company one is running and what kind of leader one is. For example, consider two scenarios. In scenario one, a tired and over involved leader, seeing that they have taken on more than they can chew with a deadline fast approaching, tells one of their team that they no longer have time to do a report that they were meant to be writing and so thrusts it on the employee to hastily pick up the slack.
In scenario two, a leader identifies a member of their team who they want to write a report for them. They talk to the employee and tell them that they’ve noticed the employee’s precision in putting facts across concisely and engagingly and want them to put those skills to use in this latest report. They talk through what they want from the project and why this employee is the perfect person to achieve those goals. They make known that they are available for support should any be needed.
In both examples, the boss is asking their employee to write a report. But in one that work is something fobbed off on the employee, a chore the leader no longer wants to do. In the second example, the leader is identifying the skills of a member of their team, letting the employee know that these are the skills needed for the task at hand and thus giving the employee an idea of what’s needed from them as well as a confidence boost.
Sostrin suggests four strategies for successful delegation [8]. First, to start with reasoning. As in the example above, this includes telling someone not just what work they want done but why – and that means both why they are working towards a certain goal and why the employee is the person to do it.
Second, to inspire their commitment. This, again, is about communication. By relaying the task at hand, their role in it and why it’s important, they can understand the bigger picture, not just their specific part of it. They’re then more able to bring themselves to the project, rather than viewing it as simply a tick-box exercise they’re completing for their boss.
Third, to engage at the right level. Of course delegation doesn’t mean that a leader should hand work over to their employees and then never worry about it again. They should maintain sufficient engagement levels so that they can offer support and accept accountability, but do so without stifling their team. The right balance depends on the organisation, the project and the personnel involved, but Sostrin suggests that simply asking staff what level of supervision they want can be a good start.
Fourth, to practise saying “yes”, “no”, and “yes, if”. That means taking on demands that you think are best suited to you, saying “yes, if” to those that would be better off delegated to someone more suited to that specific task, and giving outright “no”s to those you don’t deem worthwhile.
For example, Keith Underwood, COO and CFO of The Guardian, said that he doesn’t delegate when “the decision involves a sophisticated view of the context the organisation is operating in, has profound implications on the business, and when stakeholders expect me to have complete ownership of the decision” [9].
Kelly Devine, president of Mastercard UK and Ireland, says, “The only time I really feel it’s hard to delegate is when the decision is in a highly pressurised, contentious, or consequential situation, and I simply don’t want someone on my team to be carrying that burden alone” [10].
On top of these four, it’s worth adding the benefits around communicating high-profile, critical company decisions to your team, whether that be layoffs, new investors, or whatever the case may be. Leaders should want their employees to feel part of the organisation. That means keeping them in the loop of not just what is happening but why. Transparency is highly valued and in turn valuable.
In summary
It can be all too easy for managers who rose through the corporate ranks to eschew delegation in favour of an auteur-esque approach – shaping a team in their distinct image, if not actively trying to do all the work themselves. But delegation not only makes life less tiring and stressful for the leader, who cannot possibly hope to cover everyone’s work alone, but it also results in a happier, more productive, and likely more capable workforce, one that feels trusted and free to experiment rather than constrained by fear of failure.
Good ideas come from anywhere. Good organisations are built on trust. Good leaders don’t smother their workers but empower them. And with each empowered collaborator, the likelihood of collective success grows.
More on Trust
Leadership in Focus: Foundations and the Path Forward
Unleashing Leadership Excellence with Dan Pontefract (podcast)
References
[1] https://marshallgoldsmith.com/book-page-what-got-you-here/
[2] https://hbr.org/2017/10/to-be-a-great-leader-you-have-to-learn-how-to-delegate-well
[4] https://hbr.org/2017/10/to-be-a-great-leader-you-have-to-learn-how-to-delegate-well
[6] https://www.sciencedirect.com/science/article/abs/pii/S0191886916311527
[7] https://hbr.org/2016/06/how-leaders-can-let-go-without-losing-control
[8] https://hbr.org/2017/10/to-be-a-great-leader-you-have-to-learn-how-to-delegate-well
[9] https://hbr.org/2023/03/5-strategies-to-empower-employees-to-make-decisions
[10] https://hbr.org/2023/03/5-strategies-to-empower-employees-to-make-decisions
Introduction
In the world of investing, Charlie Munger is a legendary figure, celebrated for his sage-like wisdom and insightful aphorisms. As Warren Buffet’s right-hand man, his approach is a testament to the power of effective decision-making and wisdom, which he famously accredits to his ‘multi-disciplinary’ approach—a rich mosaic of insights from various academic disciplines, including applied, organisational, and social psychology.
Munger’s perspective is unique and practical because he harnesses these theories and translates them into real-world applications. His approach forms an interesting amalgamation, merging business acumen with psychological theories—a powerful combination that leads to meaningful, insightful, and profitable decisions.
The power of incentives: An intersection of economics and psychology
Munger emphasises the importance of incentives, an intersection of economics and psychology, in shaping human behaviour. “Show me the incentive, and I will show you the outcome,” he famously said. In applied psychology, the operant conditioning theory by B.F. Skinner aligns with Munger’s philosophy. It suggests that behaviour is learned and maintained through immediate consequences or rewards. In organisations, this theory’s implications are vast. By understanding the impact of incentives—be it financial, social, or psychological—leaders can drive behaviour that aligns with the company’s strategic objectives.
Cognitive biases and decision making: A Mungerian perspective
In his famed address to Harvard University in 1995, Munger laid out 25 standard causes of human misjudgement—a compendium of cognitive biases that he believes significantly impacts decision-making. These biases are psychological tendencies that can cloud our judgment and influence our decision-making processes. They include confirmation bias (favouring information that confirms our pre-existing beliefs), social proof (the tendency to see an action as more appropriate when others are doing it), and availability bias (relying on immediate examples that come to mind when evaluating a specific topic or decision), among others.
In addition, Munger also discussed biases such as over-optimism, anchoring, and the contrast effect, highlighting how these can distort our understanding of reality and lead to erroneous decisions.
In the field of organisational psychology, these cognitive biases are recognised as significant barriers to rational decision-making. They create an environment susceptible to phenomena such as groupthink, where a desire for harmony or conformity in the group results in an irrational or dysfunctional decision-making outcome. These biases can also engender substantial resistance to change, as individuals often favour the familiar and view potential changes with a degree of scepticism and fear.
To mitigate the effects of these cognitive biases, Munger emphasised the importance of cultivating cognitive flexibility and self-awareness in our thinking patterns. Cognitive flexibility involves shifting our thinking and using different thinking strategies in different scenarios. On the other hand, self-awareness is the conscious knowledge of one’s character, feelings, motives, and desires. By being aware of our biases, we can better question our initial judgments and decisions and consider alternatives.
Munger also advocates for the idea of using mental models, drawing from a variety of disciplines, to aid in decision-making. This multidisciplinary approach to thinking helps counteract the narrow-mindedness that can result from over-reliance on a single perspective and encourages a more comprehensive understanding of problems, ultimately leading to better decision-making.
Harnessing social influence: Understanding the psychology of persuasion
Munger often references Robert Cialdini’s principles of persuasion—reciprocity, commitment and consistency, social proof, authority, liking, and scarcity. He asserts that these principles don’t just operate on an individual level but can significantly influence organisational culture and drive business outcomes.
For instance, the principle of commitment and consistency can improve organisational efficiency. When employees commit to a task or goal, they are more likely to follow through. Similarly, the principle of social proof plays a role in shaping corporate cultures. People tend to conform to the behaviours of the majority, which can either drive productive work ethics or create a toxic environment.
Navigating the latticework of mental models
Munger advocates for the latticework of mental models, suggesting that one must understand various disciplines to make effective decisions. This is where the role of interdisciplinary knowledge, specifically a blend of applied, organisational, and social psychology, becomes paramount.
One of the key insights of this approach is the understanding that organisations are not just economic entities but psychological and social entities as well. Leaders who appreciate this complexity are more equipped to drive their organisations towards sustainable success.
Conclusion: The intersection of wisdom and psychology
Munger’s wisdom, grounded in various psychological theories, provides a robust framework for understanding and influencing human behaviour in organisations. By weaving together insights from applied, organisational, and social psychology, he teaches us that wisdom is not just about knowledge but also about understanding human nature and leveraging it for collective progress. His philosophies echo the timeless essence of these psychological theories, reminding us that at the heart of every organisation, the human element counts the most.
Introduction
As the world continues to evolve, so does the way we use technology to improve our lives and workplaces. New York City recently adopted final regulations on the use of AI in hiring and promotion processes marking a significant step in addressing potential biases and ethical concerns surrounding the use of AI in the workplace. The question now is, will other countries follow suit and implement similar regulations?
As AI increasingly moves from automating drudge work to playing a more prominent role in decision-making, it’s vital that we understand the implications and potential risks. The good news is that some countries have already started to take action in this area.
Global progress on regulations
The European Union, for instance, unveiled its proposed AI regulations in April 2021. While these regulations are still in the proposal stage, they represent a comprehensive approach to governing AI use across various sectors, including hiring and promotions. The EU’s proposed rules are designed to ensure that AI systems are transparent, accountable, and respect fundamental rights.
Japan, another key player in AI development, established the AI Technology Strategy Council in 2016. The Council has since released a series of strategic guidelines that consider the ethical, legal, and social issues surrounding AI use. While these guidelines are not legally binding, they provide a framework for companies and the Japanese government to consider as they develop AI systems and technologies.
Ethical challenges
In contrast, countries like China and Russia have prioritised developing and deploying AI for economic and strategic gains, with less emphasis on ethical considerations. However, as AI becomes more integrated into hiring and promotion processes globally, it’s likely that these countries will also have to address the ethical challenges presented by AI.
So, what are the chances of the NYC regulations being successful? It largely depends on how well they are enforced and how willing companies are to adapt their practices. One of the keys to success will be educating employers about the benefits of ethical AI use and the potential risks of non-compliance.
Biases and discrimination
The impact of AI in hiring and promotion goes far beyond automating menial tasks. By leveraging AI’s ability to analyse vast amounts of data, we can make better, more informed decisions in these areas. However, this also raises the risk of perpetuating biases and discrimination.
As we’ve seen in recent years, AI algorithms can sometimes unintentionally reinforce existing biases due to the data they’re trained on. By implementing regulations like those in NYC, we can help ensure that AI is used responsibly and that it truly serves to benefit all members of society.
The key takeaway is that while the use of AI in hiring and promotion can be hugely beneficial, it’s essential to have regulations in place to ensure ethical practices. As New York City has taken this bold step, we’ll see more countries and cities follow in their footsteps.
Conclusion
In conclusion, the adoption of AI regulations in New York City is a significant move towards ensuring the responsible and ethical use of AI in hiring and promotion processes. As AI continues to play an increasingly important role in our lives, it’s crucial that governments and businesses alike prioritise transparency, accountability, and the protection of fundamental rights. By doing so, we can harness the power of AI to create a fairer, more inclusive society – and that’s something worth celebrating.
So, will other countries follow New York City’s lead? I believe they will, and it’s only a matter of time before AI regulations become a global norm. Let’s keep the conversation going, stay informed, and make the best decisions.
Introduction
Earlier this month, Elvis Costello played in Dublin, performing without the full line-up of the Attractions and accompanied only by his long-time collaborator Steve Nieve. After journeying together through 45 years of tour buses, dressing rooms, hotel lounges, flights, recording studios, and live performances, the seamless synergy between Elvis Costello and Steve Nieve is undeniable. Their collaboration and bond have evolved into an intuitive language, subtle to an outsider but vividly clear to them. The intuitive language shared by Costello and Nieve symbolises the essence of collaboration—a universal phenomenon that crosses various fields and industries.
Collaboration: the term is a buzzword in boardrooms, often discussed in strategy meetings and corporate corridors. Morten T. Hansen, in his pivotal book, ‘Collaboration: How Leaders Avoid the Traps, Build Common Ground, and Reap Big Results,’ explains that the core of collaboration isn’t about amassing tangible assets. Rather, it’s about unlocking value through shared knowledge and relationships.
If you’ve ever viewed collaboration as elusive, difficult to implement, or limited to a select few, it’s time to rethink that perspective. Drawing on insights from scholars like Robert Axelrod, we’re making the case that collaboration isn’t just an inherited trait like ‘DNA.’ It’s also influenced by factors such as leadership and vision, which can be actively nurtured to become a potent force for collective action within any organisation.
Collaboration in practice
Public opinion on collaboration varies. While some see it as vital to effective organisational practice, others dismiss it as mere managerial jargon. The truth lies somewhere in between; collaboration offers tangible benefits and value when practised effectively. Given the rapid changes in our world, the importance of collaboration has never been greater. With emerging nations reshaping the global economic landscape and partnerships becoming increasingly essential, is it now a non-negotiable asset? From the arts and sports to science and business, effective collaboration enriches our collective experiences and is indispensable for leadership. Symbiotic relationships like that between Xavi Hernandez and Andres Iniesta in football or Michael Jordan and Scottie Pippen in basketball have redefined standards for teamwork. These duos show that collaboration magnifies individual brilliance to create game-changing moments. In facing global challenges like climate change, the need for collaboration extends beyond industries to nations and continents. Initiatives like the Paris Agreement represent concerted efforts to combat an existential threat, underscoring the power of collective action.
Collaborative discoveries
In science, the importance of collaboration is ever-present. The International Space Station (ISS) is a testament to what can be achieved through international teamwork, bringing diverse skill sets and perspectives together to reach a common goal. Historical collaborations like that between Albert Einstein and Marcel Grossmann laid the foundation for ground-breaking theories like general relativity.
In the wake of the COVID-19 pandemic, unprecedented levels of global scientific collaboration led to the rapid development and distribution of vaccines. This real-time, high-stakes cooperation among nations, scientists, and pharmaceutical companies demonstrated that extraordinary outcomes are possible when humanity unites for a common cause.
The business of collaboration
In the business world, partnerships have also yielded significant results. Procter & Gamble, which began as a small partnership, has grown into a global giant. The collaborative synergy between William Procter and James Gamble transformed a modest venture into an empire. Modern workspaces are designed better to facilitate such collaborative endeavours, but more can be done. As organisational psychologist Adam Grant proposes, people may work from home but come to the office to collaborate. Artificial intelligence is adding a new dimension to team collaboration, evolving from a tool for basic tasks to handling complex roles like data analysis. Integrating AI empowers teams to make agile decisions and foster a conducive, flexible work environment. In the age of remote work, tools like Slack and Zoom have become indispensable for team collaboration, breaking down geographical barriers and enabling real-time communication and project management.
Practical steps for effective collaboration
As the intricacies of collaboration unfurl, understanding its practical implementation becomes paramount. Begin with a shared vision, ensuring everyone recognizes the endgame. Assemble diverse teams, ensuring a mix of expertise and perspectives. Prioritize transparent communication, creating a culture where ideas flow freely. Regular check-ins are essential, not just to track progress but to celebrate milestones. Equip teams with the right tools and training, fostering an environment conducive to collaboration. And remember, genuine feedback, whether praise or constructive critique, is the cornerstone of continuous improvement.
Unpacking the potential of collaboration
Collaboration isn’t a one-size-fits-all endeavour; it’s a nuanced and intricate dance that varies depending on context. In contemporary business settings, traditional hierarchical frameworks make way for more decentralised, cross-functional operations. This shift calls for a managerial approach that goes beyond mere oversight to include motivation and influence. As evidenced by the rise of virtual teams, mastering the complexities of modern teamwork often determines organisational success or failure.
Within this complex landscape, the durability of collaborative relationships is critical. It isn’t just the responsibility of the individuals involved; it must be woven into the fabric of organisational practices. Emerging technologies like blockchain also illustrate the potential of decentralised, collaborative systems. With its network of nodes working together to validate transactions, this technology represents a ground-breaking form of collaborative interaction.
Social psychologists like Debra Mashek outline various levels of collaborative engagement, each requiring its own set of rules based on the degree of trust, commitment, and resource-sharing. Dr. Carol D. Goodheart further emphasises that effective collaboration can significantly amplify organisational resources, an aspect often overlooked due to inadequate training in collaborative practices.
The real challenge lies in integrating the value of collaboration into daily operations. Investments in cultural and behavioural initiatives often dissipate when confronted with the rigid processes of ‘business as usual.’ Existing behavioural assessment tools also fall short, lacking the specificity needed to capture the multifaceted nature of collaboration.
Moving forward, an integrative approach is essential—one that aligns cultural initiatives with business processes and enriches traditional assessments with collaboration-focused metrics. The benefits of collaboration are clear; we can’t afford to leave them to chance. Fostering a genuinely collaborative environment requires a thoughtful convergence of culture, process, and leadership.
Attributes for greater collaboration
Research has shown that the following attributes enable greater collaboration within an organisation:
• Strategically Minded: Individuals can see beyond their immediate roles and consider broader objectives. This fosters cooperative behaviour and long-term value.
• Strong Team Orientation: Crucial for effective collaboration. It enables individuals to focus on common goals, adapt to team dynamics, and foster an inclusive environment.
• Effective Communication: Vital for success, characterised by openness, two-way dialogue, and responsiveness.
• Openness to Sharing: Encompasses a willingness to discuss ideas, accept suggestions, and change one’s mind, thereby encouraging meaningful collaboration.
• Creativity and Innovation: Willingness to think outside the box and find intelligent solutions to complex problems.
• High Levels of Empathy: Demonstrated understanding of others’ perspectives and emotions, thereby enhancing teamwork and customer focus.
• Inspiring Leadership: Effective leaders focus on collaboration and people management, avoiding micromanagement and bossy attitudes.
Conclusion
Collaboration is far more than a corporate buzzword; it is a nuanced, multi-layered approach that fundamentally influences all sectors of human endeavour—from the arts and sciences to sports and business. We’ve seen how partnerships like Lennon and McCartney have become legendary in the arts, transforming the music landscape. In science, collaborations like the International Space Station embody the pinnacle of what international teamwork can achieve. In the business world, the symbiosis between William Procter and James Gamble shows how small partnerships can turn into global giants.
As the work landscape shifts, with Adam Grant suggesting the office as a ‘crucible’ for collaboration even in the age of remote work, it becomes evident that we need to understand the complexities and subtleties involved more deeply. Scholars like Debra Mashek and Carol D. Goodheart offer valuable insights into the transformative power of collaboration, urging us to see it not as an optional asset but as a vital force for societal advancement. And in facing global challenges, whether it’s climate change or the complexities of emerging technologies like blockchain, collaboration scales from the individual to the global level, making it a non-negotiable asset for collective progress.
By actively embracing and nurturing the diverse forms of collaborative interaction, we do more than enrich our individual lives; we catalyse collective progress, paving the way for unforeseen possibilities and ground-breaking innovations. This makes it imperative to appreciate the concept of collaboration and invest in creating a culture, adopting processes, and establishing leadership that intentionally fosters collaborative engagement.
As we look toward the future, the question is no longer whether collaboration is beneficial but how we can cultivate it to unlock its full potential. This calls for proactive measures from individuals and organisations to move from mere understanding to actively promoting a collaborative ethos. Our collective progress depends on it.
More on Collaboration
Synergy Over Solo: Navigating the Collaborative Future of Business
The Power of Team Clusters: A People-Centric Approach to Innovation
References:
Axelrod, R. (1984). The Evolution of Cooperation. Basic Books.
Chakkol, M., Finne, M., & Johnson, M. (2017). Understanding the psychology of collaboration: What makes an effective collaborator. Institute for Collaborative Working: March.
Hansen, M. (2009). Collaboration: How leaders avoid the traps, build common ground, and reap big results. Harvard Business Press.
Lipnack, J., & Stamps, J. (2008). Virtual teams: People working across boundaries with technology (3rd ed.). John Wiley & Sons.
Mashek, D. (2016). Collaboration: It’s Not What You Think. Psychology Today. February, 26.
Introduction
The UK government caused controversy recently by rolling back its commitments to net zero. Commenting on the backpedalling, Ford’s UK chair Lisa Brankin raised concerns, citing the rampant upheavals Ford (and every company in the automotive sector) have been making in order to act in accordance with the now-reversed ban on the sale of new petrol and diesel cars by 2030.
As well as the UK 2030 target being “a vital catalyst to accelerate Ford into a cleaner future,” Brankin stated that, “Our business needs three things from the UK government: ambition, commitment and consistency. A relaxation of 2030 would undermine all three” [1].
Essentially what the government has done is undermine the trust of Ford and every other leading player in the automotive sector. They said they were committed to something and then showed they were not. They had the whole sector aligned and working at breakneck speed to overhaul old practices only to discover they’d been wasting their time and money.
As the old saying goes, trust is hard to earn and easy to lose. And in business, a loss of trust is catastrophic.
The value of trust
Research published by Harvard Business Review found that workers at companies where trust is high report 106% greater energy in the office, 74% lower stress levels, 76% greater engagement, and 50% more productivity than their peers at low-trust businesses [2].
Meanwhile PwC reports that 91% of business executives say their ability to build and maintain trust improves the bottom line (including 50% who strongly agree), 58% of consumers say they have recommended a company they trust to friends and family, and 64% of employees say they recommended a company as a place to work because they trusted it [3].
Trust pays. It builds relationships – both internally and with clients – and only grows stronger with time. It produces happier, more productive employees and reaps dividends in profit. Evidently, then, it’s something worth investing in. But to do so, we first need to clarify what we mean by trust.
What is trust?
Writing for Forbes, John Hall, a motivational speaker and co-founder of the time and scheduling management app Calendar, says workplace trust relies on two fundamentals: “First, every team member is making their best effort to further the interests of the company; second, everyone assumes that fact about everyone else on the team unless they see evidence to the contrary” [4].
In lieu of trust falls, office ping pong or other more performative variants of workplace integration, trust boils down to something more fundamental, whether you are doing your best and giving everyone else in your team the courtesy of assuming they’re doing the same.
This second part can prove especially difficult. We can control our own work ethic, not others. And within almost all office environments there’s a sense of competitiveness, the rate and quality of your output exists in constant competition with the rate and output of your colleagues. Who’s in the boss’s good books? Who’s getting the bonus? The promotion?
All these considerations can’t help but cultivate attrition. We may like to think our colleagues aren’t working as hard or to as high a standard as we are out of pride or to build up our own sense of self-worth. This is misguided. We need to bestow trust freely and unsparingly. When considering how best to decide who is trustworthy, Ernest Hemingway put the answer most succinctly: “The best way to find out if you can trust somebody is to trust them” [5].
It’s a leap of faith. That’s what trust is at its core. And until somebody gives you a reason not to trust them, your best bet is to give them the benefit of the doubt.
The science of trust
In an era marred by a seemingly endless carousel of corporate jargon and buzzwords, it’s possible to read about the notion of trust and think it’s more of the same – a benevolent, ultimately abstract notion that holds no quantifiable value but makes for a useful throwaway LinkedIn post or hastily churned out blog. But there is a science to trust, as demonstrated by Paul J. Zak, the founding director of the Center for Neuroeconomics Studies and a professor of economics, psychology, and management at Claremont Graduate University, and the CEO of Immersion Neuroscience.
Having seen in rodents that a rise in the brain’s oxytocin levels signified that another animal was safe to approach, Zak wondered if the same was true for humans. He conducted an experiment following the model of Nobel laureate in economics Vernon Smith [6]. In the experiment, a participant would choose an amount of money to send to a stranger via computer, knowing that the amount they chose to send would triple once they’d sent it. The recipient would then have the option of sharing this tripled amount with the sender or keeping all the cash for themselves. It was a trust exercise made of two parts. First, how much do you send? Second, do you share or steal?
To measure oxytocin levels during the exchange, Zak and his colleagues developed a protocol to draw blood from people’s arms before and immediately after they made decisions to trust others (if they were senders) or to be trustworthy (if they were receivers). The participants were not informed as to the content of the study (and even if they had been, they still would have had no control over the amount of oxytocin their bodies release).
They found that the more money people received (denoting greater trust on the part of senders), the more oxytocin their brains produced. The amount of oxytocin recipients produced then also predicted how trustworthy – that is, how likely to share the money – they would be. To prove that this was not just a result of the brain randomly generating chemicals, they performed further tests, administering doses of synthetic oxytocin into the brain through nasal spray and comparing participants who’d had a dose with those who’d had a placebo. They found that giving people 24 IU of synthetic oxytocin more than doubled the amount of money they sent to a stranger.
To ensure that the oxytocin spray did not cognitively impair the participants – and thus that their actions were actually born of brain fog or psychosis rather than trust – they performed other tests, this time replacing the money test with a gambling model. They found that increased oxytocin led to no rise in risk taking. In other words, the sole and genuine effect of increased oxytocin was to reduce the fear of trusting a stranger.
Over the following ten years, during which he conducted various further tests on oxytocin levels, Zak found that stress is a potent oxytocin inhibitor, as well as learning that oxytocin increases a person’s empathy, which of course is a vital tool for any act that requires collaboration.
How to develop trust
There is a gap in how executives see trust in business and how employees and customers see it. According to PwC, 84% of business executives think that customers highly trust their company, yet only 27% of customers say the same. Similarly 79% of business executives say their employees trust the company, but only 65% of employees agree [7]. Clearly, then, the first step a higher-up can take to improve trust in the company is to be aware that it’s lacking.
Zak’s continued research shows that recognition and attainment of goals are the most proven ways of garnering trust. “The neuroscience shows that recognition has the largest effect on trust when it occurs immediately after a goal has been met, when it comes from peers, and when it’s tangible, unexpected, personal, and public” [8].
Setting goals that are difficult but achievable is crucial. The moderate stress of the task releases neurochemicals, including oxytocin and adrenocorticotropin, that intensify people’s focus and strengthen social connections. However, the challenges have to be achievable and have a clear endpoint. Research shows that vague goals cause employees to give up before they’ve even started.
Pivotal to trust rates within an organisation are messaging and communication. Internal trust networks are hard to maintain because the flow of communication is so much looser and unrestrained than in a strictly employee-client relationship. Organisations are sending their workers multiple, often contradictory messages every day. Different departments are working towards distinct, sometimes contrasting goals. Maintaining alignment to a clear, single message is extremely difficult and does not happen by accident.
Inconsistent messaging, inconsistent standards and false feedback all contribute to the sense of a company unworthy of trust. If one boss is asking workers to pull one way while another boss asks them to pull the other, employees will lose faith in management. This is even more true when it is just one boss flip-flopping on the direction of travel, unsure of their own wants.
Regarding standards, if a boss sets a line, verbal or written, as to what is acceptable behaviour or what is the demanded standard of work but then fails to live up to this standard themselves then trust will quickly dissipate. The same is true if they allow others to get away with clear or repeated breaches, especially if the boss is thought to be playing favourites. It is for managers to set the tone and take responsibility for their organisation. A leader’s words and actions are ascribed deep meaning by their employees, and will be scrutinised heavily. Trust starts at the top and filters down.
Former Herman Miller CEO Max De Pree once said, “The first responsibility of a leader is to define reality. The last is to say thank you. In between the two, the leader must become a servant” [9]. That ability to humble oneself is pivotal to good management.
One way leaders can achieve this is to be willing to ask for help from their workers rather than just telling others what to do. This forms trust and connection with the employees and shows signs of a secure leader, far more trustworthy than one who pretends to have all the answers. As Steve Jobs said, “It doesn’t make sense to hire smart people and tell them what to do; we hire smart people so they can tell us what to do” [10]. Ask questions and show a willingness to learn, you can bet your employees will do the same in turn.
Trust today
Lending employees trust is of greater importance today than ever before due to the prevalence of home and hybrid working. Employers are not able to see and covertly monitor their employees through the day as they can in an office, and so must trust their teams to get the work done in a more autonomous fashion.
People can meet the same standards of in-office productivity from home on their own, less constrained schedule. The numbers back it up [11]. But still some companies are wary. We’ve all seen stories of organisations that want to remotely monitor the usage of their workers’ computers throughout the day to check that they are always at their desk during work hours. This draconian approach shows a total lack of trust. Who would want to work for a company that held them in such low regard? What kind of atmosphere does that cultivate? We talk a lot about company culture. Well, a culture that doesn’t trust its staff is unlikely to get the best out of them, and frankly doesn’t deserve to.
Workers will only grow more remote with time. The traditional 9-5 is unlikely to return. Employers need to bestow the requisite levels of trust to get their employees thriving no matter where they are.
Trust is money
Hall recommends we treat trust like we treat money: “Save it carefully, and spend it wisely. You may not be able to measure it like you can a bank balance, but sooner or later, you’ll see it there, too” [12].
Trust is pivotal to any team endeavour and business is no different. Businesses need to cultivate trust with their consumers. To do so, they must first build it internally, starting from the top. That requires consistent messaging and open communication. It requires humility from leaders, not bullish overconfidence. It requires vulnerability and a willingness to trust someone until they prove you wrong, which inevitably some will. But for companies able to garner a truly trusting environment, one in which every worker is giving their best and working under the assumption that each of their colleagues are doing the same, the rewards are enormous.
References
[3] https://www.pwc.com/us/en/library/trust-in-business-survey-2023.html
[6] https://hbr.org/2017/01/the-neuroscience-of-trust
[7] https://www.pwc.com/us/en/library/trust-in-business-survey-2023.html
[8] https://hbr.org/2017/01/the-neuroscience-of-trust
[9] https://hbr.org/2017/01/the-neuroscience-of-trust
[10] https://businessfitness.biz/hire-smart-people-and-let-them-do-their-jobs/
[11] https://www.businessnewsdaily.com/15259-working-from-home-more-productive.html
Introduction
As the labour market evolves, organisations have been reconsidering the importance and relevance of degree qualifications in their hiring practices. A trend known as “degree inflation,” which saw an increase in job descriptions requiring degrees even when the roles hadn’t changed, was particularly evident in the early 2000s. However, the trend experienced a reset in the aftermath of the 2008-2009 Great Recession, reducing degree requirements across numerous roles.
This shift is particularly noticeable in middle-skill positions, which require some post-secondary education or training but not necessarily a four-year degree. The reset is also evident, though to a lesser extent, in higher-skill positions. Two waves have driven this trend. First, a structural reset that started in 2017 and was characterised by a move away from degree requirements in favour of demonstrated skills and competencies. Second, a cyclical reset that began in 2020, prompted by the Covid-19 pandemic, and involved employers temporarily relaxing degree requirements to find skilled workers during the health crisis.
Impact on equality
In the case of Ireland, the shift away from degree requirements has been particularly impactful in increasing female participation in the workforce. According to the latest Labour Market Pulse published by IDA Ireland in partnership with Microsoft and LinkedIn, skills-based hiring and flexible working conditions are integral to increasing female participation in the Irish labour market. The adoption of a skills-first hiring approach has the potential to increase the overall talent pool in Ireland more than six-fold and 20% more for women than men in traditionally male-dominated occupations.
Hard skills Vs soft skills
Despite the promising trends, it’s important to note that the degree inflation reset is a work in progress. A significant percentage of job descriptions still list degree requirements, effectively walling out a vast number of potential employees from the candidate pool. Additionally, while many companies have announced the removal of degree requirements, they often still exhibit a higher-than-average demand for college degrees in practice. This suggests that, while hard skills can be easily confirmed, degrees are still seen as a proxy for soft skills, which are harder to assess.
However, the shift away from degree-based hiring compels companies to think more carefully about the skills they need. More explicit descriptions of desired capabilities in job postings are increasing awareness among applicants about the importance of developing soft skills. This could influence skills providers to consider how they can update their curricula to include these skills.
Diversified talent pool
The elimination of inflated degree requirements is a critical step towards achieving equity in the labour market. Companies should reassess the assumptions underlying their recruitment strategies, reconsidering the use of blunt and outdated instruments in favour of more nuanced, skills-focused approaches. This shift is already opening attractive career pathways for traditionally overlooked workers due to the lack of a four-year degree. The potential result is a win-win situation: greater equity for job seekers and a more robust, diversified talent pool for companies to draw from.
Skills-first approach
This trend is particularly beneficial in the Irish context, where the government has set ambitious targets for gender equality and equal representation in leadership. A skills-first approach could be instrumental in activating the skills of underrepresented groups, including women, people with disabilities, and those without third-level education. Ireland can pave the way for a more inclusive, equitable future by eliminating barriers to well-paying jobs.
If you wish to introduce skill-based initiatives, it is critical to contextualise these ideas within your company’s unique circumstances, set clear objectives, and develop strategies for implementation. Here are some actionable insights based on the above points:
- Develop a Learning & Development Strategy: Understand your company’s current capabilities and identify the areas where there’s a skill gap. Invest in the creation of learning and development programs that target these gaps. These could be in-house training, online courses, or educational partnerships.
- Empower Employees to Shape Their Career Path: Create platforms or mechanisms that allow employees to share their interests and career goals. This could be an annual employee survey, open discussions, or a tool integrated into your HR system.
- Empower Employees to Shape Their Career Path: Create platforms or mechanisms that allow employees to share their interests and career goals. This could be an annual employee survey, open discussions, or a tool integrated into your HR system.
- Create Cross-functional Opportunities: Make it a point to allow employees to participate in projects or tasks outside their usual scope. This will not only allow them to broaden their skills but also to get a better understanding of the overall company operations.
- Incentivise Learning: Make learning an integral part of your company’s culture. Encourage employees to take time out of their work schedule to engage in training and learning activities. Offer rewards or recognition for those who actively participate in these programs or demonstrate new skills.
- Revamp Your Hiring Process: Transition from a credentials-based hiring approach to a skills-based one. Re-evaluate your job descriptions to focus more on the skills required to perform the job rather than academic or professional credentials.
- Introduce Skills Assessments: Implement mechanisms to measure a candidate’s skills during the hiring process objectively. This could include technical assessments, practical exercises, or situational judgement tests.
- Promote Lifelong Learning During Recruitment: During interviews, discuss the company’s learning and development programs and the opportunities for career growth within the organisation. This can make your company more attractive to potential hires.
Introduction
The role of a CEO, once defined by strategy charts and bottom lines, is undergoing a sea change. With constant technological advances, changing business complexities, and societal expectations, CEOs are required to expand their expertise beyond traditional business acumen. Today, a truly great CEO needs to master the art of social skills, demonstrating a keen ability to interact, coordinate, and communicate across multiple dimensions.
As the business landscape continues to grow more complex, the ability to navigate this intricacy has become a defining factor in effective leadership. This holds true for large, publicly-listed multinational corporations and medium to large companies operating in a rapidly evolving marketplace. As a result, leaders must possess the skills and acumen to navigate this complex landscape, make informed decisions, and steer their organisations toward success.
Social Skills
Top executives in these firms are expected to harness their social skills to coordinate diverse and specialised knowledge, solve organisational problems, and facilitate effective internal communication. Further, the interconnected web of critical relationships with external constituencies demands leaders to demonstrate adept communication skills and empathy.
The proliferation of information-processing technologies has also played a crucial role in defining a CEO’s success. As businesses increasingly automate routine tasks, leadership must offer a human touch—judgment, creativity, and perception—that can’t be replicated by technology. In technologically-intensive firms, CEOs need to align a heterogeneous workforce, manage unexpected events, and negotiate decision-making conflicts—tasks best accomplished with robust social skills.
Equally, with most companies relying on similar technological platforms, CEOs need to distinguish themselves through superior management of the people who utilise these tools. As tasks are delegated to technology, leaders with superior social skills will find themselves in high demand, commanding a premium in the labour market.
Transparency
The rise of social media and networking technologies has also transformed the role of CEOs. Moving away from the era of anonymity, CEOs are now expected to be public figures interacting transparently and personally with an increasingly broad range of stakeholders. With real-time platforms capturing and publicising every action, CEOs need to be adept at spontaneous communication and anticipate the ripple effects of their decisions.
Diversity & inclusion
In the contemporary world, great CEOs also need to navigate issues of diversity and inclusion. This calls for a theory of mind—a keen understanding of the mental states of others—enabling CEOs to resonate with diverse employee groups, represent their interests effectively, and create an environment where diverse talent can thrive. (See our article on the Chief Coaching Officer for an alternative solution to this issue)
Hiring strategies
Given this backdrop, it is essential for organisations to refocus their hiring and leadership development strategies. Instead of relying on traditional methods of leadership cultivation, companies need to build and evaluate social skills among potential leaders systematically.
Current practices, such as rotating through various departments, geographical postings, or executive development programs, aren’t enough. Firms need to design a comprehensive approach to building social skills, even prioritising them over technical skills. High-potential leaders should be placed in roles that require extensive interaction with varied employee populations and external constituencies, and their performance should be closely monitored.
Assessing social skills calls for innovative methods beyond the traditional criteria of work history, technical qualifications, and career trajectory. New tools are needed to provide an objective basis for evaluating and comparing people’s abilities in this domain. While some progress is being made with the use of AI and custom tools for lower-level job seekers, there is a need for further innovation in top-level searches.
Conclusion
In conclusion, the role of the CEO is more multifaceted than ever. The modern world demands executives to possess exceptional social skills, including effective communication, empathetic interaction, and proactive inclusion. Companies need to recognise this change and adapt their leadership development programs accordingly to cultivate CEOs who can effectively lead in the 21st century.
The persistent pulse of inquiry in history
Throughout history, our innate curiosity has been the heartbeat of progress, driving us from basic questions about nature, like “Why does it rain?” to profound existential inquiries, such as “Do we have free will?”. In today’s fast-paced world, the art of asking questions feels somewhat overshadowed by the avalanche of information available. Yet, recognising what we don’t know often serves as the true essence of wisdom.
One lasting method of exploring knowledge through questioning is the Socratic method, a tool from ancient Greece that aids critical thinking, helps unearth solutions, and fosters informed decisions. Its endurance for over 2,500 years stands as a testament to its potency. Plato, a student of Socrates, immortalised his teachings through dialogues or discourses. In these, he delved deep into the nature of justice in the “Republic”, examining the fabric of ideal societies and the character of the just individual.
Questions have not only transformed philosophy but also propelled innovations in various fields. Take, for instance, Alexander Graham Bell, whose inquiries led to the invention of the telephone or the challenges to traditional beliefs during the Renaissance that led to breakthroughs in art, science, and philosophy. With their profound questions about existence and knowledge, the likes of Kant and Descartes have shaped the philosophical narratives we discuss today.
Critical questioning has upended accepted norms in the scientific realm, leading to paradigm shifts. For example, Galileo’s scepticism of the geocentric model paved the way for ground-breaking discoveries by figures such as Aristarchus, Pythagoras, Copernicus, Newton, and Einstein. At its core, every scientific revolution was birthed from a fundamental question.
On the educational front, the importance of questioning is backed by modern research. Historically, educators have utilised questions to evaluate knowledge, enhance understanding, and cultivate critical thinking. Rather than simply prompting students to recall facts, effective questions stimulate deeper contemplation, urging students to analyse and evaluate concepts. This enriches classroom experiences and deepens understanding in experiential learning settings.
By embracing this age-old method and recognising the power of inquiry, we can better navigate the complexities of our contemporary world.
Questions through the ages: an enduring pursuit of truth
Throughout the annals of time, the act of questioning has permeated our shared human experience. While ancient civilisations like the Greeks laid intellectual foundations with their spirited debates and dialogues, their inquiries’ sheer depth and diversity stood out. These questions spanned from the cosmos’ intricate designs to the inner workings of the human soul.
Historical literature consistently echoed this thirst for understanding, whether in the East or West. It wasn’t just about obtaining answers; it celebrated the journey of arriving at them. The process, probing, introspection, and subsequent revelations hold a revered spot in our collective memory. The reverence with which we’ve held questions, as seen through the words of philosophers, poets, and thinkers, showcases the ceaseless human spirit in its quest for knowledge.
In today’s interconnected world, the legacy of these inquiries remains ever-pertinent. We live in an era of information, a double-edged sword presenting knowledge and misinformation. As we grapple with this deluge, the skills of discernment and critical inquiry, inherited from our ancestors, are invaluable. It’s no longer just about seeking answers but about discerning the truths among many voices.
With the current rise in misinformation and fake news, a sharpened sense of questioning becomes our compass, guiding us through the mazes of contemporary challenges. By honouring the traditions of the past and adapting them to our present, we continue our timeless pursuit of truth, ensuring that the pulse of inquiry beats strongly within us.
Understanding the Socratic Method
Having recognised the age-old reverence for inquiry, it becomes imperative to explore one of its most pivotal techniques: the Socratic method. Socrates, widely regarded as a paragon of wisdom, believed that life’s true essence lies in perpetual self-examination and introspection. His approach was unique in its time, as he dared to challenge societal norms and assumptions. When proclaimed the wisest man in Greece, he responded not with complacency but with probing inquiry.
The Socratic method transcends a mere question-answer paradigm. Instead, it becomes a catalyst, prompting deep reflection. This dialectical technique fosters enlightenment, not by spoon-feeding answers but by kindling the flames of critical thinking and understanding. The beauty of this method rests not solely in the answers it might yield, but in the journey of introspection and dialogue it necessitates.
Beyond philosophical discourses, this method resonates powerfully in contemporary educational spheres. It underscores that genuine knowledge transcends rote memorisation, emphasising comprehension and enlightenment. This reverence for knowledge stresses the imperative of recognising our limitations fostering an ethos where learning is ceaseless and dynamic.
In our information-saturated age, the Socratic method’s principles are not just philosophical musings but indispensable. According to Statistica, only about 26% of Americans feel adept at discerning fake news, while a concerning 90% inadvertently propagate misinformation. Herein lies the true power of the Socratic approach. It teaches us discernment, evaluation, and the courage to seek clarity continuously. By integrating this method into our lives, we are better equipped to navigate our intricate world, fostering lives marked by clarity, purpose, and profound understanding.
Why the question often surpasses the answer
Having delved into the rich tapestry of historical inquiry and the transformative power of the Socratic method, one may wonder: Why such an emphasis on the question rather than the answer?
We are often trained to seek definite conclusions throughout our educational journey and societal conditioning. Yet, as Socrates demonstrated through his dialogues, there’s profound wisdom in embracing the exploration inherent in questioning. His discussions rarely aimed for definitive answers, suggesting that the reflective process, rather than the conclusion, held deeper significance.
Imagine a complex puzzle. While the completed picture might offer satisfaction, aligning each piece, understanding its intricacies, and appreciating its nuances truly enriches the experience. Similarly, questions, even those without clear-cut resolutions, can expand our horizons, provoke self-assessment, and challenge our preconceived notions. This process broadens our perspectives and fosters a more holistic understanding of our surroundings.
By valuing the act of questioning, we equip ourselves with the tools to navigate ambiguity, confront our limitations, and engage with the world more thoughtfully and profoundly.
The Socratic Method in contemporary frameworks
Socratic questioning involves a disciplined and thoughtful dialogue between two or more people, and its methodologies, rooted in ancient philosophy, remain instrumental in today’s diverse contexts. In the realm of academia, especially within higher education, this collaborative form of questioning is a cornerstone. Educators don’t merely transfer information; they challenge students with introspective questions, compelling them to reflect, engage, and critically evaluate the content presented.
Beyond the classroom, the applicability of the Socratic method stretches wide. Business environments, such as boardrooms and innovation brainstorming sessions, harness the power of Socratic dialogue, pushing participants to confront and rethink assumptions. Professionals employ this method in therapeutic and counselling to guide clients in introspective exploration, encouraging clarity and self-awareness.
Through its emphasis on continuous dialogue, deep reflection, and the mutual pursuit of understanding, this age-old method remains a beacon, guiding us as we navigate the ever-evolving complexities of our modern world.
Conclusion: the timeless art of inquiry
From the cobbled streets of ancient Athens to contemporary classrooms, boardrooms, and counselling sessions, the enduring legacy of the Socratic method attests to the potent force of inquiry. By valuing the exploratory process as much as, if not more than, the final insight, we pave a path towards richer understanding, intellectual evolution, and the limitless possibilities of human achievement.
In today’s deluge of data and information, the allure of swift answers is undeniable. Yet, Socrates’ practice reminds us of the transformative power held in the act of questioning. Adopting such a mindset, as this iconic philosopher once did, extends an open invitation to a life punctuated by curiosity, wonder, and unending discovery.
Depending on who you listen to working from home is either proof of a declining work ethic – evidence of and contributor to a global malaise that is hampering productivity, decimating work culture and amplifying isolation and laziness – or it’s a much-needed break from overzealous corporate control, finally giving workers the autonomy to do their jobs when, where and how they want to, with some added benefits to well-being, job satisfaction and quality of work baked in.
Three years on from the pandemic that made WFH models ubiquitous, the practice’s status is oddly divisive. CEOs malign it. Workers love it. Like most statements around WFH, that analysis is over simplistic. So what’s the actual truth: is WFH good, bad or somewhere in between?
The numbers
Before the pandemic Americans spent 5% of their working time at home. By spring 2020 the figure was 60% [1]. Over the following year, it declined to 35% and is currently stabilised at just over 25% [2]. A 2022 McKinsey survey found that 58% of employed respondents have the option to work from home for all or part of the week [3].
In the UK, according to data released by the Office for National Statistics in February, between September 2022 and January 2023, 16% of the workforce still worked solely from home, while 28% were hybrid workers who split their time between home and the office [4]. Meanwhile, back in 1981, only 1.5% of those in employment reported working mainly from home [5].
The trend is clear. Over the latter part of the 20th century and earliest part of the 21st, homeworking increased – not surprising given the advancements to technology over this period – but the increase wasn’t drastic. With Covid, it surged, necessarily, and proved itself functional and convenient enough that there was limited appetite to put it back in the box once the worst of the crisis was over.
The sceptics
Working from home “does not work for younger people, it doesn’t work for those who want to hustle, it doesn’t work in terms of spontaneous idea generation” and “it doesn’t work for culture.” That’s according to JPMorgan Chase CEO Jamie Dimon [6]. People who work from home are “phoning it in” according to Elon Musk [7]. In-person engineers “get more done,” says Mark Zuckerberg, and “nothing can replace the ability to connect, observe, and create with peers that comes from being physically together,” says Disney CEO Bob Iger [8].
Meanwhile, 85% of employees who were working from home in 2021 said they wanted a hybrid approach of both home and office working in future [9]. It seems there’s a clash, then, between the wants of workers and the wants of their employers.
Brian Elliott, who previously led Slack’s Future Forum research consortium and now advises executive teams on flexible work arrangements, puts the disdain for WFH from major CEOs down to “executive nostalgia” [10].
Whatever the cause, and whether merited or not, feelings are strong – on both sides. Jonathan Levav, a Stanford Graduate School of Business professor who co-authored a widely cited paper finding that videoconferencing hampers idea generation, received furious responses from advocates of remote-work. “It’s become a religious belief rather than a thoughtful discussion,” he says [11].
In polarised times, it seems every issue becomes black or white and we must each choose a side to buy into dogmatically. Given the divide seems to exist between those at the upper end of the corporate ladder and those below, it’s especially easy for the WFH debate to fall into a form of tribal class warfare.
Part of the issue is that each side can point to studies showing the evident benefits of their point of view and the evident issues with their opponents. It’s the echo-chamber effect. Some studies show working from home to be more productive. Others show it to be less. Each tribe naturally gravitates to the evidence that best suits their argument. Nuance lies dead on the roadside.
Does WFH benefit productivity?
The jury is still out.
An Owl Labs report on the state of remote work in 2021 found that of those working from home during 2021, 90% of respondents said they were at least at the same productivity level working from home compared to the office and 55% said they worked more hours remotely than they did at the office [12].
On the other end of the spectrum, a paper from Stanford economist Nicholas Bloom, which reviewed existing studies on the topic, found that fully remote workforces on average had a reduced productivity of around 10% [13].
Harvard Business School professor Raj Choudhury, looking into government patent officers who could work from anywhere but gathered in-person several times a year, championed a hybrid approach. He found that teams who worked together between 25% and 40% of the time had the most novel work output – better results than those who spent less or more time in the office. Though he said that the in-person gatherings didn’t have to be once a week. Even just a few days each month saw a positive effect [14].
It’s not just about productivity though. Working from home can have a negative impact on career prospects if bosses maintain an executive nostalgia for the old ways of working. Studies show that proximity bias – the idea that being physically near your colleagues is an advantage – persists. A survey of 800 supervisors by the Society for Human Resource Management in 2021 found that 42% percent said that when assigning tasks, they sometimes forget about remote workers [15].
Similarly, a 2010 study by UC Davis professor Kimberly Elsbach found that when people are seen in the office, even when nothing is known about the quality of their work, they are perceived as more reliable and dependable – and if they are seen off-hours, more committed and dedicated [16].
Other considerations
It’s worth noting other factors outside of productivity that can contribute to the bottom line. As Bloom states, only focusing on productivity is “like saying I’ll never buy a Toyota because a Ferrari will go faster. Well, yes, but it’s a third the price. Fully remote work may be 10% less productive, but if it’s 15% cheaper, it’s actually a very profitable thing to do” [17].
Other cost-saving benefits of a WFH or hybrid work model include potentially allowing businesses to downsize their office space and save on real estate. The United States Patent and Trademark Office (USPTO) estimated that increases in remote work in 2015 saved it $38.2 million [18].
Minimising the need for commuting also helps ecologically. The USPTO estimates that in 2015 its remote workers drove 84 million fewer miles than if they had been travelling to headquarters, reducing carbon emissions by more than 44,000 tons [19].
A hybrid model
Most businesses now tend to favour a hybrid model. Productivity studies, including Bloom’s that found the 10% productivity drop from fully remote working, tend to concede there’s little to no difference in productivity between full-time office staff and hybrid workers. 47% of American workers prefer to work in a hybrid model [20]. In the UK, it’s 58% [21]. McKinsey’s American Opportunity Survey found that when given the chance to work flexibly, 87% of people take it [22].
However, as Annie Dean, whose title is “head of team anywhere” at software firm Atlassian, notes: “For whatever reason, we keep making where we work the lightning rod, when how we work is the thing that is in crisis” [23].
Choudhary backs this up, saying, “There’s good hybrid – and there’s terrible hybrid” [24]. It’s not so much about the model as the method. Institutions that put the time and effort into ensuring their home and hybrid work systems are well-defined and there’s still room for discussion, training and brainstorming – all the things that naysayers say are lost to remote working – are likely to thrive.
That said, New Yorker writer Cal Newport points out that firms that have good models in place (what he calls “agile management”) are few and far between. Putting such structures in place is beyond the capability of most organisations. “For those not benefiting from good (“Agile”) management,” he writes, “the physical office is a necessary second-best crutch to help firms get by, because they haven’t gotten around to practising good management [25].”
The future
Major CEOs may want a return to full-time office structures, but a change seems unlikely. You can’t put the genie back in the bottle. Home and hybrid working is popular with employees, especially millennials and Gen Z. As of 2022 millennials were the largest generation in the workforce [26]; their needs matter.
The train is only moving in one direction – no amount of executive nostalgia is going to get it to turn back. It seems a hybrid model is the future, and a healthy enough compromise.
References
[1] https://www.economist.com/special-report/2021/04/08/the-rise-of-working-from-home
[2] https://www.forbes.com/sites/stevedenning/2023/03/29/why-working-from-home-is-here-to-stay/
[3] https://www.mckinsey.com/industries/real-estate/our-insights/americans-are-embracing-flexible-work-and-they-want-more-of-it
[4] https://www.theguardian.com/commentisfree/2023/feb/14/working-from-home-revolution-hybrid-working-inequalities
[5] https://wiserd.ac.uk/publication/homeworking-in-the-uk-before-and-during-the-2020-lockdown/
[6] https://www.forbes.com/sites/jenamcgregor/2023/08/19/the-war-over-work-from-home-the-data-ceos-and-workers-need-to-know/
[7] https://hbr.org/2023/07/tension-is-rising-around-remote-work
[8] https://www.forbes.com/sites/jenamcgregor/2023/08/19/the-war-over-work-from-home-the-data-ceos-and-workers-need-to-know/
[9] https://www.ons.gov.uk/employmentandlabourmarket/peopleinwork/employmentandemployeetypes/articles/businessandindividualattitudestowardsthefutureofhomeworkinguk/apriltomay2021
[10]
https://www.forbes.com/sites/jenamcgregor/2023/08/19/the-war-over-work-from-home-the-data-ceos-and-workers-need-to-know/
[11] https://www.forbes.com/sites/jenamcgregor/2023/08/19/the-war-over-work-from-home-the-data-ceos-and-workers-need-to-know/
[12] https://owllabs.com/state-of-remote-work/2021/
[13] https://www.forbes.com/sites/jenamcgregor/2023/08/19/the-war-over-work-from-home-the-data-ceos-and-workers-need-to-know/
[14] https://www.forbes.com/sites/jenamcgregor/2023/08/19/the-war-over-work-from-home-the-data-ceos-and-workers-need-to-know/
[15] https://www.forbes.com/sites/jenamcgregor/2023/08/19/the-war-over-work-from-home-the-data-ceos-and-workers-need-to-know/
[16] https://www.forbes.com/sites/jenamcgregor/2023/08/19/the-war-over-work-from-home-the-data-ceos-and-workers-need-to-know/
[17]
https://www.forbes.com/sites/jenamcgregor/2023/08/19/the-war-over-work-from-home-the-data-ceos-and-workers-need-to-know/
[18] https://hbr.org/2020/11/our-work-from-anywhere-future#:~:text=Benefits%20and%20Challenges,of%20enhanced%20productivity%20and%20engagement
[19] https://hbr.org/2020/11/our-work-from-anywhere-future#:~:text=Benefits%20and%20Challenges,of%20enhanced%20productivity%20and%20engagement.
[20] https://siepr.stanford.edu/publications/policy-brief/hybrid-future-work#:~:text=Hybrid%20is%20the%20future%20of%20work%20Key%20Takeaways,implications%20of%20how%20and%20when%20employees%20work%20remotely.
[21] https://mycreditsummit.com/work-from-home-statistics/
[22] https://www.mckinsey.com/industries/real-estate/our-insights/americans-are-embracing-flexible-work-and-they-want-more-of-it
[23] https://www.forbes.com/sites/jenamcgregor/2023/08/19/the-war-over-work-from-home-the-data-ceos-and-workers-need-to-know/
[24] https://www.forbes.com/sites/jenamcgregor/2023/08/19/the-war-over-work-from-home-the-data-ceos-and-workers-need-to-know/
[25] https://www.forbes.com/sites/stevedenning/2023/03/29/why-working-from-home-is-here-to-stay/
[26] https://www.forbes.com/sites/theyec/2023/01/10/whats-the-future-of-remote-work-in-2023/
Introduction
Originally published in 2013, Ichiro Kishimi and Fumitake Koga’s The Courage to be Disliked quickly became a sensation in its authors’ native Japan. Its English language translation followed suit with more than 3.5 million copies sold worldwide.
The book is often shelved in the ‘self-help’ category, in large part due to its blandly overpromising subheading: How to free yourself, change your life and achieve real happiness. In truth it would be better suited to the philosophy or psychology section. The book takes the form of a discussion between a philosopher and an angsty student. The student is unhappy with his life and often with the philosopher himself, while the philosopher is a contented devotee of Adlerian psychology, the key points of which he disseminates to the student over the course of five neatly chunked conversations. His proposed principles offer sound advice for life in general but also prove useful when integrated into a business setting.
Adlerian Psychology
Alfred Adler was an Austrian born psychotherapist and one of the leading psychological minds of the 20th century. Originally a contemporary of Freud’s, the two soon drifted apart. In many ways Adler’s theories can be defined in opposition to his old contemporary; they are anti-Freudian at their core. Freud is a firm believer that our early experiences shape us. Adler is of the view that such sentiments strip us of autonomy in the here and now, seeing Freud’s ideas as a form of determinism. He instead proffers:
No experience is in itself a cause of our success or failure. We do not suffer from the shock of our experiences – the so-called trauma – but instead, we make out of them whatever suits our purposes. We are not determined by our experiences, but the meaning we give them is self-determining.
Essentially, then, the theories are reversed. Adler posits that rather than acting a certain way in the present because of something that happened in their past, people do what they do now because they chose to, and then use their past circumstances to justify the behaviour. Where Freud would make the case that a recluse doesn’t leave the house because of some traumatic childhood event, for example, Adler would argue that instead the recluse has made a decision to not leave the house (or even made it his goal not to do so) and is creating fear and anxiety in order to stay inside.
The argument comes down to aetiology vs teleology. More plainly, assessing something’s cause versus assessing its purpose. Using Adlerian theory, the philosopher in the book tells the student that: “At some stage in your life you chose to be unhappy, it’s not because you were born into unhappy circumstances or ended up in an unhappy situation, it’s that you judged the state of being unhappy to be good for you”. Adding, in line with what David Foster-Wallace referred to as the narcissism of self-loathing, that: “As long as one continues to use one’s misfortune to one’s advantage in order to be ‘special’, one will always need that misfortune.”
Adler in the workplace: teleology vs aetiology
An example of the difference in these theories in the workplace could be found by examining the sentence: “I cannot work to a high standard at this company because my boss isn’t supportive.” The viewpoint follows the cause and effect Freudian notion: your boss is not supportive therefore you cannot work well. What Adler, and in turn Kishimi and Koga, argue is that you still have a choice to make. You can work well without the support of your boss but are choosing to use their lack of support as an excuse to work poorly (which subconsciously was your aim all along).
This is the most controversial of Adler’s theories for a reason. Readers will no doubt look at the sentence and feel a prescription of blame being attributed to them. Anyone who has worked with a slovenly or uncaring boss might feel attacked and argue that their manager’s attitude most certainly did affect the quality of their work. But it’s worth embracing Adler’s view, even if just to disagree with it. Did you work as hard as you could and as well as you could under the circumstances? Or did knowing your boss was poor give you an excuse to grow slovenly too? Did it make you disinclined to give your best?
Another example in the book revolves around a young friend of the philosopher who dreams of becoming a novelist but never completes his work, citing that he’s too busy. The theory the philosopher offers is that the young writer wants to leave open the possibility that he could have been a novelist if he’d tried but he doesn’t want to face the reality that he might produce an inferior piece of writing and face rejection. Far easier to live in the realm of what could have been. He will continue making excuses until he dies because he does not want to allow for the possibility of failure that reality necessitates.
There are many people who don’t pursue careers along similar lines, staunch in the conviction that they could have thrived if only the opportunity had arisen without ever actively seeking that opportunity themselves. Even within a role it’s possible to shrug off this responsibility, saying that you’d have been better off working in X role in your company if only they had given you a shot, or that you’d be better off in a client-facing position rather than being sat behind a desk doing admin if only someone had spotted your skill sets and made use of them. But without asking for these things, without actively taking steps towards them, who does the responsibility lie with? It’s a hard truth, but a useful one to acknowledge.
Adler in the workplace: All problems are interpersonal relationship problems
Another of the key arguments in the book is that all problems are interpersonal relationship problems. What that means is that our every interaction is defined by the perception we have of ourselves versus the perception we have of whomever we are dealing with. Adler is the man who coined the term “inferiority complex”, and that factors into his thinking here. He spoke of two categories of inferiorities: objective and subjective. Objective inferiorities are things like being shorter than another person or having less money. Subjective inferiorities are those we create in our mind, and make up the vast majority. The good news is that “subjective interpretations can be altered as much as one likes…we are inhabitants of a subjective world.”
Adler is of the opinion that: “A healthy feeling of inferiority is not something that comes from comparing oneself to others; it comes from one’s comparison with one’s ideal self.” He speaks of the need to move from vertical relationships to horizontal ones. Vertical relationships are based in hierarchy. If you define your relationships vertically, you are constantly manoeuvring between interactions with those you deem above you and those you deem below you. When interacting with someone you deem above you on the hierarchical scale, you will automatically adjust your goalposts to be in line with their perceptions rather than defining success or failure on your own terms. As long as you are playing in their lane, you will always fall short. “When one is trying to be oneself, competition will inevitably get in the way.”
Of course in the workplace we do have hierarchical relationships. There are managers, there are mid-range workers, there are junior workers etc. The point is not to throw away these titles in pursuit of some newly communistic office environment. Rather it’s about attitude. If you are a boss, do you receive your underlings’ ideas as if they are your equal? Are you open to them? Or do you presume that your status as “above” automatically means anything they offer is “below”? Similarly if you are not the boss, are you trying to come up with the best ideas you can or the ones that you think will most be in-line with your boss’ pre-existing convictions? Obviously there’s a balance here – if you solely put forward wacky, irrelevant ideas that aren’t in line with your company’s ethos and have no chance of success then that’s probably not helpful, but within whatever tramlines your industry allows you can certainly get creative and trust your own taste rather than seeking to replicate someone else’s.
Pivotal to this is whether you are willing to be disagreed with and to disagree with others or are more interested in pleasing everyone, with no convictions of your own. This is where the book’s title stems from. As it notes, being disliked by someone “is proof that you are exercising your freedom and living in freedom, and a sign that you are living in accordance with your own principles…when you have gained that courage, your interpersonal relationships will all at once change into things of lightness.”
Adler in the workplace: The separation of tasks
The separation of tasks is pivotal to Adlerian theory and interpersonal relationships. It is how Adler, Kishimi and Koga suggest one avoids falling into the trap of defining oneself by another’s expectations. The question one must ask themselves at all times, they suggest, is: Whose task is this? We must focus solely on our own tasks, not letting anyone else alter them and not trying to alter anyone else’s. This is true for both literal tasks – a piece of work, for example – but also more abstract ideas. For example, how you dress is your task. What someone else thinks of how you dress is theirs. Do not make concessions to their notions (or your perceptions of what their notions might be) and do not be affected by what they think for it is not your task and therefore not yours to control.
This idea that we allow others to get on with their own tasks is crucial to Adler’s belief in how we can live rounded, fulfilling lives. The philosopher argues that the basis of our interpersonal relationships – and as such our own happiness – is confidence. When the boy asks how the philosopher defines the “confidence” of which he speaks, he answers:
It is doing without any set conditions whatsoever when believing in others. Even if one does not have sufficient objective grounds for trusting someone, one believes. One believes unconditionally without concerning oneself with such things as security. That is confidence.
This confidence is vital because the book’s ultimate theory is that community lies at the centre of everything. The awareness that “I am of use to someone” both allows one to act with confidence in their own life, have confidence in others, and to not be reliant on the praise of others. The reverse is true too. As Kishimi and Koga state, “A person who is obsessed with the desire for recognition does not have any community feeling yet, and has not managed to engage in self-acceptance, confidence in others, or contribution to others.” Once one possesses these things, the need for external recognition will naturally diminish.
For high-level employees, then, it’s important to set a tone in the workplace that allows colleagues to feel that they are of use. But as the book dictates, do not do this by fake praise – all that will do is foster further need for recognition (“Being praised essentially means that one is receiving judgement from another person as ‘good.’”) Instead, foster this atmosphere by trusting them, showing confidence.
The courage to be disliked
The Courage to be Disliked is at odds with many of the accepted wisdoms of the day. Modern cultural milieu suggests that we should be at all times accepting and validating others’ trauma as well as our own. Many may even find solace in this approach and find that it suits them best. But there is no one-size-fits-all solution when it comes to fostering a successful workplace and even less so when it comes to leading a fulfilling life. For anyone who feels confined by the idea that there are parameters around what they can achieve and are capable of because of some past event or some subjective inferiority that has been harboured too long, perhaps look at those interpersonal relationships, perhaps find the courage to be disliked, and in doing so hope to find a community that you’re willing to support as much as it supports you. There is no need to be shackled to whatever mythos you’ve internally created.
As the book states: “Your life is not something that someone gives you, but something you choose yourself, and you are the one who decides how you live…No matter what has occurred in your life up to this point, it should have no bearing at all on how you live from now on.”
References
Kishimi, Ichiro & Koga, Fumitake. The Courage to Be Disliked: How to Free Yourself, Change your Life and Achieve Real Happiness. Bolinda Publishing Pty Ltd. 2013.
Introduction
Consider a simple yet profound question: What does your work mean to you? Is it merely a task to be completed, or does it resonate with a deeper purpose in your life?
Viktor Frankl, a prominent Austrian psychiatrist and philosopher, grappled with these very questions, evolving them into a broader exploration of life’s meaning. Drawing from his harrowing experiences in Nazi concentration camps, he developed logotherapy—a form of psychotherapy that centres around the search for meaning and purpose. Through logotherapy, Frankl illuminated the idea that life’s essence can be found not just in joyous moments but also in love, work, and our attitude towards inevitable suffering. This pioneering approach underscores personal responsibility and has offered countless individuals a renewed perspective on fulfilment, even in the face of daunting challenges.
In this piece, we delve into the intricacies of Frankl’s teachings, exploring the symbiotic relationship he identified between work and our quest for meaning.
A Holistic Approach to Life and Work
In his seminal work, ‘Man’s Search for Meaning,’ Viktor Frankl delved deeply into the multifaceted nature of human existence. He eloquently described the myriad pathways through which individuals uncover meaning. For Frankl, while work or ‘doing’ is undoubtedly a significant avenue for deriving meaning, it isn’t the only one. He emphasised the value of love, relationships, and our responses to inevitable suffering. Through this lens, he offered a panoramic view of life, advocating for a holistic perspective where meaning is not strictly tethered to our work but is intricately woven through all our experiences and interactions.
Progressing in his exploration, Frankl sounded a note of caution about the perils of letting work become an all-consuming end in itself. He drew attention to the risks of burnout and existential exhaustion when one’s sense of purpose is confined solely to one’s occupation or the relentless chase for wealth. To Frankl, an overemphasis on materialistic achievements could inadvertently lead individuals into what he termed an ‘existential vacuum’ – a state where life seems starkly devoid of purpose. He argued that in our quest for success, we must continually seek a deeper, more intrinsic purpose. Otherwise, we risk being blinded by life’s profound significance and richness beyond material gains.
Delving deeper into the realm of employment, Frankl confronted the psychological and existential challenges of unemployment. He noted that without the inherent structure and purpose provided by work, many individuals grapple with a profound sense of meaninglessness. This emotional and existential void often manifests in a diminishing sense of significance towards time, leading to dwindling motivation to engage wholeheartedly with the world. The ‘existential vacuum’ emerges again, casting its shadow and enveloping individuals in feelings of purposelessness.
Yet, Frankl’s observations were not merely confined to the challenges. He beautifully illuminated the resilience and fortitude of certain individuals, even in the face of unemployment. He showcased how, instead of linking paid work directly with purpose, some found profound meaning in alternative avenues such as volunteer work, creative arts, education, and community participation.
Frankl firmly believed that the essence of life’s meaning often lies outside the traditional realms of employment. To drive home this perspective, he recounted poignant stories, such as that of a desolate young man who unearthed profound purpose and reaffirmed his belief in his intrinsic value by preventing a distressed girl from taking her life. Such acts, as illustrated by Frankl, highlight the boundless potential for a meaningful existence, often discovered in genuine moments of human connection.
Work as an Avenue for Meaning and Identity
Viktor Frankl’s discourse on work transcended the common notions of duty and obligation. For him, work was more than a mere means to an end; it was a potent avenue to unearth meaning and articulate one’s identity. Frankl posited that when individuals align their work with their intrinsic identity—encompassing all its nuances and dimensions—they move beyond merely working to make a living. Instead, they find themselves working with a purpose.
This profound idea stems from his unwavering belief that our work provides us with a unique opportunity. Through it, we can harness our individual strengths and talents, channelling them to create a meaningful and lasting impact on the world around us.
In line with modern philosophical thought, which views work as a primary canvas for self-expression and self-realisation, Frankl also recognised its significance. He believed that work could serve as a pure channel, finely tuned to our unique skills, passions, and aspirations. This deep sense of accomplishment and fulfilment from one’s chosen profession, he asserted, is invaluable. However, Frankl also emphasised the importance of seeing the broader picture. While careers undeniably play a significant role in our lives, they are but a single facet in our ongoing quest for meaning.
Frankl reminds us that while our careers are integral to our lives, the quest for meaning isn’t imprisoned within their boundaries. He believed the core of true meaning emerges from our deep relationships, our natural capacity for empathy, and our virtues. These treasures of life, he asserted, can be manifested both within the confines of our workplace and beyond.
The True Measure of Meaning Through Work
For Viktor Frankl, our professional lives brim with potential for fulfilment. Yet, fulfilment wasn’t solely defined by accolades. Instead, it was about aligning our work with our deepest values and desires. It wasn’t just the milestones that mattered but how they resonated with our core beliefs.
Frankl’s logotherapy reshapes our perception of work, emphasising that even mundane tasks can hold significance when approached with intent. With the right mindset, every job becomes a step in our journey for meaning.
In Frankl’s writings, he weaves together tales of profound significance—a young man’s transformative act of kindness, a narrative not strictly tethered to work’s traditional realm. Yet, these stories anchor a timeless truth: In every endeavour, whether grand or humble, lies the potential for unparalleled meaning. Here, work isn’t just about designated roles—it becomes an evocative stage where profound moments play out. Beyond job titles and tasks, the depth, sincerity, and fervour we infuse into each act truly capture the essence of meaningful work.
Finding Fulfilment in Every Facet
Viktor Frankl’s profound insights into the human pursuit of meaning provide a distinctive lens through which we can evaluate both our daily tasks and life’s most pivotal moments. Through his exploration—whether addressing the ordinariness of daily life or the extremities of crisis—Frankl illuminated the profound interconnectedness of work and personal identity. He posited that our professions, while significant, are fragments of a vast tapestry that constitute human existence.
Navigating the journey of life requires continual adjustments to our perceptions of success and meaning. While our careers and professional achievements are significant, true fulfilment goes beyond these confines. It’s woven into our human experiences, the bonds we nurture, the challenges we face, and the joys we hold dear.
Frankl’s pioneering work in logotherapy urges us to approach life with intention and purpose. He beckons us to see the value in every moment, task, and human connection. As we delve into our careers and strive for success, aligning not just with outward accomplishments but with the very essence of who we are is vital.
Introduction
Humans have always been fascinated by the future. Prior to the era of computers and data, we sought insights from the stars, dreams, and even animal behaviour. The tale of the Delphic Oracle is etched in this tapestry of human curiosity. A simple goat herder named Coretas reportedly stumbled upon a fissure in the earth, releasing ethereal vapours. Drawn by these mysterious emissions, he perceived glimpses of the future. This mystical spot soon became legendary. Word spread and people from distant lands journeyed here, drawn by the allure of prophecy. They came eager to hear the visions of the future, as interpreted by the chosen Pythia, a maiden who acted as the mouthpiece of Apollo. From mystical vapours to celestial patterns, humanity’s thirst for understanding tomorrow has perpetually pushed us to evolve our tools and methods, seeking ever-more sophisticated ways to peer into the future.
Throughout history, cultures around the globe have relied on a myriad of tools for forecasting the future. The Mayans, for instance, constructed elaborate calendars, meticulously tracking celestial bodies. Chinese sages consulted the I Ching, a revered text blending both philosophy and prediction. During the Middle Ages, figures like Nostradamus peered at the cosmos, firmly believing that the stars unveiled the secrets of events yet to unfold. Meanwhile, in their endless pursuit of the Philosopher’s Stone, alchemists hoped that their transformative experiments might also provide windows into future events. As the sands of time flowed, the rigours of science began to play an increasingly pivotal role in this age-old quest. Meteorologists harnessed accumulated data to forecast weather patterns, while demographers, attuned to shifts in population dynamics, used their insights to anticipate future demographic shifts.
Predictive analytics
Now, in this age, we’re navigating through a golden era of prediction. Computers, hailed as our contemporary oracles, delve into vast data lakes. With the aid of intricate algorithms and machine learning, they furnish insights about potential future events. Computers, hailed as our contemporary oracles, dive into vast data lakes — with less smoke and more code. Though technologically advanced, these modern tools have a mystique reminiscent of ancient methods. Indeed, their exceptional abilities often blur the lines between the arcane and the technological.
Even though the settings have changed—with glass skyscrapers replacing ancient temples—our innate desire to predict the future remains unwavering. We’ve shifted from seeking guidance from oracles to heeding the insights of modern-day experts: economists, scientists, and statisticians. The unpredictable nuances of geopolitics and the intricate web of global economies underscore the challenges of forecasting. Despite our technological advances, no tool or expert can perfectly predict outcomes, as emphasised by the renowned financier Peter Lynch: “You never can predict the economy. You can’t predict the stock market.”
It’s against this backdrop of prediction challenges that Philip Tetlock’s work shines prominently. Over decades, Tetlock undertook the meticulous task of analysing millions of predictions, unravelling the intricacies of human foresight. He identified the ‘superforecasters’, a rare group that consistently demonstrated superior predictive abilities.
Superforecasters
Superforecasters stand apart from their peers, not simply through the accuracy of their predictions, but through their unique way of understanding and working with probabilities. Instead of confining themselves to somewhat nebulous terms like ‘likely’ or ‘certain’, they delve into a world of precision, where small differences matter. They employ an almost artistic attention to detail, carving out distinctions in probability estimates that most would overlook.
What’s noteworthy isn’t simply that they can perceive a difference between a 56% and a 57% probability, but the mindset this precision reflects. It speaks to a meticulousness, and diligence that’s often lacking in forecasting. This ability to finely calibrate their predictions sets them apart, transforming forecasting from a vague art into a refined science.
However, this is but one facet of their skills. Superforecasters also excel at dynamically updating their forecasts as new information comes to light, demonstrating humility in acknowledging and learning from their errors, and cultivating a probabilistic thinking mindset. Taken together, these skills contribute to their exceptional track record in the challenging realm of prediction.
In the aftermath of the Iraq war, where intelligence missteps around weapons of mass destruction became evident, the US intelligence community sought Tetlock’s expertise. His findings were detailed in his book, “Superforecasting: The Art and Science of Prediction”, serving as an invaluable guide for anyone looking to refine their forecasting skills.
Beyond the realm of politics and global affairs, the implications of Tetlock’s research are profound. His techniques offer practical applications in diverse arenas, from deciphering economic trends to pivotal personal decisions, such as evaluating career trajectories or the potential of a business venture.
Ethical & economical challenges
Yet, while Tetlock’s findings are ground-breaking, they’re not infallible. Even the best predictions are fraught with uncertainties. As we harness these insights, it’s vital to maintain a balanced approach, merging strong convictions with a healthy dose of caution.
By integrating Tetlock’s teachings, we can achieve a heightened awareness of our cognitive biases, enabling more informed decisions and, potentially, a brighter future.
While predictive tools offer remarkable insights, their overreliance introduces both ethical and economic challenges. Ethically, leaning too heavily on predictions can erode our adaptability and critical thinking, luring us into a false sense of security. Economically, this complacency can result in missed opportunities or misguided strategies. Just as the roll of a dice is inherently unpredictable, so too are complex systems like economies. They’re influenced by countless variables, making them vulnerable to unexpected twists and turns. Predictions, while valuable, are best used as guiding lights and not as absolute certainties. After all, at their core, they’re imbued with an intrinsic element of unpredictability.
In the realm of forecasting, we find that with great predictive power comes great responsibility — and the inevitable debate over who truly holds the crystal ball. The craft, while teeming with potential, is not without its boundaries and ethical dilemmas. Foretelling the future transcends the realms of science and art; it’s a weighty task that beckons us to navigate with zeal and caution. Here lies our unparalleled chance to influence humanity’s trajectory, yet we must remember to gracefully balance our acquired wisdom against the vast, ever-present unknowns.
Conclusion
As we conclude, we’re reminded of the timeless rhythm of humanity’s quest: from the ethereal mists of the Delphic Oracle to the digital pulses of algorithms. This cyclical endeavour to decipher tomorrow underscores our unyielding curiosity, a reflection of our innate need to foresee, understand, prepare, and connect with the uncertain embrace of the future.