Introduction

“The world would be happier if men had the same capacity to be silent that they have to speak.” – Baruch Spinoza

We fear silence. Perhaps not in the same way we fear heights or snakes or *insert debilitating personal phobia here*. Silence does not tend to bring about bodily shakes, dizzy spells or stultifying ramifications of any kind. Yet it is something we frequently work to avoid, now more so than ever.

All one has to do is look around a busy train carriage to see this fear being played out, invariably in the most mundane of ways. Gathered commuters bristle overcoat to overcoat. Some listen to music or podcasts. Others scroll dimly through their phones. The majority half-heartedly do both. In an age of consumption, where our duty is to swallow content reliably, if more often than not indifferently, silence, even in the briefest of doses, has become something with which we are unable to contend.

Of course it’s spilled into the workplace too. Silence is frowned upon. If you are in a meeting, you should be speaking, or at the very least have something to say. In keeping with the indefatigable antagonism of social media, you should have an opinion – a “take”. It’s recommended that you have at least one of these for every subject and are willing, ideally desperate, to express it. Because if you don’t have something to say, why are you in the meeting at all?

“X didn’t add anything to the discussion today, did you notice?” It’s often said (and still more often thought) after a meeting. Someone was not pulling their weight, did not contribute as expected. What it means is they didn’t speak enough. Because value is about quantity not quality. Many a worker has ascended the corporate ladder by being nothing more than a vocaliser, demonstrating a willingness to express opinions, unhelpful or outright irrelevant as they may be, with confidence and frequency.

Of course neither being a professional loudmouth nor professional churchmouse is the ideal state of play. And an advocacy of silence does mean being silent all the time. One should most certainly express opinions if they have them. But silence, already such a rarity, needs more uptake. The amount one says may have become the de facto model for assessing contribution, but it’s a false economy. It is far more beneficial to have a worker who understands the power of silence, and knows how to wield it well.

The power of silence in…meetings

Obviously how a meeting plays out is dependent on many factors: the parties involved, the topic of discussion, the format and formality. In some cases it may be that you have to speak first and for a prolonged period, due to your role at your company or expertise in whatever is being discussed. But when that’s not the case, there’s a great deal to be said for starting from a quiet place.

In letting others speak first, you learn their priorities, their potential doubts (which you can then work to alleviate) as well as getting a sense of their demeanour or disposition. When it’s your turn to speak, you can then tailor your points and tone accordingly. How you deal with someone open and relaxed will be different to how you deal with someone fixed and uncompromising, for example. The ability to tailor your approach and play the hand you’re dealt is invaluable.

It should be noted that in the context of a meeting, choosing silence does not have to be some grand gesture. It can consist of as little as giving a few seconds after someone has spoken for their point to land. As well as showing respect to their point – by giving it time to be considered, rather than jumping in right away, giving the impression all you’ve been doing is waiting for your turn to speak – it lets you process what they’ve said and respond with considered insight. Too often meetings consist entirely of people who want to make points, none of whom are willing to listen to anyone else’s.

Writing in Harvard Business Review, Allison Shapira, founder/CEO of Global Public Speaking and a professor at the Harvard Kennedy School, suggests some tools for knowing when it’s right to speak in a meeting and when it’s best to choose silence.

Before entering any meeting she suggests you write up bullet points of things you feel you really want to say – not waffle that you feel will validate your being there, actual points you think are important – as well as asking “why me?” By this she means, why do you care about any of this – your role, organisation etc. Answering that question adds to your sense of purpose and confidence, as well as reminding you that your worth in the meeting comes from your passion and experience, not your word count.

Shapira suggests one doesn’t speak if they’re only doing so to show off, either about how much they know or about how willing they are to be part of the room’s vocal contingent. Nor should one speak if they’re doing so only to empower others. Empowering others may sound like a positive, but by stepping up to be the group’s speaker, even if you then delegate conversation to others, serving as a sort of intermediary, you become a crutch for them. It may be helpful in the moment, but not in the long-term. Finally, before speaking, one should ask themselves whether what they’re going to say might be better held back for a one-on-one conversation. There’s nothing to be gained by airing dirty laundry in public, nor by wasting twenty minutes of a whole group’s time by talking about something that only concerns one of them.

It’s important to be vigilant about these things. As the old saying goes, “Most of us know how to say nothing; few of us know when” [1].

The power of silence in…leadership

Leaders, in particular, need to pay attention to how they are using silence, or more often failing to. Research conducted by Leigh Plunkett Tost of the University of Washington, Francesca Gino of Harvard Business School, and Richard P. Larrick of Duke University into the relationship between power and leadership found that, “Members of teams with high-power leaders are likely to keep quiet in meetings, both because high-power leaders talk a lot, meaning there’s not much time for others to talk, and because of the perception – fair or not – that powerful people aren’t interested in anyone else’s ideas” [2].

That perception could be wrong, but if your employees think it’s true – and so hold back on sharing ideas as a result of it – then the company will suffer all the same. Managers may not be aware of the power differential between themselves and their employees, or the impact it has on what employees are willing to say to them. As such, a manager may announce a decision and then assume from their employees’ silence that they are happy with it (because if not, surely they would speak up?) In actuality, the employees may simply see no point in saying anything because they think the boss has already made up their mind. As Kate Donovan, founder of US-based consultancy Equal Pay Negotiation, points out, “That’s a very dangerous difference” [3].

To get a true idea of what their employees think about what they’ve said, a leader should ask their team, “What’s your initial reaction to that idea?” as a starting point, opening the floor for comment without leading the witness.

The power of silence in…speaking

Strange as it may sound, silence is also one of the most valuable tools in our arsenal when speaking. Matthew MacLachlan, from the language and soft skills training provider Learnlight, has some tips for how to use silence in public performance: “Before starting, look at the audience and be silent for a moment because that says, ‘I’m in control. I know what I’m doing. I’m confident.’” [4] Not only that, but it garners more attention for the points you’re making. “Silence makes us nervous,” MacLachlan adds, “our instinctive reaction is that we’d better pay attention, there’s something going on here.”

Ginny Radmall, speaking coach and director of The Ivy Way, is also a proponent of the power of silence in speaking [5]. She notes how we use filler words such as “um” and “ah” to replace silence. Not only do such words lend a sense of uncertainty and lack of confidence, they also interrupt our breathing rhythm, so vital to speaking well. Overlong sentences, too, detract from impact. Silence works as an emphasis. Watch a few minutes of Steve Jobs here, or Barack Obama here. Rather than letting words tumble out in one long stream that the audience must then fish through to find what’s important, silence lets one know where their attention should be. When speaking, you want to make life as easy as possible for those listening to you. Silence helps.

The power of silence in…negotiation

Research conducted at the University of Groningen in the Netherlands in both Dutch and English found that when a silence in conversation stretched to four seconds, people started to feel unsettled [6]. In contrast, a separate study of business meetings found that Japanese people were happy with silences of 8.2 seconds – twice as long as English speakers [7].

Unsurprisingly, that fact is one global business people are aware of and attempt to use to their advantage. MacLachlan notes how, “Chinese negotiators are very, very aware that Americans like to fill silences and they are trained to stay silent and impassive because that will make the Americans uncomfortable and possibly make concessions without the Chinese having to do anything” [8]. Silence is golden – and gold is worth a lot of money.

Donal Carbaugh, a professor of communication at the University of Massachusetts Amherst, points out that Finns, too, consider silence to be a vital virtue. They are happy to sit in studied thoughtfulness. “No-one is saying anything but everybody’s thinking. They are engaged. The frame around silence at that point can be very positive,” he says [9].

But of course one’s nationality should not be the sole dictator of whether they are able to use silence to their favour. Whether we’re borrowing from the Chinese, Japanese, Finnish or whoever it may be, any of us can adopt a less talkative, more considered approach. It won’t feel natural at first. As with anything, it takes practice.

Success in silence

In a world overspilling with noise and data, silence is a rarity. But if utilised, it can offer us benefits in life and business. We negotiate with more authority, learn to listen and engage with what’s being said to us rather than just waiting for our turn to talk, and it lets us speak with greater clarity and emphasis. That’s not to say we must keep tight-lipped on all our thoughts or feelings; we should express anything important to us. But to avoid getting drowned out by the noise, it may be worth cutting out some of the waffle.

If you really want to feel the impact of silence, check out John Cage’s famous “music” piece 4:33.

More on Silence

The benefits of silence in our professional lives article by Shay Dalton

Introverts, extroverts and leadership podcast with Karl Moore

References

[1] https://www.forbes.com/quotes/556/

[2] https://hbswk.hbs.edu/item/when-power-makes-others-speechless-the-negative-impact-of-leader-power-on-team-performance

[3] https://www.bbc.com/worklife/article/20170718-the-subtle-power-of-uncomfortable-silences

[4] https://www.bbc.com/worklife/article/20170718-the-subtle-power-of-uncomfortable-silences

[5] https://www.linkedin.com/pulse/how-leverage-power-silence-when-pitching-your-business-ginny-radmall/

[6] https://www.rug.nl/staff/n.koudenburg/koudenburgetal.2011.pdf

[7] http://commons.emich.edu/cgi/viewcontent.cgi?article=1052&context=gabc

[8] https://www.bbc.com/worklife/article/20170718-the-subtle-power-of-uncomfortable-silences

[9] https://www.bbc.com/worklife/article/20170718-the-subtle-power-of-uncomfortable-silences

Introduction

In today’s rapidly evolving business environment, the traditional norms of leadership need to be revised. The boardroom, a focal point for strategic decision-making, requires a fresh leadership approach that underscores authenticity, self-awareness, and a careful equilibrium between challenging and supporting CEOs.

Identifying Authenticity Gaps

While gauging a leader’s authenticity and self-awareness isn’t an exact science, certain tools help shine a light on these essential traits. One way to measure authenticity is to assess the alignment between a leader’s self-identified values and their colleagues’ perceptions. When these do not overlap sufficiently, it could indicate a disparity between a leader’s intention and others’ perception, revealing potential authenticity gaps.

Similarly, self-awareness can be evaluated by contrasting a leader’s understanding of their strengths and weaknesses against feedback from their peers. This contrast fosters critical conversations about the role of self-awareness in potential CEOs and how a self-aware leader’s strengths and weaknesses might serve the company’s strategic needs.

Key Qualitative Attributes

Our work with boards, CEOs, and C-suite teams across various industries gives us a first-hand view of the evolving definitions of effective leadership. It’s becoming increasingly clear that quantifiable metrics doesn’t solely determine a leader’s success. Instead, it lies in understanding the qualitative attributes that result in success. This includes a leader’s behaviour, their ability to build teams and develop talent, and fundamentally, who they are—not merely the numbers they produce.


There are powerful psychological foundations behind this shift in leadership paradigms. The Big Five personality traits—Openness, Conscientiousness, Extraversion, Agreeableness, and Neuroticism—have been found to correlate with leadership effectiveness (Judge et al., 2002). In a meta-analysis of 222 correlations from 73 samples, Extraversion emerged as the most consistent correlate of leadership across various settings and leadership criteria, while Neuroticism showed a negative correlation. Overall, the Big Five traits had a multiple correlations of .48 with leadership, providing strong support for the leader trait perspective when traits are organised according to this model (APA PsycINFO Database Record, 2016). Leaders scoring high in conscientiousness and openness often exhibit heightened self-awareness and authenticity, underlining the value of incorporating these psychological elements into leadership evaluations.

The Role of the Boardroom

However, the success of this new leadership approach largely hinges on the practices adopted within the boardroom. Boards must expand beyond the conventional focus on governance compliance, cultivating an environment that encourages performance excellence. This strategy rests on diverse leadership styles, effective board structures, stakeholder engagement, and fostering a positive organisational culture. Central to this is the board’s ability to model ethical behaviour, uphold core values, and promote equality, diversity, and inclusion.


Moreover, soft skills, such as empathetic listening, clear communication, and emotional intelligence, emerge as a vital element in this context. Assessing these skills through methods like board process simulations can be particularly beneficial. These simulations mimic high-pressure environments, enabling the development and refining of these essential soft skills.

A critical aspect of the board’s role is striking a balance between challenging and supporting CEOs. This dynamic greatly influences the company’s overall performance. Boards must ensure optimal decision-making and performance while also providing a supportive environment for CEOs who often face high-stress roles. Yet, care must be taken to avoid falling into ‘support’ or ‘challenge’ traps. Cognitive biases can lead to overemphasising CEOs’ successes or difficulties based on initial perceptions, often creating a negative cycle of escalating tension and deteriorating performance.

Conclusion

In conclusion, the complexities of modern leadership necessitate a shift away from traditional boardroom practices. Embracing an approach centred on authenticity, self-awareness, and balanced dynamics between the board and CEOs can foster better conversations, higher-quality decisions, and stronger organisational foundations. As we continue grappling with an unpredictable world, it’s critical that our leadership frameworks evolve in tandem, ensuring a more effective and modern boardroom.

More on Authentic Leadership

Emotional Intelligence and Engaging Others

Leadership in Focus: Foundations and the Path Forward

Reference:

Judge, T. A., et al. (2002). Personality and leadership: a qualitative and quantitative review. Journal of Applied Psychology, 87(4), 765–780. https://doi.org/10.1037/0021-9010.87.4.765

Natural Sciences: The World Through Objective Lenses

The field of natural sciences, including disciplines like biology, physics, and chemistry, is celebrated for its precision and objectivity. One of the major strengths of the natural sciences is its reliance on empirical methodologies. These methods, grounded in direct observation or experiment, aim to unravel the laws of the natural world in a consistent, replicable manner. This consistency lends itself to predictions, allowing us to anticipate outcomes based on previous observations. For instance, if one understands the principles of gravity, one can predict the behaviour of objects in free fall.

However, while the rigour of the natural sciences is commendable, it is not without its limitations. The very objectivity that stands as its strength can sometimes limit its scope. The natural sciences primarily seek quantifiable results, which might exclude phenomena that are less tangible or not immediately observable.

Social Sciences: Decoding Human Complexity

Conversely, social sciences, spanning disciplines such as sociology, anthropology, and psychology, delve into the intricate realms of human behaviour and societies. Their strength lies in understanding, interpreting, and sometimes predicting human actions, reactions, and interactions. The social sciences best understand cultural, tradition, and individual psyche nuances.

Social scientists often employ qualitative research methods, which enable them to explore intricate human emotions, motivations, and behaviours that might not be easily quantifiable. For instance, while a natural scientist can tell us how the human brain reacts to certain stimuli, a social scientist might explain why a certain stimulus is perceived as positive in one culture and negative in another. But just as with natural sciences, the strengths of the social sciences can sometimes be their Achilles’ heel. While offering rich insights, the deep dive into human behaviour and societies can sometimes lack the objective rigour that characterises the natural sciences. The very subjectivity that provides depth can also lead to biases or interpretations that may not be universally applicable.

Bridging the Divide

While the differences between the natural and social sciences are clear, it’s essential to understand that they are two sides of the same coin. The natural sciences provide us with a broad understanding of our world, explaining the ‘how’ behind phenomena. In contrast, the social sciences provide context, delving into the ‘why’ behind human actions and interactions.

To get a holistic understanding of our world, choosing one over the other is not a matter of recognising the value in both. For instance, addressing global challenges such as climate change requires both empirical data from natural scientists and insights into human behaviour from social scientists.

Interdisciplinary Collaboration: A New Dawn

The crux of modern challenges lies in their multidimensionality. Consider public health issues such as the recent global pandemic. Understanding the virus’s biology (a realm of the natural sciences) is as crucial as understanding human behaviour, societal dynamics, and the cultural implications of interventions (all territories of the social sciences). Such complex issues cannot be effectively addressed without a harmonious collaboration between the two domains.

Advantages of Collaboration

  1. Holistic Solutions: Combining the methodologies and findings from both natural and social sciences can lead to comprehensive solutions that account for both the physical and socio-cultural dimensions of a problem. For instance, in environmental conservation, biological insights into an ecosystem can be enriched by anthropological knowledge about the indigenous communities living there.
  2. Innovation Through Integration: Often, ground-breaking discoveries are made at the intersection of disciplines. By understanding and integrating principles from both natural and social sciences, we can pioneer innovative solutions that wouldn’t be conceivable within the confines of a single discipline.
  3. Greater Societal Impact: Recommendations backed by empirical data and socio-cultural insights will likely be more accepted and impactful. Policies, interventions, or solutions that consider both the scientific facts and the human element tend to be more effective and sustainable.

Challenges in Collaboration

However, the path to effective interdisciplinary collaboration isn’t without hurdles.

  1. Differing Methodologies: As previously highlighted, natural sciences primarily employ empirical methods, emphasising quantifiable data and experiments. In contrast, social sciences often lean towards qualitative approaches, focusing on in-depth observations and interviews. Finding a common ground where both sets of methodologies are respected and integrated can be challenging. For instance, imagine a research team tackling an environmental issue. Natural scientists may conduct controlled experiments to measure the impact of pollution on a specific species. In contrast, social scientists may engage in ethnographic studies to understand how the affected community perceives and responds to these changes. Bridging these distinct approaches requires thoughtful coordination and compromise.
  2. Communication Barriers: The terms, words, and fundamental concepts used in these two fields can be quite different. Effective collaboration means bridging this communication gap, which often takes more time and effort. Imagine a biologist and a sociologist teaming up to study how urbanisation affects a local ecosystem. The biologist may use technical language to explain ecological processes, while the sociologist relies on social science terminology. They must clarify their terms and understand each other’s language to work together smoothly.
  3. Institutional Hurdles: Traditional academic and research institutions tend to organise themselves into separate natural and social sciences departments. Encouraging interdisciplinary research may mean changing established academic traditions and structures. Picture a university where departments neatly separate the natural and social sciences. When researchers from these two worlds want to collaborate, they might encounter resistance within the institution. A commitment to breaking down these barriers and creating an environment that supports interdisciplinary work is needed to overcome these obstacles.

Conclusion: Embracing the Future of Interdisciplinary Research

With their empirical precision, the natural sciences have irrefutably made an indelible mark on humanity, laying the foundations for many of our advancements, innovations, and discoveries. These sciences give us a clearer understanding of the physical world, from the infinitesimally small components of an atom to the vastness of the universe. They offer predictions and analyses that have, over centuries, revolutionised our very existence. However, in pursuing knowledge and understanding, we must not lose sight of the vital importance of the social sciences. The social sciences provide us with crucial insights into the human psyche, our cultures, and the intricate tapestry of societies and their evolutions. They arm us with the tools to imagine alternative futures and understand the profound impact of technological advancements on society, as was evident during the rise of steam power and its transformative effects on the world of work and leisure.

Furthermore, the social sciences are instrumental in public health, education, and societal well-being. For example, by examining our eating habits in the context of our environment, social scientists enable us to craft more effective, tailored health interventions. The influence of the social sciences also extends to education, where understanding students’ perspectives leads to more effective schooling practices.

Moreover, in an age of digital transformation, the social sciences stand guard over our democracies, examining the shifts from traditional media to digital platforms. They ensure that despite democratising information dissemination, critical analysis remains at the forefront, safeguarding our societies against misinformation.

Importantly, social sciences challenge our worldviews, offering fresh perspectives on topics ranging from feminism and ecology to broader societal movements. They encourage us to critically engage with our surroundings, whether it’s a museum visit or an online chat, fostering a deeper appreciation and understanding of our global community.

In essence, while the natural sciences provide invaluable insights into the ‘how’ of our world, the social sciences delve into the ‘why’. The tapestry of human knowledge is woven with threads from both domains and to sideline one would be to deny ourselves a holistic understanding of our existence.

As individuals, we can foster collaboration by actively seeking interdisciplinary work opportunities within our fields or professions. By embracing the complementary strengths of the natural and social sciences, we contribute to a more informed, inclusive, and resilient global society better equipped to address the multifaceted challenges of our times.

More on Collaboration

Synergy Over Solo: Navigating the Collaborative Future of Business article by Shay Dalton

Collaboration: The Common Thread in Art, Science, and Business Success article by Shay Dalton

The Power of Team Clusters: A People-Centric Approach to Innovation article by Shay Dalton

1. Modern Leadership: Bridging Tradition and Innovation

Tokyo, a city where centuries-old temples stand alongside cutting-edge skyscrapers, exemplifies the merging of tradition with innovation. It paints a vivid picture of today’s leadership paradigm, where the challenge is to preserve age-old wisdom while embracing the agility demanded by modern times.

Take the example of Indra Nooyi, the former CEO of PepsiCo. Her approach was not just anchored in advanced business strategies but was deeply influenced by her roots and traditional values. By penning personal notes to the parents of her executives, Nooyi demonstrated a unique synthesis of cultural respect and contemporary leadership—suggesting that the two aren’t mutually exclusive but can indeed complement each other.

Now, more than ever, leadership encompasses a broader range of skills and qualities. Cross-cultural understanding, for instance, has emerged as a pivotal asset. It’s not just about an American entrepreneur being fluent in Mandarin but understanding and navigating the nuances of global markets, appreciating cultural subtleties, and forging meaningful partnerships across borders.

Ethical leadership is another domain gaining prominence. Companies like Patagonia, led by visionaries like Rose Marcario in the past, have shown that responsible governance isn’t just about ticking corporate responsibility boxes. In fact, Patagonia has committed to donating 1% of its total sales to environmental organisations through its “1% for the Planet” initiative, amounting to over $89 million in donations since the program’s inception. This move is a testament to genuinely embedding sustainability and transparency into the core business strategy, setting a gold standard for other enterprises to emulate.

In places of innovation like Silicon Valley, the very definition of leadership is evolving. It’s not confined to boardrooms or dictated by tenure. Here, a brilliant idea can propel a young developer into a leadership position, proving that age is becoming less of a determinant. Instead, adaptability, innovative thinking, and a relentless drive are becoming the hallmarks of modern leaders.

This shift in leadership dynamics extends beyond the corporate sphere and into global governance. While individual leaders may have their strategies and legacies debated, certain qualities are universally revered. Steadfastness, principled decision-making, and genuine empathy are essential traits for effective leadership in our interconnected age.

In today’s organisational landscape, leadership is omnipresent, transcending hierarchies. Firms like Google underscore this, promoting a culture where leadership emerges from collaborative efforts, proactive initiatives, and shared responsibilities. As the business world becomes increasingly complex, understanding and adopting these multifaceted leadership approaches isn’t just commendable; it’s imperative for sustainable success.

2. Leadership: A Blend of Nature, Nurture, and Adaptation

In every organisation, each individual brings unique skills and perspectives. While each member’s contribution is vital, the leader, much like a conductor, brings together these diverse talents to create a cohesive and effective outcome. Today’s leaders harness their natural abilities and continually refine and develop new skills to lead effectively.

Leadership is a synergy of inherent traits and cultivated abilities at its core. Determination, decisiveness, and vision may be innate for many, but skills such as emotional intelligence underline the constant evolution and adaptation that the modern leadership landscape demands. The journey of Ratan Tata, who transformed the Tata Group into a global conglomerate, exemplifies this balance. His leadership displayed a mix of inherited business acumen and learned skills, showcasing the essential interplay of nature and nurture in leadership.

In our fast-changing corporate world, leaning solely on inherent strengths or past achievements doesn’t suffice. Leaders like Isabelle Kocher, the former CEO of Engie, one of the world’s largest utility companies, recognised the importance of adaptability and sustainability in modern leadership. Under her direction, Engie embarked on a radical transformation, moving away from fossil fuels and heavily investing in renewable energy sources and infrastructure. This bold shift was not just a business strategy but a reflection of Kocher’s vision for a sustainable future. She spearheaded efforts to divest from coal operations and led Engie to invest in innovative renewable energy projects, embracing the future of clean energy. Effective communication played a crucial role in this transition. Kocher was adept at relaying information and conveying her passion, vision, and purpose to her team at Engie and the broader public, emphasising the company’s commitment to a sustainable and environmentally responsible future.

Diverse approaches to leadership also paint the modern landscape. While some leaders may naturally exude authority, others bring forward the strength of collaboration, collective achievements, and mutual respect. Leadership in the realm of the arts, for instance, as demonstrated by Theaster Gates—a social practice installation artist—shows how leadership can transcend corporate and political boundaries, making waves in cultural and community contexts.

Leadership today is not just about a title or a position. It’s a harmonious blend of what one is born with and what one learns and adopts, all tuned to the evolving needs of organisations and societies. Two prominent leadership styles that have gained traction in this context are ‘laissez-faire’ and ‘transformational’ leadership.

The laissez-faire style, which is derived from the French term meaning to “let go”, allows team members significant autonomy in their work. Leaders like Steve Jobs and Steven Bartlett are often associated with this style. They trusted in their teams’ inherent creativity and drive, intervening only when necessary. Such an approach has its merits in industries that thrive on innovation and where the creative freedom of individuals is paramount.

On the other hand, transformational leadership, as embodied by figures like Richard Branson, inspires and motivates team members to exceed their own expectations and achieve a collective vision. These leaders are proactive, continuously challenging the status quo and instigating change to better the organisation. They foster an environment where both the leader and the team support each other’s growth and transformation.

Both these styles emphasise the shift in norms surrounding leadership today. It’s no longer about just directing or managing but about inspiring, trusting, and continuously evolving to meet the ever-changing demands of the modern world.

3. Shaping the Future: The Role of Proactive Leadership

Proactive leadership focuses on more than just addressing current challenges; it’s about actively planning and influencing the future. While entrepreneurs like Elon Musk are often highlighted, digging deeper and understanding the foundational principles that enable such forward-thinking actions is important.

One key concept from organisational psychology is ‘Psychological Safety’. Introduced by Harvard Business School professor Amy Edmondson, it describes an environment where team members feel secure in taking risks and expressing their ideas without fear of reprimand. Successful teams, like those at Google, have pinpointed Psychological Safety as a driving factor. When leaders cultivate this safe space, they express organisational values and encourage a culture where innovation can flourish.

This atmosphere of trust and openness is especially crucial in today’s interconnected world, where leadership actions are constantly scrutinised. Every decision and every mistake is magnified in the digital age. It underscores the idea that ethical behaviour isn’t just a commendable attribute—it’s vital. Leaders who prioritise psychological safety invariably pave the way for ethical leadership. In this scenario, proactive leadership revolves around upholding transparency and ensuring that decision-making is always rooted in strong ethics, allowing team members to communicate openly and act with integrity.

However, challenges such as persistent gender biases remind us that there’s still work to be done. Effective leadership recognises such biases and takes deliberate steps to address and overcome them, ensuring that potential is recognised and nurtured regardless of gender or background. For instance, the often-discussed gender pay gap shows that women, on average, earn less than men in nearly every single occupation for which there is sufficient earnings data. This reflects a systemic inequality and can damage psychological safety, as it conveys an implicit message that women’s contributions are less valuable. Proactive leadership recognises such biases and actively works to address and correct them, ensuring that every team member feels valued and heard. This atmosphere of trust and openness directly feeds into the broader principle of psychological safety, where individuals can communicate openly without fear.

In conclusion, proactive leadership is about foresight and action. It means navigating the present while laying strong foundations for the future, driven by a combination of psychological understanding and ethical commitment. Today’s leaders don’t just ride the waves—they help create them.

4. Crafting Your Leadership Path

Leadership is a unique journey, blending inherent qualities, acquired skills, and external influences. Apple’s co-founder, Steve Jobs, advocated for pursuing passions and trusting one’s instincts. However, the leadership voyage extends beyond instinct. Like those by Daniel Goleman on emotional intelligence, ground-breaking studies highlight self-awareness as a keystone of effective leadership. Such understanding aids leaders in harnessing their strengths and addressing their vulnerabilities.

Adaptability is pivotal in the current age of rapid technological and societal changes. Management theories such as the Situational Leadership Model, developed by Hersey and Blanchard, emphasise that leaders must adjust their style based on the task and individual’s maturity. So, while the world moves quickly, aligning personal and organisational values ensures that leadership remains authentic and relevant.

Every leadership story is unique and shaped by personal aspirations, experiences, and trials. Recognising this, there’s a need to move beyond one-size-fits-all strategies. Customised leadership plans, tailored to individual paths and goals, prove more effective than generic formulas. A principle of economics, the Theory of Comparative Advantage, posits that individuals or entities should capitalise on their strengths. In the leadership context, this underscores focusing on one’s unique capabilities and value propositions. Furthermore, leaders aren’t isolated figures; they operate within complex organisational ecosystems. Just as a sailor must consider the sea’s currents and weather patterns, leaders must understand their organisational cultures. An environment fostering open dialogue, feedback, and continuous learning can catalyse a leader’s evolution. Conversely, restrictive cultures might pose challenges. But in both contexts, understanding and adeptly navigating these nuances differentiates good leaders from great ones.

In essence, leadership is not a linear path but a dynamic journey. It combines introspection, adaptation, and understanding of the larger organisational landscape. As the saying goes, it’s not just about the destination but the journey and how one travels it.

5. Visionaries to Tomorrow’s Leaders

Great leaders throughout history have consistently displayed adaptability, innovation, and a commitment to mentoring the next generation. Larry Page’s leadership at Google exemplified this. Rather than solely focusing on ideas, he emphasised nurturing talent, most notably by mentoring Sundar Pichai. This approach underscored the belief that a true leader’s legacy is in empowering successors. Apple’s resilience and ability to reinvent itself embody the “falling forward” concept — transforming challenges into opportunities. Amazon’s success story is a testament to adaptability, echoing Bruce Lee’s advice to be “like water,” — flexible, yet forceful. In the tech realm, Netflix’s pioneering use of AI and Microsoft’s emphasis on cloud computing under Satya Nadella highlight the importance of forward-thinking innovation, drawing parallels to historical visionaries like Tesla and Edison.

6. Cultural Harmony: Crafting the Future of Leadership

Satya Nadella’s transformative journey at Microsoft exemplifies the essence of a growth mindset, teaching us that true success isn’t solely about beginnings but rather the directions we’re willing to explore. With its relentless drive to innovate, Tesla embodies the spirit of pioneers who are never content with the status quo. Adobe’s culture of valuing feedback and continuous improvement is a testament to the belief that “iron sharpens iron,” highlighting the power of collective growth and learning. Similarly, Spotify’s commitment to inclusivity is not just a nod to diversity but a clear indication that the future of leadership mirrors and celebrates the myriad voices of society.

In essence, these examples underline that the modern leadership paradigm thrives on adaptability, continuous growth, and cultural harmony, emphasising that the best leaders not only lead but also listen, learn, and reflect the diverse tapestry of our global community.

7. Nurturing Leadership: Strategies, Collaboration, and Vision

Margaret Heffernan’s concept of “wilful blindness” refers to the deliberate decision to ignore or avoid inconvenient facts or realities, even when they are readily apparent. It underscores the importance of leaders being vigilant, aware, and attentive, breaking from conformity to foresee and address challenges. Salesforce’s culture, which champions innovation and disruption, mirrors the economic principle of ‘creative destruction’ proposed by economist Joseph Schumpeter, where innovative methods and ideas replace old ways of doing things. Reflecting on the management theories of Peter Drucker, he emphasised that “Management is doing things right; leadership is doing the right things.” As emerging leaders design their journey, frameworks, like Amazon’s leadership principles, serve as contemporary iterations of timeless navigational tools — guiding leaders both on well-trodden paths and ventures into the unknown.

8. Future Leadership: Charting New Waters with Timeless Principles

Drawing from Charles Darwin’s insights, it’s not the strongest species that survive, nor the most intelligent, but the most responsive to change. In the realm of leadership, this rings especially true. Traditional hierarchical models are yielding to a more collaborative and adaptive approach. The dawn of AI and the intricate dance of globalisation echo the words of economist John Maynard Keynes, emphasising the need to be versatile in the face of “animal spirits” or unpredictable elements in markets. Cybersecurity concerns today might parallel the challenges once posed by maritime pirates during the age of exploration, underscoring that while challenges evolve, the essence of leadership remains in navigating uncharted territories. Tomorrow’s leaders will not only ride the waves of technological change but also harness the diverse strengths of global teams and confront ethical quandaries in a deeply interconnected era guided by principles as old as leadership itself.

9. Leading Forward: Drawing from the Past, Shaping Tomorrow

Much like Rome, which wasn’t built in a day, leadership thrives on a foundation of age-old principles fused with modern foresight. This blend is reminiscent of the principles set forth by legendary strategist Sun Tzu in “The Art of War” – understanding the terrain, knowing oneself, and being fluid in response. Today’s urban jungles, from Tokyo to New York, encapsulate this harmony; they meld historical foundations with skyscrapers of ambition, symbolising the fusion of past wisdom with future vision. Leaders like Indra Nooyi exemplify this duality, resonating with the roots of ancient wisdom while spearheading an era of digital transformation. Leadership, therefore, isn’t a destination but an ongoing odyssey. Much like cities that reinvent while retaining their essence, leaders must be perpetual pioneers with an eye on the horizon and feet grounded in enduring values.

More on Trust

The Importance of Trust article by Shay Dalton

Elite Team Cohesion article by Jonny Cooper

Leadership in Focus: Foundations and the Path Forward article by Shay Dalton

The Importance of Ethics article by Shay Dalton

“Empowering” Workers is More Than a Catchy Phrase article by Shay Dalton

Introduction

Are humanities subjects – and humanities students – doomed? – 1% Extra Article – Rob Darke

“We should cheer decline of humanities degrees.”

So read the headline of a piece by Emma Duncan in The Times. Duncan, who notably studied Politics and Economics at Oxford, thinks that the decline in the number of students enrolling in humanities degrees is a societal positive. And her reasoning is sound. She says that the humanities fail to engender in their students sufficient practical skills to be both employable once they graduate, and able to thrive once they’re part of the workforce. The lack of skills today’s humanities graduates are instilled with and the subsequent lack of employment they are able to find as a result is, she says, why so many of today’s youth feel betrayed by their elders (she acknowledges that the bleak state of today’s housing market is also a factor) [1].

It’s a provocative piece, featuring statements like, “Literature is lovely stuff but it’s not a way to earn your bread,” that make one suspect it is deliberately so. It ignited a furious backlash from some corners of the internet and an equally furious backlash to the backlash from others. Such are the times we live in. But Duncan’s argument is nothing new. This debate has been raging since well before her own student days, though the merit of the arguments on each side does tend to depend on the context of the times in which the debate is taking place. The shifting state of employment rates and the in-vogue professional skills of the era are always going to have an impact.

In lieu of the topic’s re-emergence in the column circuit, it’s worth investigating what humanities offer, what students and employers want from a University education, and whether things are really moving in the right direction or the wrong one.

Humanities graduates: Penniless and unskilled?

Duncan argues that, “the people who are struggling are those in nice, fluffy jobs like publishing and the creative arts, and in the caring professions” [2]. Leaving caring aside, as this falls outside of the remit of humanities, Duncan is right that humanities and social science graduates are less well off than their STEM (Science, Technology, Engineering and Mathematics) studying counterparts. But not by much.

STEM subjects are generally considered to be the crème de la crème of the practical degrees, all-but guaranteeing jobs in engineering, the finance sector, and a whole other heap of other lucrative industries. In the UK, however, where Duncan’s article was focused, STEM graduates earn on average £38,272 a year compared with humanities and social science graduates’ £35,360 [3]. The difference is useful cash in the pocket, most certainly, but not such a striking distance apart that STEM might be seen as the educational pinnacle while the humanities are dragged kicking and screaming to the pedagogical chopping block. Similarly in the US, for those aged 25-34, the unemployment rate of those with a humanities degree is 4%. For those with an engineering or business degree? A little more than 3% [4]. The vigour of the anti-humanities debate doesn’t seem to accurately reflect the marginality of these differentials.

Meanwhile, creative industries represent 5.6% of the UK’s GDP. The UK is the largest exporter of books in the world, in large part because of the strength of its publishing industry, and the creative industries are not only growing, but doing so at a faster rate than the economy as a whole [5]

In other words, the humanities are fine. Except, as is plainly apparent to anyone with skin in the game, that’s not really true. For all the detractors of Duncan’s article there is a reason that she wrote it, as well as something intrinsically recognisable in the notion at its core. While we may disagree that the decline in humanities is something worth cheering about, we do understand that the decline she so celebrates is real and worsening. The chances are that everyone reading this either knows a struggling humanities graduate or is one themselves.

A striking statistic from Duncan’s piece notes that, “Looking at higher education as an investment, the Institute for Fiscal Studies calculates that the return for men on a degree in economics and medicine is about £500,000, for English it is zero and for creative arts it is negative” [6]. Meanwhile, a recent report from the British Academy found that, “English Studies undergraduate students domiciled in England fell by 29% between 2012 and 2021” [7].

In the UK, there has been a 20% drop in students taking A-levels in English and a 15% decline in the arts [8]. Across the pond, a 2018 piece in the Atlantic found that the number of University History majors was “down about 45 percent from its 2007 peak, while the number of English majors has fallen by nearly half since the late 1990s” [9]. The piece also noted that the decline was “nearly as strong at schools where student debt is almost nonexistent, like Princeton University (down 28 percent) and the College of the Ozarks (down 44 percent).” In other words, rising tuition fees were not a factor, or at least not a strong one, in these numbers.

Why is it then that we’re seeing such a drop off in the number of students wanting to pursue a degree in the humanities, especially if the unemployment and starting salary figures are as closely aligned as the stats suggest?

What you should be doing…

In the aforementioned Atlantic article, the author, Benjamin Schmidt, notes that in the US there was a large-scale drop off in the number of humanities majors in the wake of the 2008 financial crisis. Schimidt argues that in the wake of the crisis, “students seem to have shifted their view of what they should be studying—in a largely misguided effort to enhance their chances on the job market” [10].

It’s not hard to understand why students would take such an action, or to think that a similar phenomenon is not underway in the wake of the twin crises of Covid and the cost of living. People know that the economy isn’t in great shape. They know that there are by an order of magnitude more graduates than ever before that they will soon be thrust into the real world to compete with. And they know – or think they know – that STEM subjects offer a level of security that more artistic ventures do not. In large part because that’s what people in positions of power have told them.

Put plainly, Schmidt argues that, “Students aren’t fleeing degrees with poor job prospects. They’re fleeing humanities and related fields specifically because they think they have poor job prospects” [11]. The gulf between the real and the speculative here is vital, and damaging. As already noted, Duncan is offering nothing new in her argument other than an oddly fatalistic sense of glee. STEM equals rich, humanities equals poor. That’s the basic conception underpinning most of the public’s school of thought on this matter. But as has already been shown, the numbers on that don’t quite stack up.

One might argue that to attribute such enormous declines in the numbers of humanities students purely to a collective attitudinal miscalculation is short-sighted. To counter that, it would be worth running an experiment in which students were able to sign up to a University where tuition was free and every first-year student had a guaranteed job lined up after education. Under these favourable circumstances, would students still be shunning the humanities or would it turn out that the cost and perceived lack of employability is the real problem? In such a scenario, given that we’ve already ruled out the cost being a key factor, we could say with some credibility that the perceived lack of employability a humanities degree offers was the number one reason for the declining numbers.

Thankfully, we don’t need to run such an experiment as these institutions already exist in the form of US military service academies. Students are granted free tuition and a guaranteed job within the US military upon graduation. And what do the numbers show? That at West Point, Annapolis and Colorado Springs, humanities majors were at roughly the same level in 2018 as they were in 2008 [12]. They were not affected by the colossal drop-offs in History and English majors that were noted earlier in the article.

Hard skills vs Soft skills

Why someone might think a STEM subject offers more than a humanities one is obvious. One offers hard skills, the other soft. STEM students have something tangible to show for their hard work, whether that’s in the form of lab skills or a mastery of a certain equipment. Stand that up against an English literature graduate and it may look like one has received much more bang for their buck through the University experience than the other. In this regard Duncan’s argument is entirely justified. Humanities students aren’t being taught hard, practical skills that set them up for the workplace. But a skill doesn’t have to be part to be practical. Indeed, to focus only on hard skills is to massively undervalue the soft skills one learns in the humanities and the vital role they play in a real-life professional environment. Not to mention, as we will show in a moment, what they offer fiscally.

Humanities students are taught to think critically, to engage with arguments and frame their own, to deal with people and empathise with a variety of viewpoints. In stark contrast to scientific or mathematical endeavours, it is far more important in the humanities to be able to step back and understand a range of possible answers, acknowledging the merits and flaws in each, than to arrive at a single, binary, immovable conclusion. As Karan Bilimoria, a member of the House of Lords and Chancellor of the University of Birmingham has said on the subject, “Anyone who thinks [humanities] subjects are of low value [doesn’t] know what they are talking about…They provide many transferable skills — analytical, communication, written — that help students to take on a range of jobs” [13].

James Cole, a software engineer in Bath, wrote to The Guardian in response to Sheffield Hallam’s announcement that they would be dropping their English Literature course. He agreed that English Literature offered something vital, even in his much more technical line of work, saying:

English Literature degrees teach criticism, a form of analysis that suits the workplace very well. What is the truth in a given situation, how does it tie into wider themes, and how can I best communicate that? Deep reading skills, mental organisation, patience. Studying STEM (Science, Technology, Engineering and Mathematics) doesn’t develop these skills in the same way, and I should know because I also have an MPhil in computer science. Almost none of my colleagues have Humanities degrees, and it shows. [14]

Sheffield Hallam

Writing in the journal Arts and Humanities in Higher Education, Eliza F. Kent argues similarly, saying that: “the most important resource necessary to succeed in today’s competitive marketplace is a clear, eloquent, impassioned voice. The learning exercises at the foundation of excellent humanities-based education may appear to lack any utilitarian benefit, but their long-term effect is the development of each student’s individual voice, which is priceless” [15].

Soft skills: Money in the bank

In direct contrast to the many arguments that humanities are a one-way ticket to poverty, studies have shown that, due to their superior soft-skills, including diplomacy and people-management, humanities graduates often go on to find themselves in positions of leadership. 15% of all humanities graduates in the US go on to management positions (more than go into any other role) [16]. Meanwhile, a recent study of 1,700 people from 30 countries found that the majority of those in leadership positions had either a social sciences or humanities degree – this was especially true of leaders under 45 years of age [17]. Perhaps poverty does not this way lie after all.

The Future

What very few University degrees or workplaces are currently prepared for is the colossal impact AI is going to have on the kind of jobs that earn the most money, the kind that can be replaced, and the kind of skills students graduating into an AI-integrated workforce will need to be armed with. The likelihood that today’s students in any sphere are appropriately prepared for the wide scale changes ahead is slim. That’s probably doubly true for the aging and aged existing members of the workforce, who are generally likely to be less technologically articulate than their younger, tech-savvy counterparts.

Some people who do know about AI’s likely impact going forward are Brad Smith and Harry Shum, top-level executives at Microsoft who wrote in their book, The Future Computed, that:

As computers behave more like humans, the social sciences and humanities will become even more important. Languages, art, history, economics, ethics, philosophy, psychology and human development courses can teach critical, philosophical and ethics-based skills that will be instrumental in the development and management of AI solutions. [18]

Brad Smith and Harry Shum

Emma Duncan may be cheering on the demise of the humanities for the time being, then. But it seems like the soft-skilled graduates of tomorrow may end up having the last laugh.

Sources

[1] https://www.thetimes.co.uk/article/we-should-cheer-decline-of-humanities-degrees-5pp6ksgmz

[2] https://www.thetimes.co.uk/article/we-should-cheer-decline-of-humanities-degrees-5pp6ksgmz

[3] https://www.newstatesman.com/comment/2023/06/university-degrees-students-humanities-economics

[4] https://www.bbc.com/worklife/article/20190401-why-worthless-humanities-degrees-may-set-you-up-for-life

[5] https://www.newstatesman.com/comment/2023/06/university-degrees-students-humanities-economics

[6] https://www.thetimes.co.uk/article/we-should-cheer-decline-of-humanities-degrees-5pp6ksgmz

[7] https://www.thebritishacademy.ac.uk/news/english-degrees-becoming-more-popular-among-students-from-scotland-but-experiencing-decade-long-decline-elsewhere-in-the-uk-british-academy-finds/

[8] https://www.bbc.com/worklife/article/20190401-why-worthless-humanities-degrees-may-set-you-up-for-life

[9] https://www.theatlantic.com/ideas/archive/2018/08/the-humanities-face-a-crisisof-confidence/567565/

[10] https://www.theatlantic.com/ideas/archive/2018/08/the-humanities-face-a-crisisof-confidence/567565/

[11] https://www.theatlantic.com/ideas/archive/2018/08/the-humanities-face-a-crisisof-confidence/567565/

[12] https://www.theatlantic.com/ideas/archive/2018/08/the-humanities-face-a-crisisof-confidence/567565/

[13] https://www.khaleejtimes.com/long-reads/are-humanities-degrees-worthless

[14] https://www.theguardian.com/education/2022/jul/01/it-will-always-have-value-readers-on-whether-english-lit-is-worthwhile

[15] Kent, E. F. (2012). What are you going to do with a degree in that?: Arguing for the humanities in an era of efficiency. Arts and Humanities in Higher Education, 11(3), 273–284. https://doi.org/10.1177/1474022212441769

[16] https://www.bbc.com/worklife/article/20190401-why-worthless-humanities-degrees-may-set-you-up-for-life

[17] https://www.bbc.com/worklife/article/20190401-why-worthless-humanities-degrees-may-set-you-up-for-life

[18] https://www.businessinsider.com/microsoft-president-says-tech-needs-liberal-arts-majors-2018-1

Christopher Nolan’s Oppenheimer, as its name suggests, focuses on the notorious father of the atomic bomb, J Robert Oppenheimer. Oppenheimer was recruited by the US government to lead the Manhattan Project during World War II. Oppenheimer and his team were in a frantic race of science against the Nazis for who could be the first to utilise newly discovered breakthroughs in fission to build an atomic bomb. The winner would wield more power than ever considered possible. They would be, as Oppenheimer himself famously noted, the “destroyer of worlds.”

Becoming death

Oppenheimer’s quote, originally from the Hindu scripture the Bhagavad Gita – in full, “Now I am become Death, the destroyer of worlds” – hints at the moral quandary at the centre of Nolan’s film and Oppenheimer’s legacy. What he and his team at Los Alamos achieved was an extraordinary feat of science, a rightly heralded achievement. As also noted in the film, Oppenheimer was not taking part in the project out of some twisted bloodlust, rather because he feared what would happen if the Nazis were able to create such a weapon first.

As it happened, the Nazis had surrendered by the time the bomb was complete. In August of 1945, the US government instead dropped bombs on Hiroshima and Nagasaki, Japan. Estimates vary but it’s thought that over the two to four months following the bombings between 90,000 and 146,000 people were killed in Hiroshima and between 60,000 and 80,000 in Nagasaki, with roughly half of those deaths occurring on the first day [1]. Oppenheimer himself wrestled with the weight of what his creation had wrought for the remainder of his life. The question of whether dropping the bombs was a necessary act to bring the war to an end or a grotesque act of chest-puffing genocide continues to this day.

The Manhattan Project was not a business – though it did employ a whole town’s worth of scientists, engineers and their families. But the film is clear that the ethical implications of the work at hand went unconsidered until it was too late. While it may seem tenuous or trivialising to compare such an overtly destructive act to the ethical considerations of day-to-day business practice today, anyone who has read Patrick Radden-Keefe’s searing Empire of Pain [2] about the role of Purdue Pharma and the Sachler family in particular in generating the US opioid crisis, for which the death toll stands in the hundreds of thousands and counting, or has been paying close attention to the devastating climate impact wrought by Shell, Exxon, BP and the like over the past decades and its potential implications for the future of humanity, will recognise that ethics and business cannot be so easily separated. The ethical choices companies make can shape individual lives and entire worlds. It’s vital they’re taken seriously.

Business ethics

Business ethics refers to the standards for morally right and wrong conduct within a business. Businesses are, of course, held to account by law, but as we all know, there are slippery ways around the law. Something can be both legal and wholly unethical.

Business ethics are important for various reasons. As noted, in the most extreme cases, such as that of Purdue Pharma, the decision to downplay the addictiveness of their opioids contributed to or outright caused a deadly epidemic in the US. In less dramatic circumstances, some of the advantages of having a strong ethical practice in place are that it helps build customer trust (thus helping retain customers), improves employee behaviour, and positively impacts brand recognition. The numbers back that up.

Over half of U.S. consumers said they no longer buy from companies they perceive as unethical. On the flip side, three in 10 consumers will express support for ethical companies on social media [3]. A 2021 survey by Edelmen found that 71% of people believe that companies should be transparent and ethical [4]. Similarly a 2020 study on transparency from Label Insight and The Food Industry Association found that 81% of shoppers say transparency is important or extremely important to them [5].

Corporate executives surveyed by Deloitte, meanwhile, stated that the top reasons consumers lose trust in a consumer product company are that the brand is not open and transparent (90%), the brand is not meeting consumer environmental, social and governance expectations (84%), and the brand is engaging in greenwashing (82%) [6]. In case it wasn’t clear, then, ethics matter to consumers – which means they matter to business. And yet a 2018 Global Business Ethics Survey (GBES) found that fewer than one in four U.S. workers think their company has a “well-implemented” ethics program [7].

The businesses that are prioritising ethics, on the other hand, are reaping rewards.

Ethical profits

Ethisphere, an organisation that tracks the ethical conduct of the world’s largest companies, found that the businesses that qualified for its 2022 list of most ethical companies outperformed an index of similar large cap companies by 24.6 percent overall [8]. The Institute of Business Ethics similarly found that companies with high ethical standards are 10.7% more profitable than those without [9]. Honorees on the 2021 list of the World’s Most Ethical Companies outperformed the Large Cap Index by 10.5 percent over a three year period [10].

Not only do ethical companies make money, unethical ones lose it. 22% of cases examined in the 2018 Global Study on Occupational Fraud and Abuse cost the victim organisation $1 million or more [11]. And there are plenty of unethical companies who crashed and burned in a far more dramatic fashion.

Enron

Perhaps the most striking fall from grace was Enron, whose trading price plummeted to a level in accordance with its ethical standards in December of 2001. At the company’s peak, it had been trading at $90.75. Before filing for bankruptcy, its price was $0.26 [12]. For years Enron had been fooling regulators with fake holdings and off-the-books accounting practices while hiding its eye-watering levels of debt from investors and creditors. From 2004 to 2012, the company was forced to pay more than $21.8 billion to its creditors. Various high-level employees ended up behind bars.

Enron is the high-water mark for unethical disintegration. But there are plenty of moral and ethical quandaries for today’s mega-corporations to take on as well.

Ethics today

In the age of data, corporations, especially social media giants like Meta, have a huge ethical responsibility regarding their clients’ data. Extremely sensitive personal information is loaded into these sites on little more than an understanding of good faith on the consumers’ part. How that data is being used (or misused) by these companies is one of the defining discussion points of the age. That’s not to mention the targeting of ads and its potential detrimental impact, especially on the young. Rates of anxiety, depression and suicide in teen and tween girls, for example, surged from 2010, around about the time they first started getting iPhones [13]. A few years ago, Meta’s whistleblower, Frances Haugen, confirmed that the company’s internal data reflected that it was doing severe damage to the mental health of teenage girls. Despite having this information to hand, the company made no attempts to rectify the problem [14].

Meanwhile energy companies like BP, Shell and Exxon are being held to far greater scrutiny as the effects of their environmental wreckage are starting to play out more and more each year in the form of unprecedented spiking in temperatures, wildfires and floods. The company’s responsibility to its profit line is being put in conflict with its responsibility to the planet. Although record profits last year suggest that one is winning out over the other [15].

Companies like Google and Amazon, too, have ethical crises regarding their treatment of workers and data security. While their profits or position of dominance in their respective markets haven’t dropped, reputational damage has been done. Both companies will be working to try and keep the fallout to a minimum.

Implementing ethics

A company’s ethics start at the top. When management acts ethically, employees follow suit. Companies with strong ethical values tend to display their code of ethics publicly, allowing them to be held to account. Some key ways companies can create an ethical environment include conducting mandatory ethics training for all employees, integrating ethics with processes, creating a space where employees find it easy to raise concerns, clearly defining company values, fostering a culture of transparency, and aligning your incentive system with your company values [16].

The importance of ethics

Ethics are vital to an organisation’s longevity. Companies that choose to cut ethical corners may see short term gain, even thriving for decades, but when the chickens come home to roost, as they did for Enron, there’s no coming back. Climate change and issues around data-mining are placing greater scrutiny on companies who have historically been focused on profit-at-all-cost models detrimental to more macro struggles. In cut-throat business environments, morality and ethics can often be sidelined in pursuit of the bottom line. But as the numbers show, ethical companies are often rewarded by customers with repeat business, and reputational damage can prove irreversible.

Like the scientists at Los Alamos discovered, it’s possible to be so focused on innovation and achievement that you’re blinded to the ultimate consequences of your ambition. Sew the ethical seeds early so as not to be undone further down the line when your decisions play their course.

References

[1] https://en.wikipedia.org/wiki/Atomic_bombings_of_Hiroshima_and_Nagasaki

[2] https://www.penguinrandomhouse.com/books/612861/empire-of-pain-by-patrick-radden-keefe/

[3] https://www.redlands.edu/study/schools-and-centers/business/sbblog/2019/may-2019/3-reasons-why-business-ethics-important/

[4] https://www.linkedin.com/pulse/importance-business-ethics-why-ethical-conduct-success-manelkar/

[5] https://www.business.com/articles/transparency-in-business/

[6] https://www.business.com/articles/transparency-in-business/

[7] https://www.redlands.edu/study/schools-and-centers/business/sbblog/2019/may-2019/3-reasons-why-business-ethics-important/

[8] https://www.businessnewsdaily.com/9424-business-ethical-behavior.html

[9] https://www.linkedin.com/pulse/importance-business-ethics-why-ethical-conduct-success-manelkar/

[10] https://www.redlands.edu/study/schools-and-centers/business/sbblog/2019/may-2019/3-reasons-why-business-ethics-important/

[11] https://www.redlands.edu/study/schools-and-centers/business/sbblog/2019/may-2019/3-reasons-why-business-ethics-important/

[12] https://www.investopedia.com/updates/enron-scandal-summary/

[13] https://www.theatlantic.com/ideas/archive/2021/11/facebooks-dangerous-experiment-teen-girls/620767/

[14] https://www.npr.org/2021/10/05/1043207218/whistleblower-to-congress-facebook-products-harm-children-and-weaken-democracy

[15] https://www.theguardian.com/environment/2023/feb/09/profits-energy-fossil-fuel-resurgence-climate-crisis-shell-exxon-bp-chevron-totalenergies

[16] https://www.forbes.com/sites/forbescoachescouncil/2016/08/03/12-ways-your-company-can-excel-ethically/?sh=d93ba397f1e1

Introduction

Welcome to the leadership paradox. Let’s start with a scenario. You joined your company many years ago, starting in a more junior role, where you proved your skill sets over and over. Maybe you were a born salesperson, maybe you were a master at client relations, maybe you were product focused, perhaps something else entirely. Regardless, with every task and every further year at the company, you demonstrated your value. As such, you were rewarded with a series of promotions. Eventually you found yourself in a management position, the leadership role you’d always wanted. Sounds great. All is well, right?

Not necessarily. Because once you’d reached this position, you discovered that the skills you’d demonstrated to get there weren’t needed anymore. You’d ceased to be the one selling; you’d ceased the one fronting the call; you’d ceased to be the one making decisions around specification. You might have still tried to do all those things, to involve yourself heavily and bring yourself back to the fore. Perhaps you rolled up your sleeves and said your management style was “hands-on”, justified your involvement in lower-level projects by saying you were a lead from the front type.

To do so is only natural. After all, you got to where you are by being an achiever, someone who not only got things done but prided themselves on being the one actively doing them. And this is where the paradox lies. Because the skills that served you so well and earned you your promotion might be the very same ones preventing you from succeeding in your new role.

This notion is distilled to its essence in the title of Marshall Goldsmith’s bestselling book: “What got you here won’t get you there” [1]. Being a thriving part of the workforce and being a leader are two entirely different things. The skills are not the same. To achieve, you need to be able to get the best out of yourself. To lead, you need to be able to get the best out of other people.

Oftentimes, newly promoted leaders try to continue as they were before. They want to get their hands dirty, to micro-manage and ensure that every aspect of a project is marked by their fingerprints. But micro-management is not the answer. As Jesse Sostrin PhD, Global Head of Leadership, Culture & Learning at Philips, puts it, leaders need to be “more essential and less involved.” He adds, “the difference between an effective leader and a super-sized individual contributor with a leader’s title is painfully evident” [2].

For many, the adjustment is difficult and can take time. If a leader is too eager to imprint themselves on every aspect of a project, not only is the leader likely to end up feeling overstretched (according to Gallup research, managers are 27% more likely than individual contributors to strongly agree they felt a lot of stress during their most recent workday [3]) but the project will suffer too. Staff will come to feel constrained and undervalued. They may not feel they have the opportunity to grow or express themselves fully. They will be less likely to try new and innovative ideas with someone breathing down their neck or dictating that they must service a single vision at all times rather than being allowed to bring themselves to the fore.

In other words, micro-management offers a whole lot of downsides in exchange for very few upsides. Sostrin proposes that a useful way for a manager to tell if they are taking on too much responsibility is by answering the simple question: If you had to take an unexpected week off work, would your initiatives and priorities advance in your absence? [4] A well-functioning team run by an effective leader should in theory be able to get by without that leader – for a period of time, at least. Whereas an organisation that orbits around the whims of a single figure is likely to stall, and fast. It’s why all good managers practice delegation.

Why delegate?

Delegation is something every business practices but not all do well. Just handing an employee some of the work does not count as delegation in any meaningful sense. Successful delegation involves genuinely trusting the employee and granting them autonomy. That can be a scary prospect for a leader used to having a controlling stake in all output. But there are ways to ensure that even without constant supervision, your team is working in a manner you approve.

The first is obviously to hire smart, capable workers to whom you feel comfortable delegating responsibility. Oftentimes leaders take on extra workplace burdens out of a lack of faith in their team. They think, “I’m not confident they have the ability to do the task,” and so instead choose to take it on themselves. But trust is paramount to any successful workplace. And to paraphrase Ernest Hemingway, the best way to know if you can trust an employee is to trust them – at least until they give you a reason not to. The best thing a leader can do is give their employees a chance and see what happens.

After all, a leader’s job is to get the best out of their employees. As Forbes writer Cynthia Knapek puts it, “Some people work to show you what their superpower is, but a good leader works to show you yours…you’ll never be a good delegator if you’re holding on to the belief that no one can do it as well as you can” [5].

Trusting your team – and shedding the arrogance of presuming you can do everything better yourself – is pivotal to good leadership. Refusing to cede control is the sign of an insecure leader, one who sees their role and status as proportional to their decision-making authority. They think that any act of delegation would lead to a dilution of their power.

This theory is backed up by a 2017 study on psychological power and the delegation of authority by Haselhuhn, Wong and Ormiston. They ultimately found that, “individuals who feel powerful are more willing to share their decision making authority with others. In contrast, individuals who feel relatively powerless are more likely to consolidate decision making authority and maintain primary control” [6]. Delegation is a sign of strength, not weakness. Consolidation of all authority is the remit of the insecure.

Another thing leaders can do to help ensure their team is working autonomously but towards a clear end goal is to have a solid set of principles in place. These principles shouldn’t just highlight the leader’s values and goals but make clear the approach they want to use to achieve them. Shift Thinking founder and CEO Mark Bonchek calls such a set of principles a company’s “doctrine”. Bonchek argues that, “without doctrine, it’s impossible for managers to let go without losing control. Instead, leaders must rely on active oversight and supervision. The opportunity is to replace processes that control behavior with principles that empower decision-making” [7].

Having a guiding set of principles in place lets you delegate responsibility more freely because you know that even with limitless autonomy, your employees are aware of the parameters they should be working within – it keeps them drawing within the lines.

Evidently, a pivotal part of leadership is and always will be people management. But if a leader has already clearly defined their principles, they’ll find they need to manage their people much less. Some companies that advocate for principles-based management include Amazon, Wikipedia and Google. The proof is in the pudding.

Effective delegation

How delegation is handled contributes enormously to what kind of company one is running and what kind of leader one is. For example, consider two scenarios. In scenario one, a tired and over involved leader, seeing that they have taken on more than they can chew with a deadline fast approaching, tells one of their team that they no longer have time to do a report that they were meant to be writing and so thrusts it on the employee to hastily pick up the slack.

In scenario two, a leader identifies a member of their team who they want to write a report for them. They talk to the employee and tell them that they’ve noticed the employee’s precision in putting facts across concisely and engagingly and want them to put those skills to use in this latest report. They talk through what they want from the project and why this employee is the perfect person to achieve those goals. They make known that they are available for support should any be needed.

In both examples, the boss is asking their employee to write a report. But in one that work is something fobbed off on the employee, a chore the leader no longer wants to do. In the second example, the leader is identifying the skills of a member of their team, letting the employee know that these are the skills needed for the task at hand and thus giving the employee an idea of what’s needed from them as well as a confidence boost.

Sostrin suggests four strategies for successful delegation [8]. First, to start with reasoning. As in the example above, this includes telling someone not just what work they want done but why – and that means both why they are working towards a certain goal and why the employee is the person to do it.

Second, to inspire their commitment. This, again, is about communication. By relaying the task at hand, their role in it and why it’s important, they can understand the bigger picture, not just their specific part of it. They’re then more able to bring themselves to the project, rather than viewing it as simply a tick-box exercise they’re completing for their boss.

Third, to engage at the right level. Of course delegation doesn’t mean that a leader should hand work over to their employees and then never worry about it again. They should maintain sufficient engagement levels so that they can offer support and accept accountability, but do so without stifling their team. The right balance depends on the organisation, the project and the personnel involved, but Sostrin suggests that simply asking staff what level of supervision they want can be a good start.

Fourth, to practise saying “yes”, “no”, and “yes, if”. That means taking on demands that you think are best suited to you, saying “yes, if” to those that would be better off delegated to someone more suited to that specific task, and giving outright “no”s to those you don’t deem worthwhile.

For example, Keith Underwood, COO and CFO of The Guardian, said that he doesn’t delegate when “the decision involves a sophisticated view of the context the organisation is operating in, has profound implications on the business, and when stakeholders expect me to have complete ownership of the decision” [9].

Kelly Devine, president of Mastercard UK and Ireland, says, “The only time I really feel it’s hard to delegate is when the decision is in a highly pressurised, contentious, or consequential situation, and I simply don’t want someone on my team to be carrying that burden alone” [10].

On top of these four, it’s worth adding the benefits around communicating high-profile, critical company decisions to your team, whether that be layoffs, new investors, or whatever the case may be. Leaders should want their employees to feel part of the organisation. That means keeping them in the loop of not just what is happening but why. Transparency is highly valued and in turn valuable.

In summary

It can be all too easy for managers who rose through the corporate ranks to eschew delegation in favour of an auteur-esque approach – shaping a team in their distinct image, if not actively trying to do all the work themselves. But delegation not only makes life less tiring and stressful for the leader, who cannot possibly hope to cover everyone’s work alone, but it also results in a happier, more productive, and likely more capable workforce, one that feels trusted and free to experiment rather than constrained by fear of failure.

Good ideas come from anywhere. Good organisations are built on trust. Good leaders don’t smother their workers but empower them. And with each empowered collaborator, the likelihood of collective success grows.

More on Trust

The Importance of Trust

Leadership in Focus: Foundations and the Path Forward

10 Traits of a Great Leader

Unleashing Leadership Excellence with Dan Pontefract (podcast)

References

[1] https://marshallgoldsmith.com/book-page-what-got-you-here/

[2] https://hbr.org/2017/10/to-be-a-great-leader-you-have-to-learn-how-to-delegate-well

[3] https://www.forbes.com/sites/forbescoachescouncil/2022/05/11/how-to-master-the-art-of-delegation/?sh=697016a8cb32

[4] https://hbr.org/2017/10/to-be-a-great-leader-you-have-to-learn-how-to-delegate-well

[5] https://www.forbes.com/sites/forbescoachescouncil/2022/05/11/how-to-master-the-art-of-delegation/?sh=697016a8cb32

[6] https://www.sciencedirect.com/science/article/abs/pii/S0191886916311527

[7] https://hbr.org/2016/06/how-leaders-can-let-go-without-losing-control

[8] https://hbr.org/2017/10/to-be-a-great-leader-you-have-to-learn-how-to-delegate-well

[9] https://hbr.org/2023/03/5-strategies-to-empower-employees-to-make-decisions

[10] https://hbr.org/2023/03/5-strategies-to-empower-employees-to-make-decisions

Introduction

In the world of investing, Charlie Munger is a legendary figure, celebrated for his sage-like wisdom and insightful aphorisms. As Warren Buffet’s right-hand man, his approach is a testament to the power of effective decision-making and wisdom, which he famously accredits to his ‘multi-disciplinary’ approach—a rich mosaic of insights from various academic disciplines, including applied, organisational, and social psychology.

Munger’s perspective is unique and practical because he harnesses these theories and translates them into real-world applications. His approach forms an interesting amalgamation, merging business acumen with psychological theories—a powerful combination that leads to meaningful, insightful, and profitable decisions.

The power of incentives: An intersection of economics and psychology

Munger emphasises the importance of incentives, an intersection of economics and psychology, in shaping human behaviour. “Show me the incentive, and I will show you the outcome,” he famously said. In applied psychology, the operant conditioning theory by B.F. Skinner aligns with Munger’s philosophy. It suggests that behaviour is learned and maintained through immediate consequences or rewards. In organisations, this theory’s implications are vast. By understanding the impact of incentives—be it financial, social, or psychological—leaders can drive behaviour that aligns with the company’s strategic objectives.

Cognitive biases and decision making: A Mungerian perspective

In his famed address to Harvard University in 1995, Munger laid out 25 standard causes of human misjudgement—a compendium of cognitive biases that he believes significantly impacts decision-making. These biases are psychological tendencies that can cloud our judgment and influence our decision-making processes. They include confirmation bias (favouring information that confirms our pre-existing beliefs), social proof (the tendency to see an action as more appropriate when others are doing it), and availability bias (relying on immediate examples that come to mind when evaluating a specific topic or decision), among others.

In addition, Munger also discussed biases such as over-optimism, anchoring, and the contrast effect, highlighting how these can distort our understanding of reality and lead to erroneous decisions.

In the field of organisational psychology, these cognitive biases are recognised as significant barriers to rational decision-making. They create an environment susceptible to phenomena such as groupthink, where a desire for harmony or conformity in the group results in an irrational or dysfunctional decision-making outcome. These biases can also engender substantial resistance to change, as individuals often favour the familiar and view potential changes with a degree of scepticism and fear.

To mitigate the effects of these cognitive biases, Munger emphasised the importance of cultivating cognitive flexibility and self-awareness in our thinking patterns. Cognitive flexibility involves shifting our thinking and using different thinking strategies in different scenarios. On the other hand, self-awareness is the conscious knowledge of one’s character, feelings, motives, and desires. By being aware of our biases, we can better question our initial judgments and decisions and consider alternatives.

Munger also advocates for the idea of using mental models, drawing from a variety of disciplines, to aid in decision-making. This multidisciplinary approach to thinking helps counteract the narrow-mindedness that can result from over-reliance on a single perspective and encourages a more comprehensive understanding of problems, ultimately leading to better decision-making.

Harnessing social influence: Understanding the psychology of persuasion

Munger often references Robert Cialdini’s principles of persuasion—reciprocity, commitment and consistency, social proof, authority, liking, and scarcity. He asserts that these principles don’t just operate on an individual level but can significantly influence organisational culture and drive business outcomes.

For instance, the principle of commitment and consistency can improve organisational efficiency. When employees commit to a task or goal, they are more likely to follow through. Similarly, the principle of social proof plays a role in shaping corporate cultures. People tend to conform to the behaviours of the majority, which can either drive productive work ethics or create a toxic environment.

Navigating the latticework of mental models

Munger advocates for the latticework of mental models, suggesting that one must understand various disciplines to make effective decisions. This is where the role of interdisciplinary knowledge, specifically a blend of applied, organisational, and social psychology, becomes paramount.


One of the key insights of this approach is the understanding that organisations are not just economic entities but psychological and social entities as well. Leaders who appreciate this complexity are more equipped to drive their organisations towards sustainable success.

Conclusion: The intersection of wisdom and psychology

Munger’s wisdom, grounded in various psychological theories, provides a robust framework for understanding and influencing human behaviour in organisations. By weaving together insights from applied, organisational, and social psychology, he teaches us that wisdom is not just about knowledge but also about understanding human nature and leveraging it for collective progress. His philosophies echo the timeless essence of these psychological theories, reminding us that at the heart of every organisation, the human element counts the most.

Introduction

As the world continues to evolve, so does the way we use technology to improve our lives and workplaces. New York City recently adopted final regulations on the use of AI in hiring and promotion processes marking a significant step in addressing potential biases and ethical concerns surrounding the use of AI in the workplace. The question now is, will other countries follow suit and implement similar regulations?

As AI increasingly moves from automating drudge work to playing a more prominent role in decision-making, it’s vital that we understand the implications and potential risks. The good news is that some countries have already started to take action in this area.

Global progress on regulations

The European Union, for instance, unveiled its proposed AI regulations in April 2021. While these regulations are still in the proposal stage, they represent a comprehensive approach to governing AI use across various sectors, including hiring and promotions. The EU’s proposed rules are designed to ensure that AI systems are transparent, accountable, and respect fundamental rights.

Japan, another key player in AI development, established the AI Technology Strategy Council in 2016. The Council has since released a series of strategic guidelines that consider the ethical, legal, and social issues surrounding AI use. While these guidelines are not legally binding, they provide a framework for companies and the Japanese government to consider as they develop AI systems and technologies.

Ethical challenges

In contrast, countries like China and Russia have prioritised developing and deploying AI for economic and strategic gains, with less emphasis on ethical considerations. However, as AI becomes more integrated into hiring and promotion processes globally, it’s likely that these countries will also have to address the ethical challenges presented by AI.

So, what are the chances of the NYC regulations being successful? It largely depends on how well they are enforced and how willing companies are to adapt their practices. One of the keys to success will be educating employers about the benefits of ethical AI use and the potential risks of non-compliance.

Biases and discrimination

The impact of AI in hiring and promotion goes far beyond automating menial tasks. By leveraging AI’s ability to analyse vast amounts of data, we can make better, more informed decisions in these areas. However, this also raises the risk of perpetuating biases and discrimination.
As we’ve seen in recent years, AI algorithms can sometimes unintentionally reinforce existing biases due to the data they’re trained on. By implementing regulations like those in NYC, we can help ensure that AI is used responsibly and that it truly serves to benefit all members of society.

The key takeaway is that while the use of AI in hiring and promotion can be hugely beneficial, it’s essential to have regulations in place to ensure ethical practices. As New York City has taken this bold step, we’ll see more countries and cities follow in their footsteps.

Conclusion

In conclusion, the adoption of AI regulations in New York City is a significant move towards ensuring the responsible and ethical use of AI in hiring and promotion processes. As AI continues to play an increasingly important role in our lives, it’s crucial that governments and businesses alike prioritise transparency, accountability, and the protection of fundamental rights. By doing so, we can harness the power of AI to create a fairer, more inclusive society – and that’s something worth celebrating.


So, will other countries follow New York City’s lead? I believe they will, and it’s only a matter of time before AI regulations become a global norm. Let’s keep the conversation going, stay informed, and make the best decisions.

Introduction

Earlier this month, Elvis Costello played in Dublin, performing without the full line-up of the Attractions and accompanied only by his long-time collaborator Steve Nieve. After journeying together through 45 years of tour buses, dressing rooms, hotel lounges, flights, recording studios, and live performances, the seamless synergy between Elvis Costello and Steve Nieve is undeniable. Their collaboration and bond have evolved into an intuitive language, subtle to an outsider but vividly clear to them. The intuitive language shared by Costello and Nieve symbolises the essence of collaboration—a universal phenomenon that crosses various fields and industries.

Collaboration: the term is a buzzword in boardrooms, often discussed in strategy meetings and corporate corridors. Morten T. Hansen, in his pivotal book, ‘Collaboration: How Leaders Avoid the Traps, Build Common Ground, and Reap Big Results,’ explains that the core of collaboration isn’t about amassing tangible assets. Rather, it’s about unlocking value through shared knowledge and relationships.

If you’ve ever viewed collaboration as elusive, difficult to implement, or limited to a select few, it’s time to rethink that perspective. Drawing on insights from scholars like Robert Axelrod, we’re making the case that collaboration isn’t just an inherited trait like ‘DNA.’ It’s also influenced by factors such as leadership and vision, which can be actively nurtured to become a potent force for collective action within any organisation.

Collaboration in practice

Public opinion on collaboration varies. While some see it as vital to effective organisational practice, others dismiss it as mere managerial jargon. The truth lies somewhere in between; collaboration offers tangible benefits and value when practised effectively. Given the rapid changes in our world, the importance of collaboration has never been greater. With emerging nations reshaping the global economic landscape and partnerships becoming increasingly essential, is it now a non-negotiable asset? From the arts and sports to science and business, effective collaboration enriches our collective experiences and is indispensable for leadership. Symbiotic relationships like that between Xavi Hernandez and Andres Iniesta in football or Michael Jordan and Scottie Pippen in basketball have redefined standards for teamwork. These duos show that collaboration magnifies individual brilliance to create game-changing moments. In facing global challenges like climate change, the need for collaboration extends beyond industries to nations and continents. Initiatives like the Paris Agreement represent concerted efforts to combat an existential threat, underscoring the power of collective action.

Collaborative discoveries

In science, the importance of collaboration is ever-present. The International Space Station (ISS) is a testament to what can be achieved through international teamwork, bringing diverse skill sets and perspectives together to reach a common goal. Historical collaborations like that between Albert Einstein and Marcel Grossmann laid the foundation for ground-breaking theories like general relativity.

In the wake of the COVID-19 pandemic, unprecedented levels of global scientific collaboration led to the rapid development and distribution of vaccines. This real-time, high-stakes cooperation among nations, scientists, and pharmaceutical companies demonstrated that extraordinary outcomes are possible when humanity unites for a common cause.

The business of collaboration

In the business world, partnerships have also yielded significant results. Procter & Gamble, which began as a small partnership, has grown into a global giant. The collaborative synergy between William Procter and James Gamble transformed a modest venture into an empire. Modern workspaces are designed better to facilitate such collaborative endeavours, but more can be done. As organisational psychologist Adam Grant proposes, people may work from home but come to the office to collaborate. Artificial intelligence is adding a new dimension to team collaboration, evolving from a tool for basic tasks to handling complex roles like data analysis. Integrating AI empowers teams to make agile decisions and foster a conducive, flexible work environment. In the age of remote work, tools like Slack and Zoom have become indispensable for team collaboration, breaking down geographical barriers and enabling real-time communication and project management.

Practical steps for effective collaboration

As the intricacies of collaboration unfurl, understanding its practical implementation becomes paramount. Begin with a shared vision, ensuring everyone recognizes the endgame. Assemble diverse teams, ensuring a mix of expertise and perspectives. Prioritize transparent communication, creating a culture where ideas flow freely. Regular check-ins are essential, not just to track progress but to celebrate milestones. Equip teams with the right tools and training, fostering an environment conducive to collaboration. And remember, genuine feedback, whether praise or constructive critique, is the cornerstone of continuous improvement.

Unpacking the potential of collaboration

Collaboration isn’t a one-size-fits-all endeavour; it’s a nuanced and intricate dance that varies depending on context. In contemporary business settings, traditional hierarchical frameworks make way for more decentralised, cross-functional operations. This shift calls for a managerial approach that goes beyond mere oversight to include motivation and influence. As evidenced by the rise of virtual teams, mastering the complexities of modern teamwork often determines organisational success or failure.

Within this complex landscape, the durability of collaborative relationships is critical. It isn’t just the responsibility of the individuals involved; it must be woven into the fabric of organisational practices. Emerging technologies like blockchain also illustrate the potential of decentralised, collaborative systems. With its network of nodes working together to validate transactions, this technology represents a ground-breaking form of collaborative interaction.

Social psychologists like Debra Mashek outline various levels of collaborative engagement, each requiring its own set of rules based on the degree of trust, commitment, and resource-sharing. Dr. Carol D. Goodheart further emphasises that effective collaboration can significantly amplify organisational resources, an aspect often overlooked due to inadequate training in collaborative practices.

The real challenge lies in integrating the value of collaboration into daily operations. Investments in cultural and behavioural initiatives often dissipate when confronted with the rigid processes of ‘business as usual.’ Existing behavioural assessment tools also fall short, lacking the specificity needed to capture the multifaceted nature of collaboration.


Moving forward, an integrative approach is essential—one that aligns cultural initiatives with business processes and enriches traditional assessments with collaboration-focused metrics. The benefits of collaboration are clear; we can’t afford to leave them to chance. Fostering a genuinely collaborative environment requires a thoughtful convergence of culture, process, and leadership.

Attributes for greater collaboration

Research has shown that the following attributes enable greater collaboration within an organisation:

• Strategically Minded: Individuals can see beyond their immediate roles and consider broader objectives. This fosters cooperative behaviour and long-term value.

• Strong Team Orientation: Crucial for effective collaboration. It enables individuals to focus on common goals, adapt to team dynamics, and foster an inclusive environment.

• Effective Communication: Vital for success, characterised by openness, two-way dialogue, and responsiveness.

• Openness to Sharing: Encompasses a willingness to discuss ideas, accept suggestions, and change one’s mind, thereby encouraging meaningful collaboration.

• Creativity and Innovation: Willingness to think outside the box and find intelligent solutions to complex problems.

• High Levels of Empathy: Demonstrated understanding of others’ perspectives and emotions, thereby enhancing teamwork and customer focus.

• Inspiring Leadership: Effective leaders focus on collaboration and people management, avoiding micromanagement and bossy attitudes.

Conclusion

Collaboration is far more than a corporate buzzword; it is a nuanced, multi-layered approach that fundamentally influences all sectors of human endeavour—from the arts and sciences to sports and business. We’ve seen how partnerships like Lennon and McCartney have become legendary in the arts, transforming the music landscape. In science, collaborations like the International Space Station embody the pinnacle of what international teamwork can achieve. In the business world, the symbiosis between William Procter and James Gamble shows how small partnerships can turn into global giants.

As the work landscape shifts, with Adam Grant suggesting the office as a ‘crucible’ for collaboration even in the age of remote work, it becomes evident that we need to understand the complexities and subtleties involved more deeply. Scholars like Debra Mashek and Carol D. Goodheart offer valuable insights into the transformative power of collaboration, urging us to see it not as an optional asset but as a vital force for societal advancement. And in facing global challenges, whether it’s climate change or the complexities of emerging technologies like blockchain, collaboration scales from the individual to the global level, making it a non-negotiable asset for collective progress.


By actively embracing and nurturing the diverse forms of collaborative interaction, we do more than enrich our individual lives; we catalyse collective progress, paving the way for unforeseen possibilities and ground-breaking innovations. This makes it imperative to appreciate the concept of collaboration and invest in creating a culture, adopting processes, and establishing leadership that intentionally fosters collaborative engagement.


As we look toward the future, the question is no longer whether collaboration is beneficial but how we can cultivate it to unlock its full potential. This calls for proactive measures from individuals and organisations to move from mere understanding to actively promoting a collaborative ethos. Our collective progress depends on it.

More on Collaboration

Synergy Over Solo: Navigating the Collaborative Future of Business

The Power of Team Clusters: A People-Centric Approach to Innovation

Elite Team Cohesion

References:

Axelrod, R. (1984). The Evolution of Cooperation. Basic Books.
Chakkol, M., Finne, M., & Johnson, M. (2017). Understanding the psychology of collaboration: What makes an effective collaborator. Institute for Collaborative Working: March.
Hansen, M. (2009). Collaboration: How leaders avoid the traps, build common ground, and reap big results. Harvard Business Press.
Lipnack, J., & Stamps, J. (2008). Virtual teams: People working across boundaries with technology (3rd ed.). John Wiley & Sons.
Mashek, D. (2016). Collaboration: It’s Not What You Think. Psychology Today. February, 26.

Introduction

The UK government caused controversy recently by rolling back its commitments to net zero. Commenting on the backpedalling, Ford’s UK chair Lisa Brankin raised concerns, citing the rampant upheavals Ford (and every company in the automotive sector) have been making in order to act in accordance with the now-reversed ban on the sale of new petrol and diesel cars by 2030.

As well as the UK 2030 target being “a vital catalyst to accelerate Ford into a cleaner future,” Brankin stated that, “Our business needs three things from the UK government: ambition, commitment and consistency. A relaxation of 2030 would undermine all three” [1].

Essentially what the government has done is undermine the trust of Ford and every other leading player in the automotive sector. They said they were committed to something and then showed they were not. They had the whole sector aligned and working at breakneck speed to overhaul old practices only to discover they’d been wasting their time and money.

As the old saying goes, trust is hard to earn and easy to lose. And in business, a loss of trust is catastrophic.

The value of trust

Research published by Harvard Business Review found that workers at companies where trust is high report 106% greater energy in the office, 74% lower stress levels, 76% greater engagement, and 50% more productivity than their peers at low-trust businesses [2].

Meanwhile PwC reports that 91% of business executives say their ability to build and maintain trust improves the bottom line (including 50% who strongly agree), 58% of consumers say they have recommended a company they trust to friends and family, and 64% of employees say they recommended a company as a place to work because they trusted it [3].

Trust pays. It builds relationships – both internally and with clients – and only grows stronger with time. It produces happier, more productive employees and reaps dividends in profit. Evidently, then, it’s something worth investing in. But to do so, we first need to clarify what we mean by trust.

What is trust?

Writing for Forbes, John Hall, a motivational speaker and co-founder of the time and scheduling management app Calendar, says workplace trust relies on two fundamentals: “First, every team member is making their best effort to further the interests of the company; second, everyone assumes that fact about everyone else on the team unless they see evidence to the contrary” [4].

In lieu of trust falls, office ping pong or other more performative variants of workplace integration, trust boils down to something more fundamental, whether you are doing your best and giving everyone else in your team the courtesy of assuming they’re doing the same.

This second part can prove especially difficult. We can control our own work ethic, not others. And within almost all office environments there’s a sense of competitiveness, the rate and quality of your output exists in constant competition with the rate and output of your colleagues. Who’s in the boss’s good books? Who’s getting the bonus? The promotion?

All these considerations can’t help but cultivate attrition. We may like to think our colleagues aren’t working as hard or to as high a standard as we are out of pride or to build up our own sense of self-worth. This is misguided. We need to bestow trust freely and unsparingly. When considering how best to decide who is trustworthy, Ernest Hemingway put the answer most succinctly: “The best way to find out if you can trust somebody is to trust them” [5].

It’s a leap of faith. That’s what trust is at its core. And until somebody gives you a reason not to trust them, your best bet is to give them the benefit of the doubt.

The science of trust

In an era marred by a seemingly endless carousel of corporate jargon and buzzwords, it’s possible to read about the notion of trust and think it’s more of the same – a benevolent, ultimately abstract notion that holds no quantifiable value but makes for a useful throwaway LinkedIn post or hastily churned out blog. But there is a science to trust, as demonstrated by Paul J. Zak, the founding director of the Center for Neuroeconomics Studies and a professor of economics, psychology, and management at Claremont Graduate University, and the CEO of Immersion Neuroscience.

Having seen in rodents that a rise in the brain’s oxytocin levels signified that another animal was safe to approach, Zak wondered if the same was true for humans. He conducted an experiment following the model of Nobel laureate in economics Vernon Smith [6]. In the experiment, a participant would choose an amount of money to send to a stranger via computer, knowing that the amount they chose to send would triple once they’d sent it. The recipient would then have the option of sharing this tripled amount with the sender or keeping all the cash for themselves. It was a trust exercise made of two parts. First, how much do you send? Second, do you share or steal?

To measure oxytocin levels during the exchange, Zak and his colleagues developed a protocol to draw blood from people’s arms before and immediately after they made decisions to trust others (if they were senders) or to be trustworthy (if they were receivers). The participants were not informed as to the content of the study (and even if they had been, they still would have had no control over the amount of oxytocin their bodies release).

They found that the more money people received (denoting greater trust on the part of senders), the more oxytocin their brains produced. The amount of oxytocin recipients produced then also predicted how trustworthy – that is, how likely to share the money – they would be. To prove that this was not just a result of the brain randomly generating chemicals, they performed further tests, administering doses of synthetic oxytocin into the brain through nasal spray and comparing participants who’d had a dose with those who’d had a placebo. They found that giving people 24 IU of synthetic oxytocin more than doubled the amount of money they sent to a stranger.

To ensure that the oxytocin spray did not cognitively impair the participants – and thus that their actions were actually born of brain fog or psychosis rather than trust – they performed other tests, this time replacing the money test with a gambling model. They found that increased oxytocin led to no rise in risk taking. In other words, the sole and genuine effect of increased oxytocin was to reduce the fear of trusting a stranger.

Over the following ten years, during which he conducted various further tests on oxytocin levels, Zak found that stress is a potent oxytocin inhibitor, as well as learning that oxytocin increases a person’s empathy, which of course is a vital tool for any act that requires collaboration.

How to develop trust

There is a gap in how executives see trust in business and how employees and customers see it. According to PwC, 84% of business executives think that customers highly trust their company, yet only 27% of customers say the same. Similarly 79% of business executives say their employees trust the company, but only 65% of employees agree [7]. Clearly, then, the first step a higher-up can take to improve trust in the company is to be aware that it’s lacking.

Zak’s continued research shows that recognition and attainment of goals are the most proven ways of garnering trust. “The neuroscience shows that recognition has the largest effect on trust when it occurs immediately after a goal has been met, when it comes from peers, and when it’s tangible, unexpected, personal, and public” [8].

Setting goals that are difficult but achievable is crucial. The moderate stress of the task releases neurochemicals, including oxytocin and adrenocorticotropin, that intensify people’s focus and strengthen social connections. However, the challenges have to be achievable and have a clear endpoint. Research shows that vague goals cause employees to give up before they’ve even started.

Pivotal to trust rates within an organisation are messaging and communication. Internal trust networks are hard to maintain because the flow of communication is so much looser and unrestrained than in a strictly employee-client relationship. Organisations are sending their workers multiple, often contradictory messages every day. Different departments are working towards distinct, sometimes contrasting goals. Maintaining alignment to a clear, single message is extremely difficult and does not happen by accident.

Inconsistent messaging, inconsistent standards and false feedback all contribute to the sense of a company unworthy of trust. If one boss is asking workers to pull one way while another boss asks them to pull the other, employees will lose faith in management. This is even more true when it is just one boss flip-flopping on the direction of travel, unsure of their own wants.

Regarding standards, if a boss sets a line, verbal or written, as to what is acceptable behaviour or what is the demanded standard of work but then fails to live up to this standard themselves then trust will quickly dissipate. The same is true if they allow others to get away with clear or repeated breaches, especially if the boss is thought to be playing favourites. It is for managers to set the tone and take responsibility for their organisation. A leader’s words and actions are ascribed deep meaning by their employees, and will be scrutinised heavily. Trust starts at the top and filters down.

Former Herman Miller CEO Max De Pree once said, “The first responsibility of a leader is to define reality. The last is to say thank you. In between the two, the leader must become a servant” [9]. That ability to humble oneself is pivotal to good management.

One way leaders can achieve this is to be willing to ask for help from their workers rather than just telling others what to do. This forms trust and connection with the employees and shows signs of a secure leader, far more trustworthy than one who pretends to have all the answers. As Steve Jobs said, “It doesn’t make sense to hire smart people and tell them what to do; we hire smart people so they can tell us what to do” [10]. Ask questions and show a willingness to learn, you can bet your employees will do the same in turn.

Trust today

Lending employees trust is of greater importance today than ever before due to the prevalence of home and hybrid working. Employers are not able to see and covertly monitor their employees through the day as they can in an office, and so must trust their teams to get the work done in a more autonomous fashion.

People can meet the same standards of in-office productivity from home on their own, less constrained schedule. The numbers back it up [11]. But still some companies are wary. We’ve all seen stories of organisations that want to remotely monitor the usage of their workers’ computers throughout the day to check that they are always at their desk during work hours. This draconian approach shows a total lack of trust. Who would want to work for a company that held them in such low regard? What kind of atmosphere does that cultivate? We talk a lot about company culture. Well, a culture that doesn’t trust its staff is unlikely to get the best out of them, and frankly doesn’t deserve to.

Workers will only grow more remote with time. The traditional 9-5 is unlikely to return. Employers need to bestow the requisite levels of trust to get their employees thriving no matter where they are.

Trust is money

Hall recommends we treat trust like we treat money: “Save it carefully, and spend it wisely. You may not be able to measure it like you can a bank balance, but sooner or later, you’ll see it there, too” [12].

Trust is pivotal to any team endeavour and business is no different. Businesses need to cultivate trust with their consumers. To do so, they must first build it internally, starting from the top. That requires consistent messaging and open communication. It requires humility from leaders, not bullish overconfidence. It requires vulnerability and a willingness to trust someone until they prove you wrong, which inevitably some will. But for companies able to garner a truly trusting environment, one in which every worker is giving their best and working under the assumption that each of their colleagues are doing the same, the rewards are enormous.

References

[1] https://www.reuters.com/business/autos-transportation/ford-uk-slams-potential-relaxation-plans-ban-new-petrol-diesel-car-sales-by-2030-2023-09-20/

[2] https://www.forbes.com/sites/johnhall/2019/12/20/why-trust-is-one-of-the-key-factors-in-a-successful-company/?sh=542c84c75957

[3] https://www.pwc.com/us/en/library/trust-in-business-survey-2023.html

[4] https://www.forbes.com/sites/johnhall/2019/12/20/why-trust-is-one-of-the-key-factors-in-a-successful-company/?sh=542c84c75957

[5] https://www.forbes.com/sites/iese/2023/01/24/why-building-trust-benefits-your-company/?sh=1ade9872571e

[6] https://hbr.org/2017/01/the-neuroscience-of-trust

[7] https://www.pwc.com/us/en/library/trust-in-business-survey-2023.html

[8] https://hbr.org/2017/01/the-neuroscience-of-trust

[9] https://hbr.org/2017/01/the-neuroscience-of-trust

[10] https://businessfitness.biz/hire-smart-people-and-let-them-do-their-jobs/

[11] https://www.businessnewsdaily.com/15259-working-from-home-more-productive.html

[12] https://www.forbes.com/sites/johnhall/2019/12/20/why-trust-is-one-of-the-key-factors-in-a-successful-company/?sh=542c84c75957

Introduction

As the labour market evolves, organisations have been reconsidering the importance and relevance of degree qualifications in their hiring practices. A trend known as “degree inflation,” which saw an increase in job descriptions requiring degrees even when the roles hadn’t changed, was particularly evident in the early 2000s. However, the trend experienced a reset in the aftermath of the 2008-2009 Great Recession, reducing degree requirements across numerous roles.

This shift is particularly noticeable in middle-skill positions, which require some post-secondary education or training but not necessarily a four-year degree. The reset is also evident, though to a lesser extent, in higher-skill positions. Two waves have driven this trend. First, a structural reset that started in 2017 and was characterised by a move away from degree requirements in favour of demonstrated skills and competencies. Second, a cyclical reset that began in 2020, prompted by the Covid-19 pandemic, and involved employers temporarily relaxing degree requirements to find skilled workers during the health crisis.

Impact on equality

In the case of Ireland, the shift away from degree requirements has been particularly impactful in increasing female participation in the workforce. According to the latest Labour Market Pulse published by IDA Ireland in partnership with Microsoft and LinkedIn, skills-based hiring and flexible working conditions are integral to increasing female participation in the Irish labour market. The adoption of a skills-first hiring approach has the potential to increase the overall talent pool in Ireland more than six-fold and 20% more for women than men in traditionally male-dominated occupations.

Hard skills Vs soft skills

Despite the promising trends, it’s important to note that the degree inflation reset is a work in progress. A significant percentage of job descriptions still list degree requirements, effectively walling out a vast number of potential employees from the candidate pool. Additionally, while many companies have announced the removal of degree requirements, they often still exhibit a higher-than-average demand for college degrees in practice. This suggests that, while hard skills can be easily confirmed, degrees are still seen as a proxy for soft skills, which are harder to assess.

However, the shift away from degree-based hiring compels companies to think more carefully about the skills they need. More explicit descriptions of desired capabilities in job postings are increasing awareness among applicants about the importance of developing soft skills. This could influence skills providers to consider how they can update their curricula to include these skills.

Diversified talent pool

The elimination of inflated degree requirements is a critical step towards achieving equity in the labour market. Companies should reassess the assumptions underlying their recruitment strategies, reconsidering the use of blunt and outdated instruments in favour of more nuanced, skills-focused approaches. This shift is already opening attractive career pathways for traditionally overlooked workers due to the lack of a four-year degree. The potential result is a win-win situation: greater equity for job seekers and a more robust, diversified talent pool for companies to draw from.

Skills-first approach

This trend is particularly beneficial in the Irish context, where the government has set ambitious targets for gender equality and equal representation in leadership. A skills-first approach could be instrumental in activating the skills of underrepresented groups, including women, people with disabilities, and those without third-level education. Ireland can pave the way for a more inclusive, equitable future by eliminating barriers to well-paying jobs.

If you wish to introduce skill-based initiatives, it is critical to contextualise these ideas within your company’s unique circumstances, set clear objectives, and develop strategies for implementation. Here are some actionable insights based on the above points:

  1. Develop a Learning & Development Strategy: Understand your company’s current capabilities and identify the areas where there’s a skill gap. Invest in the creation of learning and development programs that target these gaps. These could be in-house training, online courses, or educational partnerships.
  2. Empower Employees to Shape Their Career Path: Create platforms or mechanisms that allow employees to share their interests and career goals. This could be an annual employee survey, open discussions, or a tool integrated into your HR system.
  3. Empower Employees to Shape Their Career Path: Create platforms or mechanisms that allow employees to share their interests and career goals. This could be an annual employee survey, open discussions, or a tool integrated into your HR system.
  4. Create Cross-functional Opportunities: Make it a point to allow employees to participate in projects or tasks outside their usual scope. This will not only allow them to broaden their skills but also to get a better understanding of the overall company operations.
  5. Incentivise Learning: Make learning an integral part of your company’s culture. Encourage employees to take time out of their work schedule to engage in training and learning activities. Offer rewards or recognition for those who actively participate in these programs or demonstrate new skills.
  6. Revamp Your Hiring Process: Transition from a credentials-based hiring approach to a skills-based one. Re-evaluate your job descriptions to focus more on the skills required to perform the job rather than academic or professional credentials.
  7. Introduce Skills Assessments: Implement mechanisms to measure a candidate’s skills during the hiring process objectively. This could include technical assessments, practical exercises, or situational judgement tests.
  8. Promote Lifelong Learning During Recruitment: During interviews, discuss the company’s learning and development programs and the opportunities for career growth within the organisation. This can make your company more attractive to potential hires.