A Research Paper By Joanna Poplawska, Leadership Coach, UNITED KINGDOM
How Would We Define AI Coaching?
In recent years, coaching has been impacted like any other area of our lives by digital technologies, including artificial intelligence (AI). The recent arrival of ChatGPT has intensified discussions about the dangers and possibilities of technology-driven coaching. It seems that this is a good time for coaching professional bodies as well as for individual coaches to pause and reflect on what role we want AI to play in the coaching processes.
There are two main digital trends happening in coaching: one is the use of AI-powered assessments such as for example BetterUp’s “Whole Person”, a digital tool that firstly assesses “where one stands” in the context of career coaching, collects feedback from the client’s colleagues and then introduces the client to a human coach. The coaching chatbots are question-answer exchanges between a human being and a program. After completing a short assessment, a client can pick a goal and schedule short suggestions for how and what they should practice. They get a push notification they can use to suggest fresh ways they can focus on certain aspects throughout their day. Each e-coaching statement targets the specific needs of each client.
The second development is AI coaching. How would we define AI coaching? I would like to point out that there is no one common shared definition of AI, despite the term being used nowadays widely, and there is almost one universally accepted definition of coaching. The Oxford Dictionary defines Artificial Intelligence (AI) as the theory and development of computer systems able to perform tasks normally requiring human intelligence, such as visual perception, speech recognition, decision-making, and translation between languages.
McCarthy, the founding father of AI, defined its description: “Every aspect of learning or any other feature of intelligence can in principle be so precisely described that a machine can be made to simulate it” (McCarthy et al., 2006).
“Machines acting in ways that seem intelligent”, “A branch of computer science that studies the properties of intelligence by synthesizing intelligence” and “Intelligence that is not biological” are examples of other definitions used in the world of academia (1).
When we look at the definition of coaching, probably the most widely used definition is John Whitmore’s, which describes coaching as “unlocking a person’s potential to maximize their own performance. It is helping them to learn rather than teaching them- a facilitation approach.” (2)
AI coaching can therefore be defined as coaching through algorithm, a machine-assisted process to help clients unlock their potential and maximise their own performance.
An example of currently available in the market AI coaching is VICI. VICI’s coaching focuses on assisting people with goal achievement. It helps users to identify goals, specify actions to reach the goals, monitor the progress of goals and activities, and adjust any if necessary. VICI also helps users to distinguish between proximal and distal goals and keeps track of both. It uses two types of coaching conversations: initial goal-setting and progress-tracking. The initial goal-setting conversation follows the GROW coaching model.
There is no question though that the almost $ 3 billion coaching market is of interest to technology companies and potentially to the clients. The arrival of ChatGPT means that the nature of coaching conversation with chatbots has changed- from rules-based, short question-and-answer sessions to much more relevant to the context comments or questions asked by quite technologically amazing ChatGPT.
The Impact of AI on Coaching
There is no doubt that the pace of digital transformation and the impact of AI on coaching will increase further in upcoming years so how concerned should we, human coaches be? Will we have to compete very soon in finding and serving clients with AI?
According to SerifeTekin, a professor of philosophy at the University of Texas, “The hype and promise are way ahead of the research that shows its effectiveness. Algorithms are still not at a point where they can mimic the complexities of human emotion, let alone emulate empathetic care” (3)
Intuitively most of us would, I think, would agree with that statement as we tend to see the relationship with a human coach as more valuable to the client, more capable of navigating complex human emotions and thought processes, but some initial research into nature of relationships between AI and humans indicates that this might not be the case.
According to Carolin Grassman and Carsten C. Schermuly, (4) some initial research in AI-led therapy showed that clients quickly established a bond toward their virtual agent. Their bond also became stronger over time. Participants who used the therapy chatbot “Woebot” reported that the bot felt like “a real person that showed concern”. Some clients preferred AI to a human therapist.
Woebot listens to the user’s inputs and captures them through text-based messaging to understand if they want to work on a particular problem. The research indicated that, while AI lacks human intelligence and emotions, positive outcomes for therapy are possible even if that’s the practice that has traditionally relied on a strong human connection.
This might potentially also be the case for coaching. Although the results in AI therapy cannot be transferred automatically to AI coaching, some further research is clearly needed to understand better the nature of the relationships between clients and AI. A question about how and why clients would have a meaningful coaching relationship with AI needs to be answered.
One can see the huge potential of AI coaching- the costs of delivery are lower, and the scaling up is easy so a much wider target audience could benefit from it. The average fee for executive coaching in the UK seems to be currently, based on my research, around £250 per hour. Employing AI coaching could significantly reduce the cost of coaching and democratize this service, allowing people, who would not normally be able to afford it, to benefit. AI coaching offers something human coaches always struggle with- a constant availability.‘Coach on demand’ can indeed be a concept tempting to many busy clients.
According to Nicky Terblanche, Joanna Molyn, Erik de Haan, and Viktor O. Nilsson (5), there are a few possible scenarios of future developments:
- AI coaching could be scaled to democratize coaching;
- AI coaching could grow the demand for human coaching;
- AI could replace human coaches who use simplistic, “model-based” coaching approaches.
The last point refers to the concept of levels of coach maturity described by Megginson and Clutterbuck(6).
At the lowest maturity level coaches follow a “models-based” approach where they are typically more interested in following a rigid process rather than exploring the complexities of the client’s challenges. They are “doing coaching to the client”. This type of coaching is, according to Megginson and Clutterbuck, typical of inexperienced coaches who rely on the coaching techniques they had been trained in.
On the second level, “process-based” coaches follow a slightly more flexible approach using an expanded but limited set of tools and techniques. They are “doing coaching with the client”.
On the third level, “philosophy-based” coaches apply a broader mindset to the client’s situation and practice reflection before and after coaching sessions.
The top, fourth level is called “systemic eclectic” and is acquired through experience. It allows a coach to exhibit a sensitive, intelligent approach to the client’s situation and utilize the most appropriate approach in any given circumstance.
While AI is currently incapable of navigating complex human emotions, values, and beliefs, the fact that AI can perform specific tasks on a level of acceptable, comparable with human competency suggests strongly that the lowest level of coach maturity (models-based) is already within the ability of AI. That means that inexperienced, model-based coaches might not be able to compete with AI.
The use of AI, not only in coaching but in general, raises some serious ethical concerns. The speed of technology developments seems to overtake our ability to create an ethical and legal framework for AI. There are some fundamental questions we need to answer; how does the issue of gathering, storing, and analyzing highly personal and sensitive data fit into confidentiality requirements? Who will have access to data for what purpose, what data protection we will be able to enforce, and how?
The quality of data used in building AI-related technologies is another compliance issue. This is because an algorithm in its application can only be as good as the data it uses. The existing AI bias may lead to poor assessments in coaching and in turn to discrimination of specific demographic groups e.g., if a voice recognition system is mainly trained on male voices, the system may not perform accurately when used by women due to a data quality problem. This has been identified as one of the problems in data-supported decision-making.
Prevention of harm, lack of guidance on developing ethical AI, respect and protection of confidentiality and privacy, transparency in the use of algorithms, bias, and data ownership- those ethical challenges must be addressed urgently by the coaching industry Currently there is no clear regulatory approval process for AI coaching before they go to market. Unfortunately, there is also still no clear regulatory process for the coaching profession. Coaching remains an unregulated profession- the level of professionalism amongst coaches varies and there are different levels of training and experience.
Whilst being aware of the risks and the need for a wide discussion of the framework for AI coaching, we should be also aware that we cannot stop the traction AI is gaining and need to think about how to utilize AI in our own coaching. The coaching process is adaptable and open to including artificial intelligence capabilities. An interesting example of the practical and positive use of AI in coaching is using AI in communication coaching (7). The AI assistant transcribes sessions, measures the pace of speech, measures eye contact, smiles, and pauses, and gives a coach insights into filler words and non-inclusive language. AI can be used to speed up the onboarding process, help us with some research, and assist in the creation of any marketing content such as blogs or social media posts.
There is an option of introducing AI in real-time into the coaching conversation which implies a “coach-AI partnership. The AI can be an observer or the supervisor of the session, a source of information, an additional sounding board, or a supportive resource. Monitoring the tone of voice, the body language, and analyzing the linguistic and conversational patterns can create a hugely valuable pool of information. The AI can bridge gaps between a human coach and a client.
That type of human coach-AI alliance sounds exciting and promising, but it needs to be defined jointly by the client and the coach as the presence of AI might impact the coaching relationship and the trust that needs to be built can be an excellent monitoring tool for coaches providing support evidence and insights for reviewing coaching sessions.
In summary, the truth is that the AI revolution is happening already, and the coaching industry still remains rather silent in the debate about the impact that it will bring. Whatever we think about the use of AI in coaching, we should, in my opinion, call on the coaches’ community to play a significant role in designing the ethical and legal framework for the use of AI in coaching.
The crucial challenge in creating a universal approach is the data protection issue as different countries have different data protection legislation. That implies that coaches must have a good understanding of the standards required in the legal space they operate in. Coaches would also benefit from better education related to the use of AI and current technology developments affecting their jobs.
At the very heart of the debate about the digital future of coaching is simply the fact that human coaches should be role models in their industry and beyond.
Edith Cohen cites in her article ‘AI and digital technology in coaching’ (8) David Peterson, Global Director of Leadership and Coaching at Google, whose call to action for human coaches summarises perfectly my conclusion of the research:
- Do transformational development, not transactional, development;
- See the really big picture;
- Get better at understanding art and science;
- Embrace and leverage new technologies;
- Be role models in adaptability.
References
1. www.minddata.org “Whatisai” by Brian KaChan-AI
2. ‘Coaching defined and explored’ by Jonathan Passmore in “The Coaches Handbook”2021
3. www.npr.org, “Therapy by chatbot-the promise and challenges in using AI for mental health, by Yuki Noguchi
4. https://journals.sagepub.com/doi/pdf/10.1177/1534484320982891, article “Coaching With Artificial Intelligence: Concepts and Capabilities” by CarolinGrassman and Carsten C. Schermuly
5. https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0270255 ‘Comparing artificial intelligence and human coaching goal attainment efficacy’, Nicky Terblanche, Joanna Molyn, Erik de Haan, and Viktor O. Nilsson 2021
6. ‘Further techniques for coaching and mentoring’ Megginson D, Clutterbuck D. 2010
7. https://app.yoodli.ai/blog/how-im-using-generative-ai-to-10x-my-executive-coaching-business
8. ‘The Coaches’ Handbook’ edited by Jonathan Passmore, 2021