Personalization has long been considered the holy grail of the employee experience. The promise of delivering experiences tailored to individual preferences or designated personas while scaling its impact across the organization has appealed to many leaders in the EX domain. However, achieving true personalization has often fallen short due to technological limitations, data availability, and the complexity of catering to diverse employee needs.
Generative AI has already demonstrated its potential to enhance employee experiences through personalization, particularly in HR service centers, through chatbots and coaching applications. Yet, not all implementations have delivered the expected results. Organizations have experienced backlash from employees where AI has been incorporated into high-risk practices such as recruitment, chatbots not acting as expected, or employees being unaware that they were engaging with an AI system.
What can we learn from these cautionary tales? And what should we consider when incorporating AI technologies into EX?
1. AI Needs to Feel Human But Not BE Human
When using AI to deliver personalized experiences, the user interface is critical in creating a "human-like" experience. For users to feel comfortable and build trust, they need to feel as though they're engaging with a human, even though they know they're engaging with an AI system.
Human-like interactions can help users feel more at ease and encourage them to interact more freely with AI. However, when AI tries too hard to mimic a human, it can negatively impact the user experience, breaking down trust between the employee and the organization.
While AI needs to have human-like qualities in its conversation style, transparency is just as important. Users should always be aware that they're interacting with AI, not a human. Balancing these two aspects — human-like interaction and clear transparency — is essential for a positive user experience.
Related Article: Why We Need to Improve the Employee Experience of Our AI Programs
2. AI Is Dependent on the User's Skill to Be Effective
The effectiveness of GenAI in delivering personalized experiences relies heavily on users' engagement with it. If users aren't sure how to engage with it, they will get frustrated by poor-quality responses and lose interest. While prompt engineering is an essential emerging skill, the technology itself should also be designed to enhance user engagement.
AI should be capable of prompting users to articulate their queries better — such as by offering examples of possible prompts to help them get started or asking clarifying questions to help users be more specific in their prompts.
For example, some employees will refer to a benefit statement, while others will refer to a reward statement. AI should ask the user clarifying questions to contextualize the answer appropriately, increasing its chances of a meaningful response.
3. AI Pretends to Know the Answers, So You Need to Tell It Where to Stop
Hallucination in AI refers to instances where the system generates outputs not based on factual information, often combining unrelated data points in a way that leads to inaccurate conclusions. AI systems, by design, attempt to respond to every query, even when those queries fall outside their expertise, resulting in responses that may be incorrect.
For instance, if asked how a V8 engine works using an HR-focused dataset, the AI would still attempt an answer, searching for patterns in the available information — even though it lacks the relevant knowledge and data to answer the query.
Organizations must understand the importance of implementing controls to detect and manage unexpected responses and setting clear boundaries for the topics the AI should handle. By educating users on the model's purpose and focus, train the model to identify its limitations and know where questions fall outside its scope. Where the AI cannot provide a meaningful response, queries should be escalated to a human counterpart to ensure the employee feels supported.
Related Article: Generative AI Results Should Come With a Warning Label
4. A Great Experience Keeps the Human-in-the-Loop
Incorporating AI technology into EX might make it appealing to remove the human touch altogether, to create immediate efficiencies. AI should always be seen as one role player in delivering EX and act as a complementary party that supports and enables the experience to be more seamless or meaningful. It is also important to acknowledge where humans need to remain in the loop and build the AI interaction around that.
For example, replacing the manager with AI in a performance conversation isn't a good idea. Performance reviews are a crucial employee experience moment, therefore AI should be used to help collate data, provide managers and employees with talking points and prepare them for a meaningful conversation — and let the humans take it from there.
Key Takeaways for Enhancing Employee Experience
GenAI can be a game changer for EX. However, we need to approach these opportunities intentionally. AI is part of a process that needs to remain human for EX to thrive. Instead of replacing the human touch, we should enhance and complement the experience in a transparent and engaging manner.
Learn how you can join our contributor community.