a series of fire escapes on the side of a building. abstract representation of career ladders
Feature

AI-Powered Career Development Brings Great Potential. It Also Brings Ethical Risks

4 minute read
Lisa Rabasca Roepe avatar
By
SAVED
AI's potential to democratize and scale learning and development is evident. What may not be as evident are these ethical concerns it raises.

AI is increasingly being used to help employees with career development, doing everything from offering recommendations for skills training to support career aspirations, to matching them with mentors, to helping onboard new hires. 

“The technologies that are coming online have enormous potential for upskilling in education,” said Prasanna Tambe, associate professor of Operations, Information and Decisions at The Wharton School. Tambe believes AI will help democratize education and reskilling in a way that employers haven't seen in decades.

AI can help organizations pinpoint skill gaps and provide personalized upskilling opportunities for employees based on their career path interests, said Carina Cortez, chief people officer at Cornerstone, provider of an AI-powered workforce agility platform. "Companies should be using these technologies to offer reskilling to people who haven't necessarily had access historically to channels to learn and skill," Tambe said.

The Ethical Risks of AI Upskilling

However, despite the potential upside of using AI to further career development, employers should be aware of potential ethical risks, experts say.

The European Union recently passed the EU AI Act, which included stipulations specific to education. The act classifies anything related to education or vocational training using AI to determine access, admission or assignment to educational and vocational training as “high risk” and requires stringent risk assessment and procedures.

Without any national U.S. legislation or policy regarding AI and adult education, Cortez recommends that companies write their own ethical AI-use statement to establish guardrails to prevent potential negative impacts, which can include possible bias, concerns about who has access to the technology and issues related to data privacy.

Bias is also a concern for Chris McClean, global lead, digital ethics at Avanade. He said if your company is using large language models and AI is generating content, that content may be biased and potentially inappropriate.

Companies should also pay attention to who has access to AI-training tools, because the people at the top of the organization will benefit the most from using AI-enabled upskilling given the work they do translates much better to these tools and reskilling, Tambe told Reworked. There are many languages, cultures, ethnicities and types of workers in an organization who are not as well represented in terms of a digital archive that a language model can learn from, he continued. The ability for these tools to help skill people across countries, across cultures, even within the firm in different jobs is going to be unequal, he said, because some people will have better access given their job generates huge amounts of digital data and in a language that is very well represented in these systems.

Related Article: Who's Responsible for Responsible AI?

How to Fix Potential Ethical Issues

Companies can avoid many of the potential ethical issues related to AI use in career development by doing the following:

1. Monitor Who Uses and Has Access to AI Upskilling

McClean recommends monitoring who is able to access the system versus who is using it. For instance, are employees completing the training courses? If not, investigate why, he said. Maybe the training doesn’t work on smartphones, or it only works on certain browsers, or the content could be biased and offensive to employees, he said.

Include a human review of any information AI provides to ensure accuracy, said Dieter Veldsman, chief scientist: HR and OD for the Academy to Innovate HR. “Individual people should be reviewing the content for things like accuracy and appropriateness so there are several layers of control,” McClean said.

In addition, keep track of who has access to the training because the people who complete the training are potentially up for promotion or the best project assignments, McClean said. Any bias in terms of access will have a potentially negative impact on an employee’s financial opportunity, mental health, physical health and safety.

2. Be Transparent About AI Use

Tell employees about how the company is using AI and invite them to provide feedback on how it's working, McClean said. Make it clear to employees when they are dealing with AI and when they are working with a human — and allow employees to opt in to AI use, suggested Veldsman.

Also make the benefits of using AI in career development clear to them. “You might say, 'This training is more personalized, it's more accurate, it's easier and it's more accessible,'” McClean added.

Related Article: How to Get Started With an Internal Talent Marketplace

3. Consider Employee Privacy

AI has that ability to quickly gather and analyze employee skills, identify strengths and help align career goals to company needs, Cortez said. It can help employees create a strategy for career development that supports their career path. But to do that, employees need to provide data — and that is a potential privacy issue.

There is growing tension between workers and employers, particularly in the U.S., around who owns employee data, Tambe said. Data privacy acts in Europe are different than in the U.S., and California recently enacted its own data privacy law to regulate what information businesses can collect.

“When you think about someone working through a reskilling program, they are providing a lot of data back to the organization,” Tambe said. The employee is revealing a lot about themselves in terms of knowledge and abilities. The data is also a valuable resource in terms of training those large language models, he continued.

Learning Opportunities

Revealing personal data related to career development raises several questions, Veldsman said. For instance, should employees share their career preferences and ambitions with their employer if that information is being put into an algorithm? The opportunities and career path being offered to employees are tied to the organization where they work. What if AI is recommending a career that is outside the organization? Should employers flag that for employees, he asked.

There are also questions about employee mental health. Everyone is struggling to keep up with AI, Tambe said. "You're automating all these upskilling systems in a way that employees might feel pressured to constantly be reskilling,” Tambe said. "That seems like something where the pace is a concern and the stress that that might invoke.”

About the Author
Lisa Rabasca Roepe

Lisa Rabasca Roepe is a Washington, D.C.-based freelance writer with nearly a decade of experience writing about workplace culture and leadership. Her work has appeared in The New York Times, Fast Company, Wired, the Christian Science Monitor, Marketplace and HR Magazine. Connect with Lisa Rabasca Roepe:

Main image: Avery Lewis | unsplash
Featured Research