Should AI be considered an employee? This company thinks so

 


July 21, 2024


 


Most workers are worried that AI will replace them. And while many are now using AI at work, few directly refer to the technology as a “coworker” or an “employee.” But the CEO of people management platform Lattice thinks we should change that.


On June 9, Lattice announced that it would become the first company to provide AI tools with their own employee records. The software unicorn will integrate AI tools that the organization uses as “digital workers” in its Human Resources Information System. 


That means that the company will treat these “digital workers” similarly to how they treat human workers. AI, Franklin argues, should also be held to the same codes of conduct that human workers are; be onboarded similar to how human workers are; be subject to performance management, have explicit goals, and receive feedback just as human workers are; and should be reflected in a team’s org chart with clear managers and coworkers. 


In a letter to employees, Lattice CEO Sarah Franklin conceded that “the idea of AI employees, or ‘digital workers,’ is not an entirely comfortable one.” Still she suggested that “the inevitable arrival of these digital workers raises important questions about their integration, measurement, and impact on human jobs.” 


In order to keep up with advancements in AI and the ways in which AI is transforming work, leaders like Franklin are searching for a term to accurately describe the professional positions that AI holds—and a way to provide AI with guardrails.


“Personified technology is being marketed as for hire,” Franklin tells Fast Company. “And we are not here saying that we are for or against AI, we are just recognizing that it is happening.”


Names, personas, and pronouns


To be sure, there are numerous jokes to be made about the idea of calling an AI tool an “employee.” If you fire an AI employee, does that tool qualify for COBRA? Or, if an AI employee shows signs of bias—like many are known to do—should the tool get sent to HR? 


But while it may seem outlandish to consider an AI tool an employee, many AI tools are intentionally designed to have human-like personas: cognition.ai’s software engineering tool Devin, Qualified’s sales agent named Piper, and Salesforce’s service agent named Einstein. And of course, there is IBM’s AI persona, Watson. 


“The tech industry has started evolving AI to have a persona. People think of AI as being a just a bot, which is really a predefined set of if, then, else rules—like a decision tree—[where] you know exactly what the outcome is going to be,” says Franklin. “But this is very different from what we’re seeing with these personified AI workers that don’t just have personalities, and names, and pronouns, but they have the ability to reason. And the question is, how do you make sure that they are incorporating your values and that they are held accountable to their outcomes?”


The rights of human workers


Franklin sees the move to consider AI tools as digital workers a preliminary protection against a “dystopian future” in which leaders blindly replace workers without holding the new technology to the same expectations of humans, citing research that suggests spending on digital workers is expected to increase from $ 4 billion to $ 20 billion in the next three years.


“As a CEO, and whether you’re going to invest headcount dollars into a digital employee, you should be held accountable for what you’re doing. And those employees need to be performing,” she says. “Because you may hire a digital sales agent; they may bring in a lot of leads, but not a lot of good leads that lead to revenue. And so you need to then have a very honest conversation about if [AI] is a good employee.”


Karla Walter, senior fellow at the Center for American Progress stresses that human workers rights need to be prioritized.


“Obviously companies might be tempted by the idea that AI could be employees, and thus allow them to replace employees, if it helped their bottom line. As we saw with gig work, companies will go through extreme lengths to keep people from being considered employees to save themselves the money and obligation,” says Walter in a statement sent to Fast Company. “It’s workers who need more power in the system to bargain to ensure that technology is deployed in ways that benefits working people.”


Technical definitions


AI researcher Chris Witcher worked for IBM for over 40 years and spent many years working on Watson. He says that AI is, at its root, a powerful “knowledge technology” that can independently complete much of the work that knowledge workers have historically done. 


But Witcher argues that while AI tools should be onboarded, vetted, trained, and given career growth opportunities similar to human workers, there are still distinct differences between human and digital workers. 


In some ways, “AI is great employees,” says Witcher. “They don’t take coffee breaks, they work 24/7, you don’t have to give them benefits but they can retain knowledge and grow their knowledge.” 


However, when it comes to decisions that include variables like human emotions, the importance of empathy, and the value of occasionally making technically irrational decisions, AI fails to live up to the capabilities of workers. 


“AI is an unbelievably powerful technology, but it is not human. The human brain is so complex, it’s not understood. And it’s going to be a long time, if ever, that systems can embody or be inspired, not mimic, human characteristics like empathy,” he says. “Deep learning AI, being a knowledge technology, doesn’t mimic human perception—it’s inspired by it. Judgment calls which embody empathy, understanding, compassion, those characteristics don’t exist in AI knowledge workers.” 


Ultimately, Witcher understands the desire to find a term that captures the tremendous capabilities of AI and the ways in which AI can be similar to human workers. But he maintains there should be a more specific word to describe an AI tool that does the work of humans. 


When I ask him what this term could be, he doesn’t have an immediate answer.  “You’re going to have to invent the name,” he says. “Don’t ever let an engineer name anything.”

Fast Company

(5)