Digital shadows Unsurprisingly, when AI begins to factor into decisions on hiring, firing and pay, employees start to grow wary of their use. (Photo: Shutterstock)

Advanced data analytics is making it much easier for HR departments across the world to get a sense of how workers feel about their jobs, their colleagues and superiors.

Rather than have HR personnel pore over thousands of pages of employee surveys, software programs can analyze them almost instantly and provide insights into the language used by workers that may not be apparent to even a seasoned HR professional.

Recommended For You

One such tool is Xander, a product made by Ultimate Software Group. A recent Wall Street Journal article illustrated how SPS Companies, a steel processor, used the tool to analyze anonymous employee satisfaction surveys from its 600 workers.

One obvious use for the tool is to provide more precise feedback to supervisors about how their subordinates feel about their leadership and work conditions.

According to a recent analysis by Deloitte, 40 percent of employers around the world have put in place artificial intelligence solutions to better understand their workforce.

Unsurprisingly, when AI begins to factor into decisions on hiring, firing and pay, employees start to grow wary of their use. But helping employers figure out who should work for them and what they should be paid is exactly what companies such as Syndio and HireVue promise their software can provide.

The legal implications of AI tools is still up in the air. While there's likely nothing wrong with using software to better-analyze anonymous surveys, the legal situation will likely get cloudier if and when employers begin using tools that analyze people's facial expressions or tone of voice in a job interview.

At least one employment attorney tells the Wall Street Journal that he worries that hiring based on algorithms will likely disproportionately impact minority applicants.

While machines are supposed to make HR more rational by removing human biases from the equation, that doesn't mean that their methods will necessarily be fair. What if a machine begins recommending against hiring people based on their brand of shoe because it has picked up on a pattern of poor performance from those who wear that brand?

So far, however, the Equal Employment Opportunity Commission has not addressed the issue.

The key for HR departments, Garry Mathiason, an attorney at Littler Mendelson P.C., is not to take humans completely out of the process and to make sure that they disclose how AI is being used and what factors the program is analyzing.

NOT FOR REPRINT

© Touchpoint Markets, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more inforrmation visit Asset & Logo Licensing.