Do you want a job Better learn to talk to a computer

0


The day after her interview for a part-time job at Target last year, Dana Anthony received an email informing her that she didn’t make the cut.

Anthony didn’t know why – a situation that most job seekers have come across at some point. But she also had absolutely no idea how the interview had gone because her interviewer was a computer.

More job seekers, including some professionals, may soon have to accept impersonal online interviews where they never speak to another person or know if behind the scenes artificial intelligence systems are influencing hiring decisions. Demand for online recruitment services that interview applicants remotely via laptop or phone has mushroomed during the COVID-19 pandemic and remains high in the face of a perceived labor shortage as the economy reopens.

These systems claim to save employers money, get around the hidden biases that can affect human recruiters, and widen the range of potential candidates. Many are now also using AI to assess candidates’ skills by analyzing their statements.

Anthony likes to look an interviewer in the eye, but all she could see was her own face, reflected on the screen. “Personally, I do better interviewing because I am able to connect with the person,” she said.

However, experts doubt whether machines can accurately and fairly assess a person’s character traits and emotional signals. Algorithms designed to find out who is best for a job can anchor bias when targeting industries where race and gender differences are already widespread.

And when a computer weeds out some candidates and highlights others with no explanation, it’s harder to know if it’s making fair ratings. Anthony, for example, couldn’t help but wonder if her identity as a black woman influenced the decision.

“If you apply for a job and get rejected because of a biased algorithm, you certainly won’t know,” said Aislinn Kelly-Lyth, Oxford University researcher. In contrast, in a face-to-face interview, a job seeker might pick up discriminatory evidence from the interviewer, she said.

New rules proposed by the European Union would make such AI setting systems more strictly regulated. Proponents in the US have pushed for similar action

One of the leaders in the field, Utah-based HireVue, has gained prominence over the past few years for using AI technology to assess personality and job skills based on a candidate’s facial expressions during an interview. After fierce criticism centering on the scientific validity of these claims and the potential for bias, the company announced earlier this year that it would end the practice.

But its AI-based ratings, which rank applicants’ skills and personalities to mark the most promising for further review, still take language and word choices into account in their decisions.

The privately held company has helped create an on-demand video interview market. Well-known customers include retailers like Target and Ikea, big tech companies like Amazon, banks like JP Morgan and Goldman Sachs, oil giants, restaurant chains, supermarkets, airlines, cruise lines and school districts. The Associated Press reached out to numerous brand-name employers using the technology; most declined to discuss it.

HireVue CEO Kevin Parker says the company has worked hard to make sure its technology doesn’t discriminate based on factors such as race, gender, or regional accents. His systems of translating speech to text and looking for clues about team orientation, adaptability, reliability and other professional skills can outperform human interviewers, he said.

“What we’re trying to replace is people’s gut instinct,” he said – of course – in a video interview.

HireVue said it interviewed more than 5.6 million people around the world in 2020. Supermarket chains used it to screen thousands of applicants daily amid a pandemic hiring surge for cashiers, warehouse keepers and delivery staff, Parker said.

Broader hiring software providers like Modern Hire and Outmatch have started offering their own video interviews and AI assessment tools. On its website, Outmatch advertises its ability to “measure the essential soft skills your candidates and employees need to be successful”.

HireVue notes that most customers do not use the company’s AI-based reviews. The Atlanta School District, for example, has been using HireVue since 2014, but says it relies on 50 human recruiters to evaluate recorded interviews. Target said the pandemic has resulted in in-person interviews being replaced by HireVue interviews, but the retail giant told the AP that it is relying on its own employees – not HireVue’s algorithms – to view and evaluate recorded videos .

None of this was apparent to Anthony when she sat in front of a screen last year to apply for a seasonal job. She dressed appropriately for the occasion and made herself comfortable. The only indication of a human presence came in a pre-recorded introduction that set out what to expect – for example, that she could erase a response and start over.

But she had no way of knowing what impression she was making. “We are not in a position to give specific feedback on your candidacy,” says the rejection email from Target. She was rejected again after completing a HireVue interview for another job in December.

“I understand companies or organizations are trying to invest more time and money in recruiting,” said Anthony, who earned a master’s degree in strategic communications from the University of North Carolina at Chapel Hill last year. Still, the one-sided interviews left her unsure of who or what was evaluating her.

This obscurity is one of the biggest concerns about the rapid growth of complex algorithms in hiring and hiring, Kelly-Lyth said.

In one infamous example, Amazon developed a resume scanning tool to recruit top talent, but gave up after finding that men were preferred for technical roles – in part because it meant job applicants with its own male-dominated technical workforce Company compared. A study published in April found that Facebook shows women and men different job advertisements that may violate anti-discrimination laws.

Governments in the US and Europe are considering possible reviews of these recruitment tools, including external audit requirements, to ensure they do not discriminate against women, minorities or people with disabilities. The EU rules presented in April would force AI system providers who screen or assess applicants to meet new requirements for accuracy, transparency and accountability.

HireVue has begun phasing out its face-scanning tool, which analyzed facial expressions and eye movements and derided by academics as “pseudoscience,” reminiscent of the discredited and racist theory of 19th century phrenology. The Electronic Privacy Information Center filed a complaint with the Federal Trade Commission in 2019, quoting a HireVue executive who said that 10 to 30% of a candidate’s score was based on facial expressions.

“The value it added in the context of the controversy it created wasn’t very much,” Parker told the AP.

HireVue has also released portions of a third-party audit examining fairness and bias issues related to its automated tools. A published summary recommended minor changes such as: B. the change in the weighting of the particularly short answers, which were disproportionately given by minority candidates.

Critics welcomed the audit but said it was just a start.

“I don’t think science really supports the idea that speech patterns are a meaningful estimate of a person’s personality,” said Sarah Myers West of New York University’s AI Now Institute, which studies the social effects of AI. For example, she said, such systems have had difficulty understanding women’s voices in the past.

Kian Betancourt, a 26-year-old doctoral student in organizational psychology at Hofstra University, also failed at a HireVue remote interview for a consultancy position earlier this year. He acknowledged that he may have tried too hard to predict how the system would rate him for a consulting job, and adjusted his diction to include keywords that he thought would improve his score.

Betancourt does support “structured interviews” with standard questions, but is bothered by the opacity of automated systems.

“Tell people exactly how we’re rated, even if it’s as simple as ‘This is an AI interview,'” he said. That basic information can affect the way people present themselves, he said.



Source link

Leave A Reply

Your email address will not be published.