Moonshot Insights Bets on Algorithms to Address Biases in Hiring

In recent weeks, there’s been some local buzz about a team of University of Wisconsin-Madison students that’s getting set to participate in the 2018 Hyperloop Pod Competition, which will take place next week in Hawthorne, CA. This year, the team hopes to run its pod on SpaceX’s hyperloop test track, an opportunity it was denied at the contest last year after not being selected as one of the top three teams.

The Badgerloop team isn’t the only group with UW-Madison ties that’s heading to California to pitch its technology against other bright young innovators. The other group is Moonshot Insights, a startup that plans to pitch to a Silicon Valley venture capital firm in in hopes of landing an early stage investment. Moonshot is developing digital tools to help customers make better hiring decisions, says co-founder and CEO Kyle Treige.

Dorm Room Fund, an arm of the VC firm First Round Capital that focuses on student ventures, recently picked Moonshot as one of about 60 startups that will pitch to investors at Dorm Room in the coming weeks, Treige says. About half of them receive an investment from Dorm Room, and a typical check size is $20,000. Moonshot is the first Wisconsin-based business to be selected to pitch to Dorm Room’s investment committee, Treige says.

There are currently four people on Moonshot’s team, Treige says. Three are recent UW-Madison graduates, and the fourth is currently a student at the school.

Moonshot’s software evaluates job candidates to predict how well they’ll perform in a particular role. The startup has developed a set of assessments for Moonshot’s clients to give to their candidates, which Treige says typically take about 25 minutes to complete. These evaluations include objective, position-based skills tests; questions asking respondents how they’d react in different situations; and a psychometric, or personality type, analysis, he says.

After a candidate has completed the battery of tests, Moonshot’s software algorithms analyze his or her responses, as well as some of the information the applicant has provided about previous work experience and education, Treige says. For now, the startup’s software isn’t designed to boil all that data down to a single thumbs-up or thumbs-down hiring recommendation, he says.

“It’s more painting the picture fully of who they are as a person,” Treige says.

He says most of Moonshot’s early customers are small organizations, many of which are also in the tech industry. However, companies with hundreds or even thousands of employees might also benefit by using Moonshot’s tools to evaluate candidates just after they apply for a job, Treige claims.

On average, a hiring manager spends just six seconds looking at a candidate’s resume at the initial screening stage, according to research by the careers website Ladders. Treige says shifting to a more data-driven hiring process makes an organization less susceptible to a pitfall in the tech world that has come under increased scrutiny: managers in charge of extending job offers tend to hire people who remind them of themselves.

“We found 50 to 67 percent of women and minorities are disadvantaged” by the resume-screening process as it exists today, says Treige, adding that there’s a “massive amount of disparity through that one piece of paper.”

Efforts to reduce bias in hiring decisions have become more sophisticated in recent years, but they’re nothing new. In 1978, the U.S. Department of Labor and other federal agencies introduced new hiring guidelines that included what’s become known as the “four-fifths,” or 80 percent, rule. The rule stipulates that at any given organization, candidates belonging to protected minority groups should be should be hired at a rate that’s at least 80 percent of “the rate for the group with the highest [hiring] rate,” according to the human resources firm Prevue HR.

More corporate HR departments are using algorithms to automate how they filter candidates into categories after job interviews, and even how the companies make hiring decisions, says Aws Albarghouthi, an assistant professor in UW-Madison’s computer sciences department. He says those shifts have sparked researchers’ interest in what has become a hot topic in tech: the question of whether algorithms are biased, and if they are, how best to address that bias.

Albarghouthi, who is not involved with Moonsight, is one of four UW-Madison researchers who last year received a three-year, $1 million grant form the National Science Foundation to continue studying ways to remove bias from computer algorithms.

The software Albarghouthi and his colleagues are developing, which they call FairSquare, could eventually be used to certify automated decision-making programs as fair. While he says there’s not really a straightforward definition of fairness when it comes to algorithms, Albarghouthi uses a hiring-related example to explain how FairSquare might one day be used.

“If you think of a hiring case where you have black applicants and white applicants, the idea is to make the algorithm oblivious to race by effectively removing it from the data,” he says. Specifically, that could mean “fuzzing the data a little bit, such that the algorithm cannot tell whether someone is black or white—even through proxies like ZIP code,” he says.

Treige, of Moonsight Insights, recently spoke at a conference organized by the Corporation for a Skilled Workforce in Washington, D.C. While on stage, several people in the audience asked him if the algorithms Moonsight and other software companies are developing will perpetuate biases, he says, and worsen what many view as a problematic status quo.

Treige says he can’t speak for all CEOs of companies developing digital tools designed like his. But his hope is that Moonshot’s products will have the opposite effect, and help reduce the level of bias in organizations’ hiring decisions. For example, he says, instead of presenting hiring managers with a standard resume or recruiter feedback from an interview, the manager could instead see a de-identified set of information on the candidate listing his or her qualifications for the job, and little else.

“This is a transitional point” in time, Treige says. In his view, the first step toward improving things is to “understand what the current conditions are around bias and workplace, resumes and interviews being a huge problem at the crux of that.”

Trending on Xconomy