This post was originally published on OPEN Forum. |
Michael Rosenbaum, president and founder Catalyst IT services, began using the "
We asked Rosenbaum how he did it, what kind of data he compiles and how he determines whether candidates will be a good fit before hiring them.
BI: Why do you think the traditional interviewing method is so imperfect?
MR: The classic way of hiring is so subjective, because it tends to revolve around hiring someone similar to yourself. When I was 23, the person who hired me for my first job also went to The London School of Economics and I think that's the only reason I got my job. The idea is that people think that they're good at their jobs, so they want to hire someone who reminds them of themselves. However, teams work best when you have complementary skill sets on the team.
BI: What type of data do you look at?
MR: We collect traditional data, such as what's on an individual's resume and their social network, but we also look at things like how long someone spends filling out their online application, and we analyze their keystrokes. What you can get from analyzing large data sets can better help you understand how someone works and their productivity levels.
The time that candidates spend looking at individual questions and the number of keystrokes are only some of the variables that go into our algorithms for predicting high performance in individuals and teams. The complete set of data lets us predict not only high performance but also which teams will have compatible cultures and communication styles with particular kinds of clients. We use this approach for all of our engineering positions.
BI: What if someone takes a break in the middle of filling out their online application? Do you record that data? And how do you use it?
MR: If it turns out the data we compiled doesn't seem relevant when we compare it to everything else, we probably won't consider it.
BI: What kind of software do you use to collect the data?
MR: When we decided to use this method in our own company, we developed software that would allow us to use algorithms to find the best match. After some success, we spun out that software into its own company called Pegged Software, which launched in 2010 and is available for other employers. We're a software company so it makes sense that we wanted to develop our own software for hiring purposes to predict if someone's a good fit and how long they'll stay with us.
BI: What made you want to develop your own software? Did you have a string of bad hires?
MR: It’s actually the other way around. Catalyst was founded around the idea of using data rather than resumes and interviews to identify high performing individuals and teams more effectively; the focus on IT services came after. The spark for this was my observation that resumes communicate a set of signals that may or may not predict success, and they are the core of a labor market that is very inefficient. I felt that if we could use data rather than subjective perception to assemble teams we could achieve much higher levels of performance. I then received an Irving R. Kaufman Fellowship to help support building the first version of what is now Catalyst’s analytics engine for talent selection and team assembly.
BI: Is hiring all metric-based? What percentage depends on interview or resume?
MR: A large portion of hiring is metrics-based. We will likely hire 150 people in the next year, and in order to do that we will evaluate 10,000 people. Data will narrow that 10,000 down to 300 or so, and the 300 will be interviewed to select 150.
The hiring is not all metric-based ... but when incorporating data into the equation, the entire hiring process becomes much more efficient.
BI: How accurate are you at predicting how long someone will stay?
MR: Our voluntary turnover rate last year was 5.8 percent and our involuntary turnover rate was 9 percent. In the software development space, turnover is typically 30 percent.
SEE MORE ON OPEN FORUM: