Data Quality: Invoke's Take
Invoke's Peter Mackey was recently on a panel at the Research Industry Summit in Chicago hosted by IIR and the Research Report's Bob Lederer. In preparation, we were asked to answer a few questions about the data quality issue and we thought the responses were worth sharing.
1) How do you define quality?
There does not appear to be a universal definition for quality. Most of the discussion around quality has been focusing on quality of the participant and placing the burden for improving quality on the sample providers. I believe this is a superficial view of the issue.
At Invoke, we define quality in three ways.
First, is quality of the participant. We need to ensure that people are who they say they are and that their responses are truthful. Given the Engage Live experience and the transparency to the data it provides, we have a real-time window into respondent integrity. As a result, we make an extra effort to screen properly. And we pay close attention to participant answers as they are given and weed out anyone who should obviously not be there on the spot.
Second, is quality of the participation. We constantly monitor participation levels to ensure people are engaged and answering questions throughout the session. Our environment, and its dynamic, moderator-led questionnaire flow and interactive stimuli provide an experience that enhances participation engagement, which results in better, more thoughtful and honest responses.
Third, and most importantly, is the quality of information. Does it tell a complete story? Are we able to use it to make specific decisions? Are we asking the right questions that will allow our participants to respond fully and our clients to confidently decide what to do next? Is the survey engaging, interesting and fun to participate in? Our hybrid platform invites full participation of every participant and provides the ability to ask, and have answered, any question you may have.
2) Has the quality versus cost equation been altered with the industry's intense focus on data quality?
I don't think its an issue of costs. Other than with conferences such as this, the data quality issue is not as pervasive on the client side. They want us to deal with it and stand behind what we do, but it's quickly become a cost of entry. The assumption is that our data is good. Clients don't want to pay more for what they should have been getting all along.
That being said, there is a lot of energy around finding alternatives to captive online panels for inviting research participation that may offer higher quality, such as customer advisory panels and communities, Linkedin's recent announcement to sell their member base as sample, the use of social networking sites, even Second Life. The bottom line is, we need to be more creative about finding channels to participants, which may or may not require spending more.
3) Got any good stories involving you and a client or vendor that especially makes whatever point you want to make?
We recently did a study for a technology company talking to IT decision makers; I would argue one of the most over-researched targets. It was a multi-national study involving the US and Germany. We had to use five different sample providers to satisfy our requirements: three panels, Linkedin, and a phone to Web recruit. Throughout the screening process, we lost about 15% of possible participants because we suspected they weren't who they said they were based on response to the screener. In the end, the quality of participant AND participation was excellent.
We needed to push hard, think creatively, and be watchful, but not necessarily spend more money, to ensure high quality results.
4) Data quality is certainly not a new issue. Why has it remained a hot area for two years now?
I think the conversation has evolved. In the beginning it was a panel issue but now the focus has rightfully shifted towards the quality of surveys, levels of interactivity, etc. Today's consumers have higher expectations and need to be actively engaged. They have the power and we need to keep them entertained to get their input. There are companies who have made great strides in this area, but the norm is still way behind where it should be.
1) How do you define quality?
There does not appear to be a universal definition for quality. Most of the discussion around quality has been focusing on quality of the participant and placing the burden for improving quality on the sample providers. I believe this is a superficial view of the issue.
At Invoke, we define quality in three ways.
First, is quality of the participant. We need to ensure that people are who they say they are and that their responses are truthful. Given the Engage Live experience and the transparency to the data it provides, we have a real-time window into respondent integrity. As a result, we make an extra effort to screen properly. And we pay close attention to participant answers as they are given and weed out anyone who should obviously not be there on the spot.
Second, is quality of the participation. We constantly monitor participation levels to ensure people are engaged and answering questions throughout the session. Our environment, and its dynamic, moderator-led questionnaire flow and interactive stimuli provide an experience that enhances participation engagement, which results in better, more thoughtful and honest responses.
Third, and most importantly, is the quality of information. Does it tell a complete story? Are we able to use it to make specific decisions? Are we asking the right questions that will allow our participants to respond fully and our clients to confidently decide what to do next? Is the survey engaging, interesting and fun to participate in? Our hybrid platform invites full participation of every participant and provides the ability to ask, and have answered, any question you may have.
2) Has the quality versus cost equation been altered with the industry's intense focus on data quality?
I don't think its an issue of costs. Other than with conferences such as this, the data quality issue is not as pervasive on the client side. They want us to deal with it and stand behind what we do, but it's quickly become a cost of entry. The assumption is that our data is good. Clients don't want to pay more for what they should have been getting all along.
That being said, there is a lot of energy around finding alternatives to captive online panels for inviting research participation that may offer higher quality, such as customer advisory panels and communities, Linkedin's recent announcement to sell their member base as sample, the use of social networking sites, even Second Life. The bottom line is, we need to be more creative about finding channels to participants, which may or may not require spending more.
3) Got any good stories involving you and a client or vendor that especially makes whatever point you want to make?
We recently did a study for a technology company talking to IT decision makers; I would argue one of the most over-researched targets. It was a multi-national study involving the US and Germany. We had to use five different sample providers to satisfy our requirements: three panels, Linkedin, and a phone to Web recruit. Throughout the screening process, we lost about 15% of possible participants because we suspected they weren't who they said they were based on response to the screener. In the end, the quality of participant AND participation was excellent.
We needed to push hard, think creatively, and be watchful, but not necessarily spend more money, to ensure high quality results.
4) Data quality is certainly not a new issue. Why has it remained a hot area for two years now?
I think the conversation has evolved. In the beginning it was a panel issue but now the focus has rightfully shifted towards the quality of surveys, levels of interactivity, etc. Today's consumers have higher expectations and need to be actively engaged. They have the power and we need to keep them entertained to get their input. There are companies who have made great strides in this area, but the norm is still way behind where it should be.
Comments