Posts

Showing posts from March, 2008

Advertising Research Foundation Conference

Image
I'll be at the ARF Conference next week if you want to stop by and discuss any of the topics in this blog - booth 221. Hope to see you! http://www.thearf.org/events/upcoming/rethink-08.html

Recession Proofing Your Research

Image
With the downturn in the economy, I've been asked by at least a dozen people recently how that will impact research budgets. Its too soon to tell for sure, but Forrester recently released a report with ten ways to "Recession-proof Market Research." Four jumped out at me: Require a clear business outcome for every primary study... This one I learned years ago from Chadwick Martin Bailey's John Martin. Never conduct a study if you don't know how the information is going to be used. And from a provider standpoint, don't assume an RFP is completely thought through. Put revenue focus in every survey you field... Another key to successful designs is focusing on key outcome behaviors (e.g., trial, purchase, cross-purchase, advocate) and their attached revenue. Its nice to know what people think, but satisfaction doesn't necessarily drive revenue. Reduce sample sizes based on the level of confidence needed in the results... Ask yourself how many respondents yo

A Simple Rule for the Data Quality Discussion

Image
Our friends at Greenfield Online (Keith Price specifically) start to examine the issue of data quality in their latest newsletter. Two key things: The first is that it seems that Keith believes as the issues around cheating respondents has been a bit overstated (I agree). The second is that a bad survey will always deliver bad results. That's the issue that drives us at Invoke - creating a participant experience that delivers great insights. A simple rule: Never write or test a survey that ends with you asking "who would take this?" Here's a snippet of the article ( click to read the whole thing): Enter 2007. In the US, online surveys have reached high market penetration, and for most survey respondents, taking an online survey is no longer a "new" activity. As an industry we've gone from questioning under-participation of respondents, to questioning over-participation. We continue to look into respondents that attempt to cheat or game the system to p

Mothers Still Concerned About Marketers, But Take Responsibility

Image
Last week we used our Engage Live and Open methodologies to conduct a 60-minute interactive research session with over 200 mothers of children under 18. The goal was to get a quantitative and qualitative read on how they feel about marketing to children, what is appropriate, and where the responsibility in protecting their children lies. We covered five main areas: general perceptions, brands and product categories, media exposure, shopping with children, and whose responsibility it is to protect children. Going into the session I wondered if these mothers would take responsibility for protecting their children or if they would pass the blame on to the media or marketeres. I was heartened to see that while they are concerned, they realize that in the end it is their job to raise their children even if others could make it easier! Quote of the session: “I don’t think it should be federally regulated. Maybe state regulated. But then again, the advertisers should be responsible and if the

The Little Things Count

Image
Whether you are designing surveys, working in retail, waiting tables, or selling services, the little things count when you are trying to influence someone's behavior. One recent example from my life. I am a little obsessed with my dog, Baxter, and a few weeks ago we needed a place for him to stay while we went to visit family in Atlanta. In the past we have always had him stay with friends but this time we needed another option and decided to try out the Petsmart hotel that just opened near our house. It was the closest to a kennel we had ever used and while I tried to hide it, I was anxious. Sure, I had bought all the add on activities and treats to keep him busy but leaving him with strangers in a strange place had me on edge. As I drove over I was already thinking of backup plans in case it didn't "feel" right. It all went away, however, when we walked through the door and they came right over and said "you must be Baxter, you're so cute!" The peop

Automated Interactions Done Right

Image
I was just reading an article in this month's AMP Agency Frequency Newsletter that talks about a new automated customer service strategy being used at Ikea (love that store!). From the newsletter:"IKEA Website users are introduced to Anna, an automated online assistant, who not only is helpful but also has a sense of humor. When I asked about warranties, she automatically changed the browser to the correct section of the site. Then, when asked if she preferred cats or dogs, Anna told me 'Thanks for asking, but I’m only here to answer questions about IKEA so I don’t have any outside interests.'” This is one example of an automated process working correctly. In research I have seen a wide array of automated interactions and most fail in my opinion (Like Socrates) because they are following a set of questioning that is supposed to be one on one, leaving little margin for error. That's why Invoke's Engage Open platform simulates a group environment, allowing for qu

4 Steps to Successful Research Projects

I've been in this industry long enough to see why lots of projects fail, so here's 4 quick tips to making them successful. 1) Know what decisions you are going to make BEFORE you start a project. This makes it easy to ensure you get everything you need to support those decisions. 2) Have someone not involved in the project test it to see if people not intimately involved understand the questions. Often times when researchers are immersed in a subject, they forget that participants will have significantly less knowledge in most cases. 3) Take the time to think about the participant experience. Its only good data if people are engaged. 4) Don't assume you know the answers before you start. Its fine to test hypotheses, but include questions that allow your hypotheses to be challenged. 5) (BONUS TIP) Incorporate both qualitative and quantitative questioning into the process. This way you don't assume you know why something gets low or high scores, you know.