Janel Faucher, a research manager with AOL, talked about qualagility. “Our R&D team came to us with over 200 product ideas, an unprecedented research request.” To top it off, the deadline was short, there was no added budget, and the team has four full-time researchers. How could the team become more proactive than reactive to avoid these fire drills?

“We decided what we had to give up was and our involvement in the research process. With traditional research, we are doing it ourselves from start to finish, with a lot of control. This request didn’t allow for that type of involvement, which was a pretty scary thought. I’ve seen plenty of research where unintentionally leading questions were asked, and the results are cringe-worthy. How do we avoid that while fulfilling the request?”

“We decided to work with what we had, an omnibus tool and a remote qual tool.” Janel took the 200 ideas and prioritized them using the omnibus, turning to quant to prioritize the qual. The team created a way to score each idea using three questions:
1. A behavior the respondent had performed in the past month (different by idea)
2. The level of importance
3. The level of painfulness.

Important ideas that relieved pain bubbled to the top.

“To our surprise, this was a misstep for us. Thinking about product ideas, and how consumers use things, importance and painfulness aren’t the only way to think about a product that they need or want. It doesn’t account for products they can’t articulate that they might want: a clear example is the iPhone; consumers didn’t articulate clearly the need for an all-in-one device for phone, email, camera. So this fell short for us. We’re not using it anymore. But it did help us get through the 200 product ideas! A key lesson is that you might not get it right the first time, and that’s okay. It is scary to not know the outcome.”

An experimental approach where you are not sure what the outcome will be requires an honest conversation with the client.

For phase 2, AOL used UserTesting.com for remote interviews about problems. “This is a Swiss Army knife tool that we use for all kinds of things, from website testing to doing interviews.” It is a way to recruit a specific audience affordability and quickly.

Janel involved the client coming in and taking on some of the responsibility in order to hit the deadline. In this case, working through prioritized ideas to develop them further for testing using a framework of the opportunity and hypotheses. “A really big learning was that we had to optimize the inputs before drafting the research.” The investment in optimizing is quicker in the end; otherwise you can get “wonky” results to half-presented ideas.

Janel’s team developed a project tracker outlining every single step of the problem interviews, including research tasks, client tasks, and joint tasks. “This tracker made it transparent to the client and increased client comfort.”

In the trial and error of developing this methodology, it became clear the client couldn’t write the survey instrument, that research staff would do that. Videos of all interviews were provided to the client, who wrote notes about each video about what they were hearing while being mentored not to overreact to a single participant’s comments; researchers were not documenting the interviews. This took quite a bit of convincing, but happened because the work couldn’t get done by the time desired without it. A joint 1-hour debrief call helped pull out conclusions and key findings for a 1-page top-line report.

This process produced 5 key results:
1. “We successfully evaluated 200+ product ideas.”
2. Reduced turnaround from 5 days to 24 to 48 hours per project. Speed is paramount to many clients.
3. Enabled budget allocations to be devoted for strategic focused research.
4. Fostered stronger relationships with clients by talking with them more.
5. The process is now repeatable.

“Clients are now coming more often, and earlier in the process, which we are really thrilled about.”

Author Notes:

Jeffrey Henning

Gravatar Image
Jeffrey Henning, IPC is a professionally certified researcher and has personally conducted over 1,400 survey research projects. Jeffrey is a member of the Insights Association and the American Association of Public Opinion Researchers. In 2012, he was the inaugural winner of the MRA’s Impact award, which “recognizes an industry professional, team or organization that has demonstrated tremendous vision, leadership, and innovation, within the past year, that has led to advances in the marketing research profession.” In 2022, the Insights Association named him an IPC Laureate. Before founding Researchscape in 2012, Jeffrey co-founded Perseus Development Corporation in 1993, which introduced the first web-survey software, and Vovici in 2006, which pioneered the enterprise-feedback management category. A 35-year veteran of the research industry, he began his career as an industry analyst for an Inc. 500 research firm.