Objectives are carved in stone, but clients always have chisels

Editor's note: Tim Huberty is president of St. Paul-based Huberty Marketing Research. He is also an adjunct professor at the College of Business of the University of St. Thomas.

I have been involved in the marketing research industry for over two decades. During that time, I have conducted hundreds of studies for countless clients. I have also taught in an MBA program for 15 years. Oftentimes, during a presentation or lecture, I will make a "big picture" statement, one not totally supported by the data or related to the topic. When challenged, I simply say, "It's just one of the 10 rules of marketing research." As you might imagine, over time, clients and students have become increasingly vocal about learning what those rules are. When asked where they can find them, I can only hem and haw and promise to "get them a copy." Eventually, one has to make good on all promises. So here goes.

1. People can be stupid. And they also lie.

Newspaperman H.L. Mencken once wrote that nobody ever went broke underestimating the intelligence of the American people. Until I read that quote, I had no idea that Mr. Mencken was also a marketing researcher.

Art Shulman has documented this rule for many years in this magazine. What researcher out there doesn't have his or her favorite War Stories of respondent stupidity? My own examples? I remember moderating a focus group a few years ago when a respondent really, actually, truly wanted to know if Alfredo Fettuccini was an Italian tennis player. Or there was the person in a more recent focus group who asked if distilled water and distilled liquor are the same thing. In fact, over this past weekend, I was supervising a group of students who were conducting telephone interviews. More than one reported a new "ethnic background": Norwegian.

We all know that people lie. When conducting focus groups, I make participants first write down their reactions, whether or not they are reviewing an ad or responding to a group question. Over time, I have discovered that at least one-third of all participants change their answers - even after writing them down. When challenged on this, respondents will often accuse me of misreading the number they have written. Or they will claim that they didn't understand the scale. Or, in a throwback to political incorrectness, will point out that, "It's a woman's prerogative to change her mind."

2. People's opinions count only to the extent that they agree with those of the people in the back room.

It's amazing how smart - or stupid (or good-looking or ugly) - focus group participants are to back room observers if they happen to agree with what is being presented. Eliciting positive reactions to a storyboard can make the difference between a "good" recruit and a "bad" recruit. It can make the difference between "he's not one of the sheep" and "he sleeps with the sheep."

More often than not, when I come into the back room while conducting focus groups, clients instruct me which participants to "concentrate" on. "Those other ones [i.e., the ones who aren't saying what they want to hear] aren't worth crap."

I sometimes invite clients to listen in on interviews as the data is being collected. After one session, a client asked me why I wasn't interviewing any of his "smart customers." (I guess they were busy talking to his "smart competitors.")

3. Selecting focus group locations has absolutely nothing to do with the product.

I discovered long ago that focus groups are only conducted in cities in which clients have relatives. Or old college friends or former associates. Or "big weekends" planned. Forget BDIs. Or CDIs. Or "market strength." Or "a strong distributor network." Ultimately, it all boils down to who the client knows where. Or what is going on the preceding or following weekend. One of my clients demands to do focus groups in the New York City area because his son goes to school there. Another client likes Chicago during the summer because of the Lakefront Jazz Festival. Still another client likes L.A. because she and her husband can "play" the weekend before or after the groups are conducted.

4. The bigger the building, the higher the price.

Overhead means overpriced. Essentially, the data collection interviewers are all getting paid the same wage. Thus, the mark-up has to go somewhere. Like for rent. Thus, the corollary to this rule would be: The more people the supplier employs, the more the project will cost. Last fall, I was able to win a customer satisfaction study from a client because the previous supplier was already charging 63 percent more than I had bid. Of course, that research company not only had its own building, but also had an "international reputation" (their interviews called from Canada). Caution: You don't want to bid too low because then clients suspect that they will get what they pay for. Interestingly, to my knowledge, that axiom has never been proven.

To be fair, there are ways of "justifying" the higher cost. For example, just mentioning the presence of a Ph.D. in the proposal is worth at least another 20 percent mark-up. (Usually, those eggheads seldom even see the data, but are listed as part of the "statistical team.") Also, research on "the coasts" - whether it be quantitative or qualitative - is always more expensive. I've never been able to figure that one out.

5. Somehow, during the first meeting, always ask the client what he expects the results will be.

This will save you a whole lot of time - and heartache - later.

Most clients know what they are looking for before they start. In fact, they expect you to prove what they are looking for before you start. Any researcher with more than a month's worth of experience can tell you about all the wasted time which was spent doing objective (i.e., clueless) analysis. Furthermore, if you're smart enough to figure out what they're expecting, you'll benefit from a long-term relationship. If not, you'll hear "We're not doing any research at this time" whenever you call looking for business.

I've learned this lesson the hard way. A few years ago, I worked with a government agency which was convinced of the need for a technology center in an out-state area. Unfortunately for me, respondents thought this was the stupidest idea they had ever heard of. Consequently, that study got buried.

Another example concerned a client that had expanded into the Kansas City market. They hired me to conduct focus groups to find out what local residents thought about them. The answer was "not much" and my report said so. The client's reaction was not atypical: "Focus groups are worthless. It's only a few a------s shooting their mouths off."

And speaking of clients, always remember that, "Objectives are carved in stone, but clients always have chisels." A project always begins with a proposal, in which the objectives are defined in no uncertain terms. In fact, I end each client launch with a "final" confirmation by stating, "So tell me what it will take for this project to be a success." Or, "This project will be a complete waste of time and money unless you get...."

Despite my efforts, I'm always amazed at how objectives "evolve." I cannot recall many presentations during which a client has not asked, "Why didn't you ask that?" Or, "Who came up with that idea?"

Actually, this leads me to place the "other perspective" on Rule No. 1: "If you think respondents are stupid and they lie, just wait until you present the results to management."

6. Budget determines sample size.

One question students always ask is, "What's a 'good' sample size?" They're really asking the wrong question. The first question you immediately ask the first time the phone rings is, "How much money do you have?" That becomes the figure that goes into the proposal. That's the figure that will determine how much you will actually make on a project.

Clients always have more money. Incidence lower than expected? There's a stash of cash somewhere. Sample too small to realistically analyze subgroups? Doesn't matter. There's always more money in the kitty. In fact, a rule of thumb is to double whatever a client claims he or she has to spend.

Two months ago, I wrote a questionnaire to reflect the topics that each member of the team had to have addressed. Unfortunately, this expanded the seven-minute interview to 18 minutes. No problem. The money mysteriously appeared out of the "this is absolutely all we have to spend" budget. Nonprofits are notorious for getting their hands on more money. They've always got some funds squirreled away somewhere.

Finally, the following rule works especially well when estimating your original budget: always bid low. This is an extremely crucial rule when bidding on government work, which is oftentimes awarded to "the lowest bidder." Once you've won the project (and their hearts), let the "add-ons" begin!

7. If you torture a number long enough, it will come around for you.

I learned this one years ago from a guy who had a Ph.D. in statistics. Want to "prove" statistically significant differences - like between those who have seen the advertising vs. those who have not? Just lower that confidence level to 90 percent, 85 percent - or even 80 percent. One copytesting service I know actually does this. They just don't mention it on individual tables - just in the appendix (that graveyard of report minutiae). An industrial client actually instructed me to lower the confidence level so that there were more statistically significant differences between groups. She told me, "If we don't come up with a lot of statistical differences, management will feel they haven't gotten value for their money."

Results aren't what you'd like them to be? Just keep running the numbers. There's no reason to be hampered by an "unfriendly number." A truly creative (or experienced) researcher will realize that there's always a subsample somewhere which needs to be "re-analyzed." Of course, moderators aren't encumbered at all when reporting the results of qualitative research. No matter how many said what, there's always an opportunity to downgrade a unanimous opinion with "most, but not all, said." This also works the other way - when participants trash an ad, the research spinmeister points out that "a few participants were really enthusiastic about this ad..."

8. When presenting research results, always use the latest buzzwords.

That way, not only will you sound like you're "with it," but the client will recognize that his own project is "cutting-edge." And that he's found a research "partner" who "truly understands the marketplace." Some of the current "must" words include "return on quality," and "plethora" (and especially its cousin, "veritable plethora"). And for some reason, clients are always really impressed if you can throw in the word "passion."

At the same time, all seasoned researchers have learned how important it is to use the "standard research vocabulary." They know the importance of using really big words to describe really abstract concepts. After all, the bigger the word or the more abstract the concept, the less likely it is that the audience will ask about it. They don't want to sound stupid. More importantly, the audience won't try to figure out what you're talking about. They'll just realize that they're getting their money's worth and keep quiet. Who cares if the researcher has no clothes?

Ultimately, however, any researcher worth anything will invoke the patron saint of marketing research, Fred Astaire. They've learned to dance. They've learned to communicate the message "I have absolutely no idea what you're talking about" by saying "That's a very relevant point. I'll have to conduct some further data runs."

9. The better the results, the smarter you are.

This one's for those readers who missed Rule No. 5. Brilliance in marketing research is only defined by how positive is the information you're providing. (And believe me, brilliance is a fleeting commodity.) Presenting "good" results just adds a cheeriness to the entire room. It allows your immediate client to show her boss how brilliant she is for selecting you to do the project in the first place.

This also ties into the sub-rule "The better the results, the more they'll like you." Or ultimately, "The better the results, the more likely the client is to invite you to do more work." Several of my peers are "locked into" advertising agency work because they always - just coincidentally - seem to deliver results which demonstrate how "on-strategy" the campaign (and agency) is.

Another sub-rule: "Really good research confirms what the client already knows." Many years ago, I conducted focus groups for a consumer goods manufacturer. Eight groups in four days (the "cross-country road trip"). When I was presenting the results, the client actually praised me for "being perceptive," and for "being as smart as I am."

A good researcher will acquire the gift of stating the obvious. If you tell a client what he or she already knows, it merely proves that you understand their business. This also adds immeasurably to your own credibility.

10. Last, but not least: Recommend more research.

A truly good researcher will always be on the lookout for that next study. (After all, we have to eat! Or pay for that car. Or, at least, pay the rent on that big building.) What piece of research doesn't lead to the opportunity to again prove how brilliant you are?

However, seasoned researchers have to understand that it shows a complete lack of class (and gives the profession a bad name) by just blatantly recommending further research. Instead, the researcher will soften his overtures by enthusiastically stating, "Now, if this were my business, here's what I'd do." Or, "I know budgets are really tight right now, but to get maximum return on your investment, you should do this." It's called "upselling." It's done in the fast-food industry all the time. One independent researcher once shared a "consulting secret" with me: "In order to be in business, you only need three clients. And then milk them for all they're worth."

Rules to live by

So, those are the rules that I've come to live by. I suppose some are tongue-in-cheek, but not many. The fact is, all of the anecdotes are absolutely true. I must confess that my MBA students did help me in putting together this list. That's because, unlike veteran marketing researchers, they're not so close to the industry as to overlook the obvious. Also, using many of the students' suggestions has also allowed me to take advantage of the First Rule of Marketing: "If it's worth anything, it's worth stealing."