53
Primary Research: A Crash Course Part I Laurie Gelb, MPH

Survey research crash course

Embed Size (px)

DESCRIPTION

Crash course or refresher for market researchers and their vendors

Citation preview

Page 1: Survey research crash course

Primary Research: A Crash Course

Part I

Laurie Gelb, MPH

Page 2: Survey research crash course

The challenge:

Provide insight that integrates demographics, attitudes and behavior and turn raw data (numbers or statements) into meaningful, actionable information

Page 3: Survey research crash course

What is marketing?

• Classically, the four P’s: product, price, promotion, place (distribution)

– Now, often characterized as the four C’s: consumer needs and wants, the cost to satisfy these, convenience to buy and communication, which is dialogue rather than monologue

• Market research supports activities and decisions made in support of these overarching functions…from manufacturing through consumption and the target’s evaluation of the transaction

– So marketing does not stop when a purchase is made or a prescription written

– We want decision-makers and influencers to feel good about the decision, and to communicate that feeling to others

Page 4: Survey research crash course

What is marketing research?

• Any formal or informal attempt to ascertain facts, perceptions and/or behavioral data relating to constituents, customers, peers, prospects, influencers or other people of interest

– “Do you want to see a movie tonight?”

– “Why don’t you think that I deserve a promotion?”

– “Do you want fries with that?”

– “What is the major reason for continuing to prescribe Plavix?”

– “In the last year, which of the following drugs have you taken to alleviate the symptoms of your arthritis?”

– “Can I count on your vote for Senator Niceguy?”

Page 5: Survey research crash course

Three data types = two sources of error

• What is your age– a fact someone is unlikely to grossly lie about or mischaracterize [the baseline questions on a polygraph]

• In the last year, how many days did you stay home from work due to illness or family issues? [“fact” for which recall may be poor or biased]

• On a scale of zero to ten, with zero meaning “not at all helpful” and ten meaning “extremely helpful,” how would you rate the GE onboarding program? [“perception” for which answer may be skewed by recency effects, grading pattern, vested interest in outcome, etc.]

– A physician may rate a product, rep and/or message to help a rep, hurt a rep, help a product, help a company, help a colleague, etc.

Page 6: Survey research crash course

A good project:

• Elicits understanding of what our target population thinks it wants, needs, understands, questions and fears

– Which is not the same as reality

• Respects the respondent’s time and viewpoints

– By not being redundant, pedantic, emotional or too long (90% of quantitative studies should take a half hr or less; a TDI or central location interview likewise; a focus or telegroup 1-2 hrs)

– By giving the respondent at least one unstructured question in which to state what s/he hasn’t been asked that is also important

Page 7: Survey research crash course

A good questionnaire:

• Permits and never constrains the respondent’s own language and frame of reference

• Presents clearly and unequivocally the stimuli to which the respondent is responding

• Flows smoothly from topic to topic without the respondent’s caring or noticing

• Could not possibly confuse or annoy the respondent

• Does not in itself affect attitudes or behaviors, only reflects them

• Has been formulated through reference to previous market research, field reports, sales data, clinical literature, peer-to-peer communication and common sense

• Allows us to evaluate the market, not the results of stupid questions

Page 8: Survey research crash course

Study objectives checklist

• What do we know from secondary research, past primary research, clinical experience?

• Are our questions exploratory, well-structured or both?

• Do we need an interviewer?

• Do we need projectability?

• How many purely numeric questions do we have?

• Do we need to have multi-phased or iterative questioning?

• To what sort of stimulus/stimuli if any, do we need a response?

Page 9: Survey research crash course

Typical marketing issues

• Which order of indications might optimize uptake?

• What barriers to formulary acceptance do we face?

• What messages are prescribers actually taking from our and/or competitors’ details? Which of these is/are most convincing as a reason to prescribe?

• How are efficacy/safety/convenience/”expensive” defined in this category/class/disorder/symptom cluster?

• What is market satisfaction with current tx, on what is it based, and what incremental prescribing could a safer/cheaper/more efficacious/more convenient option generate?

• How important are formulation and route of administration, e.g. TID dosing, IV setup, need to take with food, injectables, etc.?

• Are patient characteristics actually driving prescribing, or are other factors more prominent, e.g. DTC ad-driven requests, a dominant formulary, sample availability, cost, etc.?

Page 10: Survey research crash course

Draw the map before starting the car

• Before doing a study, know as much as possible about:

– Who will use it and how; any differing agendas

– When their big moment is…the forecast is due, the presentation is scheduled…the bigwigs are visiting and what their deliverable is (slides, charts, report)

– What you are expected to add to the vendor’s analysis, or simply what you can add on your own initiative

• Vendors take orders, professionals are partners

Page 11: Survey research crash course

What can we present?

• Product descriptions or profiles

• Laddered “what if’s”: what if in addition to what you just saw, this agent’s reduction of MS lesions was 30% greater than Avonex’s?

• Factor, variable or endpoint lists

• Drawings, concepts, diagrams, ads, photos

• Maps

• Posters, banners, billboards, signs

• Direct mail packages

• Detail aids

• Brochures

• Clinical reprints or abstracts

Page 12: Survey research crash course

How can we present it?

• In graphic format

• Interviewer-read

• Audiotaped music and/or spoken word

• Videotape; steaming video

• Graphics that are clickable to reveal more detail, provide feedback or record response (e.g. click on the organ system you’d review first)

• Text that can be highlighted, deleted, edited

• Role-playing: rep/doc, doc/doc, patient/doc, patient/patient

• Free association tasks

Page 13: Survey research crash course

Limitations of data collection methods

• Mail and fax questionnaires limited as to question types, survey logic; cumbersome for respondents to hand write; difficult to verify data or limit honoraria to qualified respondents

• Phone surveys rely on “human element,” often require pre-scheduling, leading to stressful context

• Central location or in-office interviews create geographic and sample size limitations, leading to social biases, inconvenience for respondents, travel time/costs for interviewer– However, still the only option if absolute control over

materials is needed• Using the Internet, depending on the country and audience, may

skew the sample toward early adopters

Page 14: Survey research crash course

What issues are most important?

Issues Best choice(s) Worst choice(s)

Data quality Only CATI and Web impose computer-aided quality control, but CATI requires a human: “Can you change your answers to total 100%?”

Phone (human intermediary) is worst; mail and fax are self-administered w/o checks

Break-offs or incomplete interviews

Phone most reliable due to politeness factor, but Web close behind (Cozint studies have <4% break-off rate)

Mail and fax risk breakoffs; papers get lost, discarded

Open ends (unstructured comments)

Web captures respondents’ comments as they enter them, mail and fax allow similar freedom, but incur fatigue, need to decipher

Phone and face to face impose interviewers’ limitations and biases

Relaxed environment Nothing beats 24/7 access and choice of location! Web, hands down

Phone and face-to-face the most burdensome due to pre-scheduling

Social biases Web, mail and fax avoid concerns about saying “the right thing”

Phone and face-to-face both risk trying to please or oppose an interviewer

Page 15: Survey research crash course

What issues are most important? (cont’d)

Issue Best choice(s) Worst choice(s)

Respondent’s ability to think fully and communicate honestly

Web, mail and fax are all self-paced; however, only the Web does not rely on maintaining and returning paper

Phone and face-to-face are high pressure, move along contexts

Intrusiveness/bias of the method itself

Designed properly, Web surveys combine best of both worlds: “point and click” ease with self-administration control

Mail and fax, since respondent must think and write within constraints of form

Follow-up with respondent

Select findings can be made available to respondents immediately via e-mail message with URL

Most other methods rarely offer post-study interaction

Compliance Very high with Web; can be high with mail given high honorarium

Declining with phone

Honorarium fulfillment The shorter the field period, the quicker the checks are mailed: Web

Phone is worst, followed by mail

Page 16: Survey research crash course

Besides Web surveys, other Net-based options exist

• Can e-mail or serve a Word or Excel document (no branching logic, data export may be more difficult), HTML document (still must return to sender), downloadable apps (more expensive; can trigger security, virus concerns)

• For group-think without the face-to-face geographic compromises and expense, consider iterative threaded chat

• Real-time focus groups can suffer from typing, reading, computer speed biases/frustrations; give the platform a real workout first, and consider your audience

• Traditional focus groups can be viewed via the Web; moderators can incorporate your comments in real time via online chat

• Posting to a sufferer, caregiver or professional Usenet group, threaded forum or chat room, if its charter permits, can garner interesting feedback; you can also provide a mailto link on your or another site– Remember, however, some AEs must be reported

Page 17: Survey research crash course

Whom and how do we sample?

• How selective or inclusive need we be in sampling?

– Are we interested in all neurologists, Board-certified neuros, only neuros who treat MS or only neuros for whom MS is a large part of their patient load?

• Do we need statistical validity (projectability)?– If so, from whom to whom?

• Is there any reason to show multiple ads, profiles, designs, etc.?– If so, do we show everything to everyone or is there a

reason to have multiple arms (e.g. to test order effects or the impact of better data)?

• Basic rules of thumb: N of 30 is minimal, 50 is standard, 60 is better and 75 is essential for multivariate tools, e.g. conjoint

Page 18: Survey research crash course

Sampling frame: the first driver of actionability

• Does the firm “prepay” respondents to serve on a panel or pay “by the study?”– Panels risk unrepresentative “professional respondents”

• Does the vendor recruit by e-mail only if respondents have already consented to e-mail contact?– Are previous respondents favored in sampling?– If an e-mail panel is not fully representative, does the vendor

have fax access or other means of offline recruitment (if not using a Web site)?

– For patients, is the screening based on specific questions about symptoms, to reconfirm dx, or is it based on a single self-reported diagnosis?

Page 19: Survey research crash course

Sampling frame: you can contribute

• If at all possible, provide a list from a third party database, stratified by the variable(s) you want proportionally represented, e.g. decile, share of a product, etc. – Make sure user of list has certified that the list remains your

property and will not be merged into any permanent database– Do not limit yourself to “heavy prescribers” of something unless

there is good reason to do so• If using a phone center’s database to interview or recruit for F2F, to

how large a group does the firm have access?– Routinely, “friendlies” are called first unless you specify otherwise– Unless your vendor has her own phone center, one phone center

can be doing competing projects; make sure that no survey materials have our name or the name of our products (another reason to blind all screening questions and all materials!)

– For central location, this means not only experienced group members but group members who may know each other

Page 20: Survey research crash course

Writing the RFP

• Specify background and objectives, audience(s) and methods, desired insights, deadlines for proposal and deliverables, deliverables (e.g. topline, report, presentation, data files)– What hypotheses are you testing? Are you looking for

confirmation or new information? • Specify report/presentation to be prepared by primary

interviewer/moderator, if applicable• Do not do everything for the vendor. Leave room for the vendor

to demonstrate his/her own knowledge of the disease state or symptoms, therapies and treatments used, audience(s) to be surveyed, most appropriate means for doing so and questions to be asked

• Give vendors a week to respond if at all possible• Provide contact info and make it clear you are open to questions• Encourage vendors to improve upon your RFP, not to copy it

Page 21: Survey research crash course

Vendor selection checklist

• Is the experience and credentials of primary project personnel clear? – Ask for references if need be

• Is the universe of potential participants what the objectives demand? Is it representative of the overall universe?

• Are specified honoraria adequate but not excessive? You have the right to ask that they be changed, but that is a red flag

• Does the proposal demonstrate knowledge of the objectives and disease state beyond the RFP?

• Is the vendor clear about the probability of hitting your deadlines, given the audience and honoraria proposed?

• Do you hear phrases like “we’ll see…we might have to increase the honoraria…extend the timeline…lower our sights…change the design?– Is there evidence that the vendor has done something like this

before?

Page 22: Survey research crash course

Managing the vendor: GIGO

• Follow the Golden Rule: return calls and e-mail promptly and honestly; consider a kickoff meeting or call with the vendor and key internal customers to ensure all are on the same page

• Do not release information that the vendor does not need to know, but obtain a signed confidentiality agreement

• Read the deliverables, and give yourself time to respond fully; don’t rationalize that the vendor will do it right, or you may be disappointed

• Work with the vendor in partnership to obtain the desired result; provide the vendor with feedback during and after the project

• Do not accept errors in spelling, grammar or logic in any deliverables, and do not sign off on deliverables until you receive a corrected version

• Monitor interviews. Ask for tapes of all qualitative work. Listen to the tapes. Listen in to pretests and vendor debriefings. If necessary, listen in to vendor briefings of subcontractors, e.g. phone centers. Ask for copies of interviewer’s guides, pronunciation guides, all field materials.

Page 23: Survey research crash course

Managing the vendor: be proactive

• Ask for daily dispos while quant work is being fielded• Ask for final incidence and compliance rates• As field proceeds, if incidence is not what was budgeted, we

must pay more or relax criteria– This should not happen if criteria were appropriate in the first

place• If response rate is low, check the invitation

– Is it enticing? Are the topic, length and honorarium clear?– Are you fielding in the middle of a pertinent national

meeting?– If phone recruiters are working, ask to monitor their calls

• Always get a final respondent list to ascertain potential friendlies, geographic bias (sign a CASRO form first)

Page 24: Survey research crash course

Ethics are essential

• Although it may be “necessary” to increase honoraria mid-study, it should not be– Don’t let a vendor play the “inch up” game– It’s unfair and actually unethical if you know it’s going to

happen– It also skews your data!

• CASRO: You can’t target or market or otherwise treat differently a respondent on the basis of his/her answers– You can, however, model (e.g. compare IMS data with self-

reported share of preference)– And even compare call reports and other field data

Page 25: Survey research crash course

More essential ethics

• Don’t talk to one vendor about another, even to tell her from whom you’ve asked for proposals; if you have to mention a price, never disclose from whom

• Don’t distribute respondent names, even under CASRO guidelines, to those without a “need to know”

• Don’t share contact info for past respondents with any vendor; if the names came from their db originally, they have already them– If they don’t, they may have come from another vendor’s db and

are not your property• Don’t share a previous questionnaire or results prepared by vendor A

with vendor B without:– Removing vendor A’s name throughout and not sharing it– Including a transmittal memo that stresses it is not to be shared

with other employees at vendor B and is to be used only for research purposes

Page 26: Survey research crash course

What do you have the right to know?

• To whom the vendor subcontracts any aspect of your study, including methodology, phone work, interviewing, data entry, moderating, logistics, analysis, Web survey programming or hosting

• The software application(s) and procedures that will be used to analyze data, esp. if a conjoint, DCM or segmentation is involved

• The honoraria to be paid, and any changes throughout the study, even if you aren’t asked to pay for them

• What the invitation to participate or the script for the recruitment call looks like

• The sources of the database(s) from which respondents are being recruited, and the sampling algorithm that is being used

Page 27: Survey research crash course

Screening questions: what do you really need?

• NEVER telegraph screening criteria like the following:– Are you Board certified in egg drop soup preparation?– Do you treat at least 10 hyperlipidemic patients per month?– How many MS patients do you treat per month:

• 0-10• 11 or more

• All questions leading into the main questionnaire should either be terminate points or drive the version of the survey that respondents see—no exceptions!– With rare exceptions, years in practice, age, participation in

clinical trials and/or total patient volume should not be screening criteria

Page 28: Survey research crash course

Screening questions: the minimum is the max

• Re-verify specialty focus or Board certification – Patient volume (total if rare or frequently seen disease, e.g.

stage IV breast cancer; per week or per month) for relevant disorder

– May need to verify rx of specific drug(s) • Present a clear patient frame and time frame. Always ask

about either current regimens (percentage of patients with X on drugs X-Z) or scripts over a time period, e.g. “in the last month, what percentage of your statin prescriptions for newly-diagnosed hyperlipidemia patients were for each of the following?”

Page 29: Survey research crash course

Honoraria: potential Achilles heel of projectability

• Differences in honoraria among vendors are often large– Are honoraria adequate? Cash or gifts or points? – Major third party data sources pay as little as $100 per year for

long, immensely detailed forms– You get what you pay for; lowball potential respondents and

response rates may be “OK” but yield skewed results because you interviewed the curious, the bored, the compulsive…instead of the mainstream

• “Ballpark” honoraria for a 20-minute Web or phone study should be $>80 for PCPs, >$100 for specialists– Consider topic, complexity, time frame, audience, competing

demands and interest level – For convention work, organizers may restrict the monetary value of

the gift for on-site work; off-site groups are reimbursed as normal

Page 30: Survey research crash course

Recruitment begins your corporate exposure

• As the client, whatever the method, understand and approve the vendor’s database source(s), algorithm for sampling, method(s) for inviting persons in that database, honoraria and the means of screening potential respondents– Your firm is the (legal) catalyst for recruitment

• Online recruitment options: popup windows, banner ads, sidebars, body text, on your site and/or others’– With phone or fax follow-ups for small universes

• When using third party target lists with e-mail addresses, must have confirmed evidence of opt-in, or reconfirm/obtain it off line– Illegitimate research risks restricted access to panelists,

disenrollment, low response rates, formal complaints, etc.

Page 31: Survey research crash course

Respecting respondents: make sure it happens

• Invitations to participate in surveys should contain:– Survey topic– Length of survey– Honorarium– URL, user ID and password (if applicable)– Toll-free number if respondent needs help

• Once respondent is doing survey, s/he should have access to– Live chat support function (Web)– 800 number– E-mail help

• At completion of survey, respondent should be reminded of honorarium time frame and given contact information should s/he have questions, need to change mailing address, etc.

Page 32: Survey research crash course

What makes a good question?

• Only one adjective or adverb• Positive phrasing• Active verbs• Human-speak, not ad, computer or marketing-speak• A scale that is based on the question stem• Brief, precise, concise, unequivocal wording• Time reference: over the last year, month, week, since

residency or fellowship…• Object reference: you, your prescriptions, your patients’

prescriptions, your patients, family members of your patients…• Patient reference

– Patients diagnosed with…treated for…seen for…evaluated for…hospitalized for…seen in the office…without insurance coverage for prescriptions…

Page 33: Survey research crash course

What makes a good scale or set of answers?

• 0-10, where 0=“not at all,” “never”… and “10” = “always,” “completely,” “completely effective”

• Disagree…somewhat agree…completely agree• Much lower/somewhat lower/the same/somewhat higher/much

higher…much more likely• Not at all likely/somewhat likely/very likely• Clear distinction between scale point labels if used: never

“strongly” vs. “very” or “extremely” vs. “very”• Do not mix answer items or attributes that are very different

structurally, e.g. “is dosed once weekly” vs. “efficacious in green dot disease”

• Branch questions to avoid long lists, e.g. first ask about percentage of pts on mono vs. combo tx, then drill down separately into each

Page 34: Survey research crash course

What can we measure?

• Likelihood of consideration, trial, purchase, further information-seeking

• Frequency, interval, duration, strength of need

• Means and urgency of information seeking

• Interest in various factors, decisions, categories, purchasing

• Media usage and role of word of mouth in purchase process

• What really closed or lost the sale: not necessarily the most recent or longest exposure; may have been something that didn’t happen

• How information is used (did I window shop till I dropped, then pay the first clerk I saw when my feet started to hurt? I listen to my friends’ political views, but do I vote on them?)

• Key drivers of perceived value (how price and quality interact), quality, price competitiveness and other desirable attributes for the product class (e.g. which lipid values are looked at in which patients to determine a statin’s efficacy?)

Page 35: Survey research crash course

More we can measure

• How many X in Y time period (total or frequency)

• Total hours or minutes per time period or typical duration

• Stage or status of need (Are tires bald? Or are they old so assumed bald? How will the driver judge when it is time to replace them other than if one blows out?)

• Yes/no: would you consider/have you heard of/do you own/have/rent…have you ever

• If you could improve/change one thing…

• To probe a closed end, try “why do you say that?” or a variant first

• Have you discussed X with Y? What did you decide?

• Aside from X, how do you stay up to date on blue hair disease?

• What kind of information about X would be most useful (yes/no on specific types sought, facts, data, pricing, options, support)

Page 36: Survey research crash course

Typical survey question types

• Check all that apply/ever• Best/worst/most/least (applicable/likely, etc.)• Numeric/percentages/proportions of patient types, therapies by subtype or

disorder, allocation of time, etc.• Rankings, ratings or point allocation tasks• Forced totals and choices (where appropriate; don’t precode away your ability

to understand the market)• Other specifies, which can be piped into other questions• “Three most potent…” or pseudo-interval scale where “best” and “worst” agents

are first identified and others placed between• Disadvantage to advantage scale• Trends and reasons• Benchmarking: “at what reduction in mortality would you begin to consider

prescribing a therapy for…”• Customized, open ended probes: “You say that Merck is the most managed

care friendly pharmaceutical firm. Why is that?”

Page 37: Survey research crash course

Meaningless questions: ever so tempting

• What is the first anti-diabetes medication that comes to mind?• Of what other anti-diabetes medications are you aware?• Have you ever heard of Rebif?• For what disorder(s) is Zyrtec indicated? • Without looking at your sample closet, do you think you have enough

Nasacort samples? [setting-dependent]• What were your net practice revenues last year?• How do nurse practitioners feel about Claritin-D?• What company uses the most celebrities in its advertising?• What was the last TV commercial directed toward patients that you

saw? [painful recall; ask about most recent positive or negative and allow none and no recall]

Page 38: Survey research crash course

Questionnaire format starts with the nits

• Start blinded product label at “L” or similarly innocuous point (avoid using letter that any major comparators begin with, e.g. do not use “L” in a statin study) and never number items

• It’s “percentage,” not “percent of”• Scales are “from 0 to 10,” not “an 11-point scale” and avoid pull-

down boxes for rating scales• Keep the answers away from the question itself, except in phone• Ask about percentages, not numbers, except early on• Reassure as to validity of “best estimate,” or “as far as you are

aware, “approximately,” never “best guess”• Use “most,” “some” or “other(s)” rather than “you” for sensitive

issues

Page 39: Survey research crash course

Don’t let format stand in the way of the answers

• Using decades for age is passé, irritating and waves the question like a red flag (use 25-34 ranges, or, better yet, just ask for the age!)

• Using tens as income ranges does all the same things

– Do you really need to know ethnicity, education, income, gender and/or age? Can these be pulled from another file?

• Numbering answer list items for no reason, e.g. regimens or products, can bias or simply confuse

• Not specifying forced totals or ranges in a question, then displaying an error message, leads to rushed corrections and poor data

• Unnecessarily characterizing open ends (“in a few sentences…”) reads like “what I, your first grade teacher, want you to do”

• Ordering answer lists by perceived popularity or appropriateness, instead of logically or alphabetically, makes it harder for the respondent to find things, and less likely to care enough to do so

Page 40: Survey research crash course

Asking sensitive questions

• Present in terms of requests: “Please stop me when I read the range that best describes your household income before taxes.”

• Introduce demographics: “These final questions will help us group your answers with others.”

• Don’t assume the worst or the best. “How many times did you fall asleep during work today?” is wrong. “During work today, did you sleep at all?” is right. Then if yes, the respondent is asked the quantity question.

• Let the respondent see that your specificity is for her benefit, not yours

• Don’t ask a question to which the respondent knows the “socially correct” answer; it’s an insult and an invitation to mislead:

– Do you usually feed your children nutritious food?

– How important is it to vote for the best-qualified candidate?

Page 41: Survey research crash course

Know thy frame

• The last three…the next two…over the last year… most recent…oldest…most often…

• Patients who have been prescribed an NSAID in the last year…osteoarthritis patients not on rx

• Three most…least important…most often seen…three statins you associate most often with reduction of LDL in diabetics…the most difficult case of RA you have treated…the last patient of whose outcome you are uncertain…most important reason to stock Mavik samples

• NOT: ten reasons to write Mavik, most important reason to write albuterol with a nasal steroid, sources of information about health issues

Page 42: Survey research crash course

Compare answers for robust models

• Paired comparisons, A vs. B

• Rankings

• Allocate 10 or 100 points

• Not at all/somewhat/very important

• Regress potential drivers against preference share or current rx

• Ask in terms of a percentage: “what percentage of your statin scripts are changed due to a formulary restriction?”

• To what sort of stimulus/stimuli if any, do we need a response?

• Always make sure each factor or attribute only reflects one thing: what is wrong with measuring agreement with “I would never buy an Acura, because they are too expensive”

Page 43: Survey research crash course

The pitfalls of double-barreled statements

• “I would never buy an Acura, because they are too expensive” means what?

– Acuras are too expensive?

– I would never buy an Acura, period?

– I will buy an Acura while still complaining about the price?

– I won’t buy an Acura but not because of cost?

– I couldn’t afford any Honda--$3K for a used car is my limit?

– I take the train?

– I have no idea what an Acura is?

– I don’t drive?

Page 44: Survey research crash course

Care in crafting attribute lists pays off

• Attributes should be associated either with the respondent or with something else, e.g. a product

• Factors should be controlled by the respondent or not (a patient controls compliance, but not side effects)

• List items should be unrelated to each other causally, e.g. we would not ask respondents to rank order factors including “side effects” in a list along with “incidence of tongue mangling”

• If any of the above rules is violated, the resultant data are meaningless

• Use positive statements, not “trick” double negatives

• Use parallel structure, but mix things up a little (don’t present 5 agreement statements in a row that start with “pancreatic cancer.”

Page 45: Survey research crash course

The “chase” question

• Sometimes a series of branching questions is unnecessary and a waste of time. Ask a direct question in a neutral, non-threatening way.

– Have you ever obtained health care information using the Internet?

– In the last year, have you used the magazine Good Housekeeping to help you purchase makeup?

• Do not cluster chase questions. Ask them sparingly and space them out, except for demographics:

• “Are you: Female/Male”

• “Is your ethnic background best described as:”

Page 46: Survey research crash course

How to insult, patronize and misuse respondents

• Phone-speak on screen: “Next we’ll talk about,” “I’d like to now ask,” “Doctor, now please…”

• Huge intimidating product grids measuring awareness, usage and/or ratings only prompt trigger-clicking (mindless mouse clicks to get on with the survey) or eyes-closed ratings

• Ad-speak: “concept description,” “white space,” “body copy,” “product description,” “story board”

• Insults to respondents’ intelligence: “record below,” “type here,” and, worst of all, “click here”; similarly, do not specify “choose one only” if a likelihood or yes/no question

• Asking for other specifies which will not be analyzed—breaks flow, wastes precious time and respondents often sense that their effort is futile

Page 47: Survey research crash course

More insults and abuse: don’t let it happen

• Over-outlining, setting the stage, introducing…• “Time remaining” thermometers or comments: misleading,

distracting, scream the context, invite breakoffs, guesses of remaining time, fitting answers into it

– Make sure vendor is accurate about length in the invitation and intro!

• Listing potential answers in both the question and the answer blanks (the more they have to read, the less time for completion)

• Placing demographic questions that are not terminate points or branching criteria at the beginning, thus risking qualified breakoffs

• Failure to use automation, e.g. forcing respondent to restate choices: “you may have seen/heard these before” (Why?)

Page 48: Survey research crash course

Don’t “lead the witness”

• Preceding a specific question with a general intro tempts respondents to skip and misinterpret the real question

• Listing scale points or choices from “top down,” e.g. starting with “10” or “completely successful” or “increase” biases responses ; the only exception to this rule is yes/no

• Placing performance ratings on the same page/grid as attribute importance ratings increases the likelihood they’ll converge

• Using abstract-speak: “Was the level of samples satisfactory?” “On the following scale, how motivating was the message?” “Was the ad believable?” guarantees less than thoughtful answers to these false dichotomies and oversimplified questions

• Giving the show away: “Are you aware that not all asthma drugs are equally safe for children?” results in one of four responses: (yes, I actually know; yes, because if I say no, I’m an idiot; no, I’m clueless; no, I’m feeling oppositional)

Page 49: Survey research crash course

Open ends: robust feedback for marketing

• Don’t allow an open end that challenges professional judgment: “Why do you not prescribe Clarinex for urticaria?” – that looks more negative on screen or in print. Instead, consider:– What are the major reasons for not prescribing Clarinex for

urticaria?– You do not currently prescribe Clarinex for urticaria. Why is that?– Under what circumstances might you begin prescribing Allegra for

urticaria?– In what time frame, if any, would you anticipate prescribing Clarinex

for urticaria? (where a choice is “not in foreseeable future”)• Don’t ask the vendor to fix verbatims’ spelling, grammar or anything

else; get the open ends in Excel with the other data even if you get an SPSS file, too– Use some pithy open ends in your summary, but read them all

Page 50: Survey research crash course

What makes a good qualitative interviewer or moderator?

• Listen for non-linear probing that follows the respondent, not the discussion guide:

– Because…

– Under what circumstances would that happen?

– Can you give me an example?

– Does that happen often?

• The interviewer should not “beat a dead horse”– if the respondent has already made a blanket statement that section III will not be applicable, it shouldn’t be asked

– This requires that the interviewer be listening, not just talking

• With a hesitant or skeptical respondent, the interviewer reinforces the dialogue:

– It’s whatever you think is important

– Don’t let me put words in your mouth

– We want to know how you perceive Begonex

Page 51: Survey research crash course

Qualitative interviewing: warning signs

• Monitor interviews and debrief the interviewer after each one

– What was interesting to you? What wasn’t? What remained unasked and unsaid?

– Modify the discussion guide if questions aren’t productive, everyone says the same thing, or the issues don’t apply to your sample

– Ask the interviewer for her ideas as well

• Be concerned if:

– The interviewer adheres word for word to the discussion guide

– You hear multiple probes that are expressed in terms of the interviewer’s context or needs: “Yes, but I need a number for that…that’s not what I said…you’re the first one who’s said that…that’s not what I’m looking for”

– You want to follow up interesting comments, but the interviewer charges ahead with the next question

Page 52: Survey research crash course

Leverage qualitative feedback

• Ask some numeric questions, preferably as warmups

– But if you need mostly numbers, you need a quant study

• Don’t assume that the respondent differentiates drugs on efficacy, side effects or cost. Find out what her rx drivers are. In other words, don’t put the cart before the horse.

– Deconstruct: Do you differentiate the statins on the basis of side effects? If so, how? If not, are there any important differences between them? If so, what are they? If not, on what basis do you decide which statin to prescribe for a given patient?

• Listen to the interviews. Even from behind a mirror, it’s not entertainment; it’s vital business information, the lack of which could lose your next blockbuster!

Page 53: Survey research crash course

Researchers’ value: best practices and insight

• Help marketing understand how:– Benefits, costs and value of products and services are

determined by target audiences, and how that’s changing– New technology, science and products influence

expectations– Your corporate image and brands’ equity can gain and hold

value via appropriate marketing• In survey research and Web development:

– Follow the highest standards– Work with only vendors who do the same– Look for ways to enhance and convey standards’ benefits