58
1 Comparing Methods to Make Research More Patient Centered Consuelo Wilkins, MD, MS 1 ;Sarah C. Stallings, PhD 2 ;Victoria Villalta-Gil, PhD 2 ;Mckenzie Houston, MPH 2 ;Yolanda Vaughn, MS 3 ;Alan Richmond, MSW 4 ;Laurie Novak, PhD 1 ;Yvonne Joosten, PhD 1 ;Chris Simpson, MS 1 ;Tiffany Israel, MSW 1 ;Alaina Boyd, PhD 2 ;Margaret Hargreaves, PhD 2 ;Velma McBride Murry, PhD 1 ;Leslie Boone, MPH 1 ;Kenneth Wallston, PhD 1 1 Vanderbilt University Medical Center, Nashville, Tennessee 2 Meharry-Vanderbilt Alliance, Nashville, Tennessee 3 Neighborhoods Resource Center, Nashville, Tennessee 4 Campus Community Partnerships for Health, Raleigh, North Carolina Original title: Improving Patient Engagement and Understanding Its Impact on Research Through Community Review Boards PCORI ID: ME-1306-03342 HSRProj ID: HSRP20143601 _______________________________ To cite this document, please use: Wilkins C, Stallings, S, Villalta-Gil V, et al. (2019). Comparing Methods to Make Research More Patient-Centered. Washington, DC: Patient-Centered Outcomes Research Institute (PCORI). https://doi.org/10.25302/12.2019.ME.130603342

Comparing Methods to Make Research More Patient Centered

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

1

Comparing Methods to Make Research More Patient Centered

Consuelo Wilkins, MD, MS1;Sarah C. Stallings, PhD2 ;Victoria Villalta-Gil, PhD2 ;Mckenzie Houston, MPH2 ;Yolanda Vaughn, MS3 ;Alan Richmond, MSW4 ;Laurie Novak, PhD1 ;Yvonne Joosten, PhD1 ;Chris Simpson, MS1 ;Tiffany Israel, MSW1 ;Alaina Boyd, PhD2 ;Margaret Hargreaves, PhD2 ;Velma McBride Murry, PhD1;Leslie Boone, MPH1 ;Kenneth Wallston, PhD1 1Vanderbilt University Medical Center, Nashville, Tennessee 2Meharry-Vanderbilt Alliance, Nashville, Tennessee 3Neighborhoods Resource Center, Nashville, Tennessee 4Campus Community Partnerships for Health, Raleigh, North Carolina Original title: Improving Patient Engagement and Understanding Its Impact on Research Through Community Review Boards PCORI ID: ME-1306-03342 HSRProj ID: HSRP20143601

_______________________________

To cite this document, please use: Wilkins C, Stallings, S, Villalta-Gil V, et al. (2019). Comparing Methods to Make Research More Patient-Centered. Washington, DC: Patient-Centered Outcomes Research Institute (PCORI). https://doi.org/10.25302/12.2019.ME.130603342

2

Table of Contents

ABSTRACT ................................................................................................................................ 4

BACKGROUND ......................................................................................................................... 7

PARTICIPATION OF PATIENTS AND OTHER STAKEHOLDERS IN THE DESIGN AND CONDUCT OF RESEARCH AND DISSEMINATION OF FINDINGS ...................................................................... 10

Collaborative Level of Engagement .......................................................................................... 10

Designing the Study .............................................................................................................. 10

Conducting the Research ...................................................................................................... 11

Monitoring the Study Progress ............................................................................................. 11

Disseminating the Research Results ...................................................................................... 11

Consultative Level of Engagement ........................................................................................ 11

METHODS .............................................................................................................................. 13

Overview .................................................................................................................................. 13

Figure 1. Study flow CONSORT diagram of study enrollment, randomization, and allocation ............ 14

Methods for Assessing CE Studio Input .................................................................................... 14

Design ................................................................................................................................... 15

Setting and Population .......................................................................................................... 15

Randomization ...................................................................................................................... 15

Data Collection ...................................................................................................................... 16

Qualitative Analysis ............................................................................................................... 17

Development of the Stakeholder Impacts in ......................................................................... 17

Content Generation .............................................................................................................. 18

Table 1. Semistructured Interview Materials ....................................................................................... 19

Pilot Testing .......................................................................................................................... 21

Items Included in the Stakeholder Impacts in Research Taxonomy ...................................... 22

Table 2. Potential Areas of Impact for Patient (and Other Stakeholder Engagement) ........................ 22

Figure 2. Visual representation of the stakeholder impacts in research taxonomy: A taxonomy of standard terms for areas of stakeholder impact in research ............................................................... 27

Methods for Developing the Quantitative PCoR Scale ............................................................. 28

Content and Item Generation ............................................................................................... 28

Evaluation of Candidate Items .............................................................................................. 29

3

Pilot Testing of Initial PCoR Scale .......................................................................................... 29

Table 3. Analysis of Original Version of the PCoR Scale* ...................................................................... 30

Preliminary Data and Scale Analysis ...................................................................................... 32

Testing of Revised PCoR Scale ............................................................................................... 32

RESULTS ................................................................................................................................ 33

Comparison of Input from CE Studios and T2 Studios .............................................................. 33

Figure 3. Differential frequencies of coding counts in CE Studio and T2 Studio transcripts ................ 34

Stakeholder Impacts in Research Taxonomy ............................................................................ 35

Conceptual Overview ............................................................................................................ 35

Pre-research and Infrastructure ............................................................................................ 35

Study Design and Implementation ........................................................................................ 36

Analysis and Dissemination ................................................................................................... 36

Post-research ........................................................................................................................ 37

Ethics and Engagement ......................................................................................................... 38

Process Improvement ........................................................................................................... 38

Communication ..................................................................................................................... 38

Development and Validation of the PCoR Scale ....................................................................... 38

Results from the Development of the First Version of the PCoR Scale .................................. 38

Table 4. Revised PCoR Scale Items, Descriptive Statistics, PCA Loading Solution, and Pearson Correlation Coefficients With Total Score ............................................................................................ 40

Table 5. PCoR Scale (Revised, Final Version) Mean Scores for Abstracts Funded by PCORI ................ 41

DISCUSSION ........................................................................................................................... 42

Limitations ............................................................................................................................... 44

CONCLUSIONS ....................................................................................................................... 45

REFERENCES .......................................................................................................................... 46

PUBLICATIONS ....................................................................................................................... 50

APPENDIX A: Community Partner Organizations Represented in the Community Advisory Council .................................................................................................................................. 51

APPENDIX B: Literature Search for Candidate Item Generation ............................................. 52

APPENDIX C: Qualitative Analysis Codebook ......................................................................... 55

4

ABSTRACT

Background: Engaging patients and communities in research has emerged as a critical element to advancing health research, improving the relevance of research, and enhanced implementation of research findings. While methods of engaging patients exist, variability among methods and limited understanding of how engagement impacts research may limit broader use.

Objectives: The overall purpose of this Methods Award was to develop evidence on the impact of stakeholder engagement on research, which is a critical methodological gap identified by the Patient-centered Outcomes Research Institute. Specifically we aimed to demonstrate whether input from patients and communities elicited using the Community Engagement Studio (CE Studio, formerly called Community Review Board)1 was more patient-centered than input from a Translation Studio (T2 Studio), which is composed of researchers.2 To evaluate the input from studios, we developed a standard set of terms, Stakeholder Impacts in Research Taxonomy, to describe potential changes in research due to stakeholder input and a quantitative scale, which would be generally applicable to assessment and evaluation stakeholder engagement.

Methods: This methods study randomized 20 researchers to the CE Studio, a structured method of eliciting project-specific input from patients and community stakeholders, or a T2 Studio, which obtains expert advice from researchers. Any faculty member or research trainee at Vanderbilt University (VU) or Meharry Medical College (MMC) was eligible to be randomized. Researchers who requested a CE Studio through StarBRITE, Vanderbilt’s online portal for research support services, and agreed to be in the study were randomized to a CE Studio (a panel of community members or patients) or T2 Studio (a panel of researchers). Those who were randomized to the T2 Studio were also offered a CE Studio. Randomization to CE Studio or T2 Studio was done using random number generator software (www.r-project.org). Each studio panel was convened to provide project-specific input. The 153 stakeholders who participated as experts on CE Studios panels were patients, caregivers, or patient advocates identified by health status, health condition, or demographic variables based on the project-based needs of the 20 researchers randomized in this project. Stakeholders include individuals with diabetes, heart failure, Parkinson’s disease, sickle cell disease, and intensive care unit (ICU) survivors. All stakeholder experts had experience as a partner or consultant on a research project or through serving on a research advisory board or committee. T2 Studio experts were researchers from VU or MMC. All studios were recorded and transcribed, and experienced qualitative researchers analyzed the data. We compared the project-specific input elicited from CE Studios with that elicited from T2 Studios using qualitative analysis.

To develop the taxonomy (the Stakeholder Impacts in Research Taxonomy) describing and characterizing wherein the research process stakeholder input has the potential for impact and the types of changes, we used a 3-step approach involving patients in each step: (1) identified stakeholder impacts from existing literature and generated standard terms; (2) evaluated the terms using cognitive interviews and a panel of researchers and stakeholders; and (3) pilot tested the terms using qualitative analysis.

5

Because no such measure existed, we created and validated a quantitative instrument measuring patient (person) centeredness of research. We developed the Person-centeredness of Research Scale (PCoR Scale) by (1) content and item generation, (2) item evaluation, (3) initial scale testing, (4) scale revision, and (5) revised scale testing. We tested the initial and revised scales by comparing ratings for abstracts from a PCORI conference with those for abstracts from a national translational research meeting. We determined the scale’s internal consistency reliability with an exploratory factor analysis and Cronbach α computation. We also examined the interrater reliability of the final rating scale.

Results: The CE Studios generated input with more themes consistent with patient-centeredness than the T2 Studios and also a striking number of patient experiences that were expressed in storytelling form. The Stakeholder Impacts in Research Taxonomy has 11 domains (representing categories of research activities) and 59 dimensions, flexibly framing the stakeholder engagement impact on translational research for increased understanding and more formal assessment. The domains are (1) pre-research, (2) infrastructure, (3) study design, (4) implementation, (5) analysis, (6) dissemination of research findings, (7) post-research, (8) ethics, (9) process improvement, (10) engagement, and (11) communication. The PCoR Scale is a 7-item instrument using a 5-point Likert rating scale that successfully classified (based on expert panel assignment) as person-centered 71.4% of the conference abstracts tested in discriminant analysis (mean total scores: 7.15 vs –2.08; t = 7.37; p < 0.001). An exploratory factor analysis showed the PCoR Scale to be unidimensional with an eigenvalue of 5.59 that explained 79.8% of the total variance, and we computed Cronbach α to be 0.95, showing high internal consistency reliability.

Conclusions: Patients and community members provided project-specific input across a broad range of research areas, and this input was more person-centered than input from researchers. The Stakeholder Impacts in Research Taxonomy and the quantitative PCoR Scale can be used by others in the field to help standardize this work and evaluate the patient person-centeredness of research products.

Limitations: Although the CE Studio can be used across many studies, it is a consultative method of engagement and may not elicit feedback reflecting the full range of stakeholder engagement. Because the intent of this study was to provide evidence to support the added value of stakeholder engagement, we compared input from stakeholders with input from researchers. We did not compare CE Studios with a different method of stakeholder engagement, which might result in more or less person-centeredness.

Definitions: In this report, the term stakeholder includes patients, caregivers, patient advocates, and other community members, but not payers, policymakers, or health care product producers. Researchers are also stakeholders in research, by definition, but for the purposes of this report, the term stakeholder is used to distinguish someone who is a community representative stakeholder from others with interest in the research. We have used community/patient stakeholder where that clarification seemed appropriate. A community representative is a person whose primary affiliation is with a nonacademic, nonresearch community-based organization and/or who represents a defined community.3

Note: The term patient-centered, well-defined by PCOR, is being replaced by the broader term person-centered by researchers in the field.42,43 This terminology shift occurred during the course of our project, and we use both

6

terms, reflecting the terms used during the research and moving toward the newer term during development of the quantitative scale, as noted in the relevant methods section.

7

BACKGROUND

Methods for engaging stakeholders in research exist but are not widely implemented

because of barriers and lack of incentives. Further, engagement evaluation and assessment are

hampered by a limited understanding of, and lack of measurements for, how engagement

impacts research. The emergence of Patient-centered Outcomes Research (PCOR) as an

important mechanism to accelerate the translation of research into practice has heightened the

need to engage patients, consumers, and other community/patient stakeholders in the

research process.4-7 Successful methods of engaging stakeholders include advisory boards,

semistructured interviews, focus groups, surveys, and community listening sessions8-10;

however, a common challenge to engagement in research specifically is researchers’ lack of

experience and training necessary to meaningfully engage stakeholders, members of the

broader community, in the research process.11-20 Thus, many skilled researchers are

unprepared to implement methods to engage stakeholders in their research. Without

appropriate training or experience, attempts to encourage stakeholder participation are often

ineffective and burdensome, and leave stakeholders feeling disenfranchised.21-24

Additionally, most research occurs in academic institutions in which the infrastructure

and incentives are not explicitly supportive of building partnerships with stakeholders.5,25 This

lack of support presents significant challenges to researchers seeking to engage stakeholders,

and the process is often resource intensive and time consuming.13,24,26-30 Centralizing the basic

infrastructure (eg, policies, procedures, informatics tools) associated with these proven

methods that can be shared rather than recreated would provide a foundation for true

advancement to PCOR methods.

In recent years, several approaches have been developed or refined to overcome

barriers to engagement. The Community Engagement (CE) Studio is a structured approach to

community/patient stakeholder engagement that allows researchers to obtain direct input

from representative groups to enhance their research design, implementation, translation,

and/or dissemination.1 The CE Studio is modeled after Clinical and Translational Research

Studios, an award-winning program that provides researchers with structured, project-specific

8

feedback.2,31-33 A studio is a 1- to 2-hour session in which a group of experts provides feedback

on research to an investigator requesting it. The experts are convened by the studio staff,

rather than the researcher, and the staff convenes, records, and reports recommendations

from the studio meeting. A neutral community facilitator moderates the meeting. This

administrative structure of the CE Studio relieves the investigator of the burden of identifying

and recruiting stakeholders as well as convening the meeting. The CE Studio has the added

benefit of providing consultation with an experienced team that can determine if the CE Studio

is the best approach to obtaining stakeholder input, determine the appropriate characteristics

of the stakeholder group, and provide researchers with coaching to prepare for the meeting

with stakeholders.

A key element of the CE Studio is the guidance provided by a team with substantive

experience in patient and consumer engagement. Because the CE Studio is tailored to the

specific project, the patients/community panel can be identified based on the stage of the

PCOR process and the purpose of the stakeholder engagement. The CE Studio is a consultative

approach to engagement and is not ideal for research topic generation, which benefits from a

larger sample, nor is it intended for community-based participatory research, which requires

long-term relationships between researchers and community partners.

A lack of standardized approaches to assessing the impact on research of

community/patient stakeholder engagement using different methods or approaches limits the

widespread implementation of engagement for lack of effectiveness and quality data. There is a

well-recognized need for standardized approaches to assessing stakeholder engagement in

research.34-40 Although engaging stakeholders in research is increasingly believed to be useful,

there is limited evidence demonstrating the value of engagement or return on the investment

of engaging stakeholders.39,41 In a recent systematic review of patient and public involvement

across all health and social science research, significant variability and inconsistency was found

in how the impact of engagement was reported and information related to engagement was

often missing.34 Additionally, the review found “none had attempted any quantitative

measurement, reflecting the lack of robust tools specifically developed to provide a measure of

9

the extent of impact.”34 To advance the field of engagement science and “build an evidence

base that is coherent, generalizable, and allows comparison across studies,”39 we must develop

tools and methods to allow rigorous evaluation of the impact of engagement on research

outcomes.

In designing this study of whether the CE Studio is an effective method for obtaining

patient-centered input, we carefully considered the best approach because there are barriers

to an optimal study design: (1) Currently there is no gold standard for obtaining stakeholder

input, and (2) given that 100% of the researchers from prior CE Studios believed the CE Studio

improved their projects,1 researchers were unlikely to agree to randomization to a no-

stakeholder input group. With these limitations in mind, we chose to compare the CE Studios

with Translation (T2) Studios, which also use a consultative method to elicit feedback, providing

expertise from seasoned academic researchers. The specific aims of this project were to (1)

assess the effectiveness of the CE Studio in obtaining person-centered input, (2) develop a

taxonomy of changes to research that could be attributed to stakeholder engagement, and (3)

create and validate a quantitative instrument to assess the person-centeredness of research.

10

PARTICIPATION OF PATIENTS AND OTHER STAKEHOLDERS IN THE DESIGN AND CONDUCT OF RESEARCH AND DISSEMINATION OF FINDINGS

The overarching goal of this project is to facilitate methods of community/patient

stakeholder engagement that will lead to research that is more patient (person) centered. Our

approach is guided by the principles of respect, trust, co-learning, and transparency. Our

methods are intended to facilitate stakeholder input relevant to respect for, and

responsiveness to, patient preferences, needs, and values.

In total, we engaged 167 community/patient stakeholders in the design and conduct of

this project. This includes 2 stakeholders who were integral members of the research team and

participated in all aspects of the study including decision making; 153 stakeholders served as

experts in a CE Studio; 14 stakeholders, reflecting a mix of faculty experts in community

engagement and community representatives with expertise in CE, gave input on the terms used

to describe the impact of engagement on research and the codes used in the qualitative

analysis. Six different community representatives (2 pilot rounds with 3 coders each) provided

feedback on the taxonomy through its use in coding transcripts.

Collaborative Level of Engagement

Designing the Study

Our research team included 2 stakeholder researchers, Yolanda Vaughn and Al

Richmond. Both are experienced leaders of community organizations and have partnered with

researchers in the past. Ms. Vaughn is a community partner with the Neighborhoods Resource

Center. Mr. Richmond is a community partner with Campus Community Partnerships for Health

(CCPH). They identify as patients and patient advocates in their work as community partners in

this research. They were involved in the conception of this project, writing of the proposal, and

design of the study.

11

Conducting the Research

Both Ms. Vaughn and Mr. Richmond were members of the primary research team and

participated in the development of tools, implementation of the study, analysis of the data, and

dissemination of the results, including as co-authors on publications. They participated in team

meetings with equal voice to the academic team members. Ms. Vaughn and Mr. Richmond

were trained in coding and performed some of the qualitative analysis. In addition to Ms.

Vaughn’s and Mr. Richmond’s involvement in data collection and analysis, 6 stakeholders were

interviewed and provided input on the taxonomy, and 6 stakeholders served as abstract raters

for the Person-centeredness of Research Scale. All stakeholders have been compensated for

their time.

Monitoring the Study Progress

The Community Advisory Council, to which our Community Engaged Research Core

reports, monitored the progress of this study. This council comprises 18 community and patient

stakeholders and meets quarterly. The community partner organizations represented on the

Community Advisory Council are listed in Appendix A.

Disseminating the Research Results

We partnered with CCPH, a national organization, and the Neighborhoods Resource

Center in Nashville, Tennessee, to assist with disseminating the results of this project to the

relevant stakeholder groups. CCPH hosted a webinar to share results with a national audience

of community members and researchers. The Neighborhoods Resource Center helped prepare

news releases, distributed results via their newsletters, and hosted a local community forum.

Consultative Level of Engagement

The 153 stakeholders who participated in CE Studios represent a range of groups

identified by health status, health condition, or demographic variables based on the project-

based needs of the 20 researchers randomized in this project. Stakeholders include individuals

12

with diabetes, heart failure, Parkinson’s disease, sickle cell disease, and intensive care unit (ICU)

survivors.

We are committed to using methods that recognize the added value stakeholders bring

to the research process.3 Our intent is to facilitate the capture of the patients’ viewpoint

including their experiential knowledge, perspectives, and preferences regarding health and

health conditions. We recognize the importance of engaging stakeholders broadly

representative of patients, and we are experienced in engaging groups that some refer to as

“hard to reach.” More than two-thirds of our prior stakeholders are minorities, which is

consistent with the specific needs of the researchers requesting CE Studios. From our

experience, researchers often seek our services because they have had difficulty engaging

stakeholders. This likely contributes to the large number of minorities that have served as

stakeholders.

The key personnel on this project include a diverse group of stakeholders and academic

faculty and staff who have experience in clinical, translational, community-engaged, and

comparative effectiveness research and dissemination, as well as health disparities, health

advocacy, community outreach, and facilitation. They are also committed to the principles of

community engagement and PCOR.

13

METHODS

Overview

This is a methods project and the overarching goal is to create methods and tools that

advance stakeholder engagement. The work completed in this project fills a methods gap

identified by PCORI and compares feedback from patients and community members with

feedback from researchers. The intent is not to prove that one is superior or imply that these

are mutually exclusive. Although researchers were randomized to 1 of 2 types of studios, there

was no intervention, and this is not a clinical trial. Our report of results includes recruitment

and randomization information per CONSORT guidelines (see Figure 1), but not information

relevant specifically to clinical trials. The purpose of the randomization was to generate content

from different stakeholder types in an unbiased way.

This randomized studio comparison study was reviewed by the Vanderbilt University IRB

and determined to be exempt (45 CFR 46.101 (b) category 2).

Developing the taxonomy required IRB approval for data collected through

semistructured interviews. This approach, design, and the questions were reviewed and

approved by the Vanderbilt University IRB.

14

Figure 1. Study flow CONSORT diagram of study enrollment, randomization, and allocation

Methods for Assessing CE Studio Input

In determining whether the CE Studio is an effective method for obtaining patient-

centered input, we carefully considered the best approach because there are barriers to an

optimal study design: (1) Currently there is no gold standard for obtaining stakeholder input to

measure the accuracy of patient input obtained by researchers; and (2) given that 100% of the

researchers from prior CE Studios believed the CE Studio improved their projects, researchers

are unlikely to agree to randomization to a no-stakeholder-input group. With these limitations

in mind, we chose to compare the CE Studios with T2 Studios, which also use a consultative

method to elicit feedback, providing expertise from seasoned academic researchers. The

primary difference between the 2 studio types is patients and/or community stakeholders

provide input in CE Studios and researchers provide input in T2 Studios. We hypothesized the

15

input provided by stakeholders in CE Studios is more patient-centered than input provided by

researchers in T2 Studios. See Figure 1 for the CONSORT diagram and study flow.

Design

We used a randomized controlled methodological study.

Setting and Population

Investigators requesting a CE Studio for input on their research were randomized to the

intervention group (CE Studio) or the usual condition/control (T2 Studio). Any faculty member

or research trainee at VU or MMC was eligible to be randomized. Twenty researchers recruited

for this project were randomized to CE Studio or both the CE Studio and T2 Studio (usual

condition; see Figure 1).

Randomization

We completed randomization to intervention or control using random number

generator software (www.r-project.org).

16

Data Collection

Methods for convening CE Studios and T2 Studios are similar, the main differences being

the following: (1) CE Studios have stakeholder experts, whereas T2 Studios have faculty

researcher experts; (2) CE Studios are moderated by a neutral community representative

facilitator and the T2 Studio is moderated by a senior researcher; and (3) CE Studios have

stakeholder orientation and investigator coaching to help facilitate that interaction because it is

not a common one, nor one for which investigators are trained.1,2 At either type of studio, the

investigator has approximately 15 minutes to present an overview of the study and key

questions, then experts provide verbal and written feedback during the session. In our study,

studio discussions were recorded and transcribed verbatim. We assessed outcome measures in

this study using the feedback received in the CE Studios and T2 Studios, rather than feedback

from the investigator participants (see Figure 1).

Key engagement strategies used in CE Studios are the following:

• Creating the infrastructure and providing resources that facilitate patient involvement

o We used an experienced team and a structured approach. Stakeholder experts

were compensated for their time and received free parking and a meal if the

meeting was held during a mealtime. We made every effort to schedule and

locate the CE Studios for the convenience of the stakeholders.

• Advance preparation to meet the needs of both stakeholder experts and researchers

o Stakeholders received a brief orientation and CE Studio handbook and

researchers received coaching prior to the facilitated meeting.

• Supporting participants before, during, and after involvement

o Our navigator provided assistance to stakeholders throughout the process.

Stakeholders received an update about how the CE Studio affected the research.

17

• Communicating clear expectations of the process

o We provided detailed instructions both in writing and during the orientation. The

facilitator reviewed these expectations at the beginning of the CE Studio session.

Stakeholders were provided the opportunity to ask questions and obtain

additional information.

• Involving a group instead of individuals

o Our stakeholders provided input in a group setting (median number of

stakeholders per CE Studio = 7).

• Empowering patients by using processes that give patients an equal voice

o A trained, neutral community representative facilitator moderated the CE Studio

meeting and monitored the power dynamics in the session so neither the

researcher nor a single stakeholder dominated the discussion.

Qualitative Analysis

Analysis of transcripts from the studios involved reading the text, creating text excerpts,

and labeling each excerpt with one or more codes (themes) using Dedoose, an online tool for

collaborative qualitative data analysis. Coders included faculty and staff on the research team

and trained student workers. Each transcript was coded by one or more coders, then reviewed

by another coder. As the codebook (ie, list of codes applied) was being developed, multiple

coding passes were taken on the initial transcripts to ensure subsequently created codes were

applied as necessary. We compared the frequency of codes in transcripts from CE Studios with

those from T2 Studios.

Development of the Stakeholder Impacts in Research Taxonomy

One of the 3 primary aims of this Methods Award was to develop a classification system

of the types of changes that may occur in research due to engagement. Examples include

18

changes in study design, changes in population target, addition of patient-reported outcomes,

changes in recruitment process, and changes in research questions. We used an iterative

approach to generate standard terms and pilot test the taxonomy,11 actively engaging

researchers and community stakeholders at each step. Our team included leaders from 2

community organizations and faculty from 3 institutions with expertise in community

engagement, scale development, and qualitative and translational research.

Content Generation

To generate an initial translational research-based framework for the analysis (draft

taxonomy), we reviewed the literature reporting research in which patient, community, and

provider stakeholders have been involved. This was not a systematic literature review, but a

review of literature intended to generate initial content (see Literature Search for Candidate

Item Generation in Appendix B). Literature review was guided by our team’s expertise in

engagement and a recent comprehensive review of impact.44 Searches in PubMed and Google

Scholar included these keywords: community engaged research), patient and stakeholder

engagement in research, participatory research, patient-centered outcomes research, impact of

community/patient/family/caregiver engagement in research, and evaluation of

community/patient engagement in research.12-14,44 We designed questions for the interview to

gather input and feedback on domain nomenclature, arrangement of the domains and

dimensions, and utility, and relevance of the taxonomy. We used a “think-aloud” method to

probe deeper into responses given by the interviewees to provide a richer thought process with

examples.15 See Table 1 for the initial code structure and questions used in the semistructured

interviews. The semistructured interviews were recorded, transcribed verbatim, and

deidentified by research analysts. Two analysts independently coded the transcripts with the

taxonomy and assessed its fit, classifying the participants’ responses to each domain and

dimension of the taxonomy as “keep,” “remove,” “add,” or “needs improvement.” The analysts’

method was to designate one coder as primary coder and a second coder to assess the

application of codes. With consensus, not independent agreement, as the goal, disagreement

was resolved through discussion, and agreement was not measured. Discrepancies in codes

19

were resolved through team adjudication. The “needs improvement” classification was used to

highlight sections or wording where there was a need for more clarity. The research team

discussed wording changes to develop more precise terminology. We revised the taxonomy

based on these results.

Table 1. Semistructured Interview Materials

a. Draft Taxonomy

Potential Areas of Impact for Patient (and Other Stakeholder) Engagement

Domains Conceptual Statements

1. Pre-research

Idea/topic generation Identify issues of greatest importance Input on relevance/purpose Identify stakeholders/potential partners

2. Infrastructure

Funding source decisions Preparation of budget Sharing of funds Appropriate compensation for stakeholders (patients, consumers, community organizations) Time Cost Process/structure for shared decision making

3. Study design

Define population Selection of patient-centered tools Organize ideas and capture the way the research will be applied Provide input on research methods Grant writing/proposal development Framing research questions Selection of comparators and outcomes Revise the research protocol Input on cultural appropriateness

4. Research implementation

Identify/hire research team members Recruitment of research participants Identify best approaches to recruitment and retention Determine best approaches to data collection (in person vs online vs telephone; survey vs interview; self-report vs caregiver report) Assist with data collection

20

5. Research analysis

Assist with data analysis (train to do qualitative analysis) Provide alternative interpretation of research results (especially those that are counterintuitive) Bring attention to factors (confounders) that may not have been measured or documented in literature Interpret/assess plausibility of results Review results and provide context for relevance to patients and stakeholders

6. Dissemination of research findings

Provide culturally relevant and appropriate language Co-authorship of manuscripts Write for nonscientific publication Advise on appropriate audiences and nontraditional venues for dissemination Convene town hall meetings and other opportunities for dissemination Create companion materials for dissemination (eg, videos, newsletters)

7. Ethics

Consent process Acceptability of research Protection of individuals vs protection of communities Privacy (might be implied in consent process) Risks/benefits (ie, health, increased knowledge)

b. Semistructured Interview Questions

Semistructured Interview Question Researcher (n = 6)

Stakeholder (n = 6)

What is your first impression of the taxonomy? What makes sense to you? Are the domains/elements rational?

X X

How would you improve the taxonomy? X X

Are there domains/elements you would eliminate? Why? X X

Would you add any domains/elements? What would you add? X X

Which domains/elements are you most familiar with? X X

Do you feel you can contribute to any of the domains/elements? If so, which ones and how?

X

Which domains/elements are most beneficial to you when seeking stakeholder input?

X

How likely are you to provide feedback in these domains/elements? X

Are there domains/elements that are particularly important to you? X X

21

Pilot Testing

We piloted the taxonomy by comparing transcripts of consultative feedback from

community stakeholders (CE Studios) and researchers (T2 Studios). Studios, implemented in

2007 in the Vanderbilt Institute for Clinical and Translational Research, are structured, dynamic

90-minute sessions bringing together 2 to 6 content experts to provide guidance intended to

surmount barriers encountered at a specific stage of clinical and translational research.16 T2

Studios, with investigators as experts, address research and proposed research that involve

clinical and comparative effectiveness research. The CE Studio uses a similar roundtable,

dynamic session, but includes stakeholders such as patients and community stakeholders as

experts instead of researchers. The CE Studio recruits and trains stakeholder experts, prepares

investigators to engage with stakeholders, and facilitates an in-person meeting with both.17

Because the experts differ between the 2 studio types, the anticipated differences in the

language and content of the guidance provided a robust test of the taxonomy’s validity. The CE

Studios were an ideal model to test this taxonomy to assess the types of input provided by

nonacademic stakeholders.

During the pilot, we trained 8 coders experienced in qualitative analysis and not

involved in developing the taxonomy. The coders were each provided orientation and training,

including stakeholder engagement background, introduction to the draft taxonomy, coding

software (Dedoose) demonstration, and discussion of guidelines and expectations. Each coded

2 transcripts (1 T2 and 1 CE Studio), blinded to the transcript source. From their codes, we

further revised the taxonomy. Following the coding, each coder was interviewed and asked,

How did the tool work for you? What challenges did you experience? How can the taxonomy be

improved? Any overall likes and dislikes in the utility of the system? From these interviews, we

refined the taxonomy to that presented here. In the refinement, we removed no elements or

codes, but added additional elements; we also added some elements to multiple domains as

cross-domain concepts.

22

Items Included in the Stakeholder Impacts in Research Taxonomy

We generated a draft taxonomy with 7 domains and 41 dimensions representing

potential areas of stakeholder impact on translational research (see Table 1). Following the

evaluation, analysis, and refinement, the taxonomy contained 11 domains and 59 dimensions.

We pilot tested and further refined that taxonomy with coders’ feedback to the version that is

presented here. We developed a hierarchy to define areas of stakeholder impact on research

systematically (Table 2) and created a visual representation of the taxonomy (Figure 2). For a

fuller description, please see Results section, Stakeholder Impacts in Research Taxonomy.

Table 2. Potential Areas of Impact for Patient (and Other Stakeholder Engagement)

Research Spectrum Concepts Research Activity Concepts Measurable Impacts from Stakeholders

Domains Dimensions Elements

RESEARCH-SPECIFIC CONCEPTS

1. Pre-research: Early stages in the research process when trying to determine what, where, and why.

Research question a. Idea/topic generation b. Identify issues of greatest importance c. Input on relevance/purpose d. Identify stakeholders/potential partners e. Development of specific aims f. Grant writing/proposal development g. Framing research questions

Literature review

Proposal development

2. Infrastructure: Involvement in the foundation/ logistics of the project, distribution of funds and planning.

Governance a. Funding source decisions b. Preparation of budget c. Sharing of funds d. Appropriate compensation for stakeholders (patients, consumers, community organizations) e. Time f. Cost g. Process/structure for shared decision making h. Scope of work (who is funded, why, and what is being done) i. Payment system for participants (if clinical study, what does insurance cover)

Team member roles

Power balance

Compensation

23

Research Spectrum Concepts

Research Activity Concepts

Measurable Impacts from Stakeholders

Domains Dimensions Elements

3. Study design: Defining the how and who: Which populations and how do we study this question within those populations? What is the best approach? Feedback regarding research.

Study population a. Define population b. Input on cultural appropriateness c. Provide input on setting for the research d. Generalizability e. Selection of patient-centered tools f. Concerns about tech access g. Organize ideas and capture the way the research will be applied h. Logistics of protocol i. Take advantage of research j. Lack of understanding method k. Provide input on research methods l. Tech preference m. Addition of comparators & outcomes n. Modification to research protocol o. Technology preference p. Take advantage of research q. Need for clarity r. Comfort level assessment s. Payment system as a confounder t. Clinical workflow considerations

Methods

Protocols

4. Implementation: Describing the how, but more specifically, the execution of the project. Developing the steps on how to complete the objective of the research.

Operations a. Identify/hire research team members b. Assist with data collection c. Identify best approaches to recruitment and retention d. Provide input on setting for the research e. Identify potential stigmas for condition studied f. Recruitment of research participants g. How to improve overall method of consenting. Ethically, integrity of research h. Optimal data collection methods i. In person vs online vs telephone; survey vs interview; self-report vs caregiver report j. Concerns about technology access

Framing

Community researcher team

formation

Data collection

5. Analysis: Steps after data are collected, which could

Data management a. Assist with data analysis (train to do qualitative analysis) b. Provide alternative interpretation of research results (especially those that are counterintuitive)

24

include statistical measures and interpretation.

Data analysis c. Bring attention to factors (confounders) that may not have been measured or documented in literature d. Interpret/assess plausibility of results e. Review results and provide context for relevance to patients and stakeholders Interpretation

6. Dissemination of research findings: Distributing the results/ discussing outcomes or work in progress of the research in oral or written forums. Appropriateness of the delivered message/ objectives/ audience.

Dissemination audience and

methods

a. Co-authorship of manuscripts b. Write for nonscientific publication c. Advise on appropriate audiences and nontraditional venues for dissemination d. Convene town hall meetings and other opportunities for dissemination e. Identify appropriate community organizations who would benefit from the research f. Provide input on audience (appropriate message delivery) g. How to reach participants h. Provide culturally relevant and appropriate language i. Create companion materials for dissemination (eg, videos, newsletters, brochures, PowerPoint presentations, handouts) j. Social media outreach k. Organize ideas and capture the way the research will be applied

Health/scientific literacy

Message(s)

7. Post-research: How study results become actions for improved health or clinical care

Translation a. Actionable steps (implementation of what discovered from the research) b. How to follow up with participants c. Overall impact of the research on the community d. Formulating next steps/convening appropriate audiences for further action/post-research action e. Formulating next follow-up research question f. Policy implications g. Support for research (coverage)

Health care policy

Research policy

25

Research Spectrum Concepts

Research Activity Concepts

Measurable Impacts from Stakeholders

Domains Dimensions Elements

OVERARCHING CONCEPTS

8. Ethics: Universal/ overarching meta-dimensions that are relevant to all phases of research

Respect for people a. Consent/consent process b. Protection of individuals vs protection of communities c. Privacy (might be implied in consent process) d. Acceptability of research e. Apprehension of tech, the interface between patients f. Education of group or community that the investigator plans to engage g. Risks/benefits (ie, health, increased knowledge) h. Unintended consequences i. Providing feedback that takes into account beliefs/human behaviors of groups and communities j. Education of participants

Beneficence

Justice (moral requirement)

9. Process improvement: Domain in which interviewees felt they had contributed guidance and oversight

Research quality

a. How to better serve patients/articulate information more clearly b. Improvements on how to prepare patients before seeing the doctor and making decisions on health c. Input on anticipated reviewer response to design

Health care services quality

a. Improving processes, increasing efficiency (ie, saving consumer/industry money, saving patient time out of schedule, finding ways to “free up” doctors’ time) b. Procedural feedback, conceptualizing the implementation (role of research)

10. Engagement: The difference between stakeholder engagement in the governance structure of a research project and engagement with groups when implementing and disseminating the research finding

Cultural competence

a. Identifying stakeholder priorities b. Insight on group behaviors that could affect project goals

Humility

11. Communication: Idea/topic generation, protocol development, governance/

Literacy/numeracy a. Communication in recruitment and consent b. Communication in research procedures/activities c. Effectiveness of written materials (tailoring of materials and language/numeracy appropriateness Power balance

26

shared decision making, protocols, interpretation of results, audience and methods of dissemination, message content, and policy change

Comprehension d. Role of research staff communicating with participants e. Style and content of face-to-face communications f. Lack of clarity (can include protocol, methods, materials, etc) g. Comprehension barrier

27

Figure 2. Visual representation of the stakeholder impacts in research taxonomy: A taxonomy of standard terms for areas of stakeholder impact in research

28

Methods for Developing the Quantitative PCoR Scale

Because there are no reference standard tools to measure the impact of stakeholder

engagement in PCOR, we developed and validated a quantitative scale.45 Our scale

development process used a multistep approach: (1) content and item generation, (2)

evaluation of item candidates, (3) testing of initial scale, (4) revision of scale, and (5) testing of

revised scale. (Each of the subsections in this section corresponds to 1 of these 5 items). Our

approach was guided by a conceptual framework for engagement in comparative effectiveness

research, which includes clear definitions for engagement, types of methods of engagement

and outputs of such engagement.46 The framework underscores the significance of seeking

input from all relevant stakeholders about preferences and values, which was prioritized in our

development of the quantitative instrument.

Content and Item Generation

Content validity is critical to scale development, and we used several approaches to

develop a relevant and inclusive initial item pool. First, we searched the medical literature (via

PubMed) using these keywords and phrases: patient-centeredness, patient-centered outcomes

research, community engaged research, stakeholder engagement in research, participatory

research, impact of community/patient/family/caregiver engagement in research, and

evaluation of community/patient engagement in research. Note this was not a systematic

literature review, but a review of literature intended to generate initial content (see Literature

Search for Candidate Item Generation in Appendix B). We also reviewed PCORI merit review

criteria on patient-centeredness. Next, our research team mapped items from existing

engagement and patient-centeredness scales (from clinical settings, as there are no patient-

centeredness scales specific to the research setting) to the conceptual frameworks. We revised

existing questions and wrote new items to fill identified gaps. The initial candidate items were

reviewed by a panel with expertise in PCOR and engagement.

29

Evaluation of Candidate Items

Researchers testing the taxonomy also reviewed the candidate items for the scale for

clarity and relevance to the conceptual framework. Researchers used the candidate items to

evaluate studio transcripts and answered the following questions: How did the items work?

What challenges were experienced? How can the items be improved? Overall likes and dislikes

in utility of the items?

Pilot Testing of Initial PCoR Scale

We refined the candidate items based on the evaluation, which resulted in an 11-item

scale (see Table 3). To test the scale, an expert member of the team in community engaged

research and studies identified 60 research abstracts: 30 abstracts from a conference devoted

to cover patient-centered research (Advancing the Science of Community Engaged Research

2016 Conference: https://vanderbilt.irisregistration.com/Home/Site?code=engagedresearch)

and 30 abstracts presented at the Association for Clinical and Translational Science (ACTS)

annual meeting (http://www.actscience.org/page/translational-science-2016). We selected

abstracts from these sources based on the expert panel’s advice to represent research more

likely to be patient-centered (PCORI) and less likely to be patient-centered (ACTS). A total of 17

raters used the scale to assess the extent of person-centeredness of the 60 abstracts. Of these

raters, 9 were community representatives, whereas the remaining raters were affiliated with an

academic institution. Raters were provided detailed instructions on using the scale, which

required rating each of the 11 items using a 4-point Likert response scale (“No, completely,”

“No, somewhat,” “Yes, somewhat,” and “Yes, completely”). All were compensated for their

time. Each rater was randomly assigned between 7 and 8 abstracts and were blinded to the

abstracts’ origin. Two independent reviewers rated each abstract. Raters provided feedback on

each item in the scale, the overall scale, and the instructions for using the instrument.

30

Table 3. Analysis of Original Version of the PCoR Scale*

Abstracts Δ = 0 Δ = 1 Same Range

Δ = 1 Different Ranges

Δ = 2

1. Beliefs are defined as the state of mind in which a person thinks something to be the case, with or without there being empirical evidence to prove that something is the case with factual certainty. Beliefs can be cultural or faith-based. Does the information reflect beliefs you think are relevant to the population of interest (or to patients/community members in general)?

10 8 5 7

2. Attitudes are defined as “settled/ingrained ways of thinking or feeling about someone or something, typically one that is reflected in a person's behavior, whether consciously or unconsciously.” Attitudes can be positive or negative, mixed or unsure, and are particularly relevant in reference to research/health care. Does the information reflect attitudes you think are relevant to the population of interest (or to patients/community members in general)?

10 10 3 7

3. Concerns are defined as “matters of interest or importance to someone.” Concerns can be related to feelings about how the research is developed and carried out. Does the information reflect concerns you think are relevant to the population of interest (or to patients/community members in general)?

11 11 5 3

4. Values are defined as “a person's principles or standards of behavior; one's judgments of what is important in life.” Values can relate to new medication, treatment opportunities, and/or cost-effectiveness. Does the information reflect values you think are relevant to the population of interest (or to patients/community members in general)?

7 12 1 10

5. Are patient and/or community needs taken into consideration?

12 3 3 12

6. For any suggestions given, are the suggestions relevant to the population of interest (or to patient and/or community members in general)?

8 8 2 10

31

7. Does the information address patient-centered and/or community-centered outcomes?

10 12 3 5

8. Does the information address research priorities of the population of interest (or patient and/or community members in general)?

8 11 3 8

9. Does the information address health care needs of the population of interest (or patient and/or community members in general)?

10 11 4 5

10. Does the information address the best methods to communicate with the population of interest (or patient and/or community members in general)?

11 7 2 10

11. Does the information address opportunities to engage the population of interest (or patients and/or community members in general) in decision making around the research planning, implementation or results dissemination?

10 10 3 7

*Counts of abstracts sent to a patient-centered conference showing identical scores; scores that differ by 1 point but fall into the same category of values; scores that differ by 1 point but change category of values; and scores that differ by 2 points.

32

Preliminary Data and Scale Analysis

For data analysis purposes, we averaged the item ratings of the 2 independent raters

into a single score. We conducted a principal components factor analysis (PCA) to gather

general psychometric properties of the 11-item scale. The aim of the factor analysis was to

explore how many latent dimensions underlay the 11 items on the rating scale. We conducted a

scale analysis to determine the Cronbach α of the scale, an indicator of internal consistency

reliability. The major analysis for this pilot study was a comparison of the total scale score and

individual item scores between the 2 types of abstracts evaluated by the raters. For those

abstracts that resulted from PCORI funding, we also examined interrater consistency in the

ratings of those abstracts.

Four items in the initial scale (4, 5, 6, and 10) had the greatest amount of interrater

unreliability, and we removed these 4 items to make the final form of the scale. Each item was

discussed independently by the expert panel.

Testing of Revised PCoR Scale

Feedback gathered from the raters drove the following changes to the initial scale:

Instead of using a 4-point rating scale, each item on the revised instrument was rated using a 5-

point Likert response scale from “Strongly disagree” (–2) to “Strongly agree” (+2), with higher

scores indicating higher person-centeredness. The range of scores for each item is –2 to +2 and

total scale score range is –14 (least person-centered) to +14 (most person-centered). We then

used this revised, 7-item version of the scale to assess 40 of the same abstracts that were rated

in the pilot study. We reduced the number of abstracts by 33% by removing those abstracts

that had consistently extremely high or low ratings by raters in the pilot study, thereby making

this a more stringent test of the PCoR Scale’s ability to discriminate between research products

that theoretically should differ in person-centeredness. Twenty of those research abstracts

were from the PCORI-funded studies, and the remaining 20 abstracts were from the clinical and

translational research conference.

33

A total of 12 raters (all of whom had rated the previous version of the scale) used the

revised PCoR Scale to assess the extent of person-centeredness of the 40 abstracts. Seven

raters were community representatives and 5 raters were affiliated with an academic

institution. Each rater assessed 20 abstracts. We created 2 packets of abstracts using computer-

generated randomization assignment with each packet containing 20 abstracts. We chose 2

teams of raters controlling for type of affiliation. Each rater in each team received his or her

packet of abstracts in computer-generated random order to control for order effects in the

ratings. This design allowed us to conduct a true assessment of interrater reliability, something

that was not possible in the pilot study. We also gathered qualitative feedback on this version

of the rating scale from the raters.

RESULTS

Comparison of Input from CE Studios and T2 Studios

Based on studio type, each transcript was classified as either “Community” or

“Translational.” We found substantial differences in code frequencies between the 2 types of

studios (Figure 3). Specifically, the input from patients and community members in the CE

Studios had more excerpts in 5 dimensions: patient experience stories, methods/protocol,

research design, population, and recruitment of research participants. We also found the input

in CE Studios to have a significantly higher narrative quality.

Input from research in T2 Studios had more codes in 5 dimensions: grant

writing/proposal development, pre-research/infrastructure, research methods, consent

process, and ethics. Three dimensions, appropriate compensation of stakeholders, trust, and

patient experiences, were found in only CE Studios, and one dimension, grant writing/proposal

development, was found in only T2 Studios. Person-centeredness of Research scores (range =

-14 to 14) were significantly higher (more person-centered) for CE Studio (8.52; SD = 5.9)

compared with T2 Studio (-5.67; SD = 9.1) transcripts (t = 10.9; p < 0.0001).

34

Figure 3. Differential frequencies of coding counts in CE Studio and T2 Studio transcripts

36

0

6

2

5

39

10

40

19

32

36

102

79

88

3

4

0

2

36

3

31

9

13

6

61

36

0 20 40 60 80 100 120

Pre-Research/ProposalDevelopment

Grant writing/proposaldevelopment

Dissemination of ResearchFindings

Reasons for notparticipating in research

Framing research questions

Ethics

Engagement

Research Population

Infrastructure

Quality Improvement

Implementation ofResearch

Research Design

Methods/Protocol

Community

Translational

35

Stakeholder Impacts in Research Taxonomy

We generated a draft taxonomy with 7 domains and 41 dimensions representing

potential areas of stakeholder impact on translational research (see Table 1). Following the

evaluation, analysis, and refinement, the taxonomy contained 11 domains and 59 dimensions.

We pilot tested and further refined that taxonomy with coders’ feedback to that presented

here. We developed a hierarchy to define areas of stakeholder impact on research

systematically (Table 2) and created a visual representation of the taxonomy (Figure 2).

Conceptual Overview

Research is often represented as a linear process that moves from pre-research to

research design, then to implementation, and ending with dissemination. Based on links

deduced in the validation pilot, we propose that research is a cyclical and iterative process with

opportunities for stakeholders to engage at all phases of research and inform next steps (see

Figure 2). We also identify 4 meta-domains that are universal/overarching themes of

participation and do not apply specifically to the translational research process, but rather span

the entire process itself: ethics, process improvement, engagement, and communication.

Pre-research and Infrastructure

This early phase frames the overall study and the potential impact of the outcomes.

These domains have elements indicating that the research planning process could benefit from

stakeholders’ involvement in proposal development and setting research priorities by making

the context of the research topic, questions, and hypotheses more patient or community

centered and relevant to the target group. The infrastructure dimension contains the

governance and policy-making stages of research for which stakeholder feedback on

compensation, time and cost burden on research participants, power balance, and team roles is

valuable.

36

“If you want more minorities in your research, you need to change the way you tell us

about your studies. Find a way to get our input, not on your terms, but ours. Put something in

place for me to tell you what I want you to study?”

Study Design and Implementation

Many of the elements that are used to describe areas of impact for patient and

community engagement in study design and implementation relate to research logistics or

operations and framing research questions. Stakeholders can influence these phases of

research by identifying relevant comparators and outcomes that may influence data collection

strategies, defining target populations, culturally tailoring recruitment strategies and materials,

developing methods to achieve improved retention, and identifying best communication

practices for the team.

“What do you mean by healthy? Why are you only interested in healthy people? What

about including people like me who have conditions like diabetes or hypertension? We need to

learn how to be healthy too. And don’t you want to see how exercise can benefit people with

these conditions?”

Analysis and Dissemination

In the results and reporting phases of research, patient and community engagement can

impact data analysis, interpretation, and results dissemination. Cultural relevance and

appropriate language in results interpretation and presentation affects uptake of the health

message. Our interview participants stated that these activities benefit greatly from patient and

community stakeholders. Challenges with cross-cultural communication that occur during these

37

phases can be overcome through appropriate training between investigators and stakeholders

and through peer-to-peer delivery of the results.

Post-research

This domain, identified during the taxonomy evaluation, centers on how study results

become actions for improved health or clinical care. We identified this domain in an interview

with a researcher during which the investigator described questions from research participants

about what happens after completion of a study.

The elements in the post-research domain refer to next steps in the research, such as,

What is the next question? What type of follow-up needs to happen now that the initial

research questions were addressed? What is the overall impact on the community and what

other social constructs that influence an individual’s health are impacted by these results?

“The only thing I can think of potentially, and it could be included in this, is just

kind of post-research dissemination, the follow-up piece. I think a lot of times in my

experience, whether it’s the patients or the homeless, they want to see what's

happening after, not just that it's a study . . . so, kind of the follow-up. . . . It’s the biggest

complaint I have received in my experience, is yeah, we participate in studies all the time

and nothing happens on the back end.”

38

Ethics and Engagement

The research team identified ethics and engagement as universal/overarching meta-

dimensions that are relevant to all phases of research and developed elements that described

corresponding activities. We further divided engagement into 2 subcodes/child codes that

reflect the difference between stakeholder engagement in the governance structure of a

research project and engagement with groups when implementing and disseminating the

research findings.

Process Improvement

In addition to providing new elements for the ethics domain, the community

interviewees identified process improvement as a domain in which they felt they had

contributed guidance and oversight.

Communication

Communication is a pervasive theme crossing all phases of research that were identified

during the taxonomy pilot. Communication is particularly evident through these dimensions:

idea/topic generation, protocol development, governance/shared decision making, protocols

(eg, study materials, consent forms/language, recruitment materials, surveys), interpretation of

results, audience and methods of dissemination, message content, translation (ie, community

action, participant follow-up), and policy change.

Development and Validation of the PCoR Scale

Results from the Development of the First Version of the PCoR Scale

The first version of the PCoR Scale (see Table 3) had 11 items on a 4-point Likert Scale

(“No, completely,” “No, somewhat,” “Yes, somewhat,” and “Yes, completely”). Data analysis

showed high internal consistency of the scale items (Cronbach α = 0.931). All items loaded into

one single factor with an eigenvalue of 6.54 that explained a total of 59.41% of the variance.

Internal consistency remained high (Cronbach α ranging from 0.919 to 0.932) if we deleted any

39

item from the scale. Pearson correlation coefficients of each item with the total score ranged

from 0.598 (p < 0.001) to 0.864 (p < 0.001). Data did not allow the calculation of intraclass

correlation coefficients to test for interrater reliability. However, we conducted post hoc

analysis to assess consistency of assessment per item for the abstracts sent to a person-

centered conference. We counted how many abstracts had the same score in each item across

raters, how many abstracts had scores that differed by one point but remained in the same

category (either scoring 1-2, non–patient- centered, or 3-4, patient-centered), differed by 1

point and changed the category (2, non–patient-centered, and 3, patient-centered) or those

that differed by 2 points (extreme category change). Table 3 shows the total counts.

Table 4 shows the items of the scale and the abstracts’ counts by consistency of item

response. Although there was a fair amount of interrater inconsistency, 4 questions (4, 5, 6, and

10) stood out as having the greatest amount of interrater inconsistency (according to the last

column in Table 4). To address the inconsistency of ratings in some items, we reduced the total

number of items from 11 to 7 questions by eliminating those items that showed the least

consistency between the 2 raters in the pilot study. Qualitative feedback from the raters

recommended different wording for some of the of the questions and changing the rating scale

from 4 response options to 5 options, thus not forcing raters to make a choice between person-

centered and non–person-centered. Also, it was recommended that the definitions of terms

such as beliefs, attitudes, concerns, and values be removed from the questions themselves and

placed somewhere else, perhaps in the instructions for raters. It was also suggested that we

rename the scale to assess person-centeredness rather than patient-centeredness, following

newer recommendations.42,43

40

Table 4. Revised PCoR Scale Items, Descriptive Statistics, PCA Loading Solution, and Pearson Correlation Coefficients With Total Score

Factor

Loadings Mean (SD) Pearson r

1. There is evidence that beliefs relevant to the population of interest or to patients/community members in general are included or addressed in the research.

0.851 0.22 (1.624) 0.852a

2. There is evidence that attitudes relevant to the population of interest or to patients/community members in general are included or addressed in the research.

0.921 0.26 (1.611) 0.921a

3. There is evidence that concerns relevant to the population of interest or to patients/community members in general are included or addressed in the research.

0.898 0.59 (1.552) 0.893a

4. Person/community-centered goals and/or outcomes are included or addressed in the research.

0.892 0.61 (1.556) 0.888a

5. Research priorities of interest to the patient/community are included or addressed in the research.

0.864 0.50 (1.522) 0.861a

6. The needs of the patient/community are included or addressed in the research.

0.871 0.40 (1.557) 0.869a

7. Individuals representing patients and/or communities are engaged in the research as stakeholders, advisors, consultants, or team members (beyond serving as research participants or volunteers).

0.784 –0.18 (1.717) 0.797a

ap < 0.001.

41

Data analysis for the revised (final) version of the PCoR Scale showed high internal

consistency (Cronbach α = 0.945). Table 5 shows the mean scores of each item, their factor

loadings, and the correlation of each item with the total scale score. The items loaded on one

single factor with an eigenvalue of 5.29, explaining a total of 75.63% of the variance.

Discriminant analysis showed the scale successfully classified 71.4% of the abstracts to either

belonging to a person-centered conference or a translational science conference. Interrater

reliability was high, with an average measures ICC (intraclass correlation coefficient) of 0.891 (p

< 0.001) for the one team of raters and of 0.950 (p < 0.001) for the other team of raters. Table 5

shows the mean scores for the abstracts as well as the discriminant analysis classification

results.

Table 5. PCoR Scale (Revised, Final Version) Mean Scores for Abstracts Funded by PCORI

PCoR Scale Mean Total Score (± SD)

Predicted Group Membership

Person- centered

Non–person- centered

Total

PCORI-funded abstracts 7.15 (7.96) 88 22 110

Translational abstracts -2.08 (9.50) 37 54 91

42

DISCUSSION

This is the first study to compare person-centeredness of stakeholder input to

researcher input on project-specific questions. In a head-to-head comparison, transcripts of

consultations with patients and community stakeholders showed a differential set of

recommendations and priorities for research than the ones provided by consultations with

researchers. These findings address a critical gap in the evidence base on the value of

stakeholder engagement.

Prior reports support stakeholder engagement as an approach to increase the

translation, dissemination, and uptake of research findings.34,39,47 Additional evidence supports

the value of stakeholders in prioritizing research and empowering patients to be more engaged.

Although stakeholder engagement in research has been more widely embraced in recent years,

literature demonstrating its value and impact is limited and is often derived from descriptive,

retrospective data. Prospective studies of engagement have been case reports or qualitative

analysis of engagement across multiple studies with differing types of engagement strategies

and no comparison or control.35,47,48 In this study, we prospectively assessed input from patient

and community stakeholders quantitatively and qualitatively in 20 studies using a structured

method of engagement.

To conduct this work, we developed tools that can be used broadly to help advance the

field of stakeholder engagement: a quantitative instrument to assess the person-centeredness

of research and a taxonomy of changes in research due to stakeholder engagement. The lack of

tools to guide assessment of stakeholder engagement is widely recognized, and the

development and validation of the PCoR Scale fills a critical gap in the field. This new

instrument has the potential to be used across a broad range of research projects to assess the

extent of person-centeredness of research products.

The taxonomy sets a model for the type of impact stakeholder input has on research for

evaluation. These domains can better categorize stakeholder involvement in the overall

43

research enterprise and determine value and type of contribution that can be used to create a

baseline that can measure effectiveness of stakeholder input.

Figure 2 is a global or dimensional view of the overall taxonomy to provide spatial

reference for where in the research continuum stakeholder engagement activities can take

place.

The purpose of any taxonomy is to create a standard set of terms that describe the

potential impact of stakeholder engagement on research. The lack of standard and common

nomenclature in stakeholder engagement has been identified as a gap in the field, and this

taxonomy helps address this gap. The Stakeholder Impacts in Research Taxonomy contains

standardized global categories and naming structures that could be used to better understand

how stakeholder input changes research across multiple studies.

The PCoR Scale addresses a critical methods gap in patient-centered outcomes

research—namely that of quantitative evaluation of patient-centeredness for purposes of

comparative effectiveness research and standards development in that field. A quantitative tool

with operationalized questions assessing the person-centeredness of research products will

allow individuals to compare any 2 products of research and assess the extent of person-

centeredness of each of them. It will also provide an objective measure for policymakers to

assess extent of person-centeredness. A summary score at this time is not relevant by itself as

we do not have a standardized set of scores for person-centeredness of research products; but

it is very useful as a comparison measure. As of now, the scale fills the need to provide a

measurable comparison tool between products to assess the extent of person-centeredness.

We believe the measure effectively and reproducibly quantifies multiple aspects of

person-centeredness, a complex concept. For example, beliefs, attitudes, and values are related

but not equivalent concepts, and the measure treats them as dissimilar in order to allow for a

distinction in their contribution to the level of person-centeredness. The factor analysis

indicates responses to the items are statistically correlated. Beliefs, attitudes, and values are

44

related but not equivalent concepts. We included the definitions of the questions key words in

order to facilitate distinction between them.

Limitations This study has several limitations. Although the structured, easy-to-replicate CE Studio

model is ideal for generating and evaluating the outputs of engagement, it is a consultative

method of engagement and may not elicit feedback reflecting the full range of stakeholder

engagement, which may result in more or less patient/person-centeredness. Our work

compared input from community and patient stakeholders with input from researchers instead

of stakeholder input using a different method. While it might be presumed that stakeholder

input is more person-centered, we believed it was critical to have evidence to support this

presumption, as the value of engagement is often questioned. Finally, the quantitative scale

does not assess the degree or magnitude of stakeholder engagement, which could impact

patient-centeredness; the scale was validated using only conference abstracts; and we did not

assess its psychometric properties when applied to other research products. We intend to

assess the external validity of the instruments in future work using grant proposals, journal

articles, and lay reports. The lack of a similar instrument assessing a similar construct prevented

us from assessing concurrent validity. Although we used the same abstracts and raters for both

steps of the validation process, raters were not completely blind to all abstracts we provided;

however, they were blind to at least 13 abstracts out of the 20 that they assessed. We also

eliminated the abstracts that were more person-centered (had higher ratings) in the pilot study

from the validation study. Despite these limitations, this work fills a well-recognized gap in

methods of PCOR and provides the foundation for adapting and refining methods for other

types of engagement as needed.

45

CONCLUSIONS

In this pivotal work, we found input from patients and community stakeholders to be

more person-centered than input from researchers. This evidence supports the value of

stakeholder engagement in research and uses a method of engagement that is easy to

replicate. The qualitative and quantitative tools developed in this project have the potential to

substantially advance the field of engagement. Broad dissemination of these methods should

be considered to help standardize approaches to assessing the impact of stakeholder

engagement in research. Additional studies are needed to understand the value of stakeholder

engagement using other methods of engagement and to assess the impact of engagement on

research.

46

REFERENCES

1. Joosten YA, Israel TL, Williams NA, et al. Community Engagement Studios: a structured approach to obtaining meaningful input from stakeholders to inform research. Acad Med. 2015;90:1646-1650.

2. Byrne DW, Biaggioni I, Bernard GR, et al. Clinical and translational research studios: a multidisciplinary internal support program. Acad Med. 2012;87:1052-1059.

3. Wilkins CH, Spofford M, Williams, N et al. Community representatives’ involvement in clinical and translational science awardee activities. Clin Transl Sci. 2013;6:292-296.

4. Crowley WF Jr, Sherwood L, Salber P, et al. Clinical research in the United States at a crossroads: proposal for a novel public-private partnership to establish a national clinical research enterprise. JAMA. 2004;291:1120-1126.

5. Krumholz HM, Selby JV; Patient-centered Outcomes Research Institute. Seeing through the eyes of patients: the patient-centered outcomes research institute funding announcements. Ann Intern Med. 2012;157(6):446-447. doi:10.7326/0003-4819-157-6-201209180-00519.

6. Pignone M. Challenges to implementing patient-centered research. Ann Intern Med. 2012;157:450-451.

7. Fleurence R, Selby JV, Odom-Walker K, et al. How the Patient-centered Outcomes Research Institute is engaging patients and others in shaping its research agenda. Health Aff (Millwood). 2013;32:393-400.

8. Guise JM, O’Haire C, McPheeters M, et al. A practice-based tool for engaging stakeholders in future research: a synthesis of current practices. J Clin Epidemiol. 2013;66:666-674.

9. O’Haire C, McPheeters M, Nakamoto E, et al. Engaging Stakeholders to Identify and Prioritize Future Research Needs. Rockville, MD: Agency for Healthcare Research and Quality; 2011.

10. Staley K. Exploring Impact: Public Involvement in NHS, Public Health and Social Care Research – INVOLVE. 2009. https://www.invo.org.uk/posttypepublication/exploring-impact-public-involvement-in-nhs-public-health-and-social-care-research/

11. Saunders C, Girgis A. Status, challenges and facilitators of consumer involvement in Australian health and medical research. Health Res Policy Syst. 2010;8:34.

12. Gagnon MP, Desmartis M, Lepage-Savary D, et al. Introducing patients’ and the public’s perspectives to health technology assessment: a systematic review of international experiences. Int J Technol Assess Health Care. 2011;27:31-42.

13. McLaughlin H. Involving young service users as co-researchers: possibilities, benefits and costs. Br J Soc Work. 2006;36:1395-1410.

47

14. Rowe A. The effect of involvement in participatory research on parent researchers in a Sure Start programme. Health Soc Care Community. 2006;14:465-473.

15. Savage CL, Xu Y, Lee R, Rose BL, Kappesser M, Anthony JS. A case study in the use of community-based participatory research in public health nursing. Public Health Nurs. 2006;23:472-478.

16. Walter I, Davies H, Nutley S. Increasing research impact through partnerships: evidence from outside health care. J Health Serv Res Policy. 2003;8(suppl 2):58-61.

17. Minkler M, Fadem P, Perry M, Blum K, Moore L, Rogers J. Ethical dilemmas in participatory action research: a case study from the disability community. Health Educ Behav. 2002;29:14-29.

18. Plumb M, Price W, Kavanaugh-Lynch MHE. Funding community-based participatory research: lessons learned. J Interprof Care. 2004;18:428-439.

19. McCormick S, Brody J, Brown P, Polk R. Public involvement in breast cancer research: an analysis and model for future research. Int J Health Serv. 2004;34:625-646.

20. Faulkner A. Changing Our Worlds: Examples of User-controlled Research in Action. 2010. https://www.invo.org.uk/wp-content/uploads/2011/09/INVOLVEChangingourworlds2010.pdf

21. Boote J, Telford R, Cooper C. Consumer involvement in health research: a review and research agenda. Health Policy. 2002;61:213-236.

22. Smith E, Ross F, Donovan S, et al. Service user involvement in nursing, midwifery and health visiting research: a review of evidence and practice. Int J Nurs Stud. 2008;45:298-315.

23. Minogue V, Boness J, Brown A, Girdlestone J. The impact of service user involvement in research. Int J Health Care Qual Assur Inc Leadersh Health Serv. 2005;18:103-112.

24. Wyatt K, Carter M, Mahtani V, Barnard A, Hawton A, Britten N. The impact of consumer involvement in research: an evaluation of consumer involvement in the London Primary Care Studies Programme. Fam Pract. 2008;25:154-161.

25. Workman T, Maurer M, Carman K. Unresolved tensions in consumer engagement in CER: a US research perspective. J Comp Eff Res. 2013;2:127-134.

26. Holmes W, Stewart P, Garrow A, Anderson I, Thorpe L. Researching Aboriginal health: experience from a study of urban young people’s health and well-being. Soc Sci Med. 2002;54:1267-1279.

27. Rhodes P, Nocon A, Booth M, et al. A service users’ research advisory group from the perspectives of both service users and researchers. Health Soc Care Community. 2002;10:402-409.

28. Trivedi P, Wykes T. From passive subjects to equal partners: qualitative review of user involvement in research. Br J Psychiatry. 2002;181:468-472.

48

29. Wright D, Corner J, Hopkinson J, Foster C. Listening to the views of people affected by cancer about cancer research: an example of participatory research in setting the cancer research agenda. Health Expect.2006;9:3-12.

30. Broad B, Saunders L. Involving young people leaving care as peer researchers in a health research project: a learning experience. Res Policy Plan. 1998;16:1-8.

31. Harris PA, Swafford JA, Edwards TL, et al. StarBRITE: the Vanderbilt University Biomedical Research Integration, Translation and Education portal. J Biomed Inform. 2011;44:655-662.

32. Pulley JM, Harris PA, Yarbrough T, Swafford J, Edwards T, Bernard GR. An informatics-based tool to assist researchers in initiating research at an academic medical center: Vanderbilt Customized Action Plan. Acad Med. 2010;85:164-168.

33. Pulley JM, Bernard GR. Proven processes: The Vanderbilt Institute for Clinical and Translational Research. Clin Transl Sci. 2009:2:180-182.

34. Brett J, Staniszewska S, Mockford C, et al. Mapping the impact of patient and public involvement on health and social care research: a systematic review. Health Expect. 2014;17:637-650.

35. Forsythe LP, Ellis LE, Edmundson L, et al. Patient and stakeholder engagement in the PCORI Pilot Projects: description and lessons learned. J Gen Intern Med. 2016;31:13-21.

36. Sheridan S, Schrandt S, Forsythe L, Hilliard TS, Paez KA. The PCORI Engagement Rubric: promising practices for partnering in research. Ann Fam Med. 2017;15:165-170.

37. Eder MM, Carter-Edwards L, Hurd TC, Rumala BB, Wallerstein N. A logic model for community engagement within the CTSA Consortium: can we measure what we model? Acad Med. 2013;88:1430-1436.

38. Mockford C, Staniszewska S, Griffiths F, Herron-Marx S. The impact of patient and public involvement on UK NHS health care: a systematic review. Int J Qual Health Care. 2012;24:28-38.

39. Esmail L, Moore E, Rein A. Evaluating patient and stakeholder engagement in research: moving from theory to practice. J Comp Eff Res. 2015;4:133-145.

40. Crocker JC, Boylan AM, Bostock J, Locock L. Is it worth it? Patient and public views on the impact of their involvement in health research and its assessment: a UK-based qualitative interview study. Health Expect. 2017;20:519-528.

41. Selby JV, Beal AC, Frank L. The Patient-centered Outcomes Research Institute (PCORI) national priorities for research and initial research agenda. JAMA. 2012;307:1583-1584.

42. Miles A, Mezzich J. The care of the patient and the soul of the clinic: person-centered medicine as an emergent model of modern clinical practice. Int J Pers Cent Med. 2011;1:207-222.

43. Starfield B. Is patient-centered care the same as person-focused care? Perm J. 2011;15:63.

49

44. Légaré F, Boivin A, van der Weijden T, et al. Patient and public involvement in clinical practice guidelines: a knowledge synthesis of existing programs. Med Decis Making. 2011;31:E45-E74. doi: 10.1177/0272989X11424401

45. Helfand M, Berg A, Flum D, Gabriel S, Normand S-L. Draft Methodology Report: Our Questions, Our Decisions and Standards for Patient-centered Outcomes Research. Washington, DC: PCORI; 2012.

46. Deverka PA, Lavallee DC, Desai PJ, et al. Stakeholder participation in comparative effectiveness research: defining a framework for effective engagement. J Comp Eff Res. 2012;1:181-194.

47. Barber R, Beresford P, Boote J, Cooper C, Faulkner A. Evaluating the impact of service user involvement on research: a prospective case study. Int J Consum Stud. 2011;35:609-615.

48. Buck D, Gamble C, Dudley, L et al; EPIC Patient Advisory Group. From plans to actions in patient and public involvement: qualitative study of documented plans and the accounts of researchers and patients sampled from a cohort of clinical trials. BMJ Open. 2014;4:e006400. doi: 10.1136/bmjopen-2014-006400

50

PUBLICATIONS

Johnson DA, Joosten YA, Wilkins CH, Shibao CA. Case study: community engagement and clinical trial success: outreach to African American women. Clin Transl Sci. 2015;8(4):388-390.

Joosten YA, Israel TL, Williams NA, et al. Community Engagement Studios: a structured approach to obtaining meaningful input from stakeholders to inform research. Acad Med. 2015;90(12):1646-1650.

51

APPENDIX A: Community Partner Organizations Represented in the Community Advisory Council

Matthew Walker Comprehensive Health Center

Nashville Health Disparities Coalition

Belmont University School of Nursing

Progreso Community Center

Nashville Latino Health Coalition

YMCA of Middle Tennessee

State Alliance of YMCAs

Healthy Nashville Leadership Council

TSU Center for Health Research

Council on Aging of Greater Nashville

Senior Services Network

NAACP Nashville Branch

Meharry Sickle Cell Center

Tennessee Department of Health

Tennessee Cancer Coalition

National Healthcare for the Homeless

Nashville CARES

Siloam Family Health Center

Nashville Area Metropolitan Planning Organization

Tennessee Obesity Task Force

United Way of Metropolitan Nashville

Metro Nashville Health Department

Metro Nashville Public Schools

Neighborhoods Resource Center

Community Partners Network New Unity Church

52

APPENDIX B: Literature Search for Candidate Item Generation

Query:

(((((((((patient-centeredness[All Fields] OR ("patient outcome assessment"[MeSH Terms] OR

("patient"[All Fields] AND "outcome"[All Fields] AND "assessment"[All Fields]) OR "patient

outcome assessment"[All Fields] OR ("patient"[All Fields] AND "centered"[All Fields] AND

"outcomes"[All Fields] AND "research"[All Fields]) OR "patient centered outcomes research"[All

Fields])) OR (("residence characteristics"[MeSH Terms] OR ("residence"[All Fields] AND

"characteristics"[All Fields]) OR "residence characteristics"[All Fields] OR "community"[All

Fields]) AND engaged[All Fields] AND ("research"[MeSH Terms] OR "research"[All Fields]))) OR

(stakeholder[All Fields] AND engagement[All Fields] AND ("research"[MeSH Terms] OR

"research"[All Fields]))) OR (participatory[All Fields] AND ("research"[MeSH Terms] OR

"research"[All Fields]))) OR (("Evaluation"[Journal] OR "Evaluation (Lond)"[Journal] OR

"evaluation"[All Fields]) AND ("residence characteristics"[MeSH Terms] OR ("residence"[All

Fields] AND "characteristics"[All Fields]) OR "residence characteristics"[All Fields] OR

"community"[All Fields]) AND engagement[All Fields] AND ("research"[MeSH Terms] OR

"research"[All Fields]))) OR (("Evaluation"[Journal] OR "Evaluation (Lond)"[Journal] OR

"evaluation"[All Fields]) AND ("patient participation"[MeSH Terms] OR ("patient"[All Fields]

AND "participation"[All Fields]) OR "patient participation"[All Fields] OR ("patient"[All Fields]

AND "engagement"[All Fields]) OR "patient engagement"[All Fields]) AND ("research"[MeSH

Terms] OR "research"[All Fields]))) OR (("Impact (Am Coll Physicians)"[Journal] OR "impact"[All

Fields]) AND ("residence characteristics"[MeSH Terms] OR ("residence"[All Fields] AND

"characteristics"[All Fields]) OR "residence characteristics"[All Fields] OR "community"[All

Fields]) AND engagement[All Fields] AND ("research"[MeSH Terms] OR "research"[All Fields])))

OR (("Impact (Am Coll Physicians)"[Journal] OR "impact"[All Fields]) AND ("patient

participation"[MeSH Terms] OR ("patient"[All Fields] AND "participation"[All Fields]) OR

"patient participation"[All Fields] OR ("patient"[All Fields] AND "engagement"[All Fields]) OR

"patient engagement"[All Fields]) AND ("research"[MeSH Terms] OR "research"[All Fields]))) OR

(("Impact (Am Coll Physicians)"[Journal] OR "impact"[All Fields]) AND ("family"[MeSH Terms]

53

OR "family"[All Fields]) AND engagement[All Fields] AND ("research"[MeSH Terms] OR

"research"[All Fields]))) OR (("Impact (Am Coll Physicians)"[Journal] OR "impact"[All Fields])

AND ("caregivers"[MeSH Terms] OR "caregivers"[All Fields] OR "caregiver"[All Fields]) AND

engagement[All Fields] AND ("research"[MeSH Terms] OR "research"[All Fields])) AND

("0001/01/01"[PDAT] : "2013/12/31"[PDAT])

Search Results URL: https://tinyurl.com/wilkins-PCOR-PubMedSearch

Search Details:

54

# Results / Year in PubMed

0

1000

2000

3000

4000

5000

6000

7000

8000

900020

1320

1120

0920

0720

0520

0320

0119

9919

9719

9519

9319

9119

8919

8719

8519

8319

8119

7919

7719

7519

7219

7019

6819

6419

60

count

55

APPENDIX C: Qualitative Analysis Codebook

Table 1. Potential Areas of Impact for Patient (and other Stakeholder) Engagement1

Domains/Parent Code* Elements/Subcode* 1. Pre-Research/Proposal Development This describes early stages in the research process when trying to determine what, where and why?

a. Idea/topic generation b. Identify issues of greatest importance c. Input on Relevance/Purpose d. Identify stakeholders/potential partners e. Development of specific aims**

2. Infrastructure* Involvement in the foundation/logistics of the project, distribution of funds and planning

a. Funding source decisions b. Preparation of budget c. Sharing of funds d. Appropriate compensation for stakeholders (patients, consumers, community organizations)* e. Time* f. Cost* g. Process/structure for shared decision making* h. Scope of work (who is funded, why, and what is being done)**

3. Research Design This describes defining the how and who. Which populations and how do we study this question within those populations. What is the best approach?

a. Define population b. Selection of patient-centered tools c. Organize ideas and capture the way the research will be applied. d. Provide input on research methods e. Grant writing/proposal development f. Framing research questions g. Selection of comparators & outcomes h. Revise the research protocol i. Input on cultural appropriateness j. Provide input on setting for the research**

4. Implementation of Research This also describes the how, but more specifically execution of the project. Developing the steps on how to complete the objective of the research.

a. Identify/hire research team members b. Recruitment of research participants c. Identify best approaches to recruitment and retention d. Determine best approaches to data collection (in person vs online vs telephone; survey vs interview; self-report vs caregiver report) e. Assist with data collection f. Identify potential stigmas for condition studied**

5. Analysis of Research This describes the steps after data is collected which could include statistical measures and interpretation.

a. Assist with data analysis (train to do qualitative analysis) b. Provide alternative interpretation of research results (especially those that are counterintuitive) c. Bring attention to factors (confounders) that may not have been measured or documented in literature d. Interpret- assess plausibility of results e. Review results and provide context for relevance to patients and stakeholders

6. Dissemination of Research Findings Refers to distributing the results/discussing outcomes or work-in-

a. Provide culturally relevant and appropriate language b. Co-authorship of manuscripts c. Write for non-scientific publication d. Advise on appropriate audiences and non-traditional

Res

earc

h Pr

oces

s

56

progress of the research in oral or written forums. Appropriateness of the delivered message/objectives/audience.

venues for dissemination e. Convene town hall meetings and other opportunities for dissemination f. Create companion materials for dissemination- videos, newsletters, brochures, PowerPoint presentations, handouts etc.** g. Social media outreach** h. Identify appropriate community agencies who would benefit from the research** i. Provide input on who to reach and how (appropriate message delivery)** j. Organize ideas and capture the way the research will be applied**

7. Translation/Post-Research** a. Next follow-up question (you have this data now what? What are the next steps?) b. Actionable steps (implementation of what discovered from the research) c. How to follow-up with participants d. Overall impact of the research on the community e. Helping formulate next steps/convene appropriate audiences for further action/post-research action

8. Ethics* Assuring the protection of and respect for volunteers/participants in research. Making sure the participants are well informed and risks and benefits are made clear.

a. Consent process b. Acceptability of research c. Protection of individuals vs protection of communities d. Privacy (might be implied in consent process) e. Risks/Benefits (i.e. health, increased knowledge)* f. Education of group or community that the investigator plans to engage**

9. Quality Improvement** a. Improving processes, increase efficiency (i.e. saving consumer/industry money, saving patient time out of schedule, finding ways to “free-up” doctor’s time) b. How better serve patients/articulate information more clearly c. Improvements on how to prepare patients before seeing the doctor and making decisions on health

10. Engagement* Internal Processes for Engagement

External Types and Quality of Stakeholder Input

Wilkins, Boyer, Joosten, Richmond, Vaughn, Boone, Israel; version 3, October 16, 2014. Note: * Input/domains and elements provided by team. Domains are also highlighted in orange. ** Elements/Domains provided by interviews conducted. Domains are also highlighted in green. Free Codes Validation Round 1:

• Bias • Buy-in • Consent • Define measures • Empowerment through knowledge • Ethics outside of research • Individualized Care

Res

earc

h

57

• Layperson terms • Operating in Silos

Free Codes Validation Round 2: • Logistics of research protocol • Education of Participants • Tech preference • Comfort Level • Concerns about tech access • Need for clarity • Role of Research • Tailoring to Improve • Language as a barrier • Language Terminology as Barrier

58

Copyright© 2019. Vanderbilt University Medical Center. All Rights Reserved.

Disclaimer:

The [views, statements, opinions] presented in this report are solely the responsibility of the author(s) and do not necessarily represent the views of the Patient-Centered Outcomes Research Institute® (PCORI®), its Board of Governors or Methodology Committee.

Acknowledgement:

Research reported in this report was [partially] funded through a Patient-Centered Outcomes Research Institute® (PCORI®) Award (#ME-1306-03342) Further information available at: https://www.pcori.org/research-results/2013/comparing-methods-make-research-more-patient-centered