23
SRDMS: Summary Statement Data Import Process and Report Guide Not applicable for LRP or applications with ‘component’ sub-projects. QUICK LINKS SRDMS Website QVR Project Search Website Summary Statement Data Import o General Import Information o QVR Project Search Criteria o QVR Code and/or Score Data Report o Multi-Project Scores Only: Peer Review Module Voter Matrix o SRDMS Data Import IAR Pre-SS Report(s) Download and SRDMS Critique Extractor SS Data Builder Summary Statement Download Form Appendix A: Critique Extractor Guide Appendix B: SS Template Business Rules SUMMARY STATEMENT DATA IMPORT AND PRE-SS DOWNLOAD PROCESS General Import Information The summary statement report requires the QVR custom download report used for the initial import process plus the IMPAC II IAR module’s Pre-SS reports and, for multi-project applications only, the IMPAC II Voter Matrix (Excel). These can be saved in any location. The custom download QVR report needs to be saved in Microsoft Excel 97-2003 (.xls) format and contain either ADMIN or APP in the file name. For Multi-Projects Only, the IMPAC II Voter Matrix (Excel) report needs to contain either SCORE or MATRIX in the file name. Once all the reports are downloaded and saved, the rest of the process is completed within the SRDMS. During the import process these files will be uploaded into the system, data will be re-formatted and then imported into the specific review. This process can be repeated as many times as needed. There is no need to separate previously imported data from new data – download all relevant review data together. Except for the HS/VA codes & scores, the previously imported data will not be changed or updated as long as the related application number is entered and matches the reports.

ecollab.niaid.nih.gov Document…  · Web viewThere is no need to separate previously imported data from new data – download all ... data): open in MS Word and re-save as .docx

  • Upload
    dotu

  • View
    217

  • Download
    4

Embed Size (px)

Citation preview

SRDMS: Summary Statement Data Import Process and Report GuideNot applicable for LRP or applications with ‘component’ sub-projects.

QUICK LINKS

SRDMS Website

QVR Project Search Website

Summary Statement Data Import

o General Import Information

o QVR Project Search Criteria

o QVR Code and/or Score Data Report

o Multi-Project Scores Only: Peer Review Module Voter Matrix

o SRDMS Data Import

IAR Pre-SS Report(s) Download and SRDMS Critique Extractor

SS Data Builder

Summary Statement Download Form

Appendix A: Critique Extractor Guide

Appendix B: SS Template Business Rules

SUMMARY STATEMENT DATA IMPORT AND PRE-SS DOWNLOAD PROCESS

General Import Information The summary statement report requires the QVR custom download report used for the initial import

process plus the IMPAC II IAR module’s Pre-SS reports and, for multi-project applications only, the IMPAC II Voter Matrix (Excel). These can be saved in any location.

The custom download QVR report needs to be saved in Microsoft Excel 97-2003 (.xls) format and contain either ADMIN or APP in the file name.

For Multi-Projects Only, the IMPAC II Voter Matrix (Excel) report needs to contain either SCORE or MATRIX in the file name.

Once all the reports are downloaded and saved, the rest of the process is completed within the SRDMS.

During the import process these files will be uploaded into the system, data will be re-formatted and then imported into the specific review.

This process can be repeated as many times as needed. There is no need to separate previously imported data from new data – download all relevant review data together. Except for the HS/VA codes & scores, the previously imported data will not be changed or updated as long as the related application number is entered and matches the reports.

Replace the related files, each time they are downloaded. There is no need to keep the old ones. Let the SRDMS Help Team know if any data is not imported correctly (the report setup may change from

time to time and not match what the coding is based on, etc.). Known data oddities:

o The QVR FOA number data is missing the initial RFA, PAR, etc. phrase (e.g. AI16-012).o Fields that download as ALL CAPS are converted to Title Case. The FOA title is an example of

this conversion and may need to be corrected on the related summary statement.

Notes: QVR has afternoon and overnight data updates. If application(s) were assigned to a SEP today, they may

not be available in QVR until later or the next day.

If an application was imported, then deleted from a specific review and now needs to be restored/re-imported - an SRDMS admin user can restore the application instead of re-importing it.

QVR Project Search Criteria1. Login and open the QVR Project Search Page

2. Use the Primary Search section to enter the criteria necessary to pull the specific application data for the review meeting. Marking the Contact PI Only checkbox is required for all reports. Examples of criteria include:

a. RFAs: Contact PI Only and related RFA number(s) (format is AI##-###)

b. Charter Committee Specific Search Criteria: Contact PI Only, Council (applicable Yr & Month) and Study Section (AIDS, AITC, MID, MID-B, etc.)

c. R13 Conference Grants Specific Search Criteria: Contact PI Only, Council (applicable Yr & Month) and Study Section

d. Investigator-Initiated P01s: Contact PI Only, Study Section and Council (applicable Yr & Month)

Multi-Project Application QVR Criteria Notes: Do NOT enter a PI name in the QVR PI Name field when exporting these reports. Doing so, limits the

sub-project data to those who have that name as part of the sub-project leader’s name (e.g. 2 sub-projects may be imported when there were really 7).

DO mark the Subprojects checkbox (this can be marked for both reports).

3. Mark the Contact PI’s Only checkbox (this will prevent duplicate records for multiple PI applications).

QVR Codes and/or Scores Data Report

Notes: The following steps can be used to import either codes or scores or both. To do so the related data needs

to be entered into IMPAC II and available for download. It can be re-downloaded and imported to add/update these data elements.

1. After selecting the appropriate criteria, click on the Custom Download tab

2. After the Custom Download window appears, click on the Custom Download tab again and select Retrieve Item List

3. When the Retrieve Saved Download Lists window opens, select the Public Column Lists option and then REVIEW from the drop down list as shown below

4. Scroll down and click on the SRDMS-APP+SUB-PROJ DATA list to populate the Custom Download Columns window with the corresponding data fields

5. Click the button6. When the next window appears, click DOWNLOAD MY RESULTS and then Open to view this report

7. Once the file is open, save it as a Microsoft Office Excel 97-2003 Workbook (*.xls) file with a name that has ADMIN or APP.

8. IF the import is the:a. initial data upload or SS import for a single-project review, Excel and QVR can be closed

b. SS import for a multi-project review, close the Excel file and follow the instructions below to download Peer Review Voter Matrix (Excel)

Multi-Project Scores Only: Peer Review Module Voter Matrix (Excel)1. Log into the IMPAC II Peer Review module: https://apps.era.nih.gov/rev 2. Select the related meeting using the Switch Meeting link/form, as needed

3. Go to the Reports section and select Voter Matrix (Excel)

4. Click Run Report when the next window appears and save the file as an .xls file with a name that has SCORE or MATRIX before closing it

SRDMS Data Import

Note: Multi-projects Only: Until the next release, for existing reviews (app data already present) upload the

app/admin data report used for the VA/HS codes separate from the score matrix. Scores are only updating when the score matrix is uploaded by itself.

1. Click on the eApplication Import Menu then App/KP/Codes/Score Data Import2. When the following form opens, either drag and drop files onto the Drop files here box or select the

related reports using the Choose Files button. All of the files can dragged or chosen at the same time.

3. If the HS and VA codes have been entered into IMPAC II or the data in SRDMS can be replaced for ALL of the applications entered in the review, uncheck the ‘Do no import/replace codes’ box to import these.

Helpful hint: Select specific applications in QVR when only a subset have the codes ready to upload.

4. Click on the Upload all button to complete the data import

5. Click OK when message listing the data imported status window appears

IAR PRE-SS REPORT(S) DOWNLOAD AND SRDMS CRITIQUE EXTRACTOR

Notes: All critiques should be uploaded and final before downloading this report. However, if any critiques are changed after the initial download, they can be re-downloaded while available in the system.

1. Log into the IMPAC II Peer Review module: https://commons.era.nih.gov/commons/

2. Search for and select the related meeting using the Meeting Search and List of Meetings forms, as needed

3. Click on the By Application link next to the related meeting

4. If necessary (Pre-SS are ‘deleted’ or not present), click on the Generate PSS button. It may take a few minutes for these to appear to be generated; however, the next step can be done and as needed re-done to see how many are complete.

5. Click Download Zip of All Pre-SS to download all available MS Word (not the PDF version) files. Do not rename any of these file names, they need to stay exactly as they are exported from IMPAC II to be linked to the related application number in the system.

6. Log into SRDMS and open the Post-Review\Pre-SS Critique Extractor form

7. Drag and drop all of the Pre-SS files into the white form area or click the Upload Files button on the Upload Pre-Summary Statement window to start the upload process

a. If using the Upload Files button, click the Add Files button to select or drag and drop all of the previously downloaded Pre-SS reports.

8. After the files are uploaded, check the Status column for any system-identified issues indicated by a red box (shown below)

a. If any are identified or to preview extracted information, click on the blue View icon to open the data reviewer window and look for red Critique arrow icons

b. When the Review SS Detail window opens, click on the related critique down-arrow to expand (or collapse) that section and check data

i. Refer to the Appendix A: Critique Extractor Guide to troubleshoot or learn about the critique extraction function.

c. When issues are found, close the view window and open the related Pre-SS file outside the system to correct any issues and re-save it. Then delete the previously uploaded version before uploading the new one.

9. When the Pre-SS files are uploaded and are ready for use in the SS report , select all or only the related Pre-SS checkboxes that are ready and click the Upload to SRDMS button to connect them to the related application and close the critique extractor form.

SS BUILDER

This data entry form can be used to enter or update the Summary Statement related report data and ‘build’ the headers and footers.

If data is entered in the main application forms, it is brought over into this form and can be updated as needed.

The score field only allows numbers. ND is listed when the data = 0 or is empty.

The BioHazard and Select Agents acceptable/unacceptable data is based on all related record status (i.e. if any biohazards or select agents have been entered and are unacceptable this form’s field will be

unacceptable). Comments for the separate ones entered into the main Special Issues form are not part of this form yet. Only the HS/VA code comments are listed.

Select the related Application and Sub-Project as applicable to limit the results to an easy to view and update row of data.

SS headers and footers are displayed according to the business rules listed shown in Appendix B: SS Template.

Data does need to be entered (single letter will do) for the SRO Admin Note, Budgetary Overlap and Committee Budget Recommendation headers to appear. After the Summary Statement report is downloaded, more detailed statements can be added.

Upon closing the form a ‘Do you want to leave this page?’ message will appear to allow a data save function to occur. Click the button that allows you to leave the page.

SUMMARY STATEMENT DOWNLOAD FORM

This form lists applications that have application numbers entered and are not marked withdrawn/unresponsive and is used to download individual summary statements.

The Upload Status column shows the status of the related critiques.

Clicking on an application number will begin the summary statement report download and allow it to be either saved or opened based on the Internet Explorer or Chrome browser function.

APPENDIX A: CRITIQUE EXTRACTOR GUIDE

The SRDMS extraction tool uses the IMPAC II Pre-SS report (MS Word version) to extract the scored review criteria scores and critique information for each application. All information/data elements extracted will be part of the SRDMS summary statement report.

Multiple Pre-SS report files can be uploaded at the same time. The application number listed at the end of the file name will be used to link to the application data present in the system. The extraction tool will not pull critiques if the related application number does not already exist in the system.

This extraction tool checks for extraction errors and flags these in the ‘Status’ column. The files will still be able to be uploaded; however, the information connected to an error will not be present in the exported SS report. If errors occur, the Pre-SS can be corrected and re-uploaded as needed.

To extract the correct information, the following ‘business rules’ for the critique templates sent to the reviewers and post-meeting pre-SS reports need to be true.

Boxed templates are not supported by the extraction tool. A separate MS Word macro must be run to convert these to table-less documents before uploading or run before re-uploading.

Required Pre-SS phrases and punctuation (see last page) When checking Pre-SS after error flags appear, it may be helpful to turn on the show/hide function. This

will show all spaces and tabs, and the hard returns that have to be after the Overall Impact, Description and other phrases.

For multi-project applications, the extractor checks for the Project, Core or Admin-Core words above the “(Description as provided by applicant)” line. If missing, the related sub-project will not be extracted.

It checks for the word Critique and the related number (e.g. Critique 1). If missing, the related critique will not be extracted. If there is no text between 2 critiques on the Pre-SS report (reviewer(s) did not submit a critique), nothing will be extracted. The SS will list the critiques based on the number on the Pre-SS report. If the critiques need to be re-ordered for any reason, change the numbers on the Pre-SS report and upload it.

The standard templates have the blue font/criteria sections setup as images. These will be ignored. If any have been changed to text, the text will be extracted and listed on the SS.

Scored Review Criteria

Must have a number, period with a space (not a tab) in front of each and a colon after the criteria phrase (i.e. “1. Significance:”). Any criteria word/phrase between the number and the colon will be pulled as the criteria (future FOA/criteria update flexibility).

Text listed other than the following will cause an error.

1. Significance: is expected.<<Image>>

Strengths

1. Significance: use of a tab instead of a single space between the number and criteria will cause an error and criteria will not be extracted correctly.

<<Image>>Strengths

1. Significance: XXXX is acceptable/extra text is ignored.<<Image>>

Strengths1. Significance:

XXXX is acceptable/extra text is captured in a separate box above strengths section.<<Image>>

Strengths

1. Significance:<<Image>>XXXX or hard return is acceptable/extra text is captured in a separate box above strengths

section.Strengths

Strengths and Weaknesses (S/W) section

Strengths

Weaknesses

S/Ws must not have a colon (e.g. Strengths:) or anything listed directly else after them. Any comments/notes must be on the next line. Anything between the ‘Strength’ and ‘Weakness’ words will be extracted. If only empty bullets are listed (no actual S/W comments), nothing will be extracted. A ^ symbol in white font has been added under the Additional Review Criteria image to stop pulling data

for the last weakness per critique. If it is not present, extra phrasing will be listed in the last weakness box.

Additional Review Criteria By default, additional review criteria will not be extracted. These will remain part of the Pre-SS report and

that report can be used to add any necessary sections and comments to the final summary statement.

IF staff would like the additional review criteria comments extracted – these can be added on the original critique template or after the Pre-SS is exported from IMPAC II.oAdd 2 asterisks (**) in front of the additional review criteria to have extractor keep the criteria and

comments with the related critique.oAdd 4 asterisks (****) in front of the additional review criteria to have extractor put the criteria and

comments after the footer section.

Notes & Helpful HintsPrior to sending these to the reviewers, do not modify the template in anyway that will make the requirements listed above not true (e.g. add a colon after the word “Strengths”, etc.).

Commonly occurring template issues found were related to reviewers not using the critique template at all (making their own or just having a paragraph for the overall), using boxed critique templates or changing the template (deleting overall impact: if they didn’t put anything, changing strengths to strength, etc.).

If the blue text criteria pictures in the original critique template file are converted to text, this text will be extracted and placed into the SS (blue color is not kept). This has happened AND reviewers have entered their own responses between the criteria questions.

Reviewers tend to repeat their habits, so a reviewer using a boxed critique template for 1 assignment most likely used it for the rest of his/her assignments. Fix all prior to upload or re-upload.

Let reviewers know that they should not make any modifications to the template other than adding their comments.

All of the requirements listed above also need to be true when the Pre-SS report is exported (reviewer submitted version) to avoid system error flags. If a critique has an error flag, the Pre-SS report can be modified and re-uploaded to clear it, as needed.

Check for errors and compare results window to related Pre-SS report critique section.

Check Pre-SS if data extracted/listed on SS report seems very minimal for type of review.

Missing review criteria or overall impact usually means that punctuation or phrasing is missing. Overall Impact and Description phrases need a colon and a hard return after them (nothing else).

Pre-SS file doesn’t upload as expected (blank title and other data): open in MS Word and re-save as .docx file type. A few Pre-SSs seem to have corrupt elements that are fixed by re-saving.

Boxed Critique Template Error: If an overall text error and criteria data (sample shown below) related to ‘hyperlink’ appears after uploading the Pre-SS report into the extractor, there is a boxed template in the report. Run the Pre-SS Boxed Template Revision macro to re-format it as a box-less template.

SRDMS Critique Extractor Required Pre-SS Report ElementsCritique <Number> (Note: table below is created by IMPAC II, criteria vary depending on application type)Significance: 3Investigator(s): 2Innovation: 3Approach: 6Environment: 3

Principal Investigator (PI):

DESCRIPTION:<Paragraph1><Paragraph2><Paragraph3>

Overall Impact: Or for multi-projects (Tim will add in white font for those without an Overall Impact statement to stop pulling data for the Description section)Overall Impact – Overall:Overall Impact – Individual Research Projects:Overall Impact – Individual Cores:<Paragraph1><Paragraph2><Paragraph3>

<number>. <Criteria>: <Paragraph1><Paragraph2><Paragraph3>

{Text between criteria phrase and “Strengths” will be captured and listed in SS rpt}Strengths

<Paragraph1> <Paragraph2> <Paragraph3>

Weaknesses <Paragraph1> <Paragraph2> <Paragraph3>

^ - Caret in white font (invisible for reviewers) is below additional review criteria image to stop pulling data for the last weakness per critique

** in front of additional review criteria to have extractor keep the criteria and comments with the related critique.<Paragraph1><Paragraph2><Paragraph3>

**** in front of the additional review criteria to have extractor put the criteria and comments after the footer section.<Paragraph1><Paragraph2><Paragraph3>

APPENDIX B: SS TEMPLATE BUSINESS RULES

The following template shows the business rules/requirements that are used to generate the summary statement headers, footers and other information. This is what the development team used to create this complex report.

Entire document is in black font, MS Word Arial 11, .75 margins with text in sentence or ALL CAPS as shown. Formatting is very important because this is a document that is released to the applicant/public.

Sort by application score (10 is the best score, should appear 1st., the SROs need to complete these 1st) and then by App #, with 0 or streamlined ones in the back.

The 1st section is the ‘header’ section for each application SS, it is used to flag potential app issues listed in more detail within the SS. Most ‘headers’ only appear if the data entered is unacceptable – see below for specifics. All headers are text/phrases in ALL CAPS and in alphabetical order.

Last section is the ‘footer’ section for each app. All footers are in ALL CAPS then have comments in sentence case. Unlike the header section, the footer section is not in alpha order, it has a defined order shown in the related section below.

<<>> surround SRDMS data elements For applications that have a score of 0/null/streamlined, the yellow highlighted sections will not need to be

displayed. SROs will add if/when needed. For remaining sections, use data/business rules listed. For applications with an activity code of R13, see the purple font (2 extra headers and footers) additions.

Final font color should be black, just used purple to flag it as R13 only. Applications with a T activity code, have 2 additional Headers/Footers, see blue font For applications with a K activity code, have 1 header/footer that is also present for the T activity code apps –

RCR hdr/ftr noted in blue font SROs: CHIMPANZEE and REVISED headers will need to be added as needed.

<<APP #>><<PI LAST NAME, FIRST INITIAL.>>

AUTHENTICATION OF KEY BIOLOGICAL AND/OR CHEMICAL RESOURCES UNACCEPTABLE (display when the field is any of the 3 versions of unacceptable)

APPROPRIATE REPRESENTATION (only display when R13 activity code and Approp Rep is unacceptable)

BIOHAZARD COMMENT (only display when BioH is unacceptable or if Select Agents are either acceptable or unacceptable)

BUDGETARY OVERLAP (display when comments entered into related field)

COMMITTEE BUDGET RECOMMENDATIONS (display when comments entered into related field)

DATA SHARING PLAN (display when related plan is unacceptable)

FOREIGN INSTITUTION (display when foreign component field = Yes, Justified or Not Justified)

GENOMIC DATA SHARING (display anytime GDS is marked – even when acceptable)

INCLUSION OF CHILDREN PLAN UNACCEPTABLE (display when related code is unacceptable, 1U, 2U, etc.)

INCLUSION OF MINORITIES PLAN UNACCEPTABLE (display when related code is unacceptable, 1U, 2U etc.)

INCLUSION OF WOMEN PLAN UNACCEPTABLE (display when related code is unacceptable, 1U, 2U, etc.)

INTRAMURAL INVESTIGATOR(S) (always display – staff may not mark intra inv checkbox)

MODEL ORGANISMS SHARING PLAN (display when related plan is unacceptable)

PROTECTION OF HUMAN SUBJECTS UNACCEPTABLE (display when related code is unacceptable/44 code)

PROVISION OF FAMILY CARE FACILITIES (only display when R13 activity code and FamCareFac is unacceptable)

RECRUITMENT AND RETENTION PLAN TO ENHANCE DIVERSITY (display when activity code begins with a T and the ‘Diversity Plan’ is marked as Unacceptable)

SCIENTIFIC REVIEW OFFICER’S ADMINISTRATIVE NOTES (display when comments are entered into related field, Bio NC is marked or Authentication field is unacc-missing or overstuffed)

SELECT AGENTS (display anytime Select Agents are either acceptable or unacceptable)

TRAINING IN THE RESPONSIBLE CONDUCT OF RESEARCH (display for T or K activity code and only when these have the any of the RCR (PA reqs form) fields for Format, Subject Matter, Faculty Participation, Duration or Frequency are marked as Unacceptable)

VERTEBRATE ANIMAL UNACCEPTABLE (display when related code is unacceptable/44 code)

RESUME AND SUMMARY OF DISCUSSION: This <<Adjectival (based on score)>> <<Type based on 1-3/1st # of App #>> <<Award Type based on App# Activity code>> application entitled <<App Title>> was submitted in response to <<FOA #: FOA Title>> by the <<App Institution>>, <<City>>, <<State>>, with Dr. <<PI 1st Name>><<PI Last Name>> as <<Award Type Table PD/PI SS Resume PD/PI>>. <<App Specific Aim/Goal statement if any>> <<MPI Statement if any>> <<Foreign Component Comments if any>>

{SRO adds Strengths and weaknesses to Word doc/Export here, keep ~5 lines of space}

IF multi-project application type (based on Admin User\Award Type\Review Type column), Overall Program Critiques are listed here. Otherwise these are placed under the critique section.

Based upon the evaluation of scientific and technical merit, this application received an Impact/Priority score of <<App-Level Score>>.

DESCRIPTION (provided by applicant): <<App Abstract>>

PUBLIC HEALTH RELEVANCE: <<App Relevance>>

Sub-project descriptive sections (see P01 or U19 SSs for exact setup).

1st section – sub-proj descriptorsType+Number: TitleProject/Core Leader: PI Last Name, First Initial.{SRO Paragraph – keep about 3 lines of space here}Project score sentence (only add if score>0 is present): “Based on the evaluation of scientific and technical merit, Project # received an Overall Impact score of score.”

Core score sentence (only add if score>0 is present): “Type+Number received an Overall Impact score of score.”

2nd section -‘Critiques’{disclaimer only added once for whole section}

If scored useCRITIQUE: The comments in the CRITIQUE section were prepared by the reviewers assigned to this application and are provided without significant modification or editing by staff. They are included to indicate the range of comments made during the discussion, and may not reflect the final outcome. The RESUME and SUMMARY OF DISCUSSION section summarizes the final opinion of the committee after the discussion and is the basis for the assigned Overall Impact score.

If not scored/discussed useCRITIQUE: The comments in the CRITIQUE section were prepared by the reviewers assigned to this application and are provided without significant modification or editing by staff.

Type+Number: Title, then Project/Core Leader: PI Last Name, First InitialDESCRIPTION (provided by applicant: Abstract

CRITIQUE 1 and others then next project/core.

{Critique section from IMPAC II file(s) here}

THE FOLLOWING RESUME SECTIONS WERE PREPARED BY THE SCIENTIFIC REVIEW OFFICER TO SUMMARIZE THE OUTCOME OF DISCUSSIONS OF THE REVIEW COMMITTEE ON THE FOLLOWING ISSUES: IF an application has a score of 0/null/streamlined use the following statement instead – these apps were never discussed by a review panel.THE FOLLOWING RESUME SECTIONS WERE PREPARED BY THE SCIENTIFIC REVIEW OFFICER BASED ON INDIVIDUAL REVIEWER COMMENTS OR ADMINISTRATIVE REVIEW BY NIH STAFF:

APPROPRIATE REPRESENTATION: <<R13 APPROP REP>> (only display when activity code is R13)<<R13 Approp Rep Comments>>

PROTECTION OF HUMAN SUBJECTS: <<HS CODE RELATED SS U/A/NA PHRASE (Admin Table)>><<HS Code related SS Default Comment (Admin Table)>> IF 30 Code Else<<HS Comment Field>>

MP Sub-Proj: List overall application code and phrase (same as single project) next to ftr with comments under it (if entered or acceptable default comment), then group sub-projs based on the code entered with comments listed under each grouping.

DATA SAFETY MONITORING PLAN/BOARD: Only list if the DSMB checkbox is markedMP Sub-Proj: List if overall application checkbox is marked – same as single project.

INCLUSION OF WOMEN PLAN (Resume): <<WOMEN CODE RELATED SS U/A/NA PHRASE (Admin Table)>><<Women Code related SS Default Comment (Admin Table)>> IF “Acceptable” (1A, 2A….) Code Else<<Women Comment Field>>

MP Sub-Proj: List overall application code and phrase (same as single project) next to ftr with comments under it (if entered or acceptable default comment), then group sub-projs based on the code entered with comments listed under each grouping.

INCLUSION OF MINORITIES PLAN (Resume): <<MINORITIES CODE RELATED SS U/A/NA PHRASE (Admin Table)>><<Minorities Code related SS Default Comment (Admin Table)>> IF “Acceptable” (1A, 2A….) Code Else<<Minorities Comment Field>>

MP Sub-Proj: List overall application code and phrase (same as single project) next to ftr with comments under it (if entered or acceptable default comment), then group sub-projs based on the code entered with comments listed under each grouping.

INCLUSION OF CHILDREN PLAN (Resume): <<CHILDREN CODE RELATED SS U/A/NA PHRASE (Admin Table)>><<Children Code related SS Default Comment (Admin Table)>> IF “Acceptable” (1A, 2A….) Code Else<<Children Comment Field>>

MP Sub-Proj: List overall application code and phrase (same as single project) next to ftr with comments under it (if entered or acceptable default comment), then group sub-projs based on the code entered with comments listed under each grouping.

VERTEBRATE ANIMAL (Resume): <<VERTEBRATE ANIMALS CODE RELATED SS U/A/NA PHRASE (Admin Table)>><<Vert Animal Code related SS Default Comment (Admin Table)>> IF 30 Code Else<<Vertebrate Animals Comment Field>>

MP Sub-Proj: List overall application code and phrase (same as single project) next to ftr with comments under it (if entered or acceptable default comment), then group sub-projs based on the code entered with comments listed under each grouping.

BIOHAZARD COMMENT: <<ACCEPTABLE/UNACCEPTABLE/NOT APPLICABLE based on field value>> “The plan to prevent risks during handling of biohazard materials or samples is adequate.” IF all “Acceptable” Else<<BioH SS Comments, if any>> {Add this footer anytime Select agents or Bio Hs are present}

MP Sub-Proj: List overall application A/UN/N/A (same as single project) next to ftr with comments under it (if entered or acceptable default comment). If N/A for all then just put main N/A and skip sub-proj data, otherwise group sub-projs based on the A/UN/NA entered with comments listed under each grouping.

RECRUITMENT AND RETENTION PLAN TO ENHANCE DIVERSITY: <<ACC/UNACC Value>> (only display when activity code starts with a T)<<Diversity plan comments>>

TRAINING IN THE RESPONSIBLE CONDUCT OF RESEARCH: <<ACCEPTABLE/UNACCEPTABLE based on all entries-if any are unacceptable this will display unacceptable>> (only display this section for T and K activity codes, note – subsections below are all in sentence case – not all caps)Format: <<Format Acceptable/Unacceptable>> <<Format Comments>>

Subject Matter: <<Subject Matter Acceptable/Unacceptable>> <<Subject Matter Comments>>Faculty Participation: <<Faculty Participation Acceptable/Unacceptable>> <<Faculty Participation Comments>>Duration: <<Duration Acceptable/Unacceptable>> <<Duration Comments>>Frequency: <<Frequency Acceptable/Unacceptable>> <<Frequency Comments>>

PROVISION OF FAMILY CARE FACILITIES: <<R13 FamCareFac>> (only display when activity code is R13)<<R13 FamCareFac Comments>>

SELECT AGENTS: : <<ACCEPTABLE/UNACCEPTABLE/NOT APPLICABLE based on field value>><<Select Agent SS Comments, if any>> {add this footer anytime Select agents are present}

MP Sub-Proj: List overall application A/UN/N/A (same as single project) next to ftr with comments under it (if entered). If N/A for all then just put main N/A and skip sub-proj data, otherwise group sub-projs based on the A/UN/NA entered with comments listed under each grouping.

RESOURCE SHARING PLANS

DATA SHARING PLAN: <<ACCEPTABLE/UNACCEPTABLE/NOT APPLICABLE value in ALL CAPS>><<Data Sharing Plan SS Comments>>

MP Sub-Proj: List overall application A/UN/N/A (same as single project) next to ftr with comments under it (if entered). If N/A for all then just put main N/A and skip sub-proj data, otherwise group sub-projs based on the A/UN/NA entered with comments listed under each grouping.

MODEL ORGANISM SHARING PLAN: <<ACCEPTABLE/UNACCEPTABLE/NOT APPLICABLE value in ALL CAPS >><<Model Organism Plan SS Comments>>

MP Sub-Proj: List overall application A/UN/N/A (same as single project) next to ftr with comments under it (if entered). If N/A for all then just put main N/A and skip sub-proj data, otherwise group sub-projs based on the A/UN/NA entered with comments listed under each grouping.

GENOMIC DATA SHARING PLAN: comments only, No Acc/UnAcc/NA data<<GDS SS Comments>>

MP Sub-Proj: List overall application A/UN/N/A (same as single project) next to ftr with comments under it (if entered). If N/A for all then just put main N/A and skip sub-proj data, otherwise group sub-projs based on the A/UN/NA entered with comments listed under each grouping.

AUTHENTICATION OF KEY BIOLOGICAL AND/OR CHEMICAL RESOURCES: <<ACCEPTABLE/UNACCEPTABLE/NOT APPLICABLE value in ALL CAPS >><<Authentication Comments>> {add this footer for every SS}

MP Sub-Proj: This footer only relates to the overall application level – no sub-projects. {This may need to be added for sub-projs (data and SS rpting) in future-new policy so still being developed.}

FOREIGN INSTITUTION: <<JUSTIFIED/NOT JUSTIFIED/NOT APPLICABLE>><<Current foreign component comment field>>

MP Sub-Proj: List overall application A/UN/N/A (same as single project) next to ftr with comments under it (if entered). If N/A for all then just put main N/A and skip sub-proj data, otherwise group sub-projs based on the A/UN/NA entered with comments listed under each grouping.

INTRAMURAL INVESTIGATOR(S)Only list details for any Level-1 KP who have the NIH Intramural Inv checkbox marked in the following format in alpha order by KP Last NameDr. <<KP First Name>> <<KP Last Name>>, <<Institution>>, is a <<Role>>. <<KP Comments>>Next KP….

SCIENTIFIC REVIEW OFFICER’S ADMINISTRATIVE NOTES: NOT APPLICABLE IF no data is entered into field and BioNC is not marked and Authentication value is not Unacceptable-Missing or Unacceptable-Overstuffed. ELSE only add footer with field text & applicable comments below (N/A is not listed).<<SRO Admin Note>>

If BioSketch NC checkbox is marked add the following default statementDuring the review of this application, reviewers and/or NIH staff noted that one or more biosketches did not comply with the required format (NOT-OD-15-032). An electronic notification has been sent to the Signing Official for this application, to ensure that future application use the correct biosketch format. NIH has the authority to withdraw such applications from review or consideration for funding.

{If Authentication of B/C=Unacc-Missing add statement}Applications submitted for due dates on or after January 25, 2016 are required to include a PDF attachment describing plans for authentication and/or validation of key biological and/or chemical resources that will be used in the proposed research study (see NOT-OD-16-011).  Reviewers consider information provided in this attachment as part of their evaluation of the application.   This attachment could not be assessed because it is missing from your application

{If Authentication of B/C=Unacc-overstuffed add statement}Applications submitted for due dates on or after January 25, 2016 are required to include  a PDF attachment describing plans for authentication and/or validation of key biological and/or chemical resources that will be used in the research study (see NOT-OD-16-011).  Information in this attachment must focus only on authentication and/or validation of key resources to be used in the study; all other methods and preliminary data must be included within the Research Strategy section.  However, your application has excess information in the attachment. Applications identified as non-compliant with the attachment requirements can be withdrawn from the review process (NOT-OD-15-095). Instead, the decision was made to provide a one-time warning.  Please note that future applications must comply with the content requirements for the attachment

MP Sub-Proj: This footer only relates to the overall application level – no sub-projects.

<< Budgetary Overlap>>

MP Sub-Proj: List overall application N/A (if no comments entered, same as single project) next to ftr or list main application comments under it (if entered). Then list comments next to each sub-proj if entered.

COMMITTEE BUDGET RECOMMENDATIONS: <<Committee Budget Recommendations>>

List footer on all SS (SROs can remove if not wanted/used).

MP Sub-Proj: List overall application comments (if entered), same as single project. Then list comments next to each sub-proj if entered.

If staff put **** in front of an additional review criteria, that criteria and the related comments will appear here under a group hdr that lists the App/Sub-Project and Critique number.

Critique 1 (used for overall program project/single-project version)

Resubmission section and text…..

Additional Comments to the Applicant and text….

Project 1 Critique 1 (sub-project variation)

Resubmission section and text…..

Additional Comments to the Applicant and text….

Project 1 Critique 2

Resubmission section and text…..

Additional Comments to the Applicant and text….

Project 2 Critique 1

Resubmission section and comments….

Additional Comments to the Applicant and text….

Admin-Core Critique 1

Resubmission section and comments….

Additional Comments to the Applicant and text….