Upload
others
View
0
Download
0
Embed Size (px)
Citation preview
1
EVIDENCE TEAM REPORT CRITICAL THINKING & PROBLEM SOLVING ILO ASSESSMENT
SPRING 2012
Team Members:
Julia Raybould-Rodgers (English), Team Leader
Kathy Headtke (Library), Report Writer/Historian
Larry Manalo (Nursing)
Karen Tait (Mathematics)
Helen Talkin (Art History)
Peggy Warrick (Business)
Support:
Jennie Robertson, Learning Outcomes Analyst
Laurie Pemberton, Director, Institutional Research and Planning (IRP)
Carmela Vignocchi, Title V Director
Kelly Brune, KB Statistical Consulting
English Department Classified Staff
2
EVIDENCE TEAM REPORT CRITICAL THINKING & PROBLEM SOLVING ASSESSMENT
SPRING 2012
BACKGROUND
Allan Hancock College (AHC) has identified seven institutional learning outcomes (ILOs) that its students completing an associate’s degree will achieve. The ILOs include the areas of communication, critical thinking and problem solving, global awareness and cultural competence, information and technology literacy, quantitative literacy, scientific literacy, and personal responsibility and development. The requisite knowledge, skills, abilities and attitudes to achieve these outcomes are integrated in various academic courses and programs, and student services available at the college. In 2007, the U.S. Department of Education awarded Title V funds to the college for El Colegio de Aprendizaje: The Learning College Project for the purpose of assessing all seven ILOs. Assessing ILOs at AHC is part of our progress as an institution towards proficiency, and then towards sustainability in student learning outcomes (SLOs), which is a requirement for effective educational institutions. Guidelines were established and defined by the Accrediting Commission for Community and Junior Colleges (ACCJC) in the Rubric for Evaluating Institutional Effectiveness. AHC has been charged with reaching the third level of Proficiency by spring 2013. Evidence teams were included in the project as a method to assess the ILOs where the Title V benchmark for success is that 70% of students meet or exceed college level expectations by fall 2012. Subsequently, the Learning Outcomes and Assessment Committee (LOAC) drafted an institutional assessment plan that provides for the collection and analysis of ILO data by interdisciplinary evidence teams comprised of faculty, administrators, students and other relevant staff. LOAC selected the communication ILO as the first one to be assessed and in fall 2009, the college held a series of workshops for all faculty and staff on the communication ILO. The communication ILO team findings, processes, and reports were presented to faculty and staff as a model for the assessment of the other six ILOs. In early fall 2011, groups of interdisciplinary faculty were formed to assess the remaining ILOs, including the critical thinking & problem solving evidence team.
This report provides the assessment of the critical thinking & problem solving ILO evidence team who will disseminate the results to LOAC and the Title V.
3
PURPOSE
The critical thinking & problem solving ILO states: Upon graduation, Allan Hancock College students will be able to “explore issues through various information sources; evaluate the credibility and significance of both the information and the source to arrive at a reasoned conclusion.” The critical thinking & problem solving evidence team began meeting in September 2011 to examine the ILO. The responsibility of the committee is as follows:
Develop the evidence research process
Identify courses for possible sources of student artifacts
Create the ILO rubric
Collect research evidence samples from faculty members
Score student artifacts using an ILO rubric
Engage in email group and individual correspondence/discussions
Scan evidence and other documents and place in the drop box
Synthesize and analyze the research process METHODOLOGY The critical thinking & problem solving ILO evidence team held five meetings during fall 2011 and nine meetings in spring 2012. The first step in assessing the critical thinking & problem solving ILO was to develop a rubric that could apply to a variety of disciplines and assignments. The assessment rubric was used by members of the evidence team to evaluate artifacts (student work) and determine the strength of Allan Hancock College’s students’ critical thinking & problem solving skills. The team examined a number of rubric models and variations of rubrics, finally agreeing on a four-point rubric with a rating scale that includes the levels of “exceeds expectations,” “meets expectations,” “fails to meet expectations,” and a “no evidence” column with a value of 0. In addition, there is a not applicable (NA) column that indicates the dimension does not apply to a particular artifact/assignment. The critical thinking & problem solving ILO rubric developed by the team focused on four dimensions listed below:
Explore or define issues, problems, or questions.
Identify or evaluate the credibility or significance of sources or information.
Apply critical thinking strategies for solving issues, problems, or questions.
Arrive at a reasoned conclusion/s or solution/s.
In addition, each dimension included several descriptors. Table 1 in Appendix A contains the
final rubric developed by the evidence team.
Direct evidence Members selected two courses each based on the availability of data from two sources. The first was the mapping forms that were in the process of being uploaded to eLumen. The second source was conversations with faculty to determine whether faculty had artifacts that related
4
to the critical thinking & problem solving ILO. The artifacts assessed were from courses taught in spring 2011 or fall 2011. The team selected a diverse variety of classes to survey including: accounting, anthropology, automotive, biology, computer business information systems, chemistry, English, two different math courses, nursing, and sociology. A complete record of the twelve courses and artifact rankings is listed in the “Course Based Analysis of Directly Collected Evidence” in Table 1. For each course a copy of the assignment instructions, a grading rubric (if available), and at least 20 examples or more of student work were collected. 10 student assignments were randomly selected from each group of assignments by the team member. A total of 120 artifacts were used as part of the assessment. Copies of the assignment instructions are included in Appendix B. Evidence team members assessed several course artifacts as a group, after which the rubric was modified to better apply to different types of assignments in various disciplines.
Subsequently, the member responsible for a particular class analyzed the assignment and the artifacts, and systematically applied the rubric. At the following meeting, the team member explained the assignment to the team, and presented a possible strategy for analysis. Then the team member presented assessment recommendations to the group which were then reviewed by the team. This process worked well for most of the artifacts except for two different assignments where, after initial assessment by the group, it was necessary for team members to consult again with the course instructor for advice on ways to identify or interpret dimensions on the rubric in relation to the assignment.
Indirect evidence
The Community College Survey of Student Engagement (CCSSE) survey was conducted during spring 2010. A factor analysis was conducted with the CCSSE data to determine which questions were statistically related and could correspond to any of the college’s ILOs. Five CCSSE questions were determined to be related to the critical thinking & problem solving ILO (Appendix C).
RESULTS Direct Evidence
Direct evidence for the Critical Thinking & Problem Solving ILO was collected in 12 courses. To assure a comparable assessment of the 12 different courses, four separate rubrics measuring four different dimensions of critical thinking and problem solving were developed by the evidence team. For each of the 12 courses, a random sample of 10 direct evidence artifacts (from 10 different students) was evaluated. The final scores are combined in Table 1. (Brune, Course Based Analysis of Directly Collected Evidence, Appendix D.)
5
COURSE Explore or define issues, problems, or questions.
Identify or evaluate the credibility or significance of
sources or information.
Apply critical thinking strategies for solving issues, problems, or
questions.
Arrive at a reasoned conclusions or solutions.
(3) (2) (1) (0) (3) (2) (1) (0) (3) (2) (1) (0) (3) (2) (1) (0)
ACCT 160 7 0 2 1 6 0 4 0 3 1 4 2 4 0 5 1
ANTH 102 3 3 0 4 4 1 3 2 3 4 2 1 N/A N/A N/A N/A
AT 130 1 8 1 0 5 4 1 0 6 2 2 0 4 3 3 0
BIO 150 3 7 0 0 3 6 1 0 3 6 1 0 2 6 2 0
BUS 106 6 4 0 0 4 4 2 0 6 3 1 0 6 2 2 0
CBIS 112 6 3 0 1 7 3 0 0 7 3 0 0 4 6 0 0
CHEM 150 9 1 0 0 1 7 2 0 6 1 3 0 9 1 0 0
ENG 103 5 5 0 0 3 6 1 0 4 4 2 0 5 5 0 0
MATH 105 N/A N/A N/A N/A N/A N/A N/A N/A 8 1 0 1 2 0 7 1
MATH 321 N/A N/A N/A N/A 5 4 1 0 6 3 1 0 N/A N/A N/A N/A
NURS 109 9 1 0 0 7 3 0 0 7 3 0 0 7 3 0 0
SOC 102 6 3 1 0 6 3 1 0 6 4 0 0 5 5 0 0
Total 55 35 4 6 51 41 16 2 65 35 16 4 48 31 19 2
Table 1.
The rubric scores were defined as follows (0) – No evidence (1) – Fails to meet expectations (2) – Meets expectations (3) – Exceeds expectations
If the dimension was not applicable to a specific artifact, a rating of ‘NA’ for not applicable, was given. (Brune, Course Based Analysis of Directly Collected Evidence, Appendix D.)
The results of the direct evidence assessment for all courses using the critical thinking & problem solving ILO rubric are listed in Table 2. The percentages are taken from the totals in Table 1. For the first dimension, there were 100 artifacts ranked from 10 courses. In Dimension One, 90% of students scored in the Exceeds and Meets Expectations categories, which is above the 70% threshold. Regarding Dimension Two, 110 artifacts from 11 courses were ranked, and 83.7% scored Exceeds or Meets Expectations, above the 70% benchmark. Dimension Three had the highest number of artifacts ranked with 120 samples from 12 courses. Of the artifacts ranked, 83% scored Exceeds or Meets Expectations. Last, in Dimension Four, 100 artifacts from 10 courses were scored with 79% in the Exceeds or Meets Expectations rankings. All four dimensions scored above the 70% benchmark expected by the Title V instructions to indicate proficiency in an ILO.
6
Critical Thinking & Problem Solving ILO Course Assessment Results – Cumulative Scores
Dimension Exceeds
Expectations
Meets
Expectations
Fails to Meet
Expectations
No
Evidence
Explore and define issues, problems, or questions. 55% 35% 4% 6%
Identify and evaluate credibility or
significance of sources or information. 51% 41% 16% 2%
Apply critical thinking strategies for
solving issues, problems, or questions. 65% 35% 16% 4%
Arrive at reasoned conclusions or
solutions. 48% 31% 19% 2%
Table 2. * Percent totals may not equal 100 due to rounding. Based on the joint assessment of all student artifacts in 12 courses examined, the statistical analysis determined that collectively, 84% of the students meet or exceed the expectations for the critical thinking & problem solving institutional learning outcome.
As shown in Table 2, for all four dimensions of the rubric, the directly collected and assessed data shows strong evidence for AHC’s student proficiency in critical thinking and problem solving skills.
Course Based Analysis of Directly Collected Evidence
Dimension 1. Explore or define
issues, problems, or
questions.
Dimension 2. Identify or evaluate the
credibility or significance of sources
or information.
Dimension 3. Apply critical thinking strategies for solving issues, problems, or
questions.
Dimension 4. Arrive at a reasoned
conclusions or solutions.
Students meeting or exceeding college level expectations
90.0% 83.7% 83.4% 79.0%
Table 3. From Brune’s “Course Based Analysis of Directly Collected Evidence.” Appendix D.
The percentage of students meeting or exceeding college level requirements exceeds the targeted 70% in all four dimensions. While Dimension One ranked higher than Dimensions Two, Three, and Four, each of the four dimensions ranked well above the targeted 70%. The criterion for success for an institutional learning outcome identified by the Title V project is for 70% of AHC’s students to meet or exceed expectations for each outcome, therefore based on the findings of the evidence team and Institutional Research and Planning (IRP), this criterion has clearly been met for all four dimensions on the rubric.
7
Based on the artifacts assessed by the evidence team, there are no recommendations for the improvement of critical thinking & problem solving in the curriculum or services at Allan Hancock College. CONCLUSIONS
The results of the Community College Survey of Student Engagement indicated that student perception of their engagement in critical thinking & problem solving activities fell slightly below the national average except in one dimension “using information you have read or heard to perform a new skill” (Appendix C). The current review of artifacts, however, revealed that students met the Title V 70% benchmark in the same institutional learning outcome, critical thinking & problem solving.
The committee was able to apply an evidence-based model for ILO assessment. Although the result of the artifacts assessed showed that the AHC students represented scored well above the 70% target level for critical thinking & problem solving on the ILO, the team members have the following recommendations and observations:
RUBRIC DEVELOPMENT:
1. Recommendation: Create two separate rubrics, one for problem solving and one for critical thinking.
a. The design of the rubric was problematic with two different areas of emphasis, critical thinking & problem solving. The team created a Not Applicable column because there were artifacts that didn’t meet all dimensions.
2. Recommendation: Create an institutional policy on rubric development and standardize the procedure in LOAC. Write a Best Practices Guide for future ILO assessment.
a. There were unclear instructions regarding the development of a rating rubric. Available models were inconsistent regarding the critical thinking & problem solving rubrics. The best practice for rating scales was unclear.
EVIDENCE COLLECTION:
3. Recommendation: Choose artifacts mapped to the ILO. a. The process of collecting artifacts required consideration of various factors
including accessible student work, faculty buy-in to the process, and student confidentiality. This posed issues regarding accessible software, and inordinate time and effort to faculty and English department staff who scanned 500 pages of artifacts. The artifacts collected were not necessarily designed to meet this specific ILO.
b. The inter-rater reliability or other ranking errors among committee members were possible due to the nature of the courses involved in the assessment of artifacts. Issues related to the ranking process, subject expertise, definitions, and applications of the ILO occurred. These issues were addressed through reliance on faculty expertise (either as a member of the evidence team or as a contributor of student work/artifacts).
8
c. The mapping of course SLOs to the ILOs was incomplete at start of this project which slowed artifact collection.
4. Recommendation: Develop procedures that integrate qualitative data. a. There was unclear use of quantitative and qualitative methods. While the “Course
Based Analysis of Directly Collected Evidence” (Appendix D) stressed the results of the quantitative data, a more balanced approach that includes the use of qualitative may be valuable. There were nuances in the artifacts that warrant consideration.
5. Recommendation: Involve more faculty with subject expertise in the assessment of the artifacts process.
a. There was a need for subject expertise to review artifacts. 6. Recommendation: Advise LOAC to address procedures on best practices in the ILO
assessment cycle. a. There was a lack of direction about best practices for ILO data collection.
FACULTY COOPERATION IN ASSESSMENT PROCESS:
7. Recommendation: Encourage faculty cooperation across the disciplines. Recognize and reward faculty efforts. Publicize ILO assessments earlier to inform faculty and allow them to prepare appropriate assignments.
a. There was an information gap among faculty about the purpose of ILO assessment, which made some faculty reluctant to give the team artifacts.
8. Recommendation: Present the results of the Critical Thinking & Problem Solving ILO assessment project at professional development activities to better instruct and involve faculty in the assessment development process.
a. There is a need to maintain the faculty driven assessment process. 9. Recommendation: Establish a clear institutional policy of what artifacts will be used for and
what the role of faculty is in ILO assessments. a. During the artifact collection process, the team heard concerns from some faculty
about the issues of academic freedom and student confidentiality.
Instructor acknowledgement: Completion of this research would have been impossible without the talented instructors who shared artifacts with the team for this project. The instructor’s dedication to genuine student learning is visible and demonstrated in their rigorous, thoughtful, practical, well-designed, and creative assignments.
References Accrediting Commission for Community and Junior Colleges (ACCJC). 2011. Rubric for Evaluating Institutional Effectiveness. http://www.accjc.org/
9
APPENDICES Appendix A: Final Rubric for ILO 2. Appendix B: Assignment Instructions. Appendix C: Allan Hancock College. Office of Institutional Research and Planning. 2011.
“Results of the Community College Survey of Student Engagement Related to Allan Hancock College’s Institutional Learning Outcomes.”
Appendix D: Brune, Kelly. KB Statistical Consulting. April 16, 2012. “Course Based Analysis of Directly Collected Evidence.”
Appendix E: Committee History. Meeting dates and meeting minutes.
10
APPENDIX A Table 1. Final Rubric CRITICAL THINKING & PROBLEM SOLVING ILO Explore issues through various information sources; evaluate the credibility and significance of both the information and the source to arrive at a reasoned conclusion. Examples of when students have demonstrated mastery of this ILO includes, but is not limited to: * Apply a variety of critical and creative strategies for solving complex problems. * Generate and explore questions and arrive at reasoned conclusions. * Synthesize ideas and information from various sources and media. * Evaluate the credibility and significance of sources and material used as support or evidence. * Identify assumptions, discern bias, and analyze reasoning and methods.
Dimension NA* Exceeds
Expectations (3)
Meets Expectations
(2)
Fails to Meet Expectations
(1)
No Evidence (0)
Explore or define issues, problems, or questions.
Clearly and thoroughly defines the issues, problems, or questions.
Defines most of the components of the issues, problems, or questions.
Does not identify the components of the issues, problems, or questions.
No evidence of exploring, or defining issues, problems, or questions.
Identify or evaluate the credibility or significance of sources or information.
Thoroughly identifies or evaluates the credibility or significance of sources or information.
Adequately identifies or evaluates the credibility or significance of sources or information.
Poorly identifies or evaluates the credibility or significance of few or no sources or information.
No evidence of evaluating the credibility and significance of sources or information.
Apply critical thinking strategies for solving issues, problems, or questions.
Accurately applies formulae, sources, procedures or discipline principles.
Applies appropriate formulae, sources, or discipline principles with minor inaccuracies.
Labels formulae, sources or discipline principles inappropriately, inaccurately, or omits them.
No evidence of applying critical thinking strategies for solving issues, problems, or questions.
Arrive at a reasoned conclusions or solutions.
Thoroughly and accurately describes conclusions or solutions.
Identifies conclusions or solutions with minor inaccuracies.
Inaccurately identifies or fails to present conclusions or solutions.
No evidence of arriving at reasoned conclusions or solutions.
*NA indicates that the dimension is not applicable for a specific artifact.
11
APPENDIX B Course Assignment Instructions for ACCT 160, ANTH 102, AT 130, BIO 150, BUS 106, CBIS 112, CHEM 150, ENG 103, MATH 105, MATH 321, NURS 109, SOC 102. Artifact: Acct 160 Assignment – Company Financial Statement Report
This course requires completion of a comprehensive project which analyzes a company's financial statements. Using chapter 6 of your text as a guide, you are to perform a thorough analysis of your assigned company's financial statements (Using the latest SEC 10K filing provided.) See below for your assigned company (using your initials). Ford: MA, DA, TB, KC, DC, EC, WD GM: WE, SE, LE, TF, MF, DG Apple: HG, LG, EH, KH, EH, MH Target: JH, MI, FK, GK, IK, DM Boeing: BP, JT, BV, JW, KW, EW This project is worth 22% of your grade and is very open ended. The more thorough analysis you do, the better your grade. Click the link provided below to see links to the .pdf file for your company. Create your project in Microsoft Word and submit one Microsoft Word file as your submission before the December 1 deadline. (The syllabus says December 5 but to get them graded on time I need you to submit them by December 1.) To upload your file: click the Browse button, locate your file, and then click Submit. Artifact: Anth 102 Assignment – Social Aspects of Language
2. Imagine you’re an anthropologist and you are going to study the social aspects of language
within a given culture. Describe a social aspect of language that you would study and
discuss three methods of gathering data that you would use in your research. Provide an
example of each method of gathering data that you would use.
Artifact: AT 130 Assignment – Diagnostic Skills Quiz
AT 313 Diagnostics Skills Quiz Diagnosing problems with the automobile involves many skills beyond making simple
observations. Remember that we want to identify the symptom, the system, the
component, and the cause. Symptoms are observable and, given the right conditions,
repeatable. Sometimes they are constant and sometimes intermittent. The system could
be the brakes, suspension, engine, etc. The component is the specific part that needs
12
adjusting or replacement. If possible, the cause should always be found in order to
prevent a reoccurrence of the problem. Sometimes the cause can be normal wear and
other times it can be traced to something happening to the vehicle. In order to find all of
these you must gather information from a variety of sources, analyze that information,
formulate a working theory, and verify your theory. Here is the scenario: You are working in a brake and front-end shop as a technician. One
morning you arrive to find a 2004 Ford Ranger in your stall. You pull the work order and
find the following under "Description of Problem; Customer hears noise coming from right
rear when brakes are applied."
Under "Work to be performed", the service writer has: "Diagnose complaint and
estimate labor and part cost of any needed repairs." You then visually inspect the vehicle to make certain it is road-worthy for a test drive. All
appears to be OK. You then realize that you are blocked in your stall by another car that is
having a smog-check done and cannot move the vehicle for at least 45 minutes. Not
wanting to waste time, you decide talk to the service writer who wrote the work order.
You find out that the owner dropped the car off right before closing last night and was on
the way to the airport to leave on vacation this morning. After getting permission, you
phone the customer. When they answer and you tell them who you are, they explain that
they are in the airport and their flight is leaving in 2 minutes. What 3 questions would you
ask the customer? What are the reasons for asking each question? 1. Under what conditions did the noise occur? 2. Did the noise happen every time or was it intermittent? 3. Have you had any other work done on the car recently?
Artifact: Bio 150 Assignment – Fermentation. Formal Lab Write Up
Title and Abstract (8 points)
Title: missing/inappropriate/vague; Missing: purpose, experiment methodology, overview, hypothesis,
major results & conclusions. Not on a separate page. Not in past tense. Spelling/grammar errors. Use of
”We, Us, I, You” (*not in impersonal voice).
Introduction. (6 points)
Unclea. Missing: purpose, hypothesis, background info. Not in past tense. Spelling/grammar errors. Use
of ”We, Us, I, You” (*not in impersonal voice).
Materials and methods. (4 points)
13
Unclear,/inaccurate. Missing: materials/methods, lacks organization, not in past tense,
spelling/grammar errors. Use of ”We, Us, I, You” (*not in impersonal voice).
Results. (6 points)
Missing data table, graph, title. Missing data, units. Analyses data. Inaccurate/unclear/disorganized.
Spelling/grammar errors.
Discussion (10 points)
Missing: analysis of all data, hypothesis, reference. Restates results without analysis. Missing: error
possibilities, conclusion paragraph. Inaccurate, confused, disorganized. Not in essay style. Not in past
tense. Spelling/grammar errors. Use of ”We, Us, I, You” (*not in impersonal voice). Missing literature
cited, in-text and reference page.
Artifact: Bus 106 Assignment- Business Plan
Business Plan Assignment Rubric Cover Page All information present, including: Name of the Business Company logo Names of the owners (the group members) Date Any other information that you feel is pertinent Table of Contents Includes each major section Includes page numbers Organized and neat Executive Summary Each key section briefly summarized Specifically answers:
What business are you in? Where will you locate and why? What is your product/service? How much capital will be needed? Why should someone invest in your company?
Length is one to three pages
Business Description Name of person who is the point of contact Basic information (mailing address; phone number, website, email, etc.) Legal form (partnership, corporation, etc.) be specific Mission statement
14
Goals & objectives Overview of operating history or how the venture was conceived
Product or Service Description
Give sufficient detail to enable investors to develop a working understanding of what the company sells May include photographs, schematic drawings, and descriptive scenarios Describe protection from competitors Focus on the benefits of each feature Feature refers to what the product/service is Benefits refer to what the product/service does
Competitor Analysis Review of firm's top competitors & their relative market share Examine markets that competitors serve & the strategies they employ Describe barriers to .entry and articulate strategies for overcoming them Barriers to entry (characteristics of an industry that make e it difficult to start new ventures) Market Analysis Provide insights into the target market Clearly identify the target market- Who are your customers? Demographics Psychographies Geography How large is the potential market? Are customers' needs changing? Are sales seasonal? Is demand tied to another product or service?
Marketing Plan Describe advertising & promotion campaigns Media used-reader, viewer, listener profiles Media costs Frequency of usage Plans for generating publicity Web presence Budget for marketing plan
Location & Layout Location Demographic analysis of location versus target customer profile Traffic Lease/Rental rates Labor needs & supply Wage rages
15
Layout Size requirements Americans with Disabilities Act compliance Ergonomic issues Layout plan (suitable for an appendix)
Management Team Emphasize relevant experience that each team member brings to enterprise Experience Education Duties List & highlight strengths of other stakeholders who will contribute management or consulting expertise Board of Directors Advisory Board Legal counsel or other professional service firms such as accounting, management, etc. Financial Projections - Accurate, correct format, consistent, realistic Should be conservative but achievable; reasonable yet compelling Sales forecast List product(s)/service(s) to be sold Price at which product(s)/service(s) will be sold Number of units of each product/service will be sold each month Financial Statements (the first year) Income Statement (Profit & Loss statement) ) Cash Flow Statement Balance Sheet Breakeven Analysis
Loan or Investment Proposal Amount requested Purpose and uses of funds Repayment or "cash out" schedule (exit strategy) Timetable for implementing plan and launching business
Appendices The Appendices (beginning on a separate sheet at the end of your plan) can include pertinent information about yourself and your business that is not included elsewhere in the plan. Management structure, organization chart(s), and resumes Brochures or other published information describing the product(s) and service(s) you provide Layout schematic Details of objectives and goals Catalog sheets, photographs, or technical information
16
Artifact: CBIS 112 – Create a Grade Calculator
Mid-Term Exam – Part 2 – Visual Basic Project Projects Overview This project is assigned as part of your mid-term exam. In order to complete your mid-term exam, you need to: 1. Complete and submit this project in all its parts 2. Complete and submit the mid-term exam in Blackboard Project – Chapter 4 – Case Programming Assignment # 5 - Grade Calculator – page 286 Follow the instructions in the chapter to complete the sample program. Make sure to submit all the files requested below to the instructor. 1. Create a folder for your midterm project and save all the files for the project there 2. Create a use case definition for the program - call it Use_Case_Grade_Calculator.rtf 3. Create an event planning document for the mid-term project – call it Event_Planning_Case_Grade_Calculator.rtf 4. Compress the mid-term project folder with the: a. Visual Basic Project b. Use Case documentation c. Event Planning documentation 5. Submit the compressed folder with all your work for the midterm project to Blackboard\Mid-Term folder, Project Assignment module
Artifact: Chem 150 Assignment – Experiment #6, Lab Report
Title Written 5 pts Objectives (Written in Paragraph form) (-5 if just copied) 5 pts PreLab 30 pts Balanced equation 10 pts Gas Law calculation 10 pts Gas Stoichiometry calculation 10 pts Theory 10 pts Combined Gas Law 10 pts Procedure 30 pts Cite Laboratory Manual 5 pts Summary of Procedural Steps 10 pts Specific Changes Made to Experiment 5 pts Picture of Apparatus 10 pts Data 40 pts Gravimetric Data 20 pts Volumetric Data 20 pts Data Analysis/Calculations 105 pts Gravimetric Calculations with 3 trials (Correct sig figs, average, and StDev reported) 50 pts
17
Volumetric Calculations with 3 trials (Correct sig figs, average, and StDev reported) 50 pts Within 5% of unknown value 5 pts Summary Table of Results 20 pts Results from parts A and B (10 points each table shown in lab manual) 20 pts Conclusions 40 pts Discussion of container not being air tight. 10 pts Discussion of how this would affect gravimetric and volumetric calculations. 10 pts Discuss confidence between both methods. 20 pts
Artifact: English 103 Assignment
Paper Four English 103
Paper four:
Eight to eleven pages
Research paper, which must use
Eight to eleven outside sources While you may choose your own topic, you must get my approval, and this means your topic must be researchable, and you must have a thesis that is arguable. This means you must choose a point on which reasonable people can disagree.
In order for you to have enough time to do sufficient research and to be ready when it comes time to write the paper, there are some preliminary assignments.
You will need an annotated bibliography and note cards. Each of these assignments has been divided into two parts.
An annotated bibliography includes the citation material (author, title, publisher, dates, etc.) and a brief paragraph explaining the source. There is a sample on Blackboard.
Note cards are a place to list individual pieces of information you may want do your paper.
There is a sample on Blackboard. Have a works cited page. The annotated bibliography is not a
works cited page.
Artifact: Math 105 Assignment –Math Quiz
1. (Each part is worth 2 points). Without computing a each sum, answer the questions..
O = 1 + 3 + 5+ 7 +…. +97
E = 2 + 4 + 6 + 8 + …+98
P = 1 + 3 + 5 + 7 +… +99
a) Which is greater O or E? ________ By how much? ________
b) Which is greater P or E? ________ By how much? ________
18
2. (2 point) The rule for the table is that the numbers in each row and column must add up
to the same number. Fill out the missing numbers.
6
9
8 3 10
Artifact: Math 321 Assignment (only problem13 was assessed)
12. A 13-ft ladder is placed against the side of a building with its food 5 ftf. From the base of the
building. How far above the ground will the ladder tough the
building?
13. Using Figure 4, where D is the midpoint of AC and E is the
midpoint of BC.
a. Find m ).
b. What theorem(s) did you use to answer part (a)?
c. Find m(
d. What theorem(s) did you use to answer part c?
e. Find m(
f. What theorem(s) did you use to answer part e?
14. a. Find x in Figure 5.
b. Find x in Figure 5.
19
Artifact: Nurs 109 Assignment – Respiratory Failure Quiz
Respiratory Failure
Thomas Bach, a 72-year-old farmer, is brought to the ED by private care in acute
respiratory distress. His wife gives the following history: Mr. Bach has a 2 PPD smoking
history for over 40 years and was diagnosed with COPD 15 years ago. 20 yr. history of
Diabetes Type II. 16 yr history of CAD and had an MI 5 years ago. He is currently on the
following meds: Digoxin 0.25 mg daily, Lasix 40 mg daily, Slow-K 2 tabs daily, Advair
100/50 inhaled twice a day and Albuterol inhaler as needed. No known drug allergies.
About two weeks ago, Mr. Bach developed anorexia and dyspepsia, which has increased
steadily. Three nights ago he woke up with a sudden onset of severe shortness of breath,
which was relieved when he sat up in a chair. He has gained 6 pounds over the last two days
despite his loss of appetite. His legs have become swollen and he is concerned about a
decreasing urine output over the past 48 hours.
Assessment overview: Mr. Bach is an elderly gentleman who appears to be in poor health.
His complexion is pale and his lips are dark. His arms and legs are thin with noticeable
edema in his lower legs. He is sitting upright in bed, leaning forward with his hands on his
knees. His respirations are labored with a loud audible expiratory wheeze. He is alert to
person and place but is not sure of the day of the week.
VS-- BP= 160/90, P= 114 slightly irregular, R= 34 and labored, T= 97.2 orally, Sp02= 80% HEAD: Coloring pale. Mucous membranes, earlobes, tongue, and conjunctiva are cyanotic.
He is oriented to name and place and is cooperative. Positive ND is present while he is
sitting up.
CHEST: Breath sounds are distant and difficult to distinguish. Fine crackles are
auscultated over bilateral bases, with coarse sounds and expiratory wheezes heard
centrally. The AlP diameter of the chest is approximately 1:1 with little chest movement
noted during breathing. Heart sounds are distant, rhythm somewhat irregular.
ABDOMEN: Abdomen distended with complaint of discomfort when high upper right
quadrant is palpated. Bowel sounds are present but hypoactive. No scars are noted on the
abdomen.
EXTREMITIES: Legs are hairless below the knees, with patchy brown areas noted. A
healed ulceration is noted on L ankle. 4+ pitting edema is noted bilaterally on the lower
legs. He has palpable radial pulses but pedal pulses cannot be palpated due to the
edema.
20
SKIN: Skin warm, dry, and flaky. No areas of breakdown are noted. His general coloring
and nail beds are dusky. His capillary refill is appropriately 4 seconds. Mr. Bach is taken to
an ED room.
1. What initial nursing actions are appropriate for Mr. Bach at this time? Include
procedures to be done, appropriate communication and bedside nursing activities.
2. Interpret Mr. Bach’s ABGs. Explain how the ABG and other lab values relate to Mr. Bach’s clinical condition.
3. Interpret the latest blood gas. 4. You notify the physician of the results. He orders the FiO2 to be decreased to 70%, the
TV to 500, and the rate to 12. What ABG values would you expect to see changed as a result of these changes?
5. Discuss what the goals might be for the blood gas values (you do not have to put specific numbers, but do indicate if it is a reasonable to expect to fully normalize all values). Consider his history of COPD when answering.
Artifact: Soc 102 Assignment- Sick Around the World Essay
Final Exam Information Answer the question from a sociological perspective (use the evidence and concepts from the text and any relevant articles/lectures to support your answer--and cite/annotate your usage. Chapter ten and SICK AROUND THE WORLD and blackboard articles: Based on the film compare and contrast two countries profiled in the film with the United States. Use the textbook, film and any other lecture materials in your analysis. 1. Summarize and discuss your findings. 2. What might the US learn from these two countries? 3. What structures or policies would you recommend the US adopt? Why? What policies or structures would not work in the US? Why? Minimum requirement: Use and cite the textbook (at least 4 intext citations), film (at least 4 in-text citations) and at least one related class article (at least 3 in-text citations). Clearly address each numeric section of this question. The "take home" question is worth 15 points. • Cite the textbook by page number • Cite the film by time • Cite the article(s) by paragraph number
21
APPENDIX C
Results of the Community College Survey of Student Engagement Related to Allan Hancock
College’s Institutional Learning Outcomes
Results of the Community College Survey of Student Engagement
Related to Allan Hancock College’s Institutional Learning Outcomes
In spring 2010, Allan Hancock College (AHC) administered the Community College Survey of Student
Engagement (CCSSE). One use of the information has been to examine the questions included with
respect to AHC’s Institutional Learning Outcomes (ILOs). This report summarizes related findings.
Factor Analysis
During spring 2011, Dr. Kelly Brune conducted a factor analysis with the CCSSE data to determine
which questions appeared to be statistically related and if those relationships might correspond to AHC’s
Institutional Learning Outcomes. Below is a chart with five factors that were identified through this
analysis, indicating only those questions that also met the following criteria:
- Group selection was based on results of the factor analysis
- Only questions that showed a high loading for one factor and small loadings for all the other
factors were kept
- Question content had to be related to the Institutional Learning Outcome
- Factor 5 (QUANT) from the original factor analysis was excluded as it did not produce any
strong factor loadings
- Questions were put into groups according to their factor loadings and each factor was assigned to
an ILO according to initial question-ILO assignment that was done before the analysis
22
CCSSE Question Groups that Match to ILOs
ITEM ITEMNAME FACTOR 1 FACTOR 2 FACTOR 3 FACTOR 4 FACTOR 6
04l FACGRADE 0.11 0.17 0.51 -0.19 0.14
04m FACPLANS 0.21 0.10 0.51 -0.12 -0.01
04n FACIDEAS 0.10 0.14 0.52 -0.21 -0.02
04f CLASSGRP 0.08 0.09 0.43 -0.04 0.11
04g OCCGRP 0.10 0.10 0.56 -0.05 0.11
05b ANALYZE 0.12 0.68 0.14 -0.14 0.09
05c SYNTHESZ 0.22 0.69 0.16 -0.20 0.08
05d EVALUATE 0.13 0.66 0.14 -0.22 0.03
05e APPLYING 0.19 0.75 0.15 -0.17 0.09
05f PERFORM 0.27 0.57 0.19 -0.07 0.07
12j GNSELF 0.73 0.05 0.03 -0.24 0.00
12l GNETHICS 0.76 0.06 0.11 -0.24 -0.02
12m GNCOMMUN 0.66 0.06 0.19 -0.16 -0.05
12b GNWORK 0.56 0.15 0.18 0.02 0.12
12h GNOTHERS 0.69 0.13 0.24 -0.04 0.14
12i GNINQ 0.67 0.16 0.03 -0.10 0.13
12n CARGOAL 0.69 0.09 0.22 -0.08 0.08
12o GAINCAR 0.68 0.03 0.21 -0.08 0.04
SF03 COLLQ1415 0.14 0.18 0.04 -0.65 0.12
SF05 COLLQ1417 0.24 0.16 0.11 -0.47 0.07
SF02 COLLQ1414 0.07 0.13 0.23 -0.59 0.11
SF04 COLLQ1416 0.17 0.14 0.07 -0.64 0.13
SF08 COLLQ1123 0.28 0.05 0.17 -0.47 0.04
04j INTERNET -0.05 0.14 0.16 -0.19 0.42
09g ENVCOMP 0.28 0.06 -0.01 -0.05 0.59
SF14 COLLQ1129 0.17 0.04 0.03 -0.20 0.48
Assigned ILO PERS CRIT COMM GLOBAL INFO
Results of Questions Related the ILOs
Given the above groupings, the next portion of this summary report provides information about AHC’s
responses to each of the identified questions. For each related CCSSE question, AHC’s frequency
distribution and mean are provided, along with the national mean based on responses from all community
colleges who participated in 2010. The “SF” items were special focus questions added by Allan Hancock
College, and do not have national comparisons.
23
Communication
“Communicate effectively using verbal, visual and written language with clarity and purpose in
workplace, community and academic contexts.”
Question Allan Hancock College National
Mean Never Sometimes Often Very Often Mean
04l. Discussed grades or
assignments with an
instructor
132 (16%) 273 (33%) 233 (28%) 182 (22%) 2.33 2.54
04m. Talked about career
plans with an instructor or
advisor
272 (32%) 381 (45%) 128 (15%) 61 (7%) 1.97 2.04
04n. Discussed ideas from
your readings or classes
with instructors outside of
class
456 (54%) 281 (34%) 69 (8%) 31 (4%) 1.61 1.74
04f. Worked with other
students on projects during
class
104 (12%) 319 (38%) 292 (35%) 128 (15%) 2.53 2.48
04g. Worked with
classmates outside of class
to prepare class
assignments
322 (38%) 302 (36%) 157 (19%) 63 ( 7%) 1.95 1.89
24
Critical Thinking and Problem Solving
“Explore issues through various information sources; evaluate the credibility and significance of both the
information and the source to arrive at a reasoned conclusion.”
Question Allan Hancock College National
Mean Very Little Some Quite A Bit Very Much Mean
5b. Analyzing the basic
elements of an idea,
experience, or theory
62 (7%) 219 (26%) 363 (43%) 195 (23%) 2.82 2.86
5c. Synthesizing and
organizing ideas,
information, or experiences
in new ways
80 (10%) 256 (31%) 340 (41%) 159 (19%) 2.69 2.73
5d. Making judgments
about the value or
soundness of information,
arguments, or methods
123 (15%) 277 (33%) 282 (34%) 157 (19%) 2.56 2.57
5e. Applying theories or
concepts to practical
problems or in new
situations
101 (12%) 277 (33%) 295 (35%) 165 (20%) 2.63 2.67
5f. Using information you
have read or heard to
perform a new skill
65 ( 8%) 256 (30%) 323 (38%) 199 (24%) 2.78 2.78
25
Global Awareness and Cultural Competence
“Respectfully interact with individuals of diverse perspectives, beliefs and values being mindful of the
limitation of your own cultural framework.”
Question
Allan Hancock College National
Mean Never /
Not Very
Sometimes /
Somewhat
Often /
Quite A Bit
Very Often /
Extremely
Mean
SF03. In your experience
at this college during the
current school year, about
how often have you
examined the strengths
and weaknesses of your
own views on a topic or
issue
69 (10%) 267 (35%) 272 (36%) 142 (19%) 2.64 n/a
SF05. In your experience
at this college during the
current school year, about
how often have you
learned something that
changed your viewpoint
about an issue or concept?
84 (11%) 307 (41%) 256 (34%) 101 (14%) 2.50 n/a
SF02. In your experience
at this college during the
current school year, about
how often have you
included diverse
perspectives (different
races, religions, genders,
political beliefs, etc.) in
class discussions or
assignments
135 (18%) 283 (38%) 193 (26%) 129 (17%) 2.42 n/a
SF04. In your experience
at this college during the
current school year, about
how often have you tried
to better understand
someone else’s views by
imagining how an issue
looks from his or her
perspective
71 (10%) 247 (33%) 268 (35%) 167 (22%) 2.70 n/a
SF08. How much does this
college include multi-
cultural issues in your
coursework
146 (19%) 303 (41%) 210 (28%) 84 (12%) 2.32 n/a
26
Information and Technology Literacy
“Define what information is needed to solve a real-life issue then use appropriate technologies to locate,
access, select and manage the information.”
Question
Allan Hancock College National
Mean Never /
Very Little/
Strongly
Disagree
Sometimes /
Some /
Disagree
Often /
Quite A Bit
/ Agree
Very Often/
Very Much/
Strongly
Agree
Mean
04j. Used the Internet or
instant messaging to work
on an assignment
76 (9%) 188 (23%) 268 (32%) 297 (36%) 2.95 2.91
09g. Using computers in
academic work 61 (8%) 141 (17%) 288 (35%) 323 (40%) 3.07 3.14
SF14. In your experiences
at this college during the
current school year, when
appropriate, faculty
effectively incorporate the
use of computers and
other technology into their
teaching
24 (4%) 82 (11%) 339 (48%) 261 (37%) 3.18 n/a
Personal Responsibility and Development
“Take the initiative and responsibility to assess your own actions with regard to physical wellness,
learning opportunities, career planning, creative contribution to the community and ethical integrity in the
home, workplace and community.”
Question Allan Hancock College National
Mean Very Little Some Quite A Bit Very Much Mean
12j. Learning effectively
on your own 148 (18%) 215 (27%) 277 (34%) 171 (21%) 2.58 2.62
12l. Developing a personal
code of values and ethics 183 (23%) 247 (30%) 250 (31%) 131 (16%) 2.37 2.38
12m. Contributing to the
welfare of your community 307 (38%) 254 (32%) 164 (20%) 80 (10%) 2.02 2.02
12b. Acquiring job or
work=-related knowledge
and skills
170 (21%) 238 (30%) 236 (29%) 163 (20%) 2.49 2.58
12h. Working effectively
with others 73 (9%) 269 (33%) 306 (38%) 162 (20%) 2.69 2.75
12i. Learning effectively
on your own 74 (9%) 204 (25%) 327 (40%) 205 (25%) 2.82 2.92
12n. Developing clearer
career goals 139 (17%) 207 (26%) 262 (33%) 198 (25%) 2.64 2.68
12o. Gaining information
about career opportunities 140 (17%) 219 (27%) 267 (33%) 182 (23%) 2.61 2.56
27
Next Steps
The Office of Institutional Research and Planning suggests that the next steps in this project are for
discussions to occur regarding how students responded to the various questions related to AHC’s ILOs,
and what the College should consider setting as target levels for future perceptions. If there are gaps
between current perceptions and expected outcomes, then discussions should consider what changes
might impact student learning and behavior.
The College might benefit from developing its own survey instrument to use in the future rather than
administering CCSSE again. Questions should be designed that relate closely to the Institutional Learning
Outcomes, since the current need is to measure student perceptions about whether ILO expectations are
being achieved.
28
APPENDIX D
Course Based Analysis of Directly Collected Evidence From: Kelly Brune, KB Statistical Consulting To: Critical Thinking & Problem Solving Evidence Team Date: May 16, 2012 RE: Analysis of Critical Thinking & Problem Solving ILO collected evidence
Appendix – Course Based Analysis of Directly Collected Evidence
Background:
Direct evidence for the ILO “Critical Thinking & Problem Solving” was collected in 12 courses. To
assure a comparable assessment of the 12 different courses, 4 separate rubrics measuring 4
different dimensions of critical thinking and problem solving were developed by the evidence
team. For each of the 12 courses, a random sample of 10 direct evidence artifacts (from 10
different students) was evaluated. The final scores are combined in Table 1.
COURSE
Explore or define issues, problems, or
questions.
Identify or evaluate the credibility or significance of
sources or information.
Apply critical thinking strategies for solving issues,
problems, or questions.
Arrive at a reasoned conclusions or
solutions.
(3) (2) (1) (0) (3) (2) (1) (0) (3) (2) (1) (0) (3) (2) (1) (0) ACCT 160 7 0 2 1 6 0 4 0 3 1 4 2 4 0 5 1
ANTH 102 3 3 0 4 4 1 3 2 3 4 2 1 N/A N/A N/A N/A
AT 130 1 8 1 0 5 4 1 0 6 2 2 0 4 3 3 0
BIO 150 3 7 0 0 3 6 1 0 3 6 1 0 2 6 2 0
BUS 106 6 4 0 0 4 4 2 0 6 3 1 0 6 2 2 0
CBIS 112 6 3 0 1 7 3 0 0 7 3 0 0 4 6 0 0
CHEM 150 9 1 0 0 1 7 2 0 6 1 3 0 9 1 0 0
ENG 103 5 5 0 0 3 6 1 0 4 4 2 0 5 5 0 0
MATH 105 N/A N/A N/A N/A N/A N/A N/A N/A 8 1 0 1 2 0 7 1
MATH 321 N/A N/A N/A N/A 5 4 1 0 6 3 1 0 N/A N/A N/A N/A
NURS 109 9 1 0 0 7 3 0 0 7 3 0 0 7 3 0 0
SOC 102 6 3 1 0 6 3 1 0 6 4 0 0 5 5 0 0
Total 55 35 4 6 51 41 16 2 65 35 16 4 48 31 19 2
Table 1
The rubric scores were defined as follows
(0) – No evidence
(1) – Fails to meet expectations
(2) – Meets expectations
29
6 4
35
55
0.0%
10.0%
20.0%
30.0%
40.0%
50.0%
60.0%
No evidence Fails to MeetExpectations
MeetsExpectations
ExceedsExpectations
Explore or define issues, problems, or questions.
Figure 1
(3) – Exceeds expectations,
where expectations refer to college level expectations. If the dimension was not applicable to a
specific artifact, a rating of ‘NA’ was given.
Assumptions:
A random sample of students is not practical to analyze ILOs; therefore, cluster sampling is used
in this study with clusters being courses at AHC. All assumptions related to cluster sampling
apply to this study. Another assumption in the analysis of ILOs is that the courses included in
the study are a representative sample of all the courses at AHC that are linked to the ILO being
assessed, in this case Critical Thinking and Problem Solving. Also, it is assumed that instructors
randomly selected the student artifacts for assessment for each course and that this sample is
representative of the population of credit students at AHC.
Method of Analysis:
Since this analysis is the premier analysis of the Critical Thinking and Problem Solving ILO,
analyses that consider time as a variable are not needed, here. In the future, to assess how
students are progressing in critical thinking and problem solving over the years, it may be more
appropriate to consider analyses that assess performance over time. Given that the criteria for
success identified by the Title V project are that 70% of students meet or exceed college level
expectations, simple analyses of central tendency, frequencies, and percentages provide
adequate analyses. The artifact ratings are considered to be scaled, meaning that the
difference between 1 and 2 is the same as the difference between 2 and 3, so that means and
standard deviations may be calculated.
Results:
The results are presented separately for each dimension as well as side-by-side so that the
Learning Outcomes & Assessment Committee may examine each dimension on its own, or
compare the dimensions of critical thinking and problem solving. First, each dimension’s
analysis is detailed.
The first dimension is “Explore or
define issues, problems, or
questions.” There were 100 ratings
from 10 courses to assess this
dimension of critical thinking and
problem solving. Figure 1 shows the
percentages and numbers of ratings
given for each rating category (i.e. no
evidence, fails to meet expectations,
30
2
16
41
51
0.0%
10.0%
20.0%
30.0%
40.0%
50.0%
No evidence Fails to MeetExpectations
MeetsExpectations
ExceedsExpectations
Identify or evaluate the credibility or significance of
sources or information.
4 16 35
65
0.0%
20.0%
40.0%
60.0%
No evidence Fails to MeetExpectations
MeetsExpectations
ExceedsExpectations
Apply critical thinking strategies for solving issues,
problems, or questions.
Figure 3
meets expectations, and exceeds expectations). Ninety (90%) of the students met or exceeded
college level expectations. Looking back at Table 1, ANTH 102 was a major contributor to the
lower ratings compared to other courses with 40% of students having ‘no evidence’ ratings.
Half of the courses examined for this dimension did not have any ‘fails to meet expectations’ or
‘no evidence’ ratings.
For the second dimension, “Identify
or evaluate the credibility or
significance of sources or
information,” there were 110 ratings
from 11 different courses. Figure 2
shows that 92 (83.7%) artifacts met or
exceeded college level expectations
for this dimension. Again, ANTH 102
contributed a greater percentage of
lower ratings than other courses, but
this difference was not as large as the
first dimension. Two of the courses
examined did not have any ‘fails to
meet’ or ‘no evidence’ ratings for this dimension.
The third dimension is “Apply critical thinking strategies for solving issues, problems, or
questions.” Team members rated
samples from twelve courses for a
total sample size of 120. One
hundred (83.4%) artifacts met or
exceeded college level expectations.
For this dimension, ACCT 160 had a
greater proportion of lower ratings
than other courses with 60% of
artifacts receiving ratings of 0 or 1.
Three of the 12 courses examined
had no ratings of 0 or 1.
Figure 2
31
2
19
31
48
0.0%
10.0%
20.0%
30.0%
40.0%
50.0%
60.0%
No evidence Fails to MeetExpectations
MeetsExpectations
ExceedsExpectations
Arrive at reasoned conclusions or solutions.
Figure 4
There were 10 courses totaling 100
artifact ratings for the dimension,
“Arrive at reasoned conclusions or
solutions.” Seventy-nine (79.0%)
artifacts met or exceeded college level
expectations. ACCT 160 and MATH 105
had the greatest proportion of lower
ratings, with 60% and 80% of artifacts
with ratings of 0 or 1, respectively. Half
of the courses assessed for this
dimension did not have ratings of 0 or 1.
Figure 5 provides a look at all critical thinking and problem solving dimensions. The red line at
70% shows the criteria for success identified by the Title V project. The figure shows that all
dimensions exceed these criteria with the lowest percentage (79.0%) for the ‘Conclude’
dimension and the highest percentage (90.0%) for the ‘Explore’ dimension.
32
55.0% 46.4%
54.2% 48.0%
35.0%
37.3% 29.2% 31.0%
4.0% 14.5% 13.3% 19.0%
6.0% 1.8% 3.3% 2.0%
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
Explore or defineissues, problems,
or questions.
Identify orevaluate thecredibility or
significance ofsources or
information.
Apply criticalthinking
strategies forsolving issues,problems, or
questions.
Arrive at areasoned
conclusions orsolutions.
Critical Thinking and Problem Solving Rating Percentages by ILO Dimension
No evidence
Fails to MeetExpectations
MeetsExpectations
ExceedsExpectations
70% of students
should meet or
exceed college level
expectations
Figure 5
33
0.0
0.1
0.2
0.3
0.4
0.5
0.6
-6.00 -4.00 -2.00 0.00 2.00 4.00 6.00
f(x)
Mean
Normal Distributions of Ratings by Dimension
Normal
Dim 1
Dim 2
Dim 3
Dim 4
Another way to view the data is by creating distributions to examine the means and standard
deviations in graphical form. Figure 6 shows a normal distribution with a mean of 0 and a
standard deviation of 1 (the black curve). If attending has an effect on the critical thinking and
problem solving skills of the general student body, the distributions of the various dimensions
should shift toward the right of the black initial curve. In addition, one would expect that the
dimension curves be narrower than the initial curve, showing that the variation between
students is smaller than the theoretical general population.
Conclusions:
Students seem to perform almost equally well when comparing the dimension ratings of critical
thinking and problem solving. This equality of performance is most evident when looking at
Figure 6. Not only are the peaks of the normal distribution lines at almost the same point,
indicating that the means are almost the same (also see Table 2), the width of the distributions
are very similar, indicating that the students scored similarly across dimensions (i.e. the scores
across dimensions have similar standard deviations). Scoring similarly, in this case, means that
there were roughly the same percentages of 0s, 1s, 2s, and 3s for each dimension (also shown
in Figure 5). In addition, the distributions were narrower than the initial black curve (i.e. the
standard deviations were smaller), meaning that team members rated artifacts more similarly
Figure 6
34
than one would expect in the population. The smaller the standard deviation (thus, the
narrower the distribution), the more similarly the artifacts were rated. A perfect scenario
would be that all artifacts were given ratings of 3 (exceeds expectations), which would result in
a standard deviation of 0. This scenario illustrates that the higher the average rating and the
smaller the standard deviation, the better is the outcome. For all 4 rubrics, the directly collected
and assessed data shows strong evidence for AHC’s student critical thinking and problem solving skills.
The percentage of students meeting or exceeding college level requirements exceeds the targeted 70%
in all 4 dimensions.
Explore or define issues, problems, or
questions.
Identify or evaluate the credibility or
significance of sources or
information.
Apply critical thinking strategies for solving issues,
problems, or questions.
Arrive at a reasoned
conclusions or solutions.
Students meeting or exceeding college level expectations
90.0% 83.7% 83.4% 79.0%
Average 2.39 2.28 2.34 2.25
Standard deviation 0.83 0.78 0.84 0.83
Table 1
Discussion and Future Direction:
One of the assumptions is that the courses included in the study are a representative sample of all the courses at AHC that are linked to the ILO being assessed. The goal of ILO analysis should be to include randomly selected courses linked to the ILO. If the LOAC decides that every course linked to a particular ILO should be included in the ILO analysis once every six years, perhaps randomly assigning numbers 1-6 to each course and then assessing all 1s in year 1, 2s in year 2, etc... Then, at the beginning of the next 6-year cycle, randomly assign 1-6 again so that the same courses aren’t always assessed together every cycle. This process will ensure that the assumption of random selection is met.
Another assumption is that students are randomly selected within classes. To ensure that this assumption is met, a team member or the research analyst, not an instructor, should randomly select students from the rosters of the courses included in the study. It may be difficult for an instructor to submit a less than desirable artifact or refrain from submitting an artifact that was exceptional. In addition, some instructors may feel compelled to submit artifacts that represent a range of abilities even though their sample may not be representative of the abilities in their classes. In order to avoid the human aspect of selection, the best method of randomly selecting artifacts would be to randomly select students from the rosters of the classes included in the analysis and give the names to the instructors. The instructors would then give the sample from the selected students to the team for assessment. The artifacts would remain anonymous since the artifacts would not have identifying information on them. Another reason for knowing which students’ artifacts are being assessed is to be able to
35
compare sample demographics to overall AHC demographics so that readers may decide if the sample is representative of the student population at AHC. Institutional research could provide demographics (i.e. ethnicity, age, gender) for the sample, class (from which samples were collected), and AHC credit students.
To rate the artifacts, individual team members scored the artifacts and then presented their ratings to the team. Team members then agreed or disagreed with the ratings and either approved them or requested revisions. This rating process that the team used was a bit unconventional, but not without merit. Usually, when artifacts are judged, at least two raters from the group rate the artifacts independently and then compare their scores to see if they agree (after going through a norming procedure). This would happen for each class in the study. So, instead of everyone rating one class for the norming, at least two raters would rate each class and compare their ratings to see if they agree. The method used for this study was not necessarily incorrect. In fact, having everyone on the team experience the process of using the rubric in a field in which everyone on the team has some knowledge (such as math or English) is very logical. Once everyone has the experience, though, it would be best to have more than one team member rate the artifacts independently and see if they are in agreement (see the Rater Training Procedures used with the ESL Writing Sample Exam). Having one person rate the artifacts for a class and then present his/her ratings to the team for approval is adequate, but the success of this process really depends on the how much detail of the artifact is presented and how much other team members are willing to speak up if they disagree. By members rating the artifacts on their own and comparing the ratings, you are more likely to get an "honest" and uninfluenced measure of agreement. Also, by having at least two team members rating the artifacts independently, you may decide to forgo the presentation and approval of artifacts.
As for the ratings themselves, it is important that when the team is in the process of rating the artifacts that the members compare ratings across courses to see if they are similar. If the ratings are not similar for a course or courses, it does not necessarily mean that there is a problem with the ratings. The outlying courses should be checked to see why ratings are much higher or lower than other courses. Mentioned in the results, ANTH 102 had a greater proportion of lower ratings for the first two dimensions than the other classes rated. Also, some courses had no low ratings at all. There are several possibilities why the ratings were lower for ANTH 102. The best the team can do is to make sure that the ratings are not lower because of reasons other than the artifacts lacking evidence of meeting or exceeding college level expectations. Other potential reasons for extremely low or extremely high ratings are inconsistent raters (i.e. “easy” or “hard” raters for some class artifacts), artifact type (i.e. subjectively vs. objectively scored), and discrepancy between rubrics and assignment goals. An example of a discrepancy would be that the goal of the assignment was to assess student mastery of an accounting concept where grammar was not indicated in the rubric, but then using the assignment to assess grammar as part of the ILO. It is easy to see why student artifacts might not receive good ratings on assignments that were rated with ILO rubrics that are vastly different than assignment rubrics. To overcome potentially poor ratings because of
36
assignment vs. ILO rubric differences, it is important to provide students with both rubrics so that students know on what they should focus.
Particularly good parts of the team’s actions were using identical formatting for the course Excel worksheets, which made analysis easier. Also, the team took steps to ensure that the ratings were agreed upon by the team and not decided by one person; though, ideally, team members should rate the artifacts individually. In addition, the team sampled a variety of course types to assess critical thinking and problem solving skills. By looking at various disciplines, the team is able to get a good representation of students at AHC.
37
APPENDIX E
Meeting Minutes of Evidence Team
Evidence Team Meeting #1 Minutes Thursday September 1st, 2011 8:00-9:30 Conference Room Building C
Attendees: Kathy Headtke, Larry Manalo, Julia Raybould-Rodgers, Helen Talkin, Karen Tait Note taker: Margaret Warrick __________________________________________________________________________________ I. Announcements II. New Business:
a. The fall deadlines below were reviewed: a. October 21st, 2011 - Plan of action with rough draft of rubric due to Title V. b. November 30th, 2011 - All evidence due to Dropbox. c. December 1tst, - 9th 2011- Final draft of rubric, collection of data evaluation, plan
for spring 2012 due to Title V. b. Two additional Thursday meeting times were arranged:
October 13th and November 3rd 2011 in Building C Conference room. c. Roles within team were assigned:
a. Report writer/historian - Kathy Headtke. b. Note taker - Margaret Warrick. c. Team members - Larry Manalo, Helen Talkin, Karen Tate.
d. The evidence needs were discussed including how to: a. Identify courses that have SLOs linking to the critical thinking and problem
solving ILO and record their SLOs. b. Collect evidence (electronic collection preferred but copies can be scanned):
Two (2) different course assignments (20-30) pieces of student artifacts (10 needed to randomly sample).
Course assignments should be collected from fall 2010, spring 2011, or fall 2011.
Copies of assignments and rubric or method of grading. e. The rubric design was discussed. The grading criterion for the rubric below was
approved by the team:
Dimension Exceeds Expectations
(4)
Meet Expectations
(3)
Fails to Meet Expectations
(2)
Unacceptable (1)
No Evidence
(0)
a. Kathy will research examples of existing rubrics at colleges for critical thinking and problem solving for the team. She will publicize the rubrics using Dropbox by October 6th, 2011, so the team can discuss the examples the next meeting.
f. The team will accept their invitations to join the team’s Dropbox folder for critical thinking and problem solving to enable the team to store and share documents.
g. David Brown distributed iPads and familiarized the group with their use. III. For next time:
38
a. Start to gather evidence: a. Name the instructor(s) and courses using for evidence. b. Identify courses for evidence collection and identify SLOs linking to ILO. c. Ask for a sample of assignment instructions/rubric/scoring sheet for each
course. b. Review the examples of rubrics in the Dropbox after October 6th, 2011.
MINUTES Evidence Team Meeting #2
Thursday October 13, 2011 8:00-9:30 Conference Room Building C
Attendees: Kathy Headtke, Larry Manalo, Julia Raybould-Rodgers, Karen Tait, Helen Talkin
Notetaker: Margaret Warrick
______________________________________________________________________
I. Announcements a. The team should complete the missing details on the plan of action.
II. New Business: a. Rubric Trial
i. Peggy’s and Larry’s rubrics (see attached) were discussed. The team focused on two artifacts.
ii. The team then tried out both rubrics on the two artifacts below:
Artifact One Assignment: Define marriage and discuss how marriage functions with cultures. Discuss the social and economic advantages and disadvantages of monogamy, polygamy, and polyandry. Is marriage an economic exchange system? Explain why you think it is or isn’t. Please use specific examples/facts in your response. Artifact Two Assignment: Without computing each sum, answer the questions
O = 1+ 3 + 5 + 7… + 97 E = 2 + 4 + 6 + 8… + 98 P = 1 + 3 + 5 + 7… +99
Which is greater O or E? By how much? Answer: E and 49
Which is greater P or E? By how much? Answer: P and 50
iii. The team discussed how the rubric worked differently for the two artifacts. It was
decided that using one rubric to assess both critical thinking and problem solving has certain challenges and that not all the categories on the rubric are applicable to all kinds of assignments.
iv. It was decided to ask Kelly Brune for clarification, to see if a “not applicable” column could be added to the rating scale to cover the sections on the rubric which may not be applicable to a particular artifact.
v. The team scheduled an additional meeting to work on the rubric for November 23rd 2011 (rescheduled to November 30th, 2011).
39
III. For next time:
The team will:
i. Complete the plan of action with the missing information whenever possible. ii. Continue to collect artifacts.
Work on improving the rubric. Peggy’s Revised Rubric
ILO: Explore issues through various information sources; evaluate the credibility and significance of both the information and the source to arrive at a reasoned conclusion.
Dimension Exceeds Expectations (4)
Meet Expectations (3)
Fails to Meet Expectations (2)
Unacceptable (1)
No Evidence (0)
Evaluate the quality of reasoning behind information sources, arguments, interpretations, or beliefs.
Clearly and thoroughly defines the main question, problem or issue, identifies all relevant assumptions, and thoroughly identifies and evaluates the sources of data and evidence.
Defines most of the components of the main question, problem or issue, identifies most of the relevant assumptions, and adequately identifies and evaluates the sources of data and evidence.
Defines some of the components of the main question, problem or issue, identifies some of the relevant assumptions, identifies and evaluates some but not most of the sources of data and evidence.
Does not identify components of the main question, or does so only minimally, identifies few or none of the relevant assumptions, and inaccurately identifies or fails to evaluate the sources of data and evidence.
Examine and/or employ sources to develop solutions.
Creatively employs sources to develop solutions that are clear, coherent, accurate and complex.
Appropriately employs sources to develop solutions that are clear, coherent and accurate.
Employs sources to develop solutions that are not clear or maybe incorrect.
Does not apply sources in an accurate or appropriate manner.
40
Formulate, analyze, and solve problems.
Makes no errors when developing a conclusion or complex solution that is complete and well supported.
Makes few errors when developing a conclusion or complex solution that is complete and well supported.
Makes many errors when developing a conclusion or complex solution that is complete and well supported.
Makes extensive errors when developing a conclusion or complex solution that is complete and well supported.
Larry’s Revised Rubric
ILO: Explore issues through various information sources; evaluate the credibility and significance of both the information and the source to arrive at a reasoned conclusion.
I looked at all the information describing the ILO, Critical Thinking and Problem Solving. And I have come
up with these ideas regarding how we can look at the dimensions.
Explore issues through various information sources
Evaluate the credibility and significance of source and information. This technically is similar to
identifying assumptions and discerning bias.
Arrive at a reasoned conclusion.
Apply critical and creative strategies for solving complex problems. This also includes analyzing
reasoning and methods and possibly synthesizing ideas and information from various sources
and media.
With these in mind, then can we consider these as the major dimensions and base the rubric on these
“big ideas”.
Dimension Exceeds Expectations
(4)
Meet Expectations
(3)
Fails to Meet Expectations
(2)
Unacceptable (1)
No Evidence
(0)
Explore issues through various information sources
Clearly and thoroughly defines the main question, problem or issue.
Defines most of the components of the main question, problem or issue.
Defines some of the components of the main question, problem or issue.
Does not identify components of the main question, or does so only minimally.
Evaluate the credibility and significance of source and information
Identifies all relevant assumptions, and thoroughly identifies and
Adequately identifies and evaluates the sources of data and evidence.
Identifies some of the relevant assumptions, identifies and evaluates
Identifies few or none of the relevant assumptions, and inaccurately
41
evaluates the sources of data and evidence.
some but not most of the sources of data and evidence.
identifies or fails to evaluate the sources of data and evidence.
Arrive at a reasoned conclusion
Explains accurately and thoroughly multiple solutions, positions, or perspectives that balance opposing points of view.
Describes multiple solutions, positions, or perspectives accurately.
Identifies simple solutions or over-simplified positions and perspectives with only minor inaccuracies.
Names a single solution, position or perspective, often inaccurately, or fails to present a solution, position, or perspective.
Apply critical and creative strategies for solving complex problems
Employs formulas, sources, procedures, and principles of a discipline accurately, appropriately, and/or creatively.
Applies relevant formulas, sources, procedures, and principles accurately and appropriately.
States appropriate formulas, sources, procedures, and principles with only minor inaccuracies or irrelevancies.
Labels formulas, procedures, sources, and principles inappropriately, inaccurately, or omits them.
MINUTES Evidence Team Meeting #3
Thursday November 17, 2011 8:00-9:30 Conference Room Building C
Attendees: Kathy Headtke, Larry Manalo, Julia Raybould-Rodgers, Karen Tait, Helen Talkin
Notetaker: Margaret Warrick
______________________________________________________________________
I. Announcements a. The team should complete the missing details on the plan of action.
II. New Business: b. Rubric Trial
vi. Peggy’s and Larry’s rubrics (see attached) were discussed. The team focused on two artifacts.
vii. The team then tried out both rubrics on the two artifacts below:
Artifact One Assignment: Define marriage and discuss how marriage functions with cultures. Discuss the social and economic advantages and disadvantages of monogamy, polygamy, and polyandry. Is marriage an economic exchange system?
42
Explain why you think it is or isn’t. Please use specific examples/facts in your response. Artifact Two Assignment: Without computing each sum, answer the questions
O = 1+ 3 + 5 + 7… + 97 E = 2 + 4 + 6 + 8… + 98 P = 1 + 3 + 5 + 7… +99
Which is greater O or E? By how much? Answer: E and 49
Which is greater P or E? By how much? Answer: P and 50
viii. The team discussed how the rubric worked differently for the two artifacts. It was
decided that using one rubric to assess both critical thinking and problem solving has certain challenges and that not all the categories on the rubric are applicable to all kinds of assignments.
ix. It was decided to ask Kelly Brune for clarification, to see if a “not applicable” column could be added to the rating scale to cover the sections on the rubric which may not be applicable to a particular artifact.
x. The team scheduled an additional meeting to work on the rubric for November 23rd 2011 (rescheduled to November 30th, 2011).
III. For next time:
The team will:
iii. Complete the plan of action with the missing information whenever possible. iv. Continue to collect artifacts. v. Work on improving the rubric.
43
Peggy’s Revised Rubric
ILO: Explore issues through various information sources; evaluate the credibility and significance of both the information and the source to arrive at a reasoned conclusion.
Dimension Exceeds Expectations (4)
Meet Expectations (3)
Fails to Meet Expectations (2)
Unacceptable (1)
No Evidence (0)
Evaluate the quality of reasoning behind information sources, arguments, interpretations, or beliefs.
Clearly and thoroughly defines the main question, problem or issue, identifies all relevant assumptions, and thoroughly identifies and evaluates the sources of data and evidence.
Defines most of the components of the main question, problem or issue, identifies most of the relevant assumptions, and adequately identifies and evaluates the sources of data and evidence.
Defines some of the components of the main question, problem or issue, identifies some of the relevant assumptions, identifies and evaluates some but not most of the sources of data and evidence.
Does not identify components of the main question, or does so only minimally, identifies few or none of the relevant assumptions, and inaccurately identifies or fails to evaluate the sources of data and evidence.
Examine and/or employ sources to develop solutions.
Creatively employs sources to develop solutions that are clear, coherent, accurate and complex.
Appropriately employs sources to develop solutions that are clear, coherent and accurate.
Employs sources to develop solutions that are not clear or maybe incorrect.
Does not apply sources in an accurate or appropriate manner.
Formulate, analyze, and solve problems.
Makes no errors when developing a conclusion or complex solution that is complete and well supported.
Makes few errors when developing a conclusion or complex solution that is complete and well supported.
Makes many errors when developing a conclusion or complex solution that is complete and well supported.
Makes extensive errors when developing a conclusion or complex solution that is complete and well supported.
Larry’s Revised Rubric
44
ILO: Explore issues through various information sources; evaluate the credibility and significance of both the information and the source to arrive at a reasoned conclusion.
I looked at all the information describing the ILO, Critical Thinking and Problem Solving. And I have come up with these ideas regarding how we can look at the dimensions.
Explore issues through various information sources
Evaluate the credibility and significance of source and information. This technically is similar to identifying assumptions and discerning bias.
Arrive at a reasoned conclusion.
Apply critical and creative strategies for solving complex problems. This also includes analyzing reasoning and methods and possibly synthesizing ideas and information from various sources and media.
With these in mind, then can we consider these as the major dimensions and base the rubric on these “big ideas”.
Dimension Exceeds Expectations
(4)
Meet Expectations
(3)
Fails to Meet Expectations
(2)
Unacceptable (1)
No Evidence
(0)
Explore issues through various information sources
Clearly and thoroughly defines the main question, problem or issue.
Defines most of the components of the main question, problem or issue.
Defines some of the components of the main question, problem or issue.
Does not identify components of the main question, or does so only minimally.
Evaluate the credibility and significance of source and information
Identifies all relevant assumptions, and thoroughly identifies and evaluates the sources of data and evidence.
Adequately identifies and evaluates the sources of data and evidence.
Identifies some of the relevant assumptions, identifies and evaluates some but not most of the sources of data and evidence.
Identifies few or none of the relevant assumptions, and inaccurately identifies or fails to evaluate the sources of data and evidence.
Arrive at a reasoned conclusion
Explains accurately and thoroughly multiple solutions, positions, or perspectives that balance opposing points of view.
Describes multiple solutions, positions, or perspectives accurately.
Identifies simple solutions or over-simplified positions and perspectives with only minor inaccuracies.
Names a single solution, position or perspective, often inaccurately, or fails to present a solution, position, or perspective.
45
Apply critical and creative strategies for solving complex problems
Employs formulas, sources, procedures, and principles of a discipline accurately, appropriately, and/or creatively
Applies relevant formulas, sources, procedures, and principles accurately and appropriately.
States appropriate formulas, sources, procedures, and principles with only minor inaccuracies or irrelevancies.
Labels formulas, procedures, sources, and principles inappropriately, inaccurately, or omits them.
MINUTES Evidence Team Meeting #4
Thursday, November 3, 2011 8:00-9:30 Conference Room Building C
Attendees: Kathy Headtke, Larry Manalo, Julia Raybould-Rodgers
Notetaker: Margaret Warrick
I. Announcements: a. Kelly Brune’s Response to the team’s questions:
The team discussed Kelley Brune’s response to the team’s questions about the first draft of the rubric (see attached) and made note of her suggestions and comments. It was agreed that the rubric will consist of five existing dimensions- Not applicable (NA), Exceeds Expectations (4), Meets Expectations (3), Fails to Meet Expectations, Unacceptable (1) and No Evidence (0).
b. Plan of action: The team agreed to complete the missing information in the plan of action and submit missing artifact examples to the Dropbox.
2 . New Business: a. The team worked to finalize the rubric. A revised copy is attached. b. The team will fine tune the rubric if necessary next semester. The team also agreed to bring
one example of an artifact to test the effectiveness of the rubric to the team’s first meeting next semester.
Other: a. Meeting times for next semester:
Julia agreed to look at the team’s schedules to identify a time for the team to meet in spring semester. She requested the team email her with details of their availability in spring soon as possible.
b. The team discussed the possibility of meeting before spring semester starts. Julia will send out an email to see who is available to meet on Tuesday January 17th, 2012. The plan is to use this day to look at the team’s artifacts and assess them together. The team members expressed concern about the difficulties of meeting during the semester, especially in the last four weeks of the semester. If the team can meet before the semester starts, it will also give the team more time to work and finish the report after assessing the artifacts. A working lunch could also be provided from the $200 budget allocated to the team for refreshments.
3. For next time: a. Bring one example of an artifact to test the effectiveness of the rubric.
46
From: Julia Raybould-Rodgers Sent: Wednesday, November 16, 2011 11:10 AM To: Kelly Brune Subject: Request for help from the Critical Thinking and Problems Solving Evidence Team
Dear Kelly, Our team is assessing the critical thinking and problem solving ILO. After working with some of our artifacts we have discovered that not all of them show evidence of both critical thinking and problem solving. We had planned to design a rubric that we could use for all types of artifacts rather than use separate rubrics for critical thinking and problem solving. Now, we are not sure how to proceed. Our first question is: Can we have a “not applicable” column on our rubric (as in the example below) that we could check if parts of artifact cannot be measured because there is no evidence of either critical thinking or problem solving shown in the artifact? If we can have a N/A column on our rubric will it affect the data evaluation? Do you have any other suggestions how to handle this on our rubric? Dimension Not
applicable
(N/A)
Exceeds Expectation
s (4)
Meet Expectation
s (3)
Fails to Meet Expectation
s (2)
Unacceptable
(1)
No Evidenc
e (0)
Analyze data, information, ideas, and concepts to draw well supported conclusions.
Creates a detailed conclusion or complex solution that is complete, well-supported, logically consistent, and often unique.
Produces a conclusion or solution that is complete and consistent with evidence presented
Restates an abbreviated conclusion or simple solution that is mostly consistent with evidence presented and has only minor inaccuracies or omissions.
Attempts a conclusion or solution that is inconsistent with evidence presented, that is illogical, or omits a conclusion or solution.
A N/A column is a good idea. That’s the tricky part of having two different things (critical thinking AND problem solving) put in the same ILO (This is why in surveys you don’t ask questions like, “Do you use crack and have a television?” You’re not sure how to answer the question and you can’t really score it, either.). When you don’t have evidence of either critical thinking or problem solving, it’s OK to use N/A, but if you have many of those for a piece of evidence, you might want to reconsider using that piece of evidence. Some missing data is OK, but I’d say if half of your dimensions are ‘N/A’ then I’d try to find different evidence. If that’s all the evidence that you can find, then we’ll take what we can get. Our second question is: How many scoring categories should we have on our rubric? Are there any benefits for data analysis of having a five or four category scale?
47
Really, the more categories the better for statistical analysis. More categories allow for increased variance to be able to separate students who really know their stuff from those who don’t. But, with more categories it gets harder to define each one. Plus, what we’re the most interested in, here, is where students are on either side of ‘fails to meet expectations’ and ‘meets expectations’. So, 4 or 5 categories is fine. Our third question is: Do we need to have a “no evidence column” which is only used when the artifact shows no evidence of the assignment or question being started, or could we include these types of artifacts in our unacceptable column?
You may never use the ‘no evidence column’, but I’d keep it just in case a student draws a picture of a flower when he/she was supposed to show evidence of critical thinking in a math
class. I can always combine the data from those two columns when I do the analysis
MINUTES Evidence Team Meeting #5 Tuesday January 17th, 2012 8:30-11:30 Conference Room Building C Attendees: Kathy Headtke, Larry Manalo, Julia Raybould-Rodgers, Helen Talkin, Karen Tait, Margaret Warrick ___________________________________________________________________
1. Announcements 2. New Business:
a. Evidence team meetings scheduled for next semester: Dates: February 7, 14, 21, 28 and March 6, 13 Time: 8:00 – 9:30 am Place: Building C Conference Room Agenda items were approved. The rubric was reviewed.
b. The list of artifacts collected below was reviewed:
Helen – ANTHRO 102; SOC 102
Kathy – CHEM 150, ENG 103
Larry – NURS 109, BIO 150
Karen – MATH 105, MATH 321
Peggy – CBIS 112, BUS 106
Julia – ACCT 160, AT 130
The team agreed that a sufficient number of artifacts had been collected by the team and the team was ready to to start analyzing them.
c. The rubric was tried out on selected artifacts.
An English 103 essay was assessed and scored using the rubric by the team together and a consensus was reached on a final score. No problems were found with the rubric with this artifact.
A Math 321 problem was assessed and scored using the rubric by the team together. The results were problematic. A decision was made to modify the rubric.
The following revisions were made to the rubric: o The second dimension was reworded adding “identify or evaluate” to be
more applicable cross-discipline.
48
o The “Meets Expectations” and “Fails to Meet Expectations” were combined and the “Meets Expectations” column was deleted.
The rubric was reapplied to the Math 321 artifact and successfully assessed.
The revised rubric was approved by the team. (See attachment).
d. The random sampling process was reviewed. The team will randomly chose 10 artifacts to assess from each course and scan them with the names removed and put them in the Dropbox. Current artifacts which have not been randomly sampled will be removed from the Dropbox.
i. The team agreed to assess the artifacts together as a team in a series of meetings. A schedule of which artifacts will be assessed next will be agreed by the team at the end of each meeting.
3. For next time:
a. The team will put 10 random samples for 2 courses in Dropbox and delete any previous versions.
b. The team will remove the names from the random samples.
c. Karen will do a presentation for the Math 321 artifacts utilizing the rubric.
d. Peggy will do presentation for the CBIS 112 artifacts utilizing the rubric. She will place the instructions and grading rubric for the artifact in the Dropbox for the team to review.
e. The team will assess the Eng 103 artifacts using the revised rubric.
f. The team will test the rubric with Nurs 109, Chem 150, Soc 102 at the next meeting.
MINUTES Evidence Team Meeting #6 Tuesday February 7th, 2012 8:00-9:30 Conference Room Building C Attendees: Kathy Headtke, Larry Manalo, Julia Raybould-Rodgers, Helen Talkin, Margaret Warrick Absence: Karen Tait ___________________________________________________________________
Announcements: Karen emailed the group to let us know that she is ill and unable to attend. She emailed her evaluations of the English 103 artifacts for the group to analyze in her absence. She will do her presentation for the Math 321 artifacts utilizing the rubric next time.
I. New Business: a. Results of assessment of the Eng 103 artifacts by all the team.
The team combined their results and came to agreement on the 10 randomly selected artifacts. The group grading of the artifacts highlighted a few ongoing issues:
The instructor has a different criterion for giving a final grade to the assignment and different requirements for the assignment than the criterion used by the team to assess an ILO. One example of this is the use of the works cited page in the assignment. Another example is grammar issues in the assignment.
The group agreed that this group assessment was good practice in differentiating between grades and the assessment of learning outcomes.
II. Presentation for the CBIS 112 artifacts utilizing the rubric by Peggy.
Peggy explained the procedure for scoring the 10 randomly selected CBIS artifacts. As the expert in the team in programming she was able to explain to the team the
49
programming procedures intrinsic to the assignment. Peggy also relied on the grades given by the instructor to assess the dimensions on the rubric.
The group grading of the CBIS artifacts highlighted a few ongoing issues:
There are some artifacts that the team had difficulties assessing due to a lack of expertise in the subject area of the artifact.
In these cases the team may rely on the expertise of a single member to assess the artifacts and present the findings to the team for advice and approval. 2. Other: Results for ILO assessment of English 103 and CBIS 112.
English 103 ~ ILO Group Evaluation of Evidence
Max Pts 12
Dim 1 Dim 2 Dim 3 Dim 4 Total
Student 1 3 3 3 3 12
Student 2 2 2 2 2 8
Student 3 2 2 1 2 7
Student 4 3 2 2 3 10
Student 5 2 2 2 2 8
Student 6 2 1 1 2 6
Student 7 2 2 2 2 8
Student 8 3 3 3 3 12
Student 9 3 3 3 3 12
Student 10 3 2 3 3 11
CBIS 112 ~ ILO Group Evaluation of Evidence
Max Pts 12
Dim 1 Dim 2 Dim 3 Dim 4 Total
Student 1 3 3 3 3 12
Student 2 2 2 2 2 8
Student 3 2 3 3 3 11
Student 4 3 3 3 3 12
Student 5 0 2 2 2 6
Student 6 3 3 3 3 12
Student 7 3 3 3 2 11
Student 8 3 3 3 2 11
Student 9 3 3 3 2 11
Student 10 2 2 2 2 8
IV. For next time:
50
a. Presentations of artifacts utilizing the rubric will continue.
MINUTES Evidence Team Meeting #7 Tuesday February 14th, 2012 8:30-11:30 Conference Room Building C Attendees: Kathy Headtke, Larry Manalo, Julia Raybould-Rodgers, Helen Talkin, Karen Tait, Margaret Warrick ___________________________________________________________________ I. Announcements: A short discussion was held about the assessment of last week’s artifacts. II. New Business:
a. Presentation for MATH 321 artifacts utilizing the rubric by Karen. Karen explained the procedure for scoring the 10 randomly selected Math 321 artifacts. The procedure was made easier because the six parts of the question easily matched with the two dimensions (2, 3) on the rubric that she was using.
a- Answer b- Source c- Answer d- Source e- Answer f- Source
Karen explained the scoring for the artifact on the rubric. Dimension One on the rubric was not applicable with the math artifact analyzed because the students were given all parts of the problem to solve. Dimension Two was applied to parts b, d, and f (where the theorem used to solve the problem) was identified and analyzed. Dimension Three was used in the area of the problem where the student demonstrated a correct solution to the problem. Karen explained that the word “applies” in the dimension was a key word in assessing the artifact. She used parts a, c, e as measures of accuracy for a solution to the problem. Dimension Four on the rubric was not applied because it was proven to be a duplication of Dimension Three for this artifact. Karen also noted that subjectivity is involved when applying the rubric to the artifacts. The team agreed with Karen’s scoring procedure and results on the rubric.
III. Presentation for CHEM 150 artifacts utilizing the rubric by Kathy
Kathy explained the procedure for scoring the 10 randomly selected chemistry 150 artifacts. The procedure was made easier because she was able to score the artifacts in consultation with the instructor who was responsible for the assignment. She used all of the dimensions on the artifact. She also utilized the instructor’s scoring sheet on each artifact. Kathy explained how the structure of the assignment fitted well with Dimensions One and Four. Dimension One was mapped to the title and objective section of the assignment. Dimension Three was mapped to the summary section of the assignment. Dimension Four was mapped to the Conclusions section of the assignment. After a lengthy discussion with the team members, Kathy decided to go back to the instructor to use his expert subject knowledge to see if there is one section of the assignment which can be mapped exclusively to Dimension Two. She will report her findings to the team next time.
51
The team agreed with Kathy’s scoring procedure and results on the rubric for Dimensions One, Three and Four.
IV. Presentation for NURS 109 artifacts utilizing the rubric by Larry.
Larry used both his understanding as well as the instructor’s rating of the artifacts. He aligned the different components of the artifact, a case study, with the different dimensions for scoring the ten randomly selected samples.
Dimension 1 was mapped to the question that enabled the student to identify the clinical issues by naming the specific clinical condition in third set of questions.
Dimension 2 was evident in the second set of questions that required accurate interpretation of various clinical information (significance) that led to identification of the clinical condition.
Dimension 3 was mapped using the fourth set of questions where specific critical thinking strategies are required to come up with viable outcomes of possible solutions.
Dimension 4 was mainly mapped in the first set of questions. The scenario was set to enable determination of timely decisions and solutions to a potentially life-threatening situation although the dimension was generally evident throughout the artifact.
The team agreed with Larry’s scoring procedure and results on the rubric for Dimensions One, Two, Three and Four.
Other: Results for the ILO assessment of Nursing 109 and Business 106
Math 321 ~ Problem #13
Max Pts 6
Dim 1 Dim 2 Dim 3 Dim 4 Total
Student 1 N/A 3 3 N/A 6
Student 2 N/A 3 2 N/A 5
Student 3 N/A 2 2 N/A 4
Student 4 N/A 3 3 N/A 6
Student 5 N/A 3 3 N/A 6
Student 6 N/A 2 1 N/A 3
Student 7 N/A 2 3 N/A 5
Student 8 N/A 2 3 N/A 5
Student 9 N/A 3 3 N/A 6
Student 10 N/A 1 2 N/A 3
Nursing 109
Max Pts 12
Dim 1 Dim 2 Dim 3 Dim 4 Total
Student 1 3 3 3 3 12
52
Student 2 3 2 2 2 9
Student 3 3 2 3 3 11
Student 4 3 2 2 2 9
Student 5 3 3 3 3 12
Student 6 3 3 3 3 12
Student 7 2 3 2 2 9
Student 8 3 3 3 3 12
Student 9 3 3 3 3 12
Student 10 3 3 3 3 12
III. For next time: Presentations of artifacts utilizing the rubric will continue.
MINUTES Evidence Team Meeting #8 Tuesday February 21st, 2012 8:00-9:30 Conference Room Building C Attendees: Kathy Headtke, Larry Manalo, Julia Raybould-Rodgers, Helen Talkin, Karen Tait, Margaret Warrick ___________________________________________________________________ I. Announcements:
A short discussion was held about the assessment of last week’s artifacts. II. New Business:
1. Presentation of ACCT 160 artifacts utilizing the rubric by Julia.
Julia explained the assignment to the team. She explained the procedure for scoring the 10 randomly selected Acct 160 artifacts.
She was able to match the Summary section of the project with Dimension 4 which outlined the strengths and weaknesses of the company. She was unsure which sections of the report to match up with the other dimensions. As a result of the team’s discussion, it was decided to match the Background section of the report with Dimension One on the rubric. The Key Financial Ratios on the report were matched with Dimension Two and the Analyzing the Data section was linked up with Dimension Three on the rubric.
These artifacts again highlighted the problems of analyzing data in a specialized field. 2. Presentation of AT313 artifacts utilizing the rubric by Julia.
Julia suggested analyzing the AT artifacts in the following way:
Dimension 1 - Student has opportunity to ask three questions to diagnose the problem given.
Dimension 2 - Student uses three details from customer email to identify three specific symptoms.
Dimension 3 - Student identifies where the problem is, uses statements in customer's email which support diagnosis then uses one simple check to verify diagnosis.
Dimension 4 - Student states the cause of the problem. After group discussion,
Presentation of MATH 105 artifacts utilizing the rubric by Karen.
53
Karen explained the assignment to the team. She explained the procedure for scoring the 10 randomly selected Math 105 artifacts.
Karen explained her rational for choosing to assess problems one and two on the quiz. She decided to check Dimensions One as “Not Applicable” on the rubric because the students were given the problems to solve and Dimension Two as “Not Applicable” because the students did not have to use sources.
The score on the rubric consisted of 50% from problem one and 50% from problem 2.
Problem one was assessed using Dimension Four on the rubric and problem two was assessed using Dimension Three on the rubric.
Revisiting Chem 150 Dimension 2 artifacts by Kathy.
On February 14, the group agreed on the mapping of Dimensions 1, 3, and 4 of the Chem 150 artifacts but needed subject expertise on Dimension 2. Kathy consulted with the Chemistry instructor who said Dimension 2 pertained to the credibility of outside sources required for the assignment. The group agreed and the scoring rubric was completed for Chem 150.
III. Other: Results for the ILO Assessment of Acct 160, AT313, Math 105, and Chem 150.
ACCT 160
Max Pts 12
Dim 1 Dim 2 Dim 3 Dim 4 Total
Student 1 3 1 1 3 8
Student 2 1 1 0 1 3
Student 3 3 3 3 3 12
Student 4 3 1 3 3 10
Student 5 3 3 3 3 12
Student 6 3 3 1 0 7
Student 7 0 3 1 1 5
Student 8 1 1 0 1 3
Student 9 3 3 2 1 9
Student 10 3 3 1 1 8
AT 313
Max Pts 12
Dim 1 Dim 2 Dim 3 Dim 4 Total
Student 1 2 2 3 2 9
Student 2 2 3 3 3 11
Student 3 2 3 3 2 10
Student 4 2 1 1 1 5
54
Student 5 3 3 2 1 9
Student 6 2 2 1 1 6
Student 7 1 3 3 3 10
Student 8 2 3 2 2 9
Student 9 2 2 3 3 10
Student 10 2 2 3 3 10
Math 105
Problem #2 Problem #1 Max Pts 6
Dim 1 Dim 2 Dim 3 Dim 4 Total
Student 1 N/A (none given) N/A 3 1 4
Student 2 N/A (none given) N/A 3 3 6
Student 3 N/A (none given) N/A 3 1 4
Student 4 N/A (none given) N/A 3 1 4
Student 5 N/A (none given) N/A 3 1 4
Student 6 N/A (none given) N/A 3 1 4
Student 7 N/A (none given) N/A 3 3 6
Student 8 N/A (none given) N/A 2 1 3
Student 9 N/A (none given) N/A 3 1 4
Student 10 N/A (none given) N/A 0 0 0
Chem 150 Max Pts 12
Dim 1 Dim 2 Dim 3 Dim 4 Total
Student 1 3 3 3 3 12
Student 2 3 3 3 3 12
Student 3 3 3 3 3 12
Student 4 3 3 2 3 11
Student 5 3 3 3 3 12
Student 6 2 2 1 3 8
Student 7 2 1 1 3 7
Student 8 1 1 2 2 6
Student 9 3 2 1 3 9
Student 10 3 3 3 3 12
VI. For next time: Presentations of artifacts utilizing the rubric will continue.
55
MINUTES Evidence Team Meeting #9 Tuesday February 28th 2012 8:00-9:30 Conference Room Building C Attendees: Kathy Headtke, Larry Manalo, Julia Raybould-Rodgers, Helen Talkin, Karen Tait, Margaret Warrick ___________________________________________________________________
1. Announcements:
A short discussion was held about the assessment of last week’s artifacts. 2.New Business:
a. Presentation of SOC 102 artifacts utilizing the rubric by Helen.
Helen explained the procedure for scoring the 10 randomly selected Soc 102 artifacts. She explained the assignment to the team. These artifacts were part of a longer quiz. Helen explained how the students were given study sources and lecture notes, but they were required to define the issue. The team decided that there was enough evidence of problem solving and critical thinking, despite being given assistance by the instructor on the issue and problem to assess the artifacts and Dimension One was applied to this part of the artifact. Helen was able to match Dimension Two with the students’ ability to identify sources as they discussed the issue in the assignment. For Dimension Three Helen matched up the student’s abilities to identify solutions to the issue and problems. Dimension Four was more problematic to apply to the rubric to because the students were not required to come up with a conclusion or solution in the assignment’s instructions. However, some of the students were able to do this even though it was not a requirement of the assignment.
The team agreed that this artifact had some problematic areas when the rubric was applied.
b. Presentation for Presentation for BIO 150 artifacts utilizing the rubric by Larry.
Larry explained the assignment to the team. He explained the procedure for scoring the 10 randomly selected Bio 150 artifacts. He matched Dimension One on the rubric with how well the students were able to understand the main question which posts a hypothesis and state expected outcomes of the experiment the students performed. Dimension Two matched well up with the results section of the lab report. Dimension Three matched up to the students’ explanation of the results and Dimension Four matched up with whether the hypothesis was proven or not. Overall, the Biology 150 lab report artifacts were assessed thoroughly for problem solving and critical thinking using the ILO rubric.
III. Other: Results for the ILO assessment of Biology 150, Chemistry 150, and Sociology 106
Bio 150
Max Pts 12
Dim 1 Dim 2 Dim 3 Dim 4 Total
Student 1 2 2 2 2 8
Student 2 2 2 2 2 8
Student 3 2 2 2 1 7
Student 4 2 1 1 1 5
Student 5 2 2 2 2 8
56
Student 6 3 3 3 2 11
Student 7 2 2 2 2 8
Student 8 2 2 2 2 8
Student 9 3 3 3 3 12
Student 10 3 3 3 3 12
Soc 102 Max Pts 12
Dim 1 Dim 2 Dim 3 Dim 4 Total
Student 1 3 3 3 3 12
Student 2 2 2 2 2 8
Student 3 2 2 2 2 8
Student 4 2 2 2 2 8
Student 5 3 3 3 3 12
Student 6 1 1 2 2 6
Student 7 3 3 3 3 12
Student 8 3 3 3 2 11
Student 9 3 3 3 3 12
Student 10 3 3 3 3 12
VI. For next time: Presentations of artifacts utilizing the rubric will continue.
ILO: Explore issues through various information sources; evaluate the credibility and significance of
both the information and the source to arrive at a reasoned conclusion.
Dimension NA* Exceeds
Expectations
(3)
Meets
Expectations
(2)
Fails to Meet
Expectations
(1)
No Evidence
(0)
Explore or
define issues,
problems, or
questions.
Clearly and
thoroughly
defines the
issues, problems,
or questions.
Defines most of
the components
of the issues,
problems, or
questions.
Does not identify
the components of
the issues,
problems, or
questions.
No evidence of
exploring, or
defining issues,
problems, or
questions.
Identify or
evaluate the
credibility or
significance of
Thoroughly
identifies or
evaluates the
credibility or
Adequately
identifies or
evaluates the
credibility or
Poorly identifies or
evaluates the
credibility or
significance of few
No evidence of
evaluating the
credibility and
significance of
57
sources or
information.
significance of
sources or
information.
significance of
sources or
information.
or no sources or
information.
sources or
information.
Apply critical
thinking
strategies for
solving issues,
problems, or
questions.
Accurately
applies formulae,
sources,
procedures or
discipline
principles.
Applies
appropriate
formulae,
sources, or
discipline
principles with
minor
inaccuracies.
Labels formulae,
sources or
discipline principles
inappropriately,
inaccurately, or
omits them.
No evidence of
applying critical
thinking
strategies for
solving issues,
problems, or
questions.
Arrive at a
reasoned
conclusions or
solutions.
Thoroughly and
accurately
describes
conclusions or
solutions.
Identifies
conclusions or
solutions with
minor
inaccuracies.
Inaccurately
identifies or fails to
present conclusions
or solutions.
No evidence of
arriving at
reasoned
conclusions or
solutions.
*NA indicates that the dimension is not applicable for a specific artifact.
MINUTES Evidence Team Meeting #10
Tuesday March 5th, 2012 8:30-11:30 Conference Room Building C Attendees: Kathy Headtke, Larry Manalo, Julia Raybould-Rodgers, Helen Talkin, Karen Tait, Margaret Warrick ___________________________________________________________________
1. Announcements:
A short discussion was held about the assessment of last week’s artifacts. 2. New Business:
a. Presentation of ANTH 102 artifacts utilizing the rubric by Helen.
Helen explained the assignment to the team. She explained the procedure for scoring the 10 randomly selected Soc 102 artifacts.
Helen explained her rational for assessing the artifacts. She identified a problem with Dimension One on the rubric. The students were given the issue to define and explore. Helen decided that Dimension One was applied to the ability of the students to identify the context in which language would be studied. The team agreed with her. Dimension Two was matched with the students’ ability to choose a social aspect of language and justify its existence. The assessment in Dimension Three was based on the students’ ability to provide examples. The number of examples corresponded to a score on the rubric. For example, if three examples were given, it was scored as a three on the rubric. Dimension Four was
58
scored as “Not Applicable” on the rubric because the students were not asked to describe or arrive at conclusions or solutions in the assignment instructions.
c. Presentation for Presentation for BUS 160 artifacts utilizing the rubric by Peggy.
Peggy explained the assignment to the team. She explained the procedure for scoring the 10 randomly selected Bus 160 artifacts. Dimension One matched up with the information in the artifact describing the business or service. Dimension Two matched up with the identification and evaluation of the location, management team, and financial projection plan. Dimension Three matched up with the ability for the students to identify the competitors, and do a marketing analysis and plan. Dimension Four matched up with the executive summary and the overall comprehensiveness of the report.
IV. Other: Results for ILO assessment of English 103 and CB112.
Anthro 102
Max Pts 12
Dim 1 Dim 2 Dim 3 Dim 4 Total
Student 1 3 3 2 N/A 8
Student 2 3 1 1 N/A 5
Student 3 2 2 0 N/A 4
Student 4 0 1 3 N/A 4
Student 5 2 1 3 N/A 6
Student 6 0 3 3 N/A 6
Student 7 0 3 2 N/A 5
Student 8 2 0 1 N/A 3
Student 9 3 3 2 N/A 8
Student 10 0 0 2 N/A 2
Business 106 - Small Business Management
Max Pts 12
Dim 1 Dim 2 Dim 3 Dim 4 Total
Student 1 2 2 2 2 8
Student 2 2 3 3 3 11
Student 3 3 3 2 2 10
Student 4 3 2 3 3 11
Student 5 2 1 1 1 5
Student 6 2 1 2 1 6
Student 7 3 3 3 3 12
Student 8 3 3 3 3 12
Student 9 3 2 3 3 11
Student 10 3 2 3 3 11
VI. For next time: Presentations of artifacts utilizing the rubric will continue.
59
MINUTES Evidence Team Meeting #11 Tuesday April 17th, 2012. 8:00-9:30 Conference Room Building C Attendees: Kathy Headtke, Larry Manalo, Julia Raybould-Rodgers, Helen Talkin, Karen Tait, Margaret Warrick ___________________________________________________________________
1. Announcements: Timeline for ILO report and finalizing report details.
2. New Business: There was group discussion about the following topics: a. Aspects of Kelly Brune’s statistical analysis of the artifacts. b. Report conclusion regarding faculty participation, accumulation of
artifacts, and the ILO evidence team process. c. Indirect evidence. d. Qualitative versus quantitative research.
MINUTES Evidence Team Meeting #12
Tuesday May 1st, 2012. 8:00-9:30 Conference Room Building C Attendees: Kathy Headtke, Larry Manalo, Julia Raybould-Rodgers, Helen Talkin, Karen Tait, Margaret Warrick ___________________________________________________________________
1. Announcements: None. 2. New Business: Group discussion continued regarding Kelly Brune’s statistical analysis of artifact rankings. Group discussion followed with revisions to the final ILO report, including the conclusions, CCSSE indirect evidence, faculty driven assessment process, and course mapping process.
MINUTES Evidence Team Meeting #13 Tuesday May 8th, 2012. 8:00-9:30 Conference Room Building C Attendees: Kathy Headtke, Larry Manalo, Julia Raybould-Rodgers, Helen Talkin, Karen Tait, Margaret Warrick ___________________________________________________________________
1. Announcements: None.
2. New Business: The last meeting involved group discussion regarding final revisions to ILO report. The topics of focus included: acknowledging participating faculty contributions, observations regarding use of data, ILO process, ACCJC regulations, and conclusions.