Upload
hathuy
View
219
Download
4
Embed Size (px)
Citation preview
1
Weber State University Annual Assessment of Evidence of Learning
Cover Page Department/Program: Computer Science Academic Year of Report: 2012-13 Date Submitted: 11/15/12 Report author: Dr. Brian Rague, Dept. Chair Contact Information: Phone: 801.626.7377 Email: [email protected]
2
A. Brief Introductory Statement:
Please review the Introductory Statement and contact information for your department displayed on the assessment site:
http://www.weber.edu/portfolio/departments.html - if this information is current, please indicate as much. No further
information is needed. We will indicate “Last Reviewed: [current date]” on the page.
If the information is not current, please provide an update:
The contact information is correct. There are some minor editorial revisions to the introductory statement, which should be updated as follows: The Computer Science program employs a technical, scientific approach requiring a solid foundation in mathematics and physics. The program blends scientific and engineering principles implemented through actual, practical, and applications-oriented experience as well as the intellectual study of computation. It is designed to provide a sound fundamental understanding of logic and of digital computer organization as well as the interaction between hardware, software, and the interconnection of system components. Also emphasized is software engineering which includes understanding operating systems design, implementing the theory of computing, analysis of algorithms, simulation design, and the development of knowledge-based systems. The objectives of the Computer Science program are to provide students with an education that will help them achieve their academic and career goals while simultaneously meeting the needs of industry partners.
3
B. Mission Statement
Please review the Mission Statement for your department displayed on the assessment site:
http://www.weber.edu/portfolio/departments.html - if it is current, please indicate as much; we will mark the web page as “Last
Reviewed [current date]”. No further information is needed.
If the information is not current, please provide an update: All information is correct except that it’s not clear why the subtitle “Academic Year 1999-2000” is placed under Mission Statement. I would like to request the removal of “Academic Year 1999-2000” from the webpage unless it has some significance that I’m not aware of.
4
C. Student Learning Outcomes Please review the Student Learning Outcomes for your department displayed on the assessment site:
http://www.weber.edu/portfolio/departments.html - if they are current, please indicate as much; we will mark the web page as
“Last Reviewed [current date]”. No further information is needed.
If they are not current, please provide an update:
We have updated our learning outcomes as follows:
Measureable Learning Outcomes At the end of their study at WSU, students in this program will:
1. Students will understand the importance of and will practice professional and ethical behavior, and will understand the
professional, ethical, legal, security, and social responsibilities of computing professionals
2. Students will be able to read and understand manuals, documentation, and technical literature, find and understand sources of
information, and learn on their own what they need to continue to perform professionally after graduation
3. Students will be able to solve new problems and to express their new solutions appropriately
4. Students will be able to function as a team member and carry out assigned tasks
5. Students will have the knowledge and the skills needed to be employable, and to be immediately and continuously productive
6. Students will have a basic understanding of computer theory, software design and operation, project management, databases,
networking, and computer hardware
7. Students will understand algorithm design and how to express and how to implement algorithms using a variety of notation,
programming languages, and paradigms
8. Students will be able to debug computer programs
9. Students will be able to express themselves clearly both verbally and in writing
10. Students will be able to critically evaluate the quality and the features of information from various sources and to make
informed decisions about the design of information systems
11. Students will be prepared for graduate studies in Computer Science and will have the necessary knowledge and skills to be
accepted into and succeed in relevant programs if they desire to continue their education in computer science
5
D. Curriculum
Please review the Curriculum Grid for your department displayed on the assessment site:
http://www.weber.edu/portfolio/departments.html - if it is current, please indicate as much; we will mark the web page as “Last
Reviewed: [current data]”. No further information is needed.
If the curriculum grid is not current, please provide an update:
Our curriculum map has changed: Curriculum Map
Curriculum Map: Core Courses Articulated with Student Learning Outcomes
I = Introduced R = Reinforced E = Emphasized 1 Core Courses in Department/Program
Department/Program Learning Outcomes
1. P
rofe
ssio
nal
an
d
eth
ical
beh
avio
r
2.
Rea
d t
ech
nic
al
lite
ratu
re a
nd
lear
n o
n
thei
r o
wn
3.
Solv
e p
rob
lem
s an
d
exp
ress
so
luti
on
s
4.F
un
ctio
n i
n t
eam
s an
d
carr
y o
ut
assi
gnm
ents
5. K
no
wle
dge
an
d s
kil
ls
nee
d f
or
emp
loy
men
t
6. T
heo
ry, d
esig
n,
op
erat
ion
, pro
ject
m
anag
emen
t, &
DB
2
7. U
nd
erst
and
an
d u
se
algo
rith
ms
8. D
ebu
g p
rogr
ams
9. V
erb
al a
nd
wri
tin
g sk
ills
10
. E
val
uat
e in
form
atio
n s
yst
ems
11
. P
rep
arat
ion
fo
r gr
adu
ate
stu
die
s 2
CS1400 Fundamentals of Programming I I I I CS1410 Object-Oriented Programming R I R R I I R I CS2350 Web Development R R R R CS2420 Introduction to Data Structures & Algorithms R R R R R R R R CS2450 Software Engineering I R R R I R R R R I I CS2550 Database Design & Application Development R R R R R R CS2650 Computer Architecture/Organization R R R R R R R R R R CS2705 Network Fundamentals and Design R R R R R I MGMT2400 Project Management 3 R R R R R R CS3100 Operating Systems R R R R R R R R CS3130 Computational Structures R R R R R R CS3750 Software Engineering II E R R R R R E E
6
I = Introduced R = Reinforced E = Emphasized 1 Core Courses in Department/Program
Department/Program Learning Outcomes
1. P
rofe
ssio
nal
an
d
eth
ical
beh
avio
r
2.
Rea
d t
ech
nic
al
lite
ratu
re a
nd
lear
n o
n
thei
r o
wn
3.
Solv
e p
rob
lem
s an
d
exp
ress
so
luti
on
s
4.F
un
ctio
n i
n t
eam
s an
d
carr
y o
ut
assi
gnm
ents
5. K
no
wle
dge
an
d s
kil
ls
nee
d f
or
emp
loy
men
t
6. T
heo
ry, d
esig
n,
op
erat
ion
, pro
ject
m
anag
emen
t, &
DB
2
7. U
nd
erst
and
an
d u
se
algo
rith
ms
8. D
ebu
g p
rogr
ams
9. V
erb
al a
nd
wri
tin
g sk
ills
10
. E
val
uat
e in
form
atio
n s
yst
ems
11
. P
rep
arat
ion
fo
r gr
adu
ate
stu
die
s 2
CS4110 Concepts of Formal Languages and Algorithms R R R E E E CS4230 Java Application Development 4 CS4750 Advanced Software Engineering 4 CS4790 N-Tier Web Programming 4
E E E E E E E E E E E E E E E E E E E E E E E E E E E
1 Program improvement statistics are collected for these courses 2 This outcome is more fully enabled through elective courses 3 This course is taught by qualified CS faculty in support of departmental student learning outcomes 4 Students must select one course
7
E. Assessment Plan Please review the Assessment Plan for your department displayed on the assessment site:
http://www.weber.edu/portfolio/departments.html - if the plan current, please indicate as much; we will mark the web page as
“Last Reviewed [current date]”. No further information is needed.
If the plan is not current, please provide an update: The site should contain an up-to-date assessment plan with planning going out a minimum of three years beyond the current
year. Please review the plan displayed for your department at the above site. The plan should include a list of courses from
which data will be gathered and the schedule, as well as an overview of the assessment strategy the department is using (for
example, portfolios, or a combination of Chi assessment data and student survey information, or industry certification exams,
etc.). Here is our current assessment plan: Program-Level Assessment Data Collection Schedule
Course 2011-2012
2012-2013
2013-2014
2014-2015
2015-2016
CS1410 Object-Oriented Programming C A I C CS2420 Introduction to Data Structures & Algorithms C A I C CS2450 Software Engineering I C A I C CS3130 Computational Structures C A I C CS1400 Fundamentals of Programming C A I CS2550 Database Design & Application Development C A I CS2705 Network Fundamentals and Design C A I CS3230 Internet Multimedia Services and Applications Using Java
C A I
CS2350 Web Development C A I CS2650 Computer Architecture/Organization C A I CS3100 Operating Systems C A I
8
CS4110 Concepts of Formal Languages and Algorithms for Computing
C A I
CS 3750 Software Engineering II C A CS 4230 Java Application Development C A CS 4750 Advanced Software Engineering C A CS 4790 N-Tier Web Programming C A C – Collect Data A – Analyze Data I – Implement Improvements
9
F. Report of assessment results for the most previous academic year: There are a variety of ways in which departments can choose to show evidence of learning. This is one example. The critical pieces to include are 1) what learning outcome is being assessed, 2) what method of measurement was used, 3) what the threshold for ‘acceptable performance’ is for that measurement, 4) what the actual results of the assessment were, 5) how those findings are interpreted, and 6) what is the course of action to be taken based upon the interpretation.
a. Evidence of Learning: Courses within the Major (duplicate this page as needed)
CS1410 – Assessment Results The data analyzed for this report were collected and normalized from four sections of CS 1410 (Object-Oriented Programming Using
C++) taught during the Fall and Spring semesters of the 2011-2010 academic year. It should be noted that although, the department
has past experience evaluating individual courses, individual instructors, and students, the 2011-2012 academic year was the first time
that the Weber State University Department of Computer Science attempted to collect data for the purpose of evaluating the degree to
which students attain a set of outcomes. Clearly, the department learned more about the data collection process than it did about
students attaining its outcomes.
Methodology
As a part of the process for achieving ABET accreditation, the Dept. of Computer Science faculty defined a set of Program
Educational objectives, which are “broad statements that describe what graduates are expected to attain within a few years of
graduation.” The faculty also defined a set of learning outcomes, are “what students are expected to know and be able to do by the
time of graduation.” The ABET learning outcomes and their evaluations are consistent with the outcomes and evaluations required for
Northwest accreditation. Both the objectives and the outcomes are listed in the self-study submitted Fall 2012.
Next, the faculty defined a set of “core classes,” which are required of all students. There are a series of three capstone classes
included in the core set; students must complete one of the capstone courses. Each course in the core was designated as a course that
(a) introduced an outcome topic, (b) reinforced an outcome, or (c) emphasized or evaluated the outcome. The table of core classes (as
defined at the end of the 2011-2012 academic year) is included in the self-study.
10
Finally, the adopted a schedule for collecting student outcome data from the core classes, for analyzing that data, and for
implementing improvements based on the analysis. The data collection schedule is also included in the self-study.
The department used an extension to the generic ADDIE model (an acronym that names the five common phases of instructional
development: Analysis, Design, Development, Implementation, and Evaluation) called work models to specify the articulation
between student assessment and the outcomes. Four work models (for CS 1410, CS 2420, CS 2450, and CS 3130) are included with
the self-study. In the process of collecting and analyzing the assessment data, several deficiencies in the process were identified and
numerous errors, exacerbated by the process deficiencies, were encountered.
Data Collection Deficiencies and Errors 2011-2012
Work models were not designed nor were they intended for this use. Consequently, the work model approach was found to be
cumbersome and unmanageable. Specifically, work models, in this context, are inappropriate for the following reasons:
Creating such work models for each class is too much of a burden in time and effort for an already overwhelmed faculty.
When designing and redesigning a course, it is too difficult to project individual assessments accurately enough to create the
work models.
As a course and the concomitant assessments evolve, it is too much effort to maintain coherency between the work models and
the course and its assessments.
Collecting and reporting this amount of course-level data requires too much faculty time and effort.
The work model based assessments produced too much data of dubious usefulness.
Although the above list highlights the flaws in the 2011-2012 data collection process, the process itself was not completed without
errors. The data presented below corresponds to four sections of CS 1410 spanning two semesters, but only represents part of the data
used for determining student grades. Grades for all four semesters were based on ten programming assignments, three programming
tests, four midterms, and twelve worksheets. Students uploaded computer programs, for assignments and tests, to Blackboard, and
completed the midterms and the worksheets on ChiTester. The instructor neglected to save any ChiTerster data between Fall 2011
and Spring 2012. The instructor did attempt to save assessment data at the end of Spring 2012 but encountered problems with
ChiTester and stopped. In the rush to prepare for Fall 2012, all data was flushed from ChiTester before it was saved. Therefore, the
CS 1410 data analyzed below is based solely on the programming data, assignments and tests, harvested from ChiTester.
Unfortunately, the collected data do not directly reflect student attainment of the outcomes. In retrospect, this is hardly surprising
given the underlying of the data collection model. Work models were designed to ensure that the fine-grained instructional elements
arising from an instructional decomposition were presented in an authentic, holistic context rather than in a fragmented, disjoint, and
11
decontextualized fashion. Work models work well for this task, and, indeed, an examination of the work models included with the
self-report demonstrate that the instructional elements supporting the outcomes are grouped into realistic and meaningful assignments.
Nevertheless, the subcomponents of the assignments do not articulate well with individual outcomes, making any analysis difficult
and problematic. These observed flaws and errors lead to the following recommendations.
Procedural Changes Proposed 2012-2013
The Computer Science faculty has recently adopted a set of rubrics for each of the learning outcomes. Each rubric specifies a series of
specific performances and four distinct descriptors that measure the level of attainment achieved by a student. The new data
collection process is base on this set of rubrics as follows.
1. Assessments measure a range of understanding differentiated by the four distinct attainment categories defined in each rubric.
2. A committee of faculty members who teach a core course oversees the course content. Course committees must identify the
outcomes and the performances for which a given course provides instruction and for which outcome data must be collected.
One or more assessments are specified for each outcome and for each performance.
3. The rubric-based scoring precludes the use of multiple choice questions. Various types of assessment are appropriate and are a
function of the level of treatment for that outcome:
a. Introduced: create one exam assessment evaluated via the accepted rubric.
b. Reinforced: create one exam assessment or one program/project assessment evaluated via the accepted rubric.
c. Emphasized: create two assessments, which are evaluated via the accepted rubric, from the following list
i. exam
ii. program
iii. project
iv. group / team work
4. At the end of the semester, each instructor is responsible for reporting the anonymized data to the assessment coordinator for
analysis and reporting.
CS 1410 Data Analysis 2011-2012
CS 1410 introduces outcomes 6, 7, and 11, and reinforces outcomes 3, 5, and 8. Although the instruction does introduce or reinforce
the stated outcomes, the assessments reflect the student’s success in completing the overall tasks and are not able to directly measure
the attainment of a specific outcome. This is clearly a flaw assessment design.
12
The following table summarizes the collected data and a very basic analysis of that data. Labs represent programming assignments
that the students completed over approximately one week. Programming tests (PT) require the students to solve a simple problem
write a program to express that solution in 60 minutes. Labs 2 and 3 represent three programs that are divided differently between the
labs fall and spring semesters based on the holidays occurring during the respective semesters; the data for the two labs are combined
for reporting purposes. Score data >1 indicate extra credit received.
Students who withdraw from the course at anytime during the semester remain on the Blackboard roll. Data for these students skews
all scoring data and significantly distorts the non-submit data (the number of students who do not submit an assignment). Most of the
zero-entries in the raw data represent assignments that were not submitted but the data fail to note other causes for zero scores, such as
withdrawing from the course. The second table normalizes the data by removing the assignments that were not submitted. Together,
the two tables represent the extreme ranges of the data - the most reliable indicators probably lie between the extremes.
Lab 1 Lab 2/3 Lab 4 Lab 5 Lab 6 Lab 7 Lab 8 Lab 9 Lab 10 PT 1 PT 2 PT 3
Mean 0.82 0.76 0.80 0.66 0.62 0.57 0.68 0.66 0.57 0.88 0.73 0.73
Median 1.00 0.95 1.00 0.95 0.88 0.87 1.00 0.93 0.00 1.00 0.85 1.00
Mode 1.00 1.00 1.00 1.00 1.00 0.00 1.00 0.00 0.00 1.00 0.90 1.00
Students 116.00 116.00 116.00 116.00 116.00 116.00 116.00 116.00 116.00 116.00 116.00 116.00
Submits 101.00 104.00 100.00 88.00 84.00 68.00 84.00 81.00 53.00 108.00 98.00 91.00
Non-submits 15.00 12.00 16.00 28.00 32.00 48.00 32.00 35.00 63.00 8.00 18.00 25.00
Lab 1 Lab 2/3 Lab 4 Lab 5 Lab 6 Lab 7 Lab 8 Lab 9 Lab 10 PT 1 PT 2 PT 3
Mean 0.95 0.85 0.93 0.87 0.86 0.97 0.94 0.95 1.25 0.94 0.87 0.93
Median 1.00 0.97 1.00 1.00 0.96 1.00 1.00 1.00 1.40 1.00 0.90 1.00
Mode 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.75 1.00 0.90 1.00
Unfortunately but not surprisingly, the two sets of data are contradictory. For the data to be of value, it must be refined by cross
referencing it with student withdrawal data so that the true impact of unsubmitted assignments is reflected in the scores. Teaching
commitments are reporting deadlines do not the needed cross referencing.
Outcome 3
Students will be able to solve new problems and to express their new solutions appropriately.
13
Labs 1 through 6 and 10, and programming tests 1 and 2 entail problem solving. The worst case data (first table) range from a high of
82% down to a low of 57%, which indicate an unacceptable number of students attaining this outcome. The best case data (second
table) demonstrate a low of 85%, which indicate an acceptable number of students attaining outcome 3. It is not possible to reliably
determine the attainment of this outcome with the data in its present form.
Outcome 5
Students will have the knowledge and the skills needed to be employable, and to be immediately and continuously productive.
CS 1410 uses Microsoft Visual Studio (VS) as the primary development environment. Although VS is not the only development
environment utilized in industry, it is nevertheless quite common and is taught by very few schools in the Utah Higher Education
system. This gives WSU CS graduates a significant advantage in the local job market. All programming assignments and test require
practical programming. With the exception of the students who rarely attend class or submit assignments but who nevertheless do not
drop the course, all students complete at least some of the assignments and therefore demonstrate a basic ability to use VS. This
indicates an acceptable level of attainment for outcome 5.
Outcome 6
Students will have a basic understanding of computer theory, software design and operation, project management, databases,
networking, and computer hardware.
Outcome 6 is quite broad and CS 1410 only addresses one of the sub-components: software design. Labs 8 and 9, and programming
test 3 focus on the skills related to software design by exploring UML class diagrams and how to translate those diagrams into
working code. These assignments suffer from the highest non-submission rates (exceeded only the rate of non-submission for lab 7,
the overloaded operators assignment). Many students complete these assignments and earn high scores, but the high number of non-
submissions indicates that this outcome is not attained by a satisfactory number of students.
Outcome 7
Students will understand algorithm design and how to express and how to implement algorithms using a variety of notation,
programming languages, and paradigms.
14
Labs 2 and 3 address solving a problem, developing an algorithm to represent the solution, and expressing the algorithm as either
pseudo code or as a flow chart. The labs also entail translating pseudo code and flow charts into working programs. Even the worst
case data set (the first table) indicate a mean score of 76% and a median score of 95%. These scores demonstrate an acceptable level
of attainment for outcome 7.
Outcome 8
Students will be able to debug computer programs.
All programming assignments and tests require students to debug their programs. Unfortunately, it is possible to make two
interpretations of the data. Similarly for the argument of outcome 5, it could be argued that most students, excepting those who rarely
attend class or submit assignments but do not drop the course, complete at least some of the assignments and therefore demonstrate an
attainment of outcome 8. But another plausible and likely interpretation is that the large number of non-submissions may be caused
by students who are unable to debug their program (and for some reason never seek the instructor’s help) and therefore do not submit
it for grading. The likelihood of the second interpretation indicates an unacceptable level of attainment for this outcome.
Outcome 11
Students will be prepared for graduate studies in Computer Science and will have the necessary knowledge and skills to be accepted
into and succeed in relevant programs if they desire to continue their education in computer science.
CS 1410 introduces three topics supporting outcome 11: pointers and pointer operations, class relations, and polymorphism. Labs 8
and 9 are the primary assignments covering these topics. The scores vary by approximately 30% between the worst case data (first
table) and the best case data (second table), which does not conclusively demonstrate attainment of this outcome.
Conclusions
Given the current data, it is difficult to claim an acceptable rate of attainment for most of the learning outcomes. The first effort in
data collection and analysis provided more data about the process than it did about efficacy of instruction. Based on what was learned
about the process, the process has been toughly redesigned for the 2012-2013 academic year.
15
CS3130 – Assessment Results
Learning Outcome Curriculum Map Entry Assessment Result
1. Professional and Ethical Behavior R - Reinforced Team Project #1 Qualitative - 95% of team members collaborated and conducted themselves professionally
Team Project #2
Qualitative - 88% of team members collaborated and conducted themselves professionally
Team Project #3
Qualitative - 96% of team members collaborated and conducted themselves professionally
3. Solve problems and express solutions R - Reinforced Exam #1 Questions Quantitative - 85%
Exam #2 Questions Quantitative - 95%
Exam #3 Questions Quantitative - 75%
5. Knowledge and skills needed for employment R - Reinforced Team Project #1
Qualitative - 90% of team members sufficiently demonstrated knowledge-based, presentation and other workplace skills
Team Project #2
Qualitative - 85% of team members sufficiently demonstrated knowledge-based, presentation and other workplace skills
Team Project #3
Qualitative - 92% of team members sufficiently demonstrated knowledge-based, presentation and other workplace skills
6. Theory, design, operation, etc. R - Reinforced Exam #1 Questions Quantitative - 80%
Exam #2 Questions Quantitative - 79%
Exam #3 Questions Quantitative - 70%
7. Understand and use algorithms R - Reinforced Exam #1 Questions Quantitative - 75%
Exam #2 Questions Quantitative - 95%
16
Exam #3 Questions Quantitative - 85%
11. Preparation for graduate studies R - Reinforced Exam #1 Questions Quantitative - 75%
Exam #2 Questions Quantitative - 90%
Exam #3 Questions Quantitative - 73%
CS2420 – Assessment Results
For Summer Semester 2012, almost all proposed CS 2420 assessment methods for assignments and tests were
implemented. Gathering data for these is somewhat cumbersome, as Chitester currently does not have extensive
features to provide question by question assessment data on essay style questions. I am currently working with
developers who manage Chitester to implement this for the future. For this report, a selection of the assessed items are
given, with all data manually computed.
4. Hash tables - The final exam had two questions regarding knowledge of hash tables. One concerned strengths of
linked list based hashtables with array based hashtables. Another sought to assess a students understanding of the topic
by asking them to propose slight modifications to the the overall hash table insertion process to gain further
optimizations.
The first question was essay based. Two pointers were awarded, one point for indicating a valid strength of each. The
average score was 1.44, with a standard deviation of 0.77. The second question was also essay based. The average
score was 0.96, with a standard deviation of 0.84.
5. Algorithmic Efficiency - The final exam contained a number of questions related to Big-O notation. One question
asked for average cases for nine search and sort algorithms. Other questions asked for worst case Big-O. The results are
as follows:
17
When students were asked a question regarding average efficiency, nine sort and search algorithms were listed. One
point was given for each correct response. Out of 25 students, the average score was 6.56, with a standard deviation of
1.98
7.1 Sorted binary trees - The final exam displayed a sorted binary tree, and asked the students to indicate what the
traversal would become if the traversal was done using pre-order, in-order, and post-order traversal algorithms. A total
of 5 points was award for the entire question. One point was awarded for getting each item correct. Two additional
points were awarded if the answers were at least partially correct (for example, in a sequence of 10 numbers, all but 1
number may be listed correctly.) Out of 25 students, the average score was 3.48 with a standard deviation of 1.7.
7.2 AVL Trees and B Trees - The final exam asked two questions related to this topic. One asked the student to list two
preferable characteristics of a B tree compared to an AVL tree. The second asked the student to describe the state of a
given B tree after a given value was inserted into it.
The first question was essay based. One point was awarded for correctly listing one characteristic. Two points were
awarded for listing the other. The average score was 2.48, with a standard deviation of 0.59. The second question was
also essay based. Three points total were given. All points were awarded for having the correct answer. Partial credit
was given for answers that were either partially correct or displayed partial understanding of the underlying topic. The
average score was 1.92, with a standard deviation of 1.11.
18
b. Evidence of Learning: High Impact or Service Learning (duplicate this page as needed) No community-based learning (CBL) or independent study courses were assessed for this current report.
19
c. Evidence of Learning: General Education Courses (duplicate this page as needed or delete if department does not offer GE courses)
Computer Science offers one (1) General Education course in the Creative Arts (CA) area: CS 1010 CA – Introduction to Interactive Entertainment
The department is currently actively involved in the Gen Ed CA subcommittee charge to assess Learning Outcomes for all courses in this Gen Ed grouping, and will follow the assessment strategies and directives established by that subcommittee.
20
G. Summary of Artifact Collection Procedure
Summary Information (as needed) All artifacts indicated below were collected at the time in which the project or quiz/exam was administered as listed on the syllabus for the course. All artifact results are stored electronically on Canvas and/or Chi-tester. Source of Assessment Data for CS1410:
Instructional Content CS 1410 Object-Oriented Programming in C++
Student Outcomes
Assessment
1. Basics 1.1. Using Microsoft visual
studio 5 Programs 1 - 11
1.2. The compilation process: the preprocessor, the compiler, the linker
5 Programs 1 - 11
1.3. Multi-file programs 5 Programs 4, 6 - 11 2. Simple Programs
(variables, constants, operators, & casting)
1, 3, 5, 6, 8, 11
Programs 1 - 11 Exam 1: 1 - 23 Exam 2: 14 - 15
3. Program using flow-of-control statements (if, switch, for, while, do, break, and continue)
1, 3, 5, 6, 8, 11
Programs 2, 3, 5, 11 Exam 1: 24 - 58
4. Structures and enumerations
1, 3, 5, 6, 8, 11
Exam 2: question 1
4.1. Fields / members Program 4
21
Instructional Content CS 1410 Object-Oriented Programming in C++
Student Outcomes
Assessment
4.2. Pointers and references (content vs. address, address of and indirection operators)
Program 4 Exam 2: 2 & 3 Exam 4: 24
5. Functions 1, 3, 5, 6, 7, 8, 11
5.1. Definition Program 4, 6 - 11 5.2. Declaration /
prototype Program 4, 6 - 11
5.3. Calls (pass-by-value, reference, and pointer)
Program 4, 6 - 11 Exam 2: 4 - 6, 18 - 20 Exam 4: 25
5.4. Function overloading, recursion, and default arguments
Program 6 Exam 2: 22 - 23
6. Arrays, array function arguments
1, 3, 5, 6, 8, 11
Program 5 Exam 2: 9 - 12, 23 - 25, 27
7. C-strings and string objects 1, 2, 3, 5, 6, 8, 11
7.1. c-string functions Programs 5, 11 Exam 2: 7 - 8, 26
7.2. string class member functions
Programs 5, 11
7.3. Command line arguments: argc & argv
Programs 5 Exam 2: 15 - 17
7.4. Ascii codes 8. Classes and objects 1, 3, 5, 6, 8,
22
Instructional Content CS 1410 Object-Oriented Programming in C++
Student Outcomes
Assessment
11 8.1. Encapsulation,
member data and functions, modifiers (public, private, & protected)
Programs 6-10 Exam 2: 16 - 18 Exam 3: 26 - 29, 39 - 41, 43 - 44
8.2. Constructors and destructors; the copy constructor; conversion constructors
Programs 6-10 Exam 3: 1 - 2, 33 - 35
8.3. The this pointer 9. Class relations 1, 2, 3, 5, 6,
7, 8, 11 Exam 3: 42
9.1. UML diagrams Programs 9 & 10 Exam 2: 11 - 15
9.2. implementing class relations in C++: inheritance, association, aggregation, composition, & dependency
Programs 9 & 10 Exam 2: 19 - 25
10. Polymorphism 1, 3, 5, 6, 8, 11
10.1. virtual functions, casting, and function overriding
Program 10 Exam 4: 12 - 19, 21 - 22
10.2. pure virtual Program 10
23
Instructional Content CS 1410 Object-Oriented Programming in C++
Student Outcomes
Assessment
functions and abstract classes
11. Overloaded operators 1, 3, 5, 6, 8, 11
Exam 3: 38
11.1. Overloading arithmetic operators and >> and <<
Programs 7-9 Exam 2: 3 - 10 Exam 3: 30 - 32
11.2. friend functions Programs 7-9 Exam 3: 36 - 37
12. Memory management 1, 3, 5, 6, 8, 11
12.1. Static versus dynamic instantiation
Program 10
12.2. Stack and heap Program 10 12.3. New and delete
operators Program 10
13. I/O stream classes: ifstream, ofstream, fstream
1, 2, 3, 5, 6, 8, 11
Program 11
13.1. Stream functions Program 11 Exam 4: 23
13.2. Text versus binary files
Exam 4: 7 - 11
13.3. Manipulators and formatting functions
Program 11
13.4. Error detection: good, bad, fail
Program 11
14. Templates 1, 3, 5, 6, 8, 11
Program 10 Exam 4: 3 - 4, 20
24
Instructional Content CS 1410 Object-Oriented Programming in C++
Student Outcomes
Assessment
15. Exceptions 1, 3, 5, 6, 8, 11
15.1. The purpose of exceptions
Exam 4: 1 - 2
15.2. try / catch blocks Exam 4: 5 - 6 *** Source of Assessment Data for CS2420:
Instructional Content CS 2420 Introduction to Data Structures and Algorithms
Student Outcomes
Assessment
1 Review of CS 1410 concepts 2, 3, 5, 6, 7, 8, 11
Two or three challenging homework assignments are given as review. A common assignment used is a Big Int calculator class which performs addition, subtraction, multiplication, and division, for both negative and positive numbers. Another is a fully functional roman numeral class, with similar mathematical operators. For item 1 given in the Contest List, each assignment attempts to review five to seven of the nine listed review items. It takes roughly three to four weeks to review all concepts through homework and lecture. Assessment is done with weekly quizzes on these concepts. Homework assignments are also graded. These concepts are all assessed in a midterm.
2.2 and 2.3 Singly linked lists and iterators
2, 3, 5, 6, 7, 8, 11
An initial homework assignment has students implementing additional methods for a linked list class. These include deleting nodes by value,
25
Instructional Content CS 2420 Introduction to Data Structures and Algorithms
Student Outcomes
Assessment
deleting all nodes by value (in one pass), deleting the smallest item, finding the kth element and returning its info. Iterators are added into this assignment. Students must make iterators act similar to STL list iterators, with a few modifications. The iterators should be able to suppose operator overloads for +, -, ++, --, overloaded * for dereferencing, and overloaded [] for array like access. Sample code is given in main() which provides test cases to ensure the student code meets the expected output. Assessment is again done with weekly quizzes on these concepts. The homework assignment is also graded. These concepts are all assessed in a midterm.
2.4 Doubly linked lists and 3 Stacks and Queues
2, 3, 5, 6, 7, 8, 11
A homework assignment covering stacks and queues are given. A lecture is given on stacks, queues, and priority queues. The expected implementation of the homework is to effectively write a class which handles all functionality of stacks, queues, and priority queus, but does so internally using a doubly linked list. Students are required to modify their prior singly linked list into a doubly linked list. Then the student must implement all necessary stack, queue, and priority queue methods. Sample code is given in main() which provides test cases to ensure the student code meets the expected output. Assessment is again done with weekly quizzes on these concepts. The homework assignment is also graded. These concepts are all covered in a midterm.
2.5 Circular linked lists 7, 11 This is only lectured. Occasionally this is covered in a midterm. 4. Hash tables 2, 3, 5, 6, 7, 8, A homework assignment for hash tables are given. The student must
26
Instructional Content CS 2420 Introduction to Data Structures and Algorithms
Student Outcomes
Assessment
11 write his or her own hash algorithm. The resulting object must be stored in the hash table, which internally is implemented as an array of linked lists. The homework covers closed hashing. The assignment also ties together multiple review concepts from content list item #1 in ways that students typically had not yet encountered. Specifically, the students must learn to work with multiple classes simultaneously. The student must also understand how to properly work with pointers as arrays, and how to create many linked lists in an array. Sample code is given in main() which provides test cases to ensure the student code meets the expected output. Open hashing, array based concept, and probing techniques are lectured but not assessed. Assessment is again done with weekly quizzes on these concepts. The homework assignment is also graded. These concepts are all covered in a final exam.
5. Algorithmic efficiency
2, 6, 11 This topic covered in every subsequent lecture. As each new algorithm is described, its efficiency in time and space are analyzed. This is heavily tested in both quizzes and the final exam. One variation of an upcoming sort assignment does have students identify which possible sort algorithms are used by measuring how long it takes to complete.
6. Sort and search algorithms
2, 3, 5, 6, 7, 8, 11
Each search and sort algorithm is heavily tested in both quizzes and the final exam. Because textbooks supply these algorithms freely, the assignment does not require students to solve a problem by implementing code. Rather, the student needs to provide a visual display to how sorting actually
27
Instructional Content CS 2420 Introduction to Data Structures and Algorithms
Student Outcomes
Assessment
processes. One variation of an has have students identify which possible sort algorithms are used by measuring how long it takes to complete.
7.1 Sorted binary trees 2, 3, 5, 6, 7, 8, 11
A homework assignment is given which requires the student to generate a parse tree to take a normal mathematical expression given as a C string, place it into a parse tree, then compute the solution to that expression. The student also needs to print out the expression again from the tree in pre-order, in-order, and post-order (Reverse Polish notation) fashion. Occasionally functors are included as part of the implementation for this assignment. Traversal methods are frequently tested in both quizzes and in the final exam.
7.2 AVL trees and B trees 2, 3, 5, 6, 7, 8, 11
Due to the lack of time typically found at the end of each semester, only one of these two are assessed in a homework assignment. The assignment is fairly straightforward. Each tree needs a handful of commonly used methods. The textbook provides code for some, concepts for others. The assignment is to complete the methods in which the book did not provide the code. Insertion and deletion algorithms are assessed in both quizzes and the final exam.
8 Graphs 2, 3, 5, 6, 7, 8, 11
A homework assignment is given in which students are given a PDF containing a graph of roughly 20-30 nodes and 50-70 edges. The student then needs to provide a program which asks the user for a starting node, and then lists the shortest path and path sequence needed to each other node. The student also needs to print out the graph using breadth first and depth first traversal to ensure the graph was implemented in code correctly. Breadth first, depth first, and Dijkstra’s algorithm are covered on the final
28
Instructional Content CS 2420 Introduction to Data Structures and Algorithms
Student Outcomes
Assessment
exam. They are not covered in a quiz, as the semester is drawing to a close.
*** Source of Assessment Data for CS2450:
Instructional Content CS 2450 Software Engineering I
Student Outcomes
Assessment
1.1. Steps to problem solving
3, 7, 9 Problem solving consists of six steps: 1. Identify the problem (What is the problem?) 2. Understand the problem (What is involved with the problem? What does the client want? Maybe the client does not know what they want. Make sure you know the client.) 3. Identify alternative ways to solve the problem (Create a list. Maybe talk with others. Make sure they could be acceptable solutions.) 4. Select the best way to solve the problem from the list of alternative solutions (What are the pros and cons of each solution?) 5. List the instructions that enable you to solve the problem using the selected solution (Create a numbered list of instructions) 6. Evaluate the solution (Did it satisfy the needs of the client with the problem?) Use these steps to solve the problem such as: - What to do this evening? - Where to eat dinner?
1.2. Why projects fail 1,2,3,9,10 Find a failed Software Project. Create a PowerPoint with graphics and sources as to why it failed (you can use http://www.codinghorror.com/blog/2006/05/the-long-dismal-history-of-software-project-failure.html as resources to find a
29
Instructional Content CS 2450 Software Engineering I
Student Outcomes
Assessment
project) There should be one slide describing the project, one slide describing why it failed and one slide with your source(s)
1.7. Working as a team 4 Fill out group survey and discuss different personalities. Apply throughout the semester as Professor meets with teams in verbal environment discussing and re-emphasizing personalities
2.1. System request
2-6, 9, 10 Create a system request similar to the one on page 61 using Professor Anderson as the Project Sponsor. The Business need will be to improve the program. Then look at page 58 and create a feasibility analysis including the technical, economic, and organizational aspects similar to the one on page 63. The economic might be difficult depending on your system request but try your best. You can also use the project sponsor as a resource for information. There is no page requirement. Just make sure you do a thorough job and think about the opportunity costs (if you do this you can't do something else) and the ROI (return on your investment - is this project better to do than another).
2.2. Selecting a project
4,10 As a team, think about your Computer Science Department and choose an idea that could improve student satisfaction within your educational experience. Create a system request similar to the one on page 61 using Professor Anderson as the Project Sponsor. The Business need will be to improve the program.
3.0 Managing the project
2,3,9 Chapter 3, questions 2, 5, 7, 11
3.3.2 Project charter 2,4,9 Page 95. Do 3-4 the project charter
30
Instructional Content CS 2450 Software Engineering I
Student Outcomes
Assessment
4.3. Requirements strategies 2,4,9 Chapter 4, questions 1-2, 5, 15 4.4. Gathering requirements
3,4,9,10 Create a list of questions for the client (the professor) regarding your
system request. Email the list to the client by Jan 31st at midnight. When the client responds, use that information plus all other information you have gathered to create a list of the functional and nonfunctional business requirements for your system request.
5.1. Activity diagrams 3,4,5,9,10 Based upon the current project create an activity diagram and review the diagram as a team
5.2. Use case diagrams 3,4,5,9,10 Based upon the current project create a use case diagram and review the diagram as a team
6.2. CRC cards 3,4,5,9,10 Using the provided template, fill out the CRC cards for your project. 6.3. Class diagrams 3,4,5,9,10 As a team, create a class diagram for your project 7.1. Sequence diagrams 3,4,5,9,10 Based upon the current project create a sequence diagram and then
review it with your team 7.2. CRUD analysis 3,4,5,9,10 As a team perform a CRUD analysis for your system 8.1. Validating the analysis 3,4,5,9,10 Perform a walkthrough with your peers validating the activity, use
case, sequence, and class diagrams 9.2. Normalization 3,4,5,9,10 As a team, create an ERD
10. Human computer interface 3,4,5,9,10 For the assigned project, design the graphical user interface to meet the client’s needs within the scope of the project. As a team, review the documentation and confirm that the GUI does indeed meet functional requirements.
11.2. Deployment diagram 3,4,5,9,10 Create a deployment diagram for the current project and then review it with your team
11.3. Security requirements 3,4,5,9,10 Determine any security requirements for the current project 12.1 Testing plan
3,4,5,9,10 Create a plan to test the project to ensure that it meets all functional
and non-functional requirements 12.2. Maintenance plan 3,4,5,9,10 Create a maintenance plan for the project to ensure that it future
changes will be handled according the strategy defined within the scope of the project
31
*** Source of Assessment Data for CS3130:
Instructional Content CS 3130 Computational Structures
Student Outcomes Assessment
Outcomes 1 2 3 4 5 6 7 8 9 10 11
1. Discrete Math Structure X X X X X X X X X
Quiz #1,#2/Exam #1/Final Exam Team Assignment #1
1.1. Definition X X X X X X X X X
Quiz #1,#2/Exam #1/Final Exam Team Assignment #1
1.2. Operations X X X X X X X X X
Quiz #1,#2/Exam #1/Final Exam Team Assignment #1
1.3. Properties of Operations X X X X X X X X X
Quiz #1,#2/Exam #1/Final Exam Team Assignment #1
2. Application and Theory of Sets X X X X X
Quiz #1,#2/Exam #1/Final Exam
2.1. Set notation and definition X X X X X
Quiz #1,#2/Exam #1/Final Exam
2.2. Elements and member of a Set X X X X X
Quiz #1,#2/Exam #1/Final Exam
2.3. Subsets X X X X X
Quiz #1,#2/Exam #1/Final Exam
2.4. Operations on Sets, including Intersection, Union, Difference, Symmetric Difference X X X X X
Quiz #1,#2/Exam #1/Final Exam
2.5. Algebraic Properties of Set operations X X X X X
Quiz #1,#2/Exam #1/Final Exam
2.6. The Addition Principle and its Application X X X X X Quiz #1,#2/Exam #1/Final
32
Exam 2.7. Computer Implementation of Sets X X X X X X X Programming Assignment #1
3. Functions
X X X X X X X X
Quiz #1,#2/Exam #1/Final Exam Team Assignment #1 Programming Assignment #1
3.1. Specialized form of Relation
X X X X X X X
Quiz #1,#2/Exam #1/Final Exam Team Assignment #1 Programming Assignment #1
3.2. Functions as a mapping between sets
X X X X X X X
Quiz #1,#2/Exam #1/Final Exam Team Assignment #1 Programming Assignment #1
3.3. Domain, Co-Domain, and Range
X X X X X X X
Quiz #1,#2/Exam #1/Final Exam Team Assignment #1 Programming Assignment #1
3.4. Composition of three or more functions X X X X X
Quiz #2/Exam #1/Final Exam
3.5. Properties of Functions X X X X X
Quiz #2/Exam #1/Final Exam Programming Assignment #1
3.5.1. One-to-one correspondence (bijection) X X X X X
Quiz #2/Exam #1/Final Exam Programming Assignment #1
3.5.2. Everywhere defined X X X X X
Quiz #2/Exam #1/Final Exam Programming Assignment #1
3.5.3. Onto X X X X X
Quiz #2/Exam #1/Final Exam Programming Assignment #1
33
3.5.4. Invertible X X X X X
Quiz #2/Exam #1/Final Exam Programming Assignment #1
3.6. Functions for Computer Science X X X X X X X
Quiz #1/Exam #1/Final Exam
3.6.1. Characteristic Function X X X X X X X Programming Assignment #1 3.6.2. Floor function X X X X X X X Quiz #2 3.6.3. Ceiling function X X X X X X X Quiz #2 3.6.4. Hashing function X X X X X X X Quiz #2
4. Propositions and Logical Operations X X X X X X
Quiz #3,#4/Exam #2/Final Exam
4.1. Types of Statements – Declarative, Interrogative, etc. X X X X X X
Quiz #3,#4/Exam #2/Final Exam
4.2. Propositional Variables X X X X X X
Quiz #3,#4/Exam #2/Final Exam
4.3. Truth Tables X X X X X X
Quiz #3,#4/Exam #2/Final Exam
4.4. Negation, Conjunction, Disjunction, Biconditional X X X X X X
Quiz #3,#4/Exam #2/Final Exam
4.5. Implications (hypothesis and conclusion) X X X X X X
Quiz #3,#4/Exam #2/Final Exam
4.6. Predicates and Quantifiers X X X X X X
Quiz #3,#4/Exam #2/Final Exam
4.6.1. Universal Quantifier X X X X X X
Quiz #3,#4/Exam #2/Final Exam
4.6.2. Existential Quantifier X X X X X X
Quiz #3,#4/Exam #2/Final Exam
4.7. Properties of Operations on Propositions X X X X X X
Quiz #3,#4/Exam #2/Final Exam
5. Logic Programming X X X X X X X
Individual Assignment #2, #3/Exam #2
5.1. Prolog syntax and relations X X X X X X X Individual Assignment #2,
34
#3/Exam #2
5.2. Application of Prolog Facts and Rules X X X X X X X
Individual Assignment #2, #3/Exam #2
5.3. Modeling Real-world relationships using Prolog X X X X X X X
Individual Assignment #2, #3/Exam #2
5.4. Recursion X X X X X X X
Individual Assignment #2, #3/Exam #2
6. Boolean Algebras and Circuit Design X X X X X X
Quiz #3,#4/Exam #2/Final Exam Individual Assignment #3
6.1. Boolean Polynomials X X X X X X
Quiz #3,#4/Exam #2/Final Exam Individual Assignment #3
6.2. Lattices and Partially Ordered Sets X X X X X Quiz #3
6.3. Digital Logic Gates X X X X X X X
Quiz #3,#4/Exam #2/Final Exam Individual Assignment #3
6.3.1. AND gate X X X X X X X
Quiz #3,#4/Exam #2/Final Exam Individual Assignment #3
6.3.2. OR gate X X X X X X X
Quiz #3,#4/Exam #2/Final Exam Individual Assignment #3
6.3.3. NOT gate X X X X X X X
Quiz #3,#4/Exam #2/Final Exam Individual Assignment #3
6.4. Circuit Design X X X X X X X
Quiz #3,#4/Exam #2/Final Exam Individual Assignment #3
6.4.1. Relationship with Boolean Expressions and Truth Tables X X X X X X
Quiz #3,#4/Exam #2/Final Exam Individual Assignment #3
35
6.5. Sum of Products Expression X X X X X X X
Quiz #3,#4/Exam #2/Final Exam Individual Assignment #3
6.5.1. Minimization of Sum of Products Expression X X X X X X
Quiz #3,#4/Exam #2/Final Exam
6.5.2. Karnaugh Maps for minimizing number of circuit components X X X X X
Quiz #3,#4/Exam #2/Final Exam
7. Algorithms and the Growth of Functions X X X X X X X X
Quiz #3,#4/Exam #2/Final Exam Team Assignment #1
7.1. Computational Complexity X X X X X X X X
Quiz #3,#4/Exam #2/Final Exam Team Assignment #1
7.2. Definition of big-O X X X X X X X X
Quiz #3,#4/Exam #2/Final Exam Team Assignment #1
7.3. Definition of big-Θ X X X X X X X X
Quiz #3,#4/Exam #2/Final Exam Team Assignment #1
7.4. Interpreting algorithms expressed as pseudocode X X X X X X X X X X X
Team Assignment #1
7.5. Recursion X X X X X X X X X
Exam #1 Team Assignment #1 Individual Assignment #2
7.6. Rules for determining the Θ-class of a Function X X X X X X X X
Quiz #3,#4/Exam #2/Final Exam Team Assignment #1
8. Integers and Counting X X X X X X
Quiz #4/Exam #2/Final Exam
8.1. Properties of Integers X X X X X X
Quiz #4/Exam #2/Final Exam
8.1.1. Prime, LCM, GCD X X X X X X X Quiz #4/Exam #2/Final
36
Exam Individual Assignment #4
8.2. Integer Representations (Base n expansions) X X X X X X X
Quiz #4/Exam #2/Final Exam
8.3. Permutations
X X X X X X X
Quiz #4/Exam #2/Final Exam Individual Assignment #4 Team Assignment #2
8.4. Combinations
X X X X X X X
Quiz #4/Exam #2/Final Exam Individual Assignment #4 Team Assignment #2
8.5. The Pigeonhole Principle X X X X X Exam #2
9. Discrete Probability
X X X X X X X X X X X
Quiz #4/Exam #2/Final Exam Individual Assignment #4 Team Assignment #2
9.1. Sample Spaces
X X X X X X X X X
Quiz #4/Exam #2/Final Exam Individual Assignment #4 Team Assignment #2
9.2. Events
X X X X X X X X X
Quiz #4/Exam #2/Final Exam Individual Assignment #4 Team Assignment #2
9.3. Assigning Probabilities to Events
X X X X X X X X X
Quiz #4/Exam #2/Final Exam Individual Assignment #4 Team Assignment #2
10. Boolean Matrices X X X X X X X X X X X
Quiz #5/Final Exam Individual Assignment #5 Team Assignment #2
37
10.1.Elements X X X X X X X
Quiz #5/Final Exam Individual Assignment #5
10.1.1. Zero Matrix X X X X X X X
Quiz #5/Final Exam Individual Assignment #5
10.1.2. Identity (Diagonal) Matrix X X X X X X X
Quiz #5/Final Exam Individual Assignment #5
10.2.Operations X X X X X X X
Quiz #5/Final Exam Individual Assignment #5
10.2.1. Meet X X X X X X X
Quiz #5/Final Exam Individual Assignment #5
10.2.2. Join X X X X X X X
Quiz #5/Final Exam Individual Assignment #5
10.2.3. Boolean product X X X X X X X
Quiz #5/Final Exam Individual Assignment #5
10.3.Properties X X X X X X
Quiz #5/Final Exam Individual Assignment #5
11. Relations and Digraphs X X X X X X X X X X
Quiz #5/Final Exam Individual Assignment #5 Team Assignment #3
11.1.Partitions and Coverings X X X X X X X X X
Final Exam Team Assignment #3
11.2.Relations and Sets X X X X X X X X
Quiz #5/Final Exam Individual Assignment #5 Team Assignment #3
11.3.Relations and Functions X X X X X X X X
Quiz #5/Final Exam Individual Assignment #5 Team Assignment #3
11.4.Relations and Boolean Matrices X X X X X X X X
Quiz #5/Final Exam Individual Assignment #5 Team Assignment #3
11.5.Representing relations as Digraphs X X X X X X X X
Quiz #5/Final Exam Individual Assignment #5
38
Team Assignment #3
11.5.1. In-degree of nodes X X X X X X X X
Quiz #5 Team Assignment #3
11.5.2. Out-degree of nodes X X X X X X X X
Quiz #5 Team Assignment #3
11.5.3. Paths and Cycles X X X X X X X X
Quiz #5/Final Exam Team Assignment #3
11.6.Connectivity Relation X X X X X X X X X X
Quiz #5/Final Exam Team Assignment #3
11.7.Properties of Relations X X X X X X X X
Quiz #5/Final Exam Individual Assignment #5 Team Assignment #3
11.7.1. Reflexive and Irreflexive X X X X X X X X
Quiz #5/Final Exam Individual Assignment #5 Team Assignment #3
11.7.2. Symmetric, Antisymmetric, and Asymmetric X X X X X X X X
Quiz #5/Final Exam Individual Assignment #5 Team Assignment #3
11.7.3. Transitive X X X X X X X X
Quiz #5/Final Exam Individual Assignment #5 Team Assignment #3
11.8.Closures X X X X X X X X X
Quiz #5/Final Exam Individual Assignment #5 Team Assignment #3
11.8.1. Reflexive, Symmetric, and Transitive Closures X X X X X X X X X
Quiz #5/Final Exam Individual Assignment #5 Team Assignment #3
11.8.2. Warshall’s Algorithm X X X X X X X X X
Quiz #5/Final Exam Individual Assignment #5 Team Assignment #3
12. Trees X X X X X X X X
Quiz #6/Final Exam Team Assignment #3
39
12.1.Definition of Trees X X X X X X X X
Quiz #6/Final Exam Team Assignment #3
12.2.Tree levels, parents, siblings, leaves, vertex X X X X X X X X
Quiz #6/Final Exam Team Assignment #3
12.3.N-trees X X X X X X X X
Quiz #6/Final Exam Team Assignment #3
12.4.Binary Trees and Complete Binary Trees X X X X X X X X
Quiz #6/Final Exam Team Assignment #3
13. Sequences, Strings, and Regular Expressions X X X X X X
Quiz #6/Final Exam Individual Assignment #6
13.1.Infinite and finite sequences X X X X X X
Quiz #6/Final Exam Individual Assignment #6
13.2.Recurrence relations X X X X X X
Quiz #6/Final Exam Individual Assignment #6
13.3.Sets corresponding to a sequence X X X X X X
Quiz #6/Final Exam Individual Assignment #6
13.4.Regular Expression Alphabet X X X X X X
Quiz #6/Final Exam Individual Assignment #6
13.5.Regular Expression over a Set X X X X X X
Quiz #6/Final Exam Individual Assignment #6
14. Languages and Grammars X X X X X X
Quiz #6/Final Exam Individual Assignment #6
14.1.Natural Language vs. Computer Language X X X X X X
Quiz #6/Final Exam Individual Assignment #6
14.2.Phrase Structure Grammar X X X X X X
Quiz #6/Final Exam Individual Assignment #6
14.3.Terminals and Nonterminals X X X X X X
Quiz #6/Final Exam Individual Assignment #6
14.4.Production Rules X X X X X X
Quiz #6/Final Exam Individual Assignment #6
14.5.Derivation Trees X X X X X X
Quiz #6/Final Exam Individual Assignment #6
40
14.6.Regular Grammars and Regular Expressions X X X X X X
Quiz #6/Final Exam Individual Assignment #6
15. Machines and Languages X X X X X X
Quiz #6/Final Exam Individual Assignment #6
15.1.Finite State Machines X X X X X X
Quiz #6/Final Exam Individual Assignment #6
15.1.1. States and Alphabet X X X X X X
Quiz #6/Final Exam Individual Assignment #6
15.1.2. State transition table X X X X X X
Quiz #6/Final Exam Individual Assignment #6
15.1.3. Acceptance States X X X X X X
Quiz #6/Final Exam Individual Assignment #6
15.2.Language of a Machine X X X X X X X
Quiz #6/Final Exam Individual Assignment #6
15.3.Moore machine X X X X X X X
Quiz #6/Final Exam Individual Assignment #6
41
Please respond to the following questions.
1) Reflecting on this year’s assessment(s), how does the evidence of student learning impact your faculty’s confidence in the program being reviewed; how does that analysis change when compared with previous assessment evidence? To answer this question, compare evidence from prior years to the evidence from the current year. Discuss trends of evidence that increases your confidence in the strengths of the program. Also discuss trends of concern (e.g. students struggling to achieve particular student outcomes). Because of the CS department’s immediate goal to earn ABET accreditation, a more effective means of gathering assessment data is necessary as noted in the department’s revised strategy described in 3) below. Because we are in the early stages of developing valid and reliable measures that align with accreditation standards, interpreting the current year’s assessment data and its connection with the department’s stated learning outcomes is problematic since proper correlations between student performance on projects/exams and learning objectives are not well-established. Nonetheless, through regular feedback from industry partners and student interaction over the last several years, the faculty has confidence that the program as presently constituted provides instruction and mentorship that meets and, in some cases, exceeds the needs and expectations of our student population and the employer community.
2) With whom did you share the results of the year’s assessment efforts?
Results are shared with College administrators and advisors.
3) Based on your program’s assessment findings, what subsequent action will your program take?
Current Learning Outcomes Assessments
Section G above contains four example work models that articulate instruction, assessment, and our adopted student outcomes. These
work models were created for our Northwest accreditation evaluation that took place this past Spring Semester. Although the design
showed initial promise, we believe that it is unfeasible for the following reasons:
Creating such work models for each class is too much of a burden in time and effort for an already overwhelmed faculty.
42
When designing and redesigning a course, it is too difficult to project individual assessments accurately enough to create the
work models.
As a course and the concomitant assessments evolve, it is too much effort to maintain coherency between the work models and
the course and its assessments.
Collecting and reporting this amount of course-level data requires too much faculty time and effort.
The work model based assessments will produce too much data of dubious usefulness.
Proposed Learning Outcomes Assessments
We propose the following revised assessment strategy:
1. Assessments should measure a range of understanding (e.g., should have three or four distinct scores differentiated by rubric).
This implies that the questions should not be implemented as multiple choice questions.
2. The faculty shall create and approve a rubric for each learning outcome. Each rubric must be applied fairly based on course
level (e.g., a 1410 student would not be expected to perform at the same level as a 4790 student when assessing “read technical
literature and learn on their own”).
3. From the Curriculum Map, identify the outcomes for which your course provides instruction and create assessments for each
outcome.
4. The level of assessment is a function of the level of treatment for that outcome:
a. Introduced: create one exam assessment evaluated via the accepted rubric.
b. Reinforced: create one exam assessment or one program assessment evaluated via the accepted rubric.
c. Emphasized: create two assessments, which are evaluated via the accepted rubric, from the following list
i. exam
ii. program
iii. project
iv. group / team work
5. At the end of the semester, each instructor is responsible for reporting the anonymous data to the assessment coordinator.