Supplemental Appendices ABET Self-Study Report
Transcript of Supplemental Appendices ABET Self-Study Report
Supplemental
Appendices
ABET
Self-Study Report
for the
B.S. in Computer Science
at
Lamar University
Beaumont, Texas
June 30, 2013
CONFIDENTIAL
The information supplied in this Self-Study Report is for the confidential use of ABET
and its authorized agents, and will not be disclosed without authorization of the
institution concerned, except for summary data not identifiable to a specific institution.
2
Table of Contents
Appendix E – Assessment Methodology 2013-2014.......................................................... 3
E.1 – Procedures for Direct Measure of Student Outcomes .......................................... 4
E.2 – Procedures for Indirect Measure of Student Outcomes ...................................... 25
Appendix F – Indirect Measure Assessment Instruments 2013-2014 .............................. 28
F.1 – Form for Student Course and Instructor Evaluations ......................................... 29
F.2 – Form for Exit Interview ...................................................................................... 31
F.3 – Form for Exit Survey .......................................................................................... 37
F.4 – Form for Alumni Survey .................................................................................... 40
Appendix G – Assessment Results & Analysis 2012-2013 .............................................. 44
G.1 – Direct Measure Results and Assessment Analysis 2012-2013 ........................... 45
G.2 – Direct Measure Results Summary: Student Learning Outcomes 2012-2013 ..... 77
G.3 – Indirect Measure Results: Student Evaluation Summary 2012-2013 ................. 79
G.4 – Indirect Measure Results: Exit Interview Summary 2012-2013 ......................... 85
G.5 – Indirect Measure Results: Exit Survey Summary 2012-2013 ............................. 87
G.6 – Indirect Measure Results: Alumni Survey Summary 2010-2011 ....................... 89
G.7 – Indirect Measure Results: Advisory Board Feedback 2012-2013 ...................... 90
G.8 – Indirect Measure Results: ETS Exams 2012-2013 ............................................. 92
Appendix H – Curriculum Map ....................................................................................... 93
Appendix I – Department Programming Documentation Standard ................................ 109
Appendix J – Meeting Minutes 2012-2013 .................................................................... 111
Appendix K – Course Schedules 2012-2013 .................................................................. 137
Appendix L – Advisement by STARS............................................................................ 144
L.1 – Lamar Enrollment Agreement .......................................................................... 144
L.2 – Advising Communication Timeline – Fall Semester ....................................... 146
L.3 – Lamar Retention Programs ............................................................................... 148
L.4 – Tutor Request Form .......................................................................................... 150
3
Appendix E – Assessment Methodology 2013-2014
Sources of Assessment Data
Direct Measures
1. Rubrics and Test Questions for evaluating direct performance criteria
Indirect Measures
1. Student Response Questions on Course Evaluations: administered every semester
2. Exit Interviews of Graduating Seniors: administered every semester in Senior
Seminar (COSC 4172).
3. Exit Surveys of Graduating Seniors: administered every semester in Senior
Seminar (COSC 4172).
4. Alumni Surveys : provided every two years
5. Advisory Board Feedback
6. Standardized ETS Exams: administered to graduating seniors every semester in
Senior Seminar (COSC 4172).
4
E.1 - Procedures for Direct Measure of Student Outcomes
Department of Computer Science, Lamar University
Summer 2013
Criteria Used to Evaluate Rubrics and Test Questions for Direct Measures
The department has decided that instead of using average and STDEV, we will focus
upon the percentage of students who are adequate or better in 2012-2013. The target is at
least 80% of those students who pass a course will meet each performance criterion in
2012-2013. Likewise, the target will be at least 80% for the students in a course to
demonstrate acceptable work on each performance criterion.
Using the feedback from the indirect measures and the results from our direct measures,
the analysis of our assessment findings, actions taken, and recommendations for
improvement are presented at the end of these tables for each Student Outcome. In
addition to the table below with direct measures, we include in our analysis the following
indirect assessment methods: Student Course and Instructor Evaluation, Exit Interview,
Alumni Survey, and ETS Scores.
Note on Tables Below
* Courses contain material relevant to the performance criteria, but are not used in the
assessment strategy at this time.
5
Student Outcome 1 Software Fundamentals
Performance Criteria Strategies Assessment
Method(s)
Context for
Assessment
Time of Data
Collection
Assessment
Coordinator
Analysis of
Results
Apply UML
interaction diagrams
and class diagrams to
illustrate object
models.
COSC 1336,
COSC 1337,
COSC 2336,
CPSC 4360
Selected
Questions on
Final Exam
CPSC 4360 Spring and Fall
of each year
Dr. Peggy
Doerschuk or Dr.
Stefan Andrei
Size =
Percentage =
The target of
80% was
_____.
Apply important
design patterns to
OOD.
COSC 3308,
CPSC 4360
Selected
Questions on
Final Exam
CPSC4360 Spring and Fall
of each year
Dr. Peggy
Doerschuk or Dr.
Stefan Andrei
Size =
Percentage =
The target of
80% was
_____.
Create useful
software architecture
documentation.
COSC 2336,
COSC 3304,
CPSC 3320,
CPSC 4302,
CPSC 4340
CPSC 4360
Rubric on
software
architecture
documentation
on final project
CPSC 4340 Fall of each
year
Dr. Kami Makki Size =
Percentage =
The target of
80% was
_____.
Develop correct and
efficient programs.
COSC 1336,
COSC 1337,
COSC 2336,
COSC 3304,
CPSC 3320,
*CPSC 4302,
*CPSC 4340
*CPSC 4360
Selected
Questions on
Assignments
COSC 3304 Spring of each
year
Dr. Quoc-Nam
Tran
Size =
Percentage =
The target of
80% was
_____.
6
Debug implemented
software in a
proficient manner.
COSC 1336,
COSC 1337,
COSC 2336
COSC 2372
COSC 3304
Selected
Questions on
Assignments
COSC 3304 Spring of each
year
Dr. Quoc-Nam
Tran
Size =
Percentage =
The target of
80% was
_____.
Design user interfaces
appropriate to a large
software system.
COSC 1336
COSC 1337
CPSC 3320
CPSC 4360
Rubric CPSC 4360 Fall and Spring
of Each year
Dr. Stefan Andrei
and Dr. Peggy
Doerschuk
Size =
Percentage =
The target of
80% was
_____.
Develop user-level
documentation for
software.
All courses
with
programming
assignments
Rubric CPSC 4360 and
COSC 2336
Fall and Spring
each year
Dr. Andrei or Dr.
Doerschuk
Size =
Percentage =
The target of
80% was
_____.
7
Student Outcome 2.1 Computer Science Technology Skills – Discrete Mathematics and Structures
Performance Criteria Strategies Assessment
Method(s)
Context for
Assessment
Time of Data
Collection
Assessment
Coordinator
Analysis of
Results
Be able to develop
software to support
specific operations on
frequently used discrete
structures such as lists,
trees, and graphs.
COSC
2336,
COSC
4302, CPSC
3320
Code development
on final exams
COSC 2336 Fall and Spring of
each year
Dr. Stefan Andrei Size =
Percentage =
The target of
80% was _____.
Be able to use elementary
concepts of
combinatorics,
probability, and statistics
to analyze and evaluate
the efficiency of
algorithms.
COSC 3304 Selected Questions
on Midterm Exam
in COSC 3304
COSC 3304 Spring of each
year
Dr. Quoc-Nam Tran Size =
Percentage =
The target of
80% was _____.
Be able to use concepts of
discrete mathematics,
automata, and finite state
machines to explain the
design of computer
hardware.
COSC
2336,
COSC
2372,
ELEN 3431
COSC 3302
Selected Questions
on Final Exam in
COSC 3302
COSC 3302 Spring of each
year
Dr. Hikyoo Koh Size =
Percentage =
The target of
80% was _____.
8
Student Outcome 2.2 Computer Technology Skills – Analysis and Design of Algorithms
Performance Criteria Strategies Assessment
Method(s)
Context for
Assessment
Time of Data
Collection
Assessment
Coordinator
Analysis of
Results
Demonstrate basic
understanding of
asymptotic notations
and time complexity.
COSC
2336
COSC
3304
Questions from
Midterm Exam
COSC 3304 Spring of each
year
Dr. Quoc-Nam
Tran
Size =
Percentage =
The target of
80% was
_____.
Design efficient
algorithms and
compare competing
designs.
COSC
2336,
COSC
3304
COSC
4360
Questions from
Midterm Exam
COSC 3304 Spring of each
year
Dr. Quoc-Nam
Tran
Size =
Percentage =
The target of
80% was
_____.
Demonstrate basic
understanding of some
design approaches
such as greedy
algorithms, dynamic
programming and
divide-and-conquer.
COSC
2336,
COSC
3304
Questions from
Midterm Exam
COSC 3304 Spring of each
year
Dr. Quoc-Nam
Tran
Size =
Percentage =
The target of
80% was
_____.
Demonstrate
familiarity with
standard searching and
sorting algorithms and
linear and non-linear
structures.
COSC
2336
COSC
3304
Questions from
Midterm Exam
COSC 3304 Spring of each
year
Dr. Quoc-Nam
Tran
Size =
Percentage =
The target of
80% was
_____.
9
Student Outcome 2.3 Computer Science Technology Skills – Formal Languages and Computability Theory
Performance Criteria Strategies Assessment
Method(s)
Context for
Assessment
Time of Data
Collection
Assessment
Coordinator
Analysis of
Results
Knowledge of
equivalences between
various types of
languages and
corresponding
accepting devices
including Turing
Machines.
COSC
3302
Exam questions COSC 3302 Spring of each
year
Dr. Hikyoo Koh Size =
Percentage =
The target of
80% was
_____.
Knowledge of
practical applicability
of various types of
grammar and of some
standard representation
forms.
COSC
3302
Exam questions COSC 3302 Spring of each
year
Dr. Hikyoo Koh Size =
Percentage =
The target of
80% was
_____.
Knowledge of
limitations of
computational
capability of computer
grammars.
COSC
3308
COSC
3302
Exam questions COSC 3302 Spring of each
year
Dr. Hikyoo Koh Size =
Percentage =
The target of
80% was
_____.
Knowledge of
equivalences and
normal forms of
logical formulas in
propositional logic.
COSC
3308
COSC
3302
Exam questions COSC 3302 Spring of each
year
Dr. Hikyoo Koh Size =
Percentage =
The target of
80% was
_____.
10
Understanding and
appreciation of the
various essential
programming
languages constructs,
paradigms, evaluation
criteria, and language
implementation issues.
COSC
3308
Exam questions COSC 3308 Fall of each year Dr. Hikyoo Koh Size =
Percentage =
The target of
80% was
_____.
Demonstrate basic
knowledge and skills
in programming
techniques with the
focus on concepts and
not on a particular
language.
COSC
3308
Exam questions COSC 3308 Fall of each year Dr. Hikyoo Koh Size =
Percentage =
The target of
80% was
_____.
11
Student Outcome 2.4 Computer Science Technology Skills – Operating Systems
Performance Criteria Strategies Assessment
Method(s)
Context for
Assessment
Time of Data
Collection
Assessment
Coordinator
Analysis of
Results
Knows the main
components of an
operating system and
their purposes and
modes of interaction.
COSC
4302
Exam Questions COSC 4302 Fall and Spring
of each year
Dr. Bo Sun Size =
Percentage =
The target of
80% was
_____.
Knows the structure
of device drivers and
the interaction
between device
drivers and operating
systems.
COSC
4302
Exam Questions COSC 4302 Fall and Spring
of each year
Dr. Bo Sun Size =
Percentage =
The target of
80% was
_____.
Outlines the basic
issues in memory
management design
and virtual memory.
COSC
4302
Exam Questions COSC 4302 Fall and Spring
of each year
Dr. Bo Sun Size =
Percentage =
The target of
80% was
_____.
Can develop basic
system applications
based on operating
system APIs.
COSC
4302
CPSC 3320
Exam Questions COSC 4302 Fall and Spring
of each year
Dr. Bo Sun Size =
Percentage =
The target of
80% was
_____.
12
Student Outcome 2.5 Computer Science Technology Skills – Database Design
Performance Criteria Strategies Assessment
Method(s)
Context for
Assessment
Time of Data
Collection
Assessment
Coordinator
Analysis of
Results
Demonstrate the
application of Entity-
Relational diagrams to
model real world
problems.
CPSC 4340 Exam Questions CPSC 4340 Fall of each year Dr. Kami Makki Size =
Percentage =
The target of
80% was
_____.
Design relations for
real world problems
including
implementation of
normal forms, keys,
and semantics
constraints for each
relation.
CPSC 4340
CPSC 4360
Exam Questions CPSC 4340 Fall of each year Dr. Kami Makki Size =
Percentage =
The target of
80% was
_____.
Demonstrate
competence in
implementations of
database applications.
CPSC 4340 Rubric for final
project
CPSC 4340 Fall of each year Dr. Kami Makki Size =
Percentage =
The target of
80% was
_____.
13
Student Outcome 2.6 Computer Science Technology Skills – Computer Networks
Performance Criteria Strategies Assessment
Method(s)
Context for
Assessment
Time of Data
Collection
Assessment
Coordinator
Analysis of
Results
Employ the socket
API to program
applications among
independent hosts.
CPSC 3320 Exam Questions CPSC 3320 Fall of each year Dr. Bo Sun Size =
Percentage =
The target of
80% was
_____.
Explain common
network architectures,
services provided by
each layer, and
protocols required for
connecting peer
layers.
CPSC 3320 Exam Questions CPSC 3320 Fall of each year Dr. Bo Sun Size =
Percentage =
The target of
80% was
_____.
Evaluate network
models through
simulation and the use
of common
performance metrics
for networks.
CPSC 3320 Project CPSC 3320 Fall Semester Dr. Bo Sun Size =
Percentage =
The target of
80% was
_____.
14
Student Outcome 2.7 Computer Science Technology Skills –Computer Organization and Architecture
Performance Criteria Strategies Assessment
Method(s)
Context for
Assessment
Time of Data
Collection
Assessment
Coordinator
Analysis of
Results
Understands modern
ISA design principles
and employs them to
evaluate systems.
COSC
2372,
ELEN
3431,
COSC
4310
Local Exam
Question
COSC 4310 Fall and Spring
of each year
Dr. Jiangjiang Liu
Size =
Percentage =
The target of
80% was
_____.
Know how to measure
performance for
different computer
architectures.
COSC
4310
Local Exam
Question
COSC 4310 Fall and Spring
of each year
Dr. Jiangjiang Liu
Size =
Percentage =
The target of
80% was
_____.
Demonstrate
knowledge of
hardware
implementation of
numbers and
arithmetic operations.
COSC
2372,
COSC
4310
Local Exam
Question
COSC 4310 Fall and Spring
of each year
Dr. Jiangjiang Liu
Size =
Percentage =
The target of
80% was
_____.
15
Student Outcome 3 Scientific Method**
**Graduates will be able to gather requirements, analyze, design and conduct simulations or other computer experiments in order to
evaluate and interpret the data.
Performance Criteria Strategies Assessment
Method(s)
Context for
Assessment
Time of Data
Collection
Assessment
Coordinator
Analysis of
Results
Be able to justify why selected
research methods were chosen and
state the intended outcomes of the
study.
COSC 2336,
CPSC 3320,
COSC 4310
Rubric and
Project
CPSC 3320
and COSC
4310
Fall and
Spring of each
year
Dr. Jiangjiang
Liu and Dr.
Bo Sun
Size =
Percentage =
The target of
80% was
_____.
Identify steps used in a particular
study.
COSC 2336,
CPSC 3320,
COSC 4310
Rubric and
Project
CPSC 3320
and COSC
4310
Fall and
Spring of each
year
Dr. Jiangjiang
Liu and Dr.
Bo Sun
Size =
Percentage =
The target of
80% was
_____.
Be able to outline and explain the
key features of the adopted method.
COSC 2336,
CPSC 3320,
COSC 4310
Rubric and
Project
CPSC 3320
and COSC
4310
Spring and
Fall of every
year
Dr. Jiangjiang
Liu and Dr.
Bo Sun
Size =
Percentage =
The target of
80% was
_____.
16
Analyze and interpret collected data
based on the adopted method and
draw appropriate conclusions.
COSC 2336,
CPSC 3320,
COSC 4310
Rubric and
Project
CPSC 3320
and COSC
4310
Fall and
Spring of each
year
Dr. Jiangjiang
Liu and Dr.
Bo Sun
Size =
Percentage =
The target of
80% was
_____.
17
Student Outcome 4 Societal Awareness**
**Graduates will be aware of and understand the impact of computer technology on society at large, on the workplace environment,
and on individuals.
Performance Criteria Strategies Assessment
Method(s)
Context for
Assessment
Time of Data
Collection
Assessment
Coordinator
Analysis of
Results
Demonstrate understanding of
evolving computer technology
applications.
COSC 1172,
COSC 3325
Exam
Questions
COSC 3325 Spring of each
year
Dr. Stefan
Andrei
Size =
Percentage =
The target of
80% was
_____.
Demonstrate knowledge of positive
social impacts including information
globalization, E-Commerce, E-
learning and new job creation.
COSC 1172,
COSC 3325,
CPSC 4340,
CPSC 3320
Exam
Questions
COSC 3325 Spring of each
year
Dr. Stefan
Andrei
Size =
Percentage =
The target of
80% was
_____.
Demonstrate knowledge of negative
social impacts including internet
pornography, privacy violation,
health hazards, computer crimes and
dehumanization.
COSC 1172,
COSC 3325,
CPSC 4340,
CPSC 3320,
ELEN 3431
Exam
Questions
COSC 3325,
CPSC 3320
Spring and
Fall of each
year
Dr. Stefan
Andrei, Dr.
Bo Sun
Size =
Percentage =
The target of
80% was
_____.
Demonstrate basic understanding of
intellectual property protection via
copyright and patent law and fair use
exception for copyrighted software.
COSC 1172,
COSC 3325,
CPSC 4340,
CPSC 4360
Exam
Questions
COSC 3325 Spring of each
year
Dr. Stefan
Andrei
Size =
Percentage =
The target of
80% was
_____.
18
Student Outcome 5 Ethical Standards**
**Graduates will be able to recognize and understand the importance of ethical standards as well as their own responsibilities with
respect to the computer profession.
Performance Criteria Strategies Assessment
Method(s)
Context for
Assessment
Time of Data
Collection
Assessment
Coordinator
Analysis of
Results
Know the differences of various
philosophical views on ethics such as
deontology, utilitarianism, egoism,
and relativism.
COSC 3325
Exam
Questions
COSC 3325 Spring of each
year
Dr. Stefan
Andrei
Size =
Percentage =
The target of
80% was
_____.
Understand the ACM or a similar
professional body’s code of ethics
and principles underlying those
ethics.
COSC 3325,
CPSC 4360
Exam
Questions
CPSC 4360 Fall and
Spring of each
year
Dr. Stefan
Andrei, Dr.
Peggy
Doerschuk
Size =
Percentage =
The target of
80% was
_____.
Honor the property rights of others
including copyrights and patents.
COSC 1172,
COSC 3325,
CPSC 4360
Exam
Questions
COSC 3325 Spring of each
year
Dr. Stefan
Andrei
Size =
Percentage =
The target of
80% was
_____.
Demonstrate ability for ethical
decision making within the computer
profession.
COSC 1172,
COSC 3325,
CPSC 3320,
CPSC 4360
Exam
Questions
COSC 3325 Spring of each
year
Dr. Stefan
Andrei
Size =
Percentage =
The target of
80% was
_____.
19
Demonstrate knowledge of factors
affecting fair resolution of conflicts of
interests.
COSC 1172,
COSC 3325,
CPSC 4360
Exam
Questions
COSC 3325 Spring each
year
Dr. Stefan
Andrei
Size =
Percentage =
The target of
80% was
_____.
20
Student Outcome 6 Collaborative Work Skills**
**Graduates will demonstrate the ability to work effectively in teams to conduct technical work through the exercise of interpersonal
communication skills.
Performance Criteria Strategies Assessment
Method(s)
Context for
Assessment
Time of Data
Collection
Assessment
Coordinator
Analysis of
Results
Demonstrate the ability
to work in
heterogeneous
environments which
are diverse in gender,
ethnicity, and academic
accomplishment.
COSC 1172,
CPSC 4360,
CPSC 4340,
COSC 4302
Rubrics CPSC 4340,
CPSC 4360
Fall and Spring
of each year
Dr. Andrei, Makki,
Dr. Doerschuk
Size =
Percentage =
The target of
80% was
_____.
Attend team meetings
and contribute towards
solution of technical
problems during the
meetings.
COSC 1172,
CPSC 4360,
CPSC 4340,
COSC 4302
Rubrics CPSC 4340,
CPSC 4360
Fall and Spring
of each year
Dr. Andrei, Makki,
Dr. Doerschuk
Size =
Percentage =
The target of
80% was
_____.
Make appropriate
contributions within
their skill set to the
completion of the
project.
COSC 1172,
CPSC 4360,
CPSC 4340,
COSC 4302
Rubrics CPSC 4340,
CPSC 4360
Fall and Spring
of each year
Dr. Andrei, Makki,
Dr. Doerschuk
Size =
Percentage =
The target of
80% was
_____.
Demonstrate a sense of
interdependence with
other team members.
COSC 1172,
CPSC 4360,
CPSC 4340,
COSC 4302
Rubrics CPSC 4340,
CPSC 4360
Fall and Spring
of each year
Dr. Andrei, Makki,
Dr. Doerschuk
Size =
Percentage =
The target of
80% was
_____.
21
Student Outcome 7 Oral Communications**
**Graduates will demonstrate their verbal ability to communicate clearly.
Performance
Criteria
Strategies Assessment
Method(s)
Context for
Assessment
Time of Data
Collection
Assessment
Coordinator
Analysis of
Results
Ability to
communicate in a
given situation.
COSC 3325,
COSC 4172,
COSC 1172
Rubrics COSC 3325,
COSC 4172
Fall and Spring
of each year
Dr. Stefan Andrei,
Dr. Lawrence
Osborne
Size =
Percentage =
The target of
80% was
_____.
Ability to
comprehend what
is said and to show
an appreciation of
the importance of
listening.
COSC 3325,
COSC 4172,
COSC 1172
Rubrics COSC 3325,
COSC 4172
Fall and Spring
of each year
Dr. Stefan Andrei,
Dr. Lawrence
Osborne
Size =
Percentage =
The target of
80% was
_____.
Communicate
clearly at the level
of the audience the
technical material
intrinsic to the
discipline of
computer science.
COSC 3325,
COSC 4172,
COSC 1172
Rubrics COSC 3325,
COSC 4172
Fall and Spring
of each year
Dr. Stefan Andrei,
Dr. Lawrence
Osborne
Size =
Percentage =
The target of
80% was
_____.
Demonstrate
knowledge of the
communication
process.
COSC 3325,
COSC 4172,
COSC 1172
Rubrics COSC 3325,
COSC 4172
CPSC 4360
Fall and Spring
of each year
Dr. Stefan Andrei,
Dr. Lawrence
Osborne
Size =
Percentage =
The target of
80% was
_____.
22
Student Outcome 8 Written Communication Skills**
**Graduates will demonstrate their ability to write effectively both technical and non-technical materials with appropriate multimedia
aids.
Performance Criteria Strategies Assessment
Method(s)
Context for
Assessment
Time of Data
Collection
Assessment
Coordinator
Analysis of
Results
Provide an
introduction that grabs
the attention of
readers.
COSC
1172,
COSC
3325,
COSC
4172,
CPSC
4360,
COSC
4302
Rubrics CPSC 4360,
COSC 4302
Fall and Spring
of each year
Dr. Sun, Dr.
Andrei, Dr.
Doerschuk
Size =
Percentage =
The target of
80% was
_____.
Organize documents
in terms of a few main
points or themes.
COSC
1172,
COSC
3325,
COSC
4172,
CPSC
4360,
COSC
4302
Rubrics CPSC 4360,
COSC 4302
Fall and Spring
of each year
Dr. Sun, Dr.
Andrei, Dr.
Doerschuk
Size =
Percentage =
The target of
80% was
_____.
23
Choose appropriate
illustrations, examples,
or evidence to support
the written documents.
COSC
1172,
COSC
3325,
COSC
4172,
CPSC
4360,
COSC
4302
Rubrics CPSC 4360,
COSC 4302
Fall and Spring
of each year
Dr. Sun, Dr.
Andrei, Dr.
Doerschuk
Size =
Percentage =
The target of
80% was
_____.
Write appropriately
for specified readers in
terms of technical
content.
COSC
1172,
COSC
3325,
COSC
4172,
CPSC
4360,
COSC
4302
Rubrics
CPSC 4360,
COSC 4302
Fall and Spring
of each year
Dr. Sun, Dr.
Andrei, Dr.
Doerschuk
Size =
Percentage =
The target of
80% was
_____.
Write organized,
grammatically correct
reports.
COSC
1172,
COSC
3325,
COSC
4172,
CPSC
4360,
COSC
4302
Rubrics CPSC 4360,
COSC 4302
Fall and Spring
of each year
Dr. Sun, Dr.
Andrei, Dr.
Doerschuk
Size =
Percentage =
The target of
80% was
_____.
24
Student Outcome 9 Continuing Education and Lifelong Learning**
**Graduates will be demonstrate that they can independently acquire new computing related skills and knowledge in order to pursue
either further formal or informal learning after graduation.
Performance Criteria Strategies Assessment
Method(s)
Context for
Assessment
Time of Data
Collection
Assessment
Coordinator
Analysis of
Results
Be able to search scholarly
publications to assist in
resolving problems.
COSC 3325,
COSC 4172,
COSC 4302,
CPSC 4360
Rubrics COSC 3325 and
COSC 4172
Fall and Spring
of each year
Dr. Osborne and
Dr. Andrei
Size =
Percentage =
The target of
80% was
_____.
Intend to engage in
additional formal education
or participate in employer-
related training or research
projects.
COSC 4172 Rubrics COSC 4172 Fall and Spring
of each year
Dr. Osborne Size =
Percentage =
The target of
80% was
_____.
Independent study.
Participate in Honors
program or in undergraduate
research at Lamar. This
could be done in the
STAIRSTEP Program,
Presentations or Posters at
Professional Conferences,
COOP or Internship
position reports. Student
could own a software design
and development company.
COSC 4172 Rubrics COSC 4172 Fall and Spring
of each year
Dr. Osborne Size =
Percentage =
The target of
80% was
_____.
25
E.2 - Procedures for Indirect Measure of Student Outcomes
Sources of Data for Evaluations for Each Learning Outcome
Assessment Committee Approved Spring 2013
Outcome Course Evaluations
Student
Evaluation
Questions
(Done every
semester)
Exit
Interview
Questions
(Done every
semester by
graduating
seniors)
Exit
Survey
Questions
(Done
every
semester
by
graduating
seniors)
Alumni
Survey
Questions
(Partial
surveys
every two
years)
ETS Scores
1 COSC 1336
COSC 1337
COSC 2336
COSC 2372
COSC 3304
CPSC 3320
COSC 4172
COSC 4302
CPSC 4340
CPSC 4360
27, 28, 29, 31
27-31
27,28,30,31,32,38
27,28,30,31,32
27-32
27,28,30,38
27
25,27,28,30,31
25,27-31
25,27-32
1,2,3,6,12 1,2,3,6,12 Overall Average
Score and 3
Assessment
Indicators
(Programming,
Computer
Organization,
Algorithms and
Theory)
2 15 15 The 3 Assessment
Indicators
(Programming,
Computer
Organization,
Algorithms and
Theory)
2.1 COSC 2336 27,
26
28,29,30,31,40
COSC 3304 27,37,40
COSC 3302 27,39,40
2.2 COSC 3304 27,28,33,34,39,40
2.3 COSC 3302 30,39,40
2.4 COSC 4302 27,28,35,39,40
2.5 CPSC 4340 27,28,39,40
2.6 CPSC 3320 28,30,38,39,40
2.7 COSC 2372 27,31,35,40
COSC 4310 35,38,40
3 COSC 2336
CPSC 3320
COSC 4310
37,38,40
37,38,40
35,37,38,40
3,4,6,7 3,4,6,7 The 3 Assessment
Indicators
(Programming,
Computer
Organization,
Algorithms and
Theory)
4 COSC 1172
COSC 3325
CPSC 4360
41
41
41
5,9 5,9
5 COSC 3325 4136 9 16
9
6 COSC 1172
COSC 4302
CPSC 4340
CPSC 4360
25,26
25,26,34,35
25,26,34,35
25,26,34
4,7,8,11,13,14 4,7,8,11,13,14
7 COSC 1172
COSC 3325
CPSC 4360
25,26
34,42
25,26,34
8,13,14 13 8,13,14
8 COSC 1172
COSC 3325
COSC 4302
CPSC 4360
26,34
42
26,34
26,34
8,13,14 12 8,13,14
27
9 COSC 3325
COSC 4172
42
27,34,35,37,40,42
1,10,11 9,11 1,10,11 Overall Average
Score
Note: An Exit Survey is also administered to students in COSC 4172 (Senior Seminar). It is concerned mainly with overall program issues such
as scheduling, cognate courses, advising, and satisfaction with opportunities for independent study.
Criteria for Satisfactory Performance
Course Student Evaluations: average for each course/semester >= 3.75
Exit Interview Form: average for each question/year >= 3.75
Exit Interview Form: average for each of the overall quality questions/year >= 7.5/year
Exit Survey Form: questions 1-18 >= 3.75/year except for question 3 where the goal is between 2.25 and 4.00/year.
Alumni Survey: average on each curriculum question >= 4.0
Alumni Survey: average for each of the overall quality questions/year >= 8.0
ETS questions: Mean on each assessment indicator each semester >= 50.0;
Overall average/semester >= 160 with minimum >= 140.
Other Sources of Indirect Data
1. Input from our Industrial Advisory Board
Criteria Used to Evaluate Indirect Data
If average score >= our target criteria, then performance criteria is met.
If 5 <= sample size < 10, then monitor performance criteria for next two semesters.
If sample size < 5, then the curriculum remains the same, but we will gather data for the next two cycles to produce a larger
sample for analysis.
Criteria is not met.
28
Appendix F – Indirect Measure Assessment Instruments 2013-2014
This appendix includes assessment instruments used for indirect measures. Please note
that Alumni Surveys are only administered every two years. The following instruments
are included:
1. Student Course and Instructor Evaluations
2. Exit Interview
3. Exit Survey
4. Alumni Survey
29
F.1 - Form for Student Evaluations
Undergraduate Online Course
Assessment Form Course Name: ____________ ________________________
Major________________ Date_________________ Course Number _________________
Question Number (University Online Evaluation Question Number: )
Student Assessment of Program Outcomes Note: Not all of the topics listed below are covered in any class. Hence, it does not make sense for all of your answers to be the same. It is perfectly reasonable that some of your answers should be “strongly disagree.” This course provided you
Strongly Disagree
Disagree
Undecided
Agree
Strongly Agree
1 (25) the opportunity to work effectively as a member of a software development team.
1
2
3
4
5
2 (26) the knowledge to employ effective teamwork and interpersonal communication skills.
1
2
3
4
5
3 (27) the knowledge to analyze a software development problem and design a software solution.
1
2
3
4
5
4 (28) the ability to implement a software design specification in an appropriate development environment.
1
2
3
4
5
5 (29) the ability to apply appropriate user interface design. 1
2
3
4
5
6 (30) the knowledge to design and apply relevant software testing procedures.
1
2
3
4
5
7 (31) instruction on the proper documentation of source code.
1
2
3
4
5
8 (32) the knowledge needed to develop user-level documentation for software.
1
2
3
4
5
9 (33) the ability to independently acquire new computing related skills (e.g. new computing environment, new programming language).
1
2
3
4
5
10 (34) the ability to communicate technical design and implementation concepts to computing professionals as well as to non-computing personnel, both orally and in writing.
1
2
3
4
5
11 (35) the knowledge to evaluate hardware and software in the context of integrating computing into an environment or defining a computing solution to a particular problem or situation.
1
2
3
4
5
12 (36) the knowledge to conduct yourself in an ethical and professional manner and to assume a leadership role in class projects.
1
2
3
4
5
13 (37) the ability to apply knowledge from computer science and other disciplines to solve computer science problems.
1
2
3
4
5
14 (38) the knowledge to design and conduct simulation or other computer experiments and analyze and interpret data.
1
2
3
4
5
15 (39) with a firm theoretical foundation for the subject of the course.
1
2
3
4
5
16 (40) the knowledge to acquire the required skills in the use of the tools and technology of computer science.
1
2
3
4
5
17 (41) the ability to obtain and use information about the local
30
and global impact of the field on relevant societal issues.
1 2 3 4 5
18 (42) with motivation to establish habits of life-long learning and curiosity.
1
2
3
4
5
Student Assessment of Instruction
Strongly Disagree
Disagree
Undecided
Agree
Strongly Agree
19 Instructor seemed to have a thorough understanding of subject matter.
1
2
3
4
5
20 Instructor was able to answer student questions effectively.
1
2
3
4
5
21 Instructor made contributions not in assigned material. 1
2
3
4
5
22 Instructor treats all students equally. 1
2
3
4
5
23 Instructor had a reasonable grading system. 1
2
3
4
5
24 Instructor made grading system clear to students. 1
2
3
4
5
25 Instructor was available to students online. 1
2
3
4
5
26 Instructor gave tests that adequately evaluated the understanding of the course material.
1
2
3
4
5
27 Instructor made reasonable assignments. 1
2
3
4
5
28 Instructor returned tests and papers in a reasonable time.
1
2
3
4
5
29 Instructor made the course interesting. 1
2
3
4
5
31 Instructor was able to present concepts so they were understood.
1
2
3
4
5
32 Instructor presented lectures that were carefully planned and were helpful.
1
2
3
4
5
33 Taking this instructor’s course was worthwhile. 1
2
3
4
5
Student Information
34 What grade did you expect to receive in this course? F
D
C
B
A
35 What is your grade range in this course? DF CD BC AB
36 What is the average number of hours per week you spent on this course?
<2
2 to 7
7 to 12
>12
37 If you dropped or do not pass this course, would you consider taking the course from the same instructor again?
No
Yes
38 Would you recommend the instructor to a friend who is considering taking this course?
No
Yes
39 Please assign an overall rating to the instructor based on a scale from F (very poor) to A (excellent).
F
D
C
B
A
Comments Section
Number of Tests given?
Number of assignments assigned?
31
F.2 - Form for Exit Interview
Department of Computer Science
Exit Interview Form
Spring 2013
Date: ___________________________________________________________________
Name:__________________________________________________________________
Permanent Address: _______________________________________________________
_______________________________________________________
_______________________________________________________
Circle your degree program: B.S. in CS B.S. in CIS
If you took the SAT test in high school, what was your total score? _________________
What was the most important reason for your coming to Lamar rather than another
university? ______________________________________________________
(Circle) I have have not found a position yet.
Do you plan to attend graduate school after graduation? _______________
If the answer is “Yes”, what school are you going to attend?
______________________________.
What degree do you plan to pursue in graduate school?
__________________________________________.
If the answer was “No”, do you plan to attend graduate school in the future?
_______________.
If you do eventually go to graduate school, what degree do you intend to pursue?
___________________
If you have found a position, what is the name of the company, and where is the company
located?
________________________________________________________________
_________________________________________________________________
If you have found a position, what is your job title? _____________________________
32
If you have found a position, what is the starting salary of your new position? ________
On the average, how many hours per week have you been employed during the time
when you were enrolled in courses during the last two years before graduation? _______
From what high school did you graduate?______________________________________
What year? ______ If outside the local area, what was the city and state? __________
_____________________________________________________________________
How many years have passed since the time you first enrolled at Lamar and the time
when you will be graduating? _______________________________
Questions concerning the Quality of the Program in the Department of Computer
Science
1. On a scale of one to ten (with 10 being good), how do you rate the quality of the courses taken
within the department? 2. On a scale of one to ten, how do you rate the quality of instruction in computer science
courses? 3. On a scale of one to ten (with 10 being easy and 1 being hard), how do you rate the ease of
scheduling courses in computer science? 4. On a scale of one to ten (with 10 being very satisfied and 1 being not satisfied at all), how do
you rate your overall satisfaction with the program you are graduating in?
Department of Computer Science Objectives Strongly Disagree
Disagree Undecided Agree Strongly Agree
1. Your education required you to apply critical thinking to solving difficult problems.
1
2
3
4
5
2. Your education ensured that you can design software solutions to different types of problems.
1
2
3
4
5
3. Your education provided a firm theoretical foundation so that you were prepared for future scientific advances.
1
2
3
4
5
4. Your education stimulated an understanding of the role of computer science in interdisciplinary studies, and it increased your interest and abilities in other areas.
1
2
3
4
5
5. Your education fostered an understanding of the impact of the discipline on relevant local and global social issues.
1
2
3
4
5
6. Your education enabled you to develop the ability to analyze and solve computer science problems by applying knowledge from computer science, mathematics, and software engineering.
1
2
3
4
5
7. Your education offered the preparation necessary to design and conduct simulations or other experiments and analyze and interpret data.
1
2
3
4
5
8. Your education developed in you skill in communication and cooperation within workgroups.
1
2
3
4
5
9. Your education fostered an awareness of professional and
33
ethical responsibilities and their application in real situations.
1 2 3 4 5
10. Your education established an understanding of the need for life-long education and curiosity.
1
2
3
4
5
11. Your education in the CS Department occurred in an environment that facilitated and encouraged learning.
1
2
3
4
5
12. Your education enabled you to understand the process of software development including specifications, analysis, design, and testing.
1
2
3
4
5
13. Your education provided a sufficient educational foundation for leadership roles along future career paths.
1
2
3
4
5
14. Your education gave you the ability to recognize and value diversity in the world and in intellectual areas.
1
2
3
4
5
15. Your education gave you a strong background in the fundamental technical areas of computer architecture, algorithms, operating systems, database systems, and formal languages.
1
2
3
4
5
Please give your opinion concerning the strengths of your degree program?
Please give suggestions for improvement to your degree program?
34
Questions Concerning Your Experiences at Lamar
Have you received any awards from the Department, College, or University since you
have been at Lamar? If you have, please list them.
________________________________________________________________________
________________________________________________________________________
Have you used the services of the Career Center since coming to Lamar? _________
If you have, what help did the Career Center provide?
How many group projects do you think you did in computer science courses? _________
How many presentations did you make in computer science courses? _______________
Did you present any course projects outside the classroom at
Y N
Regional Student Conferences _______ _______
Civic Group (i.e. Chamber of Commerce) _______ _______
Professional Conference sponsored by the ACM or IEEE _______ _______
Annual Lamar Research Conference _______ _______
Other _______ _______
Did you participate regularly in ACM? ________________________________________
What factors caused you to participate or not participate regularly in ACM? __________
________________________________________________________________________
Did you participate in UPE? ________________________________________________
Did you receive any scholarships? If so, what were the sources of the funds?
_______________________________________________________________________
If you received any scholarships, what was the total amount you received over the course
of time that you studied at Lamar? ______________________________________
If you received any scholarships, did the money you receive determine your decision to
come to Lamar and study Computer Science? __________
35
What were your favorite CS/CIS/ELEN courses? _____________________________
_______________________________________________________________________
Reasons for selections? ____________________________________________________
_______________________________________________________________________
_______________________________________________________________________
What were your least favorite CS/CIS/ELEN courses? _________________________
_______________________________________________________________________
Reasons for selections? ____________________________________________________
_______________________________________________________________________
_______________________________________________________________________
Who were your favorite CS/CIS/ELEN instructors? ___________________________
Reasons for selections? ____________________________________________________
_______________________________________________________________________
_______________________________________________________________________
Who were your least favorite CS/CIS/ELEN instructors? ______________________
Reasons for selections? ____________________________________________________
_______________________________________________________________________
_______________________________________________________________________
What were your favorite Math and/or Physics courses? ________________________
Reasons? _______________________________________________________________
What were your least favorite Math and/or Physics courses? ___________________
Reasons? _______________________________________________________________
36
With what companies and for how many semesters did you do an Internship in Computer
Science?
With what companies and for how many semesters did you do a COOP in Computer
Science?
What evidence can you point to that proves your ability to design a system, component,
or process to meet realistic constraints?
What are you most proud of learning at Lamar, and in which experiences or courses did
you learn this skill?
37
F.3 - Form for Exit Survey
Department of Computer Science Exit Survey
2012-2013 Academic Year
The following information is being collected as part of our on-going self-evaluation. This
survey is designed for graduating Computer Science and Computer Information Systems
majors for the purpose of obtaining feedback from students with the goal of improving
our courses and degree programs. Your responses to this survey will remain anonymous.
Results will be analyzed and reported in terms of group statistics and collected
comments. Do not place your name on the form.
Major:
Computer Information Systems [ ] Computer Science [ ]
Approximate overall GPA: ____ Approximate GPA in major: ____
For each statement that follows, please indicate your level of agreement. Space is
provided for your comments that explain or clarify your answer. Use backs of sheets to
continue comments (label by question number). While we are principally interested in the
courses in the major and cognate, you may add comments on other courses at the
university if you wish but please make clear to which courses you are referring.
1. I have learned a great deal in my major.
[ ]Strongly Disagree [ ] Disagree [ ] Not Sure [ ] Agree [ ] Strongly Agree
Comment:
2. I am well prepared for employment in my major.
[ ]Strongly Disagree [ ] Disagree [ ] Not Sure [ ] Agree [ ] Strongly Agree
Comment:
3. The work required for my major was
[ ] Too Easy Easy [ ] Reasonable Difficult Too Difficult
Comment:
4. Faculty are readily available for assistance on course work.
[ ]Strongly Disagree [ ] Disagree [ ] Not Sure [ ] Agree [ ] Strongly Agree
Comment:
5. The quality of teaching in the major is good.
[ ]Strongly Disagree [ ] Disagree [ ] Not Sure [ ] Agree [ ] Strongly Agree
38
Comment: (name courses)
6. The computer labs that support the program are satisfactory for that
purpose.
[ ]Strongly Disagree [ ] Disagree [ ] Not Sure [ ] Agree [ ] Strongly Agree
Comment:
7. Departmental academic advisors were readily available for help and met my
needs.
[ ]Strongly Disagree [ ] Disagree [ ] Not Sure [ ] Agree [ ] Strongly Agree
Comment:
8. Scheduling is easy because of the availability of courses.
[ ]Strongly Disagree [ ] Disagree [ ] Not Sure [ ] Agree [ ] Strongly Agree
Comment:
9. Independent study or research opportunities are satisfactory.
[ ]Strongly Disagree [ ] Disagree [ ] Not Sure [ ] Agree [ ] Strongly Agree
Comment:
10. Classrooms are adequate to support the program.
[ ]Strongly Disagree [ ] Disagree [ ] Not Sure [ ] Agree [ ] Strongly Agree
Comment:
11. I can analyze, design and implement a computerized solution to a “real life”
problem.
[ ]Strongly Disagree [ ] Disagree [ ] Not Sure [ ] Agree [ ] Strongly Agree
Comment: (name courses)
12. I can write technical documents such as specifications, design and users’
manuals in a specified format.
[ ]Strongly Disagree [ ] Disagree [ ] Not Sure [ ] Agree [ ] Strongly Agree
Comment: (name courses)
13. I can orally present a computerized project.
[ ]Strongly Disagree [ ] Disagree [ ] Not Sure [ ] Agree [ ] Strongly Agree
Comment: (name courses)
14. I am prepared to enter an appropriate graduate program.
[ ]Strongly Disagree [ ] Disagree [ ] Not Sure [ ] Agree [ ] Strongly Agree
Comment:
15. I have a good general background in Computer Science.
[ ]Strongly Disagree [ ] Disagree [ ] Not Sure [ ] Agree [ ] Strongly Agree
Comment: (name courses)
39
16. I am cognizant of ethical issues and local and global societal concerns
relating to computers in society.
[ ]Strongly Disagree [ ] Disagree [ ] Not Sure [ ] Agree [ ] Strongly Agree
Comment:
17. My math and science courses provided a good background/supplement to my
major.
[ ]Strongly Disagree [ ] Disagree [ ] Not Sure [ ] Agree [ ] Strongly Agree
Comment: (name courses)
18. My math and science courses were well taught.
[ ]Strongly Disagree [ ] Disagree [ ] Not Sure [ ] Agree [ ] Strongly Agree
Comment: (name courses)
19. What did you like best about the major?
20. What did you like least about the major?
21. What would you recommend to improve the advising system?
40
F.4 - Form for Alumni Survey
Department of Computer Science
Alumni Survey
1. Name Date
(If female, please provide maiden name in addition to married name)
What degree(s) did you earn in the Department of Computer Science at Lamar
University?_______________________________________________
Please give at least one address through which we might best be able to reach you in the
future. For unmarried students, this will probably be the address of your parent(s) or
guardian.
Permanent Home Address:
Present Address :
Phone Number: Email Address:
Year of Graduation: Degree(s) Received from Lamar:
B.S. in Computer Science
B.S. in Computer and Information
Sciences
M.S. in Computer Science
2. If you are employed, please provide the following:
Name of your company:
Your title:
Address of Employer:
41
Salary: Less than $40,000 $80,000 - $100,000
$40,000 - $60,000 $100,000 - $200,000
$60,000 - $80,000 More than $200,000
3. I rate the quality of the courses taken in the CS department as:
Poor Excellent
0 1 2 3 4 5 6 7 8 9 10
4. I rate the quality of instruction in the program as:
Poor Excellent
0 1 2 3 4 5 6 7 8 9 10
5. Scheduling of needed courses was:
Very Difficult Reasonable Easy
0 1 2 3 4 5 6 7 8 9 10
6. Overall I am satisfied with the program:
Not at All Somewhat Very
0 1 2 3 4 5 6 7 8 9 10
7. Department of Computer Science Objectives
Strongly Strongly
Disagree Disagree Undecided Agree Agree
1. Your education required you to apply critical thinking to solve
difficult problems.
1 2 3 4 5 2. Your education ensured that you can design software solutions
for a wide range of problems.
1 2 3 4 5 3. Your education provided a firm theoretical foundation so that
you were prepared for future scientific advances.
1 2 3 4 5 4. Your education stimulated an understanding of the role of
computer science in interdisciplinary studies, and it increased
your interest and abilities in other areas.
1 2 3 4 5 5. Your education fostered an understanding the impact of the
discipline on relevant social issues.
1 2 3 4 5 6. Your education enabled you to develop the ability to analyze
and solve computer science problems by applying knowledge
from computer science, mathematics, and software engineering.
1 2 3 4 5 7. Your education offered the preparation necessary to design and
conduct simulations or other experiments and analyze and
interpret data.
1 2 3 4 5 8. Your education developed in you skill in communication and
cooperation within workgroups and larger organizations
1 2 3 4 5 9. Your education fostered an awareness of professional and
ethical responsibilities and their application in real situations.
42
1 2 3 4 5 10. Your education established an understanding of the need for
life-long education and curiosity.
1 2 3 4 5 11. Your education in the CS department occurred in an
environment that facilitated and encouraged learning.
1 2 3 4 5 12. Your education enabled you to understand the process of
software development including specifications, analysis, design,
and testing.
1 2 3 4 5
13. Your education provided a sufficient educational foundation for
leadership roles along future career paths.
1 2 3 4 5 14. Your education gave you the ability to recognize and value
diversity in the world and in intellectual areas.
1 2 3 4 5 15 Your education has prepared you, in your opinion, for
graduate study in Computer Science
1 2 3 4 5 16 You have a deep understanding of one or more sub-areas of
Computer Science.
1 2 3 4 5 17 Your education gave you a strong background in the
fundamental technical areas of computer architecture,
algorithms, operating systems, database systems, and
formal languages.
1 2 3 4 5
Please comment on what you think are the strengths of the CS program:
During your job interviews, did the interviewers offer any comments that suggested areas where
they felt our degree was either especially weak or especially strong? Were there topics they
asked you about with which you were unfamiliar?
10. In what ACM/IEEE activities did you participate?
11. Age at graduation? Married? Gender? Ethnicity?
12. How many children do you have?
43
13. Were you a transfer student?
If so, how many hours transferred toward the degree?
14. Were you a co-op or intern student? How many semesters?
Company Name:
Address:
15. Have you gone to graduate school after leaving Lamar?
If yes, what school(s) did you attend and what degree(s) did you earn?
Please Return Completed Alumni Form to:
Department of Computer Science
Lamar University
P.O. Box 10056
Beaumont, TX 77710
44
Appendix G – Assessment Results & Analysis 2012-2013
This appendix includes results and analysis of assessment for the 2012-2013 academic
year (which includes the fall 2012 and spring 2013 long semesters). The following are
included:
1. Direct Measure Results and Assessment Analysis 2012-2013
2. Direct Measure Results Summary: Student Learning Outcomes 2012-2013
3. Indirect Measure: Student Course and Instructor Evaluation Summary 2012-2013
4. Indirect Measure: Exit Interview Summary 2012-2013
5. Indirect Measure: Exit Survey Summary 2012-2013
6. Indirect Measure: Alumni Survey Summary 2010-2011
7. Indirect Measure: Advisory Board Feedback 2012-2013
8. ETS Exams 2012-2013
45
G.1 – Direct Measure Results and Assessment Analysis 2012-2013
Department of Computer Science, Lamar University
Summer 2013
Using the feedback from the indirect measures specified in Appendices E.1 and the results from our direct measures, the analysis of
our assessment findings, actions taken, and recommendations for improvement are presented in this document. Note that the selected
questions used on final examinations for each performance criterion are submitted by the faculty and approved by the departmental
Assessment Committee to ensure adequate appropriate depth and consistency of content across time.
Assessment and Evaluation
Student Outcome 1 Software Fundamentals
Indirect Assessment Methods: Student Course and Instructor Evaluation, Exit Interview, Alumni Survey, ETS Scores
Performance Criteria Strategies Assessment
Method(s)
Context for
Assessment
Time of Data
Collection
Assessment
Coordinator
Analysis of
Results
Apply UML
interaction diagrams
and class diagrams to
illustrate object
models.
COSC 1336,
COSC 1337,
COSC 2336,
CPSC 4360
Selected
Questions on
Final Exam
CPSC 4360 Spring and Fall
of each year
Dr. Peggy
Doerschuk
or
Dr. Stefan Andrei
Size = 20
Percentage =
83.45
The target of
80% was Met
Apply important
design patterns to
OOD.
COSC 3308,
CPSC 4360
Selected
Questions on
Final Exam
CPSC4360 Spring and Fall
of each year
Dr. Peggy
Doerschuk
or
Size = 20
Percentage =
83.45
46
Dr. Stefan Andrei The target of
80% was Met
Create useful
software architecture
documentation.
COSC 2336,
COSC 3304,
CPSC 3320,
CPSC 4302,
CPSC 4340
CPSC 4360
Rubric on
software
architecture
documentation
on final project
CPSC 4340 Fall of each
year
Dr. Kami Makki Size = 6
Percentage =
100
The target of
80% was Met
Develop correct and
efficient programs.
COSC 1336,
COSC 1337,
COSC 2336,
COSC 3304,
CPSC 3320,
*CPSC 4302,
*CPSC 4340
*CPSC 4360
Selected
Questions on
Assignments
COSC 3304 Spring of each
year
Dr. Quoc-Nam
Tran
Size = 12
Percentage =
91
The target of
80% was Met
Debug implemented
software in a
proficient manner.
COSC 1336,
COSC 1337,
COSC 2336
COSC 2372
COSC 3304
Selected
Questions on
Assignments
COSC 3304 Spring of each
year
Dr. Quoc-Nam
Tran
Size = 12
Percentage =
100
The target of
80% was Met
Design user
interfaces appropriate
to a large software
system.
COSC 1336
COSC 1337
CPSC 3320
CPSC 4360
Rubric CPSC 4360 Fall and Spring
of each year
Dr. Stefan Andrei
and
Dr. Peggy
Doerschuk
Size = 19
Percentage =
97.26
The target of
80% was Met
47
Develop user-level
documentation for
software.
All courses
with
programming
assignments
Rubric CPSC 4360 and
COSC 2336
Fall and Spring
of each year
Dr. Doerschuk
or Dr. Stefan
Andrei
Dr. Makki
Size = 32
Percentage =
92.19
The target of
80% was Met
* Courses contain material relevant to the performance criteria, but are not used in the assessment strategy at this time.
Date: June 14, 2013.
Results: The sample size was between 12 and 21 for all performance criteria except criteria 3 in which the sample size was 6. This
was better than last year when the sample size was about half. The direct results met our targets this year (83%, 83%, 100%, 91%,
100%, 97%, 92%) which was an improvement over last year since criteria 1 was not met last year (79%). Four out of seven criteria
had a higher rating this year as compared to last year with 3 being slightly lower. Overall, this was an improvement compared to last
year.
Actions: By criterion:
1.3 – Since we did not meet the student evaluation targets (indirect data) we will bring this to the attention of the instructor and
monitor this.
1.4 and 1.5 – Indirect results are mixed and therefore is inconclusive overall. No actions. We need more participation from students
in the course and instructor evaluation process. We have informed some faculty and will inform others about how to attain
higher response rates, particularly those faculty whose indirect data did not meet targets.
Second Cycle Results: Since we met the direct and indirect targets, it appears that adding the departmental code documentation
standards has helped.
48
Student Outcome 2.1 Computer Science Technology Skills – Discrete Mathematics and Structures
Indirect Assessment Methods: Student Course and Instructor Evaluation, Exit Interview, Alumni Survey, ETS Scores
Performance Criteria Strategies Assessment
Method(s)
Context for
Assessment
Time of Data
Collection
Assessment
Coordinator
Analysis of
Results
Be able to develop
software to support
specific operations on
frequently used
discrete structures
such as lists, trees, and
graphs.
COSC 2336,
COSC 4302,
CPSC 3320
Code
development on
final exams
COSC 2336 Fall and Spring of
each year
Dr. Kami Makki Size = 11
Percentage = 82
The target of
80% was Met
Be able to use
elementary concepts
of combinatorics,
probability, and
statistics to analyze
and evaluate the
efficiency of
algorithms.
COSC 3304 Selected
Questions on
Midterm Exam in
COSC 3304
COSC 3304 Spring of each
year
Dr. Quoc-Nam Tran Size = 12
Percentage = 83
The target of
80% was Met
Be able to use
concepts of discrete
mathematics,
automata, and finite
state machines to
explain the design of
computer hardware.
COSC 2336,
COSC 2372,
ELEN 3431
COSC 3302
Selected
Questions on
Final Exam in
COSC 3302
COSC 3302 Spring of each
year
Dr. Hikyoo Koh Size = 7
Percentage =
100
The target of
80% was Met
49
Date: June 14, 2013.
Results: The last two years we did not fully meet all measure targets. This year we met all targets. This is the first time in three
years we met all targets for both direct and indirect measures.
Actions: None.
Second Cycle Results: We added a Discrete Math course as a prerequisite for the COSC 3304 Algorithms course.
50
Student Outcome 2.2 Computer Technology Skills – Analysis and Design of Algorithms
Indirect Assessment Methods: Student Course and Instructor Evaluation, Exit Interview, Alumni Survey, ETS Scores
Performance Criteria Strategies Assessment
Method(s)
Context for
Assessment
Time of Data
Collection
Assessment
Coordinator
Analysis of
Results
Demonstrate basic
understanding of
asymptotic notations and
time complexity.
COSC 2336
COSC 3304
Questions
from
Midterm
Exam
COSC 3304 Spring of each
year
Dr. Quoc-Nam
Tran
Size = 12
Percentage =
83
The target of
80% was Met
Design efficient
algorithms and compare
competing designs.
COSC
2336, COSC
3304
COSC 4360
Questions
from
Midterm
Exam
COSC 3304 Spring of each
year
Dr. Quoc-Nam
Tran
Size = 12
Percentage =
83
The target of
80% was Met
Demonstrate basic
understanding of some
design approaches such as
greedy algorithms,
dynamic programming
and divide-and-conquer.
COSC 2336,
COSC 3304
Questions
from
Midterm
Exam
COSC 3304 Spring of each
year
Dr. Quoc-Nam
Tran
Size =12
Percentage =
83
The target of
80% was Met
Demonstrate familiarity
with standard searching
and sorting algorithms and
linear and non-linear
structures.
COSC 2336
COSC 3304
Questions
from
Midterm
Exam
COSC 3304 Spring of each
year
Dr. Quoc-Nam
Tran
Size = 12
Percentage =
83
The target of
80% was Met
51
Date: June 14, 2013.
Results: We met the targets for indirect measures criteria on the Exit Interviews and Exit Summary. Last year these targets were not
all met. We noted from their responses on the course and instructor evaluations that students felt felt they did not have a firm
theoretical understanding of algorithms.
Actions: Starting next year, we will discuss with prospective instructors for COSC 3304 (Algorithms course) successful methods for
teaching analysis of algorithms. The goal will be to see better results on the student course and instructor evaluations in regard to the
question of whether or not students feel they have a firm theoretical understanding of algorithms.
Second Cycle Results: None.
52
Student Outcome 2.3 Computer Science Technology Skills – Formal Languages and Computability Theory
Indirect Assessment Methods: Student Course and Instructor Evaluation, Exit Interview, Alumni Survey, ETS Scores
Performance Criteria Strategies Assessment
Method(s)
Context for
Assessment
Time of Data
Collection
Assessment
Coordinator
Analysis of
Results
Demonstrate basic knowledge
of equivalences between
various types of languages and
corresponding accepting
devices including Turing
Machines.
COSC 3302 Exam
questions
COSC 3302 Spring of every
year
Dr. Hikyoo Koh Size = 7
Percentage =
100
The target of
80% was Met
Demonstrate basic knowledge
of practical applicability of
various types of grammar and
of some standard
representation forms.
COSC 3302 Exam
questions
COSC 3302 Spring of every
year
Dr. Hikyoo Koh Size = 7
Percentage =
100
The target of
80% was Met
Demonstrate knowledge of
limitations of computational
capability of computer
grammars.
COSC 3308
COSC 3302
Exam
questions
COSC 3302 Spring of every
year
Dr. Hikyoo Koh Size = 7
Percentage =
86
The target of
80% was Met
Demonstrate basic knowledge
of equivalences and normal
forms of logical formulas in
propositional logic.
COSC 3308
COSC 3302
Exam
questions
COSC 3302 Spring of every
year
Dr. Hikyoo Koh Size = 7
Percentage =
100
The target of
80% was Met
53
Demonstrate basic
understanding and appreciation
of the various essential
programming languages
constructs, paradigms,
evaluation criteria, and
language implementation
issues.
COSC 3308 Exam
questions
COSC 3308 Fall of every year Dr. Andrei Size = 10
Percentage =
90
The target of
80% was Met
Demonstrate basic knowledge
and skills in programming
techniques with the focus on
concepts and not on a
particular language.
COSC 3308 Exam
questions
COSC 3308 Fall of every year Dr. Andrei Size = 10
Percentage =
90
The target of
80% was Met
Date: June 14, 2013.
Results: We met all targets for direct and indirect measures for these 6 performance criteria. This is an improvement from last year.
The performance targets for the 6 criteria were met for direct measure with a sample size was of 7 for year 2012-2013. Also, we have
a similar result for last year. The performance targets of this outcome also have been met for indirect measures, with the sample size
of 4 for year 2012-2013. Note that for 2011-2012 student course and instructor evaluations were not used to assess Outcome 2.3.
Actions: None.
Second Cycle Results: The instructor wanted to make sure that knowledge of the limitations of computational capability of computer
grammars was to be covered in the class (see last year’s Actions), and this year the instructor successfully incorporated that material
into the curriculum for COSC 3302.
54
Student Outcome 2.4 Computer Science Technology Skills – Operating Systems
Indirect Assessment Methods: Student Course and Instructor Evaluation, Exit Interview, Alumni Survey, ETS Scores
Performance Criteria Strategies Assessment
Method(s)
Context for
Assessment
Time of Data
Collection
Assessment
Coordinator
Analysis of
Results
Knows the main components of
an operating system and their
purposes and modes of
interaction.
COSC
4302
Exam
Questions
COSC 4302 Fall and
Spring of
every year
Dr. Bo Sun
Size = 15
Percentage =
86.80
The target of
80% was Met
Knows the structure of device
drivers and the interaction
between device drivers and
operating systems.
COSC
4302
Exam
Questions
COSC 4302 Fall and
Spring of
every year
Dr. Bo Sun Size = 15
Percentage =
86.80
The target of
80% was Met
Outlines the basic issues in
memory management design and
virtual memory.
COSC
4302
Exam
Questions
COSC 4302 Fall and
Spring of
every year
Dr. Bo Sun Size = 15
Percentage =
86.80
The target of
80% was Met
Can develop basic system
applications based on operating
system APIs.
COSC
4302
CPSC 3320
Exam
Questions
COSC 4302 Fall and
Spring of
every year
Dr. Bo Sun Size = 15
Percentage =
86.80
The target of
80% was Met
55
Date: June 14, 2013.
Results: The performance targets for these 4 criteria were met for the direct measure, with a sample size of 15 for year 2012-2013.
We had similar results last year with a sample size of 11. The performance targets of this outcome also have been met for the indirect
measure, with a sample size of 44 for year 2012-2013. Note that for 2011-2012 student course and instructor evaluations were not
used to assess Outcome 2.4.
Actions: None.
Second Cycle Results: None.
56
Student Outcome 2.5 Computer Science Technology Skills – Database Design
Indirect Assessment Methods: Student Course and Instructor Evaluation, Exit Interview, Alumni Survey, ETS Scores
Performance Criteria Strategies Assessment
Method(s)
Context for
Assessment
Time of Data
Collection
Assessment
Coordinator
Analysis of
Results
Demonstrate the
application of Entity-
Relational diagrams to
model real world
problems.
CPSC 4340 Exam Questions CPSC 4340 Fall of every
year
Dr. Kami Makki Size = 6
Percentage =
83
The target of
80% was Met
Design relations for
real world problems
including
implementation of
normal forms, keys,
and semantics
constraints for each
relation.
CPSC 4340
CPSC 4360
Exam Questions CPSC 4340 Fall of every
year
Dr. Kami Makki Size = 6
Percentage =
83
The target of
80% was Met
Demonstrate
competence in
implementations of
database applications.
CPSC 4340 Rubric for final
project
CPSC 4340 Fall of every
year
Dr. Kami Makki Size = 6
Percentage =
100
The target of
80% was Met
Date: June 14, 2013.
Results: The performance targets for these 3 criteria were met for direct measure, with a sample size of 6 for year 2012-2013. In
2011-2012, criterion 2 was not met, with a sample size of 4. However, the performance targets of this outcome have not been met for
57
indirect measure, with a sample size 6 of for year 2012-2013. Note that for 2011-2012 student course and instructor evaluations were
not used to assess Outcome 2.
The Exit Interview result was 3.75 with a sample size of 8 for Outcome 2, which showed that this Outcome has been met (Question
15). For 2011-2012, the result was 4.50 with a sample size of 8.
The Alumni Survey result was 4.17 with a sample size of 9 for Outcome 2, which showed that this Outcome has been met (Question
15). For 2011-2012, the result was 4.17 with a sample size of 6.
Actions: Since direct measure targets were met for these performance criteria and the sample size was small, no actions will be taken
other than to notify the instructor of results of the student course and instructor evaluations (since evaluations did not meet targets for
indirect measures).
Second Cycle Results: None.
58
Student Outcome 2.6 Computer Science Technology Skills – Computer Networks
Indirect Assessment Methods: Student Course and Instructor Evaluation, Exit Interview, Alumni Survey, ETS Scores
Performance Criteria Strategies Assessment
Method(s)
Context for
Assessment
Time of Data
Collection
Assessment
Coordinator
Analysis of
Results
Employ the socket API to program
applications among independent
hosts.
CPSC 3320 Exam
Questions
CPSC 3320 Fall of every
year
Dr. Bo Sun Size = 18
Percentage =
83
The target of
80% was Met
Explain common network
architectures, the services provided
by each layer, and the protocols
required for connecting peer
layers.
CPSC 3320 Exam
Questions
CPSC 3320 Fall of every
year
Dr. Bo Sun Size = 18
Percentage =
83
The target of
80% was Met
Evaluate network models through
simulation and the use of common
performance metrics for networks.
CPSC 3320 Project CPSC 3320 Fall of every
year
Dr. Bo Sun Size = 18
Percentage =
83
The target of
80% was Met
Date: June 14, 2013.
Results: The results in Student Outcome 2.6 are all satisfactory in 2012-2013. In 2011-2012, performance criterion 2 was not met
with the percentage of 66.67% and sample size of six students. The department made improvement on performance criterion 2 with
the percentage of 83.00% and sample size of 18 students.
59
Actions: None.
Second Cycle Results: The instructor responded favorably to our recommendation about devoting more course time and attention to
criteria 2.
60
Student Outcome 2.7 Computer Science Technology Skills –Computer Organization and Architecture
Indirect Assessment Methods: Student Course and Instructor Evaluation, Exit Interview, Alumni Survey, ETS Scores
Performance Criteria Strategies Assessment
Method(s)
Context for
Assessment
Time of Data
Collection
Assessment
Coordinator
Analysis of
Results
Understands modern ISA design
principles and employs them to
evaluate systems.
COSC 2372,
ELEN 3431,
COSC 4310
Local Exam
Question
COSC 4310 Fall and Spring
of every year
Dr.
Jiangjiang
Liu
Size = 8
Percentage =
88
The target of
80% was Met
Know how to measure performance
for different computer architectures.
COSC 4310 Local Exam
Question
COSC 4310 Fall and Spring
of every year
Dr.
Jiangjiang
Liu
Size = 8
Percentage =
50
The target of
80% was
Not Met
Demonstrate knowledge of hardware
implementation of numbers and
arithmetic operations.
COSC 2372,
COSC 4310
Local Exam
Question
COSC 4310 Fall and Spring
of every year
Dr.
Jiangjiang
Liu
Size = 8
Percentage =
88
The target of
80% was Met
Date: June 14, 2013.
Results: The results in Student Outcome 2.7 were all satisfactory except direct measure performance criterion 2, in 2012-2013.
Performance criterion 2 is not met with the percentage of 50% and a sample size of 8 students in 2012-2013. In 2011-2012,
performance criterion 2 was met with the percentage of 100% and a sample size of 9students. Final exam questions were used to
61
assess performance criterion. The department will monitor performance criterion 2, and the instructor will give more attention to this
area.
Actions: None.
Second Cycle Results: None.
62
Student Outcome 3 Scientific Method**
**Graduates will be able to gather requirements, analyze, design and conduct simulations or other computer experiments in order to
evaluate and interpret the data.
Indirect Assessment Methods: Student Course and Instructor Evaluation, Exit Interview, Alumni Survey, ETS Scores
Performance Criteria Strategies Assessment
Method(s)
Context for
Assessment
Time of Data
Collection
Assessment
Coordinator
Analysis of
Results
Be able to justify why selected
research methods were chosen and
state the intended outcomes of the
study.
COSC 2336,
CPSC 3320,
COSC 4310
Rubric and
Project
CPSC 3320
and COSC
4310
Fall and
Spring of
every year
Dr. Jiangjiang
Liu and Dr.
Bo Sun
Size = 26
Percentage =
91.69
The target of
80% was Met
Identify steps used in a particular
study.
COSC 2336,
CPSC 3320,
COSC 4310
Rubric and
Project
CPSC 3320
and COSC
4310
Fall and
Spring of
every year
Dr. Jiangjiang
Liu and Dr.
Bo Sun
Size = 26
Percentage =
91.69
The target of
80% was Met
Be able to outline and explain the
key features of the adopted method.
COSC 2336,
CPSC 3320,
COSC 4310
Rubric and
Project
CPSC 3320
and COSC
4310
Fall and
Spring of
every year
Dr. Jiangjiang
Liu and Dr.
Bo Sun
Size = 26
Percentage =
91.69
The target of
80% was Met
63
Analyze and interpret collected data
based on the adopted method and
draw appropriate conclusions.
COSC 2336,
CPSC 3320,
COSC 4310
Rubric and
Project
CPSC 3320
and COSC
4310
Fall and
Spring of
every year
Dr. Jiangjiang
Liu and Dr.
Bo Sun
Size = 26
Percentage =
91.69
The target of
80% was Met
Date: June 14, 2013.
Results: The results in Student Outcome 3 are all satisfactory except for some indirect measures from the Student Course and
Instructor Evaluation results in 2012-2013. The department made improvement in CPSC 3320. The targets of the Student Course and
Instructor Evaluation were not met for CPSC 3320 in 2011-2012 and are all met in 2012-2013. The cumulative averages were 3.33
out of 5.00 points for question 37, 3.67 for question 38, and 3.67 for question 40 with the sample size of six students in 2011-2012. In
2012-2013, the cumulative averages are 4.17 for question 37, 4.22 for question 38, and 4.00 for question 40 with the sample size of 23
students.
The department also made improvement in COSC 4310. The target for question 38 of the Student Course and Instructor Evaluation
was not met with the average of 3.50 for COSC 4310 in 2011-2012 and is met with the average of 3.86 in 2012-2013. The sample
size was 8 for both years. The results for questions 35 and 38 of the Student Course and Instructor Evaluation are also improved. The
cumulative averages were 3.25 for question 35 and 3.38 for question 37 in 2011-2012. In 2012-2013, the cumulative averages are
very close to our target 3.75 with 3.71 for question 35 and 3.43 for question 37.
The targets for the Student Course and Instructor Evaluation for COSC 3308 were all met with the average of 4.09 for questions 38
and 39 in 2011-2012. The targets were not met with the average of 3.55 for questions 38 and 39 in 2012-2013. However, the
cumulative averages are very close to our target 3.75.
Actions: None.
Second Cycle Results: None.
64
Student Outcome 4 Societal Awareness**
**Graduates will be aware of and understand the impact of computer technology on society at large, on the workplace environment,
and on individuals.
Indirect Assessment Methods: Student Course and Instructor Evaluation, Exit Interview, Alumni Survey, ETS Scores
Performance Criteria Strategies Assessment
Method(s)
Context for
Assessment
Time of Data
Collection
Assessment
Coordinator
Analysis of
Results
Demonstrate understanding of
evolving computer technology
applications.
COSC 1172,
COSC 3325
Exam
Questions
COSC 3325 Spring of
every year
Dr. Stefan
Andrei
Size = 17
Percentage =
89
The target of
80% was Met
Demonstrate knowledge of positive
social impacts including information
globalization, E-Commerce, E-
learning and new job creation.
COSC 1172,
COSC 3325,
CPSC 4340,
CPSC 3320
Exam
Questions
COSC 3325 Spring of
every year
Dr. Stefan
Andrei
Size = 17
Percentage =
89
The target of
80% was Met
Demonstrate knowledge of negative
social impacts including internet
pornography, privacy violation,
health hazards, computer crimes and
dehumanization.
COSC 1172,
COSC 3325,
CPSC 4340,
CPSC 3320,
ELEN 3431
Exam
Questions
COSC 3325,
CPSC 3320
Fall and
Spring of
every year
Dr. Stefan
Andrei, Dr.
Bo Sun
Size = 21
Percentage =
91.90
The target of
80% was Met
Demonstrate basic understanding of
intellectual property protection via
copyright and patent law and fair use
exception for copyrighted software.
COSC 1172,
COSC 3325,
CPSC 4340,
CPSC 4360
Exam
Questions
COSC 3325 Spring of
every year
Dr. Stefan
Andrei
Size = 17
Percentage =
93
The target of
80% was Met
65
Date: June 14, 2013.
Results: The sample size is 17, which is comparable with last year, and the direct results met our targets in all of the performance
criteria of the student outcome 4 (89%, 89%, 91.9%, 93%). Analyzing question 41 (“Obtain Info./Local & Global impact-societal
issues”) from the Student Course and Instructor Evaluation, the target has been met for COSC 3325 and CPSC 4360 with 4 out of 5
and 4.25 out of 5. However, the target has not been met for COSC 1172 as the average was only 3.43 instead of minimum of 3.75.
As for the Exit Interview, questions 5 (“Your education fostered an understanding of the impact of the discipline on relevant local and
global social issues”) and 9 (“Your education fostered an awareness of professional and ethical responsibilities and their application
in real situations”) were both met in 2012-2013 with an average of 4.12 and 3.75, respectively (comparable with 2011-2012 academic
year).
There is no question for the Exit Survey which reflects this student outcome. As for Alumni Survey, question 9 (“Your education
fostered an awareness of professional and ethical responsibilities and their application in real situations”) were met in 2012-2013
with an average of 4.11 out of 5, respectively (comparable with 2011-2012 academic year). However, question 5 (“Your education
fostered an understanding of the history of computer science and the impact of the discipline on relevant social issues”) did not met
the target as it had an average of only 3.89, below our goal of 4.
We met targets for both direct and indirect measures this year.
Actions: None.
Second Cycle Results: None.
66
Student Outcome 5 Ethical Standards**
**Graduates will be able to recognize and understand the importance of ethical standards as well as their own responsibilities with
respect to the computer profession.
Indirect Assessment Methods: Student Course and Instructor Evaluation, Exit Interview, Alumni Survey, ETS Scores
Performance Criteria Strategies Assessment
Method(s)
Context for
Assessment
Time of Data
Collection
Assessment
Coordinator
Analysis of
Results
Know the differences of various
philosophical views on ethics such as
deontology, utilitarianism, egoism,
and relativism.
COSC 3325
Exam
Questions
COSC 3325 Spring of
every year
Dr. Stefan
Andrei
Size = 17
Percentage =
92
The target of
80% was Met
Understand the ACM or a similar
professional body’s code of ethics
and principles underlying those
ethics.
COSC 3325,
CPSC 4360
Exam
Questions
CPSC 4360 Fall and
Spring of
every year
Dr. Stefan
Andrei, Dr.
Peggy
Doerschuk
Size = 20
Percentage =
87
The target of
80% was Met
Honor the property rights of others
including copyrights and patents.
COSC 1172,
COSC 3325,
CPSC 4360
Exam
Questions
COSC 3325 Spring of
every year
Dr. Stefan
Andrei
Size = 17
Percentage =
82
The target of
80% was Met
Demonstrate ability for ethical
decision making within the computer
profession.
COSC 1172,
COSC 3325,
CPSC 3320,
CPSC 4360
Exam
Questions
COSC 3325 Spring of
every year
Dr. Stefan
Andrei
Size = 17
Percentage =
85
The target of
80% was Met
67
Demonstrate knowledge of factors
affecting fair resolution of conflicts of
interests.
COSC 1172,
COSC 3325,
CPSC 4360
Exam
Questions
COSC 3325 Spring of
every year
Dr. Stefan
Andrei
Size = 17
Percentage =
89
The target of
80% was Met
Date: June 14, 2013.
Results: This is the fourth consecutive year in which both the direct measures, the Student Course and Instructor Evaluations, the Exit
Interview question 9 (“Your education fostered an awareness of professional and ethical responsibilities and their application in real
situations”), the Exit Survey question 16 (“I am cognizant of ethical issues and local and global societal concerns relating to
computers in society”) were consistent in achieving their targets.
As for Alumni Survey, question 9 (“Your education fostered an awareness of professional and ethical responsibilities and their
application in real situations”) was met in 2012-2013 with an average of 4.11 out of 5, respectively (comparable with 2011-2012
academic year).
Actions: None.
Second Cycle Results: None.
68
Student Outcome 6 Collaborative Work Skills**
**Graduates will demonstrate the ability to work effectively in teams to conduct technical work through the exercise of interpersonal
communication skills.
Indirect Assessment Methods: Student Course and Instructor Evaluation, Exit Interview, Alumni Survey, ETS Scores
Performance Criteria Strategies Assessment
Method(s)
Context for
Assessment
Time of Data
Collection
Assessment
Coordinator
Analysis of
Results
Ability to work in
heterogeneous
environments which
are diverse in gender,
ethnicity, and academic
accomplishment.
COSC 1172,
CPSC 4360,
CPSC 4340,
COSC 4302
Rubrics CPSC 4340,
CPSC 4360
Fall and Spring
of every year
Dr. Andrei, Makki,
Dr. Doerschuk
Size = 27
Percentage =
98.07
The target of
80% was Met
Attend team meetings
and contribute towards
solution of technical
problems during the
meetings.
COSC 1172,
CPSC 4360,
CPSC 4340,
COSC 4302
Rubrics CPSC 4340,
CPSC 4360
Fall and Spring
of every year
Dr. Andrei, Makki,
Dr. Doerschuk
Size = 26
Percentage =
94.77
The target of
80% was Met
Make appropriate
contributions within
their skill set to the
completion of the
project.
COSC 1172,
CPSC 4360,
CPSC 4340,
COSC 4302
Rubrics CPSC 4340,
CPSC 4360
Fall and Spring
of every year
Dr. Andrei, Makki,
Dr. Doerschuk
Size = 26
Percentage =
98
The target of
80% was Met
Demonstrate a sense of
interdependence with
other team members.
COSC 1172,
CPSC 4360,
CPSC 4340,
COSC 4302
Rubrics CPSC 4340,
CPSC 4360
Fall and Spring
of every year
Dr. Andrei, Makki,
Dr. Doerschuk
Size = 27
Percentage =
98.07
The target of
80% was Met
69
Date: June 14, 2013.
Results: All targets are met of direct measures with high scores on all of the criteria. The sample size of 26 was much higher than last
year’s sample size of 7. However, the student evaluation scores from CPSC 4340 showed that the students did not feel they
understood how to collaborate in teams. This is an improvement from last year in the sense that the indirect measures for COSC 4302
student evaluations matched the direct measures.
CPSC 4340 showed a flat 3.00 for all questions 25, 26, 34, and 35. Also, CPSC 4340 had a small sample size of 6.
The Exit Interview scores matched the expectations for all questions (4, 7, 11, 13, and 14), except number 8 (“Your education
developed in you skill in communication and cooperation within workgroups”) which got 3.63 close to the target of 3.75. The
Assessment Committee agreed to remove the words “and larger organizations” from the question as it is not what we intended to
measure.
The Alumni Survey scores matched the expectations for all questions (4, 7, 8, 11, and 14), except number 13 (“Your education
provided a sufficient educational foundation for leadership roles along future career paths.”) which attained 3.89—close to the target
of 4.0.
Actions: None.
Second Cycle Results: On direct measures, the performance has met targets since 2007-2008. No actions were taken last year. Thus,
there are no second cycle results to report.
70
Student Outcome 7 Oral Communications**
**Graduates will demonstrate their ability to verbally communicate clearly.
Indirect Assessment Methods: Student Course and Instructor Evaluation, Exit Interview, Alumni Survey, ETS Scores
Performance
Criteria
Strategies Assessment
Method(s)
Context for
Assessment
Time of Data
Collection
Assessment
Coordinator
Analysis of
Results
Demonstrate the
ability to
communicate in a
given situation.
COSC 3325,
COSC 4172,
COSC 1172
Rubrics COSC 3325,
COSC 4172
Fall and Spring
of every year
Dr. Stefan Andrei,
Dr. Lawrence
Osborne
Size = 21
Percentage =
80.67
The target of
80% was Met
Demonstrate the
ability to
comprehend what
is said and to show
an appreciation of
the importance of
listening.
COSC 3325,
COSC 4172,
COSC 1172
Rubrics COSC 3325,
COSC 4172
Fall and Spring
of every year
Dr. Stefan Andrei,
Dr. Lawrence
Osborne
Size = 21
Percentage =
85.43
The target of
80% was Met
Communicate
clearly at the level
of the audience the
technical material
intrinsic to the
discipline of
computer science.
COSC 3325,
COSC 4172,
COSC 1172
Rubrics COSC 3325,
COSC 4172
Fall and Spring
of every year
Dr. Stefan Andrei,
Dr. Lawrence
Osborne
Size = 21
Percentage =
80.67
The target of
80% was Met
71
Demonstrate
knowledge of the
communication
process.
COSC 3325,
COSC 4172,
COSC 1172
Rubrics COSC 3325,
COSC 4172
CPSC 4360
Fall and Spring
of every year
Dr. Stefan Andrei,
Dr. Lawrence
Osborne
Size = 27
Percentage =
75.70
The target of
80% was
Not Met
Date: June 14, 2013.
Results: All targets for the direct measurement of the performance criteria were met for the last five years until this year when
criterion 4 had only 75.7% of the students doing satisfactory work. The sample size was 27. The criterion was “Demonstrate
knowledge of the communication process.” On question 34 on the online student evaluations, “Students had the opportunity to
communicate design and implementation concepts to professionals and non-professionals” the target was met. The average was 3.86.
We did not meet targets for Exit Interview question 8 or Alumni Survey question 18.
Actions: In COSC 4172 we will conduct a review of methods for giving an effective presentation.
Second Cycle Results: None.
72
Student Outcome 8 Written Communication Skills**
**Graduates will demonstrate their ability to write effectively both technical and non-technical materials with appropriate multimedia
aids.
Indirect Assessment Methods: Student course and Instructor Evaluation, Exit Interview, Alumni Survey, ETS Scores
Performance Criteria Strategies Assessment
Method(s)
Context for
Assessment
Time of Data
Collection
Assessment
Coordinator
Analysis of
Results
Provide an
introduction that grabs
the attention of
readers.
COSC
1172,
COSC
3325,
COSC
4172,
CPSC
4360,
COSC
4302
Rubrics CPSC 4360,
COSC 4302
Fall and Spring
of every year
Dr. Sun, Dr.
Andrei, Dr.
Doerschuk
Size = 36
Percentage =
95.64
The target of
80% was Met
Organize documents
in terms of a few main
points or themes.
COSC
1172,
COSC
3325,
COSC
4172,
CPSC
4360,
COSC
4302
Rubrics CPSC 4360,
COSC 4302
Fall and Spring
of every year
Dr. Sun, Dr.
Andrei, Dr.
Doerschuk
Size = 36
Percentage =
94.39
The target of
80% was Met
73
Choose appropriate
illustrations, examples,
or evidence to support
the written documents.
COSC
1172,
COSC
3325,
COSC
4172,
CPSC
4360,
COSC
4302
Rubrics CPSC 4360,
COSC 4302
Fall and Spring
of every year
Dr. Sun, Dr.
Andrei, Dr.
Doerschuk
Size = 35
Percentage =
91.46
The target of
80% was Met
Write appropriately
for specified readers in
terms of technical
content.
COSC
1172,
COSC
3325,
COSC
4172,
CPSC
4360,
COSC
4302
Rubrics
CPSC 4360,
COSC 4302
Fall and Spring
of every year
Dr. Sun, Dr.
Andrei, Dr.
Doerschuk
Size =35
Percentage =
96.26
The target of
80% was Met
Write organized,
grammatically correct
reports.
COSC
1172,
COSC
3325,
COSC
4172,
CPSC
4360,
COSC
4302
Rubrics CPSC 4360,
COSC 4302
Fall and Spring
of every year
Dr. Sun, Dr.
Andrei, Dr.
Doerschuk
Size = 36
Percentage =
96.36
The target of
80% was Met
74
Date: June 14, 2013.
Results: For the sixth consecutive year, all direct measure targets were met and indirect measure targets were met (except COSC
1172).
Actions: None.
Second Cycle Results: None.
75
Student Outcome 9 Continuing Education and Lifelong Learning**
**Graduates will be demonstrate that they can independently acquire new computing related skills and knowledge in order to pursue
either further formal or informal learning after graduation.
Indirect Assessment Methods: Student course and Instructor Evaluation, Exit Interview, Alumni Survey, ETS Scores
Performance Criteria Strategies Assessment
Method(s)
Context for
Assessment
Time of Data
Collection
Assessment
Coordinator
Analysis of
Results
Be able to search scholarly
publications to assist in
resolving problems.
COSC 3325,
COSC 4172,
COSC 4302,
CPSC 4360
Rubrics COSC 3325 and
COSC 4172
Fall and Spring
of every year
Dr. Osborne and
Dr. Andrei
Size = 26
Percentage =
84.38
The target of
80% was Met
Intend to engage in
additional formal education
or participate in employer-
related training or research
projects.
COSC 4172 Rubrics COSC 4172 Fall and Spring
of every year
Dr. Osborne Size = 9
Percentage =
77.22
The target of
80% was
Not Met
Independent study.
Participate in Honors
program or in undergraduate
research at Lamar. This
could be done in the
STAIRSTEP Program,
Presentations or Posters at
Professional Conferences,
COOP or Internship
position reports. Student
COSC 4172 Rubrics COSC 4172 Fall and Spring
of every year
Dr. Osborne Size = 9
Percentage =
66.67
The target of
80% was
Not Met.
76
could own a software design
and development company.
Date: June 14, 2013.
Results: We met the first direct measure criterion. We did not meet the other two (77.22%, 66.67%), both of which are covered in
COSC 4172. We note that 77.22% was close to our target of 80%.
The targets for all online course evaluations for this outcome were either met or narrowly missed.
Actions: The Assessment Committee will analyze the rubric used to assess criterion 9.3 to determine if it should be modified to
include other elements that would indicate if students are capable of independent study.
We removed question 35 from the online course evaluations.
Second Cycle Results: None.
77
G.2 - Direct Measure Results Summary: Student Learning Outcomes
2012-2013
Summary of Student Learning Outcome Results 2012-2013
Student
Outcome
Performance
Criterion
2012-2013 Target
>=80%
out of
students
pass
Sample
Size
Mean Scale
[0%..100%]
Outcome 1 1 20 83.45%
2 20 83.45%
3 6 100.00%
4 12 91.00%
5 12 100.00%
6 19 97.26%
7 32 92.19%
Outcome 2.1 1 11 82.00%
2 12 83.00%
3 7 100.00%
Outcome 2.2 1 12 83.00%
2 12 83.00%
3 12 83.00%
4 12 83.00%
Outcome 2.3 1 7 100.00%
2 7 100.00%
3 7 86.00%
4 7 100.00%
5 10 90.00%
6 10 90.00%
Outcome 2.4 1 15 86.80%
2 15 86.80%
3 15 86.80%
4 15 86.80%
Outcome 2.5 1 6 83.00%
2 6 83.00%
3 6 83.00%
Outcome 2.6 1 18 83.00%
2 18 83.00%
3 18 83.00%
Outcome 2.7 1 8 88.00%
2 8 50.00% Not Met
3 8 88.00%
78
Outcome 3 1 26 91.69%
2 26 91.69%
3 26 91.69%
4 26 91.69%
Outcome 4 1 17 89.00%
2 17 89.00%
3 21 91.90%
4 17 93.00%
Outcome 5 1 17 92.00%
2 20 87.00%
3 17 82.00%
4 17 85.00%
5 17 89.00%
Outcome 6 1 27 98.07%
2 26 94.77%
3 26 98.00%
4 27 98.07%
Outcome 7 1 21 80.67%
2 21 85.43%
3 21 80.67%
4 27 75.70% Not Met
Outcome 8 1 36 95.64%
2 36 94.39%
3 35 91.46%
4 35 96.26%
5 36 96.36%
Outcome 9 1 26 84.38%
2 9 77.22% Not Met
3 9 66.67% Not Met
79
G.3 - Indirect Measure Results: Student Course and Instructor
Evaluation Summary 2012-2013
Student
Outcome Course Ques.
(u#)*
Semester Total
Sample
Size
Avg.
[1..5] >=3.75
Fall
Spring
Sample
Size
Mean
[1..5]
Sample
Size
Mean
[1..5]
Outcome 1
COSC
1336-
01 27 12 4.25 1 4.0 13 4.23
28 12 4.08 1 4.0 13 4.07
29 11 3.82 1 4.0 12 3.84
31 12 4.08 1 4.0 13 4.07
COSC
1336-
02 27
1 3.00 1 3.00 Not
Met
28
1 3.00 1 3.00 Not
Met
29
1 3.00 1 3.00 Not
Met
31
1 3.00 1 3.00 Not
Met
COSC
1336-
48F 27 8 3.0 5 3.2 13 3.077 Not
Met
28 8 2.75 5 3.2 13 3.077 Not
Met
29 8 2.75 5 3.2 13 3.077 Not
Met
31 8 3.00 5 3.8 13 3.077 Not
Met
COSC
1337-
01 27 7 4.29 22 3.725 29 3.861
28 7 3.86 22 3.635 29 3.689 Not
Met
29 7 4.14 22 3.913 29 3.981
30 7 3.71 22 3.820 29 3.793
31 7 4.14 22 4.315 29 4.272
COSC
2336-
01 27 12 4.33 9 4.000 21 4.189
28 12 4.33 9 4.000 21 4.189
80
30 12 4.17 9 4.220 21 4.191
31 12 4.58 9 4.440 21 4.503
32 12 4.42 9 4.220 21 4.334
38 12 3.92 9 3.780 21 3.860
COSC
2372 27
13 3.15 13 3.15 Not
Met
28
13 3.31 13 3.31 Not
Met
30
13 3.46 13 3.46 Not
Met
31
13 3.69 13 3.69 Not
Met
32
13 3.46 13 3.46 Not
Met
COSC
3304 27
9 4.11 9 4.11
28
9 4.00 9 4.00
29
9 3.56 9 3.56 Not
Met
30
9 4.00 9 4.00
31
9 3.44 9 3.44 Not
Met
32
9 3.44 9 3.44 Not
Met
CPSC
3320 27 23 4.17 23 4.17
28 23 4.09 23 4.09
30 23 3.96
38 23 4.22
COSC
4172 27 5 4.6 2 3.0 7 4.14
COSC
4302-
01 25 28 3.89 16 4.00 44 3.93
27 28 4.18 16 4.31 44 4.23
28 28 3.93 16 4.06 44 3.98
30 28 4.00 16 4.00 44 4.00
31 28 4.18 16 4.19 44 4.18
CPSC
4340 28 6 3.33 6 3.33 Not
Met
29 6 3.67 6 3.67 Not
Met
30 6 3.33 6 3.33 Not
Met
31 6 3.00 6 3.00 Not
81
Met
CPSC
4360-
01 25 6 4.33 6 3.83 12 4.08
27 6 4.83 6 4.33 12 4.58
28 6 4.67 6 4.33 12 4.50
29 6 4.67 6 4.33 12 4.50
30 6 4.50 6 4.50 12 4.50
31 6 4.67 6 4.33 12 4.50
32 6 4.33 6 4.50 12 4.42
Outcome
2.1
COSC
2336 27 12 4.33 9 4.000 21 4.19
28 12 4.33 9 4.000 21 4.19
29 12 4.00 9 4.00 21 4.00
30 12 4.17 9 4.220 21 4.19
31 12 4.58 9 4.440 21 4.50
40 12 4.50 8 4.25 20 4.40
COSC
3304 27 9 4.11 9 4.11
37 9 3.89 9 3.89
40 9 3.89 9 3.89
COSC
3302 27 4 4.00 4 4.00
39 4 4.00 4 4.00
40 4 3.75 4 3.75
Outcome
2.2
COSC
3304 27 9 4.11 9 4.19
28 9 4.00 9 4.00
33 9 3.89 9 3.89
39 9 3.56 9 3.56 Not
Met
40 9 3.89 9 3.89
Outcome
2.3
COSC
3302 30 4 3.75 4 3.75
39 4 4.0 4 4.00
40 4 3.75 4 3.75
Outcome
2.4
COSC
4302 27 28 4.18 16 4.31 44 4.23
28 28 3.93 16 4.06 44 3.98
35 28 4.07 16 4.25 44 4.14
39 28 3.82 16 4.06 44 3.91
40 28 3.93 16 4.44 44 4.12
Outcome
2.5
CPSC
4340 27 6 3.50 6 3.50 Not
Met
82
28 6 3.33 6 3.33 Not
Met
39 6 3.33 6 3.33 Not
Met
40 6 3.50 6 3.50 Not
Met
Outcome
2.6
CPSC
3320 28 23 4.09 23 4.09
30 23 3.96 23 3.96
38 23 4.22 23 4.22
39 23 4.13 23 4.13
40 23 4.00 23 4.00
Outcome
2.7
COSC
2372 27 13 3.15 13 3.15 Not
Met
31 13 3.69 13 3.69 Not
Met
35 13 3.39 13 3.39 Not
Met
40 13 3.77 13 3.77
COSC
4310 35 1 5.00 6 3.50 7 3.71 Not
Met
38 1 5.00 6 3.67 7 3.86
40 1 5.00 6 3.50 7 3.71 Not
Met
Outcome 3
COSC
2336-
01 37 12 4.33 9 4.11 21 4.24
38 12 3.92 9 3.78 21 3.86
40 12 4.5 8 4.25 20 4.40
CPSC
3320 37 23 4.17 23 4.17
38 23 4.22 23 4.22
40 23 4.00 23 4.00
COSC
4310 35 1 5.00 6 3.50 7 3.71 Not
Met
38 1 5.00 6 3.67 7 3.86
40 1 5.00 6 3.50 7 3.71 Not
Met
Outcome 4
COSC
1172 41 44 3.34 21 3.62 65 3.43 Not
Met
COSC
3325 41 9 4.00 9 4.00
CPSC
4360 41 6 4.33 6 4.17 12 4.25
Outcome 5 COSC 36 9 3.78 9 3.78
83
3325
Outcome 6
COSC
1172 25 45 3.29 21 3.19 66 3.26 Not
Met
26 45 3.36 21 3.10 66 3.28 Not
Met
COSC
4302-
01 25 28 3.89 16 4.00 44 3.93
26 28 4.11 16 4.06 44 4.09
34 28 3.93 16 4.25 44 4.05
35 28 4.07 16 4.25 44 4.14
CPSC
4340 25 6 3.00 6 3.00 Not
Met
26 6 3.00 6 3.00 Not
Met
34 6 3.00 6 3.00 Not
Met
35 6 3.00 6 3.00 Not
Met
CPSC
4360-
01 25 6 4.33 6 3.83 12 4.08
26 6 4.17 6 3.83 12 4.00
34 6 4.33 6 4.00 12 4.17
Outcome 7
COSC
1172 25 45 3.29 21 3.19 66 3.26 Not
Met
26 45 3.36 21 3.10 66 3.28 Not
Met
COSC
3325 34 9 3.44 9 3.44 Not
Met
42 9 4.00 9 4.00
CPSC
4360-
01 25 6 4.33 6 3.83 12 4.08
26 6 4.17 6 3.83 12 4.00
34 6 4.33 6 4.00 12 4.17
Outcome 8
COSC
1172 26 45 3.36 21 3.10 66 3.28 Not
Met
34 44 2.91 21 3.33 65 3.05 Not
Met
COSC
3325 42 9 4.0 9 4.00
COSC
4302- 26 28 4.11 16 4.00 44 4.07
84
01
34 28 3.93 16 4.31 44 4.07
CPSC
4360-
01 26 6 4.17 6 3.83 12 4.00
34 6 4.33 6 4.00 12 4.17
Outcome 9
COSC
3325 42 9 4.0 9 4.0
COSC
4172 27 5 4.6 2 3.0 7 4.14
34 5 4.2 2 3.0 7 3.86
37 5 4.4 2 2.0 7 3.71 Not
Met
40 5 4.4 2 2.0 7 3.71 Not
Met
42 5 4.4 2 2.0 7 3.71 Not
Met
85
G.4 - Indirect Measure Results: Exit Interview Summary 2012-2013
A. Program Quality. Each item is measured on a 10 point scale with a goal of a mean
score of at least 7.5.
Question Semester
Fall Spring
Total Sample
Size Average >=7.5
Sample Size Mean Sample Size Mean
1 5 8.20 3 8.00 8 8.13
2 5 8.00 3 8.00 8 8.00
3 5 8.60 3 4.00 8 6.88 Not
Met
4 5 8.20 3 8.33 8 8.25
B. Department Student Outcomes. Each item is measured on a 5 point scale with a goal
of a mean score of 4.0.
Student
Outcome Question Semester
Total
Sample
Size
Average
[1..5] >=3.75
Fall Spring
Sample
Size
Mean
[1..5]
Sample
Size
Mean
[1..5]
Outcome 1 1 5 4.60 3 4.33 8 4.50
2
5 3.60 3 3.67 8 3.63 Not
Met
3 5 4.20 3 4.00 8 4.13
6 5 4.40 3 4.33 8 4.37
12 5 4.20 3 4.00 8 4.13
Outcome 2 15 5 3.60 3 4.00 8 3.75
Outcome 3 3 5 4.20 3 4.00 8 4.13
4 5 4.00 3 4.33 8 4.12
6 5 4.40 3 4.33 8 4.37
7 5 4.20 3 3.67 8 4.00
Outcome 4 5 5 4.00 3 4.33 8 4.12
9 5 4.20 3 3.00 8 3.75
Outcome 5 9 5 4.20 3 3.00 8 3.75
Outcome 6 4 5 4.00 3 4.33 8 4.12
7 5 4.20 3 3.67 8 4.00
8
5 3.60 3 3.67 8 3.63 Not
Met
86
11 5 4.40 3 4.00 8 4.25
13 5 4.00 3 4.00 8 4.00
14 5 4.20 3 4.33 8 4.25
Outcome 7 8 5 3.60 3 3.67 8 3.63 Not
Met
13 5 4.00 3 4.00 8 4.00
14 5 4.20 3 4.33 8 4.25
Outcome 8 8 5 3.60 3 3.67 8 3.63 Not
Met
13 5 4.00 3 4.00 8 4.00
14 5 4.20 3 4.33 8 4.25
Outcome 9 1 5 4.60 3 4.33 8 4.50
10 5 4.40 3 4.00 8 4.25
11 5 4.40 3 4.00 8 4.25
87
G.5 - Indirect Measure Results: Exit Survey Summary 2012-2013
A. Program Quality. Each item is measured on a 5 point scale with a goal of a mean
score of at least 3.75 except question 3 where the goal is between 2.25 and 4.00/year.
Question Semester
Fall Spring
Total Sample
Size Average >=3.75
Sample Size
Mean
[1..5] Sample Size
Mean
[1..5]
1 5 4.40 3 3.33 8 4.00
2
5 3.20 3 3.67 8 3.38 Not
Met
3 5 3.40 3 3.67 8 3.50
4 5 4.20 3 3.33 8 3.87
5 5 4.20 3 3.33 8 3.87
6 5 4.60 3 4.00 8 4.38
7 5 4.40 3 4.67 8 4.50
8 5 3.80 3 2.67 8 3.38 Not
Met
9 5 3.60 3 3.33 8 3.50 Not
Met
10 5 4.20 3 3.67 8 4.00
11 5 3.40 3 4.33 8 3.75
12 5 3.60 3 4.33 8 3.87
13 5 4.20 3 4.33 8 4.25
14
5 3.60 3 3.67 8 3.63 Not
Met
15 5 3.60 3 4.33 8 3.87
16 5 4.00 3 3.67 8 3.88
17 5 4.00 3 4.00 8 4.00
18 5 4.00 3 3.67 8 3.88
88
B. Department Student Outcomes. Each item is measured on a 5 point scale with a goal
of a mean score of 3.75 except question 3 where the goal is between 2.25 and
4.00/year.
Student
Outcome Question Semester
Total
Sample
Size
Average
[1..5] >=3.75
Fall Spring
Sample
Size
Mean
[1..5]
Sample
Size
Mean
[1..5]
Outcome 5 16 5 4 3 3.67 8 3.88
Outcome 7 13 5 4.2 3 4.33 8 4.25
Outcome 8 12 5 3.6 3 4.33 8 3.87
Outcome 9 9 5 3.6 3 3.33 8 3.50 Not
Met
11 5 3.4 3 4.33 8 3.75
89
G.6 - Indirect Measure Results: Alumni Survey Summary 2010-2011
Question
Sample
Size
Mean
Standard
Deviation Target
A. Program Quality. Each item is
measured on a 10 point scale
with a goal of a mean score of at
least 8.0.
Scale
[0. .10]
>=8.0
1 9 8.00 0.87
2 9 7.89 1.05 Not Met
3 9 8 2.12
4 9 8.67 1.01
B. Department Student Outcomes.
Each item is measured on a 5
point scale with a goal of a mean
score of 4.0.
Scale
[1. .5]
>=4.0
1 9 4.44 0.73
2 9 4.11 0.78
3 9 4.44 0.53
4 9 4.33 0.73
5 9 3.89 1.05 Not Met
6 9 4.56 0.53
7 9 4.44 1.01
8 9 4.22 0.83
9 9 4.11 0.78
10 9 4.56 0.73
11 9 4.67 0.50
12 9 4.56 0.53
13 9 3.89 0.78 Not Met
14 9 4.22 0.83
15 9 4.17 0.75
16 9 4.33 0.52
17 9 4.67 0.52
90
G.7 - Indirect Measure Results: Advisory Board Feedback 2012-2013
The Lamar Department of Computer Science Advisory Board met on March 1, 2013 in
the Lamar Library. Several faculty from the department gave presentations to the Board.
Dr. Stefan Andrei made a presentation on the state of the department. Dr. Peggy
Doerschuk made a presentation about the STARS program. Dr. Tim Roden made a
presentation about plans for adding a concentration in computer game development to the
curriculum.
A common theme in the feedback from Board members was a need to teach students
more about mobile application development on a variety of platforms. After the Board
meeting, Dr. Stefan Andrei, Dr. Lawrence Osborne and Dr. Timothy Roden discussed the
feedback received. No actions were taken as a result of the Board meeting. However,
Dr. Roden reaffirmed his commitment to teach mobile development in the upcoming
game development courses (added to the curriculum in spring 2013 and expected to be
offered during the 2013-2014 academic year).
As part of the process for soliciting feedback from Board members, a written survey was
given. The questions and summary of feedback are provided below. There were 12
respondents.
1. Where do you see the biggest growth in technology jobs, requiring a Computer
Science degree, within the next five years?
Responses: Energy, healthcare, Oil & Gas, private Space enterprise, green energy, nano-
tech, system security, mobile, telecommunications, gaming, data mining, SaaS
2. What are the top 5 skills you think Computer Science graduates should have today?
Responses: Mobile development, SQL, Microsoft Visual Studio, communication skills,
ethics, teamwork, Scrum/Agile methodology, OOD, 2 or more languages, project
management, database design, analytics, code optimization
3. What additional courses or skills do you think Lamar University should add to its
Computer Science programs?
Responses: game development, forensics, ecommerce systems, mobile development, iOS,
Html 5, J#, embedded systems, security, SaaS, project management
4. What other knowledge and/or skills from other disciplines, besides Computer
Science, do you feel are very important for computing-related jobs?
Responses: sales & marketing, advertising & analytics, project management,
communication, business, basic engineering, math, technical writing, communication and
presentation skills, Python, source code control, physics for quantum computing
91
5. Do you think Lamar University should have additional technology degrees, besides
the B.S. in Computer Science, that focus a student’s studies in one particular area of
Computer Science? If so, what types of specialized Computer Science Degrees do
you recommend?
Responses: Operating systems, telecommunications, information assurance, security,
networking, (most respondents said no specialty degrees are needed but many
recommended adding concentrations to existing degree)
92
G.8 - Indirect Measure Results: ETS Exams 2012-2013
SEMESTER SAMPLE
SIZE MEAN
SCORE STD.
DEVIATION PROG. FUND. SYSTEMS ALGOR. LOW
SCORE HIGH
SCORE
Fall 2002 4 135.5 8.18 sample size too small 124 143
Spring 2003 9 144.2 14.43 41.8 33.2 41.3 131 173
Fall 2003 6 151.O 18.28 48.8 36 44.8 131 169
Spring 2004 5 162.2 14.65 sample size too small 139 178
FALL 2004 8 153.8 20.9 56.4 36.6 44.9 125 180
Spring 2005 7 172.7 12.32 78.7 55.3 66.3 159 194
Fall 2005 1 175 0 sample size too small 175 175
Spring 2006 5 158.2 14.13 sample size too small 154 171
Fall 2006 6 142.5 10.89 56 31 31 130 156
Spring 2007 4 156.5 7.93 67 52 40 148 167
Fall 2007 2 161 9.89 66 53 46 154 168
Spring 2008 6 149 11 67 36 31 130 154
Fall 2008 2 149.2 16.1 66 60 44 145 175
Spring 2009 7 150 12 60 46 33 130 164
Fall 2009 5 148 10 59 50 29 133 159
Spring 2010 3 155.3 10.9 65 44 44 140 164
Fall 2010 3 158.3 13.05 71 36 54 148 173
Spring 2011 2 142.5 7.79 50 26 38 137 148
Fall 2011 4 144.8 18.4 53 35 33 127 170
Spring 2012 4 151.1 3
141 165
Fall 2012 5 145.4 10.57
134 158
Spring 2013 5
93
Appendix H – Curriculum Map I: Introductory course R: Reinforce course S: Summative course *: Indicates those courses may contain the content related to the performance criteria, but do not affect the assessment strategies.
Outcome 1
Performance Criteria
COSC 1172
COSC 1336
COSC 1337
COSC 2336
COSC 2372
COSC 3302
COSC 3304
COSC 3308
COSC 3325
CPSC 3320
COSC 4172
COSC 4302
COSC 4310
CPSC 4302
CPSC 4340
CPSC 4360
ELEN 3431
Apply UML interaction diagrams and class diagrams to illustrate object models
I R R S
Apply important design patterns to OOD.
R S
Create useful software architecture documentation.
I R R R S R
Develop correct and efficient programs.
I R R S R
Debug implemented software in a proficient manner.
I R R R S
94
Design user interfaces appropriate to a large software system.
I R R S
Develop user-level documentation for software.
I I S R R R R R R R R R R R S
Outcome 2.1
COSC 1172
COSC 1336
COSC 1337
COSC 2336
COSC 2372
COSC 3302
COSC 3304
COSC 3308
COSC 3325
CPSC 3320
COSC 4172
COSC 4302
COSC 4310
CPSC 4302
CPSC 4340
CPSC 4360
ELEN 3431
Be able to develop software to support specific operations on frequently used discrete structures such as lists, trees, and graphs.
S * *
Be able to use elementary concepts of combinatorics, probability, and statistics to analyze and evaluate the efficiency of algorithms.
S
95
Be able to use concepts of discrete mathematics, automata, and finite state machines to explain the design of computer hardware.
I R S *
Outcome 2.2
COSC 1172
COSC 1336
COSC 1337
COSC 2336
COSC 2372
COSC 3302
COSC 3304
COSC 3308
COSC 3325
CPSC 3320
COSC 4172
COSC 4302
COSC 4310
CPSC 4302
CPSC 4340
CPSC 4360
ELEN 3431
Demonstrate basic understanding of asymptotic notations and time complexity.
I S
Design efficient algorithms and compare competing designs.
I S *
Demonstrate basic understanding of some design approaches such as greedy algorithms, dynamic programming and divide-and-conquer.
I S
96
Demonstrate familiarity with standard searching and sorting algorithms and linear and non-linear structures.
I S
Outcome 2.3
COSC 1172
COSC 1336
COSC 1337
COSC 2336
COSC 2372
COSC 3302
COSC 3304
COSC 3308
COSC 3325
CPSC 3320
COSC 4172
COSC 4302
COSC 4310
CPSC 4302
CPSC 4340
CPSC 4360
ELEN 3431
Demonstrate basic knowledge of equivalences between various types of languages and corresponding accepting devices including Turing Machines.
S
Demonstrate basic knowledge of practical applicability of various types of grammar and of some standard representation forms.
S
Demonstrate knowledge of limitations of computational capability of
S R
97
computer grammars.
Demonstrate basic knowledge of equivalences and normal forms of logical formulas in propositional logic.
S R
Demonstrate basic understanding and appreciation of the various essential programming languages constructs, paradigms, evaluation criteria, and language implementation issues.
S
Demonstrate basic knowledge and skills in programming techniques with the focus on concepts and not on a particular language.
S
98
Outcome 2.4
Performance Criteria
COSC 1172
COSC 1336
COSC 1337
COSC 2336
COSC 2372
COSC 3302
COSC 3304
COSC 3308
COSC 3325
CPSC 3320
COSC 4172
COSC 4302
COSC 4310
CPSC 4302
CPSC 4340
CPSC 4360
ELEN 3431
Knows the main components of an operating system and their purposes and modes of interaction with one another.
S
Knows the structure of device drivers and the interaction between device drivers and operating systems.
S
Outlines the basic issues in memory management design and virtual memory.
S
Can develop basic system applications based on operating system APIs
R S
Outcome 2.5
Performance Criteria
COSC 1172
COSC 1336
COSC 1337
COSC 2336
COSC 2372
COSC 3302
COSC 3304
COSC 3308
COSC 3325
CPSC 3320
COSC 4172
COSC 4302
COSC 4310
CPSC 4302
CPSC 4340
CPSC 4360
ELEN 3431
99
Demonstrate the application of Entity-Relational diagrams to model real world problems.
S
Design relations for real world problems including implementation of normal forms, keys, and semantics constraints for each relation.
S R
Demonstrate competence in implementations of database applications
S
Outcome 2.6
Performance Criteria
COSC 1172
COSC 1336
COSC 1337
COSC 2336
COSC 2372
COSC 3302
COSC 3304
COSC 3308
COSC 3325
CPSC 3320
COSC 4172
COSC 4302
COSC 4310
CPSC 4302
CPSC 4340
CPSC 4360
ELEN 3431
Employ the socket API to program applications among independent hosts.
S
100
Explain common network architectures, the services provided by each layer, and the protocols required for connecting peer layers.
S
Evaluate network models through simulation and the use of common performance metrics for networks.
S
Outcome 2.7
Performance Criteria
COSC 1172
COSC 1336
COSC 1337
COSC 2336
COSC 2372
COSC 3302
COSC 3304
COSC 3308
COSC 3325
CPSC 3320
COSC 4172
COSC 4302
COSC 4310
CPSC 4302
CPSC 4340
CPSC 4360
ELEN 3431
Understands modern ISA design principles and employs them to evaluate systems.
I S *
Know how to measure performance for different computer architectures.
S
101
Demonstrate knowledge of hardware implementation of numbers and arithmetic operations.
I S
Outcome 3
Performance Criteria
COSC 1172
COSC 1336
COSC 1337
COSC 2336
COSC 2372
COSC 3302
COSC 3304
COSC 3308
COSC 3325
CPSC 3320
COSC 4172
COSC 4302
COSC 4310
CPSC 4302
CPSC 4340
CPSC 4360
ELEN 3431
Be able to justify why selected research methods were chosen and state the intended outcomes of the study.
I S S
Identify steps used in a particular study.
I S S
Be able to outline and explain the key features of the adopted method.
I S S
Analyze and interpret collected data based on the adopted method and draw appropriate conclusions.
I S S
102
Outcome 4
Performance Criteria
COSC 1172
COSC 1336
COSC 1337
COSC 2336
COSC 2372
COSC 3302
COSC 3304
COSC 3308
COSC 3325
CPSC 3320
COSC 4172
COSC 4302
COSC 4310
CPSC 4302
CPSC 4340
CPSC 4360
ELEN 3431
Demonstrate understanding of evolving computer technology applications.
I S
Demonstrate knowledge of positive social impacts including information globalization, E-Commerce, E-learning and new job creation.
I S R *
Demonstrate knowledge of negative social impacts including internet pornography, privacy violation, health hazards, computer crimes and dehumanization.
I S S * *
103
Demonstrate basic understanding of intellectual property protection via copyright and patent law and fair use exception for copyrighted software.
I S * S
Outcome 5
Performance Criteria
COSC 1172
COSC 1336
COSC 1337
COSC 2336
COSC 2372
COSC 3302
COSC 3304
COSC 3308
COSC 3325
CPSC 3320
COSC 4172
COSC 4302
COSC 4310
CPSC 4302
CPSC 4340
CPSC 4360
ELEN 3431
Know the differences of various philosophical views on ethics such as deontology, utilitarianism, egoism, and relativism.
S
Understand the ACM code of ethics or a similar professional body’s code of ethics and principles underlying those ethics.
R S
104
Honor the property rights of others including copyrights and patents.
I S *
Demonstrate ability for ethical decision making within the computer profession.
I S R *
Demonstrate knowledge of factors affecting fair resolution of conflicts of interests.
I S *
Outcome 6
Performance Criteria
COSC 1172
COSC 1336
COSC 1337
COSC 2336
COSC 2372
COSC 3302
COSC 3304
COSC 3308
COSC 3325
CPSC 3320
COSC 4172
COSC 4302
COSC 4310
CPSC 4302
CPSC 4340
CPSC 4360
ELEN 3431
Demonstrate the ability to work in heterogeneous environments which are diverse in gender, ethnicity, and academic accomplishment.
I R S S
Attend team meetings and contribute towards solution of technical
I R S S
105
problems during the meetings.
Make appropriate contributions within their skill set to the completion of the project.
I R S S
Demonstrate a sense of interdependence with other team members.
I R S S
Outcome 7
Performance Criteria
COSC 1172
COSC 1336
COSC 1337
COSC 2336
COSC 2372
COSC 3302
COSC 3304
COSC 3308
COSC 3325
CPSC 3320
COSC 4172
COSC 4302
COSC 4310
CPSC 4302
CPSC 4340
CPSC 4360
ELEN 3431
Demonstrate the ability to communicate in a given situation.
I S S
Demonstrate the ability to comprehend what is said and to show an appreciation of the importance of listening.
I S S
106
Communicate clearly at the level of the audience the technical material intrinsic to the discipline of computer science.
I S S
Demonstrate knowledge of the communication process.
I S S
Outcome 8
Performance Criteria
COSC 1172
COSC 1336
COSC 1337
COSC 2336
COSC 2372
COSC 3302
COSC 3304
COSC 3308
COSC 3325
CPSC 3320
COSC 4172
COSC 4302
COSC 4310
CPSC 4302
CPSC 4340
CPSC 4360
ELEN 3431
Provide an introduction that grabs the attention of readers.
I R R S S
Organize documents in terms of a few main points or themes.
I R R S S
Choose appropriate illustrations, examples, or evidence to support the written documents.
I R R S S
107
Write appropriately for specified readers in terms of technical content.
I R R S S
Write organized, grammatically correct reports.
I R R S S
Outcome 9
Performance Criteria
COSC 1172
COSC 1336
COSC 1337
COSC 2336
COSC 2372
COSC 3302
COSC 3304
COSC 3308
COSC 3325
CPSC 3320
COSC 4172
COSC 4302
COSC 4310
CPSC 4302
CPSC 4340
CPSC 4360
ELEN 3431
Be able to search scholarly publications to assist in resolving problems.
S S * *
Intend to engage in additional formal education or participate in employer-related training or research projects.
S
108
Independent study. Participate in Honors program or in undergraduate research at Lamar. This could be done in the STAIRSTEP Program, Presentations or Posters at Professional Conferences, COOP or Internship position reports.
S
109
Appendix I - Department Programming Documentation Standard
Programming Documentation Requirements
I. “External” Documentation (or Program Information): In programming
courses, the comprehensive set of documents that detail the design, development,
and structure of a program are usually condensed into a comparatively brief
‘block comment’ at the top of the source code. This “external” documentation
will minimally include:
a. Author(s) name, the course name/number, assignment name/number,
instructor’s name, and due date.
b. Detailed description of the problem the program was written to solve,
including the algorithm used to solve the problem.
c. The program’s operational requirements, such as the programming language,
special compilation information, and the input information.
d. Required features of the assignment that author(s) were not able to complete,
and/or information about the existing bugs.
II. Documentation about the “Classes”: When writing the code for a class in an
object–oriented programming language, it should be preceded by a block
comment minimally containing the following:
a. The class name, (author(s) name in team projects,) the names of any external
packages upon which the class depends, the name of the package for the
classes containing this class (if any), and the inheritance information.
b. An explanation of the purpose of the class.
c. Brief descriptions of the class and instance constants and variables.
d. Brief descriptions of constructors as well as the implemented class and
instance methods.
III. “Internal” Documentation (or in-program documentation): The details of the
program are explained by comments and placed within the code. The internal
documentation should minimally include the following:
a. A ‘block comment’ which should be placed at the head of every method (also
known as the function or subprogram). This will include the method name; the
purpose of the method; the method’s pre– and post–conditions; the method’s
return value (if any); and a list of all parameters, including direction of
information transfer (into this method, out from the method back to the calling
method, or both), and their purposes.
b. Meaningful identifier names. Traditionally, simple loop variables may have
single letter variable names, but all others should be meaningful. Never use
nonstandard abbreviations. If the programming language has a naming
convention for variables, methods, classes, etc., then those conventions should
be used.
110
c. Each variable and constant must have a brief comment immediately after its
declaration that explains its purpose. This applies to all variables, as well as to
fields of structure declarations.
d. Complex sections of the program that need some more explanations should
have comments just before or embedded in those program sections.
IV. Miscellaneous / Optional Requirements: a. Write programs with appropriate modularity; that is, create classes when
appropriate, write methods that accomplish limited, well-defined tasks, etc.
b. Global/public variables should be avoided in programs, unless they are
required.
c. Use “white spaces” (blank lines) to set apart logically related sections of code.
d. Indent bodies of methods, loops, and “if” statements, and do so with a single,
consistent style.
e. Unconditional branching (such as the “goto“ statement) should be avoided in
programs unless it is required for that specific language (such as the assembly
language).
Notes. There are a number of standards and tools for program documentation, such as
IEEE 1063-2001 “Standard for Software User Documentation” written by IEEE,
ISO/IEC 18019-2004 and ISO/IEC TR 9294 written by the International Standards
Organization (ISO), and the International Electrotechnical Commission (IEC).
Tools such as Doxygen, javadoc, ROBODoc, and TwinText can be used to auto-
generate the code documents. Hence, these tools add more capabilities for document
preparation. For example, they are able to extract the comments from the source code and
create reference manuals in such forms as text or HTML files.
References
1. O. McCann. “Toward Developing Good Programming Style”.
http://www.cs.arizona.edu/people/mccann/style.html, [accessed Jan 17, 2012]
2. P. DePasquale. http://www.comtor.org/ [accessed Jan 17, 2011]
3. O. Paull, “The Importance of Software Documentation”,
http://www.ehow.co.uk/about_6706857_importance-software-documentation.html
[accessed Jan 17, 2012]
4. Dimitri van Heesch: “Doxygen Documentation. Generate documentation from source
code”, 2012, http://www.stack.nl/~dimitri/doxygen/ [accessed Jan 17, 2012]
111
Appendix J – Meeting Minutes 2012-2013
Minutes of meetings of Computer Science committees are posted on the Department
website for assessment. Some committee minutes may not be publicly accessible.
This appendix includes minutes from meetings during the 2012-2013 year that were
relevant to assessment. The following minutes are included:
Assessment Committee Meetings 2012-2013 Academic Year
1. Assessment Committee, February 11, 2013
2. Assessment Committee, May 23, 2013
3. Assessment Committee, June 3, 2013
4. Assessment Committee, June 10, 2013
5. Assessment Committee, June 14, 2013
112
Department of Computer Science
Assessment Committee Meeting
February 11, 2013
Maes Building, Room 59A
Committee Members
Dr. Roden, Chair Dr. Andrei Dr. Liu Dr. Makki Dr. Osborne
Invited Guest Dr. Doerschuk
In Attendance
Dr. Roden, Chair Dr. Andrei Dr. Liu Dr. Makki Dr. Osborne
Minutes Taken By Mrs. Denise Rode, Administrative Associate Sr.
Assessment Committee Agenda
February 11, 2013
I. Approval of Minutes at Last Meeting on August 2, 2012. II. Request from Dr. Peggy Doerschuk for changes to CPSC 4360 (See attached
memorandum). NOTE: Any changes approved must be voted on during a department faculty
meeting. III. Other Business. IV. Adjourn.
Dr. Roden called the meeting to order at 2:00pm.
Approval of Meeting Minutes
Chair, Dr. Roden asked members if the Assessment Committee Meeting Minutes from
the last meeting of August 2, 2012 are accepted as presented.
Dr. Osborne motioned that the minutes be approved as presented. Dr. Makki seconded
the motion.
Dr. Roden said that the minutes are moved and seconded that they be accepted as
presented.
113
Dr. Doerschuk Memorandum Chair, Dr. Roden asked members to review the memorandum from Dr. Doerschuk
regarding the CPSC 4360, Software Engineering’s current summative courses for ABET
evaluation. There are 20 outcomes assigned to CPSC 4360 and Dr. Doerschuk would like
to see some of the outcomes assigned to the course COSC 4172, Senior Assessment.
The committee discussed each item thoroughly and reviewed the Curriculum Map before
making the decision to assign items number 12, 13, 14, and 15 to COSC 4172, Senior
Assessment.
Upon discussion, item number 5 will be removed as an outcome for CPSC 4360,
Software Engineering.
Item numbers 7 and 8 will remain on CPSC 4360 as senior students need to know the
importance of ethics and the ability to work in heterogeneous environments.
All committee members were in agreement with the changes made to the CPSC 4360 for
the summative evaluation for ABET.
A motion was made by Dr. Roden to present the changes at the next Faculty Meeting. Dr.
Bo Sun seconded the motion.
Dr. Roden asked members to say “Aye” if they are in favor of the changes made to the
CPSC 4360, Software Engineering outcomes and received an unanimously “Aye”.
Dr. Roden asked members if anyone opposed the changes and no members opposed.
Dr. Roden asked if there was any more business to discuss at this time. Members did not
have anything further to discuss at this time.
Dr. Roden asked for a motion to adjourn and all members said “Aye”. Meeting was
adjourned at 3:00pm.
114
Department of Computer Science
Assessment Committee Meeting
May 23, 2013
Maes Building, Room 59A
Committee Members
Dr. Roden, Chair Dr. Andrei Dr. Liu Dr. Makki Dr. Osborne
In Attendance
Dr. Roden, Chair Dr. Andrei Dr. Liu Dr. Makki Dr. Osborne
Minutes Taken By Mrs. Denise Rode, Administrative Associate Sr.
Assessment Committee Agenda
May 23, 2013
I. Approval of Minutes of Last Meeting on February 11, 2013. II. Assessment for 2012-2013 Academic Year. III. Other Business. IV. Adjourn.
Dr. Roden called the meeting to order at 11:00am.
Approval of Meeting Minutes
Chair, Dr. Roden asked members if the Assessment Committee Meeting Minutes from
the last meeting of February 11, 2013 are accepted as presented.
Dr. Makki motioned that the minutes be approved as presented. Dr. Andrei seconded the
motion.
Dr. Roden asked members if anyone opposed and no members opposed.
Dr. Roden asked members to say “Aye” if they are in favor of the minutes as presented
and all members unanimously replied ‘Aye”.
Minutes of February 11, 2013 are accepted as presented to the members.
115
Dr. Roden asked Dr. Osborne to explain what will be necessary to complete the
documentation needed for the ABET audit in October 2013.
I. Dr. Osborne distributed the handout, Procedures for Measuring Each Student
Outcome Indirectly to all members. He proceeded to describe what each column title meant and what would be necessary for the assessment of the 2012-2013 audit.
II. Dr. Osborne shared that Dr. Jiangjiang Liu calculated and prepared all tables that were presented for the ABET Assessment Report 2011-2012. He asked Dr. Liu if she would calculate and prepare all tables for redirect measures of the 2012-2013 ABET Assessment Report. She agreed and Dr. Osborne will email her all templates.
III. Dr. Osborne explained to members that most universities find it hard to get
employer surveys. He utilizes “LinkedIn” to review what former students are doing with their education after graduation. He encourages all faculty and staff to utilize this tool as the education objective is not what we want them to be able to do now but what they are able to do five (5) years from now.
IV. Dr. Andrei has collected the Alumni Surveys and will have our Office
Assistant, Jenifar Kallul secure them into the ABET Binder for 2012-2013.
I. Dr. Osborne distributed the handout, Appendix B Indirect Measure and Direct
Measure: Assessment Methodology 2012-2013. He reviewed the document with all members and explained what responsibilities belonged to which member of the committee.
II. Dr. Liu will be responsible for: Indirect Measures: Exit Interviews, Exit Survey and ETS Scores.
III. Dr. Osborne will be responsible for: Course Evaluations.
IV. Dr. Andrei will be responsible for getting the Official Transcripts of six (6)
students. Dr. Andrei will also check the course syllabus for the strategies to ensure that
the course objectives correspond to the performance criteria.
V. Dr. Makki will be responsible for: Direct Measures and Facilities of the Self-
Study report.
VI. Dr. Roden will be responsible for: Direct Measures: Strategies
I. Dr. Osborne distributed the handout, Assessment Summary of Direct Measures
2011-2012. He reviewed each column to members.
116
a. Performance Criteria – Specific Information auditors are looking for. b. Strategies – What course(s) the data was collected. c. Assessment Method(s) – Where the data was collected. d. Context for Assessment – What course the data was collected. e. Time of Data Collection – What semester the data was collected. f. Assessment Coordinator – Teacher(s) that teach the course. g. Analysis of Results – Size, Percentage and Target
II. Assessment Summary of Direct Measures 2011-2012
a. Dr. Osborne reviewed the Criteria for Satisfactory Performance – This information can be found on page 37 of this handout.
b. The rules we are currently utilizing is on Page 40, the department has decided that the target will be at least 80% of the students in a course that do acceptable work on each performance criterion.
Dr. Osborne suggested that the committee meet again next week. Dr. Roden will arrange
a time for the next meeting. At that time, the committee will have completed their
assignments and the material will be reviewed. The second cycle results will be written
after making comparisons between the 2011-2012 data and the 2012-2013 data. If
another meeting is necessary, it will be arranged at the next meeting.
Dr. Osborne shared that over the last several years, there has been an improvement in the
outcomes related to ethics and social impact, as well as simulation. Simulation is covered
in the courses Data Structures and Computer Architecture.
Dr. Osborne informed members that the “Self-Study” will need to be ready by the end of
June.
Dr. Osborne asked that the Administrative Associate Sr., Denise Rode orders supplies
that will be needed to organize all data for the audit. Order consists of: 20-25 3” binders,
6 5” binders, Index Sheets, and legal pads. This order was placed on the afternoon of
May 23, 2013.
Dr. Andrei formed a Beginning Freshman Course Assessment Committee that will
review COSC 1336 Programming Fundamentals I, COSC 1337 Programming
Fundamentals II and COSC 2336 Programming Fundamentals III. The purpose of this
committee will be to assess these three (3) courses and determine:
I. Confirm the textbook(s) to be used for each course based on their assessment
results. II. Analyze and list the sequence of topics for each course based on their
assessment results.
117
The members of the Beginning Freshman Course Assessment Committee are Dr. Makki
as Chair, Dr. Andrei, Dr. Doerschuk, Dr. Roden and Mrs. Wang.
Dr. Roden asked if there was any more business to discuss at this time. Members did not
have anything further to discuss at the time.
Dr. Roden asked for a motion to adjourn and all members said “Aye”. Meeting was
adjourned at 12:15pm.
118
Department of Computer Science
Assessment Committee Meeting
June 3, 2013
Maes Building, Room 59A
Committee Members
Dr. Roden, Chair Dr. Andrei Dr. Liu Dr. Makki Dr. Osborne
In Attendance
Dr. Roden, Chair Dr. Andrei Dr. Liu Dr. Makki Dr. Osborne
Minutes Taken By Mrs. Denise Rode, Administrative Associate Sr.
Handouts: Indirect Measure: Exit Interview Summary 2012-201
Appendix G Direct Measure: Student Learning Outcome Results and
Analyses 2012-2013
Assessment Committee Agenda
June 3, 2013
I. Approval of Minutes of Last Meeting on May 23, 2013.
II. Assessment for 2012-2013 Academic Year
a. Review of Indirect Measures Results (Dr. Liu) and Comparison to
Previous Year.
b. Review of Direct Measures Results (Dr. Makki) and Comparison to
Previous Year.
III. Recommendations for Continued Program Improvement.
IV. Other Business.
V. Adjourn.
Dr. Roden called the meeting to order at 10:20am.
Approval of Meeting Minutes
Dr. Osborne moved to accept the May 23, 2013 Assessment Committee Meeting Minutes
as presented to the members.
Dr. Makki seconded the motion. Dr. Roden asked members if anyone opposed and no
members opposed.
Dr. Roden asked members to say “Aye” if they are in favor of the minutes as presented
and all members unanimously replied “Aye”.
Minutes of May 23, 2013 are accepted as presented to the members.
119
Dr. Liu presented the handouts Indirect Measure: Exit Interview Summary 2012-2013
and Appendix G Direct Measure: Student Learning Outcome Results and Analyses 2012-
2013 to all members.
Dr. Andrei questioned the results for Outcome 9 on the Appendix G Direct Measure
handout. His concern was on how the result was only 66% and it should be higher. Dr.
Osborne explained that it is difficult to assess a one hour freshman course. The totals on
the report were taken from semesters Fall 2012 and Spring 2013 combined.
Dr. Liu’s report was well prepared and the hard work put into making the report was
appreciated by the committee.
Dr. Makki figured the Direct Measure totals for the report. There were some questions on
how Dr. Makki reached the totals and he was given a new formula to use and refigure
totals on the Direct Measure student average on the report. The corrected report will be
presented at the next Assessment Committee Meeting.
The committee agreed that it will identify weakness(es) from the assessment data and
document the analysis. If our scores are low or if we barely make the percentage, we will
look at that area and see what is going on with particular classes.
We may not meet every objective on the report. We have upper level student’s exit
surveys and exit interviews which are their last chance to give feedback. We need to
remember that a teacher cannot meet everything in every class. The assessment is not
considered a weakness for the teacher but does show where there are program concerns.
ABET is looking for consistency on how the students are achieving outcomes rather than
finding out how poorly a teacher is teaching.
Dr. Osborne stated that student evaluations are not our only measure. Many evaluations
are not high if students misunderstand the question.
Question 3 review of report: Dr. Osborne explained that “scheduling” is always a
problem as students complain about not having all courses available to them each
semester. The Computer Science Department does work with students to ensure they are
able to graduate on time.
The committee has concerns that results were not good regarding the students being
ready for higher education or employment. We will be reviewing results for both the
indirect and direct measures to see how we have done and in what areas need more
attention.
120
Review of Exit Interview Questions form.
Committee asked the Administrative Associate Sr. to edit Question 2 as follows: Your
education ensured that you can design software solutions to different types of problems.
The committee unanimously agreed that the question should read as above.
Committee asked the Administrative Associate Sr. to edit Question 8 as follows: Your
education developed in you skill in communication and cooperation within workgroups.
The committee unanimously agreed that the question should read as above.
The updated Exit Interview Questions form will be presented to members at the next
Assessment Committee Meeting.
Dr. Roden as the members if there was any other business for discussion. No one had any
further business.
Dr. Roden adjourned the meeting at 12:15pm.
121
Department of Computer Science
Assessment Committee Meeting
June 10, 2013
Maes Building, Room 59A
Committee Members
Dr. Roden, Chair Dr. Andrei Dr. Liu Dr. Makki Dr.
Osborne
In Attendance
Dr. Roden, Chair Dr. Andrei Dr. Liu Dr. Makki Dr.
Osborne
Minutes Taken By Mrs. Denise Rode, Administrative Associate Sr.
Handouts: Appendix C Indirect Measure: Alumni Survey Summary 2012-2013
Appendix F Indirect Measure: Student Evaluation Summary 2012-2013
Appendix G Direct Measure: Student Learning Outcome Results and
Analysis 2012-2013
Procedures for Measuring Each Student Outcome Indirectly
Student Learning Outcomes at the PROGRAM Level Student
Outcome 2.1
Student Learning Outcomes at the PROGRAM Level Student
Outcome 2.2
Student Learning Outcomes at the PROGRAM Level Student
Outcome 2.3
Student Learning Outcomes at the PROGRAM Level Student
Outcome 2.4
Student Learning Outcomes at the PROGRAM Level Student
Outcome 2.5
Student Learning Outcomes at the PROGRAM Level Student
Outcome 2.6
Student Learning Outcomes at the PROGRAM Level Student
Outcome 2.7
Table of Contents
Assessment Committee Agenda
June 10, 2013
I. Approval of Minutes of Last Meeting on June 3, 2013.
II. Assessment for 2012-2013 Academic Year.
122
Review of Direct Measures Results (Dr. Makki) and Comparison to Previous
Year.
Status of Self-Study (Dr. Roden).
III. Recommendations for Continued Program Improvement.
IV. Other Business.
V. Adjourn.
Dr. Roden called the meeting to order at 1:40pm.
Approval of Meeting Minutes
Dr. Liu asked for corrections to the first paragraph which needed several additions to
handouts that she presented to all members. These are the handouts Dr. Liu distributed:
Indirect Measure: Exit Interview Summary & Exit Survey 2012-2013 as well as the ETS
Report.
Minutes for June 3, 2013 will be reviewed at the next Assessment Committee Meeting
for approval.
Review of Direct Measures and Comparison to Previous Year
Dr. Makki presented all members with the updated Summary Page. The committee
discussed each area individually that was “Not Met”.
Outcome 2.7 (Your education ensured that you can design software solutions to different
types of problems.) Dr. Roden asked Dr. Liu what could be done to improve in this area.
Outcome 2.7 was met in 2011-2012.
ABET report indicates that issue #1 is not a problem. We do want to monitor this and if
this continuous, we will need to change structure that measures the project just in itself
and may justify a big change. The professor will monitor this assignment project
carefully. A high school teacher prepares a student to be a good higher education student.
It is our duty to prepare the higher education student to want to get their
Masters/Doctorate.
NOTE: Committee agreed unanimously that …software solutions to a wide range of
problems be changed to software solutions to different types of problems. The
Administrative Associate Sr., Mrs. Denise Rode will make the changes to the Exit
Interview Question form and present it to the Committee at the next Assessment
Committee Meeting.
Outcome 5.2 (Your education fostered an understanding of the impact of the discipline on
relevant local and global social issues.) This outcome was also “met” in 2011-2012. Dr.
123
Osborne believes there may be a correction to be made to this outcome. Dr. Osborne
asked Dr. Makki to check with Dr. Doerschuk on how many passed her class. If the
figures are incorrect, Dr. Makki will notify Dr. Roden with corrections.
Outcome 7.4 (Your education offered the preparation necessary to design and conduct
simulations or other experiments and analyze and interpret data.) Calculations will be
refigured for this outcome due to the change of courses used to figure this area would
make the total over 80%. Outcome 7.4 was met in 2011-2012.
Dr. Makki was asked where he got his data and he informed the Committee that all
information was taken from the system.
Discussion of Oral Reporting: The Committee discussed the various ways in which a
teacher can help students improve in their oral reporting. Review with students the proper
way in which one conducts an oral report. What attributes are to be taken before they
give a successful report.
Outcome 9.1 (Your education fostered an awareness of professional and ethical
responsibilities and their application in real situations.) Dr. Osborne explained that ACM
was taken out before fall semester started in 2012. Dr. Osborne will have Abu Shufean,
Webmaster change the % for his data on the Internet and the change will show that the
outcome was met with 80.5%. Outcome 9.1 was met in 2011-2012.
Dr. Roden asked Committee if they would like to review the rubric he uses. How do we
get students enthusiastic for more training? The students are not attending Inspired or
ACM and even after Dr. Zaloom speaks to them, there is no interest in that area either.
Outcome 9.3 Independent Study Dr. Andrei asked Dr. Roden to send him a report for his
area and send Dr. Liu one for her area.
Dr. Osborne shared with the Committee that the whole purpose of problem areas is to
review the situation(s) at hand and determine what needs to be changed in order to be
able to meet the outcomes. Do we need to talk to the teacher and see what can be changed
to meet the demands of the students or do we need to assign another teacher to the
course? Look for the problem and allocate staff and improve the situation. We need to
know what exactly the problem is and make some positive changes. We all know that the
students choose the best liked teacher. This is why we have indirect student evaluations
and direct student measures.
ABET does not allow higher education schools to rate teaching staff by grades. If the
teacher is pressured to give good grades, they will give good grades. We are trying to
access data to get results and have something to look at. Dr. Osborne shared with the
committee that the students are required to submit a “Life Long Learning” report to him.
124
Dr. Osborne distributed the handout Appendix C Indirect Measure: Alumni Survey
Summary 2012-2013. We will not be able to utilize the Employer Survey as we did not
receive this data. And we have a small amount of data from the Alumni also.
The Alumni Survey Summary 2012-2013 shows data for schedules in Question A 1-4.
Question B 1-17 shows data for student’s future endeavors. Dr. Osborne utilizes
“LinkedIn” to obtain graduated student’s resumes. According to their resumes, the
students are doing fairly well. Dr. Osborne asked the Committee how they wanted to
evaluate students by using their resumes.
Dr. Osborne distributed the handout Procedures for Measuring Each Student Outcome
Indirectly. He asked the Committee what questions from 27-40 are the most important.
There are some courses which are well taught, such as Operating Systems, Software
Engineering and Database Design. Notice how may courses are repeated in each
outcome, this is due to the fact that they are related to different outcomes.
What criteria will we use to evaluate Indirect data? We need to remember that our job is
to get students to complete the evaluations. They need to understand what they are being
asked and the evaluation questions need to be reviewed in the classroom setting before
they complete the evaluations. EXAMPLE: Before starting class, explain to students that
the topic will regard “simulations” and that this topic will be on their Exit Interview
Questions.
Procedures for Measuring Each Student Outcome Indirectly
Dr. Roden assigned the following to each Committee member so data will be ready for
discussion on Friday, June 14, 2013. Assignments are as follows:
1, 2.1, 2.2 Dr. Roden
2.3, 2.4, 2.5 Dr. Makki
2.6, 2.7, 3 Dr. Liu
4, 5, 6 Dr. Andrei
7, 8, 9 Dr. Osborne
Dr. Osborne suggested that each member review the indirect and direct measures and see
how they fit. Results will be written and then the action(s) will be written. He reminded
members to just “write” results, not actions. Actions will be discussed and members
approve of the actions.
Table of Contents Dr. Roden’s handout “Table of Contents” was reviewed with the committee. SELF-
STUDY
Background Information Data is from 2009
Criterion 1: Students Dr. Osborne
Criterion 2: Program Educational Objectives Dr. Roden wrote summary/ Dr.
Andrei will review summary
125
Criterion 3: Student Outcomes Dr. Osborne
Criterion 4: Continuous Improvement Dr. Roden & Dr. Osborne –
Justifications needed
Criterion 5: Curriculum Dr. Bo Sun
Criterion 6: Faculty Missing Classes Dr. Tran taught –
Mrs. Rode will provide data
Criterion 7: Facilities Dr. Makki & Mr. Frank Sun
Criterion 8: Institutional Support Dr. Andrei
Appendix A - E
Dr. Roden reviewed the appendixes with the committee.
Appendix A: Course Syllabi Dr. Roden is in the process of reviewing
syllabus
Appendix B: Faculty Vitae Dr. Roden has completed this area
Appendix C: Equipment Mr. Frank Sun provided this data
Appendix D: Institutional Summary Dr. Andrei & Greg Marsh
Appendix E: Curriculum Map This has been completed
Signature Attesting to Compliance This has been completed
Assessment Committee will provide Dean Brenda Nichols a copy of the material by
Friday, June 21, 2013 for her review.
Dr. Roden informed Committee that the next Assessment Committee Meeting will be
held on Friday, June 14, 2013 at 1:30pm in Room 59A.
Dr. Roden asked the members if there was any other business for discussion. No one had
any further business.
Dr. Andrei motioned to adjourn the meeting. Dr. Osborne seconded the motion. Members
unanimously agreed by saying “Aye”. Dr. Roden adjourned the meeting at 3:45pm.
126
Department of Computer Science
Assessment Committee Meeting
June 14, 2013
Maes Building, Room 59A
Committee Members
Dr. Roden, Chair Dr. Andrei Dr. Liu Dr. Makki Dr. Osborne
Invited Guest Dr. Doerschuk
In Attendance
Dr. Roden, Chair Dr. Andrei Dr. Makki Dr. Osborne Dr. Bo Sun
Minutes Taken By Mrs. Denise Rode, Administrative Associate Sr.
Assessment Committee Agenda
June 14, 2013
I. Approval of Minutes at Last Meeting on June 10, 2013. II. Assessment for 2012 academic year.
a. Discussion of Results of Assessment (all committee members) i. Outcomes 1,2.1,2.2 – Roden
ii. Outcomes 2.3,2.4,2.5 – Makki iii. Outcomes 2.5,2.7,3 – Liu iv. Outcomes 4,5,6 – Andrei v. Outcomes 7,8,9 – Osborne
b. Status of Self-Study (Roden) III. Other business. IV. Adjourn.
Dr. Roden, Chair called the meeting to order at 1:40pm.
Approval of Meeting Minutes Dr. Andrei motioned for the June 3, 2013 Assessment Committee Meeting Minutes to be approved. Dr. Makki seconded the motion. Dr. Roden asked members to say "Aye” if they were in agreement and the members replied “Aye”, they agreed unanimously.
127
Dr. Osborne motioned for the June 10, 2013 Assessment Curriculum Meeting Minutes to be approved. Dr. Andrei seconded the motion. Dr. Roden asked members to say “Aye” if they were in agreement and the members replied “Aye”, they agreed unanimously. Dr. Osborne will forward Dr. Liu the Fall 2012 ETS scores to her.
Dr. Roden will be submitting the ABET 2012-2013 Accreditation Report to Dean Nichols on Thursday, June 20, 2013. He will email the updated report to all committee members for their review on Saturday or Sunday.
Discussion of Results of Assessment
Dr. Roden's assignment was Outcomes 1, 2.1 and
2.2.
Student Outcome 1: Software Fundamentals
Performance Criteria:
1. Apply UML interaction diagrams and class diagrams to illustrate object models.
2. Apply important design patterns to OOD.
3. Create useful software architecture documentation.
4. Develop correct and efficient programs.
5. Debug implanted software in a proficient manner.
6. Design user interfaces appropriate to a large software system.
7. Develop user-level documentation for software.
Student Outcome 2.1: Computer Science Technology Skills - Discrete Mathematics
and Structures.
Performance Criteria:
1. Be able to develop software to support specific operations on frequently used discrete
structures such as lists, trees and graphs.
2. Be able to use elementary concepts of combinatorics, probability, and statistics to
analyze and evaluate the efficiency of algorithms.
3. Be able to use concepts of discrete mathematics, automata, and finite state machines to
explain the design of computer hardware.
Results:
All direct measure criteria were met this year as compared to last year when criteria 2,
COSC 3304 was not met (60%). For the first time since 2010, we met measures for
Outcome 2.1.1. We did not meet measures for Outcome 2.1.2 from 2010 through 2012;
however, we did meet measures for the academic year 2012-2013. Measures were met in
2012-2013 for Outcome 2.1.3.
Indirect data was low and we will bring this to the instructor's attention.
All measures were met for this outcome in the Student Evaluations. Unable to compare to
last year as this was not assessed.
128
There are no questions on the Exit Interview, Exit Survey or Alumni Survey for this
outcome.
Actions:
Since there was an overall improvement from last year and all targets were met, no
actions are planned for the 2013-2014 academic year.
Second Cycle Results:
Last year we proposed that beginning with Fall 2012, the MATH 2305 Discrete
Mathematics course would become a prerequisite course for the COSC 3304 Algorithm
Analysis and Design.
This is our second cycle and we met our targets with improvement over last year.
Student Outcome 2.2: Computer Science Technology Skills - Analysis and Design of
Algorithms.
Performance Criteria:
1. Demonstrate basic understanding of asymptotic notations and time completely.
2. Design efficient algorithms and compare competing designs.
3. Demonstrate basic understanding of some design approaches such as greedy
algorithms, dynamic programming and divide-and-conquer.
4. Demonstrate familiarity with standard searching and sorting algorithms and linear and
non-linear structures.
Results:
All direct measure criteria were met this year as compared to last year when criteria 1 and
2 were not met (both 60%). Students indicated that they did not feel they had a firm grasp
of the topic, “Algorithms”.
In the Student Evaluations, question 39 did not meet the target in COSC 3304.
Question 39: “This course provides you with a firm theoretical foundation for the subject
of the course. Unable to compare to last year as this was not assessed.
In the Student Evaluations, question 40 did meet the target.
-
Unable to compare to last year as this was not assessed.
There are no questions on the Exit Interview, Exit Survey or Alumni Survey for this
outcome.
Actions: Since the results are mixed (direct measures improved while student evaluations
were not met), no actions are planned for the 2013-2014 academic year except to
continue to use student evaluations in COSC 3304 to measure this outcome.
Second Cycle Results: Last year we proposed that beginning with Fall 2012, MATH
2305 Discrete Mathematics become a prerequisite course for COSC 3304 Algorithm and
Analysis and Design.
129
The measure of online question 39 was 3.56 which does not meet our target but is not
extremely low. We will discuss with perspective teachers instructional methods of
teaching analysis.
Dr. Makki's assignment was Outcomes 2.3, 2.4 and 2.5.
Student Outcome 2.3:
Performance Criteria:
1. Demonstrate basic knowledge of equivalences between various types of languages and
corresponding accepting devices including Turing Machines.
2. Demonstrate basic knowledge of practical applicability of various types of grammar
and of some standard representation forms.
3. Demonstrate knowledge of limitations of computational capability of computer
grammars.
4. Demonstrate basic understanding and appreciation of the various essential
programming languages constructs, paradigms, evaluation criteria, and language
implementation issues.
5. Demonstrate basic knowledge and skills in programming techniques with the focus on
concepts and not on a particular language.
Results:
The performance targets for all of the six criteria were met for direct measure, with the
sample size of 7 for year 2012-2013. Also, we have similar result for last year. The
performance targets of this outcome also have been met for indirect measure, with the
sample size of 4 for year 2012-2013. However, for 2011-2012 course evaluations are not
used to evaluate Outcome 2.3.
Actions:
No action is mandated by our assessment.
Second Cycle Results:
The instructor was aware that last year wanted to make sure that knowledge of the
limitations of computational capability of computer grammars is to be covered in the
class and this year the instructor successfully incorporated that material into the
curriculum for COSC 3302.
Student Outcome 2.4:
Performance Criteria:
1. Knows the main components of an operating system and their purposes and modes of
interaction.
2. Knows the structure of device drivers and the interaction between device drivers and
operating systems.
3. Outlines the basic issues in memory management design and virtual memory.
4. Can develop basic system applications based on operating system APIs.
Results:
130
The performance targets for all of the four criteria were met for the direct measure, with
the sample size of 15 for year 2012-2013. We have similar results last year with sample
size of 6. We have similar results last year with sample size of 11. The performance
targets of this outcome also have been met for the indirect measure, with the sample size
of 44 for year 2012-2013. However, for 2011-2012 course evaluations are not used to
evaluate Outcome 2.
Action:
No action is mandated by our assessment.
Second Cycle Results
None.
Student Outcome 2.5:
Performance Criteria:
1. Demonstrate the application of Entity-Relational diagrams to model real world
problems.
2. Design relations for real world problems including implementation of normal forms,
keys and semantics constraints for each relation.
3. Demonstrate competence in implementations of database applications.
Results:
The performance targets for all of the three criteria were met for direct measure, with the
sample size 6 for year 2012-2013. In 2011-2013, criterion 2 were not met, with sample
size of 4. However, the performance targets of this outcome have not been met for
indirect measure, with the sample size of 6 for year 2012-2013. However, 2011-2012
course evaluations are not used to evaluate Outcome 2.
The Exit Interview results were 3.75 for sample of 8 for Outcome 2 which showed that
this Outcome has been met (question 15). For 2011-2012, the result was 4.50 for sample
of 8.
The Alumni Survey results were 4.17 for sample of 9 for Outcome 2 which showed that
this Outcome has been met (question 15). For 2011-2012, the result was 4.17 for sample
of 6.
Actions:
Since direct measure targets were met for all performance criteria and the sample size
was small, no actions will be taken as a result of the results except to notify the instructor
of results of student evaluations (since evaluations did not meet targets for indirect
measures).
Second Cycle Results:
None.
Dr. Liu’s assignment was Outcomes 2.6, 2.7 and 3.0.
Dr. Liu motioned to have Number 3 of the Student Evaluation Question removed from
the report Direct Measure and Indirect Measure Comparison 2012-2013 COSC 3308-01
question 38 and COSC 3308-01 question 39. All members were in agreement
unanimously.
131
Student Outcome 2.6:
Performance Criteria:
1. Employ the socket API to program applications among independent hosts.
2. Explain common network architectures, the services provided by each layer, and the
protocols required for connecting peer layers.
3. Evaluate network models through simulation and the use of common performance
metrics for networks.
Results:
The results are all satisfactory in 2012-2013.
In 2011-2012, performance criterion 2 was not met with the percentage of 66.67% and
sample size of 6 students. The department made improvement on performance criterion 2
with the percentage of 83.00% and sample size of 18 students.
Student Outcome 2.7:
Performance Criteria:
1. Understands modern ISA design principles and employs them to evaluate systems.
2. Know how to measure performance for different computer architectures.
3. Demonstrate knowledge of hardware implementation of numbers and arithmetic
operations.
Results:
The results are all satisfactory except direct measure performance criterion 2 in 2012-
2013.
Performance criterion 2 is not met with the percentage of 50% and sample size of 8
students in 2012-2013. In 2011-2012, performance criterion 2 was met with the
percentage of 100% and sample size of 9 students. Final exam questions were used to
assess performance criterion.
Actions:
In 2012-2013, the department will monitor performance criterion 2 and the instructor will
give more attention to this area.
Student Outcome 3:
Performance Criteria:
1. Justify why selected research methods were chosen and state the intended outcomes to
the study.
2. Identify steps used in a particular study.
3. Outline and explain the key features of the adopted method.
4. Analyze and interpret collected data based on the adopted method and draw
appropriate conclusions.
Results:
The results are all satisfactory except some for some indirect measure Student Course and
Instructor Evaluation results in 2012-2013.
The department made improvement on CPSC 3320. The targets of the Student Course
and Instructor Evaluation were not met for CPSC 3320 in 2011-2012 and are all met in
2012-2013. The cumulative averages were 3.67 for question 38 and 3.67 for question 40
132
with the sample size of 6 students in 2011-2012. In 2012-2013, the cumulative averages
are 4.22 for question 38, and 4.00 for question 40 with the sample size of 23 students.
The department also made improvement on COSC 4310. The target of the Student
Course and Instructor Evaluation question 38 was not met with the average of 3.50 for
COSC 4310 in 2011-2012 and is met with the average of 3.86 in 2012-2013. The sample
size is 8 for both years. The results for the Student Course and Instructor Evaluation
questions 35 and 38 are also improved. The cumulative averages were 3.25 for question
35. In 2012-2013, the cumulative averages are very close to our target 3.75 with 3.71 for
question 35.
The targets for the Student Course and Instructor Evaluation for COSC 3308 were all met
with the average of 4.09 for questions 38 and 39 in 2011-2012. The targets are not met
with the average of 3.55 for questions 38 and 39 in 2012-2013. However, the cumulative
averages are very close to our target of 3.75
Dr. Andrei’s assignment was Outcomes 4, 5 and 6.
Student Outcome 4: Societal Awareness: Graduates will be aware of and
understand the impact of computer technology on society at large, on the workplace
environment and on individuals.
Performance Criteria:
1. Demonstrate understanding of evolving computer technology applications.
2. Demonstrate knowledge of positive social impacts including information globalization,
E-Commerce, E- learning and new job creation.
3. Demonstrate knowledge of negative social impacts including internet pornography,
privacy violation, health hazards, computer crimes and dehumanization.
4. Demonstrate basic understanding of intellectual property protection via copyright and
patent law and fair use exception for copyrighted software.
Results:
The sample size is 17, which is comparable with last year, and the direct results met our
targets in all of the performance criteria of the student outcome 4 (89%, 89%, 91.9%, and
93%). Analyzing question 41 ("Obtain Information/Local & Global impact-societal
issues”) from the Student Course and Instructor Evaluation, the target has been met for
COSC 3325 and CPSC 4360 with 4 out of 5 and 4.25 out of 5. The average was only 3.43
instead of minimum of 3-75.
As for Exit Interview, question 5 ("Your education fostered an understanding of the
impact of the discipline on relevant local and global social issues") and question 9 ("Your
education fostered on awareness of professional and ethical responsibilities and their
application in real situations") were both met in 2012-2013 with an average of 4.12 and
3.75, respectively comparable with 2011-2012 academic year.
There is no question for the Exit Survey which reflects this student outcome.
As for Alumni Survey, question 9 (“Your education fostered on awareness of
professional and ethical responsibilities and their application in real situations") were met
133
in 2012-2013 with an average of 4.11 out of 5, respectively comparable with 2011-2012
academic year. However, question 5 (“Your education fostered an understanding of the
history of computer science and the impact of the discipline on relevant social issues”)
did not meet the target as it had an average of only 3.89, below our goal of 4.
Results met indirect measures for this academic year 2012-2013.
Actions:
COSC 1172 is only a one-credit hour course, so we shall consider as an action to include
the number of credit hours for the future indirect measure, (that is, this course will count
only 1/3 of how much will count each of COSC 3325 and CPSC 4360).
Second Cycle Results:
We did not initiate any actions last year, so there are no second cycle results to report.
Student Outcome 5: Ethical Standards: Graduates will be able to recognize and
understand the importance of ethical standards as well as their own responsibilities
with respect to the computer profession.
Performance Criteria:
1.Know the differences of various philosophical views on ethics such as deontology,
utilitarianism, egoism, and relativism.
2.Understand the ACM or a similar professional body's code of ethics and principles
underlying those ethics.
3.Honor the property rights of others including copyrights and patents.
4.Demonstrate ability for ethical decision making within the computer profession.
5.Demonstrate knowledge of factors affecting fair resolution of conflicts of interests.
Results:
This is the fourth consecutive year in which both the direct measures, the Student Course
and Instructor Evaluation, the Exit Interview question 9 ("Your education fostered an
awareness of professional and ethical responsibilities and their application in real
situations"), and the Exit Survey question 16 (“I am cognizant of ethical issues and local
and global societal concerns relating to computers in society") were consistent in
achieving their targets.
As for the Alumni Survey, question 9 (“Your education fostered an awareness of
professional and ethical responsibilities and their application in real situations”) was met
in 2012-2013 with an average of 4.11 out of 5, respectively comparable with 2011-2012
academic year.
Actions:
No additional actions are planned for academic year 2012-2013.
Second Cycle Results:
We did not initiate any actions last academic year, so there are no second cycle results to
report.
134
Student Outcome 6: Collaborative Work Skills: Graduates will demonstrate the
ability to work effectively in teams to conduct technical work through the exercise of
interpersonal communication skills.
Performance Criteria:
1. Demonstrate the ability to work in heterogeneous environments which are diverse in
gender, ethnicity, and academic accomplishment.
2. Attend team meetings and contribute towards solution of technical problems during the
meetings.
3. Make appropriate contributions within their skill set to the completion of the project.
4. Demonstrate a sense of interdependence with other team members.
Results:
All targets are met of Direct Measures with high scores on all of the criteria. The sample
size is 26, much higher than last academic year’s sample size of 7. CPSC 4340 showed
that the students did not feel they understood how to collaborate in teams. This is an
improvement from last year in the sense that the Indirect Measure for COSC 4302
Student Evaluations matched the Direct measure.
CPSC 4340 has got a flat 3.00 for all questions25 26, 34 and 35. Also, CPSC 4340 has a
small sample size of 6.
The Exit Interview scores matched the expectations for all questions (4, 7, 11, 13 and 14),
except number 8 (“Your education developed in you skill in communication and
cooperation within groups and larger organizations”) which got 3.63 close to the target of
3.75. The Assessment Committee agreed to remove the words “and larger organizations”
from the question as it is not what we intended to measure.
The Alumni Survey scores matched the expectations for all questions (4, 7 ,8, 11 and 14),
except number 13 (“Your education provided a sufficient educational foundation for
leadership roles along future career paths”) which got 3.89 close to the target of 4.0.
Actions:
No actions seem to be indicated from the Indirect and Direct Measures of performance
for Student Outcome 6.
Second Cycle Results:
On Direct Measures, the performance has met targets since 2007-2008. No actions were
taken last year. Thus, there are no second cycle results to report.
Dr. Osborne’s assignment was Outcomes 7,8 and 9.
Student Outcome 7: Oral Communications: Graduates will demonstrate their
ability to verbally communicate clearly.
Performance Criteria:
1. Demonstrate the ability to communicate in a given situation.
2. Demonstrate the ability to comprehend what is said and to show an appreciation of the
importance of listening.
3. Communicate clearly at the level of the audience the technical material intrinsic to the
discipline of computer science.
4. Demonstrate knowledge of the communication process.
Results:
135
All targets for the direct measurement of the performance criteria were met for the last
five years until this year when criterion 4 had only 75.7% of the students doing
satisfactory work. The sample space was 27. The criterion was (“Demonstrate knowledge
of the communication process”). On question 34 on the online student evaluations,
(“Students had the opportunity to communicate design and implementation concepts to
professionals and non-professionals”), the target was met. The average was 3.86. We did
not meet targets for Exit Interview question 8 or Alumni Survey question 18.
Actions:
In COSC 4172, we will conduct a review of methods for giving an effective presentation.
Second Cycle Results:
None.
Student Outcome 8: Written Communication Skills: Graduates will demonstrate their
ability to write effectively both technical and non-technical materials with appropriate
multimedia aids.
Performance Criteria:
1. Provide an introduction that grabs the attention of readers.
2. Organize documents in terms of a few main points or themes.
3. Choose appropriate illustrations, examples, or evidence to support the written
documents.
4. Write appropriately for specified readers in terms of technical content.
5. Write organized, grammatically correct reports.
Results:
All direct measure targets were met and indirect measure targets were met (except COSC
1172). This was the sixth consecutive year.
Actions: None.
Second Cycle Results: None.
Student Outcome 9: Continuing Education and Lifelong Learning: Graduates will
be able to demonstrate that they can independently acquire new computing related
skills and knowledge in order to pursue either further formal or informal learning
after graduation.
Performance Criteria:
1. Be able to search scholarly publications to assist in resolving problems.
2. Participate in ACM and/or UPE.
3. Intend to engage in additional formal education or participate in employer-related
training or research projects.
4. Participated in Honors Program or in undergraduate research at Lamar. This could be
done in the INSPIRED or STAIRSTEP Programs, Presentations or Posters at
Professional Conferences, COOP or Internship position reports.
Results:
136
Assessment Committee agreed to remove COSC 4172, question 35 as it is not listed in
the syllabus for this course. Targets are all online, Student Evaluation for Outcome 9. The
performance targets of this outcome have not been met for direct measure, missed the
target by .04.
Actions: The Assessment Committee will analyze the rubric used to assess criterion 9.3
to determine if it should be modified to include other elements that would indicate if
students are capable of independent study.
Second Cycle Results: None.
137
Appendix K – Course Schedules 2012-2013
Computer Science Schedule Fall 2012
Instructor Course Course Name Time Day Room CRN
Andrei COSC 5100-01 Graduate Seminar
10:20 - 11:15 F 109 90623
COSC 2336-01 Fundamentals III 12:40-1:35 MWF 111 92801
COSC 3308-01 Programming Languages
9:10 - 10:05 MWF 111 90636
Beard COSC 1371-02 Microcomputers 9:10 - 10:05 MWF 107 90612
COSC 1371-03 Microcomputers 10:20 - 11:15 MWF 107 90613
COSC 1371-04 Microcomputers 11:30 - 12:25 MWF 111 90614
COSC 1371-05 Microcomputers 12:40 - 1:35 MWF 107 90615
Doerschuk CPSC 4360-01/
5360-01 Intro Software Engineering
12:45 - 2:05 TR 111 90616/90637
Koh COSC 4301-03 Programming for
Graduate Student
9:35-10:55 TR 111 90638
COSC 5315-01 Foundations of
Computer Science
3:50-5:10 MW 111 90631
Liu CPSC 4330-01/
5330-01 Multimedia Processing
11:30 - 12:25 MWF 109 90622/91637
Makki COSC 1337-01 Fundamentals II 9:35 - 10:55 TR 109 90624
138
Osborne COSC 5369-01 Graduate Project 5:30-6:50 TR 109 90632
COSC 4172-01 Senior Seminar 8:00-9:20 R 109 90627
Roden COSC 4301-
01/COSC 5340-01
Games for Handheld Devices
12:40 - 1:35 MWF 109 93414/ 93415
COSC 1336-01 Fundamentals I 11:30 - 12:25 MWF 107 90611
Bo Sun CPSC 3320-01 Computer Networks
11:10 - 12:30 TR 111 92802
Frank Sun COSC 1371-01 Microcomputers 08:00 - 08:55 MWF 107 90633
Tran COSC 5311-01 Data Mining 11:10 - 12:30 TR 109 92804
Wang COSC 1173-01 Programming
Lab 9:35-10:55 T 213 92805
COSC 1173-02 Programming
Lab 9:35 - 10:55 R 213 90639
COSC 1371-07 Microcomputers 11:10 - 12:30 TR 107 90619
Web Courses
Instructor Course Course Name Time Day Room CRN
Beard COSC 3321-48F Advanced
Microcomputer Apps WEB *Web 90794
Chiou COSC 1371-48F Microcomputers WEB *Web 90783
COSC 3320-48F Web Design/XHTML WEB *Web 90793
Doerschuk COSC 4301-48F/
5340-48F Machine Learning WEB *Web 92806/92807
139
Koh COSC 1371-49F Microcomputers WEB *Web 90785
Liu COSC 1172-48 Thinking, Speaking,
Writing WEB *Web 90780
COSC 4310-48F Computer Architecture WEB *Web 92812
Makki COSC 1337-48F Fundamentals II WEB *Web 92808
CPSC 4340-48F/ CPSC 5340-48F
Database WEB *Web 92809/92810
Osborne COSC 5328-48 Computer Networks WEB *Web 92813
Sun, Bo COSC 4302-48 Operating Systems WEB *Web 92814
Sun, Frank COSC 4332-48 Programming for Mobile
Devices WEB *Web 93028
Wang COSC 1336-48F Fundamentals I WEB *Web 90782
COSC 1173-48F Programming Lab WEB *Web 92815
COSC 3306-48F C++/Unix WEB *Web 90792
140
Computer Science Schedule Spring 2013
Instructor Course Course Name Time Day Room CRN
Andrei COSC 5369-01 Graduate Project
11:30-12:25
MWF 111 10259
Beard COSC 1371-02 Microcomputers 09:10 - 10:05
MWF 107 10289
COSC 1371-03 Microcomputers 10:20 - 11:15
MWF 107 10244
COSC 1371-04 Microcomputers 11:30 - 12:25
MWF 107 10245
COSC 1371-05 Microcomputers 12:40 - 01:35
MWF 107 10246
Doerschuk CPSC 4370-01/ CPSC 5370-01
Artificial Intelligence
9:35 - 10:55
TR 111 10249 10250
COSC 1336-02 Fundamentals I 12:45 -
2:05 TR 111 14164
Koh COSC 3302-01 Computer
Theory 12:45-2:05 TR 109 14161
COSC 5313-01 Algorithm
Analysis and Design
3:50-5:10pm
MW 111 10254
Liu COSC 5310-01 Adv. Computer
Architecture 10:20-11:15
MWF 111 13995
COSC 4310-01 Computer
Architecture 12:40-1:35 MWF 111 14004
Makki COSC 2336-01 Fundamentals III 10:20-11:15
MWF 109 10253
COSC 1337-01 Fundamentals II 12:40-1:35 MWF 109 10258
Osborne COSC 5302-01 Adv. Operating
Systems 5:30-
7:00pm MW 109 10260
COSC 4172-01 Senior Seminar 3:50-
5:15pm M 109 10261
COSC 5100-01 Graduate Seminar
9:10-10:05 W 111 14921
COSC 4301-01 Programming for
Graduate Student
2:20 - 3:40pm
TR 111 14026
Roden COSC 1336-01 Fundamentals I 11:30-12:25
MWF 109 14160
COSC 4319-01/ COSC 5321-01
Graphics 8:00-8:55 MWF 109 13997 13999
Bo Sun COSC 4302-01 Operating systems
11:10-12:30
TR 111 12859
Frank Sun COSC 1371-01 Microcomputers 9:35-10:55 TR 107 10266
CPSC 4315-01 System
Administration 2:20-
3:40pm TR 109 12858
141
Tran COSC 3304-01 Algorithm
Analysis and Design
9:10-10:05 MWF 109 14002
Wang COSC 1173-01 Programming
Lab 2:20-
3:10pm R 213 14006
COSC 2372-01 Organization &
Assembly 9:35-10:55 TR 109 14003
Web Courses
Instructor Course Course Name CRN
Andrei CPSC 4360-48F/ CPSC 5360-48F
Software Engineering
TBA *Web 13346 13347
COSC 3325-48F Computer Law
and Ethics TBA *Web 14165
Beard COSC 3321-48F Advanced
Microcomputer Application
TBA *Web 14059
Chiou COSC 1371-48F Microcomputers TBA *Web 13338
COSC 3320-48F Web
Design/XHTML TBA *Web 13345
Haynes COSC 4320-48F Advanced Web
Design TBA *Web 14062
Dr. Makki COSC 1337-48 Fundamentals II TBA *Web 13339
Jarrell COSC 3323-48F Fundamentals of
Digital Media TBA *Web 14060
Liu COSC 1172-48 Thinking,
Speaking,Writing TBA *Web 13337
Frank Sun COSC 3301-48F Computer Security
TBA *Web 13342
Wang COSC 1336-48F Fundamentals I TBA *Web 13750
COSC 1173-48F CS1 Lab TBA *Web 14058
COSC 1371-49F Microcomputers TBA *Web 13322
142
Computer Science Two-Year Class Rotation Schedule
o-Spring o-Summer o-Fall e-Spring e-Summer e-Fall Sections Desc Online
COSC class online class online class online class online class online class online
1172
1
1
1
1 4 every long every long
1173 2 1 1 2 1 2 1 1 2 1 14 every
1371 7 2 2 1 7 2 7 2 2 1 7 2 42 every every
1381
1
1 odd summer
odd summer
1336 1 1 1 2 1 1 1 2 1 11 every fall
1337 1 1
1 1 1
1 6 every long spring
2336 1
1 1
1 4 every long odd fall
2372 1
1 1
2 spring odd spring
3301
0
3302 1
1
2 spring even
spring
3304 1
1
1 spring
even spring
3306
1
1
1
1 4 long long
3308
1
1 2 fall odd fall
3320
1
1
1
1 4 every long every long
3321
1
1 2 spring spring
3325
1
1
2 spring odd spring
4172 1
1 1
1 4 every long
4301
0
4302 1
1
1
1 4 long even fall
4307
1
1
2 summer
143
4309
1
1 odd summer
4310
1
1 1 fall even fall
4319 1
1
2 even summer
4322
1
1 even summer
4324
1
1 odd summer
4341 1
1 1
1 4 every long
4342 1
1 1
1 4 every long
4345 1
1
2 spring
CPSC class online class online class online class online class online class online
3316 1
1 odd spring
3320
1
1 2 fall even fall
4315 1
1
2 summer
4328
1
1
2 summer
4330
1
1 2 fall ???
4340 1 1
1
1 4 fall & summer even fall
4360
1 1 1
1 4 every spring odd spring
4370
1 1 2 spring odd spring
ELEN class online class online class online class online class online class online
3431
"1"
"1" 0 fall by EE
22 12 10 1 21 9 21 11 10 2 19 11
Sections 34 11 30 32 12 30 149
144
Appendix L – Advisement by STARS
L.1 – Lamar Enrollment Agreement
LAMAR UNIVERSITY I WILL Enrollment Agreement
Students who do not meet the requirements for “unconditional admission” to Lamar University
will be considered on an individual approval basis termed I Will admission. Lamar University is
committed to higher educational opportunity and recognizes that traditional formal admission
requirements are imperfect predictors of student success. Effort, dedication, and related
intangible factors do matter; hence, I Will. Lamar is equally committed to student success and
behaviors indicative of future achievement. I Will students begin their college careers within a
structured higher educational environment specifically created with their needs, the needs of their
fellow students, and the requirements of the university in mind. Lamar University is committed
to providing support for success to I Will students through:
Mandatory advisement and registration: I Will students are required to meet with Undergraduate
Advisement Center advisors at least twice every semester to discuss academic and personal
progress, choose classes, and register. Enrollment hours and course selections are subject to
advisor approval, and I Will students may be required to wait until grades post before enrolling
for future semesters or terms. Upon release from the I Will agreement, students may still be
subject to registration restrictions.
Temporary limits on enrollment: I Will students are limited to a maximum of 14 credit hours in
their first semester.
Texas Success Initiative (TSI) remediation (if required): I Will students who did not pass one or
more of the three test areas for college readiness must be enrolled in at least one of those areas
every semester until fully TSI complete.
Support Services: I Will students are required to participate in support programs and services
offered through Lamar’s Center for Academic Success (“STARS” Center). As appropriate, I Will
students must avail themselves of financial assistance and counseling services offered by the
university.
To continue to matriculate at Lamar University, I Will students must complete the following
requirements during the first semester of enrollment (Please initial after each condition
indicating your understanding):
1. Earn nine college-level credit hours. _____ (initial)
2. Earn a grade of “C” or higher in an English or mathematics course. _____ (initial)
3. Earn a grade of “C” or higher in a study skills course (PEDG 1271 or PSYC 2270).
_____ (initial)
4. Earn a grade of “C” or higher in LMAR 1101 (University Success Seminar). _____
(initial)
5. Have an overall (cumulative) Lamar University grade point average of 2.0 or above.
_____ (initial)
145
6. Not have an outstanding financial obligation (in excess of $50.00) to LU for the
completed semester. _____ (initial)
7. Not have a disciplinary offense, including academic dishonesty (following due process
adjudication). _____ (initial)
8. Meet a minimum of twice a semester with an advisor in the Undergraduate Advisement
Center. _____ (initial)
9. Utilize support programs and services as appropriate and as recommended by an advisor.
_____ (initial)
Student Name (print):
__________________________________________________________________
ID#: ________________________ Semester of entry: ________________________
Attention: Failure to comply with any of the above conditions will result in suspension from
Lamar University without appeal. Students who do not meet I Will conditions may return to
Lamar University only by transferring at least 18 hours with a 2.0 or higher GPA from another
institution. Any exception to admission decisions or conditions requires the approval of the
Associate Vice President for Strategic Enrollment Management. As an I Will student given this
enrollment opportunity, you will be held accountable for the above conditions. Your signature
below indicates that you voluntarily elect to accept enrollment under the guidelines stated in this
agreement.
Student Signature: __________________________________ Date: _____________________
Advisor Signature: __________________________________ Date: _____________________
146
L.2 – Advising Communication Timeline – Fall Semester
Advising Communication Timeline - Fall Semester
September
Early * Email welcome letter to students:
a. Include list of campus resources
b. Remind what good academic standing means (2.0 GPA)
c. Encourage advisor contact for assistance or questions; include phone
number
Mid * Advisors: begin calling students
- Be supportive in asking how classes are going; discuss course
load/syllabi
- Politely remind students of contract requirements and schedule
appointment
- Remind students of the 12th
class day and explain what that means:
a. Students can go to their SSB account and drop a class themselves
b. This drop will NOT count toward the 6-drop rule
c. This is the last day for a full refund of dropped (not withdrawn)
courses
Late *1st Progress Reports requested
October
Early * 1st Progress Reports requested/obtained
* Advisors: follow-up phone calls/emails regarding progress reports
Mid * Advisors: continue calling students and meet with scheduled
appointments
-Inform students the Class Schedule will be available online end of
October
Late * 2nd
Progress Reports requested
* Email letter to students:
a. Encourage students to follow through with contract requirements
b. Schedule a meeting with their advisor; Seek academic assistance
c. Indicate last drop/withdrawal date with academic penalty; spring
advisement begins November 1st; and conditional registration may be
required
November
Early * 2nd
Progress Reports requested/obtained
* Advisors: continue follow-up with students; begin Spring Advisement
- Review Progress Reports with students
- Explain conditional registration, if required
- Confirm phone/email contact information for accuracy and ask
students if they have received prior emails
Mid * Advisors: heavy advisement continues and open registration begins
Late * Advisors: follow-up phone calls/emails
* Email letter to students:
147
a. Remind students of consequences of not fulfilling contract
requirements
b. Encourage students to contact their advisor immediately
December
Early * Advisors: heavy advisement and registration continues
Mid * Email: LU will be closed (list dates); Advisement is mandatory prior to
students being allowed to register; Advisement will resume on (date)
* Begin evaluating grades as they are available
- Contact students about eligibility
* Revise Communication Timeline for the Spring term
148
L.3 – Lamar Retention Programs
College Program or Unit Name
Year of Inception
Description Target Population Funding
Arts & Sciences Dr. Brenda Nichols, Dean
Biology Dr. Matthew Hoch, Department Chair
Chemistry Dr. Paul Bernazzani, Department Chair
Tutoring Tutoring for chemistry students. We target those who are taking a chemistry course.
Local
Computer Science Dr. Lawrence Osborne, Dept. Chair
INSPIRED Stairstep
Have about 10-12 student assistants that work with tutoring and outreach for computer science majors, special focus on underrepresented populations and minorities within this group. Tutoring programs for science related majors.
All computer science majors. Multi-discipline target including math, physics, earth & space sciences, chemistry and computer science
National Science Foundation (NSF) NSF
Earth & Space Sciences Dr. Jim Jordan, Department Chair
Informal Tutoring
On a case-by-case basis try to find upper level students to tutor a student needing assistance in the specific course.
All students in an earth & space science course that requests help from the department.
none
English & Modern Languages Dr. Steven Zani, Department Chair
History Dr. Mary Kelley-Scheer, Dept. Chair
Informal Tutoring
On a case-by-case basis, a graduate student will try to help out with any History course needed.
Any student in a History course.
none
Mathematics Dr. Paul Chiou, Department Chair
Tutoring Lab Mentoring Program
1995 The lab provides free tutorial for students who take lower level mathematics courses including math core courses – College Algebra and Elementary Statistics. Individual faculty members voluntarily serve as mentors for Mathematics majors.
We target those who take lower level mathematics courses including math cores Mathematic majors
Local Funding (tuition and fees) None
149
Nursing Dr. Eileen Curl, Department Chair
The Caring Place 2003 Graduate Assistants provide facilitated learning sessions for students who request/need additional assistance in learning concepts/information. Students sign a contract that they will come to The Caring Place prepared (having read the assignments). Our role is to facilitate and support their active learning, bud we do not spoon feed information to them. Our goal is for them to become active learners who know how to learn.
Our resource is open to all nursing students who have been admitted into our undergraduate ADN and BSN programs.
Initial funding from the THECB grant and support from St. Elizabeth’s Hospital. Now funding is internal through the use of Graduate Assistants.