DETERMINING THE BASELINE OF GENERAL · PDF fileDETERMINING THE BASELINE OF GENERAL CHEMISTRY...

1
DETERMINING THE BASELINE OF GENERAL CHEMISTRY STUDENT PERFORMANCE AT A TIER 1 UNIVERSITY Gregory Allen , R. Alan Gamage †,‡ , Alberto Guzman-Alvarez , Catherine Uvarov †,‡ , Chris Pagliarulo , Marco Molinaro Department of Chemistry, iAMSTEM Hub (part of Office of Undergraduate Education), University of California Davis, CA, 95616, United States STUDENT ATTITUDES CLASS-Chem Assessment Changes in student attitudes showed can be measured for each class. The Colorado Learning Attitudes about Science Survey for Chemistry (CLASS-Chem), is an assessment with ten constructs which gauges how students’ perceptions of chemistry and their approach to learning compares to those of experts in the field. 2 The CLASS-Chem Assessment was implemented in Spring Quarter 2014 where second and third quarter general chemistry (2B and 2C, respectively) are taught in parallel. The rows in green indicate that students in Chem 2C have two to three times larger gains in expert perspectives than students in Chem 2B. Further analysis is necessary to determine to what factors these changes can be attributed. Possible factors include differences in content, student maturity, proportion of student majors (physical science vs life- science), and instructional practices. CLASSROOM OBSERVATIONS COPUS Application The COPUS (Classroom Observation Protocols for Undergraduate STEM) instrument was designed to systematically observe and record what occurs in classrooms real-time 3 . To facilitate the process, GORP, a browser based mobile phone app was created (Figure 5). Every 2 minutes the observer catalogues what is occurring in the classroom based on a range of teaching practices such as lecturing and less common practices such as facilitation of group work. The data generated by the classroom observations is saved on a server and mapped to create visualizations of the learning environment (Figure 6.) 0 2 4 6 8 10 12 14 16 18 20 22 24 26 28 30 32 34 36 38 40 42 44 Time (min) Student Activity (Class 1) Listening Asking Questions Student Question Working in Groups Other Group Activity 0 2 4 6 8 10 12 14 16 18 20 22 24 26 28 30 32 34 36 38 40 42 44 Time (min) Instructor Activity (Class 1) Lecturing Writing Asking Questions Answering Questions Followup of Q 0 2 4 6 8 10 12 14 16 18 20 22 24 26 28 30 32 34 36 38 40 42 44 46 48 50 Time (min) Student Activity (Class 2) Listening Student Question 0 2 4 6 8 10 12 14 16 18 20 22 24 26 28 30 32 34 36 38 40 42 44 46 48 50 Time (min) Instructor Activity (Class 2) Lecturing Writing 0 2 4 5 7 8 10 12 13 15 17 18 20 21 23 25 26 28 30 31 33 34 36 38 39 Time (min) Student Activity (Class 3) Listening Asking Questions Student Question Real Time Observations 0 2 4 5 7 8 10 12 13 15 17 18 20 21 23 25 26 28 30 31 33 34 36 38 39 Time (min) Instructor Activity (Class 3) Lecturing Asking Questions COPUS observations from three different general chemistry classes are shown in figure 6. The activities of Classes 2 and 3 are fairly homogenous with lecturing as the main activity and a few questions from students. Class 1 on the other hand is more mixed including some lecture, facilitation of group work, and active exchange between students and the instructor. Once we become more sophisticated with this tool, COPUS data will be correlated with learning gains to determine if the effect of different instructional practices can be measured. FUTURE PLANS INTRODUCTION Figure 5: GORP app interface displaying the different instructor and student activities from COPUS. Figure 6: COPUS data from different hour long lectures. Student activity is shown in the left column and instructor activity is shown in the right column. s LEARNING GAINS First Quarter Normalized Learning Gains 0 50 100 150 200 250 300 350 0 10 20 30 40 50 60 70 80 90 100 Number of Students Score (%) Pre and Post Score Frequency Pre-Test Post-Test Comparisons of student learning from class to class can be made. Multiple choice content exams were developed based on the ACS Examinations Institute ACCM for General Chemistry 4 and aligned with UCD chemistry student learning outcomes for each quarter of general chemistry (2A, B, and C). Exams were implemented in Winter 2014 for Chem 2A and Spring 2014 for Chem 2B and 2C. The distribution of pre/post scores for all 1472 students in Chem 2A is shown in figure 3. The performance of students in each Chem 2A classes are compared in figures 1 and 2. Figure 2 shows the comparison after correcting for individual student characteristics (i.e., demographics and SAT scores) in order to reduce the number of confounding variables in the comparison. We can conclude that students in class A did not perform as well in comparison to the other courses. However, further analysis is needed to determine whether these differences are due to instruction or different times of day when classes were held. Topic Analysis of Learning Gains -10.0 -5.0 0.0 5.0 10.0 15.0 20.0 25.0 30.0 35.0 40.0 Percent Difference Categories Assessed (Note: *p value < 0.05, **p value < 0.01) Percent Pre/Post Difference Per Topic Class A Class D Performance on particular topic areas can compared from class to class. Student performance can also be broken down by topic areas in order to inform future instructional intervention. Figure 4. compares student performance from Chem 2A section A and Section D which were shown in Figure 1 and Figure 2 to have statistically different overall learning gains. Students in section D only performed significantly better on half of the topics. Although section D shows overall increases in performance it also shows a greater variance (as measured by standard error) which implies that the effect of instruction is less homogenous for D than for A. Finally, it is interesting to note that the category of Bonding Concepts, showed relatively no improvements for either course even though it is considered one of the most important topics taught in Chem 2A. The power of this assessment approach is that it can not only measure the variability between instructors, but also establish if the instructional program as a whole is meeting the standards that the faculty has set. Figure 1: The performance of four first quarter general chemistry classes that took a pre/post chemistry content exam. Student performance is reported as normalized learning gains, (%post-%pre)/(100-%pre). Class A was found to be statistically different from the other three classes. Figure 2: Learning gains from the same four classes using a robust model to correct for differences in student demographics. Figure 3: Histogram showing the percent score of the entire first quarter general chemistry class on the pre/post content exam (n=1472). Figure 4: A comparison of student performance for Chem 2A class section A and section D broken down into 10 topics. Topics with statistically significant differences of p<0.01 are highlight by red boxes. This first year of implementation has given us valuable insight on how students’ performance in courses change from quarter to quarter and from instructor to instructor. Not only have we mapped out a baseline of how students perform given the current instructional practices we have already begun to pinpoint areas for program improvement. In the coming years our system for implementing assessments will improve and we will begin facilitating and reporting the effects of instructional innovations at UC Davis. Future projects include: Flipped Classroom: structuring classes to employ active learning strategies and using educational technologies to prepare students to succeed in flipped instructional periods. Teaching Assistant Training: Formalizing a system for training graduate student assistants to compliment ongoing instructor innovations. Expanding Scope of Assessment: Implemented assessment strategies to measure student learning in general chemistry laboratories and organic chemistry courses. 1. Bretz, S.L. Trajectories of Chemistry Education Innovation and Reform. 2013. Chapter 10, pp 145–153. 2. Adams, W.K.; Wieman, C.E.; Perkins, K.K. Barbera, J. J. Chem. Educ. 2008, 85 (10), p 1435. 3. Smith, M.K.; Jones, F.H.M. Gilbert, S.L.; Wieman, C.E. CBE Life Sci Edu. 2013; 12(4), pp 618-27. 4. Holme, T.; Murphy, K. J. Chem. Educ., 2012, 89 (6), pp 721–723. Assessment Tool Courses Assessed Number of students/observations Content Exam 2A, 2B, and 2C 4118 * CLASS-Chem 2B and 2C 1181 * COPUS 2B and 2C 22 Ongoing instructional innovation in a large research university is not easy. The logistics of large classes, the enormous demands on time for faculty and administrators, and the specialized knowledge necessary to carry out meaningful and long lasting changes in curriculum often seem like insurmountable barriers. To decrease the activation energy for ongoing innovation, UC Davis has spearheaded a partnership between STEM faculty and a group of discipline-based education practitioners making up the UCD iAMSTEM Hub. The goal of the partnership is to nurture scholarship in teaching and learning where faculty can explore methods of improving their teaching practice while the Hub facilitates the logistics of running the educational experiments, developing assessments, analyzing the statistical data, and making recommendations for ongoing improvements. Faculty members will finally be able to innovate in their classrooms without first becoming experts in educational theory and assessment. Table 1: Courses where each assessment tool have been administered. *Indicates number of students who took both pre and post. CHE 2B (n=614) CHE 2C (n=425) Time X Group Interaction Pre (%) Post (%) Change (%) Pre (%) Post (%) Change (%) p-Value Overall 47 48 1* 49 51 2*** 0.156 Personal Interest 47 47 0 49 51 1 0.421 Real World Connections 51 50 0 53 54 1 0.469 Problem Solving 54 54 1 54 58 3** 0.040* Problem Solving Confidence 62 63 2 60 66 6*** 0.016* Problem Solving Sophistication 33 35 2 35 39 4** 0.154 Effort 58 57 0 59 59 0 0.921 Conceptual Connections 40 42 2 40 45 5*** 0.042* Conceptual Learning 30 34 4*** 31 36 5*** 0.423 Atomic Molecular Perspective 45 47 2 50 56 6*** 0.009** * p < .05 ** p < .01 *** p < .001 Table 2: Results of CLASS-Chem Assessment in Spring Quarter 2014. The pre/post changes that are statistically significant within each class are starred. Rows highlighted in green indicate which categories are statistically significant between chem 2B and 2C. Establishing a system of conveniently and holistically assessing student performance across courses and instructors was the first goal of the partnership between the UCD Chemistry department and the UCD iAMSTEM Hub. Without this system in place it would be difficult to measure the effects of instructional innovation. We took advantage of the decades of work that education researchers and the ACS Examinations Institute have contributed to developing valid and reliable instruments which not only assess essential teaching concerns such as student understanding of concepts and ability to solve problems but also assess complimentary concerns such as student beliefs, approaches to learning chemistry, and other affective dimensions of learning 1 . Described in this poster are initial findings from one year of implementing five assessment tools: three General Chemistry Content Exams (one for each quarter), CLASS-Chem for measuring affective dimensions of learning, 2 and COPUS for cataloguing real-time instructional practices in the classroom. 3 Establishing A Scholarship of Teaching and Learning Community at UC Davis Foundations For Instructional Innovation Acknowledgements: We would like to thank all of the instructors and faculty in the UC Davis Chemistry department, the staff at the iAMSTEM Hub, Association Of American Universities, and all others who helped us develop and administer the various tools used in the general chemistry classes.

Transcript of DETERMINING THE BASELINE OF GENERAL · PDF fileDETERMINING THE BASELINE OF GENERAL CHEMISTRY...

Page 1: DETERMINING THE BASELINE OF GENERAL · PDF fileDETERMINING THE BASELINE OF GENERAL CHEMISTRY STUDENT PERFORMANCE AT A TIER 1 ... faculty has set. Figure 1: ... 51 50 0 53 54 1 0.469

DETERMINING THE BASELINE OF GENERAL CHEMISTRY

STUDENT PERFORMANCE AT A TIER 1 UNIVERSITY

Gregory Allen†, R. Alan Gamage

†,‡, Alberto Guzman-Alvarez

‡, Catherine Uvarov

†,‡, Chris Pagliarulo

‡, Marco Molinaro

†Department of Chemistry,

‡iAMSTEM Hub (part of Office of Undergraduate Education), University of California Davis, CA, 95616, United States

STUDENT ATTITUDES

CLASS-Chem Assessment

Changes in student attitudes showed can be measured for each class.The Colorado Learning Attitudes about Science Survey for Chemistry (CLASS-Chem), is an assessment with tenconstructs which gauges how students’ perceptions of chemistry and their approach to learning compares tothose of experts in the field.2 The CLASS-Chem Assessment was implemented in Spring Quarter 2014 wheresecond and third quarter general chemistry (2B and 2C, respectively) are taught in parallel. The rows in greenindicate that students in Chem 2C have two to three times larger gains in expert perspectives than students inChem 2B. Further analysis is necessary to determine to what factors these changes can be attributed. Possiblefactors include differences in content, student maturity, proportion of student majors (physical science vs life-science), and instructional practices.

CLASSROOM OBSERVATIONS

COPUS ApplicationThe COPUS (Classroom ObservationProtocols for Undergraduate STEM)instrument was designed tosystematically observe and record whatoccurs in classrooms real-time3. Tofacilitate the process, GORP, a browserbased mobile phone app was created(Figure 5). Every 2 minutes the observercatalogues what is occurring in theclassroom based on a range of teachingpractices such as lecturing and lesscommon practices such as facilitation ofgroup work.

The data generated by the classroomobservations is saved on a server andmapped to create visualizations of thelearning environment (Figure 6.)

0 2 4 6 8 10 12 14 16 18 20 22 24 26 28 30 32 34 36 38 40 42 44

Time (min)

Student Activity (Class 1)Listening

Asking Questions

Student Question

Working inGroupsOther GroupActivity

0 2 4 6 8 10 12 14 16 18 20 22 24 26 28 30 32 34 36 38 40 42 44

Time (min)

Instructor Activity (Class 1)Lecturing

Writing

Asking Questions

AnsweringQuestionsFollowup of Q

0 2 4 6 8 10 12 14 16 18 20 22 24 26 28 30 32 34 36 38 40 42 44 46 48 50

Time (min)

Student Activity (Class 2)

Listening

Student Question

0 2 4 6 8 10 12 14 16 18 20 22 24 26 28 30 32 34 36 38 40 42 44 46 48 50

Time (min)

Instructor Activity (Class 2)

Lecturing

Writing

0 2 4 5 7 8 10 12 13 15 17 18 20 21 23 25 26 28 30 31 33 34 36 38 39

Time (min)

Student Activity (Class 3)

Listening

Asking Questions

Student Question

Real Time Observations

0 2 4 5 7 8 10 12 13 15 17 18 20 21 23 25 26 28 30 31 33 34 36 38 39Time (min)

Instructor Activity (Class 3)

Lecturing

Asking Questions

COPUS observations from three different general chemistry classes are shown in figure 6. The activities of Classes 2 and 3 arefairly homogenous with lecturing as the main activity and a few questions from students. Class 1 on the other hand is moremixed including some lecture, facilitation of group work, and active exchange between students and the instructor. Once webecome more sophisticated with this tool, COPUS data will be correlated with learning gains to determine if the effect ofdifferent instructional practices can be measured.

FUTURE PLANS

INTRODUCTION

Figure 5: GORP app interface displaying the different instructor and student activities from COPUS.

Figure 6: COPUS data from different hour long lectures. Student activity is shown in the left column and instructor activity is shown in the right column.

s

LEARNING GAINS

First Quarter Normalized Learning Gains

0

50

100

150

200

250

300

350

0 10 20 30 40 50 60 70 80 90 100

Nu

mb

er o

f St

ud

ents

Score (%)

Pre and Post Score Frequency

Pre-Test Post-Test

Comparisons of student learning from class toclass can be made. Multiple choice content exams were

developed based on the ACS Examinations Institute ACCM for GeneralChemistry4 and aligned with UCD chemistry student learning outcomesfor each quarter of general chemistry (2A, B, and C). Exams wereimplemented in Winter 2014 for Chem 2A and Spring 2014 for Chem 2Band 2C. The distribution of pre/post scores for all 1472 students in Chem2A is shown in figure 3. The performance of students in each Chem 2Aclasses are compared in figures 1 and 2. Figure 2 shows the comparisonafter correcting for individual student characteristics (i.e., demographicsand SAT scores) in order to reduce the number of confounding variablesin the comparison. We can conclude that students in class A did notperform as well in comparison to the other courses. However, furtheranalysis is needed to determine whether these differences are due toinstruction or different times of day when classes were held.

Topic Analysis of Learning Gains

-10.0

-5.0

0.0

5.0

10.0

15.0

20.0

25.0

30.0

35.0

40.0

Perc

ent

Dif

fere

nce

Categories Assessed (Note: *p value < 0.05, **p value < 0.01)

Percent Pre/Post Difference Per Topic

Class A

Class D

Performance on particular topic areas can compared from class to class. Student performance can also be broken

down by topic areas in order to inform future instructional intervention. Figure 4. compares student performance from Chem 2A section A and Section Dwhich were shown in Figure 1 and Figure 2 to have statistically different overall learning gains. Students in section D only performed significantly better onhalf of the topics. Although section D shows overall increases in performance it also shows a greater variance (as measured by standard error) which impliesthat the effect of instruction is less homogenous for D than for A. Finally, it is interesting to note that the category of Bonding Concepts, showed relativelyno improvements for either course even though it is considered one of the most important topics taught in Chem 2A. The power of this assessmentapproach is that it can not only measure the variability between instructors, but also establish if the instructional program as a whole is meeting thestandards that the faculty has set.

Figure 1: The performance of four first quarter general chemistry classes that took apre/post chemistry content exam. Student performance is reported as normalizedlearning gains, (%post-%pre)/(100-%pre). Class A was found to be statistically differentfrom the other three classes.

Figure 2: Learning gains from the same four classes using a robust model to correct for differences in student demographics.

Figure 3: Histogram showing the percent score of the entire first quarter general chemistry class on the pre/post content exam (n=1472).

Figure 4: A comparison of student performance for Chem 2A class section A and section D broken down into 10 topics. Topics with statistically significant differences of p<0.01 are highlight by red boxes.

This first year of implementation has given us valuable insight on how students’ performance in courses change from quarterto quarter and from instructor to instructor. Not only have we mapped out a baseline of how students perform given thecurrent instructional practices we have already begun to pinpoint areas for program improvement. In the coming years oursystem for implementing assessments will improve and we will begin facilitating and reporting the effects of instructionalinnovations at UC Davis. Future projects include:

• Flipped Classroom: structuring classes to employ active learning strategies and using educational technologies to preparestudents to succeed in flipped instructional periods.

• Teaching Assistant Training: Formalizing a system for training graduate student assistants to compliment ongoinginstructor innovations.

• Expanding Scope of Assessment: Implemented assessment strategies to measure student learning in general chemistrylaboratories and organic chemistry courses.

1. Bretz, S.L. Trajectories of Chemistry Education Innovation and Reform. 2013. Chapter 10, pp 145–153.2. Adams, W.K.; Wieman, C.E.; Perkins, K.K. Barbera, J. J. Chem. Educ. 2008, 85 (10), p 1435.3. Smith, M.K.; Jones, F.H.M. Gilbert, S.L.; Wieman, C.E. CBE Life Sci Edu. 2013; 12(4), pp 618-27.4. Holme, T.; Murphy, K. J. Chem. Educ., 2012, 89 (6), pp 721–723.

Assessment Tool Courses AssessedNumber of

students/observations

Content Exam 2A, 2B, and 2C 4118*

CLASS-Chem 2B and 2C 1181*

COPUS 2B and 2C 22

Ongoing instructional innovation in a large research university is not easy. Thelogistics of large classes, the enormous demands on time for faculty andadministrators, and the specialized knowledge necessary to carry out meaningful andlong lasting changes in curriculum often seem like insurmountable barriers. Todecrease the activation energy for ongoing innovation, UC Davis has spearheaded apartnership between STEM faculty and a group of discipline-based educationpractitioners making up the UCD iAMSTEM Hub.

The goal of the partnership is to nurture scholarship in teaching and learningwhere faculty can explore methods of improving their teaching practice while theHub facilitates the logistics of running the educational experiments, developingassessments, analyzing the statistical data, and making recommendations forongoing improvements. Faculty members will finally be able to innovate in theirclassrooms without first becoming experts in educational theory and assessment.

Table 1: Courses where each assessment tool have been administered. *Indicates number of students who took both pre and post.

CHE 2B (n=614) CHE 2C (n=425)Time X Group

Interaction

Pre (%) Post (%) Change (%) Pre (%) Post (%) Change (%) p-Value

Overall 47 48 1* 49 51 2*** 0.156

Personal Interest 47 47 0 49 51 1 0.421

Real World Connections 51 50 0 53 54 1 0.469

Problem Solving 54 54 1 54 58 3** 0.040*

Problem Solving Confidence 62 63 2 60 66 6*** 0.016*

Problem Solving Sophistication 33 35 2 35 39 4** 0.154

Effort 58 57 0 59 59 0 0.921

Conceptual Connections 40 42 2 40 45 5*** 0.042*

Conceptual Learning 30 34 4*** 31 36 5*** 0.423

Atomic Molecular Perspective 45 47 2 50 56 6*** 0.009**

* p < .05 ** p < .01 *** p < .001

Table 2: Results of CLASS-Chem Assessment in Spring Quarter 2014. The pre/post changes that are statistically significant within each class are starred. Rows highlighted in green indicate which categories are statistically significant between chem 2B and 2C.

Establishing a system of conveniently and holistically assessing student performanceacross courses and instructors was the first goal of the partnership between the UCDChemistry department and the UCD iAMSTEM Hub. Without this system in place itwould be difficult to measure the effects of instructional innovation.

We took advantage of the decades of work that education researchers and the ACSExaminations Institute have contributed to developing valid and reliable instrumentswhich not only assess essential teaching concerns such as student understanding ofconcepts and ability to solve problems but also assess complimentary concerns suchas student beliefs, approaches to learning chemistry, and other affective dimensionsof learning1. Described in this poster are initial findings from one year ofimplementing five assessment tools: three General Chemistry Content Exams (onefor each quarter), CLASS-Chem for measuring affective dimensions of learning,2 andCOPUS for cataloguing real-time instructional practices in the classroom.3

Establishing A Scholarship of Teaching and

Learning Community at UC Davis

Foundations For Instructional Innovation

Acknowledgements: We would like to thank all of the instructors and faculty in the UC Davis Chemistry

department, the staff at the iAMSTEM Hub, Association Of American Universities, and all others who helped us develop andadminister the various tools used in the general chemistry classes.