Annual Report, IFD, 2010-2011 · Web viewAnnual Report, IFD, 2010-2011

34
ANNUAL REPORT, IFD, 2010-2011 Author: Heather McGovern, Director of the Institute for Faculty Development GOALS & SUMMARY OF MAJOR RESPONSIBILITIES/SUPPORT/ACTIVITIES The primary goals of the IFD are to “support effective pedagogy and productive scholarship for all faculty members.” Assessment of student learning is inexorably linked to effective pedagogy; thus, the IFD also supports faculty development related to assessment and assists with program and college assessment, especially related to student learning. These goals align strongly with the college learning theme. In addition, some support of faculty/student learning aligns with the college engagement theme. The IFD is supported with a non-salary budget for fiscal year 2010 of $22,680, one full time Senior Clerk Typist, and one faculty with full course release on alternate assignment from teaching to direct the Institute. The faculty director also receives the equivalent of two summer course stipends in compensation for summer work. With these resources, the IFD has supported the following goals in the last year. GOAL: SUPPORT FACULTY DEVELOPMENT o Support pedagogy Help faculty interpret their student evaluations, reflect upon, and revise their teaching Observe faculty teaching Maintain a library with up-to-date publications on pedagogy Share recent publications or advice on teaching Host an annual guest speaker on pedagogical or assessment topics Administer, summarize data, and consult with faculty about mid-term teaching evaluations Support teaching in other ways: e.g., photograph students, videotape teaching, provide access to a laptop and personal response clickers Consult with faculty about pedagogy Plan most of the weekly fall workshops, targeting but not limited to new faculty, to focus on pedagogical issues o Support faculty as they proceed through the personnel process Provide a two day summer orientation and weekly fall workshops for new faculty Help faculty develop and revise personnel files Provide guidance about conducting peer observations of teaching Help hold a short adjunct orientation/information workshop near the start of each term Hold a January workshop helping first year faculty prepare to write their first year files o Support faculty scholarship

Transcript of Annual Report, IFD, 2010-2011 · Web viewAnnual Report, IFD, 2010-2011

Page 1: Annual Report, IFD, 2010-2011 · Web viewAnnual Report, IFD, 2010-2011

ANNUAL REPORT, IFD, 2010-2011

Author: Heather McGovern, Director of the Institute for Faculty Development

GOALS & SUMMARY OF MAJOR RESPONSIBILITIES/SUPPORT/ACTIVITIES

The primary goals of the IFD are to “support effective pedagogy and productive scholarship for all faculty members.” Assessment of student learning is inexorably linked to effective pedagogy; thus, the IFD also supports faculty development related to assessment and assists with program and college assessment, especially related to student learning. These goals align strongly with the college learning theme. In addition, some support of faculty/student learning aligns with the college engagement theme.

The IFD is supported with a non-salary budget for fiscal year 2010 of $22,680, one full time Senior Clerk Typist, and one faculty with full course release on alternate assignment from teaching to direct the Institute. The faculty director also receives the equivalent of two summer course stipends in compensation for summer work. With these resources, the IFD has supported the following goals in the last year.

GOAL: SUPPORT FACULTY DEVELOPMENT

o Support pedagogy Help faculty interpret their student evaluations, reflect upon, and revise their teaching Observe faculty teaching Maintain a library with up-to-date publications on pedagogy Share recent publications or advice on teaching Host an annual guest speaker on pedagogical or assessment topics Administer, summarize data, and consult with faculty about mid-term teaching evaluations Support teaching in other ways: e.g., photograph students, videotape teaching, provide

access to a laptop and personal response clickers Consult with faculty about pedagogy Plan most of the weekly fall workshops, targeting but not limited to new faculty, to focus on

pedagogical issues o Support faculty as they proceed through the personnel process

Provide a two day summer orientation and weekly fall workshops for new faculty Help faculty develop and revise personnel files Provide guidance about conducting peer observations of teaching Help hold a short adjunct orientation/information workshop near the start of each term Hold a January workshop helping first year faculty prepare to write their first year files

o Support faculty scholarship Collaborate with the Grants Office to create a new database of potential

mentors/trainers/statisticians/editors for faculty working on research projects Plan some weekly fall workshops to focus on scholarship issues Mentor faculty at all stages about scholarship plans Maintain resources in the IFD library to assist faculty in writing journal articles and with other

scholarship plans

GOAL: SUPPORT PROGRAM AND COLLEGE ASSESSMENT

o Help disseminate assessment results across campus and foster college-wide discussion of assessment Schedule and lead meetings of the Assessment Committee Publish Evidence

Page 2: Annual Report, IFD, 2010-2011 · Web viewAnnual Report, IFD, 2010-2011

2

o Help programs develop assessment plans Consult with programs and coordinators Lead the annual intensive summer Assessment Institute

o Serve as a Student Evaluation Liaison Communicate administrative and other deadlines to faculty and staff Assist in determining when faculty/staff have legitimate exceptions to established policies

and procedures Assist in troubleshooting technological and other administrative problems Prepare orders for IDEA Group Summary Reports Conduct internal research using local IDEA results (e.g., conducting analysis of differences

among response rate and type at Stockton when using different forms (online or paper) for different kinds of classes (traditional, hybrid, and online)

o Coordinate CLA testing File Institutional Testing Plan with CLA Recruit CLA participants (faculty/students) Schedule lab space and set up testing sessions online Write and enter local questions Proctor tests Collaborate with IR to send data to CLA Help interpret and disseminate CLA results Collaborate with IR to do local analysis of results

o Provide support for program assessment and for faculty research into areas related to pedagogy and assessment Maintain a Zoomerang account and other research software (NVIVO, SPSS) Create surveys (e.g. for First Year Seminars, dPT, and EDUC) Help analyze survey data Enter program assessment data for CRIM, MATH, BASK, QUAD, Writing, MATH, and other

programs into Excel or SPSS Conduct basic statistical analysis of program assessment data Interpret and discuss assessment data with programs Purchase, proctor, and help interpret results of standardized tests (COMM)

SUPPORT FOR FACULTY DEVELOPMENT

SUPPORT FOR PEDAGOGY

TEACHING OBSERVATIONSFrom July 1, 2010-June 30, 2011 the Director of the IFD observed thirteen faculty teaching, involving taking extensive notes during a full class meeting, meeting with faculty, and writing both informal descriptive and formal evaluative reports. Observations included adjunct and tenure-track faculty and a full professor. These thorough observations each take the Director about one full working day as the IFD strives to provide both formative feedback and a highly detailed glimpse into the classroom for file readers. The majority of class observations (ten) last year occurred in the fall, probably because of spring file deadlines. The number of teaching observations the Director of the IFD completes has increased sharply over the past three years as teaching observations have become required or strongly recommended for all faculty files.

Over the past year, the IFD has also created space on the IFD website to guide other tenured faculty when they conduct observations. The guidance includes a summary of the college’s requirements, advice for the observed and observers, a sample evaluation, and relevant link. This information

Page 3: Annual Report, IFD, 2010-2011 · Web viewAnnual Report, IFD, 2010-2011

3

responds to feedback through a survey done for Middle States this year that indicated that faculty would like more guidance about the teaching observations. Eighteen of 128, or 14% of respondents wrote in comments suggesting a rubric or other guidance, and had we directly asked about it we likely would have had more indicate a desire for more guidance.

MIDTERM TEACHING EVALUATIONSThe IFD sends email out around midterm each fall and spring semester encouraging faculty to conduct midterm evaluations. Prior to last year, the IFD primarily encouraged new faculty or faculty in consultations to conduct midterm evaluations, but starting in Spring 2010 the IFD began to publicly encourage all faculty to do so. The IFD provides a form that faculty may use—adapted from a form Sonia Gonsalves was using to change it from the seven point scale she had developed to a five point scale that would better align with IDEA results and to update its document design. Faculty may use that form, but, of course, they can modify it or use their own. It seems that more faculty are now using midterm evaluations—many have reported this and many have dropped off evaluations for the IFD to process--but the exact numbers of faculty conducting a midterm evaluation or doing so since the public urging are unknown because faculty may do them privately. The IFD office will administer evaluations for faculty on request, but most faculty choose to administer them on their own. Many faculty do bring the raw forms to the IFD office, which compiles quantitative results and qualitative comments, producing a one to two page summary. This spring the IFD experimented with helping faculty offer online midterm evaluations using Zoomerang, but unless professors gave students in-class time to complete them, response rates seemed very low. This is unfortunate as moving midterm evaluations online would save paper and photocopying resources and allow for immediate results to faculty. It would also relieve the IFD office of the task of producing summaries, as Zoomerang does this automatically.

Following midterm evaluations, many faculty schedule appointments to discuss mid-course changes.

TEACHING CONSULTATIONSThe Director of the IFD meets with faculty in teaching consultations. Some of these take the form of helping faculty interpret student evaluations—in groups at the annual New Faculty Orientation, at the twice-a-year Adjunct workshops, in a session for new faculty in the fall weekly workshops to which other faculty and staff are invited, and through one-on-one consultations. In her first year in the position, the Director attended workshops on, watched videos on, and extensively read the IDEA website to become an expert interpreter of the IDEA. In the past year, she met with about fifteen faculty in individual, formal meetings, about interpreting their IDEA reports (and these usually also become meetings about improving their teaching or building on their already abundant pedagogical strengths). In addition, the Director has numerous more informal conversations with Deans and faculty and staff about how to interpret IDEA results.

Other teaching consultations focus on teaching more generally—they might involve student evaluations or they might involve faculty discussing recent changes they’ve made to a course, changes they are considering for a future course, how to deal with students who appear to be resistant due to a teacher’s gender, race, religion, or nationality, and more. The Director of the IFD had more than 25 of these meetings in the past year, not counting those directly related to a teaching observation or to interpreting student evaluations.

In some instances, meetings focus on helping faculty communicate about their teaching, but these sessions also often involve subtly urging faculty to recognize their pedagogical strengths and work on their weaknesses. Last year the Director of the IFD read drafts of or otherwise consulted on the writing of more than 30 faculty files (some more than once). Many of these meetings become a mix of how-to-

Page 4: Annual Report, IFD, 2010-2011 · Web viewAnnual Report, IFD, 2010-2011

4

interpret your evaluations, write a teaching philosophy statement, connect to college and program standards, and document teaching excellence. The majority of this work is done in the spring and summer.

SHARING NEW PEDAGOGICAL INFORMATION WITH FACULTYThe IFD also notes and shares information related to pedagogy. One way to do this is to host an annual guest speaker. This year’s speaker, Anya Kamenetz, spoke about changes in higher education and her new book, DIY University. She was co-sponsored by the Provost’s Office and Student Affairs both for practical budget purposes (her speaker’s fee was about four times as much as the IFD usually covers) and because she was a speaker with a broader appeal to students and the college community. She gave a traditional speech. She also interacted with staff and faculty at lunch and a smaller Q and A session (both hosted by the Provost’s Office). These smaller events inspired many participating staff and faculty to read her books, or at least portions, and have productive discussion of her ideas about the present and future of higher education. All sessions were well attended, and a handful of faculty have indicated concrete changes they’ve made because of her ideas. Next year’s guest speaker, Rich Shavelson, has been scheduled for Oct. 2011.

The IFD website also links to articles from the Chronicle of Higher Education and other sources on issues related to pedagogy and the Director sends targeted emails to faculty, or, more rarely, email to the whole college community, about such issues. The Director follows the Chronicle and other blogs, websites, and journals on pedagogy, teaching, and higher education in order to be up-to-date.

In addition, the IFD maintains a library with publications on issues related to pedagogy. Last year, the IFD purchased a book on teaching (McKeachie’s Teaching Tips, 13th edition) for all new faculty members and individual copies of books on pedagogy (especially writing assessment and electronic portfolios, but also critical thinking and other issues) for the IFD library. For Fall 2011, the IFD has purchased copies of Academically Adrift for all new faculty.

PEDAGOGICAL WORKSHOPSMany portions of faculty development at New Faculty Orientation focus on teaching, including sessions on use of student evaluations and teaching observations at Stockton and a review of recent college-wide assessment results and demographic data about Stockton students to help new faculty know who they would teach and understand the larger pedagogical context in which they will teach. Presentations from Computer Services, the Registrar, and Academic Advising prepare new faculty for posting grades, understanding the curriculum, precepting, and using technology in their teaching. A presentation from the Dean of General Studies helps faculty understand the importance of General Studies in the Stockton curriculum and prepare to participate in it.

In addition, of the fourteen new faculty workshops in fall 2010, twelve were primarily focused on teaching: everything from a pep talk from Bill Daly on teaching to helping students develop writing, quantitative, and information literacy skills to dealing with disruptive students in the classroom to advising to developing general studies courses. See last year’s workshops. The workshops are each assessed with a survey by participants, and the workshop schedule is adjusted each year based on feedback from previous participants. For instance, feedback has resulted in the Grants Office session, always one of the highest rated sessions, being moved earlier last year. In addition, feedback on a session two years ago focusing on Student Affairs resulted in that session being canceled and replaced with one by Computer Services this year; feedback on that is resulting in it being redirected to faculty-at-large and replaced with full sessions on the IRB and technology tricks for the classroom, both popular but too brief when sharing a time slot last year.

Page 5: Annual Report, IFD, 2010-2011 · Web viewAnnual Report, IFD, 2010-2011

5

SUPPORT FOR FACULTY AS THEY PROCEED THROUGH THE PERSONNEL PROCESS

Support for faculty as they proceed through the personnel process begins before they arrive on campus with work on mentoring and includes new faculty orientation, workshops for new faculty and for adjuncts, reading and commenting on drafts of faculty first year, second year, third year, tenure, and promotion files and plans, and consulting with faculty about strategies for preparing for promotion to associate or full.

MENTORINGThe IFD works with Schools and the Provost’s Office to gather a list of mentors assigned to new faculty and invites the faculty and their mentors to a lunch during new faculty orientation. With the invitation to mentors last year, the IFD included a letter thanking mentors and reminding them of helpful things they might do for their mentees. In addition, for the first time the IFD gave mentors a copy of a book, Faculty Success through Mentoring as a thank-you token and a guide.

NEW FACULTY ORIENTATIONThe IFD takes primary responsibility for New Faculty Orientation, which is hosted in collaboration with the Provost’s Office. The IFD plans the agenda, contacts and confirms people on the agenda, plans the menu and coordinates room reservations for, prepares for several significant professional development sessions led by the IFD director, and invites faculty to NFO. See the agendas for last summer and this summer’s NFO, Appendix Aand a list of last year’s participants as Appendix B.

At new faculty orientation, many activities are designed to assist faculty in moving through the personnel process. They meet their Deans, Assistant Deans, mentors, and other important people at Stockton. They learn about the importance of their contributions to General Studies. In addition, they get an introduction to the Grants Office and a workshop on the use of teaching evaluations and observations, including a review of what they will be required to put into their files.

NEW FACULTY WORKSHOPS

The majority of the weekly new faculty workshops occur in the fall term when new faculty have a course release and are scheduled around a common meeting time. The Director of the IFD plans the agenda for the workshop, schedules speakers, reserves rooms, attends and assists with discussion, assesses with surveys about speakers, takes attendance, addresses issues, and more. Last year, the IFD experimented with supporting new faculty with Blackboard session (which was a failed experiment—with a handful of exceptions new faculty did not complete weekly pedagogical reading assignments or participate in discussion of those or other new faculty issues in Blackboard). The previous year, the Director of the IFD coordinated email conversation instead, and next year this method will be used again. Most workshops have a strongly practical, and typically pedagogical, focus. However, they serve multiple uses, and two strong secondary purposes of most of the workshops are to introduce new faculty to a) Stockton culture and b) “movers and shakers” at Stockton. Bill Daly sets the stage by leading the first workshop and communicating both practical advice and one of the stronger philosophies of teaching popular among Stockton faculty. At least two sessions deal with academic advising (and faculty typically meet all Academic Advising staff), and new faculty are encouraged to read Marc Lowenstein’s article on advising to help communicate the Stockton philosophy of advising as dealing with big-picture issues and as a form of teaching. These kinds of sessions, as well as one on creating your own G course, last year led by Rodger Jackson, communicate Stockton philosophy as well

Page 6: Annual Report, IFD, 2010-2011 · Web viewAnnual Report, IFD, 2010-2011

6

as practical knowledge and send strong signals about what faculty and administrators may value in the personnel process. Last year, typically around ten new faculty attended. It would have been easier for faculty at-large to attend except that through mid-October our meeting location was not predictable due to room scheduling problems.

RESOURCES FOR ADJUNCTSStarting in Fall 2010 the College began again to provide an adjunct workshop at the start of each semester. This workshop, organized by the IFD, HR, and the Provost’s Office, includes a meet and greet with the Provost, a primer on student evaluations, discussion with union representatives on the benefits of union membership, and a mini-workshop on technology and teaching led by the Director of Computer Services. In the 2010-11 academic year, 87 adjuncts attended the workshops. These workshops provide adjuncts with technological skill to explore different pedagogical methods in the classroom. They also provide a forum for adjuncts to feel welcome and appreciated at Stockton. See an agenda, Appendix C.

HELP WITH FILE WRITINGLast year, the Director of the IFD read drafts of and/or met with faculty to assist with the writing of files more than 30 times. A large number of these were first-year files or tenure files, but some were second year files or promotion files. Reading and commenting on these drafts consumes the bulk of the time of the Director of the IFD in January and February preceding spring file deadlines. In January, before first year files are due, the IFD hosts a workshop for first year faculty to assist with writing the first year file.

SUPPORT FOR FACULTY SCHOLARSHIP

The IFD supports faculty scholarship to a degree, but more minimally, as most faculty find disciplinary-specific support more useful in this area. Nonetheless, in the last year the IFD continued past attempts to support faculty scholarship by including sessions with the Grants Office both in New Faculty Orientation and in the weekly fall workshops and by including a session on the IRB in the fall workshops. In addition, last year the IFD collaborated with the Grants Office to create a new database of potential mentors/trainers/statisticians/editors for faculty working on research projects so that both offices can better match faculty with needs (How do I use NVIVO? Construct a survey? Make a poster presentation?) with faculty willing and able to assist them. The IFD has already used that small database to match faculty looking for help with NVIVO and SPSS. The IFD also purchased several books on scholarship to keep in the IFD library that proved popular with last year’s new faculty, including Writing Your Journal Article in 12 Weeks. The Director of the IFD also consults with faculty individually on issues related to scholarship or service, which accounted for about five consultations last year.

Because of dual roles as the Director of the IFD and as a GENS rep. to the RPD committee, the Director of the IFD also read and comment on many drafts of RPD proposals.

SUPPORT PROGRAM AND COLLEGE ASSESSMENT

While important facets of the IFD are orienting and mentoring new faculty and helping all faculty hone their teaching and communicate clearly about it, an equally important aspect is helping plan, implement, and report on assessment of student learning at the college. We need to be involved in an ongoing cycle of planning, assessment, and revision of our curriculum and courses, not only to report to outside assessment bodies and stakeholders, but, more importantly, to ensure that we are doing our best to help our students learn.

Page 7: Annual Report, IFD, 2010-2011 · Web viewAnnual Report, IFD, 2010-2011

7

HELP DISSEMINATE ASSESSMENT RESULTS ACROSS CAMPUS AND FOSTER COLLEGE-WIDE DISCUSSION OF ASSESSMENT

INTERPRET AND DISSEMINATE RESULTS OF NSSE, CLA, AND OTHER SURVEYS AND TESTS WITH FACULTY The IFD collaborates with IR to help interpret results of NSSE, CLA, and other surveys and tests with faculty. One venue for this is through the Assessment Committee, a group of faculty from across Schools, which met four times last year to examine and discuss CLA results, NSSE results (particularly benchmark reporting looking at trends in Stockton’s NSSE scores over time), the local results of a national faculty survey, and more. In addition, the IFD publishes the internal newsletter Evidence, through which the Director of the IFD and others can share summaries of student performance on the CLA or NSSE (last year’s fall issue included articles about the most recent CLA results and the comparison the IFD did of transfer to native student performance on the CLA). It also is a place for programs and individuals to share their assessment projects: the spring issue of Evidence summarized the results of the General Studies pilot assessment and the fall issue included articles by faculty about assessment projects. The Assessment Committee and Evidence are intended to help keep faculty at-large informed about assessment efforts across the college and in other programs. This spring’s Evidence was published for the first time as an e-pub in order to a) align more fully with the college sustainability theme, b) help compensate for the budget cut the IFD experienced last year and increased print-shop costs, and c) be more in line with increased ownership of ereaders and tablets.

HELP PROGRAMS DEVELOP ASSESSMENT PLANS

CONSULT WITH PROGRAMS AND COORDINATORS Since July 1, 2010, the Director of the IFD has had about eleven meetings with programs or coordinators about assessment plans (not counting work done with participants in last year or this year’s Assessment Institute). Most of this work involves asking questions and listening to identify issues/areas of interest where programs might first focus their assessment efforts or suggesting changes to current assessment efforts that programs find unsatisfactory. See Appendix D for a sample email communication about program assessment. The IFD also provides provide reading material online or from the IFD library or purchased from the IFD budget to assist programs and provide hands-on help developing learning outcomes, plans, surveys, etc. The IFD Director also been involved in an advisory role with the assessment efforts of the General Studies Task Force as they planned assessment (now piloted) of some of the thirteen General Studies Outcomes as they play out in various G-categories. In addition, the IFD is now working with Claudine Keenan to complete a database summarizing and evaluating the status of assessment in programs across campus. Last year’s new faculty and participants in last year and this year’s Assessment Institute all received copies of a book about assessment.

LEAD THE ANNUAL INTENSIVE SUMMER ASSESSMENT INSTITUTEThe Assessment Institute has been held annually for five years: three times with a general focus on helping faculty/programs with assessment plans and twice with a focus on developing CLA-like performance tasks for use both pedagogically and for assessment of student learning. For several years, faculty wrote about their experiences at the end of the institute. Feedback was mostly positive. There have been 70 participants (about 44 after correcting for repeat participants) in the Assessment Institute from 2007-2011. Past participants have published articles in Evidence --seven report in the August 2007 issue, four in August 2008, two in Nov. 2008, two in Feb. 2009 (which also summarizes evaluations for the Institute in summer 2008), and three in Nov. 2009. Others have implemented their assessment plans and not yet written about them.

Page 8: Annual Report, IFD, 2010-2011 · Web viewAnnual Report, IFD, 2010-2011

8

Assessment Institute 2011 participants:

1. Carol Rittner2. Diane Holtzman3. Evonne Kruger4. Karen York5. Levi Fox6. Jessica Fleck7. Mark Berg8. Donnie Allison 9. Mary Lou Galatino10. Joyce Welliver 11. Joan Perks12. Chia Lin Wu 13. Judy Vogel14. Bruce Hietbrink15. Elizabeth Elmore16. Luis Pena

This list includes participants from most schools in the college, including a staff and an adjunct faculty member. Two accepted participants from CSIS withdrew as one had a family emergency and they were to work as partners; one of the persons above was added at the last minute as a substitute. Three proposals were rejected for having projects that were a poor fit or poorly defined. Participants met on May 18 and 19 and June 1 and have completed substantial work on their summer projects (agenda in Appendix E; see CFP and presentations online). For their work, most received $400.00 stipends and breakfast and lunch for the three meeting days.

COORDINATE CLA TESTINGThe IFD collaborates with IR on CLA testing, which has been done annually since 2006. Starting next year, it will be done every other year. The IFD takes the lead role in CLA testing, which means that the IFD Director and Senior Clerk Typist complete the following tasks:

File an Institutional Testing Plan with CLA Recruit CLA participants (faculty/students) Schedule lab space and set up testing sessions online Write and enter local questions Proctor tests Collaborate with IR to send data to CLA Help interpret and disseminate CLA results Collaborate with IR to do local analysis of results

CLA results (interpreted in the most positive light, average, and more realistically as somewhat disappointing) for last year were discussed by the IFD in Evidence last fall and at an Assessment Committee meeting. Claudine Keenan presented results to Dean’s Council. New this year were an internal comparison of transfer and native students at Stockton (who scored, overall, similarly), internal analysis done by IR that showed that a significant portion of (but not all) difference in student performance can be explained by student time taking the test, and analysis of results by program for some programs (e.g., PSYCH, HIST, BUSN—see sample as Appendix G). In addition, the CLA will provide more extensive data when we get the results for last year this coming fall. Also, the Director of the IFD this year added local questions for both first year and senior students and the IFD administered a simple tool to try to measure student engagement in the CLA test taking itself. These may provide a richer sense of the context of CLA testing at Stockton to help us better interpret our results.

Page 9: Annual Report, IFD, 2010-2011 · Web viewAnnual Report, IFD, 2010-2011

9

SERVE AS A STUDENT EVALUATION LIASON

At the end of the term, most faculty administer student evaluations of teaching. When last year’s MOA for student evaluation of teaching was drafted, the union and administration desired to centralize more decision-making about student evaluation of teaching in response to problems and to feedback that different Schools were making different decisions. Also, the College was increasingly implementing changes to student evaluations (use of a homegrown instrument for evaluating classes with fewer than fifteen students by gathering qualitative data, increased use of online student evaluations) that meant that there would be many more questions. Therefore, the union and administration determined that the Director of the IFD and the Executive Assistant to the Provost would be in good positions to have a big-picture view of student evaluations and to make relatively neutral and hopefully fair decisions about what would be in the best interest of the college, the faculty and the students in individual cases. Specific tasks that fall to the Director of the IFD are listed below:

Communicate administrative and other deadlines to faculty and staff Assist in determining when faculty/staff have legitimate exceptions to established policies

and procedures Assist in troubleshooting technological and other administrative problems Order Group Summary Reports and help programs interpret them Conduct internal research using local IDEA results (e.g., conducting analysis of

differences among response rate and type at Stockton when using different forms (online or paper) for different kinds of classes (traditional, hybrid, and online)

The change last summer resulted in a more dramatic shift in the workload for the Director of the IFD than anyone anticipated. For example, the IFD Director has had more than 25 administrative meetings about student evaluations in the past year, compared to perhaps five or fewer the year before. In addition, weeks were spent composing email reminders to School staff and to faculty about deadlines, helping diagnose or deal with the results of technical problems, discussing individual requests for exceptions through email, on the phone, or in person (with the other IDEA Liason, the faculty member, appropriate staff, and/or representatives of the administration and union). The IFD designed tables, calendars, and notices to distribute or post in School Offices to help faculty and staff more easily translate the MOA policy language into practical action and produced an updated copy of instructions for administering student evaluations. The IFD also became involved in helping identify faculty who haven’t entered data on time or have entered data incorrectly and reaching out to them so that their evaluations can proceed. This lists now ends, before it utterly bore readers; suffice it to say that the importance of student evaluations and the complexity and newness of the current evaluation system at Stockton, in addition to the IFD Director’s previous lack of experience in the administrative details, combined to make this responsibility account for probably at least 1/3 of the time of the Director of the IFD in the past year. Changes in policy and the Director’s growing familiarity with the process may decrease the time connected to this responsibility next year.

In addition to assisting with the administration of student evaluations, a relatively new responsibility for the Director of the IFD, who historically was a bit more peripherally involved (assisting faculty with interpretation, advising about objective choices and consulting about CIP choice), the IFD has also done a significant portion of work analyzing student evaluation data in the past year. The college has been piloting online evaluations for traditional classes; this led Dennis Fotia and Heather McGovern to conduct a preliminary analysis of several years worth of student evaluation data at Stockton. This analysis looked at the impact on scores and response rate of online evaluations in face-to-face classes where they were already being done at Stockton (EDUC). Dennis and Heather presented this data at a national conference on assessment in October, 2010, and locally at the Day of Scholarship in Spring 2011. Once IDEA results are in for Spring 2011, that data can be analyzed to prepare a report on the pilot. In the meantime, the Director of the IFD prepared a very preliminary report looking only at response rate of online pilot vs. online non-pilot courses that indicated that response rates for courses

Page 10: Annual Report, IFD, 2010-2011 · Web viewAnnual Report, IFD, 2010-2011

10

participating in the pilot in Fall 2010 were healthy, well above the national average and close to the previous figures for face-to-face classes at Stockton. Response rates for courses participating in the pilot in Spring 2011 are lower—Dennis Fotia reports a history of lower spring response rates—and we cannot yet compare the online response rates to those in face-to-face classes for the same terms.

Finally, the IFD orders and helps interpret group summaries for programs. Many programs ordered group summaries this year for use in their five-year program reviews. Ordering the summaries this year took a great deal of time for the Director of the Institute and staff in the IFD and Provost’s Office as changes in our use of IDEA mean that IDEA no longer has course acronyms and numbers for classes using the paper IDEA form on record. Therefore, we had to identify each individual course in annual spreadsheets and compile a list of CRN’s in order to order reports. One can imagine that this takes substantial time when ordering a report for, say, all first year writing courses for two years or all 2000 level PSYCH classes for five years. In addition, the IFD, upon request, helps analyze the group summaries for programs (Appendix F).

PROVIDE SUPPORT FOR PROGRAM ASSESSMENT AND FOR FACULTY RESEARCH INTO AREAS RELATED TO PEDAGOGY AND ASSESSMENT

The IFD also provides other support for assessment and teaching, including the following services:

Maintain a Zoomerang account and other research software(NVIVO, SPSS) Create surveys (e.g. for First Year Seminars, dPT, and EDUC) Help analyze survey data Enter program assessment data for CRIM, MATH, BASK, QUAD, Writing, MATH, and

other programs into Excel or SPSS Conduct basic statistical analysis of program assessment data Interpret and discuss assessment data with programs Purchase, proctor, and help interpret results of standardized tests (COMM

The Director of the IFD also serves on multiple committees due to the position, including this past year helping write two Middle States sub-committee reports and serving on a First Year committee chaired by Tom Grites.

BUDGET

The budget of the IFD supports its activities, with the bulk of money being spent on conference travel and participation, a guest speaker, and the Assessment Institute.

ITEM COSTAssessment Institute, faculty stipends 6,350Guest speaker honorarium 6,300I Critical exam, COMM 2,250Assessment Institute conference, travel and hotel for Heather McGovern and Dennis Fotia

1,644

Copier rental 1,620Printing Evidence at print shop 1,035Assessment institute, catering 870New faculty orientation, catering 766Books (teaching and assessment) 630Office supplies (pens, post-its, tabbed 585

Page 11: Annual Report, IFD, 2010-2011 · Web viewAnnual Report, IFD, 2010-2011

11

dividers, nametags, dry erase markers, copier toner, printer cartridges, copy paper, stationery, envelopes, etc.) for IFD use, workshops, NFOZoomerang subscription 350Welcome gifts for new faculty for Fall 2011

280

22, 680

Next year the IFD will save money on our guest speaker costs as our speaker will probably cost under $3,000. The IFD will probably not have to purchase I Critical. Heather McGovern will again travel to the Assessment Institute in Indiana to present a poster presentation and the IFD will probably provide some registration or travel assistance to other faculty presenting at that conference. Evidence can likely continue as an epub. The Zoomerang, office supply, NFO, and Assessment Institute costs should remain relatively stable. This will free some money to support other faculty/staff travel to Assessment related conferences or workshops, other relevant conference fees/travel for the Director of the IFD, or other uses.

FUTURE PLANS

Goals for next year include maintaining work done this year and the following:

Continue to enhance the IFD website with links to pedagogical, assessment, and faculty development reading and resources

Work with IR to determine what should happen with the now mostly historically useful assessment information compiled on the IFD website

Start (spring 2012) a new series of workshops to reinvigorate and inspire recently tenured faculty

Focus more time and energy on working with programs on assessment and on Evidence Work on developing new prejudice awareness programming Work on developing new retirement-preparedness programming (with SCOSA) Continue to build a scholarly support database (with Grants Office)

ASSESSMENT SUMMARIESA significant challenge for the college has been how to summarize program assessment results. In 2009-2010, the IFD tried to improve the past practice (programs reporting to the IFD for a summary on the IFD website) by suggesting that instead of programs reporting to both their Dean and the IFD, they report to the Dean and the Dean share the information with the IFD. This plan had the unfortunate outcome of schools responding differently: some (EDUC, GENS) created their own assessment summaries posted on their own websites to which the IFD website now links. Others (HLTH, BSNS, SOBL) provided the IFD with an updated summary of assessment efforts. ARHU and NAMS did not communicate new assessment data to the IFD that year. Then, the IFD took a backseat in assessment information gathering in Fall 2010 given that multiple middle states subcommittees were collecting this data for middle states reporting. The IFD did not want to pester the same people for information and hoped that people would be more forthcoming for Middle States reporting than for internal purposes. However, reading through the subcommittee reports indicated to the IFD Director that reporting by programs was still uneven, sometimes not up-to-date, and sometimes incorrect. Then, the IFD and IR have been collaborating on a difficult-to-build summary database summarizing and evaluating

Page 12: Annual Report, IFD, 2010-2011 · Web viewAnnual Report, IFD, 2010-2011

12

assessment efforts by program that is about 2/3 complete as of now. Data from Coordinator’s Reports using the new template that asks for assessment data should assist in completing that. Conversations with IR and others should help establish whether the IFD website will continue to host a summary of assessment information or whether that will now be hosted elsewhere; any decision made now will need to be revisited once Sedona and other technologies are brought into play to help compile assessment data.

NEW WORKSHOP SERIESA survey of faculty who next year will be in their 5th or 6th year and a meeting with Dean’s Council have informed the goals for a new workshop series for non-first year faculty that will be piloted in spring 2012. The survey yielded promising results. Most respondents (fourteen) are interested in a series of professional development workshops. Most indicate that they are likely to attend (see chart below).

Survey results also show that recently tenured faculty are most interested in sessions that address using a sabbatical and pedagogical issues (experiential learning and backwards course design). They have some interest in service learning and campus leadership and very little interest in issues of larger

Page 13: Annual Report, IFD, 2010-2011 · Web viewAnnual Report, IFD, 2010-2011

13

Stockton culture or higher education. See chart below.

Survey respondents were fairly evenly split about days/times to meet, so scheduling will be a challenge, making it necessary to invite other faculty and to recognize that a cohort reunion will not be fully possible. The majority of respondents prefer either monthly or biweekly meetings. Given their interest in the proposed topics, next year’s pilot will offer something like the following schedule, targeting fifth and sixth year faculty but open to all faculty. Most will be held twice, using different days/modules, to maximize the likelihood of any individual being able to attend, with the two least popular topics each held once.

Making use of a sabbatical (panel discussion with faculty who’ve had productive sabbaticals recently, reading) (January)

Getting students really involved.  (Panel Discussion with faculty using experiential learning.) (February, x 2)

Radically rethinking a class? Try backwards course design! (Panel discussion with Stockton professors who've redesigned a course using this technique.) (March, x 2)

Interested in campus leadership? A panel discussion about ways to prepare, proceed, and succeed. (April, x 1)

At your service: How can you meaningfully include service learning in your class? (Session led by the new coordinator for service learning) (April, x 1)

PREJUDICE AND DISCRIMINATION AWARENESS ACTIVITIESAs part of separate initiatives, the Director of the IFD has been discussing with a number of faculty the need for the IFD to play a role in helping faculty deal with the very real discrimination that they face from their students, colleagues, and others. Given that newly tenured faculty show little interest, these

Page 14: Annual Report, IFD, 2010-2011 · Web viewAnnual Report, IFD, 2010-2011

14

will be left out of that initiative developed separately. The IFD has gathered scholarship to distribute online to faculty for reading before or after a panel discussion by faculty and students about their experiences and about practical strategies for coping. These initiatives are needed to address the results of the Cultural Audit which indicate that Stockton’s culture needs continued growth in its ability to deal with diversity in a positive and welcoming manner.

PREPARING FOR RETIREMENT ACTIVITIESAs another separate initiative, but part of the same goal of expanding faculty development activities to serve more faculty beyond the first year, the IFD will collaborate with SCOSA to work on some workshops to help faculty think about whether retirement is right for them, now, and to prepare socially, emotionally, etc. for retirement.

SUMMARY

The IFD hopes to continue to respond to college needs. The primary resource limitation that the IFD faces is the time available for the Director to complete all desirable activities, and this may remain a limiting factor next year, although there is a possibility that if, indeed, less time is needed for both evaluation of teaching and middle states tasks, more time may be available for other activities. The Provost has regularly indicated a willingness to provide more staff or student worker support; the IFD could also pay for student support out of its budget. However, the Senior Clerk typist has available time and skills to complete the kinds of tasks that are easily given to most students or other lower-level staff. Conducting teaching observations, analyzing assessment results, consulting with faculty, working with teaching evaluation issues, and other tasks are not easily handed over to students or Clerk typists. Nonetheless, the IFD has been able to shuffle priorities (fewer issues of Evidence published, but more time spent on student evaluations; less time learning the position, but time spent on Middle States subcommittees) to accomplish the higher priority tasks and meet the goals that the College has for the IFD.

APPENDIX A: NEW FACULTY ORIENTATION AGENDAS

NEW FACULTY ORIENTATION, 2010 AGENDA

MONDAY, AUGUST 23

Time Room Activity 8:30 am F-207 Arrival and Refreshments (Photos in F-204)

8:45 am Welcome: Provost, Deans, Assistant Deans

9:00 am Faculty Development: Getting to Know One Another: You and Stockton

10:15 am Break

10:30 am Faculty Development: Evaluation of Teaching

12:00 pm Upper G-wing Mentor Luncheon

1:00 pm F-207 General Education: Dean Jan Colijn

Page 15: Annual Report, IFD, 2010-2011 · Web viewAnnual Report, IFD, 2010-2011

15

2:00 pm F-206 Break, Meet with ABP Vendors, and HR catch-up

2:30 pm F-207 Faculty Development: Getting Real

4:00 pm TRLC Faculty Senate and Residential Life Reception

TUESDAY, AUGUST 24

Time Room Activity 8:30 am F-114 Arrival and Refreshments

8:45 am Campus Safety: Glenn Miller

9:00 am Computer Services, Linda Feeney

9:30 am e-Classroom Basics, Bob Heinrich

10:00 am Break (Podium Keys & IDs outside F-114)

10:15 am F-111 Student Affairs: Assoc. VP Dee McNeely-Greene; Dean Pedro Santana

11:00 am Grants Office Overview: Beth Olsen

11:30 am Ethics and Affirmative Action Office: Nancy Hicks

12:00 pm Upper G-wing School Lunches with Provost, Deans, and Assistant Deans

1:00 pm F-118 Advising /Precepting: Peter Hagen

3:00 pm F-120 Coffee and Light Refreshments

3:15 pm Stockton Federation of Teachers: Tim Haresign

3:45 pm Registrar: Joe LoSasso

NEW FACULTY ORIENTATION, 2011 DRAFT AGENDA

MONDAY, AUGUST 22

Time Room Activity 8:30 CC MR5 Breakfast and Welcome, President, Provost, Deans, Assistant Deans

9:00 CC MR4 Faculty Development: Getting to Know One Another: You and Stockton

10:15 CC MR3 Break and photos in F-204

10:30 F115 Computer Services, Linda Feeney

11:15 F115 e-Classroom Basics, Bob Heinrich (podium keys)

Page 16: Annual Report, IFD, 2010-2011 · Web viewAnnual Report, IFD, 2010-2011

16

12:00 CC Event Room School Lunches with Provost, Deans, and Assistant Deans

1:00 pm CC MR4 General Education: Dean Jan Colijn

2:00 pm CC MR? Break, Meet with ABP Vendors, and HR catch-up

2:300 CC MR4 Grants Office, Beth Olsen & Jillian Cawley

3:00 CC MR4 Faculty Development: Prepping for the semester (syllabi, dates to know)

4:00 Adjourn

4:30-6:30 Dr. and Mrs. Saatkamp host a reception for new faculty at the President’s home

TUESDAY, AUGUST 23

Time Room Activity8:30 CC MR5 Breakfast and Campus Safety, Chief Glenn Miller

9:00 CC MR5 Student Affairs: Assoc. VP Dee McNeely-Greene; Dean Pedro Santana

9:45 CC MR5 Ethics and Affirmative Action: Nancy Hicks

10:15 Break

10:30 CCMR5 Faculty Development: Evaluation of Teaching

12:00 CC Event Room Mentor lunch with address from Provost

1:00 CC MR5 Advising /Precepting: Peter Hagen

2:00 CC MR5 Registrar: Joe LoSasso

2:30 Break

2:45 CC MR5 Faculty Senate: Mike Frank and SFT: Tim Haresign and Sue Burrows

3:30 Adjourn

APPENDIX B

In 2010, New Faculty Orientation served the following faculty and staff with teaching responsibilities:

1. Jibey Asthappan2. Guy Barbato3. Robert Barney4. Larry Boni5. Venustiano Borromeo6. Douglas Deane7. Jeremy Ervin8. Susan Fahey

Page 17: Annual Report, IFD, 2010-2011 · Web viewAnnual Report, IFD, 2010-2011

17

9. Maya Gibbons10. Philip Hernandez11. Jessie Jarvis12. Bill Quain13. Shelley Scarpino14. John Theibault15. Edward Walton16. Dina Yankelewitz17. Eliseo Valdez

APPENDIX C: ADJUNCT WORKSHOP AGENDA, FALL 2011

10:00 – 12:00: Electronic classroom session with Linda Feeney (room: TBD)

12:00 – 1:00: Lunch with Dr. Kesselman (campus center, room TBD)

1:00 – 1:30: Heather McGovern

1:30 – 2:00: Chad Parlett

2:00 – 2:30: Kathy Franzese

2:30 – 3:00: meet with Alternate Benefit Providers

APPENDIX D: SAMPLE PROGRAM ASSESSMENT COMMUNICATION

FIRST YEAR SEMINARSAs a sample of assistance to programs, see the data below. The IFD entered student responses on the pre and posttests the first year seminar faculty piloted in fall 2010, then the Director prepared the following brief, informal summary report:

Table 1: Percentage of students responding correctly on Academic Honesty Quiz, pre and post, in pilot in fall 2010

Page 18: Annual Report, IFD, 2010-2011 · Web viewAnnual Report, IFD, 2010-2011

18

1 2 3 4 5 6 7 8 9 100

102030405060708090

100

prepost

Analysis:

Overall, student performance improved from the pre to the post. However, on two items student performance got worse from pre to post. One of these items is question 3. There, I recommend repeating the wording from question two (the scenario) rather than simply referring to question two, in case students forgot the details and either couldn’t easily go back online or chose not to. Or, possibly, students became more cautious after receiving more instruction—also possible.

The other is question 10. One might check with instructors to see how that information was covered? It is also the only real Stockton-specific procedural question, so it might be the hardest for students.

Note that because some questions were so easy for students, there was little room for movement up in the posttest. This was true for questions 4 (is using the friend’s conclusion a violation), 5 (who can be charged with academic dishonesty if sharing a frat paper), 6 (is a student guilty of academic dishonesty if bought a paper online), and 9 (what is the first step a student should take if accused of academic dishonesty), in particular, where nearly or more than 90% of students got the right answer in the pretest. Substituting different questions for those might be more informative. Ideas might include asking about whether a photo found online can be used without citing or whether a student can use a paper written in one class for another or, especially given students’ low performance on the post test on question 10, additional questions about procedure or possible penalty at Stockton.

Of course, if the first year seminar group thinks that this group of students may not have been representative, it could proceed with these questions to the whole group and establish a baseline. And, on most of these items students’ scores did go up after instruction—and these are things we care that students know. Nonetheless, it may not be a good use of four question slots to ask questions when the vast majority of students know the answer in the pre-test.

COMMUNICATIONHere is the text of two emails to the COMM coordinator, again providing informal analysis. The Director of the IFD also later met with the program to discuss these results.

EMAIL 1:

Page 19: Annual Report, IFD, 2010-2011 · Web viewAnnual Report, IFD, 2010-2011

19

We have results now. Students saw their individual results yesterday as soon as they finished the exam. Of the 42 test takers, eight passed and 34 failed.

Scores ranged from 0-430. When eliminating the zero score outlier (a student who spent only 15 minutes on the test), the low score was 120 and the mean score was 217. The median score was 210 and the standard deviation was 64. Eliminating the zero score 15 minute test taker, the smallest amount of time spent was 21 minutes and the greatest amount of time spent 61 minutes. The average amount of time spent (with the low outlier excluded) was 45 minutes. The r correlation efficient for time spent and score was .37, so time spent (as one measure of student effort) explains only a small amount of the variation in scores. The company's "cut" score for pass/fail is a 260.

In addition, we have group results for 20 of the test takers (the others did not enter the exam group number or entered it incorrectly; the company claims we cannot put them into a group after the fact, but I will call and press on this issue next week).

Those scores are as follows. The results are reported separately for different versions of the test. Of the 12 test takers in the exam group results who took one version of the test, the mean score was 214, so two passed and 10 failed.

The percentage they got correct in each area measured by the test is as follows:

ACCESS 73

COMMUNICATE 47

CREATE 58

DEFINE 63

EVALUATE 41

INTEGRATE 47

MANAGE 46

Of the eight in the exam group report who took the other version of the test, the mean score was 201 and the percentages they got correct in each area measured by the test are as follows

ACCESS 38

COMMUNICATE 49

CREATE 60

DEFINE 69

EVALUATE 61

INTEGRATE 46

MANAGE 62

Overall, then, Communication students in the group exam (20 of the 42 tested) scored differently on the two different versions of the test. In both cases, they scored in the 40%'s for integrate, just under 50% correct on communicate skills, close to 60% on create, and in the 60%'s for define. The highest scores

Page 20: Annual Report, IFD, 2010-2011 · Web viewAnnual Report, IFD, 2010-2011

20

were for access on the first version of the test, which is probably the only set of student scores that the Communication faculty will be satisfied with.

Right now I can't get skill set break downs for students who weren't in the exam group. I will see if we can get these from the company.

Next week I can also access information we have on how Stockton students did on this test (or a similar test) in the past--which was not so good, if that makes you feel better. Those weren't all communication students.

I'm also attaching the company's pdf with a quick definition of what the different skills are...I would imagine that evaluate, integrate, create, and communicate may be of particular interest to Communications as a program.

As I am bearing mostly bad news, let me note that your program deserves praise for being brave enough to collect data like this when pretty much all of the strong direct data we have on student skills (previous testing with iskills, CLA, and many program-created test results) are fairly depressing. If you would like me to help share this information with your program or meet with you to talk about it in more detail, I'm willing to do so. One thing that might prove useful is to create a curriculum map, a fairly simple process of mapping in which courses most students take they'd work on these skills. Faculty in the program also may have thoughts about whether these results match with their sense of student skills. And, of course, these more general skills do not speak to whether students can give a good speech or make a good video, although in many cases these skills are necessary but not sufficient for those kinds of tasks. We can also look at individual student's scores to see if those match faculty expectations or not.

I would also thank the faculty and student volunteers. The test is difficult--students cannot simply answer multiple choice questions but had to read directions and actually complete hands-on tasks on the computer, such as indicating in which order to put information or reading a mock webpage to find particular information.

EMAIL 2 I just came across a summary Sonia wrote for campus wide results for the iskills test, which is what then became icritical--we gave it to 183 lowerclassmen and 359 upperclassmen between 2006-2008 and students scored, when compared to a national reference group, about at 80% in Define, 68% in access, 63% in evaluate, 69% in manage, 70% in integrate, 72% in create, and 45% in communicate. Here's a direct comparison to COMM:

                       COMM     Stockton

Access             73%        68%

Communicate   49%         45%

Create             60%         72%

Define              69%         80%

Evaluate           61%         80%

Integrate          47%         70%

Manage            62%         69%

Page 21: Annual Report, IFD, 2010-2011 · Web viewAnnual Report, IFD, 2010-2011

21

 

In the above comparison, I used the highest percentage from the two forms of the test for COMM students, not an average, so this paints COMM students in the most positive light possible (in case one form of the test has more bias against students in this major).

 You and the program can then see that in some categories: Access and Communicate and Manage, COMM students (at their best) score similarly to Stockton students. That may, however, be bad news in terms of Communicate, where we'd probably expect COMM students to outscore a broader sampling of Stockton students. (Communicate is defined as tasks like the following: formatting a document to make it more useful to a particular group, transforming an email into a succinct presentation to meet an audience's needs, selecting and organizing slides for distinct presentations to different audiences, or designing a flyer to advertise to a distinct group of users).

 In most areas, COMM students score worse than Stockton students: create, define, and evaluate. The largest discrepancy is in integrate, where COMM students scored quite significantly lower than Stockton students.

 Integrate is defined as "Interpret and represent information, such as by using digital tools to synthesize, summarize, compare, and contrast information from multiple sources"--such as comparing materials from competing vendors by summarizing information in a table or representing results from a tournament into a spreadsheet or summarizing information from different sources according to specific criteria in order to compare information and make a decision.

 Define and Access are typical basic research skills (identify a research questions, ask questions to clarify an assignment, collect and retrieve information using keywords or browsing). Evaluate is mainly evaluating sources. Manage is sorting-categorizing emails or sorting files. Create is tasks like editing and formatting a document according to editorial specifications, creating a presentation slide, or creating a data display.

 The most COMM skills appear, to me, to be likely to be Create and Communicate, with integrate potentially being the least COMM skill.

APPENDIX E: ASSESSMENT INSTITUTE AGENDA, 2011

AGENDA, ASSESSMENT INSTITUTE SUMMER 2011

DAY ONE:9:00-9:30: Breakfast and introductions. Write name on nametag.

9:30-11:30: Introduction to Assessment: Lecture, activity, and discussion

11:30-12:00, if time permits: Pick a partner or small group in which to work. Define your project in 1-3 sentences.

12:00-1:00: Lunch

1:00-2:30: Work on one of the following activities:

Drafting or revising learning objectives for the class/major/minor/etc. that you wish to assess.

Page 22: Annual Report, IFD, 2010-2011 · Web viewAnnual Report, IFD, 2010-2011

22

Work on creating a curriculum map, connecting existing learning objectives to a student’s experiences in the class/major/minor/etc.

Work on selecting instrument(s) and/or method(s) for assessing student progress on learning objectives.

Work on starting or revising instruments/methods

2:30-3:00: Write a question on your notecard/Q and A session

DAY TWO: 9:00-10:00: Schedule day 3. Breakfast and, as needed, continuation of Q and A from day one

10-12: Work on one of the following activities:

Drafting or revising learning objectives for the class/major/minor/etc. that you wish to assess. Work on creating a curriculum map, connecting existing learning objectives to a student’s

experiences in the class/major/minor/etc. Work on selecting instrument(s) and/or method(s) for assessing student progress on learning

objectives. Work on starting or revising instruments/methods

12-1: lunch

1-2: Q and A for projects, as needed

2-2:30: Overview of college-wide assessment knowledge/results

2:30-3:00: Make a plan for continuing your project between now and Day 3.

APPENDIX F: SAMPLE GROUP SUMMARY ANALYSIS

The IFD orders group summaries, and the Director sometimes provides additional analysis of them for programs. What follows is a sample informal analysis of the group summaries for WGSS, which looked at the introductory and theory (capstone) class for the minor:

Specifically, we can see that 100% of the capstone classes emphasized objective 2, learning fundamental principles, generalizations, or theories; objective 8, developing skill in expressing myself orally or in writing; and objective 11, learning to analyze and critically evaluate ideas, arguments, and points of view. These seem appropriate for the goals of the class. Other goals were selected about 1/3 of the time--the N here is small, just three classes.

These objective choices compare to classes overall at Stockton, too--71% of classes overall at Stockton choose objective 2, but only 42% objective 8 and 44% objective 11. In contrast, 75% of classes at Stockton choose objective 1 (learning facts) and 69% select objective 3 (apply course material)--but these are not so much the focus of Theories.

When we look at the distribution of scores for progress on relevant objectives, excellent teacher, and excellent course for the class, we see that the class once fell into similar and twice into higher for PRO and teacher, and once into higher and twice into similar for excellent course. This is also favorable--and remains favorable when compared to Stockton, as well.

Page 23: Annual Report, IFD, 2010-2011 · Web viewAnnual Report, IFD, 2010-2011

23

Students report progress on the selected objectives for the class at a higher rate than other classes at Stockton or in the IDEA system when those objectives are selected, which is good, of course.

Students also report that the following pedagogical techniques are very frequently used:

Demonstrated the importance of the subject Stimulated students to intellectual effort Inspired students to set and achieve goals which really challenged them Asked students to share ideas and experiences with others whose backgrounds and viewpoints

differ from their own Displayed a personal interest in students and their learning Encouraged student-faculty interaction outside of class Encouraged students to use multiple resources Related course material to real life situations Involved students in hands on projects Made it clear how each topic fit into the course Explained course material clearly and concisely

The only pedagogical techniques IDEA considers relevant to the goals for the class that are not that frequently used are

Asked students to help each other understand ideas or concepts Gave tests, projects, etc. that covered the most important points of the course

Even for these, however, the means were 4.0 and 4.1.

We also learn that students report that they worked harder on this class than on most classes they have taken (4.5 mean, 100% over 4.0 compared to 36% and 24% over 4.0 in classes at Stockton or in the IDEA database). They are not excited about taking the course, with lower scores than are typical at Stockton or in the IDEA system for that. Students report that they do a lot of reading (100% 4.0 or above compared to 20% at Stockton and 15$ in IDEA with comparable ratings) and work in non-reading assignments (67% report 4.0 or above compared to 21% at Stockton and 18% in IDEA) and they find the material difficult (100% 4.0 or above compared to 22% at Stockton and 18% in IDEA).

So, the class is challenging, but they learn a lot.

The class also makes them have a better attitude toward the subject (mean score on a question about this is 4.4, compared to 4.1 at Stockton and 3.9 in the IDEA system). That is excellent news!

Regarding perspectives, news is more mixed.

No objective is chosen 100% of the time. The most frequently chosen objective is 3, Learning to apply course material, which is selected 89% of the time. Developing communication skill is selected 67% of the time and Learning to analyze and critically evaluate, gaining factual knowledge, and learning fundamental principles are each selected 56% of the time. The findings here seem to indicate that conversation among WS faculty teaching the course about what objectives make sense on IDEA would be good--although, of course, some variety here may be desirable given that there are different valid ways to teach the course.

Comparison of Progress on Relevant Objectives scores are somewhat depressing. When compared to the IDEA database, only 11% of Perspectives classes are ranked in the top 30% percentile and 44% are ranked in the lowest 30% percentile. Excellent teacher and excellent course are worse, with 44% in the similar range (fine) but 44% in the lowest 10%!

Page 24: Annual Report, IFD, 2010-2011 · Web viewAnnual Report, IFD, 2010-2011

24

For no selected objectives did students indicate learning at the same rate as with other courses at Stockton or in the IDEA database. The best learning appeared to be for gaining factual knowledge (for the five classes for which it was selected).

Pedagogies used frequently were forming teams and discussion groups, asking students to share ideas and experiences with other, displaying a personal interest in students and their learning, relating course material to real life, involving students in hands on projects, and explaining course material clearly and concisely. These are good, but this is a pretty short list, and in many cases students also reported that some of these were used infrequently.

An area for faculty to focus on first might be the student report that giving tests, projects, etc. that covered the most important points of the course was used infrequently (Relevant to 8 of 9 classes in the report, student rating of 3.8).

Students reported a low desire to take the class (44% of students rated below 3.0, compared to 8% of all Stockton classes and 16% of all IDEA classes with similar ratings). Students report working about as hard as is typical at Stockton and in the IDEA database and reported the class to include a large amount of reading in comparison to other classes and a bit more non-reading work, with similar difficulty of subject matter. Perhaps most alarming is that few students had a more positive attitude toward the subject after the class (3.5 rating compared to 4.1 for all Stockton courses and 3.9 for IDEA courses).

In other words, students don't want to take the class (overall), don't think they learned that much, don't like the class much, and don't get an improved attitude about the subject. On the bright side, the course appears to have an appropriate level of intellectual rigor for its place in the curriculum and the type of class it is.

Still, these results seem to indicate that faculty may need to discuss what to do to help focus the class and improve student learning and attitudes in the class.

APPENDIX G: SAMPLE CLA PROGRAM RESULTS

Last fall, the IFD completed program-level analyses of CLA results for some active programs. See sample results below:

PSYCHOLOGYDear Liz,

 I am sending some summary, descriptive data on 24 Psychology seniors who took the CLA in spring 2009 and spring 2010. This is a small group, so be wary of making an major decisions based on this data--CLA requests that you have 100 students in a sample to do any sub-group analysis, and this is well below that. That said, I am sending some data to programs that have had more than 20 participants over the past two years in case this increases the usefulness of our CLA results for change at Stockton.

 Now that I've given caveats, I'll report:

Psychology seniors spent from 3-50 minutes on the test, with an average of 32 minutes. The average GPA for these students was 3.17, with a range from 2.14 to 4.0 

Mean CLA score of seniors across schools that took the CLA: performance task=1156/analytic writing=1226 (2010 data)

Page 25: Annual Report, IFD, 2010-2011 · Web viewAnnual Report, IFD, 2010-2011

25

Mean CLA scores of seniors at Stockton: performance task=1130/analytic writing=1166Mean CLA scores of psychology seniors at Stockton: performance task=1090/analytic writing=1194

 On both the performance task and analytic writing tasks, psychology seniors, like all Stockton seniors, performed worse than seniors did nationally. On the writing task, psychology seniors performed slightly better than Stockton seniors overall and on the performance task, slightly worse. However, our overall student data has such high variance (as does your program's student data (standard deviation of 176.3 for performance task and 126.6 for analytic writing), that this is not particularly meaningful.

Here's how students scored when their entering academic ability (as measured by the SAT if they are native and the SLE if they are transfers) is taken into account:

Performance task # well below expected=3/Below expected=2/at=1/above=3/well above=1

Analytic writing # well below=1/below=3/at=0/above=5/well above=1

 Nine of your test takers were native to Stockton and the rest were transfer students. Your native students performed worse than your transfer students did, but the N is so small that this is not meaningful. 

I don't think this data will prove useful to your program, unfortunately. All it (kinda) does is indicate that psychology students don't appear to be dramatically different from other Stockton students. I do hope that my sharing it helps thank members of your program for donating class time or otherwise persuading students to participate in the CLA.

BUSINESSDear colleagues,

 I'm sending some summary, descriptive data on 48 Business seniors who took the CLA in spring 2009 and 2010. This is a small group, so be wary of making any major decisions based on this data--CLA requests that you have 100 students in a sample to do any sub-group analysis, and this is below that. That said, I'm sending some data to programs that have had around 20 or more participants over the past two years in case this increases the usefulness of our CLA results for change at Stockton.

 Now that I've given caveats, I'll report:

The 19 of your students who took the test in spring 2010 spent an average of 43 minutes on the test, with a range of 5-78 minutes. The average GPA of your test takers was 3.3 with a range from 2.87-3.98.

mean 2010 CLA score of seniors across schools who took the CLA: performance task=1156/analytic writing=1226

mean 2010 CLA scores of seniors at Stockton: performance task=1130/analytic writing=1166

business 2009, 2010  senior means total= performance task=1086.5/analytic writing=1149.7

Senior business majors at Stockton, as sampled in spring 2009 and 2010, scored slightly worse than the average student at Stockton on the performance task, and quite a bit worse than the average senior in the national sample on the performance task. They scored slightly worse than the average Stockton

Page 26: Annual Report, IFD, 2010-2011 · Web viewAnnual Report, IFD, 2010-2011

26

student on analytic writing, and quite a bit worse than the average senior in the national sample on analytic writing.  However, our student data has high variance  Nonetheless, this data seems in line with national findings that students in Education and Business tend not to show as much improvement in critical thinking and analytic writing skills over their college education. Clearly, Business and Education majors (nationally and locally) could be learning a great deal that is not measured by the CLA.

 Here's how students scored when their entering academic ability (as measured by the SAT if they are native and the SLE if they are transfers) is taken into account:

Performance task # well below expected=7/Below expected=6/at=3/above=1/well above=4

Analytic writing # well below=8/below=7/at=3/above=1/well above=4

 In other words, their performance is fairly low compared to how we'd expect them to perform based on their SAT scores--this measure helps correct for a group with lower overall academic ability. You can also get a sense here of how variable the students' performance was.  

I hope this data can help inform Business as faculty consider the learning of their students in relation to critical thinking and analytic writing.  I do hope that my sharing it helps thank members of your program for donating class time or otherwise persuading students to participate in the CLA.

HISTORYDear Tom (are you still coordinator? it is so hard to find this information!) and Rob (in thanks for volunteering),

 I'm sending some summary, descriptive data on 19 History seniors who took the CLA in spring 2010. This is a small group, so be wary of making any major decisions based on this data--CLA requests that you have 100 students in a sample to do any sub-group analysis, and this is well below that. That said, I'm sending some data to programs that have had around 20 or more participants over the past two years in case this increases the usefulness of our CLA results for change at Stockton.

 Now that I've given caveats, I'll report:

Your students spent an average of 40 minutes on the test, with a range of 18-58 minutes. The average GPA of your test takers as 3.33 with a range from 2.87-3.98.

mean CLA score of seniors across schools who took the CLA: performance task=1156/analytic writing=1226

mean CLA scores of seniors at Stockton: performance task=1130/analytic writing=1166

history senior means total= /performance task=1149 /analytic writing=1257

Senior history majors at Stockton, as sampled in spring 2010, scored very slightly better than the average student at Stockton on the performance task, but slightly worse than the average college student on the performance task. They scored much higher, on average, than seniors at Stockton and about at the national average on analytic writing. However, our overall student data has such high variance (as does your program's student data (standard deviation of 131.3 for performance task and

Page 27: Annual Report, IFD, 2010-2011 · Web viewAnnual Report, IFD, 2010-2011

27

186.7 analytic writing), and the number of your students (10 and 9) who took each part of the test is so small, that this is not particularly meaningful.

Here's how students scored when their entering academic ability (as measured by the SAT if they are native and the SLE if they are transfers) is taken into account: Performance task # well below expected=0/Below expected=2/at=0/above=3/well above=2Analytic writing # well below=1/below=1/at=2/above=0/well above=3 We don't have this comparative data for some of your seniors because they took the CLA but not the SLE and, as they were transfer students, we don't have their SAT scores. They also, then, were not included in the official Stockton report. (This is not because of your students, but because yours were the first seniors my office proctored last spring and we didn't anticipate that so many students would neglect the SLE without additional prompting).  14 of your test takers were transfer students.

 I don't think this data will prove useful to your program, unfortunately. About the only thing it (kinda) indicates, is that your students' analytic writing skills are reasonably good (although this is a sample of 9 students who completed that part of the CLA test), so no strong conclusion can reasonably be drawn from this. I do hope that my sharing it helps thank members of your program for donating class time or otherwise persuading students to participate in the CLA.