Evaluating Writing Programs in Real Time: The Politics of

33
Evaluating Writing Programs in Real Time: The Politics of Remediation Barbara Gleason College Composition and Communication, Vol. 51, No. 4. (Jun., 2000), pp. 560-588. Stable URL: http://links.jstor.org/sici?sici=0010-096X%28200006%2951%3A4%3C560%3AEWPIRT%3E2.0.CO%3B2-9 College Composition and Communication is currently published by National Council of Teachers of English. Your use of the JSTOR archive indicates your acceptance of JSTOR's Terms and Conditions of Use, available at http://www.jstor.org/about/terms.html. JSTOR's Terms and Conditions of Use provides, in part, that unless you have obtained prior permission, you may not download an entire issue of a journal or multiple copies of articles, and you may use content in the JSTOR archive only for your personal, non-commercial use. Please contact the publisher regarding any further use of this work. Publisher contact information may be obtained at http://www.jstor.org/journals/ncte.html. Each copy of any part of a JSTOR transmission must contain the same copyright notice that appears on the screen or printed page of such transmission. The JSTOR Archive is a trusted digital repository providing for long-term preservation and access to leading academic journals and scholarly literature from around the world. The Archive is supported by libraries, scholarly societies, publishers, and foundations. It is an initiative of JSTOR, a not-for-profit organization with a mission to help the scholarly community take advantage of advances in technology. For more information regarding JSTOR, please contact [email protected]. http://www.jstor.org Wed Nov 14 16:30:27 2007

Transcript of Evaluating Writing Programs in Real Time: The Politics of

Evaluating Writing Programs in Real Time: The Politics of Remediation

Barbara Gleason

College Composition and Communication, Vol. 51, No. 4. (Jun., 2000), pp. 560-588.

Stable URL:

http://links.jstor.org/sici?sici=0010-096X%28200006%2951%3A4%3C560%3AEWPIRT%3E2.0.CO%3B2-9

College Composition and Communication is currently published by National Council of Teachers of English.

Your use of the JSTOR archive indicates your acceptance of JSTOR's Terms and Conditions of Use, available athttp://www.jstor.org/about/terms.html. JSTOR's Terms and Conditions of Use provides, in part, that unless you have obtainedprior permission, you may not download an entire issue of a journal or multiple copies of articles, and you may use content inthe JSTOR archive only for your personal, non-commercial use.

Please contact the publisher regarding any further use of this work. Publisher contact information may be obtained athttp://www.jstor.org/journals/ncte.html.

Each copy of any part of a JSTOR transmission must contain the same copyright notice that appears on the screen or printedpage of such transmission.

The JSTOR Archive is a trusted digital repository providing for long-term preservation and access to leading academicjournals and scholarly literature from around the world. The Archive is supported by libraries, scholarly societies, publishers,and foundations. It is an initiative of JSTOR, a not-for-profit organization with a mission to help the scholarly community takeadvantage of advances in technology. For more information regarding JSTOR, please contact [email protected].

http://www.jstor.orgWed Nov 14 16:30:27 2007

Barbara Gleason

Evaluating Writing Programs in Real Time: The Politics of Remediation

A case study of the evaluation of a three-year pilot project in mainstreaming basic writers at City College of New York suggests that the social and political contexts of a project need to be taken into account in the earliest stages of evaluation. This project's complex evaluation report was virtually ignored by college administrators.

College writing remediation has become increasingly controversial in recent years. David Bartholomae has suggested that we reexamine the term "basic writing" ("The Tidy House"); Mike Rose has argued that an ideology of intel- lectual inferiority permeates remedial instruction ("The Language of Exclu- sion"); and a highly influential basic writing curriculum dissolves distinctions between remedial writing and college composition (Bartholomae and Pet- rosky). During this same era, several writing programs have begun to experi- ment with enrolling remedial-placed students in full-credit-bearing college composition courses (Grego and Thompson; Rodby; Royer and Gilles).

As names such as Bartholomae and Rose would suggest, this "anti- remediation" movement is being propelled by scholars who are sympathetic to the aspirations of those labeled "remedial writers." At the same time, however, forces opposed to the admission of "unqualified" students have launched their

CCC 51:4 /JUNE 2000

G I - E A S O N / E V A L U A T I N G W R I T I N G P R O G R A M S I N R E A L T I M E

own attack on remediation. The most striking recent manifestation of this at- tack appeared in May 1998 when the Board of Trustees for the City University of New York (CUNY) first approved a resolution to refuse senior college admis- sion to students who have not passed all three CUNY skills assessment tests in reading, writing, and math (~ renson) .~

Although our profession does not endorse placing students in remedial writing courses on the basis of one test? all seventeen CUNY colleges have long used the CUNY Writing Assessment Test, a 50-minute impromptu exam, to de- termine students' initial placement in a writing course sequence. Students who fail the test, even though they have already been accepted into a CUNY senior col- lege3 on the basis of high school grades, class rank, or SAT score, have been gen- erally required to enroll in remedial writing (Basic Writing 1,Basic Writing 2, or an ESL writing course) and have frequently been barred from other required col- lege courses. The same testing conditions and grading rubric have been applied for native English speakers and for students whose first language is other than English-almost one-half of this student population.4

During summer 1993, my colleague Mary Soliday and I received support from the Fund for Improvement in Postsecondary Education (FIPSE) to pilot a mainstreaming project for students of diverse cultural backgrounds and writ- ing abilities at the City College of New York (CCNY), one of CUNY's seventeen colleges. Central to this endeavor was a two-semester writing class that en- rolled both students who had placed into Basic Writing 1or Basic Writing 2 and students who had placed into English 110 (first-year college writing).

During the three-year life of this project, the course of our journey never ran smooth. Not onlywere we piloting a writing program that was radically dif- ferent from the existing one (which continued to operate simultaneously), but we were doing so in an environment colored by continued controversy over CUNY's 1970 open admissions policy. During the project's second year, for ex- ample, James Traub published a scathing at- tack on remedial students and courses at The course of our journey never ran smooth. City College in one issue of The New Yorker and in his book, City on a HilL5 Traub's critique appeared close on the heels of journalist Heather AlacDonald's "Downword Mobility: The Failure of Open Ad- missions at City University" and John Leo's condensed version of that critique (''A University's Sad Decline"). In Leo's analysis, CUNY's decline had resulted from letting "the walls between remedial and college-level courses [come do~m]"(20), a critique that in fact captured a primary aim of our pilot project.

CCC 5 1 . 4 / J U N E 2000

In a review of Traub's book for The Nation, Jon Wiener connected the cri- tiques of Traub, MacDonald, and Leo:

Traub doesn't express the loathing for Harlem's youth that you find in John Leo and Heather MacDonald; he presents himself as sympathetic to their problems and interested in their lives. But although he never makes it explicit, the conclu- sion he leaves the reader with is pretty much the same one that Leo and Mac- Donald presented: Sending the youth of Harlem to college is useless; the taxpayers should stop wasting their money. (522)

These journalists' critiques upped the ante for a project like ours, which allowed students to bypass remedial courses. How would remedial students-those whose very admission to CUNY was now being publicly challenged-succeed when they were admitted directly to a full-credit writing course? And how would we best be able to demonstrate the nature and the extent of their suc- cesses and failures? Because the question of remediation-and remedial writ- ing in particular-had become a cause celebre at CUNY, we realized that responsible program evaluation would be particularly critical.

The project and its evaluation: an overview The thirty-seven mainstreamed sections of first-year writing we piloted shoul- dered a very heavy burden: The classes had to respond to the needs of a highly diverse student population, nearly 50% ofwhom were bilingual or wrote English as a second language; to the needs of a specific faculty, 80% of whom were part- time or "adjunct" teachers; and to institutional expectations, i.e., preparation of students for other college courses and for two institutional writing tests.

To meet the needs of students, we developed a curriculum that brought second-language learning and dialect forms into the classroom as topics for discussion, reading, and writing. Classroom tutors worked with teachers to provide the individualized instruction and peer support so badly needed on an urban commuter campus. To meet the needs of the faculty we scheduled fac- ulty workshops and asked instructors to compile teaching portfolios and con- tribute in various ways to the project's evaluation. To meet the expectations of our college, we promoted writing projects that involved students in analytical thinking and research writing.

To theorize the curriculum we drew heavily on Shirley Brice Heath's study of literacy, on the work of sociolinguists such as Deborah Tannen, and on the cur- riculum development research of Eleanor Kutz, Suzy Groden, and Vivian Zamel (Groden, Kutz, and Zamel; Kutz, Groden, and Zamel). We proposed that teachers

G L E A S O N / E ' J A L U A T I N G W R l i i N G P R O G R A M S I N 9 E A L T I M E

encourage self-reflection about past literacy and language experiences through the writing of autobiographies; we proposed further that teachers promote met- alinguistic awareness by designing writing projects that engage students in learn- ing about language from a sociolinguistic perspective, by collecting samples of actual language, and by examining and analyzing language.

The most difficult aspect of this endeavor-as we later came to discover- was program evaluation. The dearth ofpublished resources on writing program evaluation became quickly apparent as we searched the literature for gu idan~e .~ How were other writing programs commonly evaluated? Who conducted the evaluations and wrote the reports? Who were the readers of these reports, and what sorts of evidence did they find most useful or most persuasive? We quickly deduced that no one evaluation approach would answer questions about a pilot writing course, Because the question of remediation- its support structures, students' experiences, and remedial writing in particular-had and the effects of relevant educational and cul- become a cause celebre at CUNY,we tural contexts. And because students who place realized that responsible program into remediation on the basis of a writing skills would be particularly critical. test are usually believed to be unprepared for college composition and sometimes "underprepared" for college courses in gen- eral, we found it necessary to evaluate the students' experiences in the curricu- lum at large as well as in the writing course itself.

We developed an evaluative process consisting of four components: (1)formative evaluation of students' and teachers' experiences in the writing course, which included an "expert judgment" evaluation at the end of the project's first year; (2) a statistical analysis of student progress and achievement both in writing courses and other undergraduate courses; (3) an expert judg- ment report of the project at the end of its third and final year (summative eval- uation); and (4) an evaluation of the writing of twenty-tcvo randomly selected students and self-assessment of their learning. Because we have described the fourth component of this evaluation in detail elsewhere (Soliday and Gleason), I will limit this discussion to the first three components.

Formative evaluation of the first year The process of evaluating a program in order to understand and improve upon it-generally referred to as "formative evaluation"--is often contrasted with "summative evaluation"-a product-oriented form of assessment reporting on the quality of a program for an external audience (Davis, Scriven, and Thomas

CCC 5 1 : 4 / J U N E 2000

3).Our OMTI experience, however, was that formative evaluation impacted on summative evaluation in important though unforeseen ways.

We discovered that Suzy Groden's evaluation at the end ofthe first year- the "expert judgment" approach described by Witte and Faigley-served pri-marily as a type of formative evaluation. In large part because she evaluated this project at a fairly early stage in its evolution, Groden's assessment informed us about the experiences of students and instructors, about problems with in- struction and curriculum, and about issues in constructing a summative project evaluation. Ofparticular value to us were Groden's insightful comments about the curriculum, both as articulated in program documents and as actu- alized in classrooms.

The curriculum: what we learned during the first year The primary goal of our formative evaluation was analysis of the course cur- riculum as manifested in actual classrooms. By poring through teachers' port- folios and interview transcripts with teachers and tutors, and with the benefit of Groden's expert judgment, we began to discover existing problems with the curriculum as interpreted and implemented by instructors. One of the most se- rious problems was that some teachers and students did not fully grasp the na- ture or the purpose of the pilot writing course curriculum. Consequently, the curriculum was not appearing in actual classrooms as completely and consis- tently as we had hoped.

During that first year, we had presented to teachers written materials out- lining principles of language and learning designed to guide teachers' individ- ual interpretations of the curriculum. In the context of the regularly scheduled

faculty workshops, we suggested sev- Our Own experience, however, was that formative eral writing projects that we believed evaluation impacted on summative evaluation in embodied curricular principles-one

important though unforeseen ways. of the most important of which is that language consciousness underlies liter-

acy development (Pattison 8-9). Among the assignments we suggested were a literacyllanguage autobiography (Soliday) and an analysis of students' own oral and written stories (Gleason, "Something of Great Constancy"). In addition to these two recommended assignments was one writing assignment that was re- quired of all students: the ethnographic research report.

In analyzing transcripts of interviews with teachers (for which we used a scripted list of questions), we discovered that some teachers were requesting more curricular structure and greater guidance from us. One teacher. a first-

G L E A S O N / E V A L U A T I N G W R I T I N G F R O G R P , M S : N R E A L T I M E

time composition instructor, suggested 'giving the students much more of a continual kind of rationale" for "what they were embarlung on." She went on to say "I wasn't even that clear what I was embarking on" and to suggest "I do think it is true that we need to pay more attention to structure and more sup- port [for teachers] at the beginning." In her final report, Groden enlarged on this theme:

The major problem that was discernible from the April 1994visit was that the ba- sic premises of the program, the rationale for the curriculum, and the philo- sophical framework underlying the project were not fully understood by some of the teaching faculty and students. (Evaluation Report 22)

In addition to suggesting that faculty receive a strong orientation prior to teaching the pilot course, Groden noted that 'greater consistency in the cur- ricular content of the courses seems desirable" (22).

In discussions of future evaluation plans, Groden argued persuasively for an external review of student writing as one approach to studying the effec- tiveness of the pilot course curriculum. As a result of these discussions, we be- gan considering some form of project-wide portfolio evaluation and the merits of project-wide required writing assignments to anchor the curriculum firmly in every course section. We suspected that an absence of uniformity in writing topics and genres would make it difficult for external readers to evaluate stu- dent writing from several different course sections. We knew that at least some uniformity of portfolio contents had been common in other program-wide portfolio assessments (Elbow and Belanoff) and that the quality of student writing may vary from one genre to another; for example, argumentative writ- ing is more difficult for many student writers than narrative (Englehard, Gor- don, and Gabrielson). We therefore found it desirable that students in different course sections write at least some essays in similar genres.

Our concerns about project evaluation, coupled with the teachers' re- quests for greater guidance and curricular structure, led us to require five as- signments in the second year: a literacyllanguage autobiography, an analysis of oral and written stories, an ethnography, a research report on some form of popular culture, and an essay for which each student would choose a topic. We also emphasized the importance of self-reflective writing, which might appear in a portfolio cover letter or in other writing experiences. These five required as- signments were (as a result of a vote by second-year teachers) reduced to three in the project's third year: the literacyllanguage autobiography, the ethnogra- phy, and a library research reportldocumented essay

C C C 5 1 . 4 / J U N E 2000

Our formative evaluation of the pilot course during its first year thus had a direct impact on a summative assessment of student writing at the end of the project's third and final year (Soliday and Gleason). Requiring common writing

assignments made the presence of the In analyzing transcripts, we did not view our project curriculum more apparent in all aim as one of documenting success or lack of writing classrooms; achieving some degree

success but rather as one of discovering what of uniformity in the contents of the student

was actually going on in ciassrooms. portfolios that would be assessed at the end of the project's third year made the final

summative assessment easier. For external readers to assess the portfolios of students they do not know in the context of an unfamiliar curriculum presents difficulty enough. Adding diversity of writing assignments to the evaluators' reading experience would have complicated a task that was already too de- manding and complex.

Using student transcripts to analyze student progress Another key issue during the first year of this project was for us to learn about the many different sorts of problems students and teachers were encountering as a result of participating in a newly formed, two-semester writing course in which a class of students stayed with one teacher for a full academic year. One of the first ways in which we set out to examine student experienciwas to study information provided by student transcripts: course enrollments, grades, writ- ing skills test score, and writing course placement.

In analyzing transcripts, we did not view our aim as one of documenting success or lack of success but rather as one of discovering what was actually going on in classrooms. It soon became apparent that we had much to learn about the many logistical problems created by the new two-semester course and by allowing "remedial" students to enroll in college-level course^.^ During this first year, we were also able to use descriptive statistics culled from tran- scripts to address some concerns voiced by our colleagues.

We compiled a transcript list that enabled us to keep track of each stu- dent's academic progress and grades: for each student in the mainstreamed sections of English 110,we made an entry of all courses enrolled in during the first year, the grades received in those courses, the results of the CUNY Writing Assessment Test and the CUNY Reading Assessment Test, and whether the stu- dent enrolled through our college's access program SEEK (Search for Educa- tion, Elevation, and Knowledge).

G L E A S O N / E V A L i J A l I N G W R I T I N G P R O G R A M S I N R E A L T I M E

Our transcript list proved invaluable for program administration and for- mative evaluation during the first two years of this three-year project. Using the list, we could assess the mix of students in individual sections; here, for exam- ple, is the information we had on one section:

28 students enrolled in fall 1994 29 students enrolled in spring 1995 11originally placed in English 110 12originally placed in Basic Writing 2 5 originally placed in Basic Writing 1

12 students were in SEEK

This snapshot description reveals a fairly balanced mix of students who had placed into college-level English 110 versus students who had placed into re- medial writing (Basic Writing 1and Basic Writing 2). (Although each course section had a unique mix of remedial and college-level students, both remedial and college-level-placed students were enrolled in every class.)

This descriptive data also allowed us to identify and address specific lo- gistical problems created by the pilot course. For instance, although a student's grade for both fall and spring semesters was supposed to have been recorded at the end of the academic year (students received one grade for both semes- ters), a few students found that their grades for the fall semester had not been recorded. Our transcript list (which we updated periodically) allowed us to know when we needed to request grade changes.

The transcript list also was particularly useful in allaying our colleagues' concerns about the project and in the final project evaluations. We discovered that many of the students who had placed into remedial writing had passed not only the pilot course but also some core cur- riculum courses for which they ordinarily We discovered that many of the students would not have qualified. Many of our col- who had placed into remedial writing had leagues hadbeenlrery ~oncerned that reme- passed not only the pilot course but also dial-~laced students participating in this some core curriculum courses for which they project might be set up for failure-both in ordinarily would not have qualified. their composition course and in other core courses. Colleagues questioned our decision to allow remedial-placed students enrolled in the pilot writing course to co-enroll in core curriculum courses (for which they otherwise would not have been eligible). At the request of our de- partment chair: we had initially agreed to only one restriction on enrollment

CCC 5 1 . 4 / J U N E 2000

of our remedial-placed students in core courses: they had to have passed the CUNY Reading Assessment Test. However, if they had passed this reading test but not the writing test, pilot-course students were eligible to enroll in such courses as world humanities, world civilization, and philosophy.

When we compared the grades of remedial-placed students with the grades of students placed directly into English 110, we discovered that our re- medial-placed students were passing the core courses at a rate that was even higher than the rate for our pilot course students who had placed into English 110. Our data for pilot-course students' grades in World Humanities 101 are listed in Table 1. The World Humanities 101 course and a second humanities course subsequently became the two core courses used in our consultant's quantitative analysis of student grades and accumulated credits.

During the spring 1994 semester, we appeared at a Core Curriculum Com- mittee meeting to request that committee's approval to continue our experi- ment in allowing remedial-placed students to enroll in core curriculum courses. Because we had descriptive statistics indicating the success of remedial-placed students in core courses, the committee quickly approved our policy of allow- ing pilot course students who had placed in remedial writing to enroll in core curriculum courses.

Organizing and reviewing up-to-date transcript files offer a powerful al- ternative to the sorts of anecdotal evidence that too often influence writing program policy decisions. As U.S. Department of Education research analyst

Table I:Fall 1993 students who enrolled inWorld Humanities 101

Eligibility Number Percentage Grade

English 110 7 A and core courses 3 B

4 C 14 A, B, or C 6 D or F

Basic Writing 2 but not 3 A normally for core courses 6 B

4 C 13 A, B, or C 3 DorF

Basic Writing 1 but not normally 3 A, B, or C for core courses 1 D

G L E A S O N / E V A L U A T I N G W R I T I N G P R O G R A M S I N R E A L T I M E

Clifford Adelman points out, transcripts "tell us what really happens, what courses students really take, the credits and grades they really earn, the degrees they really finish and when those degrees are awarded" (vii).

The issue of remedial-placed students passing or failing core curriculum courses was to become a key element in our statistical analysis of student grades and credits during the project's third year. Data provided by our tran- script list offered an early indication of what sorts of conclusions a statistical analysis of students' grades would yield.

The statistical study of student progress and achievement A central goal of the statistical component of the evaluation was assessment of the performance of students who placed into remedial writing but chose instead the pilot college-level writing course. When we began the project in academic year 1993-94, there was very little relevant research from other institutions that would help us design this evaluation. We did consult Peter Dow Adams' study of the various academic paths taken by ba- sic writers at Essex Community College, Organizing and reviewing up-to-date transcript which indicates that many self-selected files offer a powerful alternative to the sorts of students who place into remedial writing anecdotal evidence that too often influence will in fact pass college composition if writing program policydecisions. given the opportunity.g Although strongly suggestive, this study does not offer a model of program evaluation, for which we had to look elsewhere.

We invested a good deal of our time and resources in this component of project evaluation because we believed that quantitative research would speak far more powerfully to college administrators than would any form of qualita- tive assessment. As Mina Shaughnessy noted twenty years ago, "the debate about Open Admissions has been and is being carried on in the language of those who oppose it: in the alphabet of numbers, the syntax of print-outs, the transformations of graphs and tables" (401-04).

Measures of predicting student success Our college's policy of placing students who fail the CUNY Writing Assessment Test into a (reduced credit or no credit) remedial writing course and barring them from required core curriculum courses is underwritten by the assumption that the writing test predicts student success in college courses. A central aim of our project evaluation would be to challenge that premise. We knew that we would need to study the predictive value of initial placement in a remedial writing

C C C 51 : 4 / J U N E 2000

course. In preparing for this component of project evaluation, we searched for what others had found to correlate positivelywith student success in college.

One particularly valuable resource we found was What Matters in College? Four Critical Years Revisited, in which Alexander Astin reports on a longitudi- nal study of 24,847 students at 159 different institutions from 1985-90. The

strongest predictor of students' college Our reading of Astin's research led US to speculate grades is their high school grades, Astin

that students'high school grades might be a concludes, with SAT verbal scores com- stronger predictor of students'success rates in ing in second at one-half the relative

college than the results of a writing skills test. weight of high school grades. Astin re- ports that "hundreds of other studies"

conclude that "the two most important input predictors of. ..students' college grades ...are the high school GPA and SAT verbal scores" (188).

Constructing a database Our reading of Astin's research led us to speculate that students' high school grades might be a stronger predictor of students' success rates in college than the results of a writing skills test. Even before reading Astin's work we had planned to use a multivariate analysis approach to study the relationships among student success in college and variables known to correlate with stu- dent success. In our grant proposal, we had presented a very general outline of this plan as one of four components of project evaluation. That component of our evaluation plan reads as follows:

Development of a quantitative database with the help of a professor from the De- partment of Education and the CCNY Office of Institutional Research. Current student records will be used to track GPA and retention rates and will be com- pared with the records of students in the conventional MTritingProgram at CCNY. Additional student records will be developed by means of a survey which collects information formerly found to correlate with student success (e.g., family re- sponsibilities, full-time work, family income); first and second language infor- mation will also be included in the survey. These variables will be analyzed by means of multivariate analysis in order to evaluate the relative effects of experi- ence in this pilot project. (Gleason and Soliday FIPSE Application Proposal 13)

We later came to discover that only information that was already available in existing student records could be used in a statistical study, for we lacked the necessary funding to administer a survey and create a new database for all the students enrolled in writing courses over a period of three years (4,363 stu- dents). This lack of information meant that the three variables we had men-

G L E A S O N / t \ / H i U A i N G \n !R I l lNG PROGRA?>4S I N R E A L T I M E

tioned in our grant proposal (family responsibilities, full-time work, family in- come) could not be used, nor could any information for which existing student records presented incomplete data. Incomplete records on student language background and ethnicity for example, prevented us from studying a possible testing bias against second language speakers and particular minorities, a the- sis previously advanced by our colleague Ricardo Otheguy.

Our experience in constructing a database from existing student records was so complex and so impeded by unforeseen problems that I would specify this as a primary issue for assessing student performance in a mainstreamed writing course. It is crucial to find someone within the institution who can download specific types of data from student records and then create a data- base consonant with the goals of a statistical study. In some colleges, creating a database will be a fairly routine, uneventful process, but if our experience is at all indicative, this will not be true in all schools.10

Although we encountered many difficulties creating a database, these problems pale vis-8-vis the difficulties we experienced in finding a consultant to design and conduct the statistical study.

Encountering institutional resistance and hiring an external consultant Our decision to let students self-select into our pilot writing course had direct consequences for our project's evaluation. This decision precluded the possi- bility of using the experimental model so strongly favored by the research di- rector in our college's Office of Institutional Research. In a preliminary correspondence, this research director proposed that we randomly assign stu- dents to a "control group" and a "treatment group." Because it was neither fea- sible nor desirable for us to assign students to the pilot course at random, we found it difficult to enlist the services of the research director in our Office of Institutional Research. The full extent of this difficulty was made painfully ev- ident in a first meeting with him, at which a senior (male) professor from the Math Department-who happened to be the college ombudsman-arrived unannounced and prepared to convince us to use random samples and an ex- perimental model.

During this meeting it became apparent that were we to rely on the ser- vice of the college's institutional research director, we would have limited input into the design and implementation of the study. Drawing on ideas from Alexander Astin's work and from research on CUNY students by David Lavin, Richard Alba, and Richard Silberstein, we believed we could include all pilot

C C C 5 1 : 4 i J U N E 2000

course students in the study and not just the students from the project's third year, as this research director's plan would require since we had already enrolled students for the first two years at the time of our first meeting with him. By us- ing multivariate analysis statistical methods we would be able to control for differences among students (e.g., effects of high school average, skills test scores, and previously earned college credits) and to avoid the necessity of us- ing randomly selected samples of students. Our college's research director saw the problem differently: he viewed the traditional experimental research model as the most valid approach. He also mentioned that it would be important to control dissemination of any information from the study that would reflect badly on City College-a comment reflecting the growing sensitivity of City College employees to the "CUNY critics."

Having been advised by our funding agency that reports from external consultants are generally more credible than reports from employees of our own institution, we became even more determined to hire a consultant who

was not employed by our own college. We We were integrating into the regular curriculum wanted this consultant to propose a re-

students believed to be underprepared search design, visit our campus, oversee

for college and blurring distinctions between the creation of a database, perform the

remedial and college-level courses, or, analyses, and write a final report. This

in the words of John Leo,allowing students lengthy, labor-intensive work requires

outof the remedial"antechamber." both extensive expertise in statistics and experience with educational research; it

also requires a significant time commitment. Not surprisingly, we experienced some difficulty in finding a consultant to conduct this evaluation. The consul- tant we eventually hired, Matthew Janger, was employed as a research assistant at Social Policies, Inc., a private company we discovered through contact with the American Association of Higher Education.ll We felt ourselves very fortu- nate to find someone who was competent to perform the analysis and who would engage in meaningful dialogue with us along the way.

In retrospect, I view our difficulties in designing the statistical evaluation as representative of a pervasive institutional resistance to this project. In this case, the resistance manifested itself in the attempt of two men (the institu- tional research director and the math professor/college ombudsman) to exert their influence in order to gain some control over the statistical study. But this was only one of many forms of resistance to this project. Why would certain factions within our college balk at the idea of a mainstreamed writing course? We were integrating into the regular curriculum students believed to be un-

G L E A S O N / E V A L U A T I N G W R I T I N G P R O G R A M S I N R E A L T I M E

derprepared for college and blurring distinctions between remedial and col- lege-level courses, or, in the words of John Leo, allowing students out of the re- medial "antechamber" (20).

A second "expert judgment" assessment: Keith Gilyard's report At the end of this project's third and final year, we invited Keith Gilyard to visit our campus and write an evaluative report. The particular forms of expertise that Gilyard brought to this evaluation are worth noting. Witte and Faigley have suggested that the expert judgment approach suffers from our profession's lack of agreement about the types of expertise required of an evaluator (5).How-ever, we found that the nature of our project and our own questions provided a strong index of the sorts of expertise called for in this evaluation.

As a student, Gilyard had direct experience of the New York City public school system (described in his Voices of the Self),which is where most CUNY students have been educated. Gilyard eventually became a professor at CUNY's Medgar Evers College, where he taught for fourteen years in the English De- partment. More recently, Gilyard has served as Director of the Writing Program at Syracuse University a private institution very different from any CUNY col- lege, with a strongly supported and highly innovative writingprogram.12 It was both his insider's knowledge of CUNY writing programs and his own experi- ence as a program administrator that made Gilyard such a strong potential evaluator of our project. He understood the history and the politics of CUNY's skills tests and remediation practices, but he also understood issues in writing program development from the perspective of a broad spectrum of scholarship and research.

Before we had begun planning Gilyard's visit to our campus, we were pay- ing far closer attention to the effects of institutional politics on our work than we had earlier. By the end of the project's second year, it had become patently obvious that powerful forces within our college would block institutionalizing a course allowing "remedial" students directly into the college curriculum. The full impact ofthis opposition was brought home to us when, in this project's fi-nal year, our Faculty Council voted to make English 110 (and no longer just re- medial writing) a prerequisite for six core curriculum courses required of all students, a decision that would impede the progress of all students at City Col- lege but would slow the progress of remedial writing students by at least two, and in some cases, three semesters. Clearly a wide range of faculty and admin- istrators viewed themselves as stakeholders in our college's writing courses.

CCC 5 1 . 4 / J U N E 2000

The concept of stakeholders is central to an evaluation approach proposed by Egon Guba and Yvonne Lincoln. In their analysis of approaches to evaluation, Guba and Lincoln make a strong case for a dialogic, politically oriented process, arguing that earlier modes of evaluation ignored the dimension of socio-politi- cal context and a sociological understanding of knowledge. First generation evaluation is based on the testing of students. An evaluator's role, which is pri- marily technical, involves selecting from among available tests (or "instru- ments") or creating a needed test. Contemporary examples of first generation evaluation exist in the form oftests required of students to graduate from high school or tests required for admission to college (Guba and Lincoln 26).Second generation evaluation resulted from the recognition that it is not only students who must be evaluated, but also curricula. The evaluator is construed as one who describes "patterns of strengths and weaknesses with respect to certain stated objectives." Testing of students is retained but viewed as only one of sev- eral forms of measurement. (28)Third generation evaluation moves beyond an ''objectives-oriented descriptive approach" to include judgments. The evaluator is still responsible for technical decisions, e.g., what tests to use, but judgment is considered central to this approach (30-31). The expert judgment approach de- scribed by Witte and Faigley falls under the category of third generation evalua- tion. The evaluator's expertise (in writing program development or writing instruction) is described by Guba and Lincoln as the "connoiseurship qualities" for which the evaluator is chosen (31). Clearly, CUhT's mass assessment of all incoming students' reading, writing, and math skills exemplifies the concept of first generation evaluation.

What distinguishes Guba and Lincoln's fourth generation evaluation from its predecessors is a hermeneutic, "constructivist" epistemology that meshes values with understanding (44).Objectivist or realist epistemologies, which re- tain the distinction between the knower and the known, are rejected in favor of

the theory that knowledge arises from "inter- Evaluation becomes far more a socio-political action bemeen observer and obserlred" (44).

process than an empirical investigation or an Under this type of analysis, evaluation be- exercise in description and judgment. comes far more a socio-political process

than an empirical investigation or an exer- cise in description and judgment, with the important difference that every "stakeholder group expects and receives the opportunity to provide input into an evaluation that affects it and to exercise some control on behalf of its own interests" (51).Stakeholders are described as 'groups at risk" who are 'bpen to

G - E A S O N / E V A L L J A T N G \ W R I T I N G P R O G R A M S I N R E A L T M E

exploitation, disempowerment, and disenfranchisement" and who will use the information provided by an evaluation and be educated by it (51-56). This de- finition of stakeholders might well be broadened to include all those who have a strong interest in a course or program and who might influence its future, but by this definition the most obvious stakeholders in a writing program are part- time and temporary instructors.

The fourth generation approach to evaluation has much to offer writing programs, which are often held accountable by many different departments and sometimes even by groups not di- rectly affiliated with a college (Crow- We hoped that some of our colleagues'objections ley; Phelps). However, this approach to the pilot course would be allowed to surface requires involvement ofvarious stake- from private, corridor conversations to a place holder groups from the very inception where they could be clearly articulated and would of an evaluation process and, if possi- become a matter of public record, ble, from the inception of a new cur- riculum or course structure. In our case, use of a fourth generation approach could have entailed including certain stakeholders in writing our grant pro- posal, a tactical move that might have strengthened the likelihood of the main- streamed writing course surviving its three-year funding from FIPSE, which has not happened.

Despite the late hour at which we discovered the fourth generation theory of evaluation, it was not too late to incorporate the stakeholder concept into our second and final expert judgment evaluation, and this we did do. Rather than limiting the group of people Keith Gilyard would interview to those students, tutors, and teachers directly involved in the pilot project-as we had done for Suzy Groden-we decided to invite some professors known to have voiced ob- jections to the pilot project to speak with Keith. In so doing we were well aware that this inclusion would not lead to the type of investment that an all-out par- ticipatory process might have inspired, but this was not our aim. Rather, we hoped that some of our colleagues' objections to the pilot course would be al- lowed to surface from private, corridor conversations to a place where they could be clearly articulated and would become a matter of public record, which would allow us as well as other interested observers a more informed vantage point from which to analyze the institutional context for this project.

On the basis of interviews with faculty members (with expertise in litera- ture and in reading remediation) who had not taught the pilot writing course Gilyard drew one primary conclusion about opposition to the pilot course: the

CCC 5 1 . 4 / J U N E 2000

opposition appeared to derive from flawed interpersonal relationships rather than disagreements about curriculum:

Some faculty members felt that, though they were ready for change, they had not been properly consulted about FIPSE. "Respect" and "disrespect" were words of- ten mentioned. Some resented the autonomy of composition teachers involved with the FIPSE project. And one faculty member bluntly stated that there would be turf wars if there were attempts to expand the FIPSE curriculum within the English Department. (Report 3)

On the basis of interviews with both participating and non-participating teachers, Gilyard commented further on some of the controversy surrounding this project:

Of course I don't pretend that the FIPSE curriculum is without its critics and con- troversy, or even that my evaluation is noncontroversial. There were concerns ex- pressed, by both FIPSE instructors and those in the English Department at large, about the emphasis on popular culture. There were concerns expressed about the heterogeneity of classrooms relative to writing abilities. Some teachers outside the FIPSE project were not sold on the efficacy of the two-semester arrangement. Some tutors indicated that there was a need for more consistent tutor-instructor expectations. However, these seem to be minor and resolvable problems beside what is a well-conceived program with the potential for enormous impact. (2)

Thus, although some disagreements about the curriculum and the non-track- ing nature of this course were expressed, Gilyard did not view these as critical to decisions about institutionalizing the mainstreamed course. More signifi- cant were the problematic interpersonal relationships Gilyard referred to, which resulted at least in part from ideological differences in orientation to- ward remedial students.

With regard to the pilot course itself, Gilyard wrote of a program that "ex- emplifies leading-edge work in composition studies" (2). Gilyard reported fur-

ther that "much ofwhat is theorized in Gilyard concluded that "the present climate of the F I P ~ Edocuments actually takes place

Department,assuming I am characterizing it fairly, [in classrooms]:' that "the group of

is not conducive to constructing the best FIPSE instructors I met with (and also

composition program for students." viewed on tape) were all supportive of the curriculum and described in detail

the sorts of assignments they had developed.. . in line with the project's mis- sion" (2). As for the students, Gilyard observed that their writing (in student publications) "reflects work that is admirable for its vigor, clarity, and techni- cal command" (2). Gilyard commented as well on high levels of student satis-

G L E A 5 O N / E V A L U A T I N G W R I I N G P R O G R A M S I N IREAL T I M E

faction with the pilot course: "The sheer force of student testimony on behalf of the FIPSE project, [and] the inspiring nature of that testimony.. . is not something I could easily dismiss" (3).

On integrating eontexts into program evaluations

The institutional context What Gilyard ultimately reported on were two very different perspectives on the pilot writing course: the view of insiders (e.g., students, tutors, teachers) who of- fered information and opinions about what went on in classrooms and the view of outsiders (e.g., faculty and administrators) who contributed not just their individual opin- The important question, it seems, is not ions about the mainstreamed writing course whether to include institutional context in but also important forms of ev?dence about the a program evaluation but how best this impact of the institutional environment on might be accomplished, writing instruction. Gilyard concluded that "the present climate of the Department, assuming I am characterizing it fairly, is not conducive to constructing the best composition program for students" (4).

Gilyard's ability to analyze the institutional context, which was certainly fa- cilitated by his OWTI prior CUNY experience, was strengthened by his interviews with non-participating faculty who viewed themselves as having an interest in

the future of the college's writing program. This inclusion of non-participating faculty among those interviewed was something we had not done when Suzy Groden had visited our campus two years earlier. But because forces within an institution can have such a strong and sometimes unpredictable influence on a writing program, it is clearly advantageous to integrate institutional context into a program evaluation. And in fact, this evaluation component is one of the five in a framework proposed by Witte and Faigley (40). The important question, it seems, is not whether to include institutional context in a program evaluation but how best this might be accomplished. Are descriptions of such contexts enough? Or might it be necessary to involve people from a wide range of campus interests in the development and the evaluation of a writing program?

In this case, a highly controversial new writing course may well have ben- efited from the involvement of professors and counselors in the pilot project from the ground up as active participants in formulating and developingpolicy and curriculum. However, we would have had to have considered two distinct disadvantages to this approach: first, the amount of time required to attend meetings, coupled with the time required for more than two people to make

CCC 5 1 . 4 / J U N E 2000

decisions and accomplish necessary tasks, is enormous, and administration by committee is an unwieldy process at best; second, as project directors, we would have had to have been prepared to give up a great deal of authority for a writing course curriculum to colleagues whose expertise is in other disciplines.

The cultural con text Even a highly participatory evaluation process could not have offset the

stresses posed on this project by forces outside the college. Just as we were

These glib journalistic descriptions overshadowed completing and distributing our evalu-the substantive scholarly work of Marilyn ,ion report, CUNY3sBoardof Trustees

Sternglass~whor during these same months, attempted to impose a new graduation published a six-year longitudinal study of City policy on ~~~t~~ Community College

College students demonstrating that acquisition students: passing the CUNY Writing of academic writing occurs over time in complex Assessment Test. Because the Board

social and economic contexts. had imposed this policy ex post facto, Hostos students who were affected

won their right to graduate in a court of law. This was a short-lived victory, how- ever, because CUNY won its appeal of that judgment. Meanwhile, journalists who had fanned the flames of this controversy succeeded in softening public response to a far more drastic policy that was to come one year later.

Daily News correspondent Russ Buettner wrote a highly editorialized news report about the Hostos College students: "[CUNY Trustee Herman Badillo] called the [writing test] results 'pathetic' and said they point out the failings of the Hostos administration and faculty" ("Few at Hostos Pass HS Level English Exam" 8).Not explained in this article was the fact that the 95% writing test failure Buettner cited involved a testing of 226 students at all levels of academic progress, not just students prepared to graduate from this two-year college.13 An editorial soon followed ("Will Hostos Ever Learn?"), and, when the students won their lawsuit, the DailyNews published a second editorial entitled "Hostos Victory a CUNY Failure." These glib journalistic descriptions over- shadowed the substantive scholarly work of Marilyn Sternglass, who, during these same months, published a six-year longitudinal study of City College stu- dents demonstrating that acquisition of academic writing occurs over time in complex social and economic contexts.

Conflicting interpretations of evaluation results The results ofJanger's statistical analysis paralleled what we had found during this project's first year: students who had placed into remedial writing but en-

G L E A S O N / E \ / A L U A T i K G W K I i i N G P R O G P A M S I N R E 4 i ' I M E

rolled in the pilot course and simultaneously in a humanities course passed that core course at a rate comparable to that of students who had placed into college-level English 110. This finding holds true for both college-level placed students who chose the established one-semester writing course and for those who volunteered for the two-semester pilot course.

Of the 134 students who placed in remedial writing but selected the pilot course and also enrolled in World Humanities 101? 105 students, or 78%, passed these courses. This percentage is not significantly different from the 82% pass rate (837 of 1,024 students) for students enrolled concurrently in the estab- lished one-semester college-level writing course and in the World Humanities 101 course. For students who placed into the one-semester college composi- tion course but opted instead for the two-semester pilot course, the pass rate of students concurrently enrolled in World Humanities 101 was somewhat higher, 87% (97 of 112 students), but this 5% difference is also not statistically significant ( Janger 34-37).

But what do these comparable pass rates mean? My own naive expecta- tion was that this analysis would provide convincing evidence that one score on the CUNY Writing Assessment Test cannot predict a student's success in col- lege courses. For those already ideologically disposed toward questioning re- mediation testing, the results of this analysis did offer such evidence. For example, as a direct consequence of hearing about this project and its evalua- tion results, the former chair of our college's psycholo@ department asked me to co-author an editorial essay on the problem of using skills tests to block stu- dent progress (Crain and Gleason).

For those not favorably disposed toward placing "remedial" writing stu- dents in core curriculum courses, however, the successful pass rates of remedial- placed students in such courses merely called into question what was going on in those classrooms. One influential professor was heard to say many times that such a finding could only mean that students did very little writing in those Students who had placed into remedial writing but courses. A more serious concern was enrolled in the pilot course and simultaneously in voiced by our college'sprovost, who ex- a humanities course passed that core course at a pressed the view that journalists and rate comparable to that of students who had other CUNY critics (including, for ex- placed into college-level English 110. ample, some of CUNY's own trustees) would use the results of this study to argue that City College "lacks standards."14 A two-page editorial assault entitled "CCNY's Fall from Grace" (Buettner), along with other such critiques, could well be cited in support of the provost's position.

CCC 5 1 . 4 / J U N E 2000

Not surprisingly, Janger's finding that remedial-placed students could pass core curriculum courses from which they would ordinarily be barred found little support from our college's administrators, who were certainly not disposed to let students bypass the very writing test for which Hostos Com- munity College administrators had just been publicly excoriated.

A new admissions policy for CUNY's senior colleges: RESOLVED, that all remedial course instruction shall be phased out In light of what happened next, barring remedial-placed students from six core curriculum courses-which many could pass-appeared only mildly punitive. On May 26,1998, CUNY's Board of Trustees passed a highly controversial res- olution to bar students from CUNY's senior colleges if they failed even one of the three skills tests, including the writing test. Only students who had at- tended high school outside the United States-a small part of the ESL student population-would be exempt from this system-wide

The Board's action was trumpeted as a victory for standards, but it is clear that the real purpose of their resolution was considerably less high-minded. Its chief intent is to downsize the university, rendering it more 'kost efJicient"a1ong the lines that have become a familiar feature of corporate America. Significant decreases in enrollments have already occurred in many CUNY colleges due to newly restrictive policies on availability of remedial courses, a requirement that students pass skills tests by the time they accumulate sixty credits, and admis- sions. At City College, for example, undergraduate enrollments declined from 11,700 in fall 1993 to 8,863 in fall 1998, a decrease of 2,837 (25%) (City Facts 1998-1 999,33).

Many students who fail one or more skills test will bepermitted to enroll in CUNY's community colleges, where educational costs are held down by

higher teaching loads for full-time faculty Its chief intent is to downsize the university, (27 hours per year for community college

rendering it more 'tost efXcient"a1ong the lines faculty versus 21 hours per year for senior that have become a familiar feature of college faculty) and a greater reliance on

corporateAmerica. adjunct faculty. In their recent report, David Lavin and Elliot Weininger predict

that when the new admissions policy goes into effect, senior college admissions will be lowered by "somewhere between 53% to 44%"(3). They further predict that the policy change will "shift almost 30 percent of whites out of senior in-

G L E A S O N / E V A L U A T I N G W R I T I N G P R O G R A M S I N R E A L T I M E

stitutions . . . [and] more than hayof black, Hispanic, and Asian students" (Lavin and Weininger 4).

Not only will fewer students be admitted to the four-year colleges, but fewer will be tempted to try for fear of rejection. Those who are diverted to CUNY's community colleges may be less likely, in the long run, to complete a bachelor's degree (Dougherty, cited in Lavin and Weininger). Among those who m~n.ill not Among those who will not be allowed directly into be allowed directly into CUNY's senior CUNY's senior colleges are"remedia1"students colleges are "remedial" students such such as the ones in our project who demonstrated as the ones in our project who demon- their ability to do college-level work. strated their ability to do college-level work This impending exclusion of remedial-placed students from CUNY's senior colleges is all the more disturbing in light of the successes of many remedial- placed students in college-level courses that are documented in this project's evaluation. However, this argument is increasingly difficult to make and even more difficult to hear above the din of high profile news reporting that CUNY "lacks standards" and has "failed."16

The Board's action has been vigorously protested in public hearings and in the local press. Among the most eloquent of these protests was an editorial essay by New York Times education editor Brent Staples, who pointed out the benefits accrued by past open admissions students at CUNY (documented by David Lavin and David Hyllegard in their longitudinal study, Changing the Odds). Among other benefits, there were "significant income differences be- tween students who took advantage of open admissions and their peers who did not" (Staples). A more personal plea was made by Frank McCourt, who tes- tified that he had been admitted to New York University on probation in 1953 but would have been excluded from City College then and from all CUW senior colleges now on the basis of his high school average. After graduating in four years, McCourt went on to become a very successful high school teacher and the Pulitzer-prize winning author of Angela's Ashes (McCourt A33).

Many CUNY faculty, staff, students, and administrators testified before CUNY's Board of Trustees on the proposal to eliminate remediation at senior colleges. In a 1998 internal report on remediation issues within CUNY, Judith Watson (then consultant to the Board of Trustees) noted that in the fall 1988 cohort of CUNY students there was only a 6% difference in graduation rates (computed over an eight-year period) between those who successfully com- pleted remedial courses at a senior college (42.8%) and those who required no remedial courses (48.2%) (Watson 4).17

In May 1998, I participated in a groundswell of protest against the pro- posal to eliminate remedial courses from CUNY's senior colleges. A few days prior to the Board's vote (on May 26,1998), I attended a public hearing and sent

a letter to each of the seventeen CUNY The empirically verifiable account that we were Trustees informing them that many

striving for in this evaluation was fatally remedial-placed students had passed compromised by the socio-political forces that had college-level courses at rates compara-

gathered around the issue of remediation. ble to college-level students in the context of a pilot project at City Col-

lege. I also informed Board members of our profession's official position state- ment on writing assessment:

If you vote YES on the CAP proposal you will be casting a vote in favor of a pol- icy that rejects the knowledge and the wisdom of professional educators, includ- ing professionals in writing assessment. Their stand on the issues has been carefully articulated in a statement that was published in a central professional journal, College Composition and Communication (October 1995): "One piece of writing-even if it is generated under the most desirable conditions-can never serve as an indicator of overall literac~particularly for high stakes decisions."(432 emphasis mine)

The well-publicized controversy continued in various forms until the day of the vote, when nineteen protestors of the Board's actions were arrested-some for blocking traffic by kneeling in the street in front of CUNY's central offices (Barry; Herbert; Gonzalez).

The lesson is clear. The empirically verifiable account that we were striv- ing for in this evaluation was fatally compromised by the socio-political forces that had gathered around the issue of remediation. Janger's findings that 62% of the remedial-placed students enrolled in the pilot course (3801609) passed that course and that remedial-placed students who enrolled in World Human- ities 101 passed at a rate of 78% (1051134) were lost to view in the heat of the controversy. Nevertheless, this evaluation has placed on the record a fact that is there to be seen by those who wish to see it. That fact is best summarized by Matthew Janger: initial writing test scores are "very weakly related to the stu- dents' GPAs in subsequent courses" (33).

Acknowledgments For their responses to earlier drafts of this essay, I would like to thank Richard Haswell, Teresa Purvis, Joseph Harris, Gerri McNenny, Fred Reynolds, Mary Soliday, William Herman, Barbara Comen, and Karl MalkofS Forparticularly careful and in-

G L E A S O N / E V A L U A T I N G W S I T I N G P R O G R A M S i N R E A L T I M E

sightful editing suggestions, I would like to thank Marilyn Coopel: Iespecially want to thank Edward Quinn for proposing that my analysis of this program evaluation be situated in the larger CUATcontext and for responding to several drafs of this essay:

Notes

1. The Board of Trustees' resolution states that '811 remedial instruction shall be phased out of all baccalaureate degree programs at the CUNY senior colleges as of the folloming dates: January 2000 for Baruch, Brooklyn, and Hunter Colleges; Sep- tember 2000 for Lehman, John Jay, City, The College of Staten Island, New York City Technical Colleges; and September 2001 for York and Medgar Evers Colleges." Stu- dents with some high school education in other countries will be exempt from the placement policies applied to the general population. Two exemptions were later added to the policy: "Students scoring500 or more on each of the sections (verbal and math) of the Scholastic Aptitude Test" and "students earning at least 75 on pertinent New York State Regents Exams will be exempt from taking CUNY skills tests" (Lavin and Weininger). The Board's first vote was invalidated by a successful lawsuit by City College Professor William Crain charging that the "open meetings rule" had been vi- olated, but the Board voted in support of the policy change again in January 1999.

2. See "Writing Assessment: A Position Statement," prepared by the CCCC Com- mittee on Writing Assessment. See also the debate between Edward White and Alan Purves on timed writing tests. In 'Xpologia for the Timed Impromptu Essay Test:' White concedes, "There is no debate about portfolios being superior to essay tests in principle; multiple measures are always better than single measures" (38). And in 'Xpologia Not Accepted," Purves emphasizes this same point, arguing that "the taskrepresented by the impromptu is but one small part of the domain ofwrit- ten composition" (549) and that "a task score within one segment of the domain of writing (exposition, for example) will not predict the task score for argument" (550).

3. CUNY consists of seventeen senior (four-year) colleges and community (two- year) colleges. A third category, the 'komprehensive college," awards both associate degrees and baccalaureate degrees. For a perceptive summary of the shifting roles of community colleges on a nationwide basis, see Teresa M. Purvis's review, "The Two-Year Community College: Into the 21st Century."

4. For a cogent discussion of ESL college students and impromptu writing exams, see Marc Ward, "k1yths about College English as a Second Language" [The Chronicle of Higher Education, September 26,1997.1 See also Barbara Gleason, "When the Writing Test Fails."

5. There were many published responses to Traub's work. For favorable replies, see A. M. Rosenthal's review in the New York Times (2 Oct. 1994) and Alfred Kazin's re- view in The New York Observer (7 Nov. 1994). For a critical review, see Jon Wiener's

CCC 5 1 . 4 / J U N E 2000

"School Daze" in The Nation (7 Nov. 1994) and Irwin Polishook's response to Traub, MacDonald, and Leo (The Clarion, Oct. 1994).

6. For a review of some of the literature that is available, see Baker and Jolly; Hair- ston; Lindeman; McCormick and McCormick; Presley; White (Developing);and Witte and Faigley (1983).

7. For example, a special form of financial aid was available to students enrolled in remedial courses. By choosing a college-level writing course instead of a reme- dial course, a student forfeited this particular form of financial aid.

8. Professor Joshua Wilner, City College English Department Chair, provided a great deal of support for the pilot course, especially during its first year. Humani- ties Dean, Paul Sherwin, played a key role in the initial phase of the project.

9. More recently Daniel J. Royer and Roger Gilles have reported on a program in which students are required to select either a remedial writing course or a college- level course.

10. Philip DeBlazio of City College created the database for this study

11. Louise Wetherbee Phelps, who suggested Keith Gilyard as a project evaluator, provided much helpful advice and support during the three-year life of this project.

12. Soon after we contacted him, Matthew Janger moved to Ann Arbor, Michigan, and then served as a private consultant to this project.

13. Hostos College English Department Chair Mary Williams clarified this issue when she spoke at an open admissions "Teach In" at Hostos College on November 7,1998.

14. City College's provost expressed this view during a meeting of faculty union rep- resentatives and college administrators during spring 1998.

15. Two other exemptions were later added. See note 1.

16. Just as I completed the editing of this essay, a highly publicized mayoral task force headed by Benno Schmidt released its report, The City UniversityofNew York: An Institution Adrift (June 7,1999). The Daily News ran this front page headline on June 6,1999: "F for CUNY."

17. Judith Watson informed me in March 2000 that she obtained these data from the CUNY Office of Institutional Research, that these data were reported in a Feb- ruary 1998 remediation report entitled Basic Skills andESL at the City University of New York: An Overview, and that the 42.8% statistic applies to any student who took a basic skills course during the first semester of college work and passed it.

G L E A j O l v / E V A L U A T I N G W R I T I N G P R O G R A M . I N R E A L T I M E

Works Cited

Adams, Peter Dow. "Basic Writing Recon- sidered."Journal ofBasic Writing 12 (1993): 22-36.

Adelman, Clifford. New College Course Map and Transcript Files: Changes in Course- Taking and Achievement, 1972-1993. U.S. Department of Education Report, 1995. Washington, D.C.: U. S. Government Printing Office.

Arenson, Karen. "With New Admissions Policy, CUNY Steps Into the Unknown." The New York Times 28 May 1998: Al.

Astin, Alexander W. Whatitfatters in Col- lege?Four Critical Years Revisited. San Francisco: Jossey-Bass, 1993.

Baker, Tracey, and Peggy Jolly. "The 'Hard Evidence': Documenting the Effective- ness of a Basic Writing Program." Journal of Basic Writing 18 (1999): 27-39.

Barry, Dan. "For Assemblyman, Hearing Turns Into Night in a Jail Cell:' The New York Times 28 May 1998: B6.

Bartholomae, David. "The Tidy House: Basic Writing in the American Curricu- lum."Journal of8asic Writing 12 (1993): 4-21.

Bartholomae, David, and Anthony Petrosky, eds. Facts, Artifacts, and Counterfacts: Theory and Method for a Reading and Writing Course. Upper Montclair, NJ: Heinemann-Boynton/Cook, 1986.

Buettner, Russ. "Few at Hostos Pass HS Level English Exam." Daily News 17 September 1997: 8.

----. "CCNY's Fall from Grace." Daily News 23 November 1997: 28-29.

CCCC Committee on Assessment. "Writing Assessment: A Position Statement." Col-lege Composition and Communication 46 (1995): 430-37.

City Facts 1998- 1999. City College of New York Office of Institutional Research, 3.

Crain, VVilliam, and Barbara Gleason. "Skills Tests Block Opportunity at CUNY;' The Knowledge Factory. Official Newsletter of the CCNY chapter of the PSC-CUNY Dec.-Jan. 1997: 57. Rpt. in The New York Amsterdam News I 1 Dec.-17 Dec. 1997: 13,30.

Crowley, Sharon. 'X Personal Essay on Freshman Eng1ish:'PreText 12 (1991): 156-76.

Davis, Barbara Gross, Michael Scriven, and Susan Thomas. The Evaluation of Compo- sition Instruction. 2nd ed. New York: Teachers College P, 1987.

Dougherty, Kevin. The Contradictory Col- lege: The ConfZicting Origins, Impacts, and Futures of the Community College. Albany: SUNY P, 1994.

Elbow, Peter, and Pat Belanoff. "State Uni- versity of New York at Stony Brook Portfolio-based Evaluation Program." Por$olios: Process and Product. Eds. Pat Belanoff and Marcia Dickson. Portsmouth, NH: Heinemann- Boynton/Cook, 1991.3-16.

Englellard, George, Jr., Belita Gordon, and Stephen Gabrielson. "The Influences of Mode of Discourse, Experiential De- mand, and Gender on the Quality- of Student Writing." Research in the Teach- ing of English 26 (1992): 315-36.

CCC 5 1 : 4 / J U N E T O 0 0

-.

Gilyard, Keith. Report on the FIPSE Enrich- ment Approach Pilot Project at The City College of The City University of New York. February 1997.

Voices of the Seg A Study of Language Competence. Detroit: Wayne State UP, 1991.

Gleason, Barbara. "Something of Great Constancy: Storytelling, Story Writing, and Academic Literacy." Attending to the Margins: Writing, Researching, and Teachin,q on the Front Lines. Eds. Michelle ~ a i Kells and Valerie Balester. Portsmouth, NH: Heinemann-Boyntonl Cook, 1999.97-113.

-. "When the Writing Test Fails: As- sessing Assessment at an Urban College:' Writing in iMulticultural Settings. Eds. Carol Severino. luan C. Guerra, and Johnella E. Butler. New York MLA, 1997. 307-24.

Gleason, Barbara, and Mary Soliday. The City College WritingProgram:An Enrich- merit Approach to Language and Literacy FIPSE Application Proposal No. P116A30689.3 March 1993.

-. The City College Writing Program: An Enrichment Approach to Language and Literacy: Three Year Pilot Project, 1993-1 996, Final Report. FIPSE Grant No. P116A30689. May 1997.

Gonzalez, David. "History Moves a Profes. sor to Protest." The New York Times 30 May 1998: B1.

Grego. Rhonda, and Nancy Thompson. "Repositioning Remediation." College Composition and Communication 47 (1996): 62-84.

Groden, Suzy Q. Evaluation Report of 'An Enrichment Approach to Language and Learningj'at The City College of The City University ofNew York: Year One of a Three-Year FIPSE Pilot Project. November 1994.

Groden, Suzy Eleanor Kutz, and Vivan Zamel. "Students as Ethnographers: In- vestigating Language Use as a Way to Learn to Use Language." The Writing In- structor 6 (1987): 132-140.

Guba, Egon G., and Yvonne S. Lincoln. Fourth Generation Evaluation. Newbury Park, CA: Sage, 1989.

Hairston, Maxine. "What Freshman Direc- tors Need to Know about Evaluating Writing Programs:' WPA: Writing Pro- gram Administration 3 (1979): 11-16.

Heath, Shirley Brice. Ways with Words: Lan- guage, Life, and Work in Conzmunities and Classrooms. New York: Cambridge UP, 1983.

Herbert, Bob. "Cleansing CUKY." The New York Times 28 May 1998: A29.

' * ~ t ~ ~~victory a~CUNY Failure:' Daily News 16 July 1997: 32.

Janger, Matthew. A Statistical Analysis of Student Progress and Achievement in the Pilot VVriting Project at ci ty College of New York. Mav 1997.

Kutz, Eleanor, Suzy Q. Groden, and Vivian Zamel. The Discovery of Competence: Teaching and Learning with Diverse Stu- dent Writers. Portsmouth, NH: Heinemann-BovntonICook, 1993.

Lavin,David E., Richard D. Nba, and Richard A. Silberstein. Right Versus Privi- lege: The Open Admissions Experiment at the City University of New York. New York: Free P, 1981.

Lavin, David E., and David Hyllegard. Changing the Odds: Open Admissions and the Life Chances of the Disadvantaged. New Haven: Yale UP, 1996.

Lavin, David E., and Elliot Weininger. New Admissions Policy & Changing Access to CUhTYk Senior and Community Colleges: What are the Stakes? Prepared for Higher Education Committee, The New York

G L I A S O N / E V A L U A T I N G W R T I \ I G P R O G R A h 3 1 S I N R E A L TIh3 lE

City Council. CUNY Graduate School and Purvis, Teresa M. "The Two-Year Community University Center. May 1999. College: Into the 21st Century." College

Composition and ~ornmunication 46Leo, John. 'X University's Sad Decline." (1995): 557-65.

US. News and World Report. 15 Auwst -1994: 20. Rodby, Judith. "Revising a First-Year Writing

Program: Cultural Studies Workshops Lindemann, Erika. "Evaluating Writing Pro- Replace Basic tvriting: Conference in

grams: What an Outside Evaluator Looks College Composition and Communica-For." WPA: Writing Program Administra- tion, Washington D.C,, March 1995. tion 3 (1979): 17-24.

Rose, Mike. "Remedial Writing Courses: A MacDonald, Heather. "Do~mward Mobility: critique and a proposal:. college English

The Failure of Open Admissions at 45 (1983): 109-28. City University." City Journal Summer. -1994: 10-20. -. "The Language of Exclusion: Writing

Instruction at the University: College En- glish 47 (1985): 341-59.

Royer, Daniel J., and Roger Gilles. "Directed Self-placement: An Attitude of Orien- tation." College Composition and Communication 50(1998): 54-70.

Shaughnessy, Mina. "Open Admissions and the Disadvantaged Teacher." College c~~~~~~~~~~and ~~~~~~~~~~i~~ 24 (1973): 401-04.

Soliday, Mary. "Translating Self and Differ- ence through Literacy Narratives:' College English 56 (1994): 511-26.

Soliday, Mary, and Barbara Gleason. "From Remediation to Enrichment: Evaluating a Mainstreaming Project:'Journalof Basic writ in.-̂ 16 (1997): 64-78.

Staples, Brent. "Blocking Promising Students from City University." The New York Times 26 May 1998: A20.

Sternglass, Marilyn S. Time to Know Them: A Longitudinal Study of Writing and Learning at the College Level. Mahwah. NJ: Lawrence Erlbaum, 1997.

Traub, James. "Annals of Education: Class Struggle." The New Yorker 20 September 1994: 76-90.

-----. City on a Hill: Testing the American Dream at City College. Reading, M A : Addison-Wesley, 1994.

McCormick, Frank, and Chris McCormick. "The Basic Writing Course at Eastern Illi- " nois University: An Evaluation of Its Effectiveness." VWA: WritingProgram Administration 10 (1986): 61-65.

McCourt, Frank. "Hope and Education:' The New York Times 21 May 1998: A33.

Otheguy, Ricardo. The Condition oflatinos in the City UniversiQofi2'ew York;A Re- port to the Vice Chancellor for Academic Affairs and to the Puerto Rican Council on Higher Education. June 1990.

Pattison, Robert. On Literacy: The Politics of the ~ ~ ~ d f r - ~ ~ to the o f ~ o c k .H~~~~ New York: Oxford UP, 1982.

Phelps, Louise Wetherbee. "The Institu- tional Logic of Writing Programs: Catalyst, Laboratory, and Pattern for Change." The Politics of Writing Instruction: Postsecondarjc Eds. Richard H. Bullock and John Trimbur. Ports- mouth, NH: Heinemann-Boyntonl Cook, 1991.155-70.

Presley, John. "Evaluating Developmental English Programs in Georgia." W A : Writing Program Administration 8 (1984): 47-56.

Purves, Alan C. 'Xpologia Not Accepted." College Composition and Communication 46 (1995): 549-51.

~ ~ ~ ~ ~

-.

CCC 5 1 . 4 i J U N E 2000

Ward, Marc. "Myths about College English Wiener, Jon. "School Daze." Review of City as a Second Language." Chronicle of on a Hill by James Traub. The Nation 7 Higher Education 26 September 1997. November 1994: 522.

Watson, Judith. CUVYRemediation/ESL rill H~~~~~E~~~L ~ ~ ~ ~ ? ~ ,~ ~22 i l Backgroundel: CUNY Board of Trustees september 1997: 24, Office. Undated manuscriot.

White, Edward. 'xpologia for the Timed Witte, Stephen, and Lester Faigley. Evaluat-

Impromptu Essay Test." College Com- ing College Writing Programs. Carbondale:

position and Communication 46 Southern Illinois UP. 1983.

(1995): 30-45.

Developing Successfil College Writing Programs. San Francisco: Jossey- Bass, 1989.

Barbara Gleason Barbara Gleason is Associate Professor at the City College of New York, where she teaches in the English Department and at the Center for Worker Education. She has published several essays on basic writing, teaching returning adults, and writing course curriculum. She has co-edited, with Mark Wiley and Louise Wetherbee Phelps, Composition in Four Keys: Inquiring into a Field: Nature, Art, Science, and Politics (Mayfield 1995) and, with Faun Bernbach Evans and Mark Wiley, Cultural Tapestry: Readings for Pluralistic Society (Harper Collins 1992).

You have printed the following article:

Evaluating Writing Programs in Real Time: The Politics of RemediationBarbara GleasonCollege Composition and Communication, Vol. 51, No. 4. (Jun., 2000), pp. 560-588.Stable URL:

http://links.jstor.org/sici?sici=0010-096X%28200006%2951%3A4%3C560%3AEWPIRT%3E2.0.CO%3B2-9

This article references the following linked citations. If you are trying to access articles from anoff-campus location, you may be required to first logon via your library web site to access JSTOR. Pleasevisit your library's website or contact a librarian to learn about options for remote access to JSTOR.

Works Cited

Writing Assessment: A Position StatementCCCC Committee on AssessmentCollege Composition and Communication, Vol. 46, No. 3. (Oct., 1995), pp. 430-437.Stable URL:

http://links.jstor.org/sici?sici=0010-096X%28199510%2946%3A3%3C430%3AWAAPS%3E2.0.CO%3B2-X

Repositioning Remediation: Renegotiating Composition's Work in the AcademyRhonda Grego; Nancy ThompsonCollege Composition and Communication, Vol. 47, No. 1. (Feb., 1996), pp. 62-84.Stable URL:

http://links.jstor.org/sici?sici=0010-096X%28199602%2947%3A1%3C62%3ARRRCWI%3E2.0.CO%3B2-L

Apologia Not AcceptedAlan C. PurvesCollege Composition and Communication, Vol. 46, No. 4. (Dec., 1995), pp. 549-550.Stable URL:

http://links.jstor.org/sici?sici=0010-096X%28199512%2946%3A4%3C549%3AANA%3E2.0.CO%3B2-B

http://www.jstor.org

LINKED CITATIONS- Page 1 of 3 -

Review: The Two-Year Community College: Into the 21st CenturyReviewed Work(s):

The Invisible Faculty: Improving the Status of Part-Timers in Higher Education by Judith M.Gappa; David W. LeslieDemocracy's Open Door: The Community College in America's Future by Marlene Griffith; AnnConnorTwo-Year College English: Essays for a New Century by Mark Reynolds

Teresa M. PurvisCollege Composition and Communication, Vol. 46, No. 4. (Dec., 1995), pp. 557-565.Stable URL:

http://links.jstor.org/sici?sici=0010-096X%28199512%2946%3A4%3C557%3ATTCCIT%3E2.0.CO%3B2-Z

Remedial Writing Courses: A Critique and a ProposalMike RoseCollege English, Vol. 45, No. 2. (Feb., 1983), pp. 109-128.Stable URL:

http://links.jstor.org/sici?sici=0010-0994%28198302%2945%3A2%3C109%3ARWCACA%3E2.0.CO%3B2-Y

The Language of Exclusion: Writing Instruction at the UniversityMike RoseCollege English, Vol. 47, No. 4. (Apr., 1985), pp. 341-359.Stable URL:

http://links.jstor.org/sici?sici=0010-0994%28198504%2947%3A4%3C341%3ATLOEWI%3E2.0.CO%3B2-C

Directed Self-Placement: An Attitude of OrientationDaniel J. Royer; Roger GillesCollege Composition and Communication, Vol. 50, No. 1. (Sep., 1998), pp. 54-70.Stable URL:

http://links.jstor.org/sici?sici=0010-096X%28199809%2950%3A1%3C54%3ADSAAOO%3E2.0.CO%3B2-K

Open Admissions and the Disadvantaged TeacherMina P. ShaughnessyCollege Composition and Communication, Vol. 24, No. 5. (Dec., 1973), pp. 401-404.Stable URL:

http://links.jstor.org/sici?sici=0010-096X%28197312%2924%3A5%3C401%3AOAATDT%3E2.0.CO%3B2-G

http://www.jstor.org

LINKED CITATIONS- Page 2 of 3 -

Translating Self and Difference through Literacy NarrativesMary SolidayCollege English, Vol. 56, No. 5. (Sep., 1994), pp. 511-526.Stable URL:

http://links.jstor.org/sici?sici=0010-0994%28199409%2956%3A5%3C511%3ATSADTL%3E2.0.CO%3B2-U

An Apologia for the Timed Impromptu Essay TestEdward M. WhiteCollege Composition and Communication, Vol. 46, No. 1. (Feb., 1995), pp. 30-45.Stable URL:

http://links.jstor.org/sici?sici=0010-096X%28199502%2946%3A1%3C30%3AAAFTTI%3E2.0.CO%3B2-V

North American Free Trade and the European Situation ComparedSidney WeintraubInternational Migration Review, Vol. 26, No. 2, Special Issue: The New Europe and InternationalMigration. (Summer, 1992), pp. 506-524.Stable URL:

http://links.jstor.org/sici?sici=0197-9183%28199222%2926%3A2%3C506%3ANAFTAT%3E2.0.CO%3B2-K

http://www.jstor.org

LINKED CITATIONS- Page 3 of 3 -