Accessible Navigation.

Writing Committee Annual Report 2008-2009

 

Faculty Members

Nancy Hinman, Geosciences (Chair)

2010

Kate Zoellner, Mansfield Library

2011

Heather Bruce, English

2009

Jean Carter, Pharmacy Practice

2009

Mark Medvetz, Writing Studies

2011

Matthew Semanoff, MCLL

2011

Kathy Kuipers, Sociology

2010

Student Members

Siri Smillie, ASUM Vice President


Additional Representatives
(Ex-Officio)

Arlene Walker-Andrews, Associate Provost

David Micus, Registrar

Kelly Peterson, Director, Writing Center

Kathleen Ryan, Director, Composition Program

Business Items

  • Writing course guidelines, catalog language, and frequently asked questions were presented to the Faculty Senate 10/9/08 (approved 11/13/08) [links to documents]
    • Aligned catalog language with English Placement and Composition information
    • Consulted with Provost Engstrom regarding guidelines on 10/13/08
    • Additional Information Literacy Instructions and Examples linked to document as the result of comments from the Faculty Senate and Provost
  • Review of new writing courses -Fall
    • One writing course and 4 upper-division writing courses in the major were approved.
  • Review of all writing courses according to revised guidelines - spring
    • Created writing review forms (writing course & Upper division writing expectation) [links]
    • Deadline memo [link]
    • The committee worked in three work groups to review courses. Follow-up communication was required for several courses. - 39 writing courses / 84 upper division writing courses and 6 distributed models were approved. Three courses were withdrawn and one upper-division writing course was not approved.
    • Guests were invited to discuss PSC 400, a one credit add-on.
    • The Department of Sociology submitted a matrix for their SOC 488 Writing in Sociology
  • Request transition year (memo to ASCRC) [link]
    • 17 courses were identified as being taught still fulfilling the old writing guidelines that were not submitted for review under the new guidelines
  • Requested upper division writing requirement catalog language (3/27/09)- Received responses from 12 departments
  • Report from Writing Center regarding the UDWPA (appended)
    • Passing rate history
    • Changes to grading and text selection

_____________________________________________________________

 

The University of Montana-Missoula

Intra-campus Memorandum

TO:                  Associate Provost Walker-Andrews, Provost's Office

                        Sharon O'Hare, Director, The Office for Student Success

                        ASCRC Writing Subcommittee

FROM:            Kelly Peterson, Director, The Writing Center

DATE:            May 27, 2009

RE:                  UDWPA Report

Attached please find a report outlining the validation of, recent evolution of, and passing rates for the Upper Division Writing Proficiency Assessment.  A version of this report was submitted Autumn 2008 at the request of ECOS.  This updated report is submitted to the ASCRC Writing Subcommittee in order to facilitate the committee's charge of providing oversight for the administration of the UDWPA.

This report is submitted in the interest of transparency and with the hope that all stakeholders will accept an invitation to discuss the UDWPA's design, administration, and role in the writing requirement sequence at The University of Montana. 

Please do not hesitate to contact me with any questions.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Upper Division Writing Proficiency Assessment: 

Validation, Recent Evolution, and Passing Rates

 

April 27, 2009

 

Submitted by:

Kelly Peterson, Director of The Writing Center

 

Introduction

During the 2007-2008 academic year in an effort to perform ongoing evaluation and fine-tuning of the Upper Division Writing Proficiency Assessment (UDWPA), The Writing Center analyzed issues hampering UDWPA success, both for proficient writers and for those students in need of remedial instruction.  As a result, the test administrator focused efforts on two aspects of the UDWPA:  appropriate and consistent text selection, and targeted UDWPA student tutoring. The criteria used to score the UDWPA were not revised and continue to be used in scoring of the exams. 

In the context of a discussion on the factors that may influence passing rates, it is important to acknowledge that there seems to be no campus consensus on what qualifies as an acceptable passing rate for the UDWPA.  Additionally, because validation of an assessment vehicle must be ongoing, the test administrator and the ASCRC Writing Subcommittee plan to launch a formal investigation into the UDWPA's validity, examining the degree to which the assessment instrument tests what it intends to test:  student preparedness to write for upper-division coursework.

Attached, please find the recent data on passing rates broken down by semester, by attempts required by students who have passed, by attempt number, and by individual exams.  Please also find the scoring criteria and two packets of UDWPA exams.  The UDWPA exam packets include samples of selected exams for comparison:  those that resulted in lower passing rates and those that resulted in higher passing rates, including recent exams. 

Validation

An evaluation of passing rates must also include a communal inquiry into the overall administration and validity of the UDWPA.  The test administrator invites all stakeholders into this conversation, hoping to contextualize the passing rates in an investigation of the UDWPA's design and role in the sequence of general education requirements meant to ensure writing proficiency at The University of Montana.  Although the UDWPA was created to be a collaborative and local assessment of students' writing proficiency, there is no longer formalized, campus-wide responsibility for the design, administration, scoring, and evaluation of the assessment tool.  This fact makes it difficult for the UDWPA to adapt to and reflect the local needs and expectations of campus stakeholders and to ensure its logical position in a curriculum designed to help students develop as writers. 

Acknowledgement of the no longer collaborative nature of the exam is critical in that validation of the exam depends, in part, on faculty consensus that it is a fair measure of upper-division preparedness, a fact that brings into relief the necessity that the design of the exam begin with a shared definition of upper-division preparedness.  Currently, the UDWPA vehicle assesses what its originators saw as salient features of undergraduate writing:  the ability to craft a thesis-driven argument on an issue under consideration and to sustain a position in response to others' thinking.  Because the UDWPA currently is designed, administered, and isolated in The Writing Center, the degree to which the campus community continues to agree that these are the salient features of undergraduate writing is unknown.  Any legitimate assessment instrument should be collaboratively designed in order to ensure this consensus and should be contextually situated in the University's local curriculum.  That is, valid writing assessment requires alignment between what is being tested and the intended goals and objectives of the curriculum.

If, in the course of discussing the exam's validity, the campus community reaches consensus regarding the definition of undergraduate preparedness to write for upper-division coursework, then the question of what constitutes an appropriate assessment instrument design remains.  The current assessment instrument relies on a single writing sample, written under a time constraint, to measure student ability.  Assuming stakeholders agree upon the definition of student preparedness, stakeholders must determine whether or not a single writing sample is capable of representing a student's ability.  Additionally, assuming stakeholders agree upon both what the exam is intended to test and how this testing takes place (the test vehicle), the issue of validity still remains when students have completed 97.3 credits on average at the time of attempt.  In effect, the UDWPA is not being used as a mid-career assessment but rather as a costly exit exam, a fact that makes it difficult to interpret passing rates as any indication of mid-career ability.

It is important to keep in mind the fact that the UDWPA does nothing to improve students' writing competencies.  The UDWPA is not a formative assessment, and as such, it does not contribute to students' development as writers.  Rather, the UDWPA is a summative assessment intended to gate rather than guide.  The question of whether the campus community wants a formative or a summative assessment is at the heart of the validity discussion.

Finally, the validity discussion must take into consideration the fact that the UDWPA has become the exclusive responsibility of The Writing Center (TWC).  The financial responsibility for administering this General Education Requirement belongs to TWC alone.  The total cost for administering, scoring, and scheduling the exam (including personnel and room rental costs) is between $12,000 and $15,000 during a single academic year.  This amount is equivalent to TWC's entire budget for tutoring and other writing across the curriculum activities, presumably the services that are at the heart of TWC's identity.

Not only does this financial burden cause stress on TWC's ability to deliver the tutoring and writing across the curriculum services it is committed to providing, but also the campus community's perception that often conflates TWC with the UDWPA causes a significant dilemma.  As a space dedicated to promoting students' development as writers and to supporting the growth of writing across the curriculum through support of faculty, TWC exists to act in a supportive capacity and to bolster retention efforts by offering a necessary space for collaborative learning.  The UDWPA shifts TWC's identity in students' minds to represent a site of high-stakes evaluation and, in some cases, the largest obstacle to graduation, two perceptions of TWC that are in contradiction with TWC's stated mission.  While students should perceive TWC as a site where they can receive help in preparing for the UDWPA, they should not view TWC as the one unit on campus responsible for requiring the UDWPA and, in some cases, for delaying their graduation.

Therefore, an evaluation of recent passing rates should be an entry into a larger conversation on the UDWPA in order to determine whether it remains a meaningful and responsive assessment of student writing proficiency in The University of Montana's academic context.  This includes a communal look at whether the assessment criteria continue to reflect faculty expectations, whether there is appropriate collaboration in the design and evaluation of the test vehicle, and whether the test results are in fact being used to improve curriculum and instruction as the originators intended.

Text Selection

In fall of 2007, UDWPA data from previous tests showed that the tests with higher passing rates used texts that presented a direct and broadly accessible argument while the tests with lower passing rates used texts that contained only a subtle or implied argument, often combined with a sophisticated literary style. A particular exam is reliable in that it is designed according to key features that remain consistent across exams.  When Henrietta Goodman took over the responsibility of choosing texts and composing prompts for the UDWPA in October of 2007, she worked to consistently select texts based on established selection criteria to preserve the integrity of the exams and of the scoring criteria.

These past texts associated with higher passing rates and the more recently selected texts strictly adhered to the following text selection criteria:  the text is a published self-contained essay or excerpt; the text presents a direct argument on a topic that is accessible to undergraduates in any major and does not require special expertise or knowledge in order to craft a response based on observation, reasoning, or experience; the text uses language and style that are accessible to undergraduates in any major; and the text does not devolve on cultural knowledge or linguistic devices that would prove difficult for a non-native speaker of English to comprehend for the purpose of writing a responsive essay. 

In effect, during the 2007-2008 Academic Year, an effort was made to eliminate inconsistencies in text selection.  Texts that most often resulted in lower passing rates were inconsistent with the selection criteria in that they were not accessible, direct arguments and/or in that they tended to be more literary in nature.  As a result, these texts sometimes were not accessible to undergraduates in any major and sometimes did not use language and style that would be accessible to undergraduates in any major.

The effort to select texts that present a direct argument on an accessible topic using appropriate language and style has not been made at the expense of the invitation for students to read and respond critically in an academic context.  Students' ability to read critically continues to be a primary requirement for UDWPA success.  This recent text selection effort suggests that students perform best when the text models the argumentative style and strategies they are expected to use. If students clearly understand the text, they are better able to construct specific thesis statements that respond directly to the prompt, and they are better able to express their own argument through the use of specific examples and logical reasoning.

It is noteworthy that UDWPA text selection has become the exclusive responsibility of The Writing Center.  This is in contrast to the more communal approach originally intended in which a committee of faculty members from across campus participated in final selection of the UDWPA text for each exam, ensuring that this aspect of the test vehicle is responsive to faculty expectations.

Targeted Tutoring

Many students postpone the UDWPA until well after 70 credits, and not all students pass the exam on the first attempt. An unsuccessful first attempt is sometimes due to lack of preparation, a problem which proficient writers can usually correct in the second attempt. A student who makes two or more unsuccessful attempts, however, often is in need of one-on-one instruction. Thus, during the past year The Writing Center (TWC) has attempted to identify and work with students who have taken the UDWPA unsuccessfully more than twice.

To this end, TWC began to query the UDWPA database to produce a report listing the students who failed the UDWPA more than twice.  This report has enabled TWC to flag these students in order to encourage them to engage in one-on-one tutoring at TWC.  Whenever possible, tutors at TWC refer these students to Henrietta Goodman, and she works to provide understandable explanations of the strengths and weaknesses evident in their writing, combined with guided preparation for the next exam. Often, these sessions involve instruction in two areas: composing a responsive thesis statement and employing specific and relevant supporting material.

While the effectiveness of this intensive individualized tutoring over the past year leads us to conclude that targeting and assisting those students who are struggling to pass the UDWPA is a necessary aspect of successful test administration, the test administrator cannot isolate the variables to conclusively demonstrate whether this intensive tutoring explains the recent higher passing rates.  The UDWPA database allows TWC to determine the passing rate percentages of first, second, third, etc. test takers on a single exam; however, the database does not track which of these students received targeted UDWPA tutoring.  Even so, it is clear that targeted tutoring offers struggling students the one-on-one tailored writing tutoring they often need as they prepare for the UDWPA exam and as they complete writing tasks across the curriculum.

It is notable that Autumn 2001 - Spring 2005 numbers show that 90.5% of students who passed the UDWPA did so on their first or second attempt.  This percentage remained at 90% for Autumn 2005 - Spring 2008.

In terms of students who attempted the UDWPA multiple times, data collected on recent exams show higher passing rates for students who have taken the exam two or more times.  During the 2006-2007 Academic Year, 63% of students taking the exam two or more times passed.  This percentage rose to 78% during the 2007-2008 Academic Year.  The following tables show how these averages break down by first-, second-, third-, fourth-, and fifth-time test takers:

Summary of student performance by attempt, Autumn 2006 - Spring 2007.

Attempt Number

Total Number of Students Attempting

Number of Passing Students

Percentage Passing

2

507

317

63%

3

188

124

66%

4

60

40

67%

5

28

16

57%

Summary of student performance by attempt, Autumn 2007 - Spring 2008.

Attempt Number

Total Number of Students Attempting

Number of Passing Students

Percentage Passing

2

332

256

77%

3

99

79

80%

4

27

19

70%

5

12

12

100%

While these figures show that the 2007-2008 Academic Year saw a rise in passing rates for students attempting the UDWPA two or more times, we cannot interpret the data as demonstrating a direct correlation between targeted UDWPA tutoring and higher passing rates even while figures may suggest a positive influence.  The total number of students attempting the exam a second, third, fourth, and fifth time during the 2007-2008 year is less than the same total for the 2006-2007 year.  Also, a higher passing rate for the 2007-2008 Academic Year may be due the renewed efforts to select appropriate texts, as described above.  The test administrator remains cautious in interpreting data, as there are a number of variables that affect passing rates on a given exam.

Conclusion

The 2007-2008 Academic Year saw a renewed effort to consistently select appropriate UDWPA texts based on clear selection criteria, criteria that ensure undergraduates will encounter a text that presents a direct argument on an accessible topic using appropriate language and style.  This renewed effort has not compromised the requirement that students read and respond critically in an academic context.   The 2007-2008 Academic Year also saw a pointed effort to identify and tutor those students who had failed the UDWPA more than twice, presenting those students with an opportunity for tailored, specific, one-on-one writing tutoring.  Combined, these two efforts may have resulted in higher passing rates.

However, without the ability to isolate the text selection and targeted tutoring variables, the degree to which these two efforts have resulted in higher passing rates is uncertain.  A third factor that may have affected UDWPA passing rates is the June 2007 implementation of a two-tiered scoring method that requires a second reading only of student essays with low, borderline, and high scores.  Previously, all essays were read twice; now, clearly passing but not exceptional essays are read once while essays with low, borderline, and high scores are read twice.  Nancy Mattina, the former Director of The Writing Center, implemented this scoring method in order to streamline the scoring process while continuing to use the existing scoring criteria.  Presumably, she felt comfortable implementing this new scoring procedure because of consistent high inter-rater reliability and because of the financial savings it would mean.  The new scoring procedure will be examined as a part of the ASCRC Writing Subcommittee's general investigation into the exam's validity.

Without campus-wide consensus about what might be an acceptable passing rate for the UDWPA and without cross-campus collaboration on text selection, prompt crafting, and exam scoring, it remains difficult to ensure that the UDWPA is a locally designed assessment instrument that is reflective of faculty expectations and responsive to the local context.  It is the test administrator's opinion that any evaluation of passing rates needs to include a communal look at the reality that the exam has become the exclusive responsibility of TWC, creating a situation which limits the test administrator's ability to perform authentically collaborative writing assessment that responds to the local expectations of faculty, administrators, and students.

Upper Division Writing Proficiency Assessment

Passing Rates as of April, 2009

 

 

Summary of student performance by semester, Autumn 2002-Spring 2009.

Semester

A '02

S '03

A

'03

S '04

A '04

S

'05

A '05

S

'06

A '06

S

'07

A '07

S

'08

A

'08

S

'09*

WPA attempts

572

697

1,665

537

985

1,654

922

1,649

887

1,463

764

1,338

731

1,049

WPA passes

295

474

1,076

285

550

904

611

1,052

602

943

596

1166

592

862

WPA fails

277

223

589

252

435

750

311

597

285

520

168

172

139

187

% passing

51.5

68.0

64.6

53.0

55.8

54.6

66.2

63.7

67.8

64

78

87.1

80.9

82.1

*Does not include June, 2009.

________________________________________________________________________

Summary of attempts required by students who have passed the WPA, Autumn 2005-Spring 2008. *

 

Passed on Attempt #

Autumn '05-Spring ‘08

Passing Students

n = 4860

 

 

%

1

3452

71

2

907

19

3

321

7

4

115

2

5-9

65

1

                        *Does not include June '08 WPA data.

________________________________________________________________________

 

Summary of student performance by attempt number, Autumn 2006 - Spring 2007.

Attempt Number

Total Number of Students Attempting

Number of Passing Students

Percentage Passing

2

507

317

63%

3

188

124

66%

4

60

40

67%

5

28

16

57%

Summary of student performance by attempt number, Autumn 2007 - Spring 2008.

Attempt Number

Total Number of Students Attempting

Number of Passing Students

Percentage Passing

2

332

256

77%

3

99

79

80%

4

27

19

70%

5

12

12

100%

The Writing Center has created a database of students who have failed the WPA several times, allowing these students to be contacted for individual help.  Tracking these students enables The Writing Center to identify those students in need of assistance and to provide them with targeted WPA feedback.

________________________________________________________________________

 

 

 

 

 

Summary of performance by WPA exam date, Autumn 2005-Spring 2009.

2005-2006 AY

Total #

fail

pass

% pass

1.  9/23/05

388

159

229

59%

2.  10/22/05

534

152

382

72%

3.  2/11/06

547

214

333

61%

4.  3/10/06

433

130

303

70%

5.  4/15/06

462

168

294

64%

6.  6/24/06

207

85

122

59%

     Total

2,571

908

1,663

65%

2007-2008

Total #

fail

pass

% pass

1.  9/21/07

315

82

233

74%

2.  10/20/07

449

86

363

81%

3.  2/9/08

467

82

385

82%

4.  3/15/08

410

34

376

92%

5.  4/12/08

251

29

222

88%

6.  6/21/08

210

27

183

87%

Total

2,102

340

1,762

83.8%

2006-2007 AY

Total #

fail

pass

% pass

1.  9/22/06

437

194

244

56%

2.  10/21/06

517

92

425

82%

3.  2/10/07

449

168

281

63%

4.  3/9/07

320

129

191

60%

5.  4/14/07

463

171

292

63%

6.  6/23/07

231

52

179

77%

     Total

2,417

806

1,612

67%

2008-2009

Total #

fail

pass

% pass

1.  9/19/08

370

63

307

83%

2.  10/25/08

361

76

285

79%

3.  2/7/09

446

100

346

78%

4.  3/14/09

300

62

238

79%

5.  4/11/09

303

25

278

92%

6.  6/27/09

-----

-----

-----

-----

     Total

-----

-----

-----

-----

 

 

 

 

 

 

UDWPA Evaluative Criteria/Scoring Rubric

 

Responsiveness (criterion 1); Development (criteria 2-3); Organization (criterion 4); Language (criteria 5-6); Mechanics (criterion 7).

 

Score = 5

1. Responds appropriately, with a nuanced understanding of the text and question.

2. Has a sophisticated, unified thesis that is thoroughly supported.

3. Develops ideas logically with control, clarity, and precision.

4. Has an obvious organization that guides the reader through the essay.

5. Displays care and skill in word choice and sentence structure.

6. Has an appropriate, consistent voice.

7. Uses grammar and mechanics correctly.

Score = 4

1. Responds appropriately, with a clear understanding of the text and question.

2. Has a unified thesis that is supported with details or specifics.

3. Develops ideas logically and clearly.

4. Has an obvious organization marked by transitional words and phrases.

5. Displays competency in word choice and sentence structure.

6. Has an appropriate, consistent voice.

7. Most grammar and mechanics are correct.

Score = 3

1. Responds appropriately, with a sufficient understanding of the text and question.

2. Has a single thesis that is supported by some evidence or details.

3. Develops ideas logically.

4. Has a purposeful organization.

5. Displays adequate word choice and sentence structure.

6. Has an appropriate, consistent voice.

7. Most grammar and mechanics are correct.

Score  = 2

1. Responds with partial or unfocused understanding of the text and question.

2. Has a single thesis that is trite or unsupported by evidence or details.

3. Develops ideas with minimal logical consistency or relevance.

4. Uses some organizational tactics.

5. Displays imprecise word choice or awkward sentence structure.

6. Has a voice that is inappropriate or inconsistent.

7. Grammatical or mechanical errors are commonplace.

Score = 1

1. Does not respond to the text and question.

2. Lacks a single thesis.

3. Does not develop ideas logically or in any detail.

4. Has generalized problems with unity, organization, and focus.

5. Displays imprecise word choice or awkward sentence structure.

6. Has a voice that is inappropriate or inconsistent.

7. Grammatical or mechanical errors are commonplace.

 

 

 

 

 

WPA Exams with a Lower Passing Rate

(attached)

2/12/05   Didion   5?%

6/24/06   Lawrence   59%

9/22/06   Ascher   56%

2/10/07   Dillard   63%

WPA Exams with a Higher Passing Rate

(attached)

10/21/06   Bird   82%

6/23/07   Moberg   77%

9/21/07   Thomas   74%

10/20/07   Forster   81%

2/9/08  Barthes   82%

3/15/08  Trippett   92%

4/12/08   Goldberg   88%

6/21/08   Cousins   87%