Rationale for Part A:
First implemented in 1999, the Upper Division Writing Proficiency Assessment (hereafter UDWPA) was designed to assess student preparedness for writing in upper-division courses. In 2010, the ASCRC Writing Committee carefully reviewed the UDWPA and determined that it does not provide accurate information about student preparedness or useful feedback to inform curricular revisions. The Writing Committee delivered a report to ASCRC on April 20, 2010. ASCRC asked that the Writing Committee recommend a specific course of action. On May 3, 2011, the ASCRC Writing Committee recommended ending the UDWPA and creating a university-wide program-level assessment in its place. ASCRC endorsed the recommendation and asked that a pilot project determine the feasibility of a university-wide program-level assessment. The pilot project has been in operation for two years.
Rationale for Part B:
Program-level assessment is a proven method for studying student writing performance in the context of university courses and for providing curricular and instructional feedback. Program-level assessment does not involve a re-grading of student papers or an assessment of individual instructor teaching. It is a mechanism to gather data and identify patterns of strengths and areas for improvement.
To implement program-level assessment at the University of Montana, instructors in Approved Writing Courses will ask students to submit papers to Moodle. All identifying student and instructor information will be removed. Volunteer faculty, staff, and graduate students will score a random selection of papers using a rubric. Data will be gathered, analyzed and reported.
In order to determine the feasibility of this program-level assessment of student writing, the ASCRC Writing Committee, in collaboration with Academic Affairs, the Composition Program, and the Writing Center, created and piloted an assessment of student writing sampled from lower-division Approved Writing Courses. In Spring 2012 and Spring 2013, the ASCRC Writing Committee hosted a Writing Retreat in which volunteer faculty, staff, and graduate students scored a random sampling of student papers from these courses. Scorers from multiple disciplines read and evaluated papers using a scoring rubric based on the learning outcomes of Approved Writing Courses. Scorers evaluated the papers, discussed the strengths and weaknesses of the papers, and shared their instructional experiences and techniques. The primary goals of the retreat were (1) to discover how the training/scoring process supports scorers’ ability to assess student papers with validity, reliability, and efficiency; and (2) to consider the feasibility and sustainability of an annual program-level assessment of student writing proficiency in Approved Writing Courses.
The evaluations (2013) (2012) from these Writing Retreats and the feedback on the pilot project were positive. Because the scoring rubric aligns with the learning outcomes of Approved Writing Courses* and because students demonstrate their proficiency in the context of courses, the Writing Committee strongly believes this program-level assessment of student writing will provide accurate, timely, and useful information to instructors of Approved Writing Courses and to academic programs. In addition, the program-level assessment pilot generated focused conversations about student writing and faculty instruction, thereby establishing a rich context for further work. The Writing Retreat has the potential to serve as an important professional development experience that encourages conversation about writing instruction, assessment, and curriculum and that promotes shared responsibility across disciplines.
For examples of successful university-wide program-level writing assessment models, visit the Council of Writing Program Administrators Assessment Gallery (http://wpacouncil.org/assessment-models). Data from similar types of program-level assessments have shown improvement in student performance over time. A local example is the Montana University System Writing Assessment (MUSWA), which began in 2000 to study college composition readiness issues among Montana high school students. Student proficiency in 2001 was 38%. It increased to 75% by 2011. (http://mus.edu/writingproficiency/newsletter37.pdf)
If the Faculty Senate approves this motion, the Provost’s Office will work with the Office for Student Success (specifically the Writing Center Director) to provide the funds to implement the assessment.
The Writing Committee recognizes that for university-wide program-level assessment of student writing proficiency to be successful, several steps are critical. These include:
1) cooperation from Approved Writing Course instructors to require students to submit papers for the program-level assessment;
2) voluntary participation of Writing Committee members, Writing Center staff, and faculty and staff interested in student writing proficiency and writing instruction; and
3) at least one annual Writing Retreat at which volunteer faculty, staff, and graduate students read and score student writing from Approved Writing Courses.
The Writing Committee also recommends that a staff member be tasked with gathering student writing submissions from Approved Writing Courses, coordinating logistics for the Writing Retreat, analyzing data, and reporting on the findings of the assessment.
The ASCRC Writing Committee will continue to participate and advise in the promotion, organization, and conduct of the university-wide program-level assessment of student writing proficiency. The Writing Committee will report annually to ASCRC and other appropriate audiences the writing assessment results.
* Throughout, the term Approved Writing Course refers to general education writing courses that are not specifically listed as part of the Upper-Division Writing Requirement of a major. Many Approved Writing Courses are lower-division.
The University of Montana
Missoula, MT 59812