Assessing 2010

2010 ad -- image credit: COE 2010 project As the first year of the 2010 initiative comes to a close, the COE has seen the introduction of innovative and exciting initiatives - new courses, service programs, collaborations between departments and universities, certificate proposals, and even a new newsletter (this one). As I interviewed several of the people involved in 2010, I have seen amazing enthusiasm around many different inspiring projects.

But, as this assessment series has attempted to point out, enthusiasm is only one part - though an important part - of the whole story. As the academic year draws to a close, we take a longer, more macro view on assessment: How is it possible to assess a program as large and diverse as the 2010 project? And what lessons can individual instructors and programs learn from it?

Colleen Atakpu-Abraham (a graduate of the COE's Industrial and Systems Engineering program and current graduate student in Educational Leadership and Policy Analysis ), the PA for the assessment of the 2010 project, and Sandra Courter, director of the Engineering Learning Center, have been working with the COE Task Force to develop a series of assessments to evaluate the impact of the 2010 initiative.

Each 2010 project proposal includes an individual assessment component, to be conducted by the project PIs. "Most of the project leaders are planning attitudinal surveys, and some of them are planning to do a pre-test and a post-test to see what kind of knowledge students gain. We've been encouraging informal assessments like minute papers and concept tests, too," explains Atakpu-Abraham.

At the same time, Atakpu-Abraham and Courter have been working to assess the 2010 program on the whole.

Atakpu-Abraham explains, "We're working on an overarching assessment of the eleven 2010 projects, evaluating the relative impact of each project both on the faculty and on the students. The assessments we're designing consist of two electronic attitudinal surveys, one for undergraduate students and one for faculty who taught at least one undergrad class."

This survey assessment will evaluate the new courses that were developed under 2010 grants, and uses the undergraduate student body to form experimental and control groups. "We will be surveying all undergraduates who took a course in the COE, and from there, we'll have an experiment group [students who took a 2010 course] and a control group [students who did not take a 2010 course]. We'll be looking to see what the differences might be," says Atakpu-Abraham.

2010 projects that did not involve the creation of courses, or whose courses will run during future semesters, will be evaluated separately, in similar ways, mostly through targeted surveys and focus groups for the participants.

"Some of the projects are very different from each other, and not all of them are running at the same time," she notes, "so we've had to be creative in our assessments."

As Atakpu-Abraham has become involved in her PA-ship, she has also worked with Courter to conduct workshops on assessment for instructors, raising awareness of the many forms of assessment that could be appropriate in different settings.

"We will create a website on assessment that will be linked from the COE homepage. It will talk about the different kinds of assessment that are possible, and we're hoping that instructors will want to use it. There are a lot of resources on campus for assessment, like Delta and WCER, but lots of instructors don't know about those, or about the reasons that they might want to use different kinds of assessment," she says.

Atakpu-Abraham also stresses that attitudinal surveys, like the ones created for the overarching 2010 assessment, can be useful for individual courses or even topical units, as well. "An attitudinal survey is a great assessment tool for a class that is using new material or learning a new topic. The assessment can focus on the pedagogy of the class, as well as the material, and it can include questions like what did you learn from this, what else do you want to know about it, or how can it be improved," she says.

"Course evaluations don't often tell faculty members about specifics like that, and knowing what students think can be really useful for future planning" - both in individual courses, and in large efforts like the 2010 initiative.

*

Tags: ,

*

For more information about the ideas in this article:

Attitude Surveys, from the Field Tested Learning Guide.)

This section of the Field-tested Learning Assessment Guide focuses on attitude surveys, examining the situations in which this type of assessment can be useful. The FLAG suggests that attitudinal surveys can provide instructors with feedback about what students know, what they have learned about a certain topic, and what kinds of preferences students have about the ways in which they learn. This type of feedback can help instructors choose effective teaching strategies or decide how to organize future course material.