19
Assessment of Student Learning: Expanding the Toolbox Sarah Bunnell Ohio Wesleyan University

Assessment of Student Learning: Expanding the Toolbox

  • Upload
    curry

  • View
    51

  • Download
    0

Embed Size (px)

DESCRIPTION

Assessment of Student Learning: Expanding the Toolbox. Sarah Bunnell Ohio Wesleyan University. A Framework for Thinking about Assessment. Backward Design (Wiggins & McTighe): 1. Identify desired results. What do you want to know? - PowerPoint PPT Presentation

Citation preview

Page 1: Assessment of Student Learning:  Expanding the Toolbox

Assessment of Student Learning:

Expanding the Toolbox

Sarah BunnellOhio Wesleyan University

Page 2: Assessment of Student Learning:  Expanding the Toolbox

A Framework for Thinking about Assessment

• Backward Design (Wiggins & McTighe):– 1. Identify desired results.

• What do you want to know?

– 2. Determine acceptable evidence and appropriate methods for acquiring evidence.

– 3. Then and only then… develop learning experiences and instruction.

NOTE: It’s called backward design because you start with the goal FIRST.

Page 3: Assessment of Student Learning:  Expanding the Toolbox

Different Types of “What do you want to know?” Questions

1. What works?

Do my students learn better when I do X? How would I know?

2. What is?

What is happening when students are learning or trying to learn? Where are they getting stuck?

3. What could be?

“Visions of the Possible”, What is the ideal outcome and how might I help get students there?

(From the Carnegie Academy for the Scholarship of Teaching and Learning; CASTL)

Page 4: Assessment of Student Learning:  Expanding the Toolbox

Overarching Goals of EREN1. Students will be able to apply scientific methodology (including hypothesis generation and experimental design) and recognize the importance of uncertainty for experiments at single and multiple sites.

2. Students will be able to identify factors that vary among sites across geographic or temporal scales, describe how these factors interact, and how they may affect ecological processes.  

3. Students will be able to describe the value and techniques of scientific collaboration.

4. Students will demonstrate best practices in the accurate collection, recording, and ethical management of multi-site, multi-participant datasets.

5. Students will be able to analyze, interpret, and draw conclusions from data collected in multi-site studies.

Page 5: Assessment of Student Learning:  Expanding the Toolbox

What works? What is?

What could be?

1. Students will be able to apply scientific methodology (including hypothesis generation and experimental design) and recognize the importance of uncertainty for experiments at single and multiple sites.

Page 6: Assessment of Student Learning:  Expanding the Toolbox

Revisit the Framework

• Backward Design (Wiggins & McTighe):– 1. Identify desired results.

– 2. Determine acceptable evidence and appropriate methods for acquiring evidence.

– 3. Then and only then… develop learning experiences and instruction.

Page 7: Assessment of Student Learning:  Expanding the Toolbox

Determining Acceptable Evidence – Where to look?

• Where to find evidence of Product:– Exam, Quiz, in-class performances– Pre vs. post-test shifts in learning– Group presentations, lab reports

• What about evidence of Process?

Page 8: Assessment of Student Learning:  Expanding the Toolbox

Adapted from Bass & Elmendorf, 2007

• How can we better understand these intermediate, invisible processes?

• How might we capture them? Foster them?

Novice ExpertMIRACLE

Product Product

Page 9: Assessment of Student Learning:  Expanding the Toolbox

Dimensions of Assessment

Fine Grain Holistic

What Works

What IS

What’s PossibleTemporal Dimension

Can be applied to questions of Product as well as questions of Process

Page 10: Assessment of Student Learning:  Expanding the Toolbox

What Works? Example• Impact of teaching with “Two Step Process” on

students’ ability to analyze graphical data– Pre- and post-tests; 4 institutions (240 Ss) – Competency rubric (“Improved”; “No change,

satisfactory”; “No change, unsatisfactory”; or “Worsened”)

– Improved ability to describe and create graphs – Consistent difficulty understanding IVs and

DVs., detecting trends in noisy data, and interpreting Interactions

Picone et al. (2007). Teaching Issues and Experiments in Ecology, Vol. 5

Page 11: Assessment of Student Learning:  Expanding the Toolbox

What is? Example

• Jim Sandefur - Foundations Mathematics course• Students were struggling with problem-solving• Used Video-taped think-alouds

• Observed that students who were struggling:– Got stuck right out of the gate– Didn’t try multiple strategies– They didn’t test examples

Page 12: Assessment of Student Learning:  Expanding the Toolbox

What could be? ExampleFostering Uncertainty and Community

• References to not knowing,• Rates and qualities of questions, and • Student-to student discussions

Page 13: Assessment of Student Learning:  Expanding the Toolbox

Scales of AssessmentFine-Grained Analysis Example

• Lexical Analysis (applied to discussions or journals)– Discussion Boards vs. Blogs

– Use discussion boards for clarification, and blogs for metacognitive reflection and connection

Lexical Inquiry and Word Count (LIWC; Pennebaker et al.)

***

**

p=0.10

Page 14: Assessment of Student Learning:  Expanding the Toolbox

Scales of AssessmentMid-Level Analyses

• Secondary Analysis of High-or Low-Stakes Assessments– On what content area(s) or types of questions

did most students perform well? Struggle?

– For areas of struggle: Can you identify the bottleneck?• Functional, Affective, or Conceptual

Page 15: Assessment of Student Learning:  Expanding the Toolbox

Scales of AssessmentHolistic Analyses

• Categorical judgments of performance– Capturing students’ ways of thinking (e.g.,

Marcia Baxter-Magolda)• Absolute, Transitional, Independent, and

Contextual

– Multiple Components of Performance Rubrics• Often not the same as grading rubrics• Some very good rubrics already exist for use

–AAC&U Value Rubrics

Page 16: Assessment of Student Learning:  Expanding the Toolbox

Creating your own Rubrics• Some helpful initial steps:

– As experts, complete the task to set high end– Do an initial sort to clarify categories

• Then, look for similarities within piles

• Recommend focusing on 3-5 dimensions– Goal is clear, non-double barreled

descriptions of levels of performance, from novice to expertise

– Ideal reliability between raters is ~80%

Page 17: Assessment of Student Learning:  Expanding the Toolbox

Some Final Assessment Factors to Consider

• Temporal dimension of assessment– Rapid experience sampling, semester-long

assessments, longer assessments?

• Structure of Comparison Groups– Within- vs. Between-Groups– Links back to “What do you want to know?” Qs

• Not a “clean sample”– Individual difference variables of interest?

Page 18: Assessment of Student Learning:  Expanding the Toolbox

Final Thoughts and Recommendations

• Sustainability and Scale are key– Not only to faculty quality of life– “When I changed X, Y changed.”

• Use what you already have – Meaningful, contextualized assessments– You already have a TON of data on student

learning… don’t throw it away!

Page 19: Assessment of Student Learning:  Expanding the Toolbox

Questions? Comments?Reflections?