32
August 2012 Preliminary Data Learning Meeting August 2012 Orting School District Marci Shepard

State Assessment Data Meeting for Admin

Embed Size (px)

DESCRIPTION

In this presentation, principals will engage in a process to make sense of assessment data then lead the process with their staffs. Learning Target: I can lead analysis of data using a data-driven dialog protocol. Success Criteria: * Summarize best practices for data analysis * Predict what we may see * Make literal observations * Draw inferences and ask questions * Identify possible next steps Learning Target: I can explain AMOs to my staff. Success Criteria: * Describe what AMOs are and how they are calculated * Interpret AMO calculations

Citation preview

Page 1: State Assessment Data Meeting for Admin

August 2012

Preliminary Data Learning Meeting

August 2012

Orting School District

Marci Shepard

Page 2: State Assessment Data Meeting for Admin

“Be Present”

Orting School District Teaching, Learning and Assessment August 2012 M. Shepard

Page 3: State Assessment Data Meeting for Admin

I can lead analysis of data using

a data-driven dialog protocol. • Summarize best practices for data analysis

• Predict what we may see

• Make literal observations

• Draw inferences and ask questions

• Identify possible next steps

I can explain AMOs to my staff. • Describe what AMOs are and how they are

calculated

• Interpret AMO calculations Orting School District Teaching, Learning and Assessment August 2012 M. Shepard

Page 4: State Assessment Data Meeting for Admin

• Summarize best practices for data analysis

• Predict what we may see

• Make literal observations

• Draw inferences and ask questions

• Identify possible next steps

I can lead analysis of data using

a data-driven dialog protocol.

Orting School District Teaching, Learning and Assessment August 2012 M. Shepard

Page 5: State Assessment Data Meeting for Admin

• Summarize best practices for data analysis

• Predict what we may see

• Make literal observations

• Draw inferences and ask questions

• Identify possible next steps

I can lead analysis of data using

a data-driven dialog protocol.

Orting School District Teaching, Learning and Assessment August 2012 M. Shepard

Page 6: State Assessment Data Meeting for Admin

• Summarize best practices for data analysis

• Predict what we may see

• Make literal observations

• Draw inferences and ask questions

• Identify possible next steps

I can lead analysis of data using

a data-driven dialog protocol.

Orting School District Teaching, Learning and Assessment August 2012 M. Shepard

Page 7: State Assessment Data Meeting for Admin

Update from the State Online versus Paper and Pencil

Orting School District Teaching, Learning and Assessment August 2012 M. Shepard

Page 8: State Assessment Data Meeting for Admin

More technology difficulties

this year • Technology glitches are reported as irregularities and may

have had a negative impact on scores.

o We have been analyzing differences between students with

reported irregularities and those in schools that did not have

irregularities, and cannot detect that this had a negative effect.

o When students were unable to enter an answer or had another

technology failure that precluded measuring their skill, a modified

scoring table will be applied.

• There are already many plans in place to fix the technology

difficulties.

o Districts will have more time with the test engine so students are not unfamiliar with the tools and functionality.

o Test vendor (DRC) will develop a mechanism for verifying that each district has proper set-up for online testing a month prior to testing.

Orting School District Teaching, Learning and Assessment August 2012 M. Shepard

Page 9: State Assessment Data Meeting for Admin

Online Testing – Mode Comparability

• Equating, which compares performance on items

common to last year’s test, shows the raw score

needed to be at Level 2, Level 3, and Level 4 are

the same in each mode – but when we apply those

cut scores, the percent of students meeting

standard on the paper tests is higher than the

percent meeting standard on the online tests in

nearly all grades and all content areas.

Orting School District Teaching, Learning and Assessment August 2012 M. Shepard

Page 10: State Assessment Data Meeting for Admin

Online Testing – Mode Comparability

• Districts had concerns in the past two years about online

being harder, but the psychometrics showed no

difference.

• If we gave identical paper tests, or identical online tests

to two groups of people one group might do better than

the other, and we would conclude that the groups had

different abilities (maybe one had more high performing

students). That is what we have attributed the minor

mode differences to in previous years.

• But this year brought larger differences, all in favor of

paper/pencil tests… Orting School District Teaching, Learning and Assessment August 2012 M. Shepard

Page 11: State Assessment Data Meeting for Admin

Online Testing – Mode Comparability

Grade % Testing

Online

Math Reading Science

Online Paper Online Paper Online Paper

3 ~15% 64.5 65.2 57.8 67.8

4 ~25% 57.9 58.7 64.7 71.8

5 ~35% 62.8 63.6 67.6 71.7 59.2 67.0

6 ~50% 61.6 62.3 63.0 70.6

7 ~50% 55.0 58.4 64.9 71.0

8 ~50% 52.9 58.7 64.6 68.9 61.5 70.9

Table below shows 2012 percent meeting standard (based only on equating samples):

Orting School District Teaching, Learning and Assessment August 2012 M. Shepard

Page 12: State Assessment Data Meeting for Admin

Online Testing – Mode Comparability

• This year the differences between modes are bigger than in first two years of online testing.

• The biggest differences are in the text-based subjects, where student read passages online (reading and science).

• The differences tend to be smaller in the upper grades, but not always.

• Technology irregularities did not explain the differences.

Orting School District Teaching, Learning and Assessment August 2012 M. Shepard

Page 13: State Assessment Data Meeting for Admin

Online Testing – Mode Comparability

• In consultation with our assessment vendors,

psychometrics experts and national technical

advisory committee, we made an adjustment to

online scores as part of our equating process.

Orting School District Teaching, Learning and Assessment August 2012 M. Shepard

Page 14: State Assessment Data Meeting for Admin

Online Testing – Mode Comparability

Grade % Testing

Online

Math Reading Science

Online Paper Online Paper Online Paper

3 ~15% 69.0 65.2 71.8 67.8

4 ~25% 65.6 58.7 69.2 71.8

5 ~35% 66.2 63.6 72.0 71.7 68.6 67.0

6 ~50% 61.6 62.3 67.3 70.6

7 ~50% 61.5 58.4 72.7 71.0

8 ~50% 56.8 58.7 68.7 68.9 65.4 70.9

Table below shows 2012 adjusted percent meeting standard (based only on equating samples):

Orting School District Teaching, Learning and Assessment August 2012 M. Shepard

Page 15: State Assessment Data Meeting for Admin

This means……

• Any systematic differences in difficulty between

modes have already been adjusted in scores

reported to districts.

• OSPI will continue to examine mode effects during

equating to determine if an adjustment is

warranted in future years.

Orting School District Teaching, Learning and Assessment August 2012 M. Shepard

Page 16: State Assessment Data Meeting for Admin

• Summarize best practices for data analysis

• Predict what we may see

• Make literal observations

• Draw inferences and ask questions

• Identify possible next steps

I can lead analysis of data using

a data-driven dialog protocol.

Orting School District Teaching, Learning and Assessment August 2012 M. Shepard

Page 17: State Assessment Data Meeting for Admin

• Summarize best practices for data analysis

• Predict what we may see

• Make literal observations

• Draw inferences and ask questions

• Identify possible next steps

I can lead analysis of data using

a data-driven dialog protocol.

Orting School District Teaching, Learning and Assessment August 2012 M. Shepard

Page 18: State Assessment Data Meeting for Admin

• Summarize best practices for data analysis

• Predict what we may see

• Make literal observations

• Draw inferences and ask questions

• Identify possible next steps

I can lead analysis of data using

a data-driven dialog protocol.

Orting School District Teaching, Learning and Assessment August 2012 M. Shepard

Page 19: State Assessment Data Meeting for Admin

• Summarize best practices for data analysis

• Predict what we may see

• Make literal observations

• Draw inferences and ask questions

• Identify possible next steps

I can lead analysis of data using

a data-driven dialog protocol.

Orting School District Teaching, Learning and Assessment August 2012 M. Shepard

Page 20: State Assessment Data Meeting for Admin

I can explain AMOs to my staff. • Describe what AMOs are and how they are

calculated

• Interpret AMO calculations

Orting School District Teaching, Learning and Assessment August 2012 M. Shepard

Page 21: State Assessment Data Meeting for Admin

I can explain AMOs to my staff. • Describe what AMOs are and how they are

calculated

• Interpret AMO calculations

Orting School District Teaching, Learning and Assessment August 2012 M. Shepard

Page 22: State Assessment Data Meeting for Admin

nnual Measurable bjectives

Orting School District Teaching, Learning and Assessment August 2012 M. Shepard

Page 23: State Assessment Data Meeting for Admin

Testing in ESEA Flexibility Waiver • AYP rules and procedures are replaced by

Annual Measureable Objectives.

• Lowest performing schools in reading and math need to revise their school improvement plan using up to 20% of district Title I monies.

• Participation in assessments and performance of sub-groups (including English language learners, special education, poverty) still key.

Orting School District Teaching, Learning and Assessment August 2012 M. Shepard

Page 24: State Assessment Data Meeting for Admin

State-developed differentiated recognition

accountability and support

o Reward Schools • Highest performing schools

• High-progress schools

o Priority Schools • 5% lowest performing Title I and Title 1-eligible schools with less than

60% graduation rate

o Focus Schools • 10% of Title I schools with highest proficiency gaps

o Emerging Schools • The next lowest 10% of schools on the Focus list, and the next 5% of

schools on the Priority list

Orting School District Teaching, Learning and Assessment August 2012 M. Shepard

Page 25: State Assessment Data Meeting for Admin

Accountability Evolution with ESEA Waiver

ESEA Waiver Application

Accountability System

Used to identify Reward, Focus and

Priority and Emerging schools

ESEA New Accountability

System

Used to identify Reward, Focus

and Priority and Emerging

schools

School Improvement

Uses AYP calculations to identify

schools and districts in a step of

improvement (Title I)

Uses AYP calculations to generate

list of Persistently Lowest Achieving

Schools

SBE/OSPI Achievement

Index

Used to identify Award Schools

AYP Determinations

Sanctions

Set-asides

Up to 2011-12 2012-13 and 2013-14 2014-15 and beyond

AMO Calculations

No Sanctions (letters, transportation,

etc.)

Up to 20% Set-asides for Priority, Focus,

and Emerging Schools

AMO Calculations

No Sanctions (letters, transportation,

etc.)

Up to 20% Set-asides for Priority, Focus,

and Emerging Schools

Orting School District Teaching, Learning and Assessment August 2012 M. Shepard

Page 26: State Assessment Data Meeting for Admin

Accountability Evolution with ESEA Waiver

AYP Determinations

Determinations based on current status of % meeting standard compared to Uniform Bar (100% by 2014) AYP determinations reported on Report Card Not making AYP results in sanctions for Title 1 schools $$$ set-asides

Up to 2011-12 2012-13 and 2013-14 2014-15 and beyond

AMO Calculations

Annual targets to close proficiency gaps by ½ by 2017; uses 2011 as baseline and adds equal annual increments (1/6 of proficiency gap) to get to 2017 target; each subgroup, school, district, and state, have unique annual targets. Calculations reported on Report Card No sanctions Up to 20% Set-asides for Priority, Focus, and Emerging Schools

AMO Calculations

Annual targets to close proficiency gaps by ½ by 2017; uses 2011 as baseline and adds equal annual increments (1/6 of proficiency gap) to get to 2017 target; each subgroup, school, district, and state, have unique annual targets. Calculations reported on Report Card No sanctions Up to 20% Set-asides for Priority, Focus, and Emerging Schools

Orting School District Teaching, Learning and Assessment August 2012 M. Shepard

Page 27: State Assessment Data Meeting for Admin

State-developed differentiated recognition

accountability and support

o Annual Measurable Objectives • Using 2011 as a baseline, OSPI set benchmarks that will cut

proficiency gaps in half by 2017 for every WA school.

• No sanctions required, but the expectation is that SIPs would

include strategies to close gaps.

• N size = 20

Orting School District Teaching, Learning and Assessment August 2012 M. Shepard

Page 28: State Assessment Data Meeting for Admin

Annual Measurable Objectives (AMOs)

WA has opted to establish AMOs as equal increments set toward the goal of

reducing by half the percent of students who are not proficient in all AYP sub

categories by fall 2017 (within six years)

Orting School District Teaching, Learning and Assessment August 2012 M. Shepard

Page 29: State Assessment Data Meeting for Admin

2012–13 Waiver Tasks

• Office of Student and School Success will work with

Priority, Focus and Emerging schools to address gaps.

• The State Board of Education (SBE) and OSPI are

required to submit a revised accountability system

request, which is likely to include growth data.

• Legislature must pass a law to require ‘focused

evaluations’ to use student growth as a significant

factor.

• State must establish rules regarding use of student

growth as a significant factor in teacher and principal

evaluation and support systems.

Orting School District Teaching, Learning and Assessment August 2012 M. Shepard

Page 30: State Assessment Data Meeting for Admin

I can explain AMOs to my staff. • Describe what AMOs are and how they are

calculated

• Interpret AMO calculations

Orting School District Teaching, Learning and Assessment August 2012 M. Shepard

Page 31: State Assessment Data Meeting for Admin

CDs with Resources to Use with your Staff

• This PowerPoint

• Preliminary data from the state

• OSD data comparisons

• How to calculate AMOs

• OSD AMO calculations

• Data driven dialog protocol

Orting School District Teaching, Learning and Assessment August 2012 M. Shepard

Page 32: State Assessment Data Meeting for Admin

Questions?

Orting School District Teaching, Learning and Assessment August 2012 M. Shepard