128
Professional Development to Practice The contents of this presentation were developed under a grant from the US Department of Education to the Missouri Department of Elementary and Secondary Education (#H323A120018). However, these contents do not necessarily represent the policy of the US Department of Education, and you should not assume endorsement by the Federal Government. Professional Development to Practice Data-Based Decision Making

Data-Based Decision Making

  • Upload
    rosine

  • View
    40

  • Download
    0

Embed Size (px)

DESCRIPTION

Data-Based Decision Making. Openings & Introductions. Session-at-a-glance Introductions Training Norms Learner Objectives Pre-Session Readings and Essential Q uestions. Session-At-A-Glance. Overview of the data-based decision making (DBDM) (30 minutes) - PowerPoint PPT Presentation

Citation preview

Page 1: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

The contents of this presentation were developed under a grant from the US Department of Education to the Missouri Department of Elementary and Secondary Education (#H323A120018). However, these contents do not necessarily represent the policy of the US Department of Education, and you should not assume endorsement by the Federal Government. P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Data-Based Decision Making

Page 2: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Openings & IntroductionsSession-at-a-glanceIntroductionsTraining NormsLearner ObjectivesPre-Session Readings and Essential

Questions

Page 3: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Session-At-A-GlanceOverview of the data-based decision

making (DBDM) (30 minutes)6-steps of the DBDM process (4 hours)

Why?What?Implementation fidelity indicators for each stepTeam action planning for each step

Action Planning for entire DBDM process (30 minutes)

Page 4: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

IntroductionsConsultant add information needed

regarding introductions for your session

Select and use an inclusion activity at consultant discretion

Page 5: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Training Norms Begin and end on timeBe an engaged participantBe an active listener—open to new

ideasUse notes for side bar conversationsUse electronics respectfully

Page 6: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Learner Outcomes1.Teacher learns how data-based decision making allows for

demonstration of Missouri Teacher Standards2.Teacher develops knowledge and applies steps of DBDM

“Cycles” (Data Teams) with example data sets: Develop classroom system of data collection and chartingAnalyze and disaggregate student learningEstablish student goals based on resultsSelect instructional practicesDetermine results indicators (cause) and product (effect /

social emotional and behavioral)Design ongoing monitoring of results (monitor, reflect,

adjust, repeat) Review results indicatorsReview implementation:

Instructional practicesData cycle

Page 7: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Learner “Post” Objectives3. Teacher utilizes steps of DBDM “Cycles” with their classroom data

Teacher will collect, chart, analyze and disaggregate student learning data as well as implementation data

Teacher will explain results indicators for process (cause) and product (effect)

Teacher will design ongoing monitoring of results (monitor, reflect, adjust, repeat)

Page 8: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c ePreparatory Reading Reflection

Using Student Data to Support Instructional Decision Making Review the 5 recommendations in the IES

Practice Guide SummaryMark with a star which of those

recommendations and specific steps, with support, you as a classroom teacher can work to implement into your professional practice

When directed share your starred items with a shoulder partner

Page 9: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Preparatory Reading ReflectionFirst Things First: Demystifying Data

AnalysisMike Schmoker poses 2 essential questions for educators to answer:

How many students are succeeding in the subject I teach?

Within those subjects, what are the areas of strengths and weakness?

How do you or your grade level or departmental team answer these questions now?

How can the answers to these questions efficiently drive instructional decision making at the classroom, grade level and/or departmental level?

Page 10: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Essential Questions How many students are succeeding

in the subject I/we teach? Within those subjects, what are the

areas of strengths and weakness? How can I/we establish and sustain

a culture and process for strategic instructional decision making across our building, teams and classrooms?

Page 11: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

“Connecting the dots” when you are feeling overwhelmed!

How does data-based decision making allow teachers to simultaneously improve student outcomes while also demonstrating knowledge and fluency with Missouri Teacher Standards?

Page 12: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Data-Based Decision Making and Missouri Teacher Standards

Standard #1: Content knowledge and perspectives aligned with appropriate instructionStandard #2: Student Learning, growth and developmentStandard #3: Implementing the curriculumStandard #4: Teaching for critical thinkingStandard #5: Creating a positive classroom learning environmentStandard #6: Effective CommunicationStandard #7: Use of student assessment data to analyze and modify instructionStandard #8: Reflection on professional practice to assess effect of choices and actions on othersStandard #9: Professional collaboration

✔ ✔ ✔

✔ ✔

Page 13: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

The contents of this presentation were developed under a grant from the US Department of Education to the Missouri Department of Elementary and Secondary Education (#H323A120018). However, these contents do not necessarily represent the policy of the US Department of Education, and you should not assume endorsement by the Federal Government. P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Overview of Data-Based Decision

Making

Page 14: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Page 15: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

KEY:

Core Training Modules

Follow-up Training Modules

Precursors to Training

Once teams determine an EP to focus on, they can choose one or multiples of these focused modules. Each of the EP modules in this section will include: implementation guidance with tools and troubleshooting, and using data to determine effectiveness.

The contentsof thispresentation weredeveloped under a grant fromtheUSDepartment of Education to theMissouri Department of Elementaryand Secondary Education(#H323A120018). However, thesecontents do not necessarily representthe policy of the US Department ofEducation, and youshould not assumeendorsement by the FederalGovernment.

Collaborative Data Teams (CDT)

Foundational Processes

AgendasCommunicationNormsRoles

Overview and Purpose

Collaborative Teams

Activity: Wrap Up/Overview of

Next Steps

Follow-Up Based on Data: Coaching and Revisiting PD

School-Based Implementation Coaching

Overview and Purpose of

Coaching for supporting

school-wide implementation

Critical skills of coaching

Coaching in Practice

Activity: Wrap Up/Overview of

Next Steps

Follow-Up Based on Data: Coaching and Revisiting PD

Collaborative Work Training

Follow-up to Training

Getting Started

Wrap Up Activity

Focus AreasIntroduction to Missouri Collaborative Work

Use Getting Started Guide to determine starting point and scope of learning

Data-Based Decision Making (DBDM)

Overview and Purpose of DBDM

Data Team Process Steps Sequence and Examples

1. Collect and Chart Data2. Analyze and Prioritize3. SMART Goal4. Instructional Decision Making5. Determine Results Indicators6. Ongoing Monitoring

Developing Meaningful

Learning Targets

Quality Assessment

Design

Performance Events

Constructed Response

Items

Selected Response

Items

Common Formative Assessment (CFA)

Overview and Purpose of CFA

AdvancedProcesses

Consensus Collaborative SkillsProtocols

Overview and Purpose of EP

Effective Teaching/ Learning Practices (EP)

Spaced versus

Massed

Feedback

Assessment Capable Learners

Reciprocal Teaching

Spaced versus

Massed

Assessment Capable Learners

September 2013

Activity: Wrap Up/Overview of

Next Steps

Activity: Wrap Up/Overview of

Next Steps

Activity: Wrap Up/Overview of

Next Steps

Page 16: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Why use Data-Based Decision making?

Using a DBDM process shifts the work of school leadership teams from a reactive or crisis driven process to a pro-active, outcomes driven process, and sets the stage for continuous improvement.

~Gilbert, 1978; McIntosh, Horner & Sugai, 2009

Page 17: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Why use Data-Based Decision making?

School personnel have an opportunity to grow as consumers of data who can transform data reports (e.g. graphs or charts) into meaningful information that drives effective data-based decision making for organizational change and school improvement.

~Gilbert, 1978

Page 18: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c eWhat is Data-Based Decision making?

Data-Based decision making (DBDM) involves small teams meeting regularly and using an explicit, data-driven structure to: disaggregate data, analyze student performance, set incremental student learning goals, engage in dialogue around explicit and

deliberate classroom instruction, and create a plan to monitor instruction and

student learning.

(MO SPDG 2013)

Page 19: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c ePre-Requisites for Effective DBDMLeadership

Collaborative CultureStructured and protected collaborative

timeConsistent process for DBDM CyclesEfficient Data Collection & Reporting

SystemsFidelity of implementation data Research based instructional practices &

strategiesAdditional Student Data (e.g., gender,

race/ethnicity, school /classroom attendance, etc.)

AND…

Page 20: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Pre-Requisites for Effective DBDM

Academics Curriculum Maps Identify Standard Selected

for Assessment Unwrap Standard Selected

for Assessment Common Pre, Formative

and Summative Assessments

Common Scoring Guides and Rubrics

Behavior Core Academic Standards

(Social Behavioral) Schoolwide behavioral

expectations Individual Classroom

behavioral expectations Minor Office Disciplinary

Referral (ODR) Form Major Office Disciplinary

Referral (ODR) Form Minor and Major ODR data

Page 21: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c eComponents of the DBDM Process

Collect & Chart Data

Analyze & Prioritize

SMART Goal

Instructional Decision Making

Determine Results

Indicators

Monitor

Page 22: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e1. Examine Standards.

2. Determine Tracking

Standards.

3. Develop a Pacing Guide for Essential Standards.

4. Develop Post- , Mid- & Pre-CFA

5. Administer Pre- CFA.

6. Follow a consistent DBDM

Process.

7. Teach students using Common

Instructional Practices.

8. Administer the CFA (Post Instruction).

9. Score the Assessment &

submit the data to the Data Team.

10. Meet as a team to determine if the goal

was met.

11. Return to appropriate

step.

Academic DBDM

Flow Chart

1) Collect & Chart Data2) Analyze Data3) SMART Goals4) Instructional Decision Making5) Determine Results Indicators6) Monitor

Page 23: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

The contents of this presentation were developed under a grant from the US Department of Education to the Missouri Department of Elementary and Secondary Education (#H323A120018). However, these contents do not necessarily represent the policy of the US Department of Education, and you should not assume endorsement by the Federal Government. P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

DBDM: Step 1Collect & Chart Data

Page 24: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Collect & Chart Data

Analyze & Prioritize

SMART Goal

Instructional Decision Making

Determine Results

Indicators

Monitor

Components of the DBDM Process

Page 25: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Why Collect & Chart Datadata influences decisions that guide the

instruction for adults and students (Hamilton, et.al., 2009; Horner, Sugai, Todd 2001; Means, Chen, DeBarger & Padilla 2011; Newton, Horner, Algozzine, Todd, & Algozzine, 2009).

charting data creates visuals that delineate current status in the classroom (Horner, Sugai, & Todd, 2001).

it leads to higher student achievement (Reeves, 2009)

Page 26: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Collect & Chart Data: Terms to KnowCommon Formative Assessment (CFA)

An assessment typically created collaboratively by a team of teachers responsible for the same grade level or course. Common formative assessments are used frequently throughout the year to identify (1) individual students who need additional time and support for learning, (2) the teaching strategies most effective in helping students acquire the intended knowledge and skills, (3) curriculum concerns—areas in which students generally are having difficulty achieving the intended standard—and (4) improvement goals for individual teachers and the team.

Scoring Guide/Rubric A coherent set of criteria for students’ work that includes

descriptions of levels of performance quality on the criteria

Page 27: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Collect & Chart Data: Overview

1. Teacher administers Common Formative Assessment (CFA).

2. Teacher uses Scoring Guide to score CFA.3. Teacher charts classroom CFA data & gives

to team leader.4. Team leader compiles group CFA data into

chart(s)(grade level or team).5. Team leader shares charted group data at

DBDM meeting.

Page 28: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Collect &

Chart Data Analyze

& Prioritiz

e

SMART Goal

Instructional

Decision

Making

Determine

Results Indicato

rs

Monitor

1. Teacher Administers CFA.2. Teacher scores CFA.3. Teacher charts data & turns in.4. Team Leader develops chart.5. Team Leader shares charted data.

DBDM Process

Page 29: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

A. Teachers

Total # of Students # All # SWD # All # SWD # All # SWD # All # SWD # All # SWD # All # SWD # All # SWD

Student Names

HERE

Student Names

HERE

Student Names HERE

Student Names

HERE

Student Names HERE

Student Names HERE

Student Names

HERE

Student Names

HERE

#DIV/0! #DIV/0! #DIV/0! #DIV/0! 0 0

Pre-Assessment, Mid Instruction, or Post Instruction B.# Who Took Assessment

C. # Proficient & Higher

D. # Close to Proficient

E. % Proficient & Close to Proficient

F. # Far to GoG. % Proficient,

Close, & Far to GoH. #

Intervention

Collect & Chart Data: Teacher Chart

Page 30: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Collect & Chart Data: Team ChartA. Teachers

Total # of Students

# Al l # SWD # Al l # SWD # Al l # SWD % Al l % SWD # Al l # SWD % Al l % SWD # Al l # SWD

Names Names Names Names Names Names Names Names

1 #DIV/0! #DIV/0! #DIV/0! ###### 0 0

2 #DIV/0! #DIV/0! #DIV/0! ###### 0 03 #DIV/0! #DIV/0! #DIV/0! ###### 0 0

4 #DIV/0! #DIV/0! #DIV/0! ###### 0 0

5 #DIV/0! #DIV/0! #DIV/0! ###### 0 06 #DIV/0! #DIV/0! #DIV/0! ###### 0 07 #DIV/0! #DIV/0! #DIV/0! ###### 0 08 #DIV/0! #DIV/0! #DIV/0! ###### 0 09 #DIV/0! #DIV/0! #DIV/0! ###### 0 010 #DIV/0! #DIV/0! #DIV/0! ###### 0 0

Totals 0 0 0 0 0 0 #DIV/0! #DIV/0! 0 0 #DIV/0! ###### 0 0

C. # Profi cient & Higher

D. # Close to Profi cient

H. # InterventionF. # Far to GoE. % Profi cient &

Close to Profi cient

G. % Profi cient, Close, & Far to Go

B.# Who Took Assessment

Students who did not take the assessment and why.→

Page 31: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c eCollect & Chart Data Process1. Each

teacher administers

CFA.2. Each

teacher scores the CFA

3. Each teacher charts data & gives to Team

Leader

4. Team Leader compiles group CFA data into

chart

5. Team Leader shares data at

the DBDM meeting.

Page 32: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Case Study: Pre-Assessment Individual Teacher Charting

A. Teachers

Total # of Students # All # SWD # All # SWD # All # SWD # All # SWD # All # SWD # All # SWD # All # SWDMary Mark

Ali Steve

Eva Sam Grace Jax

Grace Jax

Berry Janette

Sally Sherri Sarah

Michael

Michael Toby Pete Stephanie Alex Terry Roger Dave

Vergil

Roger Dave Vergil

Debbie Nick Mary

Nick Mary

North 25 8 8 2 6 1 56.00% 37.50% 8 3 88.00% 75.00% 3 2

Pre-Assessment B.# Who Took Assessment

C. # Proficient & Higher

D. # Close to Proficient

E. % Proficient & Close to Proficient

F. # Far to GoG. % Proficient,

Close, & Far to GoH. #

Intervention

Page 33: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Case Study: Pre-Assessment Individual Teacher Charting

All teachers complete the DBDM chart given to them (either electronic or hard copy) for each student who participates in the CFA administration.

The teachers then submit the charted data to the individual whose role it is to collate the grade level or departmental data.

Page 34: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c eCase Study: Pre-Assessment Team Charting

A. Teachers

Total # of Students # All # SWD # All # SWD # All # SWD % All % SWD # All # SWD % All % SWD # All # SWD

Mary Mark

Ali Steve

Eva Sam Anita Fred

Charlotte Murray Gracie Celia Liam Olivia Grace

Jax Addison

Grace Jax

Addison

Berry Janette

Sally Sherri Sarah Sofie

Marcus Maria Jackie

Vernon Roger

Annabel Connor

Ben Emily James P Burke

Hayden Anne

Jackson Michael Allison Steve B James R

Michael Allison Steve B James R

Toby Pete Stephanie Alex Terry

Steve C Adam Tia

Matt Ralph Jami

Milo Myron Sadie Owen Maria

Meredith Jake David

Blake Sienna

Donovan Lucy

Brendan Theo Adele

Abram Colleen

Sylvie Luke Roger Dave

Vergil Oliver Buffie

Stormy

Roger Dave Vergil Oliver Buffie

Stormy Steele Enriqu

Cameron Hudson

Debbie Jim

Elizabeth Joey

Peggy Gene Aven Gwen Nick

Mary Brad Daisy Gregg

Theenda Max Reid

Nick Mary Brad Daisy Gregg Theenda Max Reid

North 25 8 8 2 6 1 42.42% 37.50% 8 3 66.67% 75.00% 3 2South 22 6 2 0 8 2 35.71% 33.33% 8 2 64.29% 66.67% 4 2East 26 5 3 0 7 1 32.26% 20.00% 11 2 67.74% 60.00% 5 2West 24 6 4 1 3 0 23.33% 16.67% 13 3 66.67% 66.67% 4 2

Totals 97 25 17 3 24 4 42% 28% 40 10 84% 740% 16 8

Pre-Assessment

C. # Proficient & Higher

D. # Close to Proficient

H. # InterventionF. # Far to GoE. % Proficient & Close to Proficient

G. % Proficient, Close, & Far to Go

B.# Who Took Assessment

Students who did not take the assessment and why.→

Page 35: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Collect & Chart Data: Practice Profile

Missouri Collaborative Work Practice Profile Foundations present in the implementation of each essential function: Commitment to the success of all students and to improving the quality of instruction.

Data -Ba sed Decis ion M ak in g Pro cess

Essential Functions Exemplary

Ideal Implementation

Proficient

Close to Proficient (Skill is emerging, but not yet to ideal proficiency. Coaching

is recommended.)

Far from Proficient (Follow-up professional

development and coaching is critical.)

Evidence

1

Educators collect, chart, analyze and disaggregate student learning data.

>80 % of teachers administer common formative assessment and use common scoring method to evaluate student proficiency.

>80% of teachers share charted class data with the data team.

Sums and percentages are correct.

Results are disaggregated according to specific school needs (e.g., specific subgroups)

Results are available to ALL team members.

Data is triangulated (multiple sources of data are included that further illuminates students’ knowledge of skill and the area being examined)

80% of teachers administer common formative assessment and use common method to evaluate student scoring proficiency.

80% of teachers share charted class data with the data team.

Sums and percentages are correct.

Results are partially disaggregated.

Results are available to ALL team members.

Data is not triangulated.

<80% of teachers administer common formative assessment and use common scoring method to evaluate student proficiency.

<80% of teachers share charted class data with the data team.

Sums and percentages are calculated, but contain errors.

Results are not disaggregated.

Results are available only to team members present for the meeting.

Data is not triangulated.

Few or no teachers administer common formative assessment and use common scoring method to evaluate student proficiency.

Class data is not charted and/or shared.

Data-Based Decision Making Excel/Word Template

Page 36: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Implementation Fidelity

A. Teachers

Total # of Students # All # SWD # All # SWD # All # SWD # All # SWD # All # SWD # All # SWD # All # SWD

Mary Mark

Ali Steve Eva Sam

Anita Fred

Charlotte

Murray Gracie Celia Liam

Olivia

Grace Jax

Addison

Berry Janette Sally

Sherri Sarah Sofie

Marcus Maria Jackie

Vernon Roger

Annabel Connor

Ben Emily

James P Burke

Hayden Anne

Jackson

Michael Allison

Steve B James R

Toby Pete

Stephanie Alex Terry

Steve C Adam

Tia Matt Ralph Jami Milo

Myron Sadie Owen Maria

Meredith Jake Davi d Blake

Sienna Donova

Roger Dave

Vergil Oliver Buffie

Stormy Steele Enriqu

Cameron

Hudson

Debbie Jim

Elizabeth Joey

Peggy Gene Aven Gwen

Nick Mary Brad Daisy Gregg Theenda Max Reid

North 25 8 6 2 5 1 33.33% 37.50% 5 3 48.48% 75.00% 1 2South 22 6 2 0 6 2 28.57% 33.33% 6 2 50.00% 66.67% 2 2East 26 5 3 0 6 1 29.03% 20.00% 9 2 58.06% 60.00% 3 2West 24 6 3 1 3 0 20.00% 16.67% 10 3 53.33% 66.67% 2 2

Totals 97 25 14 3 20 4 27.87% 28.00% 30 10 52.46% 68.00% 8 8

Engage students in more diffi cult text where the main idea may not be so explicit.

The reading was easy enough that the students could easily comprehend. Next Steps Inference

Student attendance above 95% , low percentage of classroom managed probl em behaviors, low percentage of student removal from academic instruction

Students are present at school, students remain in the classroom for academic instruction.

Behavioral Performance Strengths Inference

Students were able to identify the mai n idea and supporting details of both fiction and non-fiction texts in a vari ety of question types and prompts (mai n idea, best supports, mostly about).

Students had prior experience to a variety of vocabulary that was included in the assessment (main idea, best supports, mostly about).

Proficient & Higher StudentsAcademic Performance Strengths Inference

2. Analyze Strengths and Obstacles

G. % Proficient, Close, & Far from

ProficientH. # Intervention

Pre-Assessment

B.# Who Took Assessment

C. # Proficient & Higher

D. # Close to Proficient

E. % Proficient & Close to Proficient

F. # Far from Proficient

Note: List students who did not partici pate in assessment and why.

Data-Based Decision Making Implementation Fidelity Checklist Instructions: This checklist is designed as a format for periodically checking on the fidelity of implementing Data-Based Decision Making. It is recommended that this checklist be completed by a team and that the team reserves time on the agenda to complete the fidelity checklist. Fidelity should be monitored “early and often” (Harn, Parisi, & Stoolmiller, 2013): early in implementation, approximately three months into implementation, then continuing quarterly for the first year. While teams should strive for 100% fidelity, 80% or 8 ‘Yes’ items may be sufficient for achieving positive outcomes. If fidelity is repeatedly less than 80%, the team may benefit from coaching. Evidence needed: You will need to refer to your completed Data-Based Decision Making program template. (See the template and case study example provided at the Data-Based Decision Making training.)

Data-Based Decision Making Implementation Fidelity Checklist Instructions: This checklist is designed as a format for periodically checking on the fidelity of implementing Data-Based Decision Making. It is recommended that this checklist be completed by a team and that the team reserves time on the agenda to complete the fidelity checklist. Fidelity should be monitored “early and often” (Harn, Parisi, & Stoolmiller, 2013): early in implementation, approximately three months into implementation, then continuing quarterly for the first year. While teams should strive for 100% fidelity, 80% or 8 ‘Yes’ items may be sufficient for achieving positive outcomes. If fidelity is repeatedly less than 80%, the team may benefit from coaching. Evidence needed: You will need to refer to your completed Data-Based Decision Making program template. (See the template and case study example provided at the Data-Based Decision Making training.) Yes Partially No If partially or no, please

explain. 1. Common formative assessment data is collected. 2. Class data is accurately charted. 3. Data is triangulated. 4. Based on the data, the team lists strengths, misconceptions, and inferences.

5. Team considers prerequisite skills and data when prioritizing needs.

6. SMART goals are are specific to targeted subject area, grade level,

and student population; are measureable; specify how measurement will occur ; achievable percentage gains or increases in terms

of expected change; and time when the assessment will take place

Page 37: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Collect & Chart Data: Next StepsUsing the results from the DBDM Practice Profile

dialog to:1. Assess your team/building current knowledge and

implementation fluency with Collect & Chart Data2. Determine possible next steps:

Decide what format will your team/building utilize (electronic or hard copy).

Plan for hands on training so that all teachers now how to chart their student data.

Establish who will collate the team data, & consider if they will need training as well.

Establish dates for submitting and for sharing collated data.

Identify specific ways your team will want/need data to be disaggregated.

Page 38: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Next Steps: Action=ResultsNext Steps: Actions = Results

Content Focus

Collaborative Data Teams Effective Teaching/Learning Practices Common Formative Assessment Data-based Decision-making School: _________________________ Date Next Steps Form Written:_______________________________ Teams (e.g. grade level or content): _________________________________________________________________________________

Action Planned

What? Responsible

Person(s) Who?

Timeline When?

Resources/Support Needed Results So What?

What steps will you take to start implementing?

Page 39: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

The contents of this presentation were developed under a grant from the US Department of Education to the Missouri Department of Elementary and Secondary Education (#H323A120018). However, these contents do not necessarily represent the policy of the US Department of Education, and you should not assume endorsement by the Federal Government. P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

DBDM Step 2:Analyze & Prioritize

Page 40: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Collect & Chart Data

Analyze &

Prioritize

SMART Goal

Instructional

Decision Making

Determine Results Indicator

s

Monitor

DBDM Process

Page 41: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Why Analyze & PrioritizeThe failure to achieve meaningful outcomes during school improvement activities is often due to a poor match between problems and the intensity, fidelity, or focus of interventions that are required.

~Sprague, et.al., 2001

Page 42: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Analyze & Prioritize: Terms to Know

Decision Rules: clear, specific guidelines for making data-driven decisions (e.g., at least 80% of students should be meeting academic benchmarks)

Inference: generate possible explanations to derive accurate meaning from performance data

Page 43: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Analyze & Prioritize: OverviewTeam uses student work to observe and

identify strengths and obstacles (errors and misconceptions) as well as trends and patterns

Team develops inferences based on dataWhat is present becomes strengthsWhat is missing becomes obstacles or challenges

Team prioritizes by focusing on the most urgent needs of learners

Page 44: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Analyze & Prioritize: ObservationsExamine student work that is proficient and higher and list Strengths Consistent skills Trends  Examine student work that is not proficient and list Strengths and obstacles Students consistently rated not proficient Error Analysis

Inconsistent skillsMisconceptions in thinking

Trends Trends related to certain subgroups (e.g., ELL, gender,

race/ethnicity, school attendance, attendance in classrooms, engagement, etc.)

Page 45: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Analyze & Prioritize:Inferences

For each subgroup of students (Proficient and Higher, Close to Proficient, Far to Go, and Intervention) infer what each listed performance strength means. (i.e., cause for celebration)

For students in Close to Proficient, Far to Go, and Intervention subgroups infer what each listed performance strength or obstacle means

Page 46: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Analyze & Prioritize:Prioritization

For students in Proficient and Higher subgroups prioritize what might be a logical Next Step for further instruction to enhance student knowledge and use of the prioritized standard.

For students in the Close to Proficient, Far to Go, and Intervention subgroups prioritize which of the performance strengths or obstacles should be the logical Next Step for student instruction and support to develop and solidify student knowledge and use of the prioritized standard.

Page 47: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Analyze & Prioritize: Behavioral DataFor each sub-group identify if each of the

following apply:Student attendance above 95% Low percentage of classroom managed problem

behaviorsLow percentage of student removal from

academic instructionIf the answer is “YES” to all three

conditions an inference can be made that:Students are present at schoolStudents remain in the classroom for

academic instruction

Page 48: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Analyze & Prioritize: Behavioral DataFor each sub-group identify if each of the

following apply:Student attendance above 95% Low percentage of classroom managed problem

behaviorsLow percentage of student removal from academic

instruction If the answer is “NO” to any of the

conditions the team needs to consider:Which condition is not met?Are universal effective classroom management

practices in place with consistency and intensity needed to meet the foundational behavioral support needs of the students under scrutiny?

Page 49: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Analyze & Prioritize Case StudyProficient & Higher Sub-group

The reading was easy enough that the students could easily comprehend.

Inference

Inference

Students had prior experience to a variety of vocabulary that was included in the assessment (main idea, best supports, mostly about).

Academic Performance Strengths Students were able to identify the main idea and supporting details of both fiction and non-fiction texts in a variety of question types and prompts (main idea, best supports, mostly about).

Next StepsEngage students in more diffi cult text where the main idea may not be so explicit.

Proficient & Higher Students2. Analyze Strengths and Obstacles

Behavioral Performance Strengths InferenceStudent attendance above 95% , low percentage of classroom managed problem behaviors, low percentage of student removal from academic instruction

Students are present at school, students remain in the classroom for academic instruction.

Page 50: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Analyze & Prioritize Case StudyClose to Proficient Sub-group

Page 51: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c eAnalyze & Prioritize Case Study

Far to Go Sub-group

Behavioral Performance StrengthsStudent attendance above 95% , low percentage of classroom managed problem behaviors, low percentage of student removal from academic instruction

InferenceStudents are present at school, students remain in the classroom for academic instruction.

They have not had enough exposure to answering a wide variety of questions

Inference

Far to Go StudentsAcademic Performance Strengths

If given the main idea, students were able to determine if a detail supported the given statement.

Performance Errors, Misconceptions or Obstacles

InferenceStudents struggled with the vocabulary in the non-fiction texts that pertained to the main idea (ex. hummingbirds, petunias). Students struggled with the various question prompts (ex. Main idea, best supports, mostly about).

Students had diffi culty identifying the main idea and supporting of non-fiction passages.

Page 52: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c eAnalyze & Prioritize Case Study

Intervention Sub-groupStudent attendance above 95% , low percentage of classroom managed problem behaviors, low percentage of student removal from academic instruction

Behavioral Performance Strengths InferenceStudents are present at school, students remain in the classroom for academic instruction.

Students could not accurately read the passages and questions.

Intervention Students Inference

Students were unable to determine the main idea of a passage and details to support. The reading was too diffi cult.

The reading level of the passages was far above the reading level of the students. Students seem to have some recollection of the term “main idea” but are not sure what the vocabulary means.

Inference

Academic Performance StrengthsOn the constructed response questions, students knew to find a detail directly from the passage to support the main idea.

Performance Errors, Misconceptions or Obstacles

Page 53: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Analyze & PrioritizePractice Profile

Missouri Collaborative Work Practice Profile Foundations present in the implementation of each essential function: Commitment to the success of all students and to improving the quality of instruction.

Data -Based Decis ion M aking Process

Essential Functions Exemplary

Ideal Implementation

Proficient

Close to Proficient (Skill is emerging, but not yet to ideal proficiency. Coaching

is recommended.)

Far from Proficient (Follow-up professional

development and coaching is critical.)

Evidence

2 Educators use results to identify learning needs.

Team lists strengths, misconceptions, inferences, and prioritized needs for all proficiency groups.

Strengths and misconceptions are directly related to the common formative assessment and targeted standard(s).

Prioritized needs are categorized according to a hierarchy of prerequisite skills

Team lists strengths, misconceptions, inferences, and prioritized needs for most proficiency groups.

Strengths and misconceptions are mostly related to the pre-assessment and targeted standard(s)

Pre-requisite skills are not considered.

Strengths and misconceptions, if listed, are not related to the pre-assessment and targeted standard(s)

Learning needs are not prioritized.

Pre-requisite skills are not considered.

Data-Based Decision Making Excel/Word Template

3

Educators establish SMART goals based on data identified student learning needs.

SMART goals contain all key components (as listed in Proficient column)

Goals reflect a consideration of students “close to proficient” and case-by-case consideration of what other students can reach the goal

Goals are derived from specific team inferences

Each goal includes baseline and anticipated post assessment

Each goal closes achievement gaps for targeted student groups

All SMART goals… Are specific to

targeted subject area, grade level, and student population

Are measureable Specify how

measurement will occur

Achievable percentage gains or increases in terms of expected change

Time when the assessment will take place

SMART goals are written and mostly meet the criteria of SMART goal

Goal percentage is not correctly calculated

If complete, SMART goals lack important criteria.

Goal percentage is not correctly calculated

Data-Based Decision Making Excel/Word Template

Page 54: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Analyze & Prioritize: Next StepsUsing the results from the DBDM Practice Profiledialog to:1. Assess your team/building current knowledge and

implementation fluency with Analyze & Prioritize2. Determine possible next steps:

Identify academic strengths for each student sub-groupIdentify academic obstacles for each sub-groupIdentify behavioral strengths and/or obstacles for each

sub-group Develop possible next instructional (academic or

behavioral) for each sub-group that directly connects to inferences made for each sub-group

Page 55: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

The contents of this presentation were developed under a grant from the US Department of Education to the Missouri Department of Elementary and Secondary Education (#H323A120018). However, these contents do not necessarily represent the policy of the US Department of Education, and you should not assume endorsement by the Federal Government. P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

DBDM Step 3:SMART Goals

Page 56: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Collect & Chart Data

Analyze &

Prioritize

SMART Goal

Instructional Decision

Making

Determine Results Indicator

s

Monitor

DBDM Process

Page 57: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Why Develop a SMART Goal“According to research, goal setting is the single most powerful motivational tool in a teacher’s toolkit. Why? Because goal setting operates in ways that provide:PurposeChallengeMeaningGoals are the guideposts along the road that make a compelling vision come alive. Goals energize people. Specific, clear, challenging goals lead to greater effort and achievement than easy or vague goals do.”

(Blanchard, 2007, p. 150)

Page 58: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

The lack of clear goals may provide the most credible explanation for why we are still only inching along in our effort to improve schooling for U.S. children.~Mike Schmoker

Page 59: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

SMART Goals: Terms to Know Specific: Says exactly who the learner is and

what the learner will be able to do

Measureable: Objective definition such that the behavior can be observed and counted

Attainable: A skill that learners can master within the given period of time

Results-Oriented: Must be something learners can do to demonstrate growth; relevant to the learner

Time Bound: Achievable by time frame set

Page 60: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Measurable Goals

“Clear, measurable goals are at the center of the mystery of a school’s success, mediocrity or failure.”~ S. J. Rosenholz

Page 61: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

For SMART Goals to make a difference to teachers…

Teachers have to be engaged in the process of developing the goal so they own the goal.

Teachers have to look at the data and design a goal that make sense to them.

The goal becomes powerful when teachers use it to inform their practice.

Page 62: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

The percentage of (Name Student Group) scoring proficient or higher

in (Name the Content Area) will increase from (Current Status

Percentage) to (Goal Percentage) by the end of (Month, Quarter or

Date) as measured by (Assessment Tool) administered on

(Specific Date).

SMART Goal: Overview

Page 63: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

SMART Goal: ExampleFirst grade students enrolled in the

District and on IEPs scoring proficient or higher in reading comprehension will increase from 45 % to 60% by the end of the third quarter grading period as measured by a teacher-made 10-question comprehension test administered two days prior to the end of the third quarter.

Page 64: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

SMART Goal Case Study

Yes No

A new goal is set only if the original goals were not met.

Met Goal of

Main Idea & Supporting Details CFA

84%

to 84% by the end of

What is your deadline?→

End of 1st Quarter

as measured by What assessment are you discussing? →

will increase from

administered on →

18%

What group of students are you discussing?→ 4th grade students

What content area? →CCSS RI.4.2 ELA (Apply post reading skills to respond to text: main idea and supporting details)

3. S.M.A.R.T. Goal

The percent of

scoring proficient or higher in

If the goal was not met, record margin short of goal =

October 8th When will you re-assess? → September 13th

Page 65: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Setting a Goal: Example Pre-Instructional CFA Data (Assessment of student knowledge of priority standard BEFORE instruction)

Currently there are 97 students in the group analyzed17 students (# of proficient or higher)(= 17.5% of student whose learning

the team will want to maintain and enrich)

24 students (# of close to proficiency) (= 24.7% of students whose learning may be most readily moved towards proficiency with instruction)

41 students (# most likely to meet proficiency goals at unit end) (= 42% of all students projected to most likely meet proficiency goals)

If the team has a goal of 83% of students at proficiency, at a minimum, then the instruction will need to be designed and implemented to accelerate the learning of40 additional students (# far to go but likely to make it) (= 41%)

81 / 97 = 83% proficient

Page 66: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c eSMART Goal: Implementation Example

The percentage of _____

scoring proficient or higher in____________

will increase from ___% to __% by the end of ________ as measured by administered on______

4thgrade students

Comm. Arts

17 88Six weeks Common Formative

AssessmentsDate

Page 67: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

SMART GoalPractice Profile

Missouri Collaborative Work Practice Profile Foundations present in the implementation of each essential function: Commitment to the success of all students and to improving the quality of instruction.

Data -Based Dec is ion M aking Pro ce ss

Essential Functions Exemplary

Ideal Implementation

Proficient

Close to Proficient (Skill is emerging, but not yet to ideal proficiency. Coaching

is recommended.)

Far from Proficient (Follow-up professional

development and coaching is critical.)

Evidence

3

Educators establish SMART goals based on data identified student learning needs.

SMART goals contain all key components (as listed in Proficient column)

Goals reflect a consideration of students “close to proficient” and case-by-case consideration of what other students can reach the goal

Goals are derived from specific team inferences

Each goal includes baseline and anticipated post assessment

Each goal closes achievement gaps for targeted student groups

Goals are few and prioritized

Scheduled time set for formal analysis of results.

All SMART goals… Are specific to

targeted subject area, grade level, and student population

Are measureable Specify how

measurement will occur

Achievable percentage gains or increases in terms of expected change

Time when the assessment will take place

SMART goals are written and mostly meet the criteria of SMART goal

Goal percentage is not correctly calculated

If complete, SMART goals lack important criteria.

Goal percentage is not correctly calculated

Data-Based Decision Making Excel/Word Template

Page 68: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

SMART Goal: Next StepsUsing the results from the DBDM Practice Profile

dialog to 1. Assess your team/building current knowledge

and implementation fluency with SMART Goals2. Determine possible next steps:

Identify how many students are in each group and by IEP status (Step 1)

Calculate percentage for how many students are in each group by IEP status (Step 2)

Develop an ambitious, yet achievable goal for the percentage of students who can with strategic instruction meet the criterion score on the Common Formative Assessment written for the Priority Learning Target under analysis (Step 3)

Page 69: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

The contents of this presentation were developed under a grant from the US Department of Education to the Missouri Department of Elementary and Secondary Education (#H323A120018). However, these contents do not necessarily represent the policy of the US Department of Education, and you should not assume endorsement by the Federal Government. P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

DBDM Step 4:Instructional Decision

Making

Page 70: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Collect & Chart Data

Analyze & Prioritize

SMART Goal

Instructional Decision Making

Determine Results

Indicators

Monitor

DBDM Process

Page 71: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Effective Teaching/LearningPractices & Strategies

Instructional Practices: Effective teaching/learning practices at the

classroom level are research based effective methods that are not content related and when practiced regularly and with fidelity improve the teaching and learning in all content areas through direct application or through transfer of knowledge and skill.

Instructional Strategies: Effective teaching/learning strategies at the

classroom level are actions that are content related and used to help improve a particular step or steps within a content standard-they are discreet.

Page 72: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Traditional Instructional Assessment Model

Pretest TEACH TEACH TEACH TEACH Posttest

Assign Grades

Page 73: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Revised Instructional Assessment Model with Data Analysis

Pre-Assess All Students with

CFA

Steps 1, 2 & 3 of DBDMCollect & Chart CFA Results,

Analyze & Prioritize

Results, & Set SMART Goal

Step 4 & 5 of DBDM Plan

for Instruction using a Specific

Instructional Strategy in

conjunction with

Schoolwide ETLP, and

select Results Indicators

TEACH Using schoolwide

selected ETLP and selected Instructional

Strategy

Step 6 of DBDM

Monitor, Reflect &

Adjust Mid-course

adjustment

TEACH Using schoolwide

selected ETLP and selected Instructional

Strategy

POST-ASSESS

Schoolwide Selection, Professional Development and Implementation of an Effective Teaching and Learning Practice (ETLP)

Page 74: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Effective Teaching/Learning Practices

The MO SPDG Collaborative Work has selected from the meta analysis work of John Hattie 4 ETLP and developed training packets for school use:Assessment Capable LearnersFeedbackSpaced vs. MassedReciprocal Teaching

Each school will select 1 ETLP for schoolwide professional learning and implementation

Page 75: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Learning Environment

Time - Frequency and

Duration

Materials for Teachers and

Students

Engage students in more difficult text where the main idea may not be so explicit

Assignments & Assessments - Where

will students be required to use the

Practice?

4. Select Instructional Practices/Strategies

Instructional Practice: Feedback Instructional Strategy: Advanced Cues, Questions, and Organizers –Common Thread: .

Students will meet with the teaches to learn strategies for identifying implicit main ideas from short passages. Ms. Battles will make copies of the passages (from Read Works.org) for the team.

2 times a week during Daily 5 rotation,

Students will use the “Implied Main Idea” organizer to record their thoughts during the teacher modeling. Ms. North will make copies of this for the team. Ms. West will make copies of the passages (from Read Works.org) for the team.

Summaries R Us: Using passages from the Science and Social Studies text, students will identify the main idea of paragraphs in the reading and respond to the section by restating, reacting, remembering, or responding with a question.

Selected Instructional

Practice/ Strategy

Proficient & Higher StudentsPrioritized Next Step:

Page 76: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Close to Proficient StudentsPrioritized Next Step:

Students will have practice verbalizing main ideas which they will then transfer into written language.

Time - Frequency and

Duration

Materials for Teachers and

Students

4. Select Instructional Practicies/Strategies

Assignments & Assessments - Where

will students be required to use the

Practice?

Instructional Practice: Feedback Instructional Strategy: Summarizing and Note Taking

teachers will guide students will complete a Listen-Sketch-Draft. The text will be the daily read aloud

3 times a week (Monday, Wednesday, and Friday) during Small group rotation with the teacher

Houghton Miffl in Reading series beginning with Unit 4, Week1. Mr. EAST will make copies of the Listen-Sketch-Draft page for the team

Class Main Idea Book: Using a topic of interest the class wil l create a main idea book. Beginning the details and moving main idea. Lesson at www.liketowrite.com

Selected Instructional

Practice/Strategy

Learning Environment

Page 77: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Far to Go Students Prioritized Next Step:

Instructional Practice: Feedback Instructional Strategy: Cooperative Learning

3 times a week during Daily 5 small group rotation with the teacher

Mrs. SOUTH will create a set of 30 cards for each teacher using questions from Study Island

Given text passages on their instructional level students will begin to identify main idea and supporting details.

4. Select Instructional Practices/Strategies

Learning Environment

Time - Frequency and

Duration

Materials for Teachers and

Students

Assignments & Assessments - Where

will students be required to use the

Practice?Brown Bag Book Club: During literature circles, students will complete the story elements organizer individually. They will then engage in conversation around the main idea, details, characters, setting, and author’s purpose of the story while enjoying popcorn from their brown bag.

Selected Instructional

Practice/ Strategy

students will engage in the cooperative learning strategy Fan-N-Pick guided by the teacher. Review the steps of the strategy using the Kagan Structures software.

Page 78: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Prioritized Next Step:Type step here

Intervention Students

Learning Environment

Time - Frequency and

Duration

Materials for Teachers and

Students

Assignments & Assessments - Where

will students be required to use the

Practice?

Selected Instructional

Practice/ Strategy

4. Select Instructional Practices/ Strategies

Instructional Practice: Feedback Instructional Strategy: Advanced Cues, Questions, and Organizers – Stated Main Idea Umbrella Organizer

Teachers will directly instruct students that the main idea of a paragraph is often the topic sentence. Teachers will meet each Friday during lunch to discuss the progress of these students and identify if adjustments to the instructional strategy is needed.

Students will meet daily with the teacher during daily 5 small group rotations.

Mr. EAST will gather passages on a 1st and 2nd grade reading level and distribute to the team. Mrs. SOUTH will make copies of the Stated Main Idea Umbrella graphic

Main idea Exit Ticket: At the end of lessons, students will create an exit ticket stating the main idea of the lesson and 2-3 details that support the learning.

Page 79: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c eInstructional Decision Making

Practice ProfileMissouri Collaborative Work Practice Profile Foundations present in the implementation of each essential function: Commitment to the success of all students and to improving the quality of instruction.

Data -Based Dec is ion M aking Pro cess

Essential Functions Exemplary

Ideal Implementation

Proficient

Close to Proficient (Skill is emerging, but not yet to ideal proficiency. Coaching

is recommended.)

Far from Proficient (Follow-up professional

development and coaching is critical.)

Evidence

4

Educators use data to select a common effective teaching/learning practice to implement with fidelity.

Selected effective teaching/learning practice(s)/strategy(s) target prioritized needs and are research based

Selected effective teaching/learning practice(s)/strategy(s) have greatest potential impact on student growth

Selected effective teaching/learning practice(s)/strategy(s) are described in detail to allow for replication

Selected effective teaching/learning practice(s)/strategy(s) target prioritized needs and are research based

Selected effective teaching/learning practice(s)/strategy(s) chosen have moderate potential to impact student growth

Selected effective teaching/learning practice(s)/strategy(s) lack a full description to allow for replication

Selected effective teaching/learning practice(s)/strategy(s) do not target prioritized needs and are not research based

Selected effective teaching/learning practice(s)/strategy(s) chosen do not have potential to impact student growth

Selected effective teaching/learning practice(s)/strategy(s) lacks a description to allow for replication

Data-Based Decision Making Excel/Word Template

Page 80: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Effective Teaching/ Learning Practices: Next Steps

Using the results from the DBDM Practice Profile dialog to:1. Assess your team/building current knowledge and

implementation fluency with Effective Teaching and Learning Practices

2. Determine possible next steps: Identify an ETLP for schoolwide implementation Identify instructional strategies that are proven effective

for the academic domain of the Priority Learning Target under consideration

Match the prioritized instructional next steps for each sub-group with the appropriate instructional practices or strategies

Page 81: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

The contents of this presentation were developed under a grant from the US Department of Education to the Missouri Department of Elementary and Secondary Education (#H323A120018). However, these contents do not necessarily represent the policy of the US Department of Education, and you should not assume endorsement by the Federal Government. P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

DBDM Step 5Results Indicators

Page 82: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Collect & Chart Data

Analyze & Prioritize

SMART GoalEffective Teaching &

Learning Practices

Determine Results

Indicators

MonitorDBDM Process

Page 83: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Why Results IndicatorsThey allow us to monitor progress of:

implementation of our strategies/practices

effectiveness of our strategies/practicesThey facilitate the planning for

sustaining or revising of our strategies/practices

Page 84: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Results Indicators: Terms to Know

Cause DataData that measures adult behaviors

Effect DataData that measures student outcomes

“Look fors”Indicators in student work which demonstrate

change in proficiency

Page 85: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Results Indicators: OverviewResults Indicators include

Adult behavior (Cause)Student behavior (Effect)“Look fors” in student work (Effect)

Articulated for each instructional groupDirectly linked to prioritized needs and

strategies/practices selectedSpecific and clear enough to allow for

prediction of student outcomes prior to next assessment

replication of practice

Page 86: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Us Airways Flight 1549Results Indicator: Look Fors

Page 87: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Cause – Effect ActivityCause

% of teachers teaching the social skills lessons on a weekly basis

Using the supplemental questions to practice the format of the test

# of teachers adhering to the allocated instructional minutes for literacy

# of teachers using bell to bell activities to review the science objectives

Effect % of students passing

the formative quiz given on Friday

Results on the fluency screening assessment in January

% of students who have 3-5 office referrals for the year

Scores on the math chapter test

Page 88: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Results Indicators: Process Identify a prioritized need for each instructional group and

select an evidence based practice or strategy Develop descriptors of what should be observable if adults

implement the practice or strategy with fidelity Develop descriptors of what should be observable in student

behavior if the adults implement the practice or strategy with fidelity

Develop descriptors of what should be observable in the student work if the practice or strategy is implemented effectively

Establish a cause/effect relationship between the practice or strategy and the results

Page 89: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Results Indictors:Implementation Example

Results indicators complete the sentence:“If teachers do _____________, then

students will _________________.”

Page 90: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Results Indicators: Implementation Example

If a teacher models using a comparison matrix in a mini lesson, then the students will extend the comparison matrix as they read further in the text, which demonstrates evaluation level thinking.

“Look fors”: students will be able to describe how different words effect them as readers in the comparison matrix.

Page 91: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Results Indicators: Implementation Example

If a teacher models syllable patterns, blend patterns and word chunks daily during guided reading, then students will apply strategies to their to their leveled guided reading text.

“Look fors”: Students will read more fluently and with greater comprehension.

Page 92: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Results Indicators: Implementation Example

If a teacher models using whole group behavior expectations during a mini lesson for classroom behaviors, then the students follow whole group behavior expectations during whole group instruction.

“Look fors”: students will be able to Stay in personal spaceKeep all hands feet and objects to selfRespond appropriately to thoughts of

othersRaise a hand to indicate they have

something to say

Page 93: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

They will be able to identify the most important details from the passage and use the details to form a main idea in a constructed response format.

Engage students in more difficult text where the main idea may not be so explicit

Adult BehaviorsStudent

Behaviors

5. Results Indicators

Prioritized Next StepProficient & Higher Students

What to Look For in Student

Work

Teachers will model an implied main idea- web graphic organizer identifying the most important details from the text during guided reading. Using the details, teachers will model a think-a-loud to construct the implied main idea of the passage

Students will read a passage with an implied main idea and ask clarifying questions about details from the passage.

Page 94: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Close to Proficient StudentsPrioritized Next Step:

Students will have practice verbalizing main ideas which they will then transfer into written language.

5. Results Indicators

Adult BehaviorsStudent

Behaviors

What to Look For in Student

Work

Teachers will read a text split into 3 smaller parts. While reading each part teachers will think-a-loud to complete a Listen-Sketch-Draft identifying the main idea of each small section before creating a summary statement of the entire passage

Students will begin to use the essential vocabulary “main idea” and “supporting details” when having conversations about text.

As they begin to speak the main idea when talking with others they will write, transferring their verbal language into written.

Page 95: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Prioritized Next Step:Far to Go Students

What to Look For in Student

Work

Given text passages on their instructional level students will begin to identify main idea and

supporting details.

Adult BehaviorsStudent

Behaviors

Teachers will model first and then participate in Fan-N-Pick with student small groups reading a main idea and choosing which of 3 details best supports the main idea.

Students will begin to use the essential vocabulary “main idea” and “supporting details” when having conversations about text.

Students will be able to eliminate the details that are irrelevant to the main idea.

5. Results Indicators

Page 96: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Intervention StudentsPrioritized Next Step:

Type step here

Adult BehaviorsStudent

Behaviors

What to Look For in Student

Work

5. Results Indicators

Teachers will directly instruct students that the main idea of a paragraph is often the topic sentence. Teachers will complete the graphic organizer identifying the stated main idea and supporting it with 3 details from the text through modeling

When students are reading a passage on their instructional reading level they will be able to identify the main idea from the topic and concluding sentences.

Students will become more confident in their ability to identify the main idea and will begin to transfer their strategies to more diffi cult text.

Page 97: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Results Indicators: Practice Profile

Missouri Collaborative Work Practice Profile Foundations present in the implementation of each essential function: Commitment to the success of all students and to improving the quality of instruction.

Da ta-Based Dec is ion M aking Process

Essential Functions Exemplary

Ideal Implementation

Proficient

Close to Proficient (Skill is emerging, but not yet to ideal proficiency. Coaching

is recommended.)

Far from Proficient (Follow-up professional

development and coaching is critical.)

Evidence

5

Educators explain results indicators for process (cause) and product (effect)

Quarterly, team discusses expected implementation data (teacher behavior) related to expected student results, with sufficient detail for replication.

Implementation data indicated fidelity occurs at a desired rate.

Quarterly, discrepancies in student results are examined in related to difference in implementation data.

Semi-annually, based on data, improved implementation processes are recommended or alternative effective teaching/learning practice is chosen.

Semi-annually, team discusses expected implementation data (teacher behavior) related to expected student results, with sufficient detail for replication.

Implementation data indicated fidelity occurs at less than a desired rate.

Semi-annually, discrepancies in student results are examined in related to difference in implementation data.

Annually, based on data, improved implementation processes are recommended or alternative effective teaching/learning practice is chosen.

Team discussion about expected implementation data and students occurs but is limited by team understanding of cause/effect or incomplete data.

Fidelity of implementation is less than desired.

Hypothesizing improved implementation processes or needs or alternative effective teaching/learning practices is limited by team understanding of the correlation or incomplete data

Data-Based Decision Making Excel/Word Template

6

Educators design ongoing monitoring of results (monitor, reflect, adjust, repeat)

Visual representation of growth is included in results once post-assessment is scored

Reflection questions are thoroughly discussed and recorded using the visual representation

Visual representation of growth is included in results once post-assessment is scored

Most reflection questions are

Visual representation of growth is included in results once post-assessment is scored

Some or few reflection questions are discussed and recorded using the visual representation

Visual representation of growth is not included in results.

Reflection questions are not discussed and/or recorded.

Data-Based Decision Making Excel/Word Template

Page 98: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Results Indicators: Next StepsUsing the results from the DBDM Practice Profile

dialog to:1. Assess your team/building current knowledge and

implementation fluency with Results Indicators2. Determine possible next steps:

Identify teacher behaviors for implementation fidelity of practice/strategy selected

Identify student behaviors that demonstrate knowledge or application of Priority Learning Target

Identify “look fors” in student work that will demonstrate knowledge and ability to apply Priority Learning Target11

Page 99: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

The contents of this presentation were developed under a grant from the US Department of Education to the Missouri Department of Elementary and Secondary Education (#H323A120018). However, these contents do not necessarily represent the policy of the US Department of Education, and you should not assume endorsement by the Federal Government. P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

DBDM Step 6Monitor

Page 100: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Collect & Chart

DataAnalyze

& Prioritize

SMART Goal

Instructional

Decision Making

Determine

Results Indicator

s

Monitor

DBDM Process

Page 101: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Monitoring is an ongoing process by educators throughout the entire data based decision making cycle in which student performance (effect data) and adult behaviors (cause data) are observed, measured, and recorded to make decisions about progress, success, challenges, and provide feedback regarding next steps.

Page 102: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Why MonitorTo engage in a continuous improvement cycle.

Monitoring allows educators to reflect on their professional practice.

Monitoring allows for mid-course corrections.Monitoring allows for short term wins.Through lessons learned, monitoring leads to next

steps.Monitoring ensures fidelity of implementation.Monitoring must consider both cause and effect

data.

Page 103: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Monitor: ProcessCause

Teachers administer CFA with fidelity.

Teachers collect and chart data appropriately.

Teachers analyze and prioritize results of CFA.

Teachers develop a S.M.A.R.T. Goal. Teachers determine the Effective

Teaching and Learning Practice. Teachers support each other in the

use of the practice. Teachers describe the

implementation of the practice (frequency, effectiveness, feedback, celebrations/ challenges).

Effect Teachers examine

student work samples to provide evidence of implementation of the practice and to determine its impact.

Teachers discuss the effectiveness of the practice (continue, modify, or stop).

Page 104: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Monitoring ComponentsMonitoring Process & Practices

Sources of Data to MonitorIndividual(s) ResponsibleTimeline

Evaluate the DBDM ProcessWe Planned TheseWe Achieved These

Apply what was LearnedWe LearnedWe will Replicate

Page 105: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Page 106: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Monitor: Practice ProfileMissouri Collaborative Work Practice Profile

Foundations present in the implementation of each essential function: Commitment to the success of all students and to improving the quality of instruction. Data-Based D ec is ion M aking Process

Essential Functions Exemplary

Ideal Implementation

Proficient

Close to Proficient (Skill is emerging, but not yet to ideal proficiency. Coaching

is recommended.)

Far from Proficient (Follow-up professional

development and coaching is critical.)

Evidence

6

Educators design ongoing monitoring of results (monitor, reflect, adjust, repeat)

Visual representation of growth is included in results once post-assessment is scored

Reflection questions are thoroughly discussed and recorded using the visual representation

Visual representation of growth is included in results once post-assessment is scored

Most reflection questions are discussed and recorded using the visual representation

Visual representation of growth is included in results once post-assessment is scored

Some or few reflection questions are discussed and recorded using the visual representation

Visual representation of growth is not included in results.

Reflection questions are not discussed and/or recorded.

Data-Based Decision Making Excel/Word Template

Page 107: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Monitoring: Next StepsUsing the results from the DBDM Practice Profile dialog to:1. Assess your team/building current knowledge

and implementation fluency with Monitoring2. Determine possible next steps:

Establish standard reflection questions for each team to use as they monitorImplementation of the DBDM processTeacher and student outcomes as a result of

DBDM processEstablish timelines for team sharing to

schoolwide leadership team

Page 108: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

The contents of this presentation were developed under a grant from the US Department of Education to the Missouri Department of Elementary and Secondary Education (#H323A120018). However, these contents do not necessarily represent the policy of the US Department of Education, and you should not assume endorsement by the Federal Government. P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Coming Full Circle

Revisiting DBDM Essential QuestionsDeveloping Action Steps

Embedding DBDM into Professional Practice

Page 109: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Essential Questions How many students are succeeding

in the subject I/we teach? Within those subjects, what are the

areas of strengths and weakness? How can I/we establish and sustain

a culture and process for strategic instructional decision making across our building, teams and classrooms?

Page 110: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Practice Profile to Action Steps(Implementation Drivers)

Review your scoring on the Practice Profile for the DBDM 6-steps:Where are there strategic opportunities for

your team/faculty to implement action steps that can move your process towards DBDM forward efficiently and effectively?

What are your teams/faculty goals for DBDM during the current school year?

What job embedded professional learning will need to take place once your team/faculty returns to your building?

?

Page 111: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Practice Profile to Action Steps(Implementation Barriers)

Review your scoring on the Practice Profile for the DBDM 6-steps:What action steps will your

team/faculty need to implement within Step 6 – Monitoring to increase the likelihood of fidelity of implementation?

How will your team/faculty respond to resistant team/faculty members?

?

Page 112: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

The contents of this presentation were developed under a grant from the US Department of Education to the Missouri Department of Elementary and Secondary Education (#H323A120018). However, these contents do not necessarily represent the policy of the US Department of Education, and you should not assume endorsement by the Federal Government. P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Putting DBDM into Professional PracticeHow to make DBDM the way your building/team does instructional decision making on a consistent

basis.

Page 113: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

DBDM Cycle

Pre-Assessment with CFA, then follow DBDM

steps 1-6

Administer Mid-Instruction CFA

then follow DBDM steps 1-6

Administer Post Assessment with CFA, then follow DBDM steps 1-6

Select a new Learning Standard

and repeat the DBDM Cycle again

Page 114: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c eSample DBDM Team Schedule

At the team meeting

(First this)In the classroom

(Then this)

Meeting 1 Select common formative pre-assessment Agree on common grading method

Administer pre-assessment Evaluate pre-assessment & sort students

into proficiency groups

Meeting 2 Steps 1-5 Use pre-assessment results to chart data (collated before meeting) Prioritize student needs Set SMART goal Select common instructional practices/strategies & determine results

indicators Model implementation of strategies

Begin implementation of Effective Teaching & Learning Practice and common instructional strategies

Monitor progress Provide feedback to students

Meeting 3 Step 6 Examine student work samples Discuss implementation of practices/strategies & make adjustments as

needed Confirm post-assessment date

Continue implementation of Effective Teaching & Learning Practice and common instructional strategies

Monitor progress Provide feedback to students Administer post-assessment

Meeting 4 Step 6 continued Evaluate results, reflect on cycle, celebrate growth Plan next steps

Offer post-assessment feedback to students

* Possible time frame to complete one full cycle of PLC/Data Teams process: 1 month for DBDM Teams who meet once per week2 months for DBDM Teams who meet twice a month

Page 115: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

DBDM “Placemat”Student Achievement

Goal:

DATA TEAM TIMELINETeam Actions Date to be Completed

Choose Learning Standard

Post and Pre-Assessment Created

Pre-Assessment Administered

Data Team 6 Steps CycleTeach the Learning Standard

Administer Formative Assessment

Data Team Steps 1-6 on Formative

Post Assessment Administered

Data Team Step 6 on Post Assess.

Celebration when goal is achieved:

Page 116: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c eDBDM “Placemat” 1st First Ever Meeting Establish norms, understand the purpose of the data meeting, understand the 6 steps

2nd Before instruction & DT Cycle Establish norms, understand the purpose of the data meeting, understand the 6 steps

3rd Before inst. collab on these 5 steps.

1.Collect & Chart Data. Select your standard, Create or select pre-assessment, determine the criteria/rubric for the 4

performance groups (Proficient, Close, Far but likely, Intervention), administer, score

2.Analyze strengths & obstacles (Performance errors & misconceptions).

Teams gather and display data from formative assessment results by teacher and the 4 performance groups. Student names are included. Through the disaggregation in this step, teams will be able to plan for the acceleration of learning for all students.

3. Establish goals

Identify the strengths and needs of student performance and then form inferences based on the data. Teams also prioritize by focusing on the most urgent needs of the learners. Include analysis for: Students Proficient of Higher- Strengths and Inferences/Next Steps and InferencesStudents Close to Proficient- Strengths and Inferences/Obstacles-Errors and InferencesStudents Far to Go-Strengths and Inferences/Obstacles-Errors and InferencesIntervention Students-Strengths and Inferences/Obstacles-Errors and Inferences

4.

Select common instructional practices. Teams collaboratively set incremental goals. These short-term goals are reviewed and revised throughout the data cycle. The percentage of ____________(student group) scoring proficient or higher in _______(content area) will increase from ________ (current % proficient or higher) to __________ goal %) by the end of _________ (month, unit, quarter) as measured by _________ (assessment tool) administered on ________ (specific date).

5. Determine results indicators.Teams collaboratively identify research-based instructional strategies. The determination is based on the analysis in step 2.

Alternate Meeting

6.

Between Meeting 3 and 4

Monitor and Evaluate: Discuss strategies: are they working or notBring student work to show evidence of strategies working or notShare and model the strategies for fidelityBegin to create next pre-assessments

4th- After inst collaboration Review post-assessment data Step 6 is Monitor and Evaluate Results

If goal was met, create or select next pre-assessment, Start Cycle Again!

Teacher # % # %

Consider strategies for each performance group and/or misconceptions

Consider the components below

Selected Instructional Strategy

Learning Environment

Time- frequency and duration

Materials for Teachers & Students

Assign. & AssessmentsWhere will stud be required to use the strategy?

Page 117: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Antecedents* ALL teachers trained in

selected Effective Teaching & Learning Practices

* Time for collaborative teams allocated and protected* CFAs are developed for Prioritized Standards by

collaborative teams and used with fidelity by ALL teachers

* ALL teachers have “permission” to stop ineffective practices

Behaviors* ALL teachers consistently

implement schoolwide selected Effective Teaching &

Learning Practices and DBDM team selected

Instructional Strategies with fidelity

*All teachers are part of DBDM Team and actively participate in “cycles” of

DBDM

Consequences* ALL students have equal

access to high quality curriculum and effective

instruction increasing their individual opportunities to

achieve academic and behavioral success

Strategic process for improved instructional data-based decision making leading to increased student achievement

Page 118: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

The contents of this presentation were developed under a grant from the US Department of Education to the Missouri Department of Elementary and Secondary Education (#H323A120018). However, these contents do not necessarily represent the policy of the US Department of Education, and you should not assume endorsement by the Federal Government. P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Pre-Instruction Assessment and Mid-

AssessmentIntention: Students growing in

proficiency with Prioritized Learning Targets through strategic

instructional decision making.

Page 119: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

TOTAL P OSSIBLE

SCORE FOR

LEVEL

# All Students

# SWD % All % SWD

TOTAL P OSSIBLE

SCORE FOR

LEVELS

# All Students

# SWD % All Students

% SWD All Students

SWD

PROFICIENT AND HIGHER

10-9 points 17 3 18% 12%

10-9 points 14 3 14% 12% -3% 0%

CLOSE TO PROFICIENCY

8-7 points

24 4 25% 16% 8-7 points

18 5 19% 20% -6% 4%

FAR TO GO (LIKELY TO BECOME PROFICIENT)

6-5 points

40 10 41% 40% 6-5 points

35 12 36% 48% -5% 8%

INTERVENTION STUDENTS (NOT LIKELY TO BECOME PROFICIENT)

4 points

or below

16 8 16% 32% 4 points or below

30 5 31% 20% 14% -12%

TOTALS 97 25 0 97 25

SWD:NOTES:

Pre-Assessment to Mid-Assessment

Number/Name of RPDC

Date Submitted to RPDC

# TEACHERS

# of SWDs not taking CFA and why?

CONTENT AREACORE STANDARD ADDRESSED

GROUPS

Consultant Name

DISTRICT

SCHOOL

GRADE LEVEL TOTAL CLASS SIZE # SWD IN CLASS

PLEASE ATTACH THE FO RMATIVE ASSESSMENT AND SCO RING GUIDE.

Pre-Assessment CFA: DATE ADMINISTERED

TOTAL POSSIBLE SCORE FOR PRE-ASSESSMENT CFA

Mid-Instruction CFA: DATE ADMINISTERED

TOTAL POSSIBLE SCORE FOR MID-INSTRUCTION CFA

0

Total possible score is maximum points possible on the assessment.SCO RE FO R LEVEL is the score range that it takes to be in that group. (e .g., 3-4 on a Rubric, 80-100%...Whatever your district has

determined as each leve l...)# All Students - is all students taking assessment, including students with IEPs

RETEACHING INSTRUCTIONAL PRACTICE/STRATEGY USED: % Change from Pre-

Assessment CFA to Mid-Assessment

CFA

Sp Ed = students with IEPs only, NO T 504, MELL, Title I, e tc.

1ST INSTRUCTIONAL PRACTICE/STRATEGY USED:

It is possible that #s on 1st and reteaching assessment may be different as students scoring Proficient and Above on 1st may not be retested.

Shaded areas do NO T need to be fi l led.

Page 120: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Green indicates movement up, Red indicates movement down and BLACK indicates some students moved up and some moved down into the group

Page 121: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Page 122: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

The contents of this presentation were developed under a grant from the US Department of Education to the Missouri Department of Elementary and Secondary Education (#H323A120018). However, these contents do not necessarily represent the policy of the US Department of Education, and you should not assume endorsement by the Federal Government. P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Pre-Instruction Assessment to

Post-Assessment Intention: Students growing in

proficiency with Prioritized Learning Targets through strategic

instructional decision making

Page 123: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

TOTAL P OSSIBLE SCORE FOR LEVELS

# All Student

s# SWD % All % SWD

TOTAL P OSSIBLE SCORE FOR LEVELS

# All Students

# SWD% All

Students% SWD

All Student

sSWD

PROFICIENT AND HIGHER

10-9 points 17 3 18% 12%

10-9 points 63 20 65% 80% 47% 68%

CLOSE TO PROFICIENCY

8-7 points

24 4 25% 16% 8-7 points

5 1 5% 4% -20% -12%FAR TO GO (LIKELY

TO BECOME PROFICIENT)

6-5 points

40 10 41% 40% 6-5 points

2 3 2% 12% -39% -28%

INTERVENTION STUDENTS (NOT

LIKELY TO BECOME

4 points or

below16 8 16% 32% 4 points

or below27 1 28% 4% 11% -28%

TOTALS 97 25 0 97 25

SWD: NOTES:

# of SWDs not taking CFA and why?

TOTAL CLASS SIZE

SCORE FOR LEVEL is the score range that it takes to be in that group. (e.g., 3-4 on a Rubric, 80-100%...Whatever your district has determined as each level...)

# SWD IN CLASSGRADE LEVEL

% Change from Pre-

Assessment CFA to Post-Assessment

Total possible score is maximum points possible on the assessment.

# All Students - is all students taking assessment, including students w ith IEPs

DISTRICTSCHOOL

Date Submitted to RPDCNumber/Name of RPDC

POST INSTRUCTIONAL PRACTICE/STRATEGY USED:

Sp Ed = students with IEPs only, NOT 504, MELL, Title I, etc.It is possible that #s on 1st and reteaching assessment may be different as students scoring Proficient and Above on 1st may not

be retested.Shaded areas do NOT need to be filled.

PLEASE ATTACH THE FORMATIVE ASSESSMENT AND SCORING GUIDE.

#TEACHERSCONTENT AREACORE STANDARD ADDRESSED

1ST INSTRUCTIONAL PRACTICE/STRATEGY USED:

TOTAL POSSIBLE SCORE for Pre-Assessment CFA

Pre-Assessment CFA: DATE

TOTAL POSSIBLE SCORE for Post-Assessment CFA

Post-Assessment: DATE

0

GROUPS

Pre-Assessment to Post-Assessment

Consultant Name

Page 124: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Page 125: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Practice ProfileMissouri Collaborative Work Practice Profile

Foundations present in the implementation of each essential function: Commitment to the success of all students and to improving the quality of instruction. Data-Based D ec is ion M aking Process

Essential Functions Exemplary

Ideal Implementation

Proficient

Close to Proficient (Skill is emerging, but not yet to ideal proficiency. Coaching

is recommended.)

Far from Proficient (Follow-up professional

development and coaching is critical.)

Evidence

6

Educators design ongoing monitoring of results (monitor, reflect, adjust, repeat)

Visual representation of growth is included in results once post-assessment is scored

Reflection questions are thoroughly discussed and recorded using the visual representation

Visual representation of growth is included in results once post-assessment is scored

Most reflection questions are discussed and recorded using the visual representation

Visual representation of growth is included in results once post-assessment is scored

Some or few reflection questions are discussed and recorded using the visual representation

Visual representation of growth is not included in results.

Reflection questions are not discussed and/or recorded.

Data-Based Decision Making Excel/Word Template

Page 126: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Closing thoughts…

Page 127: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

References Blanchard, K. (2007). Leading at a higher level: Blanchard on leading and higher performing

organizations. Upper Saddle River, NJ: Prentice Hall.

Gilbert, T.F., (1978). Human competence: Engineering worthy performance. New York, NY: McGraw-Hill.

Hamilton, L., Halverson, R., Jackson, S., Mandinach, E., Supovitz, J., & Wayman, J. (2009). Using student achievement data to support instructional decision making (NCEE 2009-4067). Washington, DC: National Center for Education Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education. Retrieved from http://ies.ed.gov/ncee/wwc/publications/practiceguides/

Horner, R. H., Sugai, G., & Todd, A.W. (2001). “Data” need not be a four-letter word: Using data to improve schoolwide discipline. Beyond Behavior, 11(1), 20-22.

Means, B. Chen, E., DeBarger, A, & Padilla, C. (2011). Teachers’ ability to use data to inform instruction: Challenges and supports. U.S. Department of Education, Office of Planning, Evaluation and Policy Development, Washington, D.C., 2011. Retrieved from http://www2.ed.gov/about/offices/list/opepd/ppss/reports.html

McIntosh, K., Horner, R. H., & Sugai, G. (2009) Sustainability of systems-level evidence-based practices in schools: Current knowledge and future directions. In W. Sailor, G. Dunlap, G. Sugai, & R. Horner (Eds.) Handbook of positive behavior support (pp. 327-352). New York: Springer.

Page 128: Data-Based Decision Making

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

References Newton, S. J., Horner, R. H., Algozzine, R. F., Todd, A. W., & Algozzine, K. M. (2009).

Using a problem-solving model to enhance data-based decision making in schools. In W. Sailor, G. Dunlap, G. Sugai & R. Horner (Eds.) Handbook of positive behavior support (pp. 551-580). New York, NY: Springer Science & Business Media, LLC.

Reeves, D. B., (2009). Leading change in your school: How to conquer myths, build commitment, and get results. Alexandria, VA: Association for Supervision and Curriculum Development:

Rosenholtz, S.J. (1991). Teacher’s workplace: the social organization of schools. New york: Teachers College Press.

Sprague, J., Walker, H., Golly, A., White, K., Myers, D.R., and Shannon, T. (2001). Translating research into effective practice: The effects of a universal staff and student intervention on indicators of discipline and school safety. Education & Treatment of Children, 24(4), 495-511.