59
MDE Program Evaluation Tool January 13, 2015 MAISD

MDE Program Evaluation Tool January 13, 2015 MAISD

Embed Size (px)

Citation preview

MDE Program Evaluation Process

MDE Program Evaluation ToolJanuary 13, 2015MAISD

AgendaWelcomeThe Why, When, What, & How of Program EvaluationImplementation Science, School Improvement, and Program EvaluationPractice with the Program Evaluation ToolQ & AFinding the Good!

Find your cartoon partner

Program Evaluation ToolFind your cartoon partnerWhats your self talk?Whats your affirming talk with your staff?What affirmations might your staff need?

Whats your self talk?Whats your affirming talk with your staff?What affirmations might your staff need?

4Learning Outcomes for Schools and DistrictsUnderstand the role of program evaluation within the continuous improvement processUnderstand the purpose, benefit, and process of program evaluationLearn about the MDE Program Evaluation tool and resources to help to evaluate a strategy/program/initiative/reform strategy

These are the outcomes for the trainings for schools and districts. Review with participants and check for clarification.At the top of an index card, identify a hobby, sport, or activity in which you enjoy participating.

Then identify the following: What would you have to do to be ready to participate?2. What knowledge and/or skills would you need?3. What opportunity would need to be present?4. How would you know if you were carrying out the activity in the way it was intended?5. What would be the result if you were skilled at the activity?Making ConnectionsDistribute large index cards or something else on which participants can write. Have them follow the directions on the slide. Once participants have answered the questions, have them get up and form groups of three or four. In the groups have them share their answers to the individual questions, then look for common themes among their answers. Have participants return to their tables and give them an opportunity to share themes with the large group. Have participants save cards for discussion at the end of the training.

Before moving on, ask participants to identify the key words/phrases in each of the five questions:ReadyKnowledge/skillsOpportunityCarry out in the way it was intendedResultsThese key words are also integral to the evaluation process.

Activity: Why, How, What?Why:Why is it important to strategically implement, monitor, and evaluate the strategy/program/initiative/reform strategy?How:How will we communicate the plan to all stakeholders so that they clearly understand and own their roles in implementation? How will we ensure implementation with fidelity?How is the implementation and impact of your plan monitored and evaluated?What:What will each strategy/program/initiative/reform strategy look like when it is implemented with fidelity?What is the expected impact on student achievement?https://www.youtube.com/watch?v=IPYeCltXpxwWhile it is important for all stakeholders to be able to answer the How and the What about any strategy/program/initiative we are undertaking, it is more important that they all be able to answer Why you are doing what you are doing. Play the first five minutes of the Simon Sinek video. Then have participants individually read and reflect on the Why question on the screen and briefly discuss in their groups. Following group discussion, bring the conversation back to the large group and have participants share out. Optional: Repeat the process for the How and the What

Why:Why is it important to strategically implement, monitor, and evaluate the School Improvement Plan?How:How will we communicate the plan to all stakeholders so that they clearly understand and own their roles in implementation?How do we build ownership of the strategy and the plan?How will we ensure implementation with fidelity?How is the implementation of your plan monitored and evaluated?How is the impact of your plan monitored and evaluated?What:What will your school look like when this plan is implemented with fidelity?What is the expected impact on student achievement?

Why is MDE Requiring Program Evaluation?To positively impact student achievement and close gaps for the subgroupsTo ensure that high-quality planning, implementation, and evaluation are part of the Continuous Improvement ProcessTo ensure ongoing engagement of multiple stakeholders (students, teachers, parents/community, administrators) in the planning and evaluation processTo maximize the use of resources to impact student learningTo provide documentation of program implementation to inform future decision-makingTo meet state and federal requirementsAlign with the School Improvement steps and processes that are essential to continuous improvement.

Improve the planning and implementation of strategies/programs/initiatives/reform strategies. Examine the impact of strategy/program/initiative/reform strategy on student achievement as well as the process used to make such impact (Readiness, Knowledge/Skill, Opportunity, Fidelity of implementation).

Enable stakeholders to identify and document successful activities/processes & reasons why they are successful. Enable stakeholders to identify and document unsuccessful activities/processes and to understand the reasons that implementation might not have been successful Involve and engage all stakeholders (students, teachers, parents/community, administrators) in the evaluation process.

Improve management of resources and demonstrate program effectiveness to all stakeholders (formula as well as competitive)

Document program development, implementation and outcomes to help ensure successful replication.

State and Federal RequirementsAnnual evaluation of the implementation and impact of the School Improvement PlanModification of the plan based on evaluation resultsAnnual evaluation of all federal programseffectiveness & impact on student achievement, including subgroupsModification of the plan based on evaluation results

ISDs/RESAs are required by PA25 to provide technical assistance to schools and districts to develop annual evaluations. ESEA requires annual evaluations of programs funded by the federal programs such as Title I, Part A, C, D; Title II and Title III.

MichiganFederalShare the federal mandates listed below with participants and use the citation as needed:

Guided by evidence-based practices, ESEA requires all states and their districts (subgrantees) to progress monitor and evaluate the effectiveness of strategies/initiatives/programs/reforms funded by federal monies in order to examine their impact on students academic achievement. For example,

Schoolwide programs must include annual evaluation as part of the annual cycle for examining successful implementation and impact on the success of all subgroups;Plans of Targeted assistance schools must be progress-monitored and evaluated for impact on the achievement of targeted students.

(Title I Part A: 34 CFR200.26 (c); Title I, Part A [Sec. 1115. (c)(2)(B) - Targeted Assistance; Sec. 1114. (b)(1)(B)(iii)(II) Schoolwide]

Title III -English Language Acquisition Program (English learners) requires districts to fulfill the program improvement responsibilities, annually evaluate the English learner program and use ELs academic and language outcomes to determine action steps for restructuring, reforming and improving all relevant EL programs, activities and operations relating to language instructional education programs and academic content instruction. [ESEA Sec. 3115(a)(3) and 3121]

Title I Part C Migrant Education Program requires conducting a comprehensive needs assessment, establishing clear goals whose implementation is monitored to ensure impact on migrant students achievement. Local migrant programs are required to conduct an annual program evaluation to determine impact of programs and services on academic achievement of migrant students Sections 1301(4); 1304 (b)(1,2); 1306 (a, c & d)

Title I Part D, Subpart 1 or 2: Neglected and Delinquent Programs Entities receiving these funds are required to be evaluated annually for impact of eligible students achievement and program outcomes. Section 1432 (a) (1-5)

Title II The district must have a written process in place to evaluate how Title II, Part A activities will impact student achievement; Title II Program services must be evaluated annually for effectiveness and impact on student achievement. ESEA Section 2122(b)(9)(D)The Title II, Part A Class-Size Reduction initiative must follow Michigan Department of Education policy requirements and the district must have a process in place to evaluate the effectiveness of the initiativeESEA Sections 2122(b)(1)(B) and 2122(b)(2)

If asked whether priority schools are required to use the MDE Evaluation Tool, respond by saying that Years 2 and up priority schools are required to use the Tool. Year 1 schools could use it more as a planning tool as they havent implemented their plan yet. SRO is looking at it as a way to assess the readiness of schools to implement their plans.

Note on slide: The picture to the left is an insert with a summary of the NCLB requirement and an excerpt of the Michigans Revised School Code.

Program Evaluation TimelineDistrict/School Improvement Plans for 2014-2015: Include program evaluation activities to support Program Evaluation as part of the Continuous Improvement ProcessImplement Program Evaluation activities throughout the 2014-2015 school year

Summer 2015 and BeyondSustain professional learning by reconvening trainers to discuss successes, challenges, and develop the required follow-up training materials and support systems

June 30, 2015 Program Evaluation submitted in ASSISTA completed program evaluation using the MDE Program Evaluation Tool will be required for submission of the Consolidated Application for 2015 2016.

Read through the timeline and address any items that need clarification. Spell out the acronyms: OFS= Office of Field ServicesISD= Intermediate School DistrictsSIFN= School Improvement Facilitators NetworkOEII: Office of Education Improvement and InnovationMICSI=MI Continuous School ImprovementLEAs= Local Educational AgenciesSRO= School Reform Office

If asked whether priority schools are required to use the MDE Evaluation Tool, respond by saying that Years 2 and up priority schools are required to use the Tool. Year 1 schools could use it more as a planning tool as they havent implemented their plan yet. SRO is looking at it as a way to assess the readiness of schools to implement their plans.What to Evaluate?Schools are required to select one:

strategy/reform strategyprograminitiative

that would have the greatest impact on student achievement and close the achievement gaps.

Any school and/or district receiving state or federal funds is required to do a program evaluation. The suggestion is that schools evaluate the program/initiative that has the greatest impact on student achievement and closing the achievement gap.

Notes from FAQ #4: What are districts/schools required to evaluate? Districts/PSAs and schools are required to submit the completed evaluation of one evidence-based strategy/program/initiative that would make the greatest impact on student achievement and close achievement gaps. When schools/districts choose a program to evaluate, they must select a program that is funded with one or more of the following supplemental federal or state funds (Title I, Part A; Title I, Part C; Title I, Part D; Title II; Title III; Title VI; Title X, Section 31a). The inclusion of general funds to support the program is most appropriate; however, the program evaluation being submitted should not be solely funded with general funds.

5. How does the district/school determine what to evaluate? The decision on what should be evaluated is made at the district level for district initiatives and at the school level for school initiatives. The district or school should ask the following question to make this decision: What is the most important initiative that we are undertaking to accelerate achievement and close achievement gaps for our unique student population? The decision to select the evidence-based strategy/program/initiative that would most likely make the greatest impact on student achievement should be based on the triangulation of multiple sets of data, including local or state student achievement. School and district teams are highly encouraged to coordinate their state, local and federal funds to support the program of their choice to ensure its successful implementation and, therefore, impact on students. If the district/school decides, based on achievement data, that a different initiative would make the greatest impact, the evaluation would be submitted for the newly-selected strategy/program/initiative.

What to Evaluate?Districts are required to select one:

strategy/reform strategyprograminitiative

that would most impact on student achievement and close the achievement gaps.

Any school and/or district receiving state or federal funds is required to do a program evaluation. The suggestion is that schools evaluate the program/initiative that has the greatest impact on student achievement and closing the achievement gap.

Districts must do an evaluation if they have district Title I set aside for districtwide initiatives, receive Title II, Title III, and/or Title I, C.

If Districts do not receive federal funds, they are encouraged to use the Tool as it would be best practice. Schools would still be required under PA25 and districts not receiving federal funds are responsible for submitting all school-level plans.

What to Evaluate

Longstanding or NewDistrict and School Specific

Resource Allocation Time Effort Cost

The MDE Program Evaluation Tool can be used with a wide variety of strategies, programs, initiatives or reform strategies. It can be used with those that are longstanding or new, schoolwide or content/grade-specific and purchased or locally developed as long as they are research-based . It is also suggested that schools and districts evaluate one strategy, program, initiative, or reform strategy at a time.

If asked : If a strategy, program, initiative or reform has a number of components, you might consider evaluating the individual/unique components within the steps and sub-questions listed under the 4 questions while completing the program evaluation.

Where is program evaluation in the Continuous Improvement Process?

First, lets look at where the MDE Program Evaluation Tool fits into the School Improvement Process. This tool fits into the Do stage of the School Improvement Process. It should be noted that Evaluation is not a one-and-done task. The process is cyclical, and evaluation data should inform the next cycle of planning.

Think about theContinuous Improvement Process

This short video clip provides schools with an opportunity to view what has frequently been the practice of school improvement or school improvement on the fly. How might this compare to what is meant by continuous school improvement? How might this compare to their experience in their own buildings/districts?

Note on slide: Click on the graphic to start the hyperlink while in the slide show mode-http://goanimate.com/videos/0rTPd5VVqIFo?utm_source=emailshare&refuser=0l2Xsy2JuwlE

Slide High-Quality Evaluation depends on High-Quality Planning!

The evaluation process actually begins with the planning process. Strategies/programs/initiatives/reform strategies that are well-planned will be much easier to evaluate. This training will examine the elements of high-quality planning before reviewing the actual Program Evaluation Tool. Are the RIGHT PEOPLEDoing the RIGHT THINGS In the RIGHT WAY At the RIGHT TIME for the benefit of students?

High quality planning is designed to ensure that the answer to the question on this slide is Yes.

Where is program evaluation in the Continuous Improvement Process?

Planning is the third stage of the Continuous Improvement Process, following Gather and Study and preceding Do.

Planning for Powerful ImplementationImplementation ScienceStagesDrivers

FocusStageDescriptionExploration/ AdoptionDecision regarding commitment to adopting the program/practices and supporting successful implementation.InstallationSet up infrastructure so that successful implementation can take place and be supported. Establish team and data systems, conduct audit, develop plan.Initial ImplementationTry out the practices, work out details, learn and improve before expanding to other contexts.ElaborationExpand the program/practices to other locations, individuals, times- adjust from learning in initial implementation.Continuous Improvement/ RegenerationMake it easier, more efficient. Embed within current practices.Work to do it right!Work to do it better!Should we do it? Why are we doing it?The 5 Stages of Implementation are defined here. During Exploration, teams are asking Should we do it? Why are we doing it? During Installation and Initial Implementation, you are working to do it right.When you reach the stages of Elaboration and Continuous Improvement/Regeneration, you are working to do it better.From exploration to elaboration takes from 3-5 years.

Important for MPS is you have buildings at various points on this continuum. The elementaries all were MiBLSi schools and had Reading First grants, which is a MTSS approach to teaching literacy. All schools will need to be aware of where they are at in implementing MTSS and be mindful of the focus of the tasks for them to explore, work to do it right or work to do it better.

RtI/MTSS Implementation is not an eventit is a continuous process. We are never really there and done, because the needs of students and staff are always changing and we continue to engage in the problem solving process in order to meet needs.

20

Staff competency to support students/families with the selected practicesSuccessful Student/Family OutcomesProgram/Initiative (set of practices implemented with fidelity)How:What: Why:

CompetencyCapacityLeadershipTechnicalAdaptiveIntegrated and CompensatoryAdapted from Fixsen & Blase, 2008Ability to provide direction/vision of processOrganizational capacity to support staff in implementing practices with fidelityThere are three categories of Implementation Drivers: Leadership, Capacity and Competency.When these core components are in place they provide the support to a successful implementation that will be sustained.

Leadership Drivers are mechanisms in place to provide direction/vision of process. These drivers include vision, management coordination, and facilitative leadership.

Capacity Drivers are mechanisms that support the organizational capacity to support staff in implementing practices with fidelity. These drivers include decision support data systems, information, resources, and incentives.

Competency Drivers are mechanisms that help to develop, improve, and sustain ones ability to implement an intervention to benefit students. Competency Drivers include: Selection, Training, and Coaching.

**This slide is animated! Once on the slide and you click once, the why, what, and how will come up. Describe for participants that the what is MiBLSi/RtI, the why we do it is to increase successful student outcomes. The Implementation Drivers support the how of accomplishing the work. Click again and the explanation for competency will come up. The third click will bring up the explanation for capacity with the fourth click bringing up the explanation for leadership. Click again and a blue circle will appear around the words Adaptive and Technical. This requires some explanation. With these drivers, we need to provide both adaptive and technical support; Technical support ensures that those implementing the innovation (MiBLSi/RtI) have the necessary technical knowledge and skills to effectively carry out the practice. Adaptive support addresses the development of adaptive skills to manage the change, feelings of loss, incompetence, or disloyalty.

A final click will bring up a blue circle around integrated and compensatory. All of these components are integrated meaning one driver influences another (to maximize their influence as stated in the NIRN brief). They are also compensatory meaning if one driver is weak, another can support and compensate for that weakness, at least for a short time.

LEADERSHIPCOMPETENCYORGANIZATIONTRAININGCOACHINGSELECTIONPERFORMANCE ASSESSMENTTECHNICALADAPTIVEDATA-DRIVEN DECISION MAKINGLEADERSHIP SUPPORTSYSTEM SUPPORTMDE PROGRAM EVALUATION What is/was the readiness for implementing the program/ initiative/strategy or activity?Do/did the participants have the knowledge and skills to implement the program? What is/was the programs impact on students?Is/was the program implemented as intended?Is/was there opportunity for high quality implementation?It is important, when planning for the implementation of any strategy, program, initiative or reform to keep in mind the three implementation components, or drivers. Research tells us that these three components leadership, competency, and organization are critical elements of any effective implementation and must be considered throughout the planning and implementation process. They must also be considered when making adjustments to implementation so that fidelity is maintained, resulting in positive evaluation results.

These drivers correspond with the five questions of the MDE program evaluation process. Specifically, from this graphic you can see that, to determine the overall impact of a strategy/program/initiative/reform strategy on students, you must consider whether or not it was implemented as intended. If not, you then address competency, leadership and organization by asking the remaining three questions; Was there readiness for implementation? Did the participants have the knowledge and skills to implement the program? Was there the opportunity for high quality implementation? These questions, considered within the context of the three implementation drivers, will help you to determine your next steps. Does your plan include activities to monitor and evaluate?Monitor ImplementationEvaluate Implementation

Adult FocusedStudent FocusedMonitor ImpactEvaluate Impact

It is important to remember that we must monitor and evaluate both adult implementation as well as student impact. Slide Planning: How will we ensure .. ? Plan forward Get Ready1. Readiness? 2. Knowledge/Skills?3. Opportunity? Implement4. Implement With Fidelity? Monitor/ Evaluate5. Impact on Students?Evaluation:To what extent was there/did we..?to evaluate impact.Here now!This slide again reminds us of the connection between planning and evaluation. During the planning phase we identify activities that will ensure readiness, knowledge and skills, opportunity for implementation, fidelity of implementation, and positive impact on students so that when during the evaluation phase we ask to what extent we did these things, the results will be positive. BreakPlease be ready to learn in 10 minutes

Questions for Evaluation

Impact on students?Knowledge and skills?Readiness?Opportunity?Implemented as intended?

The main questions used in the MDE Program Evaluation Tool are:

What was the impact of the strategy/program/initiative on students?What was the readiness for implementing the strategy/program/initiative?Did participants have the knowledge and skills to implement the plan?Was there opportunity for high quality implementation?Was the strategy/program/initiative implemented as intended?

The Program Evaluation Tool has 5 Sections& a Set of ConclusionsIMPACT: What is the IMPACT of the STRATEGY/PROGRAM/INITIATIVE ON STUDENT ACHIEVEMENT? 1. What is the READINESS for implementing the strategy/program/initiative?2. Do participants have the KNOWLEDGE AND SKILLS to implement the program?3. Is there OPPORTUNITY for implementation?4. Is the program IMPLEMENTED AS INTENDED?

The MDE program Evaluation Tool has five sections: Impact followed by Readiness, Knowledge & Skills, Opportunity and Implemented as intended.

ImpactWhat was the impact of the strategy/program/initiative on student Achievement?IN AN IDEAL STRATEGY/PROGRAM/INITIATIVE, the schools achievement results on state or district wide assessments meet proficiency standards. Achievement gaps between each of the relevant subgroups and their counterparts have been narrowed as proposed in the School Improvement Plans measurable objectives. Interim assessment results indicate progress toward proficiency for all students to the satisfaction of all stakeholders.For evaluating a strategy/program/initiative after implementation, users start with IMPACT, i.e., What was the impact of the strategy/program/initiatives on student achievement? The document includes a description of the ideal impact.

There are also three sub-questions about the evidence that school/district teams have collected and are required to complete in this section of the Tool in the ASSIST. The Tool requests users to identify both the evidence used as well as what the evidence shows. There are sample evidence items in the electronic version of the Evaluation Tool district/school teams for districts to refer to.

Note on slide: Each box transitions in with a mouse click. 29ImpactWhat is the evidence and what does it show regarding achievement of the measureable objective for all students when compared to baseline state and local data?What is the evidence and what does it show regarding achievement of the measureable objective for subgroups and their counterparts when compared to baseline state and local data?What is the evidence and what does it show regarding stakeholder (staff, parents, students) satisfaction with the results?For evaluating a strategy/program/initiative after implementation, users start with IMPACT, i.e., What was the impact of the strategy/program/initiatives on student achievement? The document includes a description of the ideal impact.

There are also three sub-questions about the evidence that school/district teams have collected and are required to complete in this section of the Tool in the ASSIST. The Tool requests users to identify both the evidence used as well as what the evidence shows. There are sample evidence items in the electronic version of the Evaluation Tool district/school teams for districts to refer to.

Note on slide: Each box transitions in with a mouse click. 30Now What?If Objectives were metConclusionWere objectives met?Determine if the strategy/program/initiative should be continued or institutionalizedAnalyze further using the other 4 questionsYesNo

ConclusionWere objectives met?Determine if the strategy/program/initiative should be continued or institutionalizedComplete the conclusion section & analyze further using the other 4 questionsDetermine if the strategy/program/ initiative should be continued or institutionalized & use the 4 questions for further studyDetermine if the strategy/program/ initiative should be continued or institutionalized & use the 4 questions for further studyBased on the evidence from the first question, decide whether the objectives were met. Note that the term "objectives refers to the measureable objectives in the continuous improvement plan. If the measureable objectives were met, the tool asks about conclusions that might be drawn.If the Objectives, were met, the school/district teams would need to decide if the initiative should be institutionalized; School and district teams should identify the practices that led to their success in achieving the objectives in order to sustain such success.

Note on slide: The overlay for the box for Yes will zoom out with by clicking the mouse. As the zoom is occurring both the yes/no responses will be hidden by a white rectangle. It is set up as an untimed animation. View from Slide Show, From Current Slide to see the full effect.

If Objectives were metConclusion: If the objectives were met, should the strategy/program/initiative be continued or institutionalized?What is the evidence and what does it say regarding whether this was the right strategy/program/initiative to meet your needs? What is the evidence and what does it say regarding whether the benefits of the strategy/program/initiative are sufficient to justify the resources it requires?What adjustments if any might increase its impact while maintaining its integrity?What is needed to maintain momentum and sustain achievement gains?How might these results inform the School Improvement Plan?

Now What?ConclusionWere objectives met?Determine if the strategy/program/ initiative should be continued or institutionalized & use the 4 questions for further studyComplete the conclusion section & analyze further using the other 4 questionsYes

Complete the conclusion section AND

Analyze further using the other 4 questionsNoIf the measureable objectives were not met, the tool requires respondents to complete the Conclusion section AND use the other FOUR questions to analyze the possible reasons for not meeting the objectives as specified in the next slide.

School/District teams are required to address the 4 questions in order to identify reasons fro not meeting the objectives. Some ISD members suggested that users ask these questions in sequence, from 1-4 (Readiness, Knowledge & Skills, Opportunity and Implementation as intended) ; others suggested to address these questions backward starting with question 4 (implementation) and ending with question 1.

Note on slide: #4 to #1 fly in one after the other with a click of the mouse.

Note on slide: The overlay for the box for Yes will zoom out with by clicking the mouse. As the zoom is occurring both the yes/no responses will be hidden by a white rectangle. It is set up as an untimed animation. View from Slide Show, From Current Slide to see the full effect.

If Objectives were not metConclusion: If the objectives were met, should the strategy/program/initiative be continued or institutionalized?What is the evidence and what does it say regarding whether this was the right strategy/program/initiative to meet your needs? What is the evidence and what does it say regarding whether the benefits of the strategy/program/initiative are sufficient to justify the resources it requires?What adjustments if any might increase its impact while maintaining its integrity?What is needed to maintain momentum and sustain achievement gains?How might these results inform the School Improvement Plan?

Each section begins with adescription of an ideal program1. What is the READINESS for implementing the strategy/initiative/ program?

IN AN IDEAL PROGRAM, stakeholders are well-prepared to implement the program. They have read and can articulate the research foundation, and regularly use the terms in conversation with each other, students, and with parents. Staff, students, and parents express a high level of interest in, support for, and commitment to the program. Specific concerns have been identified and solutions have been planned/ implemented. Staff is able to seamlessly integrate the program within the context of other building/district initiatives.

Each section has an ideal scenario about the program

Key elements: Knowing and articulating the researchCommitment to the initiativeSolutions to challenges that may ariseIntegration within current initiatives by all stakeholders. Each section has 3-5 sub-questions thatask for relevant evidenceWhat is the READINESS for implementing the strategy/initiative/program?

What evidence do you have that stakeholders can articulate and believe the research behind the decision to implement the program?

What evidence do you have that stakeholders are committed to the program with both hearts and minds?

What evidence do you have that stakeholder (staff, parent, student) concerns about the program have been identified and addressed?What evidence do you have that staff are able to integrate this program with other existing initiatives?Each question has sub-questions. Users are required to respond to each subquestion with evidence.

Each section suggests possible data sourcesWhat is the READINESS for implementing the strategy/initiative/ program?

What evidence do you have that stakeholders can articulate and believe the research behind the decision to implement the program?What evidence do you have that stakeholders are really committed to the program with both hearts and minds? Possible Evidence:

data collection plandata analysis workstakeholder survey resultssuggestion box ideas collectedSI team agendasfocus group interviewsmeeting agendas/minutesbooks/papers about the programstaff surveysSI Plan elementsprof development materialsconference/workshop attendanceDistrict and school teams are required to include examples of evidence items and data sources they used to address each sub-question. Examples of data sources are listed in the electronic copy of the MDE Program Evaluation Tool.Finally, the section asks for a self-rating and foraction steps the data suggestWhat is the READINESS for implementing the strategy/initiative/ program?

Stakeholders are fully prepared.Support and commitment are generally high, but some concern or work remains.Some promising signs are mixed with major gaps in knowledge or confidence.Interest and/or commitment are low so far.What action steps are needed to increase readiness to undertake the program?Based on the identified evidence supported by data sources, the school/district teams are asked to rate themselves on a scale of 4 (high) to 1 (low)then complete action steps that are guided by the evidence items and data sources they identified.

Sample Program Evaluation:At tables, read and discuss your assigned section #2 Knowledge & Skills#3 Opportunity#4 Implemented as IntendedIn comparing the 2 examples: What do you notice about the evidence, ratings, next steps between the 2 examples?What might be the conversations and thought processes that need to occur to identify the evidence and to make the decisions about ratings and conclusions you are seeing? How might this section help identify SIP activities?Be ready to report out!

Did Not Meet the ObjectivesWhat was the READINESS for implementing the strategy/program/ initiative?

Readiness?Again, review and reiterate how the 4 additional questions flow starting with the first question: What was the READINESS for implementing the strategy/program/initiative?This slide (57) along with the next three slides (58-60) highlight the main questions in the tool it should be pointed out that the tool follows a similar pattern for all questions and sub-questions as explained in slides 53-56.

Did Not Meet the ObjectivesDid participants have the KNOWLEDGE AND SKILLS to implement the strategy/ program/initiative?

Readiness? Knowledge and skills?Secondly, Did participants have the KNOWLEDGE AND SKILLS to implement the strategy/program/initiative?

Did Not Meet the ObjectivesWas there OPPORTUNITY for implementation?

Readiness? Knowledge and skills? Opportunity?Next, Was there OPPORTUNITY for implementation?

Did Not Meet the ObjectivesWas the strategy/program/ initiative IMPLEMENTED AS INTENDED?

Readiness? Knowledge and skill? Opportunity? Implemented as intended?Finally, Was the strategy/program/initiative IMPLEMENTED AS INTENDED?

Questions for Evaluation

Impact on students?Knowledge and skills?Readiness?Opportunity?Implemented as intended?

The main questions used in the MDE Program Evaluation Tool are:

What was the impact of the strategy/program/initiative on students?What was the readiness for implementing the strategy/program/initiative?Did participants have the knowledge and skills to implement the plan?Was there opportunity for high quality implementation?Was the strategy/program/initiative implemented as intended?

Jigsaw ActivityUsing Program Evaluation SamplesExtended DayParent EngagementMathAcademic Language

Different sample at each tableJigsaw ActivityWith your table group, Read through the sample Program Evaluation ToolWhat did you notice?What are your AH-HAs?How do the responses compare to the first sample we examined?What questions might you have about the tool and/or responses?

Alignment to School Improvement PlanHow might the questions and responses in each section inform your school improvement planning?What might be the implications for strategies/activities for the building/district improvement plan?

ActivitiesConnection to SSR/DSR, Interim SA/SAGetting Ready to Implement

ImplementMonitoring and Evaluating Implementation and Impact How will we address the targeted areas in your Process Data (SPP)?

What areas in your process data have been identified as challenge areas during your comprehensive needs assessment process?

How will we ensure readiness for implementation?

How will we ensure that staff and administrators have the knowledge and skills to implement?

POSSIBLE ACTIVITIESProfessional development around strategy Purchase materials Planning for implementation - identify schedule for strategy use, personnel, mechanism for monitoring, rollout, etc.Communication vehicles How will we ensure successful opportunity for and implementation of the strategy?

POSSIBLE ACTIVITIESCommunication to whom? How?Ongoing coaching?Observations?Instructional technology utilized? *Activities to support at-risk students (For Title One students)*Parent Involvement *

*Required Components

How will we ensure the strategy is implemented with fidelity?

How will we monitor the impact on student achievement?

How will we evaluate the fidelity of implementation and impact on student achievement?

POSSIBLE ACTIVITIESWalkthroughsPLC/CASL meetingsDocumentation of effective implementationDocumentation of impactDemonstration classrooms, videos, self assessmentsGathering achievement data To ensure that planning actually takes place, it is important to include these as activities in your continuous improvement plan.Slide When is the Evaluation Submitted?

Required to Complete the Tool byJune 30, 2015

Evaluation is an everyday occurrence. In education, not only is assessment in the classroom important but so is assessing the strategy/program/initiatives being implemented to help improve student achievement. Additionally, evaluation is a requirement for school improvement and federal Title programs; the evaluation process should be done with fidelity in order to obtain data to make the necessary adjustments and not just to meet the requirement.

Avoid These PitfallsEvaluating federally-funded programs separatelyInclusion of many strategies/unclear on strategySelecting weak, non-robust action steps Not addressing questions 1-4 when the initiative did not meet the objectiveNo evidence to support high-self rating on scaleList of assessments rather than actual data with pre-post analysis supporting progress or lack of progressUnclear, contradictory, or confusing conclusionsConfusion regarding subgroups

When reviewing evaluations that have been submitted by local districts using the MDE Evaluation Tool, OFS consultants have noticed the need to clarify to school/district teams the following:Schools and districts are highly encouraged to coordinate efforts and funds when selecting an initiative that would make the greatest impact on student achievement. They can submit one evaluation for the one initiative that may be funded by several funding sources. Schools and districts are still required to meet the intent and serve eligible students for each funding sourceInclude 2-3 strategy in order to ensure success and avoid confounding elements. Be specific in identifying the strategies. Inclusion of many strategies/no specific strategy Action steps should be specifically aligned to the question (1-4) and should be robust ensuring that are not robustNot addressing questions 1-4 when the initiative did not meet the objectiveNo evidence to support high self rating on scaleList actual data used with pre-post analysis results and figures supporting progress made toward achieving the objective or lack of progress The conclusion following Impact must be clear, convincing and quantifiable Subgroups are part of every school community since they include: gender, ethnicity/race, Economically Disadvantaged, English Learner, Students With Disabilities Lack of specificity in selecting a strategy: Example: We are evaluating RtI.

At the top of an index card, identify a hobby, sport, or activity in which you enjoy participating.Then identify the following: What would you have to do to be ready to participate?2. What knowledge and/or skills would you need?3. What opportunity would need to be present?4. How would you know if you were carrying out the activity in the way it was intended?5. What would be the result if you were skilled at the activity?Making Connections(Reference to first activity)

This is a follow up activity to the one shared at the beginning of the session.

What if your hobby was kind of a flop. Think of a scenario in which that happened. At what stage did it fall apart? What might be the action steps you need to take to make the next time more successful? Work Timewww.Advanc-Ed.org (ASSIST)TEMPLATE: www.Advanc-Ed.org/partnership/MDEWhat program are you choosing to evaluate and why?ORSkim back through template ORStart entering informationORCreate an action plan to accomplish tasksOR . . .

Slide MonthActionFebruary 2015Program/Activity Refresher PD Student assessment collectionMonitoring fidelity of programRelease survey to stakeholders (Staff, Parents, Students)March 2015Student assessment collectionMonitoring fidelity of programCollect and analyze survey results (possibly include in SPR)April 2015Student assessment collectionMonitoring fidelity of programMay/June 2015Final student assessment collectionAnalyze dataMonitoring fidelity of programReview Program Evaluation Tool and progressUpload Program Evaluation Tool in ASSISTEmbedding PET into School Improvement ProcessesThis is a copy of the 2014-15 Program Evaluation Calendar for August through January. Take a moment to review what you should be doing in your district/schools during the month of September to prepare for the required evaluation to be recorded in ASSIST June 30, 201553MonthActionMay/June 2015Identify program/activity to evaluate for the 2015-16 school yearReview tasks from October June list belowCreate strategies/activities for the 2015-16 School and District Improvement Plans to accomplish tasksOctober 2015Student assessment collectionMonitoring fidelity of programNovember 2015Student assessment collectionMonitoring fidelity of programBuild stakeholder surveys based on guiding questions in the Program Evaluation Tool

December 2015-January 2016Student assessment collectionMonitoring fidelity of programReview Program Evaluation Tool and progressContinue building surveysEmbedding PET into School Improvement ProcessesThis is a copy of the 2014-15 Program Evaluation Calendar for August through January. Take a moment to review what you should be doing in your district/schools during the month of September to prepare for the required evaluation to be recorded in ASSIST July 1, 2015.54MonthActionFebruary 2016Program/Activity Refresher PD Student assessment collectionMonitoring fidelity of programRelease survey to stakeholders (Staff, Parents, Students)March 2016Student assessment collectionMonitoring fidelity of programCollect and analyze survey results (possibly include in SPR)April 2016Student assessment collectionMonitoring fidelity of programMay/June 2016Final student assessment collectionAnalyze dataMonitoring fidelity of programReview Program Evaluation Tool and progressUpload Program Evaluation Tool in ASSISTEmbedding PET into School Improvement ProcessesThis is a copy of the 2014-15 Program Evaluation Calendar for August through January. Take a moment to review what you should be doing in your district/schools during the month of September to prepare for the required evaluation to be recorded in ASSIST June 30, 201555It is also critical that the School Improvement Team structure opportunities to celebrate successes, no matter how small.

Celebrating successes reinforces valued performance and reminds the school community that however challenging, school improvement results in improved academic performance.

One Voice One Plan

It is critical for teams to take the time to celebrate what is going well and to share these accomplishments with stakeholders. Slide One Voice One PlanHowever noble, sophisticated, or enlightened proposals for change and improvement might be, they come to nothing if teachers dont adopt them in their own classrooms and if they dont translate them into effective classroom practices. unknown

This quote is meant to emphasize that the true test of School Improvement is not whether the plan is a quality written document although that is also important - but whether the plan is implemented, monitored, and evaluated in a high quality way. School Improvement Plans that sit on the shelf and do not impact classrooms are of little value.

Slide The most important thing about assessment is that it promotes dialogue among faculty. -Mary Senter

The important question is not how assessment is defined but whether assessment information is used -Palomba & Banta

Thank you for participating today!

Trainers choose from several quotes and pictures.

Other options for quotes.If you dont know where you are headed, youll probably end up someplace else. -Douglas J. Eder, Ph.D Assessment of student learning demonstrates that the institutions students have knowledge, skills, and competencies consistent with institutional and program goals and that graduates meet appropriate higher education goals.The systematic assessment of student learning outcomes is essential to monitoring quality and providing the information that leads to improvement. -Middle States Standard XIV The most important thing about assessment is that it promotes dialogue among faculty. -Mary Senter The important question is not how assessment is defined but whether assessment information is used -Palomba & Banta One of the great mistakes is to judge policies and programs by their intentions rather than their results. - Milton Friedman58We appreciate all the hard work you do to have a positive impact on your students.

Chart1444

Sales

Sheet1SalesCompetence4Leadership4Capacity4To update the chart, enter data into this table. The data is automatically saved in the chart.