Episode #336: From Math Busywork to Results: How to Monitor What Matters in Math Improvement

Jan 8, 2025 | Podcast | 0 comments

LISTEN NOW HERE…

WATCH NOW…

Are your district improvement plans for mathematics creating real impact, or are you just spinning your wheels without knowing if you’re getting closer to your goals?

If you’re tired of waiting until the end of the year to find out whether your professional development efforts and action plans worked in the area of mathematics, this episode shows you how ongoing monitoring can transform your approach. 

Discover how to pivot effectively, gather meaningful evidence, and truly align your efforts with measurable outcomes.

What you’ll learn:

  • Learn the secret to creating focused, high-impact action items that drive meaningful change in your district for mathematics improvement.
  • Discover practical strategies for gathering evidence that supports your math professional development goals throughout the year.
  • Find out how to honour educator and stakeholder voices while building trust and accountability in your math initiatives.

Tune in now to learn how to strengthen your district’s monitoring processes and ensure every effort leads to measurable success!

Attention District Math Leaders:

Not sure what matters most when designing math improvement plans? Take this assessment and get a free customized report: https://makemathmoments.com/grow/ 

Ready to design your math improvement plan with guidance, support and using structure? Learn how to follow our 4 stage process. https://growyourmathprogram.com 

Looking to supplement your curriculum with problem based lessons and units? Make Math Moments Problem Based Lessons & Units

Be Our Next Podcast Guest!

Join as an Interview Guest or on a Mentoring Moment Call

Making Math Moments That Matter Podcast
l

Apply to be a Featured Interview Guest

It will take less than two (2) minutes to book your Math Mentoring Moment call.

Book a Mentoring Moment Coaching Call

Take two (2) minutes to book your Math Mentoring Moment call and let’s work together to shake that math pebble out of your shoe!

Are You an Official Math Moment Maker?

Ensure that you followrate and review on Apple Podcasts, Spotify and other platforms to show your support and ensure other math educators can find the show.
DOWNLOAD OUR HOW TO START THE SCHOOL YEAR OFF RIGHT GUIDE
Start your school year off right by downloading the guide that you can save and print to share with colleagues during your next staff meeting, professional learning community meeting or just for your own reference!

FULL TRANSCRIPT

Yvette Lehman: Alright John, we talk a lot about the importance of measuring. That’s something that’s really critical to our philosophy around supporting district improvement plans is the ability to measure our impact at the end of the year to help us understand whether we’ve gotten closer to the goals we set out and the change we want to see.

 

But something we haven’t talked about as much on the podcast is this idea of ongoing monitoring. And so we’re going to kind of dig into that today and make some parallels between the ongoing monitoring that we need to have in our district improvement plan, just like the ongoing assessment that we have in the classroom.

 

Jon Orr: Yeah. Yeah. No, it’s it’s a important, important part of, of the process that we go through when we’re making our improvement plans. Like, you know, we’ve, we’ve talked about the power of, you know, your mathematics vision and having this kind of, this kind of ultimate, this is what I went on our math classes to look like, sound like, you know, what our students are going to be doing. Like having this clear puzzle.

 

this cause the no imagine like all these puzzle pieces are our pieces of like, we do this and we do that and we do that, you know, we’ve been doing these like moves, we got this resource, we were trying this routine, like all these are puzzle pieces. But you know, having the completed puzzle box is really important because it helps give everyone kind of a pathway, a vision of like what we’re trying to get closer to. But a lot of times, you know, we don’t have that we don’t have a clear like, this is what we’re all working with. And all of sudden, you’re trying to put puzzle pieces together that may not fit.

 

but also maybe building a different puzzle box that you’re thinking you’re building this puzzle, but this teacher thinks they’re building this puzzle and this administrator thinks they’re building this puzzle. So it’s important for a sustainable program to all be working towards the same puzzle box. So that’s like the importance of like why we have a vision, but then also why we have say some objectives, not too many objectives, a few objectives that we’re working on this year to keep moving towards this idea of this completed puzzle.

 

and puzzle box that we’re all building towards. And then we often encourage and insist that the people that we support on a regular basis create key results, which is like by the end of this year, these are smaller, very detailed, very specific goals that we want to set for the year. these, hey, by the end of this year, this is what change we’re looking to see in the area of these objectives, like focus zones. So we have that.

 

And our districts kind of work towards creating that. And it does take some process of going like, what do I want that change to look like? Because that, like the way that I’ve just phrased that, most times we don’t think that way. We just say, I’m going to slide these puzzle pieces around this year. And maybe we’ll get closer to this puzzle box, you know, if even if I’ve defined it. But what we’re saying is like, he results go very, very detailed in saying like, what change needs to happen now? And what do I want the change to look like?

 

Jon Orr: by the end of the year. Like what outcomes do I want specifically around these zones? It’s not like the key results are not like checklist items. They’re not like, I’m gonna do this and I’m gonna have a training on this and I’m gonna have a training on this. It’s actually like, this is the change I’m looking for in the area of fluency or this is the change I’m looking for in the area of problem solving in my classrooms for resiliency. It’s the change from here to here and I’m trying to define that change so that I know that when I get there, I can see it.

 

And that’s kind of like the key results. But what we’re gonna talk about here specifically is even a little bit more detailed because you can set your goals throughout the year and say, I’m gonna measure this goal at the end of the year and I’m gonna say, I know what we’re gonna look for when we get there. But during the year, what’s telling you that you’re on the right path or not on the right path? Or when you decide that I’m going to host this professional development session on this half day.

 

because I get teachers on that half day and I decide this is the thing I’m gonna do because it helps me get closer to my goal. But are we measuring the impact of that thing? how are we, we’re not capturing data around that. And that’s what we wanna talk about here specifically is like, why, how do we do that? What do we do? Like, what are we looking for? And also like how much data do we really have to collect here to like make sure we’re on the right pathway? Yvette, let’s get into it.

 

Yvette Lehman: So like you said, our key results at the end of the year is our way to measure the change we’re hoping to see. And that’s usually through observable behaviors, shifts in mindset, those types of things. But waiting until the end of the year to measure the change is like waiting until the end of the semester to give a summative assessment, having not done any formative assessment throughout the entire learning journey.

 

And so that’s what we’re talking about today is that you are going to, in order to create change, you’re going to put action steps into place. You’re going to say, for example, we’re going to host these PD sessions. We’re going to run coaching cycles. We’re going to have administrator sessions. We’re going to do walkthroughs, whatever your commitments are to support the change.

 

How can we put a process in place to have ongoing monitoring of the impact of each of those action steps? Because at the end of the day, just like formative assessment in the classroom, yes, of course, formative assessment is for the purpose of guiding instruction, but it’s also a check to make sure that our instruction is effective. So at the end of my lesson, if I do a consolidation journal and

 

Nobody got the point. You know, the entire class reflects on my lesson and they didn’t get it. That’s probably an indication to me as the instructor that I need to change my approach. I need to shift my entry point that if my instruction is not reaching my target audience and having the intended impact, then I need to pivot and I need to adjust.

 

Jon Orr: It also might not necessarily, and again, we’re still in the land of like, this is being your classroom, is it might not necessarily mean that you need to pivot, but what it might mean is like, need more time or I need more emphasis in this area, or I need to do this one piece again, because doing it once after I realized it, even though was like the right move at the right time, I need to like reinforce it in this way, maybe using this technique now, because

 

I want my students to get to having this learning outcome and showing that they have this skill or this understanding, but I now realize that, you know, it didn’t have, what I did there didn’t have exactly the intended outcome. a one-time shot didn’t give me the outcome that I was looking for and I have to now plan to do either something else or something different, which is the same, what we’re saying in professional development. So if I decide that I’m going to have

 

you know, put coaches into classrooms one day a week. They’re gonna get into a school and they’re gonna meet with teachers and it’s gonna be voluntary and they’re gonna get into that classroom and they’re gonna be helping that teacher. We have to be able to say like.

 

What is the intended outcome that that’s going to achieve? And how will I know? And I have to put that formative assessment piece in place because I need to know that piece of data that says like that technique, that structure that I’m putting into place. I have to know that it’s actually going to help me get closer to the end goal or the key results that I’m after. And how will I know that if I wait until the end of the year? Like is it worth waiting to the end of the year?

 

And this is what we help our districts that we support do is to think about like, are the moves that you’re making actually creating the impact you want? And are we just gonna wait and see if it happened? Like a lot of the decisions that we make about especially structure, which is really stage two of our four stage process is we just use the old structures that everyone else used. It’s like, what have we done here? Well, we have coaches. Okay, how are we gonna implement those coaches? They’re gonna get into classrooms.

 

Jon Orr: one day a week, maybe two days a week, back to back, I don’t know, maybe not, maybe so. And then it’s like, okay, we have two half days, you know, that we’re gonna get in front of teachers. And it’s like, we just keep rolling, but we don’t actually measure the impact of these structures and what we’re doing to use these structures. And what we usually do and what districts we hear usually do is like look at the student data at the end of the year to see if they’re impact, which is like.

 

In our opinion, the wrong way to do this, like you’re, you’re wasting time and you’re wasting years, you know, on, on say real impact you could be having. If you’re measuring throughout the things you want to see changed so that when you create, like I’m going to use this coaching cycle, but now I have to define how will I know that this coaching cycle is working or this structure is working because we have to do that. that’s one example, right? We have to do that with everything. which all of a sudden is like, my gosh, how much measuring am I doing? like, if we’re not answering that question, like why are we doing it?

 

Yvette Lehman: Well, and it’s interesting, you just reminded me, I was having a conversation with the district leader recently and she had the aha moment when we were engaged in this conversation that this is backwards design. Professional development is no different. You just said we have to know if it worked, which means we need to know what success looks like. We have to have clearly defined it.

 

And so that going into our professional development session, let’s say we have a district wide professional development day. We have to clearly define what will success look like at the end of this day. So by the end of today, teachers will, participants will, can you write that criteria? And then how will you know? Like that I think is the perfect question to ask when it comes to monitoring. If we engage in this work, this protocol, this structure,

 

how will we know that it has had the intended impact? And I think, we’re gonna talk about some examples of what this could look like, but at the end of the day, it’s really just about gathering voice from stakeholders. And that has two benefits. One, it’s going to help you monitor the impact of the choices you’ve made to support professional development in your district, but it’s also going to honor stakeholders.

 

Educators should have a voice in the professional development journey. And so if we’re not asking them, you know, was this meaningful? Was this relevant? You know, what was your big takeaway? What was your aha moment? Did you feel valued and seen? If we’re not asking stakeholders to give feedback on the process, then we’re really not including them. And you talk about that all the time about the trust that we need to build with educators and stakeholders, parents, leaders. And if we’re not asking questions and we’re not asking for feedback, what does that you know, tell stakeholders about our value of their voice and their experience.

 

Jon Orr: Yeah, yeah, for sure. You hit the nail on head there for sure about about making sure that, you know, the voices is included because if you’re not doing that, you’re just driving a wedge between the downtown office and the professional development that we create and choose to create and the people who are participating in that professional development. And we wonder why teachers keep saying like, I don’t have time to that, you know, or that’s, know, that doesn’t work for my students. And it’s really

 

it’s really because they see the disconnect between what you’re providing and what they’re actually doing in the classroom. And we do need to close that gap. Yvette, give us some examples. now specifically, you know, if we want to talk about a structure, we’re talking about an if then how structure in monitoring the action items we’re doing every, you know, throughout the year to create change on the zones that we want to create change on. this is like, assuming we’ve already created

 

our key results by the end of the year, we’re going to see this change and we are going to measure that change or see that measurement in this way. So which means you’ve already defined like what success looks like for all year work and on say specific things. Now you’re going to look at go, what, what am I doing during the year to create that change, which now brings up a list of we call action items. So action items are like professional development sessions.

 

PLC time, staff meetings, email newsletters, things that you do in your job every single day to support teachers on the areas you wanna see change on. We need to answer the if then how statement for almost, I would say everything, everything, because otherwise why are you doing it? Like this has to do with like curriculum choice. I’m gonna choose this curriculum to bring into my system or my district. We have to say,

 

If I choose this curriculum, then I will see this outcome. And the how is the part we’re really talking about today is like, how will I know? Because a lot of times we say like, okay, we get the if then, okay, this is the curriculum, let’s choose that curriculum, let’s bring it in, we’re gonna have to do this, this and this to support it, we’re gonna have to help teachers with their date, their scope and sequences around it, they’re gonna have to unpack it for teacher.

 

Yvette Lehman: How will I know for sure?

 

Jon Orr: You know, you’re listening right now going like, I know all the stuff that has to go on when I do stuff like that. But then we have to go like, okay, if we do all of that, how will I know that that worked? And we have to define that, which you’re saying is we have to be clearly defined, like what it is that we’re trying to achieve. And then we have to reverse engineer to go, how will that one thing work? that was like, let’s talk some examples. I was just throwing out a bunch of things that

 

just to give everyone an idea of that. There’s all the things you’re doing, which I described, and that’s not all the things you do, but all the things you do, you have to be able to say, I have a piece of data, it could be anecdotal data, but it also could be measurable data that tells me that this thing is working in the ways that I want

 

Yvette Lehman: Well, and you know, not to get off track here, John, but you just said all the things. It makes me think too, you know, we should be doing fewer things well, right? You know, like I would rather have three action items that I am fully implementing with the monitoring and feedback loop, and I’m refining it and improving it rather than having a hundred action items that I have no idea if any of them have an impact because I’m not monitoring them.

 

So that’s a good point you brought up. So let’s talk specific. So let’s say, for example, I’m going to give a scenario. Let’s say that you have an objective around improving fluency in your district. And you have decided as a district to invest in, let’s say, a digital tool or a digital platform that specifically supports improving student fluency. If that’s what you did, so you’re basically your if then is it, know, if we purchase this digital tool.

 

and we support the implementation with fidelity and integrity of the use of this tool in K to five, then we’re going to see improved fluency for students. Okay, so like let’s say that this is what we’re doing. A couple of things that you would wanna monitor there is usage data. Like you need to know if it’s even being used in your district and if it’s not being used, the question isn’t, you know, it’s not about.

 

teachers aren’t doing what they’re supposed to, that’s not the point. The point here is if it’s not being used, why is it not being used? Is there an issue with access to technology? Are students not finding and engaging? Have teachers not received sufficient support in knowing how to implement it? And that goes back to also this idea that monitoring is not about being punitive or making sure teachers are accountable. It’s about making sure that the supports are in place to ensure the success of whatever initiative you’ve put into place.

 

So I would want to be looking at monitoring usage data. I’d also want to be gathering feedback. I would want to be talking to teachers, getting into classrooms, observing while students are using it, talking to students, asking them, you know, are you engaged in this? How much time do you spend on it? What, you know, I really need to gather that information from stakeholders to know whether this is a tool that we should continue to invest in.

 

Jon Orr: Yeah. Now you just said a bunch of things though, right? So you said one thing I could measure and one thing you would recommend measuring is usage data because it can help you say, you know, whether we need to adjust some of our techniques or adjust whether the choice of this particular tool was the right tool. And that’s also

 

already defining like this tool makes sense because it actually gets us closer to our key results and closer to our vision of mathematics. So having said all of that, like you’ve already made that choice and made the relationship between this tool being the right tool. But then you were saying like, you’re now maybe talking with people and maybe you’re getting, you know, you’re getting kind of interviews and it’s like, I’m imagining all of these different data points. Cause I think when we think of like, how do I collect data? We think numbers, but

 

You know, we have to, it’s almost like you could collect an infinite amount of things. So it’s like when you’re making your if then how statement, the how part, get clear on, when you get clear on the success, like what does success look like if I use this tool, then clearly outline what pieces of evidence you are going to look for to support.

 

the how, like the how tells you that you’re on the right track or not on the right track. So it might be like, I am going to gather blank amount of teachers, you know, feedback, and I’m going to, you know, use that feedback to dictate whether we’re on the right path for alignment that we’re, you know, maybe that’s the outcome that we’re after. Usage data is gonna be like, that supports whether we’re on the right track for implementation. So it’s like,

 

think of the different types of data that you can collect to support your original hypothesis or disprove the original hypothesis. But I think what you wanna do is, because I think that’s where we get caught in the weeds, right? It’s like, I could make an if then how, but then the how has to be really, really, really, really, really, really specific. Otherwise you won’t do it. You won’t do the how, because you’ll like, if this, then we’ll do that. And then, okay, I’ll look for these things, but.

 

but the end of the year is going to come and you did none of those things and you didn’t look for anything because you didn’t clearly outline that when I use this tool to capture this voice, I’m then going to look for statements that say blank. And when I see statements that say blank, I’m gonna grab them and I’m gonna put them in this report over here or I’m gonna add them to my my document that says, yes, this was a success.

 

Yvette Lehman: Well, and you just made a really good point. So let’s imagine you get to the end of the year and you have your key results. So these are your big overarching objectives. So let’s go back to the fluency one. And let’s say that our key result for fluency is that we wanted to see actual improvement in student achievement. And we’re monitoring that through a pre-post assessment, let’s say. And imagine the growth isn’t as significant as we hoped it would be.

 

there’s always factors that contribute to the changes that we see over time. And so we go to report on the work that we did this year and our key results maybe aren’t as dramatic as we were hoping or the changes in a significant. But then you can go back and say,

 

But look at all of the things that we did, that we accomplished, and we know they had an impact. And maybe we just need more time, more investment. We need to double down. These are the things that we know had the intended impact, and we want to continue them through next year. Because we can see that they’re effective. know this is what teachers are saying about it. This is what students are saying about it. This is how we know that the outcome is positive. We just need

 

more of it and we’re now talking about you know frequency duration all of those those things that we apply to intervention cycles so at the end of the year you’re going to gather your key results and they’ll tell a story they’ll tell one story but you also want to have evidence to i don’t even want to use the word justify but you know provide i guess evidence is the best word to say that the action steps we took

 

helped us get closer. Even if we didn’t maybe see the impact we were hoping for in this school year, we know that they’re working and we want to continue to invest in them.

 

Jon Orr: Yeah, that’s, having, it’s almost like, you know, this is a tool that we use with our districts is, as we call it, our progress monitoring tool, which is outlines the objectives, the key results, and their action items, and then the evidence collection of like those action items. So each action item is like, what evidence am I looking for to say that this particular action had the intended outcome to support all of these things? So it’s one document that kind of tracks

 

the work you’re doing. So by the end of the year, if you use your document well, so if you track and capture the evidence as you’re going throughout the year, your one document here really is what you’re saying. like you’ve got this completed evidence throughout that tells you whether you got closer or maybe you didn’t get closer. it’s nice to see that because at the end of the year, normally,

 

you get to the end of the year, like, don’t know if we had any impact on the things that we thought we were a measure, like we wanted to do this year, because we’re just gonna look at student data and not actually tracked in monitor progress on any of the things that we actually did to get closer to the things we actually wanted to see change on. if, know, it could be as simple as you grabbing a Google doc or a piece of paper and saying, I’m gonna look for these pieces of evidence on this action item.

 

And when I see that, I’m just going to write them down. And then I know what I’m looking for to support that that that that action item hit and did what it was supposed to do.

 

Yvette Lehman: I think that’s a great summary, John. really, again, you know, thinking back to our classroom practice, we can see this very clear parallel between district improvement assessment, let’s call it, or monitoring, and this idea that we have to engage in ongoing assessment to drive our professional development plan. And so maybe having 20 action items is not the way to go. You know, maybe it’s five.

 

action items throughout the year with a very, very clear plan on how we’re going to gather evidence to support that these action items had the intended impact. And so we gave a few examples today, like that could be exit tickets from educators who participated, something really quick, that’s their big takeaway from that session. It could be interviewing, and it doesn’t need to be an interview with every single teacher, let’s do a sampling of educators and quickly interview them. It could be…

 

you know, usage data like we discussed in our example with the digital tool. There’s a variety of ways it could be a walkthrough. It could be, yeah.

 

Jon Orr: Yeah, a look for a of like a rubric in terms of implementation that you’ve been looking for, right? So you have that look for a list that says, hey, we’re looking for these things, we’re not looking for these things. And it’s a way to document say, one of our districts is using that particular tool in their coaching cycles. So it’s like, how will we know the coaching cycles are working? It’s we’re going to use our implementation look for rubric.

 

while we work with teachers on a regular basis to keep pointing them towards these are the moves we’re trying to make. These are the understandings we’re trying to create with our students. Where do you see yourself on this, level of implementation? Were you using it as a teaching tool, as a reminder tool, as a feedback tool, and also now as a reporting tool to say, like, yep, we’re seeing growth in our coaching cycles with the teachers we’re coaching.

 

on these areas that actually are tied to our key results, tied to our objective, and it’s tied to our mathematics vision. It’s, you know, this documentation throughout the year. And I think you’ve said this a few times today is to go less, right? We had another podcast episode saying that it’s like, we need to focus on less. And I think when we start to focus on less, we wonder, what am I supposed to do? You know, like, okay, if I only make three or four action items this year,

 

Like, what do I do with my time in the middle? You know, and I think, think one, it takes a lot of time and effort and planning and thought to create this system. And it’s worthwhile doing because you know that everything’s aligned and you know you’ll have data by the end of the year to support the work that you’re doing and it will show impact. Because when you measure things, it gets done. And what happens is, and I heard this great quote, I can’t remember who,

 

And I think I’ve said it here on the podcast before too, is like patience is figuring out what to do while you’re waiting. And this is where I think we, run into trouble is that we’re not actually patient because we might set these things up to go, we’re going to do this, we’re going to do this, we’re going to do this. And then now it’s like, I just need to support those things and that’s it. But what we do is like, okay, I set it up. It’s, out there, but then you don’t, you don’t actually work to support them or mounted them or measure them or.

 

you know, get into the weeds and get your hands dirty to actually really support them. So then you start finding other things to do. And then when you start finding other things to do, cause we’re impatient, then you add another list of like action items that you never monitor and you never measure. And then you did stuff just to fill time throughout the year. And then you get to the end and he was like, well, we did some of these things and then I didn’t measure or create impact on these things. And we just spent way too much time on all the things. And when we spend too many, too much time on all the things we spend.

 

no time on anything. so shorten your list and measure the things that matter and work on those things that matter. know, dedicate your time. might have to like, you might have to get in there and do some of the coaching because that’s the move that you think is the high yield strategy.

 

Yvette Lehman: The other day I…

Yeah, I had a district leader the other day say to me, you know, I’ve been focusing on the work and not on the impact. And I think that that’s a really powerful statement. We can always find more things to do. Like there’s never a shortage of things we could be doing, but we need to be really accountable.

 

in ensuring that the things that we are doing are actually helping us get closer to those goals, that they’re being responsive to our audience, that we’re honoring our stakeholders and we’re honoring their voice and their experience and we’re not making decisions without really gathering that evidence. know, thinking about your own district improvement plan. Are there ways for you to strengthen your monitoring? Just like in the classroom, are you asking yourself, did this hit the mark?

 

Am I getting closer to my goal through the actions that I’m taking and if not what changes might we need to make in order to ensure that by the end of the year we are getting closer to those changes that we want to see that will be measured through those overarching key results connected to our main objectives for math improvement.

 

Jon Orr: Wouldn’t it be wonderful, like you got to the end of the day and you were like, the things I did today actually had impact on the areas that I want to impact. And I know that they did. Otherwise, otherwise you’re getting to the end of the day going like, I did stuff today, but was it impactful or was it on the impactful things?

 

Yvette Lehman: And I know, I know they do.

Alright, so listeners, answer the question, how will you know? That’s the big one today. In everything you do, how will you know that the choices you made, the work that you did, the PD you led, the coaching you engaged in, how will you know that it had the impact you were hoping it would?

Your Customized Improvement Plan For Your Math Classroom.
Take the 12 minute assessment and you'll get a free, customized plan to shape and grow the 6 parts of any strong mathematics classroom program.
Take The Free Assessment
District leader/math coach? Take the District Assessment

Thanks For Listening

To help out the show:

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *

The Making Math Moments That Matter Podcast with Kyle Pearce & Jon Orr
Weekly interviews, strategy, and advice for building a math classroom that you wish you were in.

DOWNLOAD THE 3 ACT MATH TASK TIP SHEET SO THEY RUN WITHOUT A HITCH!

Download the 2-page printable 3 Act Math Tip Sheet to ensure that you have the best start to your journey using 3 Act math Tasks to spark curiosity and fuel sense making in your math classroom!

3 Act Math Tip Sheet

LESSONS TO MAKE MATH MOMENTS

Each lesson consists of:

Each Make Math Moments Problem Based Lesson consists of a Teacher Guide to lead you step-by-step through the planning process to ensure your lesson runs without a hitch!

Each Teacher Guide consists of:

  • Intentionality of the lesson;
  • A step-by-step walk through of each phase of the lesson;
  • Visuals, animations, and videos unpacking big ideas, strategies, and models we intend to emerge during the lesson;
  • Sample student approaches to assist in anticipating what your students might do;
  • Resources and downloads including Keynote, Powerpoint, Media Files, and Teacher Guide printable PDF; and,
  • Much more!

Each Make Math Moments Problem Based Lesson begins with a story, visual, video, or other method to Spark Curiosity through context.

Students will often Notice and Wonder before making an estimate to draw them in and invest in the problem.

After student voice has been heard and acknowledged, we will set students off on a Productive Struggle via a prompt related to the Spark context.

These prompts are given each lesson with the following conditions:

  • No calculators are to be used; and,
  • Students are to focus on how they can convince their math community that their solution is valid.

Students are left to engage in a productive struggle as the facilitator circulates to observe and engage in conversation as a means of assessing formatively.

The facilitator is instructed through the Teacher Guide on what specific strategies and models could be used to make connections and consolidate the learning from the lesson.

Often times, animations and walk through videos are provided in the Teacher Guide to assist with planning and delivering the consolidation.

A review image, video, or animation is provided as a conclusion to the task from the lesson.

While this might feel like a natural ending to the context students have been exploring, it is just the beginning as we look to leverage this context via extensions and additional lessons to dig deeper.

At the end of each lesson, consolidation prompts and/or extensions are crafted for students to purposefully practice and demonstrate their current understanding. 

Facilitators are encouraged to collect these consolidation prompts as a means to engage in the assessment process and inform next moves for instruction.

In multi-day units of study, Math Talks are crafted to help build on the thinking from the previous day and build towards the next step in the developmental progression of the concept(s) we are exploring.

Each Math Talk is constructed as a string of related problems that build with intentionality to emerge specific big ideas, strategies, and mathematical models. 

Make Math Moments Problem Based Lessons and Day 1 Teacher Guides are openly available for you to leverage and use with your students without becoming a Make Math Moments Academy Member.

Use our OPEN ACCESS multi-day problem based units!

Make Math Moments Problem Based Lessons and Day 1 Teacher Guides are openly available for you to leverage and use with your students without becoming a Make Math Moments Academy Member.

MMM Unit - Snack Time Fractions Unit

SNACK TIME!

Partitive Division Resulting in a Fraction

Shot Put Multi Day Problem Based Unit - Algebraic Substitution

SHOT PUT

Equivalence and Algebraic Substitution

Wooly Worm Race - Representing and Adding Fractions

WOOLY WORM RACE

Fractions and Metric Units

 

Scavenger Hunt - Data Management and Finding The Mean

SCAVENGER HUNT

Represent Categorical Data & Explore Mean

Downloadable resources including blackline mastershandouts, printable Tips Sheetsslide shows, and media files do require a Make Math Moments Academy Membership.

ONLINE WORKSHOP REGISTRATION

Pedagogically aligned for teachers of K through Grade 12 with content specific examples from Grades 3 through Grade 10.

In our self-paced, 12-week Online Workshop, you'll learn how to craft new and transform your current lessons to Spark Curiosity, Fuel Sense Making, and Ignite Your Teacher Moves to promote resilient problem solvers.