Mathematics Dynamic Assessment
Purpose
The purpose of the MDA is to evaluate what your students understand about targeted mathematical concepts you teach, what misconceptions students have about a particular concept, and at what level of understanding (CRA; receptive/expressive) students are.
What is it?
Mathematics Dynamic Assessment (MDA) is a reliable process that provides teachers with indepth information about their students’ mathematical understandings, that can be practically completed in a classroom context, and that is flexible enough to be used with any mathematics curriculum. The MDA integrates four research supported effective assessment approaches in mathematics: 1) Determining student interests for the purpose of imbedding instruction in meaningful and authentic contexts: 2) CRA assessment, 3) error pattern analysis, and 4) flexible mathematics interviews.
[ back to top ]
What are the critical elements of this strategy?
Student Interests – The Mathematics Student Interest Inventory
The Mathematics Student Interest Inventory has two steps would most likely be used at the beginning of the school year and perhaps during the year if your class experiences several new student arrivals.
Step1: Ask students to describe the kinds of things they do in various situations (e.g., at home, with friends, in church/synagogue/temple, in their neighborhood/apartment complex). Each student in the class writes this information or dictates it to a teacher or peer using the Individual Student Interest form.
Step 2: Review all student responses and select those interests and experiences that best represent all of the students in the class. List interests and experiences on the Class Interest form.
 selected interests are included in column one.
 identify the specific mathematics concepts or objectives you teach that relate to those interests listed in column two.
 think about what type of authentic context can be created for a given mathematics concept (second column) and related student interests/experiences (first column) and briefly describe how it could be implemented in your classroom.
This tool can help you to develop a very powerful database to create meaningful mathematics learning experiences that are imbedded in interesting and meaningful contexts for your students.
[ back to top ]
How Do I Implement the Strategy?
A teacher can implement a MDA by integrating CRA assessment, error pattern analysis, and flexible mathematics interviews by following these steps:
1. Identify the mathematics concept you want to assess.
Example: Comparing Fractions with Like and Unlike Denominators
2. Select a relevant authentic context (from Mathematics Student Interest Inventory) to which you can relate assessment items.
Example: The hometown college football team playing a game against their big rival.
3. Introduce the authentic context to which assessment items will relate (the selected authentic context can then be used the following day when instruction begins, thereby providing a link for students from the previous day’s assessment).
Example:
During the second half of the Florida – Florida State football game on Saturday, the Gators began to move the ball both on the ground and in the air. In the fourth quarter, the Gators gained 5/8 of the football field while the Seminoles gained 2/3 of the football field. The TV announcer said that Florida really outgained the Seminoles during the quarter.
Example: Story Problem Written on DryErase Board
4. Develop three to five receptive level (recognition – students are given a response task and several choices. The student selects the choice that best responds to the task: e.g., Student is shown three concrete examples of different fractions and is asked to identify the example that represents 3/5) and three to five expressive level (the student performs the task without any prompts or choices given: e.g., the student represents 3/5 using concrete materials) assessment tasks that incorporate a relevant authentic context each at the concrete, representational, and abstract levels of understanding.
 A key problem (e.g., word or story problem) can be developed that incorporates the identified authentic context and the assessment items then relate directly to it.
 A method for student recording their responses is determined (e.g., response sheet where students write (abstract level) or draw (representational level) their answers; digital camera for students to pictures of concrete representations, etc.).
5. Teacher structures three centers, a concrete center, a representational center, and an abstract center. Each center contains the appropriate number of response sheets and necessary materials. A relevant independent learning activity is made available for students when they are not working at one of the three assessment centers (e.g., instructional game, selfcorrecting material).
6. Introduce the purpose of the MDA activity and provide directions to students.
7. Review the Key Problem (authentic context) with whole class and tell students that the problems at each center relate directly to this Key Problem.
8. Teacher briefly models how to complete tasks at each center.
9. Students progress through centers in 3 different groups.
 Grouping should be random (i.e., do not group students based on skill level) so that students with learning problems do not feel stigmatized.
 Stagger when each group starts.
 Students start at Abstract center, move to Representational center, then to Concrete center. Students progress at own pace, as space allows at next center. This progression is suggested so that a more accurate picture of students’ abstract level understanding can be obtained (i.e., abstract responses are not biased by working at the concrete and representational levels first: e.g., by seeing and manipulating fraction pieces to compare the fractions 1/8 and 1/3, they may remember the visual comparison when confronted with 1/8 or 1/3 at the abstract center, given the short time interval between the concrete and abstract centers).
 Students record their responses (e.g., response sheets; digital camera) and place their response sheets in a designated place after each center.
 Students work at independent activity when not working at the assessment centers (e.g., a maintenance activity where they practice something with which they are already familiar)
 Teacher monitors student activity, probes students as needed, notes significant misconceptions, ideas, errors, etc. The teacher can make notes about specific students on a pad of paper, a PDA, or other method. This information can be used later to inform flexible interviews and whole class instruction.
10. Conduct brief flexible interviews with particular students as they are responding as determined by observations.
11. Examine students’ work noting;
 common error patterns
 whether students are at the mastery, instructional, or frustration levels for each level of understanding (concrete, representational, abstract).
 whether student responses reflect receptive or expressive understanding.
 additional flexible interviews with particular students the following instructional period can be done if needed.
12. Develop your instructional hypothesis to guide instructional planning that explains your students’ current understandings. An instructional hypothesis has four simple components: 1) Context; 2) What students can do/understand; 3) what students can’t do/understand; 4) Reason
Given two fractions:
Students can determine >, <, = with like denominators at CRA levels;
Students can’t do so with unlike denominators at all three levels;
Because they lack conceptual understanding of the area that fractions represent (proportionality.)
Example: Comparing fractions
Context Given two fractions;
What students can do Students can determine >, <, = with like denominators at CRA levels;
What students can’t doStudents can’t do so with unlike denominators at all three levels;
ReasonBecause they lack conceptual understanding of the area that fractions represent (proportionality.)
Additional Information
Research Support for the Instructional Features of this Strategy: Bryant (1996);
Carpenter, Fennema, Franke, Levi, & Empson, (1999); Gersten (1998); Ginsburg 91987); Howell, Fox, & Morehead (1993); Kami (2000); Kennedy & Tipps (1994); Liedtke (1988); Mercer & Mercer (2005); Schumm etal 1995; Wehmeyer, Palmer, & Agran (1998);Van de Walle (2005); Zigmond, Vallecorsa, & Silverman (1981)
Videos
Clip 1  view
Teacher introduces Dynamic Mathematics Assessment 
Clip 2  view
Teacher describes student interest & CRA assessment
components 
Clip 3  view
Implementing Dynamic Assessment 
Clip 4  view
What was learned about students and how it informs teaching 
[ back to top ]
