Tag Archives: assessment

Is a careless error really careless?

In a 1982 study by John Clement, 150 freshman engineering students were given the following problem:

Write an equation using the variables S and P to represent the following statement:

“There are six times as many students as professors at this university.”

Use S for the number of students and P for the number of professors.

37% of the students answered incorrectly and wrote: 6S=P.

Why did they make that error?   Take some time and see if you can figure out why.

At first, Clement thought the mistake was due to carelessness. These were engineering students and this was a simple algebra problem.  They must have just been careless, right?

Clement wanted to check his hypothesis.  Using clinical interviews, he asked students to think aloud as they solved the problem.  At first, he thought he was correct in assuming it was a careless error– it appeared that students were using what he called a “word-order-matching approach.”  They incorrectly mapped the order of the key words in the problem directly to the order of the symbols in the equation.

However, as he continued his analysis of the interviews, he realized there was another explanation.   Students who incorrectly answered the questions were doing something that made sense to them.   Their intuition was to place the multiplier next to the letter associated with the larger group.   Although incorrect, it was meaningful to students.

The correct answer, S=6P, was not meaningful to them.  It did not describe the situation in the problem.  Professors are not multiplying.  It did not make sense.

Yet it makes sense to us and to the students who could answer it correctly.   Clement explains that we understand it because of 2 reasons:

1. We see the variables as representing numbers rather than objects

2. We are able to invent a hypothetical operation on the variables that creates an equivalence.  We know the equation represents what would happen if you made the group of professors six times larger than it really is.  Then the number of professors would be equal to the number of students.

The problem doesn’t seem so simple now, does it?   If we attributed students’ mistakes on it to a careless error, we wouldn’t diagnose the problem correctly and if we don’t diagnose the problem correctly, we have little chance of figuring out how to remediate it.

This happens a lot in classrooms.  We see a student make what looks like a careless error and we tell him to check his work assuming he just didn’t think carefully.  Then the student changes the answer because “check your work” is teacher code for “you got the answer wrong.”  Sometimes the error is a case of not being careful.  Sometimes, like here, it’s not.

Instead of assuming it’s a careless mistake, assume that the incorrent answer makes sense to the student and try to figure out why.  Then try to figure out why the correct answer makes sense to you.  The contrast may help uncover the understanding you have that they still need to develop.

Read more here:

Clement, J.  (1982).  Algebra word problem solutions:  Thought processes underlying a common misconception.  Journal for Research in Mathematics Education, 13(1), 16-30.

Clement, J. (2000) Analysis of clinical interviews: Foundations and model viability.  In Lesh, R. and Kelly, A. (Eds.), Handbook of research methodologies for science and mathematics education (pp. 341-385).  Hillsdale, NJ:  Lawrence Erlbaum.

Data data everywhere…

During school visits, I rarely make it five minutes without hearing about “the data.”  I am shown big binders containing spreadsheets of data.  I sit in data meetings where teachers furiously highlight lines in spreadsheets and calculate percentages.  Principals tell me how data are being used to evaluate students and teachers.

Some of this is great.  I’m all for assessing children regularly, being clear on what they know and then using the results to decide what to do next.  I would be willing to bet that good teachers have been doing this for years without spreadsheets and binders and meetings.

I also think it’s good to systematize this school-wide and to learn good practices for looking at student work and how to use data to analyze and improve your teaching.  Sadie Estrella is doing some great work on helping schools do this.

However, I think there is a VERY important piece that gets often lost in a rush to analyze data and use results to plan instruction.

Data are only as good as the assessments used to gather them.

That’s one of the most important things I’ve taken away from being researcher.  As researchers, we can spend just as much time designing the tools to gather our data as we do analyzing them.   If the wrong tool is used, it completely changes the conclusions we can make.

A lot of data I see collected in schools are from multiple choice tests that are supposed to reveal who mastered a standard and who didn’t.

There are two potential problems with this.  One is that there is an assumption that choosing the right answer means a student understands the standard.  That may not be true.  Did the student answer the question correctly by guessing?  Did the student answer the question incorrectly because he or she didn’t understand what the question was asking?

The second problem is that knowing how many students in your class answered a question incorrectly is important, but knowing why they didn’t answer it correctly is more important.   It’s impossible to tell from an excel spreadsheet why some students are struggling and others aren’t.

So what tools might help us figure out why students are struggling?

In my line of research, we do a lot of clinical interviews, where we ask students questions as they are solving a problem.  I find data collected that way far more useful in figuring out why students are struggling.   I’ll talk more about clinical interviews and how to use them in your class next time, but if you want to know more now check out the resources below.

It’s important to collect data on our students and analyze what we collect, but we need to spend just as much time thinking carefully about the tools we use and what kind of information we are gathering.

Buschman, L. (2001). Using student interviews to guide classroom instruction. Teaching Children Mathematics, 8 (4), 222-227

Ginsburg, H. P., Jacobs, S. F., & Lopez, L. S. (1998). The teacher’s guide to flexible interviewing in the classroom: Learning what children know about math. Boston, MA: Allyn and Bacon.

Assessing what students really understand about fractions

Do the following tasks look familiar?

What fraction of the circle is shaded?

fraction1

Shade in 2/3 of the rectangle below.

rectangle_thirds_white

Are most of your students successful at these types of tasks?  I would be willing to bet that they are after trying a couple of them.

However, I’d also guess that once you move on to more complicated concepts like comparing fractions or adding fractions, suddenly students are confused.

Why?

I think there is a misconception that if students can name a fraction in a picture and shade in a fractional amount in picture, that they understand what a fraction is.  But being able to do these tasks does not necessarily mean they have a strong conceptual understanding of what a fraction is.  And if they don’t have a strong foundation, topics like comparing or adding fractions is incredibly difficult.

Why can students do the tasks above yet not really understand fractions?

1.   The tasks do not require students to pay attention to the equal size of the parts because they are already equally partitioned for the student.  Students didn’t create the pieces themselves.

When asked to shade 2/3 of a bar that is already partitioned into 3, students can do this just by coloring in 2 boxes.  They do not need to pay attention to the fact that the boxes are equal.

In fact, if given uneven size boxes, students will often shade in 2 and say that it is 2/3 because 2 are shaded and there are 3 boxes in total.

2.  Young children will often focus on the shape of the pieces being the same and not on the size of the pieces being the same.    Those students then have problems identifying the fraction when the pieces are equal sized but different shapes.

Van de Walle offers a great assessment problem to see what your students really understand about fractions.

Which of the shapes below are correctly partitioned into fourths? Why?  Which are not correctly partitioned into fourths?  Why?

rsz_fract_assess_task

Items b and c will help assess if students are focused on the number of pieces and not the size, while e and g will help assess if students are focused on the shape and not the size.

How can you avoid these misconceptions?  What tasks might help develop a strong understanding of what a fraction is?

I’ll share my ideas about that in my next post, but I’d love to hear what you think.

Want to know more?  Read Chapter 15 of  Van de Walle, J. A., Karp, K. S., & Bay-Williams, J. M. (2007). Elementary and middle school mathematics: Teaching developmentally.

Key words aren’t the key to understanding math

Students in NYC recently finished taking the state math test. As a result, I’ve spent the past couple of weeks watching a lot of test prep going on in classrooms. One of the strategies I see being used over and over is the use of key words.

I admit that I was guilty of using this strategy when I started teaching. I was puzzled when my students could flawlessly perform computations but struggled with word problems. Teaching them to look for key words seemed like an easy fix to this. I had charts in my room listing all the key words and their corresponding operations. Yet, the key words didn’t seem to help them.

So why don’t key words work?

It’s because they don’t allow students to use what they already know to make sense of a situation.

The research backs this up. Drake and Barlow (2007) gave a student the problem below.

There are 3 boxes of chicken nuggets on the table. Each box contains 6 chicken nuggets. How many chicken nuggets are there in all?

Guess what a student who looked for key words answered? 9 chicken nuggets. The student saw the words: “in all” as a signal to add 6 and 3. I would bet that the student could have made sense of this situation and arrived at the correct answer if he drew a picture or reasoned about it. However, using key words led him to an incorrect answer.

Key words encourage students to take a short cut instead of making sense of a situation. If students think about what makes sense, they don’t need shortcuts or key words. They don’t need to worry about what happens when they aren’t any key words or when there are multiple key words in a story.

If we believe that doing mathematics should have meaning for students and make sense to them, then teaching key words doesn’t support those goals. Teaching students to reason about a situation and know why they are performing an operation does.

Have you used key words with your students? What was your experience?

Drake, J. M., & Barlow, A. T. (2007). Assessing Students’ Levels of Understanding Multiplication through Problem Writing. Teaching Children Mathematics, 14(5), 272-277.